If you have been taking Andrew Ng’s deeplearning.ai course on Coursera, you must have learned in Course 1 about the graph operations, and the method of back propagation using derivatives in terms of graph. In fact, it is the basis of TensorFlow, a Python package commonly used in deep learning. Because it is based on the graph model of computation, we can see it as a “programming language.”
Google published a paper about the big picture of computational model in TensorFlow:
TensorFlow is a powerful, programmable system for machine learning. This paper aims to provide the basics of a conceptual framework for understanding the behavior of TensorFlow models during training and inference: it describes an operational semantics, of the kind common in the literature on programming languages. More broadly, the paper suggests that a programming-language perspective is fruitful in designing and in explaining systems such as TensorFlow.
Beware that this model is not limited to deep learning.
- Coursera: Deep Learning Specialization. [Coursera]
- TensorFlow. [TensorFlow]
- Martin Abadi, Michael Isard, Derek G. Murray, “A Computational Model in TensorFlow,” Google Research Blog (MAPL 2017). [GoogleResearch]