- Python Reinforcement Learning
- Sudharsan Ravichandiran Sean Saito Rajalingappaa Shanmugamani Yang Wenzhuo
- 154字
- 2021-06-24 15:17:29
Computation graph
Everything in TensorFlow will be represented as a computational graph that consists of nodes and edges, where nodes are the mathematical operations, say addition, multiplication and so on, and edges are the tensors. Having a computational graph is very efficient in optimizing resources and it also promotes distributed computing.
Say we have node B, whose input is dependent on the output of node A; this type of dependency is called direct dependency.
For example:
A = tf.multiply(8,5)
B = tf.multiply(A,1)
When node B doesn't depend on node A for its input it is called indirect dependency.
For example:
A = tf.multiply(8,5)
B = tf.multiply(4,3)
So if we can understand these dependencies, we can distribute the independent computations in the available resources and reduce the computation time.
Whenever we import TensorFlow, a default graph will be created automatically and all nodes we create will get associated with the default graph.
- Building Computer Vision Projects with OpenCV 4 and C++
- ETL數據整合與處理(Kettle)
- 使用GitOps實現Kubernetes的持續部署:模式、流程及工具
- 分布式數據庫系統:大數據時代新型數據庫技術(第3版)
- 大數據營銷:如何讓營銷更具吸引力
- Starling Game Development Essentials
- ZeroMQ
- 深入淺出Greenplum分布式數據庫:原理、架構和代碼分析
- SQL Server 2012數據庫管理教程
- 企業級容器云架構開發指南
- IPython Interactive Computing and Visualization Cookbook(Second Edition)
- 編寫有效用例
- 活用數據:驅動業務的數據分析實戰
- Visual FoxPro數據庫技術基礎
- The Natural Language Processing Workshop