- Hands-On Deep Learning Architectures with Python
- Yuxi (Hayden) Liu Saransh Mehta
- 221字
- 2021-06-24 14:48:14
Building a graph
A TensorFlow graph is a series of operations organized in graph. The model architecture is first built in the form of a TensorFlow graph. You need to keep in mind three basic operations:
- tf.constant: Holds a constant tensor just like a constant in Python, but unlike Python, it gets activated only during a TensorFlow Session.
- tf.Variable: Holds a variable tensor that is learnable during training and updates value.
- tf.Placeholder: This is an interesting feature of TensorFlow. At the time of building the graph, we don't provide the input data. But, it is required to lay out the shape and data type of input that the graph will be receiving. Thus, placeholder acts as a container that will allow the flow of input tensors when a Session is activated.
Let's try to add two constants in TensorFlow as follows:
>>>import tensorflow as tf
>>>t1 = tf.constant('hey')
>>>t2 = tf.constant('there')
>>sum = t1 + t2
>>>print(sum)
This will output something like this: add:0, shape=(), dtype=string. You were expecting heythere? This doesn't happen because TensorFlow runs the graph only when a Session is activated. By defining the constants, we just made a graph, and that's why the print was trying to tell what the sum would be when the graph is run. So, let's create a Session.
推薦閱讀
- 21天學(xué)通JavaScript
- 傳感器技術(shù)實驗教程
- Hands-On Machine Learning on Google Cloud Platform
- Blockchain Quick Start Guide
- INSTANT Autodesk Revit 2013 Customization with .NET How-to
- 數(shù)據(jù)通信與計算機網(wǎng)絡(luò)
- 具比例時滯遞歸神經(jīng)網(wǎng)絡(luò)的穩(wěn)定性及其仿真與應(yīng)用
- 數(shù)據(jù)庫系統(tǒng)原理及應(yīng)用教程(第5版)
- Enterprise PowerShell Scripting Bootcamp
- Excel 2007技巧大全
- 網(wǎng)絡(luò)服務(wù)搭建、配置與管理大全(Linux版)
- 嵌入式操作系統(tǒng)原理及應(yīng)用
- 漢字錄入技能訓(xùn)練
- Deep Learning Essentials
- 教育創(chuàng)新與創(chuàng)新人才:信息技術(shù)人才培養(yǎng)改革之路(四)