- Hands-On Deep Learning Architectures with Python
- Yuxi (Hayden) Liu Saransh Mehta
- 221字
- 2021-06-24 14:48:14
Building a graph
A TensorFlow graph is a series of operations organized in graph. The model architecture is first built in the form of a TensorFlow graph. You need to keep in mind three basic operations:
- tf.constant: Holds a constant tensor just like a constant in Python, but unlike Python, it gets activated only during a TensorFlow Session.
- tf.Variable: Holds a variable tensor that is learnable during training and updates value.
- tf.Placeholder: This is an interesting feature of TensorFlow. At the time of building the graph, we don't provide the input data. But, it is required to lay out the shape and data type of input that the graph will be receiving. Thus, placeholder acts as a container that will allow the flow of input tensors when a Session is activated.
Let's try to add two constants in TensorFlow as follows:
>>>import tensorflow as tf
>>>t1 = tf.constant('hey')
>>>t2 = tf.constant('there')
>>sum = t1 + t2
>>>print(sum)
This will output something like this: add:0, shape=(), dtype=string. You were expecting heythere? This doesn't happen because TensorFlow runs the graph only when a Session is activated. By defining the constants, we just made a graph, and that's why the print was trying to tell what the sum would be when the graph is run. So, let's create a Session.
推薦閱讀
- 西門子S7-200 SMART PLC從入門到精通
- Mastering D3.js
- 網(wǎng)絡(luò)綜合布線技術(shù)
- 數(shù)據(jù)挖掘方法及天體光譜挖掘技術(shù)
- 筆記本電腦維修90個精選實例
- Hadoop應(yīng)用開發(fā)基礎(chǔ)
- 從零開始學(xué)SQL Server
- 工業(yè)機器人實操進階手冊
- Hands-On Business Intelligence with Qlik Sense
- Apache Spark Quick Start Guide
- 數(shù)據(jù)清洗
- MySQL Management and Administration with Navicat
- EDA技術(shù)及其創(chuàng)新實踐(Verilog HDL版)
- OSGi原理與最佳實踐
- 人工智能基礎(chǔ)教程:Python篇(青少版)