官术网_书友最值得收藏!

  • Deep Learning Essentials
  • Wei Di Anurag Bhardwaj Jianing Wei
  • 156字
  • 2021-06-30 19:17:55

Automatic differentiation

TensorFlow provides a very convenient API that can help us to directly derive the deltas and update the network parameters:

# Define the cost as the square of the errors
cost = tf.square(error)

# The Gradient Descent Optimizer will do the heavy lifting
learning_rate = 0.01
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)

# Define the function we want to approximate
def linear_fun(x):
y = x[:,0] * 2 + x[:,1] * 4 + 1
return y.reshape(y.shape[0],1)

# Other variables during learning
train_batch_size = 100
test_batch_size = 50

# Normal TensorFlow - initialize values, create a session and run the model
sess = tf.Session()
sess.run(tf.initialize_all_variables())

for i in range(1000):
x_value = np.random.rand(train_batch_size,2)
y_value = linear_fun(x_value)
sess.run(optimizer, feed_dict={a_0:x_value, y: y_value})
if i % 100 == 0:
test_x = np.random.rand(test_batch_size,2)
res_val = sess.run(res, feed_dict =
{a_0: test_x, y: linear_fun(test_x)})
print res_val

In addition to this basic setting, let’s now talk about a few important concepts you might encounter in practice.

主站蜘蛛池模板: 抚远县| 东乌珠穆沁旗| 长海县| 卢湾区| 宜兴市| 东方市| 深圳市| 淮安市| 威宁| 凤庆县| 镇江市| 浦县| 沽源县| 玛多县| 厦门市| 石家庄市| 广汉市| 维西| 阿尔山市| 县级市| 盐山县| 东平县| 黄平县| 清水河县| 星座| 万州区| 无棣县| 恩施市| 安宁市| 砚山县| 临沧市| 府谷县| 遵义市| 观塘区| 广宗县| 普兰店市| 新巴尔虎左旗| 武宁县| 厦门市| 房山区| 青海省|