- Neural Network Programming with TensorFlow
- Manpreet Singh Ghotra Rajdeep Dua
- 177字
- 2021-07-02 15:17:10
Hessian
Gradient is the first derivative for functions of vectors, whereas hessian is the second derivative. We will go through the notation now:

Similar to the gradient, the hessian is defined only when f(x) is real-valued.
The algebraic function used is
.

The following example shows the hessian implementation using TensorFlow:
import tensorflow as tf
import numpy as np
X = tf.Variable(np.random.random_sample(), dtype=tf.float32)
y = tf.Variable(np.random.random_sample(), dtype=tf.float32)
def createCons(x):
return tf.constant(x, dtype=tf.float32)
function = tf.pow(X, createCons(2)) + createCons(2) * X * y + createCons(3) * tf.pow(y, createCons(2)) + createCons(4) * X + createCons(5) * y + createCons(6)
# compute hessian
def hessian(func, varbles):
matrix = []
for v_1 in varbles:
tmp = []
for v_2 in varbles:
# calculate derivative twice, first w.r.t v2 and then w.r.t v1
tmp.append(tf.gradients(tf.gradients(func, v_2)[0], v_1)[0])
tmp = [createCons(0) if t == None else t for t in tmp]
tmp = tf.stack(tmp)
matrix.append(tmp)
matrix = tf.stack(matrix)
return matrix
hessian = hessian(function, [X, y])
sess = tf.Session()
sess.run(tf.initialize_all_variables())
print(sess.run(hessian))
The output of this is shown as follows:
[[ 2. 2.] [ 2. 6.]]
推薦閱讀
- 數據存儲架構與技術
- 大數據技術基礎
- 云計算環境下的信息資源集成與服務
- Python數據分析入門:從數據獲取到可視化
- Access 2007數據庫應用上機指導與練習
- Live Longer with AI
- Starling Game Development Essentials
- 數據庫原理與設計(第2版)
- SQL優化最佳實踐:構建高效率Oracle數據庫的方法與技巧
- 圖數據實戰:用圖思維和圖技術解決復雜問題
- 視覺大數據智能分析算法實戰
- 爬蟲實戰:從數據到產品
- Microsoft Dynamics NAV 2015 Professional Reporting
- 數字化轉型方法論:落地路徑與數據中臺
- 一類智能優化算法的改進及應用研究