- Neural Network Programming with TensorFlow
- Manpreet Singh Ghotra Rajdeep Dua
- 148字
- 2021-07-02 15:17:11
Understanding backpropagation
When a feedforward neural network is used to accept an input x and produce an output y?, information flows forward through the network elements. The input x provides the information that then propagates up to the hidden units at each layer and produces y?. This is called forward propagation. During training, forward propagation continues onward until it produces a scalar cost J(θ). The backpropagation algorithm, often called backprop, allows the information from the cost to then flow backward through the network in order to compute the gradient.
Computing an analytical expression for the gradient is straightforward, but numerically evaluating such an expression can be computationally expensive. The backpropagation algorithm does so using a simple and inexpensive procedure.
- 同步:秩序如何從混沌中涌現
- 有趣的二進制:軟件安全與逆向分析
- 大數據可視化
- 云計算與大數據應用
- 數據庫系統原理及應用教程(第4版)
- 區塊鏈:看得見的信任
- 深入淺出MySQL:數據庫開發、優化與管理維護(第2版)
- 大數據架構和算法實現之路:電商系統的技術實戰
- 數字媒體交互設計(初級):Web產品交互設計方法與案例
- PostgreSQL指南:內幕探索
- 云數據中心網絡與SDN:技術架構與實現
- IPython Interactive Computing and Visualization Cookbook(Second Edition)
- 區域云計算和大數據產業發展:浙江樣板
- 改變未來的九大算法
- 企業級大數據項目實戰:用戶搜索行為分析系統從0到1