官术网_书友最值得收藏!

Batch normalization

Batch normalization is a technique that normalizes the feature vectors to have no mean or unit variance. It is used to stabilize learning and to deal with poor weight initialization problems. It is a pre-processing step that we apply to the hidden layers of the network and it helps us to reduce internal covariate shift.

Batch normalization was introduced by Ioffe and Szegedy in their 2015 paper, Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. This can be found at the following link: https://arxiv.org/pdf/1502.03167.pdf.

The benefits of batch normalization are as follows:

  • Reduces the internal covariate shift: Batch normalization helps us to reduce the internal covariate shift by normalizing values.
  • Faster training: Networks will be trained faster if the values are sampled from a normal/Gaussian distribution. Batch normalization helps to whiten the values to the internal layers of our network. The overall training is faster, but each iteration slows down due to the fact that extra calculations are involved.
  • Higher accuracy: Batch normalization provides better accuracy.
  • Higher learning rate: Generally, when we train neural networks, we use a lower learning rate, which takes a long time to converge the network. With batch normalization, we can use higher learning rates, making our network reach the global minimum faster.
  • Reduces the need for dropout: When we use dropout, we compromise some of the essential information in the internal layers of the network. Batch normalization acts as a regularizer, meaning we can train the network without a dropout layer.

In batch normalization, we apply normalization to all the hidden layers, rather than applying it only to the input layer.

主站蜘蛛池模板: 莱州市| 马公市| 正阳县| 宿松县| 田阳县| 长沙县| 建阳市| 虹口区| 宝鸡市| 勐海县| 荆门市| 恭城| 汾西县| 明星| 龙山县| 鲁甸县| 广灵县| 广汉市| 黑山县| 枣阳市| 张家界市| 迭部县| 霍林郭勒市| 古浪县| 杭锦旗| 黄冈市| 栖霞市| 临清市| 尉犁县| 阳城县| 明星| 连山| 高雄市| 佛学| 余庆县| 古蔺县| 长乐市| 榆树市| 朝阳区| 汨罗市| 黑龙江省|