官术网_书友最值得收藏!

Batch normalization

Batch normalization is a technique that normalizes the feature vectors to have no mean or unit variance. It is used to stabilize learning and to deal with poor weight initialization problems. It is a pre-processing step that we apply to the hidden layers of the network and it helps us to reduce internal covariate shift.

Batch normalization was introduced by Ioffe and Szegedy in their 2015 paper, Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. This can be found at the following link: https://arxiv.org/pdf/1502.03167.pdf.

The benefits of batch normalization are as follows:

  • Reduces the internal covariate shift: Batch normalization helps us to reduce the internal covariate shift by normalizing values.
  • Faster training: Networks will be trained faster if the values are sampled from a normal/Gaussian distribution. Batch normalization helps to whiten the values to the internal layers of our network. The overall training is faster, but each iteration slows down due to the fact that extra calculations are involved.
  • Higher accuracy: Batch normalization provides better accuracy.
  • Higher learning rate: Generally, when we train neural networks, we use a lower learning rate, which takes a long time to converge the network. With batch normalization, we can use higher learning rates, making our network reach the global minimum faster.
  • Reduces the need for dropout: When we use dropout, we compromise some of the essential information in the internal layers of the network. Batch normalization acts as a regularizer, meaning we can train the network without a dropout layer.

In batch normalization, we apply normalization to all the hidden layers, rather than applying it only to the input layer.

主站蜘蛛池模板: 榆社县| 酒泉市| 巢湖市| 乡宁县| 偃师市| 龙陵县| 土默特左旗| 宁安市| 邓州市| 洛阳市| 富宁县| 东光县| 平乐县| 赞皇县| 天镇县| 颍上县| 望奎县| 高雄县| 东港市| 天等县| 手游| 福鼎市| 醴陵市| 岑溪市| 博兴县| 济宁市| 兴山县| 开封市| 河东区| 忻城县| 龙海市| 和硕县| 汕头市| 合山市| 南投县| 枞阳县| 九寨沟县| 荥阳市| 金昌市| 郓城县| 石渠县|