官术网_书友最值得收藏!

Batch normalization

Batch normalization is a technique that normalizes the feature vectors to have no mean or unit variance. It is used to stabilize learning and to deal with poor weight initialization problems. It is a pre-processing step that we apply to the hidden layers of the network and it helps us to reduce internal covariate shift.

Batch normalization was introduced by Ioffe and Szegedy in their 2015 paper, Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. This can be found at the following link: https://arxiv.org/pdf/1502.03167.pdf.

The benefits of batch normalization are as follows:

  • Reduces the internal covariate shift: Batch normalization helps us to reduce the internal covariate shift by normalizing values.
  • Faster training: Networks will be trained faster if the values are sampled from a normal/Gaussian distribution. Batch normalization helps to whiten the values to the internal layers of our network. The overall training is faster, but each iteration slows down due to the fact that extra calculations are involved.
  • Higher accuracy: Batch normalization provides better accuracy.
  • Higher learning rate: Generally, when we train neural networks, we use a lower learning rate, which takes a long time to converge the network. With batch normalization, we can use higher learning rates, making our network reach the global minimum faster.
  • Reduces the need for dropout: When we use dropout, we compromise some of the essential information in the internal layers of the network. Batch normalization acts as a regularizer, meaning we can train the network without a dropout layer.

In batch normalization, we apply normalization to all the hidden layers, rather than applying it only to the input layer.

主站蜘蛛池模板: 吉安市| 陈巴尔虎旗| 万年县| 石嘴山市| 积石山| 芦山县| 阳春市| 通榆县| 永清县| 道孚县| 灵宝市| 雅安市| 秭归县| 门头沟区| 临沧市| 阿瓦提县| 克拉玛依市| 龙岩市| 天水市| 凤阳县| 竹溪县| 通山县| 修文县| 武强县| 明光市| 通州市| 惠州市| 沅陵县| 淮阳县| 武陟县| 榆林市| 南川市| 五峰| 莎车县| 平昌县| 普安县| 安化县| 大渡口区| 蓝田县| 萨迦县| 三原县|