官术网_书友最值得收藏!

Regularization techniques

Overfitting is a problem in ML, where a model blindly learns all of the patterns in the data, including noise. A neural network can easily be overfitted during training, due to the availability of a large number of parameters. Theoretically, given any size of input data, a large enough Artificial Neural Network (ANN) can memorize all of the patterns in it, along with noise. Therefore, the weights of models have to be regularized to avoid overfitting the data.

We will look at three types of regularization:

  • Dropout
  • Batch normalization
  • L1 and L2 normalization
主站蜘蛛池模板: 秦安县| 托克托县| 新和县| 湘阴县| 延吉市| 湄潭县| 双江| 平潭县| 炉霍县| 策勒县| 靖州| 颍上县| 揭阳市| 五峰| 大港区| 偏关县| 清新县| 邓州市| 永顺县| 郯城县| 信阳市| 东宁县| 拜城县| 高台县| 宁阳县| 诸暨市| 顺义区| 山东| 小金县| 龙南县| 台中市| 甘泉县| 永兴县| 平谷区| 邓州市| 香港 | 偃师市| 合川市| 普兰县| 新乡市| 盘山县|