官术网_书友最值得收藏!

Regularization techniques

Overfitting is a problem in ML, where a model blindly learns all of the patterns in the data, including noise. A neural network can easily be overfitted during training, due to the availability of a large number of parameters. Theoretically, given any size of input data, a large enough Artificial Neural Network (ANN) can memorize all of the patterns in it, along with noise. Therefore, the weights of models have to be regularized to avoid overfitting the data.

We will look at three types of regularization:

  • Dropout
  • Batch normalization
  • L1 and L2 normalization
主站蜘蛛池模板: 郸城县| 扎兰屯市| 汝州市| 衡水市| 阿坝| 申扎县| 西乡县| 岳池县| 汉源县| 嘉鱼县| 进贤县| 桃源县| 宜州市| 石门县| 桐梓县| 郸城县| 永德县| 龙岩市| 麻阳| 宁南县| 信丰县| 蒙自县| 武邑县| 成都市| 朝阳市| 会东县| 阳原县| 竹山县| 墨脱县| 鹿邑县| 察雅县| 喜德县| 山丹县| 吉林市| 肥城市| 威远县| 涟源市| 谷城县| 龙口市| 武川县| 营山县|