- Hands-On Natural Language Processing with Python
- Rajesh Arumugam Rajalingappaa Shanmugamani
- 95字
- 2021-08-13 16:01:48
Regularization techniques
Overfitting is a problem in ML, where a model blindly learns all of the patterns in the data, including noise. A neural network can easily be overfitted during training, due to the availability of a large number of parameters. Theoretically, given any size of input data, a large enough Artificial Neural Network (ANN) can memorize all of the patterns in it, along with noise. Therefore, the weights of models have to be regularized to avoid overfitting the data.
We will look at three types of regularization:
- Dropout
- Batch normalization
- L1 and L2 normalization
推薦閱讀
- AngularJS Testing Cookbook
- OpenShift開發指南(原書第2版)
- 無代碼編程:用云表搭建企業數字化管理平臺
- Python王者歸來
- Web全棧工程師的自我修養
- 深度學習:算法入門與Keras編程實踐
- 數據結構與算法分析(C++語言版)
- Windows Phone 7.5:Building Location-aware Applications
- Python算法詳解
- Django 5企業級Web應用開發實戰(視頻教學版)
- ASP.NET Web API Security Essentials
- JavaScript悟道
- WebStorm Essentials
- Python Projects for Kids
- 嵌入式C編程實戰