- Hands-On Natural Language Processing with Python
- Rajesh Arumugam Rajalingappaa Shanmugamani
- 95字
- 2021-08-13 16:01:48
Regularization techniques
Overfitting is a problem in ML, where a model blindly learns all of the patterns in the data, including noise. A neural network can easily be overfitted during training, due to the availability of a large number of parameters. Theoretically, given any size of input data, a large enough Artificial Neural Network (ANN) can memorize all of the patterns in it, along with noise. Therefore, the weights of models have to be regularized to avoid overfitting the data.
We will look at three types of regularization:
- Dropout
- Batch normalization
- L1 and L2 normalization
推薦閱讀
- Node.js 10實戰
- Kali Linux Web Penetration Testing Cookbook
- 零基礎PHP學習筆記
- Functional Kotlin
- 劍指大數據:企業級數據倉庫項目實戰(在線教育版)
- NetBeans IDE 8 Cookbook
- HTML5+CSS3 Web前端開發技術(第2版)
- Java EE企業級應用開發教程(Spring+Spring MVC+MyBatis)
- App Inventor少兒趣味編程動手做
- 單片機原理及應用技術
- MongoDB Cookbook(Second Edition)
- ROS機器人編程實戰
- PHP Microservices
- 關系數據庫與SQL Server 2012(第3版)
- Offer來了:Java面試核心知識點精講(框架篇)