- Effective Amazon Machine Learning
- Alexis Perrier
- 88字
- 2021-07-03 00:17:53
L2 regularization and Ridge
L2 regularization prevents the weights {wi} from being too spread. The smaller weights that rise up for non-correlated though potentially meaningful, features will not become insignificant when compared to the weights associated to the important correlated features. L2 regularization will enforce similar scaling of the weights. A direct consequence of L2 regularization is to reduce the negative impact of collinearity, since the weights can no longer perge from one another.
The Stochastic Gradient Descent algorithm with L2 regularization is known as the Ridge algorithm.
推薦閱讀
- MySQL高可用解決方案:從主從復制到InnoDB Cluster架構
- SQL Server 2008數據庫應用技術(第二版)
- 文本數據挖掘:基于R語言
- OracleDBA實戰攻略:運維管理、診斷優化、高可用與最佳實踐
- Lego Mindstorms EV3 Essentials
- MATLAB Graphics and Data Visualization Cookbook
- LabVIEW 完全自學手冊
- Instant Autodesk AutoCAD 2014 Customization with .NET
- Construct 2 Game Development by Example
- Power BI智能數據分析與可視化從入門到精通
- Augmented Reality using Appcelerator Titanium Starter
- 爬蟲實戰:從數據到產品
- Unity Game Development Blueprints
- Unity for Architectural Visualization
- 基于數據發布的隱私保護模型研究