- Effective Amazon Machine Learning
- Alexis Perrier
- 88字
- 2021-07-03 00:17:53
L2 regularization and Ridge
L2 regularization prevents the weights {wi} from being too spread. The smaller weights that rise up for non-correlated though potentially meaningful, features will not become insignificant when compared to the weights associated to the important correlated features. L2 regularization will enforce similar scaling of the weights. A direct consequence of L2 regularization is to reduce the negative impact of collinearity, since the weights can no longer perge from one another.
The Stochastic Gradient Descent algorithm with L2 regularization is known as the Ridge algorithm.
推薦閱讀
- 同步:秩序如何從混沌中涌現(xiàn)
- 公有云容器化指南:騰訊云TKE實戰(zhàn)與應用
- 輕松學大數(shù)據(jù)挖掘:算法、場景與數(shù)據(jù)產(chǎn)品
- 正則表達式必知必會
- SQL應用及誤區(qū)分析
- Google Cloud Platform for Developers
- 區(qū)塊鏈技術應用與實踐案例
- Augmented Reality using Appcelerator Titanium Starter
- MySQL技術內(nèi)幕:SQL編程
- R Machine Learning Essentials
- 機器學習:實用案例解析
- 一本書讀懂大數(shù)據(jù)
- 領域驅(qū)動設計精粹
- Artificial Intelligence for Big Data
- Scratch Cookbook