官术网_书友最值得收藏!

Getting to know additional linear regressors

Before moving on to linear classifiers, it makes sense to also add the following additional linear regression algorithms to your toolset:

  • Elastic-net uses a mixture of L1 and L2 regularization techniques, where l1_ratio controls the mix of the two. This is useful in cases when you want to learn a sparse model where few of the weights are non-zero (as in lasso) while keeping the benefits of ridge regularization.
  • Random Sample Consensus(RANSAC) is useful when your data has outliers. It tries to separate the outliers from the inlier samples. Then, it fits the model on the inliers only.
  • Least-Angle Regression (LARS) is useful when dealing with high-dimensional data—that is, when there is a significant number of features compared to the number of samples. You may want to try it with the polynomial features example we saw earlier and see how it performs there.

Let's move on to the next section of the book where you will learn to use logistic regression to classify data.

主站蜘蛛池模板: 文登市| 宣化县| 潞城市| 丰原市| 江油市| 慈利县| 南开区| 高要市| 寿光市| 格尔木市| 丰都县| 白城市| 浦县| 丹阳市| 汝城县| 墨江| 周宁县| 临澧县| 米易县| 贵南县| 岐山县| 丹阳市| 三明市| 正定县| 闵行区| 中阳县| 抚宁县| 陇西县| 华坪县| 察隅县| 涟源市| 六安市| 维西| 武冈市| 永丰县| 昌乐县| 石嘴山市| 庆阳市| 遵义市| 兴化市| 宜宾市|