官术网_书友最值得收藏!

Getting to know additional linear regressors

Before moving on to linear classifiers, it makes sense to also add the following additional linear regression algorithms to your toolset:

  • Elastic-net uses a mixture of L1 and L2 regularization techniques, where l1_ratio controls the mix of the two. This is useful in cases when you want to learn a sparse model where few of the weights are non-zero (as in lasso) while keeping the benefits of ridge regularization.
  • Random Sample Consensus(RANSAC) is useful when your data has outliers. It tries to separate the outliers from the inlier samples. Then, it fits the model on the inliers only.
  • Least-Angle Regression (LARS) is useful when dealing with high-dimensional data—that is, when there is a significant number of features compared to the number of samples. You may want to try it with the polynomial features example we saw earlier and see how it performs there.

Let's move on to the next section of the book where you will learn to use logistic regression to classify data.

主站蜘蛛池模板: 正镶白旗| 扶风县| 威远县| 迁西县| 衡阳市| 兴仁县| 湛江市| 昔阳县| 子洲县| 临高县| 平遥县| 伊春市| 冷水江市| 吉林市| 宝兴县| 西乡县| 湖口县| 且末县| 广德县| 洛浦县| 辽宁省| 白银市| 永新县| 织金县| 庆城县| 虹口区| 萨嘎县| 新乐市| 永清县| 军事| 石门县| 房产| 新源县| 牡丹江市| 庐江县| 牟定县| 定日县| 沭阳县| 简阳市| 乌鲁木齐市| 资中县|