官术网_书友最值得收藏!

Getting to know additional linear regressors

Before moving on to linear classifiers, it makes sense to also add the following additional linear regression algorithms to your toolset:

  • Elastic-net uses a mixture of L1 and L2 regularization techniques, where l1_ratio controls the mix of the two. This is useful in cases when you want to learn a sparse model where few of the weights are non-zero (as in lasso) while keeping the benefits of ridge regularization.
  • Random Sample Consensus(RANSAC) is useful when your data has outliers. It tries to separate the outliers from the inlier samples. Then, it fits the model on the inliers only.
  • Least-Angle Regression (LARS) is useful when dealing with high-dimensional data—that is, when there is a significant number of features compared to the number of samples. You may want to try it with the polynomial features example we saw earlier and see how it performs there.

Let's move on to the next section of the book where you will learn to use logistic regression to classify data.

主站蜘蛛池模板: SHOW| 沂源县| 大渡口区| 政和县| 亳州市| 宁明县| 福泉市| 阿图什市| 泰州市| 通榆县| 垦利县| 元谋县| 德保县| 宜良县| 张家界市| 长泰县| 九台市| 馆陶县| 华阴市| 廉江市| 长宁区| 定结县| 江永县| 白城市| 敦煌市| 民丰县| 盖州市| 丹棱县| 平和县| 洛南县| 五寨县| 蓬溪县| 津南区| 桂东县| 湄潭县| 汉沽区| 保德县| 青田县| 钦州市| 德钦县| 乌兰县|