- Hands-On Machine Learning with scikit:learn and Scientific Python Toolkits
- Tarek Amr
- 172字
- 2021-06-18 18:24:32
Getting to know additional linear regressors
Before moving on to linear classifiers, it makes sense to also add the following additional linear regression algorithms to your toolset:
- Elastic-net uses a mixture of L1 and L2 regularization techniques, where l1_ratio controls the mix of the two. This is useful in cases when you want to learn a sparse model where few of the weights are non-zero (as in lasso) while keeping the benefits of ridge regularization.
- Random Sample Consensus(RANSAC) is useful when your data has outliers. It tries to separate the outliers from the inlier samples. Then, it fits the model on the inliers only.
- Least-Angle Regression (LARS) is useful when dealing with high-dimensional data—that is, when there is a significant number of features compared to the number of samples. You may want to try it with the polynomial features example we saw earlier and see how it performs there.
Let's move on to the next section of the book where you will learn to use logistic regression to classify data.
推薦閱讀
- 計算思維與算法入門
- What's New in TensorFlow 2.0
- 樂高機器人設計技巧:EV3結(jié)構(gòu)設計與編程指導
- MariaDB High Performance
- Java虛擬機字節(jié)碼:從入門到實戰(zhàn)
- Web Application Development with MEAN
- Access 2016數(shù)據(jù)庫管
- 重學Java設計模式
- Python漫游數(shù)學王國:高等數(shù)學、線性代數(shù)、數(shù)理統(tǒng)計及運籌學
- Learning JavaScript Data Structures and Algorithms
- Learning Apache Karaf
- Java Web開發(fā)就該這樣學
- Web App Testing Using Knockout.JS
- Apache Solr PHP Integration
- Visual Basic程序設計基礎