- Mastering Machine Learning with R
- Cory Lesmeister
- 233字
- 2021-07-02 13:46:25
Advanced Feature Selection in Linear Models
So far, we've examined the usage of linear models for both quantitative and qualitative outcomes with an eye on the techniques of feature selection, that is, the methods and techniques that exclude useless or unwanted predictor variables. We saw that linear models can be quite useful in machine learning problems, how piece-wise linear models can capture non-linear relationships as multivariate adaptive regression splines. Additional techniques have been developed and refined in the last couple of decades that can improve predictive ability and interpretability above and beyond the linear models that we discussed in the preceding chapters. In this day and age, many datasets, such as those in the two prior chapters, have numerous features. It isn't unreasonable to have datasets with thousands of potential features.
The methods in this chapter might prove to be a better way to approach feature reduction and selection. In this chapter, we'll look at the concept of regularization where the coefficients are constrained or shrunk towards zero. There're many methods and permutations to these methods of regularization, but we'll focus on ridge regression, Least Absolute Shrinkage and Selection Operator (LASSO), and, finally, elastic net, which combines the benefits of both techniques into one.
The following are the topics we'll cover in this chapter:
- Overview of regularization
- Dataset creation
- Ridge regression
- LASSO
- Elastic net
- AutoCAD繪圖實用速查通典
- Cinema 4D R13 Cookbook
- 西門子PLC與InTouch綜合應用
- 極簡AI入門:一本書讀懂人工智能思維與應用
- 模型制作
- JMAG電機電磁仿真分析與實例解析
- 樂高創意機器人教程(中級 下冊 10~16歲) (青少年iCAN+創新創意實踐指導叢書)
- Mastering pfSense
- 案例解說Delphi典型控制應用
- Mastering Android Game Development with Unity
- 創客機器人實戰:基于Arduino和樹莓派
- Hands-On Artificial Intelligence for Beginners
- 智能座艙之車載機器人交互設計與開發
- Hands/On Kubernetes on Azure
- Hands-On Data Analysis with Scala