- Advanced Machine Learning with R
- Cory Lesmeister Dr. Sunil Kumar Chinnamgari
- 233字
- 2021-06-24 14:24:38
Advanced Feature Selection in Linear Models
So far, we've examined the usage of linear models for both quantitative and qualitative outcomes with an eye on the techniques of feature selection, that is, the methods and techniques that exclude useless or unwanted predictor variables. We saw that linear models can be quite useful in machine learning problems, how piece-wise linear models can capture non-linear relationships as multivariate adaptive regression splines. Additional techniques have been developed and refined in the last couple of decades that can improve predictive ability and interpretability above and beyond the linear models that we discussed in the preceding chapters. In this day and age, many datasets, such as those in the two prior chapters, have numerous features. It isn't unreasonable to have datasets with thousands of potential features.
The methods in this chapter might prove to be a better way to approach feature reduction and selection. In this chapter, we'll look at the concept of regularization where the coefficients are constrained or shrunk towards zero. There're many methods and permutations to these methods of regularization, but we'll focus on ridge regression, Least Absolute Shrinkage and Selection Operator (LASSO), and, finally, elastic net, which combines the benefits of both techniques into one.
The following are the topics we'll cover in this chapter:
- Overview of regularization
- Dataset creation
- Ridge regression
- LASSO
- Elastic net
- Learning SQL Server Reporting Services 2012
- ATmega16單片機項目驅動教程
- 網絡服務器配置與管理(第3版)
- Unity 5.x Game Development Blueprints
- 嵌入式系統中的模擬電路設計
- 3D Printing Blueprints
- 筆記本電腦芯片級維修從入門到精通(圖解版)
- 計算機組裝與維護(慕課版)
- 計算機組成技術教程
- 從企業級開發到云原生微服務:Spring Boot實戰
- 微型計算機原理及應用教程(第2版)
- 電腦組裝與維修實戰
- Liferay 6.2 User Interface Development
- 多媒體技術教程
- Arduino項目開發:物聯網應用