- Mastering Machine Learning for Penetration Testing
- Chiheb Chebbi
- 172字
- 2021-06-25 21:03:04
Dimensionality reduction
Dimensionality reduction is used to reduce the dimensionality of a dataset. It is really helpful in cases where the problem becomes intractable, when the number of variables increases. By using the term dimensionality, we are referring to the features. One of the basic reduction techniques is feature engineering.
Generally, we have many dimensionality reduction algorithms:
- Low variance filter: Dropping variables that have low variance, compared to others.
- High correlation filter: This identifies the variables with high correlation, by using pearson or polychoric, and selects one of them using the Variance Inflation Factor (VIF).
- Backward feature elimination: This is done by computing the sum of square of error (SSE) after eliminating each variable n times.
- Linear Discriminant Analysis (LDA): This reduces the number of dimensions, n, from the original to the number of classes?—?1 number of features.
- Principal Component Analysis (PCA): This is a statistical procedure that transforms variables into a new set of variables (principle components).
推薦閱讀
- Application Development with Qt Creator(Second Edition)
- RCNP實驗指南:構建高級的路由互聯網絡(BARI)
- 計算機網絡與數據通信
- Windows Server 2003 Active Directory Design and Implementation: Creating, Migrating, and Merging Networks
- Mastering TypeScript 3
- OMNeT++與網絡仿真
- INSTANT KineticJS Starter
- Echo Quick Start Guide
- Android UI Design
- 物聯網
- 物聯網傳感器技術與應用
- Hands-On Microservices:Monitoring and Testing
- LiveCode Mobile Development Beginner's Guide
- 企業網絡組建與維護項目式教程
- 現場總線與工業以太網及其應用技術(第2版)