- Machine Learning for Cybersecurity Cookbook
- Emmanuel Tsukerman
- 115字
- 2021-06-24 12:28:55
Standardizing your data
For many machine learning algorithms, performance is highly sensitive to the relative scale of features. For that reason, it is often important to standardize your features. To standardize a feature means to shift all of its values so that their mean = 0 and to scale them so that their variance = 1.
One instance when normalizing is useful is when featuring the PE header of a file. The PE header contains extremely large values (for example, the SizeOfInitializedData field) and also very small ones (for example, the number of sections). For certain ML models, such as neural networks, the large discrepancy in magnitude between features can reduce performance.
推薦閱讀
- Splunk 7 Essentials(Third Edition)
- 基于LabWindows/CVI的虛擬儀器設計與應用
- Java開發技術全程指南
- 電腦上網直通車
- 精通Excel VBA
- Practical Big Data Analytics
- LAMP網站開發黃金組合Linux+Apache+MySQL+PHP
- Dreamweaver CS6精彩網頁制作與網站建設
- 網絡脆弱性掃描產品原理及應用
- Web璀璨:Silverlight應用技術完全指南
- 計算機應用基礎實訓(職業模塊)
- Machine Learning with Spark(Second Edition)
- 計算機應用基礎實訓·職業模塊
- 機器學習案例分析(基于Python語言)
- ARM體系結構與編程