- Machine Learning Quick Reference
- Rahul Kumar
- 116字
- 2021-08-20 10:05:07
Cross-validation and model selection
We have already spoken about overfitting. It is something to do with the stability of a model since the real test of a model occurs when it works on unseen and new data. One of the most important aspects of a model is that it shouldn't pick up on noise, apart from regular patterns.
Validation is nothing but an assurance of the model being a relationship between the response and predictors as the outcome of input features and not noise. A good indicator of the model is not through training data and error. That's why we need cross-validation.
Here, we will stick with k-fold cross-validation and understand how it can be used.
推薦閱讀
- 集成架構(gòu)中型系統(tǒng)
- Word 2003、Excel 2003、PowerPoint 2003上機(jī)指導(dǎo)與練習(xí)
- Machine Learning for Cybersecurity Cookbook
- IoT Penetration Testing Cookbook
- CSS全程指南
- 數(shù)據(jù)運(yùn)營(yíng)之路:掘金數(shù)據(jù)化時(shí)代
- JMAG電機(jī)電磁仿真分析與實(shí)例解析
- Mastering Elastic Stack
- 大數(shù)據(jù)安全與隱私保護(hù)
- Blender Compositing and Post Processing
- 四向穿梭式自動(dòng)化密集倉(cāng)儲(chǔ)系統(tǒng)的設(shè)計(jì)與控制
- 網(wǎng)絡(luò)綜合布線設(shè)計(jì)與施工技術(shù)
- Apache Superset Quick Start Guide
- 單片機(jī)技能與實(shí)訓(xùn)
- 設(shè)計(jì)模式