- Statistics for Machine Learning
- Pratap Dangeti
- 163字
- 2021-07-02 19:05:57
Cross-validation
Cross-validation is another way of ensuring robustness in the model at the expense of computation. In the ordinary modeling methodology, a model is developed on train data and evaluated on test data. In some extreme cases, train and test might not have been homogeneously selected and some unseen extreme cases might appear in the test data, which will drag down the performance of the model.
On the other hand, in cross-validation methodology, data was divided into equal parts and training performed on all the other parts of the data except one part, on which performance will be evaluated. This process repeated as many parts user has chosen.
Example: In five-fold cross-validation, data will be divided into five parts, subsequently trained on four parts of the data, and tested on the one part of the data. This process will run five times, in order to cover all points in the data. Finally, the error calculated will be the average of all the errors:

- 數據庫系統教程(第2版)
- 小程序實戰視頻課:微信小程序開發全案精講
- iOS 9 Game Development Essentials
- Android開發精要
- JavaScript Unlocked
- Java Web及其框架技術
- Learning Apache Mahout Classification
- 零基礎學Python網絡爬蟲案例實戰全流程詳解(入門與提高篇)
- 硅谷Python工程師面試指南:數據結構、算法與系統設計
- Creating Mobile Apps with jQuery Mobile(Second Edition)
- HTML5 APP開發從入門到精通(微課精編版)
- Swift 4 Protocol-Oriented Programming(Third Edition)
- jQuery Essentials
- Web前端開發精品課:HTML5 Canvas開發詳解
- Thymeleaf 3完全手冊