- Deep Learning Quick Reference
- Mike Bernico
- 172字
- 2021-06-24 18:40:08
K-Fold cross-validation
If you're experienced with machine learning, you may be wondering why I would opt for Hold-Out (train/val/test) validation over K-Fold cross-validation. Training a deep neural network is a very expensive operation, and put very simply, training K of them per set of hyperparameters we'd like to explore is usually not very practical.
We can be somewhat confident that Hold-Out validation does a very good job, given a large enough val and test set. Most of the time, we are hopefully applying deep learning in situations where we have an abundance of data, resulting in an adequate val and test set.
Ultimately, it's up to you. As we will see later, Keras provides a scikit-learn interface that allows Keras models to be integrated into a scikit-learn pipeline. This allows us to perform K-Fold, Stratified K-Fold, and even grid searches with K-Fold. It's both possible and appropriate to sometimes use K-Fold CV in training deep models. That said, for the rest of the book we will focus on the using Hold-Out validation.
- ArchiCAD 19:The Definitive Guide
- Mastering Proxmox(Third Edition)
- 人工免疫算法改進及其應用
- Learning Apache Spark 2
- Learning Social Media Analytics with R
- Apache Hive Essentials
- 自動控制理論(非自動化專業)
- 影視后期編輯與合成
- Containers in OpenStack
- Visual FoxPro程序設計
- 云計算和大數據的應用
- Mastering OpenStack(Second Edition)
- 企業級Web開發實戰
- Eclipse RCP應用系統開發方法與實戰
- 智能控制技術及其應用