- Deep Learning Quick Reference
- Mike Bernico
- 172字
- 2021-06-24 18:40:08
K-Fold cross-validation
If you're experienced with machine learning, you may be wondering why I would opt for Hold-Out (train/val/test) validation over K-Fold cross-validation. Training a deep neural network is a very expensive operation, and put very simply, training K of them per set of hyperparameters we'd like to explore is usually not very practical.
We can be somewhat confident that Hold-Out validation does a very good job, given a large enough val and test set. Most of the time, we are hopefully applying deep learning in situations where we have an abundance of data, resulting in an adequate val and test set.
Ultimately, it's up to you. As we will see later, Keras provides a scikit-learn interface that allows Keras models to be integrated into a scikit-learn pipeline. This allows us to perform K-Fold, Stratified K-Fold, and even grid searches with K-Fold. It's both possible and appropriate to sometimes use K-Fold CV in training deep models. That said, for the rest of the book we will focus on the using Hold-Out validation.
- 3D Printing with RepRap Cookbook
- 反饋系統:多學科視角(原書第2版)
- 大數據平臺異常檢測分析系統的若干關鍵技術研究
- 基于32位ColdFire構建嵌入式系統
- Google SketchUp for Game Design:Beginner's Guide
- Mastering Text Mining with R
- Photoshop CS4數碼照片處理入門、進階與提高
- Effective Business Intelligence with QuickSight
- PostgreSQL 10 High Performance
- INSTANT R Starter
- 工業機器人編程指令詳解
- Linux Administration Cookbook
- 白話機器學習算法
- Practical Computer Vision
- AI成“神”之日:人工智能的終極演變