官术网_书友最值得收藏!

K-Fold cross-validation

If you're experienced with machine learning, you may be wondering why I would opt for Hold-Out (train/val/test) validation over K-Fold cross-validation. Training a deep neural network is a very expensive operation, and put very simply, training K of them per set of hyperparameters we'd like to explore is usually not very practical.

We can be somewhat confident that Hold-Out validation does a very good job, given a large enough val and test set. Most of the time, we are hopefully applying deep learning in situations where we have an abundance of data, resulting in an adequate val and test set.

Ultimately, it's up to you. As we will see later, Keras provides a scikit-learn interface that allows Keras models to be integrated into a scikit-learn pipeline. This allows us to perform K-Fold, Stratified K-Fold, and even grid searches with K-Fold. It's both possible and appropriate to sometimes use K-Fold CV in training deep models. That said, for the rest of the book we will focus on the using Hold-Out validation.

主站蜘蛛池模板: 东港市| 广饶县| 阜平县| 明光市| 潮安县| 汶上县| 宽城| 沛县| 比如县| 大同县| 四会市| 清流县| 道孚县| 社会| 利川市| 湟中县| 安庆市| 巴塘县| 湾仔区| 章丘市| 汶川县| 玛多县| SHOW| 博乐市| 青海省| 台南市| 巩留县| 河源市| 尉氏县| 定陶县| 新昌县| 东山县| 嘉善县| 讷河市| 威宁| 谢通门县| 江北区| 岐山县| 邵阳市| 天水市| 潼关县|