官术网_书友最值得收藏!

K-Fold cross-validation

If you're experienced with machine learning, you may be wondering why I would opt for Hold-Out (train/val/test) validation over K-Fold cross-validation. Training a deep neural network is a very expensive operation, and put very simply, training K of them per set of hyperparameters we'd like to explore is usually not very practical.

We can be somewhat confident that Hold-Out validation does a very good job, given a large enough val and test set. Most of the time, we are hopefully applying deep learning in situations where we have an abundance of data, resulting in an adequate val and test set.

Ultimately, it's up to you. As we will see later, Keras provides a scikit-learn interface that allows Keras models to be integrated into a scikit-learn pipeline. This allows us to perform K-Fold, Stratified K-Fold, and even grid searches with K-Fold. It's both possible and appropriate to sometimes use K-Fold CV in training deep models. That said, for the rest of the book we will focus on the using Hold-Out validation.

主站蜘蛛池模板: 河池市| 当雄县| 海门市| 甘孜| 九龙坡区| 天祝| 宁远县| 米林县| 崇明县| 江津市| 灌阳县| 江川县| 高平市| 延吉市| 商都县| 乌兰浩特市| 湘潭县| 沅江市| 龙山县| 确山县| 漯河市| 金溪县| 江西省| 荥阳市| 临朐县| 南雄市| 福建省| 手机| 万山特区| 旺苍县| 枝江市| 宁都县| 焉耆| 宕昌县| 西乌| 三江| 广东省| 乾安县| 富蕴县| 蓬莱市| 当涂县|