官术网_书友最值得收藏!

Model checkpoints based on validation log loss 

It is always a good practice to save the model when the validation score chosen for evaluation improves. For our project, we will be tracking the validation log loss, and will save the model as the validation score improves over the different epochs. This way, after the training, we will save the model weights that provided the best validation score, and not the final model weights from when we stopped the training. The training will continue until the maximum number of epochs defined for the training is reached, or until the validation log loss hasn't reduced for 10 epochs in a row. We will also reduce the learning rate when the validation log loss doesn't improve for 3 epochs. The following code block can be used to perform the learning rate reduction and checkpoint operation:

reduce_lr = keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.50,
patience=3, min_lr=0.000001)

callbacks = [
EarlyStopping(monitor='val_loss', patience=10, mode='min', verbose=1),
CSVLogger('keras-5fold-run-01-v1-epochs_ib.log', separator=',', append=False),reduce_lr,
ModelCheckpoint(
'kera1-5fold-run-01-v1-fold-' + str('%02d' % (k + 1)) + '-run-' + str('%02d' % (1 + 1)) + '.check',
monitor='val_loss', mode='min', # mode must be set to max or keras will be confused
save_best_only=True,
verbose=1)
]

As you can see in the preceding code block, the learning rate reduces to half (0.50) if the validation loss hasn't improved in 3 (patience=3) epochs. Similarly, we stop the training (by performing EarlyStopping) if the validation loss is not reduced in 10 (patience = 10) epochs. The model is saved whenever the validation log loss reduces, as shown in the following code snippet:

'kera1-5fold-run-01-v1-fold-' + str('%02d' % (k + 1)) + '-run-' + str('%02d' % (1 + 1)) + '.check'

The validation log loss in each epoch of the training process is tracked in the keras-5fold-run-01-v1-epochs_ib.log log file, and is referred to in order to save the model if the validation log loss improves, or to decide when to reduce the learning rate or stop the training.

The models in each fold are saved by using the keras save function in user-defined paths, while during inference, the models are loaded into memory by using the keras.load_model function.

主站蜘蛛池模板: 香格里拉县| 陵水| 泊头市| 博兴县| 八宿县| 滦南县| 临泉县| 景德镇市| 合山市| 西华县| 安丘市| 胶州市| 黔东| 抚松县| 库伦旗| 吐鲁番市| 岳池县| 洞口县| 尼勒克县| 长寿区| 无棣县| 黎城县| 汉源县| 临清市| 宣武区| 防城港市| 霍山县| 福贡县| 呼伦贝尔市| 土默特右旗| 湟源县| 浦东新区| 高碑店市| 伊春市| 宜兰县| 岳阳县| 丘北县| 卢龙县| 杭锦后旗| 九江市| 元朗区|