- Neural Networks with Keras Cookbook
- V Kishore Ayyadevara
- 198字
- 2021-07-02 12:46:33
Defining the custom loss function
In the previous section, we used the predefined mean absolute error loss function to perform the optimization. In this section, we will learn about defining a custom loss function to perform optimization.
The custom loss function that we shall build is a modified mean squared error value, where the error is the difference between the square root of the actual value and the square root of the predicted value.
The custom loss function is defined as follows:
import keras.backend as K
def loss_function(y_true, y_pred):
return K.square(K.sqrt(y_pred)-K.sqrt(y_true))
Now that we have defined the loss function, we will be reusing the same input and output datasets that we prepared in previous section, and we will also be using the same model that we defined earlier.
Now, let's compile the model:
model.compile(loss=loss_function, optimizer='adam')
In the preceding code, note that we defined the loss value as the custom loss function that we defined earlier—loss_function.
history = model.fit(train_data2, train_targets, validation_data=(test_data2, test_targets), epochs=100, batch_size=32, verbose=1)
Once we fit the model, we will note that the mean absolute error is ~6.5 units, which is slightly less than the previous iteration where we used the mean_absolute_error loss function.
- C語言程序設(shè)計(jì)案例教程(第2版)
- BeagleBone Media Center
- Instant Typeahead.js
- 深入理解Java7:核心技術(shù)與最佳實(shí)踐
- 深入淺出Android Jetpack
- PHP+MySQL+Dreamweaver動(dòng)態(tài)網(wǎng)站開發(fā)實(shí)例教程
- Python時(shí)間序列預(yù)測(cè)
- Unity&VR游戲美術(shù)設(shè)計(jì)實(shí)戰(zhàn)
- Node.js區(qū)塊鏈開發(fā)
- Flink技術(shù)內(nèi)幕:架構(gòu)設(shè)計(jì)與實(shí)現(xiàn)原理
- Learning Grunt
- OpenCV 3.0 Computer Vision with Java
- 大規(guī)模語言模型開發(fā)基礎(chǔ)與實(shí)踐
- Android高級(jí)開發(fā)實(shí)戰(zhàn):UI、NDK與安全
- MongoDB Administrator’s Guide