- Neural Networks with Keras Cookbook
- V Kishore Ayyadevara
- 70字
- 2021-07-02 12:46:28
Varying the learning rate to improve network accuracy
So far, in the previous recipes, we used the default learning rate of the Adam optimizer, which is 0.0001.
In this section, we will manually set the learning rate to a higher number and see the impact of changing the learning rate on model accuracy, while reusing the same MNIST training and test dataset that were scaled in the previous recipes.