- Neural Networks with Keras Cookbook
- V Kishore Ayyadevara
- 70字
- 2021-07-02 12:46:28
Varying the learning rate to improve network accuracy
So far, in the previous recipes, we used the default learning rate of the Adam optimizer, which is 0.0001.
In this section, we will manually set the learning rate to a higher number and see the impact of changing the learning rate on model accuracy, while reusing the same MNIST training and test dataset that were scaled in the previous recipes.
推薦閱讀
- ASP.NET Core:Cloud-ready,Enterprise Web Application Development
- Qt 5 and OpenCV 4 Computer Vision Projects
- FuelPHP Application Development Blueprints
- 工程軟件開發技術基礎
- CMDB分步構建指南
- Oracle JDeveloper 11gR2 Cookbook
- Apache Mahout Clustering Designs
- C# 8.0核心技術指南(原書第8版)
- Terraform:多云、混合云環境下實現基礎設施即代碼(第2版)
- 計算機應用基礎案例教程
- Julia for Data Science
- Node.js區塊鏈開發
- 零基礎學C語言(第4版)
- 零基礎學C++(升級版)
- 網頁設計與制作