- Deep Learning Quick Reference
- Mike Bernico
- 79字
- 2021-06-24 18:40:05
The Adam optimizer
Adam is one of the best performing known optimizer and it's my first choice. It works well across a wide variety of problems. It combines the best parts of both momentum and RMSProp into a single update rule:




Where is some very small number to prevent division by 0.
Adam is often a great choice, and it's a great place to start when you're prototyping, so save yourself some time by starting with Adam.
推薦閱讀
- 亮劍.NET:.NET深入體驗與實戰精要
- Hands-On Internet of Things with MQTT
- 機器學習及應用(在線實驗+在線自測)
- Python Artificial Intelligence Projects for Beginners
- Managing Mission:Critical Domains and DNS
- 數據挖掘實用案例分析
- Visual C# 2008開發技術實例詳解
- PHP開發手冊
- Python Data Science Essentials
- DevOps:Continuous Delivery,Integration,and Deployment with DevOps
- The Python Workshop
- PostgreSQL 10 Administration Cookbook
- 網站入侵與腳本攻防修煉
- 激光選區熔化3D打印技術
- 基于企業網站的顧客感知服務質量評價理論模型與實證研究