- Deep Learning Quick Reference
- Mike Bernico
- 79字
- 2021-06-24 18:40:05
The Adam optimizer
Adam is one of the best performing known optimizer and it's my first choice. It works well across a wide variety of problems. It combines the best parts of both momentum and RMSProp into a single update rule:




Where is some very small number to prevent division by 0.
Adam is often a great choice, and it's a great place to start when you're prototyping, so save yourself some time by starting with Adam.
推薦閱讀
- 基于LabWindows/CVI的虛擬儀器設計與應用
- Hadoop 2.x Administration Cookbook
- Data Wrangling with Python
- Lightning Fast Animation in Element 3D
- TensorFlow Reinforcement Learning Quick Start Guide
- Hands-On Reactive Programming with Reactor
- 機器人人工智能
- AVR單片機工程師是怎樣煉成的
- Java組件設計
- Hands-On Deep Learning with Go
- 大數據素質讀本
- Effective Business Intelligence with QuickSight
- 計算機硬件技術基礎學習指導與練習
- Adobe Edge Quickstart Guide
- 局域網應用一點通