- Deep Learning with Keras
- Antonio Gulli Sujit Pal
- 94字
- 2021-07-02 23:58:02
Activation function — ReLU
The sigmoid is not the only kind of smooth activation function used for neural networks. Recently, a very simple function called rectified linear unit (ReLU) became very popular because it generates very good experimental results. A ReLU is simply defined as , and the nonlinear function is represented in the following graph. As you can see in the following graph, the function is zero for negative values, and it grows linearly for positive values:
推薦閱讀
- 深入理解Spring Cloud與實戰
- FPGA從入門到精通(實戰篇)
- Effective STL中文版:50條有效使用STL的經驗(雙色)
- 辦公通信設備維修
- Mastering Delphi Programming:A Complete Reference Guide
- Deep Learning with PyTorch
- 單片機原理及應用系統設計
- 計算機維修與維護技術速成
- Learning Game Physics with Bullet Physics and OpenGL
- Creating Flat Design Websites
- Hands-On Artificial Intelligence for Banking
- 深入理解序列化與反序列化
- LPC1100系列處理器原理及應用
- FL Studio Cookbook
- 單片機原理與技能訓練