- Deep Learning with Keras
- Antonio Gulli Sujit Pal
- 94字
- 2021-07-02 23:58:02
Activation function — ReLU
The sigmoid is not the only kind of smooth activation function used for neural networks. Recently, a very simple function called rectified linear unit (ReLU) became very popular because it generates very good experimental results. A ReLU is simply defined as , and the nonlinear function is represented in the following graph. As you can see in the following graph, the function is zero for negative values, and it grows linearly for positive values:
推薦閱讀
- Arduino入門基礎(chǔ)教程
- 24小時(shí)學(xué)會(huì)電腦組裝與維護(hù)
- Applied Unsupervised Learning with R
- 計(jì)算機(jī)應(yīng)用與維護(hù)基礎(chǔ)教程
- 深入淺出SSD:固態(tài)存儲(chǔ)核心技術(shù)、原理與實(shí)戰(zhàn)(第2版)
- Mastering Manga Studio 5
- 分布式系統(tǒng)與一致性
- Mastering Adobe Photoshop Elements
- 單片機(jī)系統(tǒng)設(shè)計(jì)與開發(fā)教程
- Building 3D Models with modo 701
- The Artificial Intelligence Infrastructure Workshop
- Mastering Machine Learning on AWS
- 微控制器的應(yīng)用
- 可編程邏輯器件項(xiàng)目開發(fā)設(shè)計(jì)
- FPGA實(shí)戰(zhàn)訓(xùn)練精粹