- Neural Networks with R
- Giuseppe Ciaburro Balaji Venkateswaran
- 85字
- 2021-08-20 10:25:18
Rectified Linear Unit
Rectified Linear Unit (ReLU) is the most used activation function since 2015. It is a simple condition and has advantages over the other functions. The function is defined by the following formula:

In the following figure is shown a ReLU activation function:

The range of output is between 0 and infinity. ReLU finds applications in computer vision and speech recognition using deep neural nets. There are various other activation functions as well, but we have covered the most important ones here.
推薦閱讀
- Python爬蟲開發與項目實戰
- PHP 編程從入門到實踐
- Getting Started with Python Data Analysis
- ADI DSP應用技術集錦
- Python程序設計與算法基礎教程(第2版)(微課版)
- INSTANT Silverlight 5 Animation
- Microsoft 365 Certified Fundamentals MS-900 Exam Guide
- 單片機原理及應用技術
- Microsoft HoloLens By Example
- Building Clouds with Windows Azure Pack
- Android項目實戰:博學谷
- 深入理解Zabbix監控系統
- 微信公眾平臺服務號開發:揭秘九大高級接口
- HTML5+CSS3+JavaScript案例實戰
- Comprehensive Ruby Programming