- Neural Networks with R
- Giuseppe Ciaburro Balaji Venkateswaran
- 85字
- 2021-08-20 10:25:18
Rectified Linear Unit
Rectified Linear Unit (ReLU) is the most used activation function since 2015. It is a simple condition and has advantages over the other functions. The function is defined by the following formula:

In the following figure is shown a ReLU activation function:

The range of output is between 0 and infinity. ReLU finds applications in computer vision and speech recognition using deep neural nets. There are various other activation functions as well, but we have covered the most important ones here.
推薦閱讀
- 計算思維與算法入門
- MySQL 8從入門到精通(視頻教學版)
- Java高手真經(高級編程卷):Java Web高級開發技術
- 網頁設計與制作教程(HTML+CSS+JavaScript)(第2版)
- Groovy for Domain:specific Languages(Second Edition)
- 假如C語言是我發明的:講給孩子聽的大師編程課
- SQL Server 2016數據庫應用與開發習題解答與上機指導
- bbPress Complete
- Spring Boot Cookbook
- Python機器學習算法: 原理、實現與案例
- HTML5開發精要與實例詳解
- Python從入門到精通(第3版)
- C編程技巧:117個問題解決方案示例
- HTML 5與CSS 3權威指南(第4版·上冊)
- 零基礎學Java(升級版)