- Hands-On Natural Language Processing with Python
- Rajesh Arumugam Rajalingappaa Shanmugamani
- 95字
- 2021-08-13 16:01:46
Rectified linear unit
ReLU caps the negative value to zero, but its output will be positive equal to the same values as the input. It has a constant gradient for positive values and a zero gradient for negative values. The following is a graph of ReLU:

As shown, ReLU doesn't fire at all for negative values. The computational complexity of this activation function is lower than the functions described previously; hence, the prediction is faster. In the next section, you will see how to interconnect several perceptrons to form a deep neural network.
推薦閱讀
- Functional Programming in JavaScript
- Python神經網絡項目實戰
- 你必須知道的204個Visual C++開發問題
- Microsoft System Center Orchestrator 2012 R2 Essentials
- 零基礎學Python數據分析(升級版)
- Apache Mahout Clustering Designs
- 微信小程序項目開發實戰
- Processing創意編程指南
- Node.js 12實戰
- C#程序設計基礎入門教程
- Web開發的平民英雄:PHP+MySQL
- SAS編程演義
- Design Patterns and Best Practices in Java
- Koa與Node.js開發實戰
- WCF編程(第2版)