- Hands-On Natural Language Processing with Python
- Rajesh Arumugam Rajalingappaa Shanmugamani
- 95字
- 2021-08-13 16:01:46
Rectified linear unit
ReLU caps the negative value to zero, but its output will be positive equal to the same values as the input. It has a constant gradient for positive values and a zero gradient for negative values. The following is a graph of ReLU:

As shown, ReLU doesn't fire at all for negative values. The computational complexity of this activation function is lower than the functions described previously; hence, the prediction is faster. In the next section, you will see how to interconnect several perceptrons to form a deep neural network.
推薦閱讀
- Getting Started with React
- Java完全自學(xué)教程
- Maven Build Customization
- Vue.js 3.x從入門到精通(視頻教學(xué)版)
- Getting Started with Polymer
- Anaconda數(shù)據(jù)科學(xué)實(shí)戰(zhàn)
- Python機(jī)器學(xué)習(xí)開發(fā)實(shí)戰(zhàn)
- Python數(shù)據(jù)科學(xué)實(shí)踐指南
- Koa與Node.js開發(fā)實(shí)戰(zhàn)
- Mastering Machine Learning with scikit-learn
- Building an E-Commerce Application with MEAN
- Activiti權(quán)威指南
- Access 2013數(shù)據(jù)庫應(yīng)用案例課堂
- The Ruby Workshop
- IBM DB2 9.7 Advanced Administration Cookbook