- Machine Learning Projects for Mobile Applications
- Karthikeyan NG
- 69字
- 2021-06-10 19:41:43
Rectified linear units
The logic behind keeping a rectified linear units (ReLUs) layer is very simple: it replaces all the negative values with 0. This helps us to make CNN mathematically healthier by avoiding negative values:
Here, in this layer, the size of the image is not altered. We will get the same size output as the input only when the negative values are replaced by 0.
推薦閱讀
- 用“芯”探核:龍芯派開發實戰
- Raspberry Pi 3 Cookbook for Python Programmers
- 深入理解Spring Cloud與實戰
- Python GUI Programming:A Complete Reference Guide
- INSTANT ForgedUI Starter
- Learning Stencyl 3.x Game Development Beginner's Guide
- 電腦軟硬件維修從入門到精通
- 嵌入式系統中的模擬電路設計
- Hands-On Machine Learning with C#
- CC2530單片機技術與應用
- 面向對象分析與設計(第3版)(修訂版)
- 單片機開發與典型工程項目實例詳解
- 龍芯自主可信計算及應用
- 數字媒體專業英語(第2版)
- Spring Security 3.x Cookbook