- Machine Learning Projects for Mobile Applications
- Karthikeyan NG
- 69字
- 2021-06-10 19:41:43
Rectified linear units
The logic behind keeping a rectified linear units (ReLUs) layer is very simple: it replaces all the negative values with 0. This helps us to make CNN mathematically healthier by avoiding negative values:
Here, in this layer, the size of the image is not altered. We will get the same size output as the input only when the negative values are replaced by 0.
推薦閱讀
- Intel FPGA/CPLD設計(基礎篇)
- Large Scale Machine Learning with Python
- 分布式系統與一致性
- 筆記本電腦應用技巧
- 筆記本電腦使用、維護與故障排除從入門到精通(第5版)
- Istio服務網格技術解析與實踐
- 單片機技術及應用
- Spring Security 3.x Cookbook
- 筆記本電腦維修技能實訓
- 計算機組裝與維護(慕課版)
- The Machine Learning Workshop
- 施耐德M241/251可編程序控制器應用技術
- DevOps實戰:VMware管理員運維方法、工具及最佳實踐
- The Deep Learning Workshop
- Hands-On Embedded Programming with C++17