- Deep Learning with PyTorch
- Vishnu Subramanian
- 138字
- 2021-06-24 19:16:27
Sigmoid
The sigmoid activation function has a simple mathematical form, as follows:
The sigmoid function intuitively takes a real-valued number and outputs a number in a range between zero and one. For a large negative number, it returns close to zero and, for a large positive number, it returns close to one. The following plot represents different sigmoid function outputs:

The sigmoid function has been historically used across different architectures, but in recent times it has gone out of popularity as it has one major drawback. When the output of the sigmoid function is close to zero or one, the gradients for the layers before the sigmoid function are close to zero and, hence, the learnable parameters of the previous layer get gradients close to zero and the weights do not get adjusted often, resulting in dead neurons.
- 圖解西門子S7-200系列PLC入門
- 電腦軟硬件維修大全(實例精華版)
- 辦公通信設(shè)備維修
- 計算機組裝·維護與故障排除
- 深入淺出SSD:固態(tài)存儲核心技術(shù)、原理與實戰(zhàn)(第2版)
- SiFive 經(jīng)典RISC-V FE310微控制器原理與實踐
- 筆記本電腦應用技巧
- Intel Edison智能硬件開發(fā)指南:基于Yocto Project
- Neural Network Programming with Java(Second Edition)
- Hands-On Motion Graphics with Adobe After Effects CC
- Python Machine Learning Blueprints
- 新編電腦組裝與硬件維修從入門到精通
- Spring Cloud實戰(zhàn)
- 基于網(wǎng)絡(luò)化教學的項目化單片機應用技術(shù)
- Blender 3D By Example