- Hands-On Deep Learning Architectures with Python
- Yuxi (Hayden) Liu Saransh Mehta
- 122字
- 2021-06-24 14:48:12
Sigmoid activation
The output range of this function is from zero to one for all real number inputs. This is very important for generating probabilistic scores from neurons. The function is also continuous and non-linear and helps to preserve the non-linearity of outputs. Also, the gradient of curve is steep near the origin and saturates as we start moving away on the x-axis. This means significant change in output will occur for a small change in input around the origin. This characteristic aids in the classification task as it tries to keep the output close to either zero or one. Following is the equation for sigmoid activation against the input x:

The following is a plot of the sigmoid activation function:

推薦閱讀
- Introduction to DevOps with Kubernetes
- Mastercam 2017數(shù)控加工自動編程經(jīng)典實例(第4版)
- OpenStack for Architects
- Hands-On Data Science with SQL Server 2017
- 自主研拋機器人技術(shù)
- Hybrid Cloud for Architects
- Linux服務(wù)與安全管理
- 工業(yè)控制系統(tǒng)測試與評價技術(shù)
- Linux內(nèi)核精析
- Hands-On Deep Learning with Go
- 大話數(shù)據(jù)科學(xué):大數(shù)據(jù)與機器學(xué)習(xí)實戰(zhàn)(基于R語言)
- AVR單片機C語言程序設(shè)計實例精粹
- 軟件需求最佳實踐
- 這樣用Word!
- 設(shè)計中的人因:34個設(shè)計小故事