- Deep Learning Essentials
- Wei Di Anurag Bhardwaj Jianing Wei
- 150字
- 2021-06-30 19:17:52
Activation functions
The activation function in each artificial neuron decides whether the incoming signals have reached the threshold and should output signals for the next level. It is crucial to set up the right activation function because of the gradient vanishing issue, which we will talk about later.
Another important feature of an activation function is that it should be differentiable. The network learns from the errors that are calculated at the output layer. A differentiable activation function is needed to perform backpropagation optimization while propagating backwards in the network to compute gradients of error (loss) with respect to weights, and then optimize weights accordingly, using gradient descent or any other optimization technique to reduce the error.
The following table lists a few common activation functions. We will dive into them a bit deeper, talk about the differences between them, and explain how to choose the right activation function:

- 軟件架構設計
- R Machine Learning By Example
- 精通Windows Vista必讀
- 群體智能與數據挖掘
- CompTIA Linux+ Certification Guide
- Ceph:Designing and Implementing Scalable Storage Systems
- 中國戰略性新興產業研究與發展·工業機器人
- 空間站多臂機器人運動控制研究
- 人工智能:語言智能處理
- Excel 2010函數與公式速查手冊
- SQL Server數據庫應用基礎(第2版)
- Mastering Ceph
- 人工智能:智能人機交互
- Keras Reinforcement Learning Projects
- Mastercam X5應用技能基本功特訓