- Deep Learning Essentials
- Wei Di Anurag Bhardwaj Jianing Wei
- 129字
- 2021-06-30 19:17:52
Sigmoid or logistic function
A sigmoid function has a distinctive S shape and it is a differentiable real function for any real input value. Its range is between 0 and 1. It is an activation function in the following form:
Its first derivative, which is used during backpropagation of the training step, has the following form:
The implementation is as follows:
def sigmoid(x):
return tf.div(tf.constant(1.0),
tf.add(tf.constant(1.0), tf.exp(tf.neg(x))))
The derivative of a sigmoid function is as follows:
def sigmoidprime(x):
return tf.multiply(sigmoid(x), tf.subtract(tf.constant(1.0), sigmoid(x)))
However, a sigmoid function can cause the gradient vanishing problem or saturation of the gradient. It is also known to have a slow convergence. Therefore, in practical use, it is not recommended to use a sigmoid as the activation function, ReLU has become more popular.
推薦閱讀
- 走入IBM小型機世界
- 輕松學PHP
- 反饋系統:多學科視角(原書第2版)
- IoT Penetration Testing Cookbook
- Visual C# 2008開發技術詳解
- 大數據處理平臺
- 塊數據5.0:數據社會學的理論與方法
- Visual Basic.NET程序設計
- 我也能做CTO之程序員職業規劃
- 空間機械臂建模、規劃與控制
- ESP8266 Robotics Projects
- 網絡存儲·數據備份與還原
- 單片機技術項目化原理與實訓
- Creating ELearning Games with Unity
- Appcelerator Titanium Smartphone App Development Cookbook(Second Edition)