官术网_书友最值得收藏!

Sigmoid or logistic function

A sigmoid function has a distinctive S shape and it is a differentiable real function for any real input value. Its range is between 0 and 1. It is an activation function in the following form:

Its first derivative, which is used during backpropagation of the training step, has the following form:

The implementation is as follows:

def sigmoid(x):
return tf.div(tf.constant(1.0),
tf.add(tf.constant(1.0), tf.exp(tf.neg(x))))

The derivative of a sigmoid function is as follows:

def sigmoidprime(x):
return tf.multiply(sigmoid(x), tf.subtract(tf.constant(1.0), sigmoid(x)))

However, a sigmoid function can cause the gradient vanishing problem or saturation of the gradient. It is also known to have a slow convergence. Therefore, in practical use, it is not recommended to use a sigmoid as the activation function, ReLU has become more popular.

主站蜘蛛池模板: 永登县| 获嘉县| 闵行区| 昔阳县| 彭阳县| 北安市| 宜阳县| 白沙| 长沙县| 宣威市| 油尖旺区| 凉城县| 仙游县| 沅陵县| 疏附县| 南木林县| 呼伦贝尔市| 盈江县| 鲁甸县| 环江| 皮山县| 久治县| 宕昌县| 南丹县| 长宁县| 鄂州市| 舒兰市| 沈阳市| 杂多县| 随州市| 沐川县| 巫溪县| 启东市| 阜康市| 临沭县| 根河市| 大姚县| 滕州市| 四子王旗| 得荣县| 丹东市|