官术网_书友最值得收藏!

Sigmoid or logistic function

A sigmoid function has a distinctive S shape and it is a differentiable real function for any real input value. Its range is between 0 and 1. It is an activation function in the following form:

Its first derivative, which is used during backpropagation of the training step, has the following form:

The implementation is as follows:

def sigmoid(x):
return tf.div(tf.constant(1.0),
tf.add(tf.constant(1.0), tf.exp(tf.neg(x))))

The derivative of a sigmoid function is as follows:

def sigmoidprime(x):
return tf.multiply(sigmoid(x), tf.subtract(tf.constant(1.0), sigmoid(x)))

However, a sigmoid function can cause the gradient vanishing problem or saturation of the gradient. It is also known to have a slow convergence. Therefore, in practical use, it is not recommended to use a sigmoid as the activation function, ReLU has become more popular.

主站蜘蛛池模板: 常熟市| 中阳县| 托克托县| 灵宝市| 汶川县| 泸定县| 广昌县| 延寿县| 沙雅县| 日土县| 彝良县| 金堂县| 集安市| 清镇市| 汝州市| 凤台县| 沁源县| 惠水县| 文成县| 林芝县| 枞阳县| 定日县| 密云县| 老河口市| 滁州市| 防城港市| 兰考县| 宿迁市| 安多县| 忻州市| 静安区| 灵川县| 贺州市| 财经| 东丰县| 即墨市| 京山县| 罗江县| 江油市| 芦山县| 福海县|