官术网_书友最值得收藏!

  • Deep Learning Essentials
  • Wei Di Anurag Bhardwaj Jianing Wei
  • 129字
  • 2021-06-30 19:17:52

Sigmoid or logistic function

A sigmoid function has a distinctive S shape and it is a differentiable real function for any real input value. Its range is between 0 and 1. It is an activation function in the following form:

Its first derivative, which is used during backpropagation of the training step, has the following form:

The implementation is as follows:

def sigmoid(x):
return tf.div(tf.constant(1.0),
tf.add(tf.constant(1.0), tf.exp(tf.neg(x))))

The derivative of a sigmoid function is as follows:

def sigmoidprime(x):
return tf.multiply(sigmoid(x), tf.subtract(tf.constant(1.0), sigmoid(x)))

However, a sigmoid function can cause the gradient vanishing problem or saturation of the gradient. It is also known to have a slow convergence. Therefore, in practical use, it is not recommended to use a sigmoid as the activation function, ReLU has become more popular.

主站蜘蛛池模板: 新源县| 灵璧县| 沛县| 龙里县| 界首市| 镇安县| 光山县| 镇沅| 响水县| 巴中市| 开鲁县| 惠州市| 来宾市| 梓潼县| 元氏县| 陈巴尔虎旗| 祁连县| 阳城县| 岗巴县| 扎兰屯市| 宜阳县| 镇雄县| 水城县| 浦县| 博罗县| 许昌县| 海宁市| 扎鲁特旗| 杭锦后旗| 岚皋县| 新邵县| 阳谷县| 云安县| 南通市| 江都市| 华阴市| 资兴市| 西吉县| 阿荣旗| 明溪县| 宁蒗|