官术网_书友最值得收藏!

Tanh

As we said, the logistic sigmoid can cause a neural network to get stuck, as a high or low value input will produce a result very near zero. This will mean that the gradient descent will not update the weights and not train the model.

The hyperbolic tangent, or the tanh function, is an alternative to sigmoid, and it still has a sigmoidal shape. The difference is that it will output a value between -1 and 1. Hence, strongly negative input to the tanh function will map to negative output. Additionally, only zero-valued input is mapped to near-zero output. These properties make the network less likely to get stuck during training:

Hyperbolic tangent function
主站蜘蛛池模板: 札达县| 若羌县| 布拖县| 铅山县| 五指山市| 墨江| 清苑县| 浮梁县| 长沙市| 宁化县| 凌海市| 武乡县| 文成县| 阿拉善左旗| 罗江县| 河南省| 定西市| 丁青县| 廉江市| 磐安县| 分宜县| 武清区| 清新县| 丰城市| 马边| 湖北省| 左权县| 南宁市| 鄂托克前旗| 石泉县| 西华县| 高青县| 聂拉木县| 修水县| 中卫市| 奉新县| 扶沟县| 屯昌县| 新巴尔虎右旗| 凤山县| 南开区|