- Hands-On Neural Networks
- Leonardo De Marchi Laura Mitchell
- 117字
- 2021-06-24 14:00:15
Tanh
As we said, the logistic sigmoid can cause a neural network to get stuck, as a high or low value input will produce a result very near zero. This will mean that the gradient descent will not update the weights and not train the model.
The hyperbolic tangent, or the tanh function, is an alternative to sigmoid, and it still has a sigmoidal shape. The difference is that it will output a value between -1 and 1. Hence, strongly negative input to the tanh function will map to negative output. Additionally, only zero-valued input is mapped to near-zero output. These properties make the network less likely to get stuck during training:

Hyperbolic tangent function
推薦閱讀
- 大數(shù)據(jù)管理系統(tǒng)
- 程序設(shè)計(jì)語(yǔ)言與編譯
- 大數(shù)據(jù)時(shí)代的數(shù)據(jù)挖掘
- 工業(yè)機(jī)器人工程應(yīng)用虛擬仿真教程:MotoSim EG-VRC
- 最簡(jiǎn)數(shù)據(jù)挖掘
- Python Data Science Essentials
- WordPress Theme Development Beginner's Guide(Third Edition)
- 菜鳥起飛系統(tǒng)安裝與重裝
- Containers in OpenStack
- Excel 2010函數(shù)與公式速查手冊(cè)
- 未來(lái)學(xué)徒:讀懂人工智能飛馳時(shí)代
- FreeCAD [How-to]
- Flash CS5二維動(dòng)畫設(shè)計(jì)與制作
- Java求職寶典
- 特征工程入門與實(shí)踐