- Keras 2.x Projects
- Giuseppe Ciaburro
- 120字
- 2021-07-02 14:36:23
Hyperbolic tangent
Another very popular and widely used activation feature is the tanh function. If you look at the screenshot that follows, you can notice that it looks very similar to sigmoid; in fact, it's a scaled sigmoid function. This is a nonlinear function, defined in the range of values (-1, 1), so you need not worry about activations blowing up. One thing to clarify is that the gradient is stronger for tanh than sigmoid (the derivatives are more steep). The function is defined by the following formula:

The following figure shows a hyberbolic tangent activation function:

Deciding between sigmoid and tanh will depend on your gradient strength requirement. Like the sigmoid, tanh also has the missing slope problem.
推薦閱讀
- PostgreSQL 11 Server Side Programming Quick Start Guide
- 3D Printing with RepRap Cookbook
- 計算機應用基礎·基礎模塊
- 運動控制系統
- DevOps Bootcamp
- 從零開始學C++
- Linux嵌入式系統開發
- 精通數據科學:從線性回歸到深度學習
- Hands-On Dashboard Development with QlikView
- Linux Shell Scripting Cookbook(Third Edition)
- 運動控制系統(第2版)
- 大數據素質讀本
- 實戰Windows Azure
- 玩轉PowerPoint
- Eclipse全程指南