- Intelligent Projects Using Python
- Santanu Pattanayak
- 230字
- 2021-07-02 14:10:41
The hyperbolic tangent activation function
The output, y, of a hyperbolic tangent activation function (tanh) as a function of its total input, x, is given as follows:

The tanh activation function outputs values in the range [-1, 1], as you can see in the following graph:

One thing to note is that both the sigmoid and the tanh activation functions are linear within a small range of the input, beyond which the output saturates. In the saturation zone, the gradients of the activation functions (with respect to the input) are very small or close to zero; this means that they are very prone to the vanishing gradient problem. As you will see later on, neural networks learn from the backpropagation method, where the gradient of a layer is dependent on the gradients of the activation units in the succeeding layers, up to the final output layer. Therefore, if the units in the activation units are working in the saturation region, much less of the error is backpropagated to the early layers of the neural network. Neural networks minimize the prediction error in order to learn the weights and biases (W) by utilizing the gradients. This means that, if the gradients are small or vanish to zero, then the neural network will fail to learn these weights properly.
- 筆記本電腦使用、維護與故障排除實戰
- 零點起飛學Xilinx FPG
- 數字道路技術架構與建設指南
- 施耐德SoMachine控制器應用及編程指南
- Artificial Intelligence Business:How you can profit from AI
- 計算機維修與維護技術速成
- BeagleBone Robotic Projects
- 微控制器的應用
- The Applied Artificial Intelligence Workshop
- Learning Less.js
- The Deep Learning Workshop
- Unreal Development Kit Game Programming with UnrealScript:Beginner's Guide
- Spring Cloud微服務架構開發
- Windows Presentation Foundation 4.5 Cookbook
- Liferay 6.2 User Interface Development