- Applied Deep Learning and Computer Vision for Self/Driving Cars
- Sumit Ranjan;Dr. S. Senthamilarasu
- 461字
- 2021-04-09 23:13:03
Understanding activation functions
Activation functions are so important to neural networks as they introduce non-linearity to a network. Deep learning consists of multiple non-linear transformations, and activation functions are the tools for non-linear transformation. Hence, activation functions are applied before sending an input signal to the next layer of neural networks. Due to activation functions, a neural network has the power to learn complex features.
Deep learning has many activation functions:
The threshold function
The sigmoid function
The rectifier function
The hyperbolic tangent function
The cost function
In the next section, we will start with one of the most important activation functions, called the threshold activation function.
The threshold function
The threshold function can be seen in the following diagram:
On the x axis, we have the weighted sum of the input, and on the y axis, we have the threshold values from 0 to 1. The threshold function is very simple: if the value is less than 0, then the threshold will be 0 and if the value is more than 0, then the threshold will be 1. This works as a yes-or-no function.
The sigmoid function
The sigmoid function is a very interesting type of function; we can see it in the following diagram:
The sigmoid function is nothing but a logistic function. In this function, anything below 0 will be set to 0. This function is often used in the output layer, especially when you're trying to find the predictive probability.
The rectifier linear function
The Rectifier Linear (ReLU) function is one of the most popular functions in the field of ANNs. If the value is less than or equal to 0, then the value of x is set to 0, and then from there, it gradually progresses as the input value increases. We can observe this in the following diagram:
In the next section, we will learn about the hyperbolic tangent activation function.
The hyperbolic tangent activation function
Finally, we have another function, called the Hyperbolic Tangent Activation (tanh) function, which looks as follows:
The tanh function is very similar to the sigmoid function; the range of a tanh function is (-1,1). Tanh functions are also S-shaped, like sigmoid functions. The advantage of the tanh function is that a positive will be mapped as strongly positive, a negative will be mapped as strongly negative, and 0 will be mapped to 0, as shown in Fig 2.16.
In the next section of this chapter, we will learn about the cost function.
- 持續(xù)演進(jìn)的Cloud Native:云原生架構(gòu)下微服務(wù)最佳實(shí)踐
- Photoshop CC 2018實(shí)用教程
- VR、AR與MR項目開發(fā)實(shí)戰(zhàn)
- Photoshop CC超級學(xué)習(xí)手冊
- 輕松玩轉(zhuǎn)3D One AI
- CryENGINE 3 Cookbook
- Microsoft Windows Communication Foundation 4.0 Cookbook for Developing SOA Applications
- AI寫實(shí)人物繪畫關(guān)鍵詞圖鑒(Stable Diffusion版)
- Joomla! E/Commerce with VirtueMart
- CG數(shù)碼插畫場景藝術(shù)設(shè)計
- Photoshop CC 2018基礎(chǔ)與實(shí)戰(zhàn)教程(全彩版)
- Apache Geronimo 2.1: Quick Reference
- AutoCAD 2022中文版實(shí)用教程
- Practical Plone 3: A Beginner's Guide to Building Powerful Websites
- 速學(xué)Axure RP:產(chǎn)品原型設(shè)計從入門到進(jìn)階