- Mobile Artificial Intelligence Projects
- Karthikeyan NG Arun Padmanabhan Matt R. Cole
- 395字
- 2021-06-24 15:51:39
Activation functions
We now know that an ANN is created by stacking individual computing units called perceptrons. We have also seen how a perceptron works and have summarized it as Output 1, IF .
That is, it either outputs a 1 or a 0 depending on the values of the weight, w, and bias, b.
Let's look at the following diagram to understand why there is a problem with just outputting either a 1 or a 0. The following is a diagram of a simple perceptron with just a single input, x:

For simplicity, let's call , where the following applies:
- w is the weight of the input, x, and b is the bias
- a is the output, which is either 1 or 0
Here, as the value of z changes, at some point, the output, a, changes from 0 to 1. As you can see, the change in output a is sudden and drastic:

What this means is that for some small change, , we get a dramatic change in the output, a. This is not particularly helpful if the perceptron is part of a network, because if each perceptron has such drastic change, it makes the network unstable and hence the network fails to learn.
Therefore, to make the network more efficient and stable, we need to slow down the way each perceptron learns. In other words, we need to eliminate this sudden change in output from 0 to 1 to a more gradual change:

This is made possible by activation functions. Activation functions are functions that are applied to a perceptron so that instead of outputting a 0 or a 1, it outputs any value between 0 and 1.
This means that each neuron can learn slower and at a greater level of detail by using smaller changes, . Activation functions can be looked at as transformation functions that are used to transform binary values in to a sequence of smaller values between a given minimum and maximum.
There are a number of ways to transform the binary outcomes to a sequence of values, namely the sigmoid function, the tanh function, and the ReLU function. We will have a quick look at each of these activation functions now.
- 社會(huì)網(wǎng)絡(luò)分析方法在圖書(shū)情報(bào)領(lǐng)域的應(yīng)用研究
- 北大中文系第一課
- 單讀. 十周年特輯(時(shí)間的移民+在世界的門(mén)外)共2冊(cè)
- 中國(guó)城市遺址類(lèi)博物館開(kāi)發(fā)模式研究
- 圖書(shū)館知識(shí)整合與知識(shí)服務(wù)研究:以西部社會(huì)科學(xué)院圖書(shū)館為例
- 山東圖書(shū)館學(xué)史研究
- 檔案檢索: 理論與方法
- 中國(guó)人民大學(xué)“復(fù)印報(bào)刊資料”轉(zhuǎn)載指數(shù)排名研究報(bào)告(2014)
- 構(gòu)筑閱讀天堂:圖書(shū)館服務(wù)設(shè)計(jì)探索
- 區(qū)域科技情報(bào)的研究與實(shí)踐
- 崇文集四編:中央文史研究館館員文選(全二冊(cè))(精裝)
- 一本書(shū)的圖書(shū)館之旅:圖書(shū)館閱讀推廣十五年
- 家庭檔案管理一本通
- 圖書(shū)館學(xué)是什么
- 兒童閱讀的世界Ⅱ:早期閱讀的生理機(jī)制研究