- Mobile Artificial Intelligence Projects
- Karthikeyan NG Arun Padmanabhan Matt R. Cole
- 395字
- 2021-06-24 15:51:39
Activation functions
We now know that an ANN is created by stacking individual computing units called perceptrons. We have also seen how a perceptron works and have summarized it as Output 1, IF .
That is, it either outputs a 1 or a 0 depending on the values of the weight, w, and bias, b.
Let's look at the following diagram to understand why there is a problem with just outputting either a 1 or a 0. The following is a diagram of a simple perceptron with just a single input, x:

For simplicity, let's call , where the following applies:
- w is the weight of the input, x, and b is the bias
- a is the output, which is either 1 or 0
Here, as the value of z changes, at some point, the output, a, changes from 0 to 1. As you can see, the change in output a is sudden and drastic:

What this means is that for some small change, , we get a dramatic change in the output, a. This is not particularly helpful if the perceptron is part of a network, because if each perceptron has such drastic change, it makes the network unstable and hence the network fails to learn.
Therefore, to make the network more efficient and stable, we need to slow down the way each perceptron learns. In other words, we need to eliminate this sudden change in output from 0 to 1 to a more gradual change:

This is made possible by activation functions. Activation functions are functions that are applied to a perceptron so that instead of outputting a 0 or a 1, it outputs any value between 0 and 1.
This means that each neuron can learn slower and at a greater level of detail by using smaller changes, . Activation functions can be looked at as transformation functions that are used to transform binary values in to a sequence of smaller values between a given minimum and maximum.
There are a number of ways to transform the binary outcomes to a sequence of values, namely the sigmoid function, the tanh function, and the ReLU function. We will have a quick look at each of these activation functions now.
- 企業(yè)數(shù)字檔案館建設(shè)理論與實踐
- 國內(nèi)圖書情報知識圖譜實證研究
- 單讀. 十周年特輯(時間的移民+在世界的門外)共2冊
- 山東圖書館學(xué)史研究
- 河南省博物院鎮(zhèn)館之寶
- 探索與實踐:博物館與口述歷史
- 高校博物館發(fā)展研究:以上海地區(qū)為中心
- 國外圖書情報知識圖譜實證研究
- 邂逅法學(xué)圖書館:浙江大學(xué)光華法學(xué)院師生原創(chuàng)文集
- 中國古典文獻學(xué)(東北師范大學(xué)文學(xué)院學(xué)術(shù)史文庫)
- 大學(xué)圖書館信息服務(wù)與信息素養(yǎng)教育理論與實踐研究
- 移動社交時代的電子閱讀
- 檔案修復(fù)與歷史資料的數(shù)字化:第六屆東亞史料研究編纂機構(gòu)聯(lián)席會議論文集
- 高校圖書館工具書管理與服務(wù)研究
- 當代博物館的新角色與新視野