官术网_书友最值得收藏!

ReLU function 

Then there is an activation function called the Rectified Linear Unit, ReLU(z), that transforms any value, z, to 0 or a value above 0. In other words, it outputs any value below 0 as 0 and any value above 0 as the value itself:

Just to summarize our understanding so far, the perceptron is the traditional and outdated neuron that is rarely used in real implementations. They are great to get a simplistic understanding of the underlying principle; however, they had the problem of fast learning due to the drastic changes in output values.

We use activation functions to reduce the learning speed and determine finer changes in z or  . Let's sum up these activation functions:

  • The sigmoid neuron is the neuron that uses the sigmoid activation function to transform the output to a value between 0 and 1.
  • The tanh neuron is the neuron that uses the tanh activation function to transform the output to a value between -1 and 1.
  • The ReLU neuron is the neuron that uses the ReLU activation function to transform the output to a value of either 0 or any value above 0.

The sigmoid function is used in practice but is slow compared to the tanh and ReLU functions. The tanh and ReLU functions are commonly used activation functions. The ReLU function is also considered state of the art and is usually the first choice of activation function that's used to build ANNs.

Here is a list of commonly used activation functions:

In the projects within this book, we will be primarily using either the sigmoid, tanh, or the ReLU neurons to build our ANN.

主站蜘蛛池模板: 读书| 澄迈县| 呼图壁县| 合阳县| 遂溪县| 潼南县| 庆安县| 朝阳区| 马龙县| 洞口县| 阳原县| 丰顺县| 拜城县| 安溪县| 大足县| 金秀| 即墨市| 崇礼县| 阿拉善左旗| 竹北市| 龙岩市| 东台市| 金沙县| 图木舒克市| 沁阳市| 绵竹市| 青海省| 滦平县| 瓦房店市| 湛江市| 安仁县| 昭通市| 班戈县| 石河子市| 汾阳市| 内丘县| 诏安县| 平原县| 卫辉市| 万载县| 西青区|