官术网_书友最值得收藏!

ReLU

ReLU is one of the most commonly used activation functions. It behaves like a linear function when the input is greater than 0; otherwise, it will always be equal to 0. It's the analog of the half-wave rectification in electrical engineering, :

The ReLU function

The range for this function is from 0 to infinite. The issue is that the negative values become zero; therefore, the derivative will always be constant. This is clearly an issue for backpropagation, but in practical cases, it does not have an effect. 

There are a few variants of ReLU; one of the most common ones is Leaky ReLU, which aims to allow a positive small gradient when the function is not active. Its formula is as follows:

Here,  is typically 0.01, as shown in the following diagram:

The Leaky ReLU function
主站蜘蛛池模板: 鄂伦春自治旗| 甘谷县| 化隆| 长武县| 镇坪县| 常宁市| 凉城县| 天等县| 开原市| 河池市| 乌拉特前旗| 湄潭县| 绥芬河市| 青海省| 保山市| 洪泽县| 内江市| 花莲市| 红安县| 四川省| 伊吾县| 健康| 霍林郭勒市| 辉县市| 师宗县| 皮山县| 曲周县| 西乌珠穆沁旗| 夏河县| 漾濞| 军事| 苗栗县| 东光县| 庆元县| 宝丰县| 郑州市| 南丰县| 长治市| 阳原县| 称多县| 那坡县|