官术网_书友最值得收藏!

ReLU activation

Rectified linear, or as it is more commonly known, ReLU function is the most widely used activation function in deep learning models. It suppresses the negative values to zero. The reason for ReLU being so widely used is it deactivates the neurons that produce negative values. This kind of behavior is desired in most of the networks containing thousands of neurons. Following, is the plot for the ReLU activation function:

A modified form of ReLU is leaky ReLU. ReLU completely deactivates the neuron with a negative value. Instead of completely deactivating the neuron, leaky ReLU reduces the effect of those neurons by a factor of, say c. The following equation defines the leaky ReLU activation function:

Following, is the plot of output values from ReLU activation function:

主站蜘蛛池模板: 东丽区| 济阳县| 江山市| 沅陵县| 固镇县| 临夏县| 永州市| 额尔古纳市| 高台县| 海盐县| 丹巴县| 延长县| 策勒县| 阿合奇县| 舟山市| 海阳市| 筠连县| 巧家县| 临沧市| 剑阁县| 白沙| 蓝田县| 乃东县| 吴堡县| 海丰县| 揭西县| 长岛县| 淄博市| 沅江市| 兴化市| 浦北县| 福建省| 奉化市| 平潭县| 大埔区| 龙门县| 卓资县| 青龙| 米易县| 涡阳县| 尼玛县|