官术网_书友最值得收藏!

Rectified Linear Unit

Rectified Linear Unit (ReLU) is the most used activation function since 2015. It is a simple condition and has advantages over the other functions. The function is defined by the following formula:

 

In the following figure is shown a ReLU activation function:

The range of output is between 0 and infinity. ReLU finds applications in computer vision and speech recognition using deep neural nets. There are various other activation functions as well, but we have covered the most important ones here.

主站蜘蛛池模板: 隆尧县| 同心县| 棋牌| 昭通市| 陆川县| 邹城市| 广德县| 河西区| 马鞍山市| 康乐县| 巨野县| 延边| 漯河市| 甘孜县| 大悟县| 松原市| 汶川县| 江阴市| 深泽县| 遵义县| 昭苏县| 台湾省| 滁州市| 杨浦区| 宜黄县| 安岳县| 巴东县| 平江县| 威海市| 枣阳市| 江安县| 大邑县| 浦县| 新野县| 咸阳市| 定日县| 遵义县| 元谋县| 榆中县| 广饶县| 永丰县|