官术网_书友最值得收藏!

Rectified Linear Unit

Rectified Linear Unit (ReLU) is the most used activation function since 2015. It is a simple condition and has advantages over the other functions. The function is defined by the following formula:

 

In the following figure is shown a ReLU activation function:

The range of output is between 0 and infinity. ReLU finds applications in computer vision and speech recognition using deep neural nets. There are various other activation functions as well, but we have covered the most important ones here.

主站蜘蛛池模板: 盐城市| 金堂县| 石台县| 墨江| 广汉市| 景洪市| 南投县| 定州市| 高尔夫| 文水县| 西丰县| 永安市| 嘉鱼县| 霸州市| 高台县| 华池县| 阳朔县| 汤阴县| 涞源县| 乐都县| 惠东县| 上高县| 盱眙县| 双柏县| 涟源市| 界首市| 台南县| 神农架林区| 密云县| 威信县| 临澧县| 澳门| 兴仁县| 仁布县| 安义县| 平山县| 莫力| 修武县| 武功县| 囊谦县| 福鼎市|