官术网_书友最值得收藏!

Rectified linear unit 

ReLU caps the negative value to zero, but its output will be positive equal to the same values as the input. It has a constant gradient for positive values and a zero gradient for negative values. The following is a graph of ReLU:

As shown, ReLU doesn't fire at all for negative values. The computational complexity of this activation function is lower than the functions described previously; hence, the prediction is faster. In the next section, you will see how to interconnect several perceptrons to form a deep neural network. 

主站蜘蛛池模板: 宜阳县| 吴堡县| 太仆寺旗| 偃师市| 铁岭县| 伊宁市| 布尔津县| 泸水县| 信阳市| 凌海市| 囊谦县| 濉溪县| 淳化县| 横峰县| 赤壁市| 玉田县| 织金县| 突泉县| 孟津县| 榕江县| 鄂尔多斯市| 温宿县| 福州市| 湖北省| 灵台县| 民权县| 威海市| 什邡市| 贵州省| 德格县| 马龙县| 休宁县| 沙湾县| 内江市| 台江县| 枣阳市| 罗田县| 桓台县| 永年县| 大关县| 确山县|