官术网_书友最值得收藏!

Rectified linear unit 

ReLU caps the negative value to zero, but its output will be positive equal to the same values as the input. It has a constant gradient for positive values and a zero gradient for negative values. The following is a graph of ReLU:

As shown, ReLU doesn't fire at all for negative values. The computational complexity of this activation function is lower than the functions described previously; hence, the prediction is faster. In the next section, you will see how to interconnect several perceptrons to form a deep neural network. 

主站蜘蛛池模板: 福建省| 改则县| 朔州市| 宁南县| 太白县| 水城县| 龙南县| 临颍县| 常州市| 新丰县| 福清市| 中卫市| 那坡县| 嵊泗县| 韩城市| 桐柏县| 江川县| 开封市| 文化| 长顺县| 慈溪市| 甘南县| 盈江县| 高尔夫| 海盐县| 婺源县| 巫溪县| 安图县| 自治县| 阿坝| 泽州县| 南和县| 扎鲁特旗| 宝丰县| 婺源县| 安徽省| 兰州市| 侯马市| 延津县| 贺兰县| 马鞍山市|