官术网_书友最值得收藏!

Rectified linear unit 

ReLU caps the negative value to zero, but its output will be positive equal to the same values as the input. It has a constant gradient for positive values and a zero gradient for negative values. The following is a graph of ReLU:

As shown, ReLU doesn't fire at all for negative values. The computational complexity of this activation function is lower than the functions described previously; hence, the prediction is faster. In the next section, you will see how to interconnect several perceptrons to form a deep neural network. 

主站蜘蛛池模板: 慈溪市| 哈尔滨市| 法库县| 磐石市| 泰宁县| 常熟市| 宣武区| 延安市| 百色市| 崇仁县| 紫阳县| 临邑县| 襄汾县| 宁南县| 扶沟县| 徐水县| 万源市| 鄱阳县| 普安县| 泌阳县| 营山县| 玛曲县| 乡宁县| 阜新市| 阜南县| 德江县| 晋城| 缙云县| 电白县| 嘉禾县| 长顺县| 冀州市| 正蓝旗| 武汉市| 获嘉县| 枝江市| 三台县| 合山市| 南乐县| 扬中市| 乌兰察布市|