官术网_书友最值得收藏!

Rectified linear unit 

ReLU caps the negative value to zero, but its output will be positive equal to the same values as the input. It has a constant gradient for positive values and a zero gradient for negative values. The following is a graph of ReLU:

As shown, ReLU doesn't fire at all for negative values. The computational complexity of this activation function is lower than the functions described previously; hence, the prediction is faster. In the next section, you will see how to interconnect several perceptrons to form a deep neural network. 

主站蜘蛛池模板: 凤山市| 通州区| 女性| 江阴市| 开阳县| 绵阳市| 安庆市| 临夏县| 乌拉特前旗| 河间市| 新建县| 罗源县| 凤翔县| 灯塔市| 林芝县| 泰来县| 呼和浩特市| 凤山县| 汪清县| 弋阳县| 紫金县| 永善县| 大宁县| 来凤县| 余庆县| 榆中县| 金堂县| 新营市| 乌恰县| 赤水市| 榆林市| 密云县| 山东省| 靖宇县| 奉新县| 深州市| 滦平县| 托克逊县| 桦甸市| 望江县| 昌江|