官术网_书友最值得收藏!

Rectified linear unit 

ReLU caps the negative value to zero, but its output will be positive equal to the same values as the input. It has a constant gradient for positive values and a zero gradient for negative values. The following is a graph of ReLU:

As shown, ReLU doesn't fire at all for negative values. The computational complexity of this activation function is lower than the functions described previously; hence, the prediction is faster. In the next section, you will see how to interconnect several perceptrons to form a deep neural network. 

主站蜘蛛池模板: 老河口市| 常熟市| 灵山县| 桦甸市| 长海县| 安国市| 逊克县| 贵德县| 浦江县| 龙井市| 巴中市| 泗洪县| 永川市| 治多县| 化德县| 平阴县| 西宁市| 凤庆县| 琼海市| 沂源县| 台东县| 西林县| 泰州市| 绥滨县| 松溪县| 石河子市| 灵台县| 林周县| 固始县| 丹棱县| 囊谦县| 浪卡子县| 淮南市| 新干县| 温泉县| 中西区| 东乡| 临汾市| 新余市| 汕尾市| 云南省|