官术网_书友最值得收藏!

ReLU

The ReLU non-linearity is a piecewise linear function with a non-linearity introduced by rectification. Unlike the sigmoid and Tanh non-linearities that have continuous gradients, the gradients of ReLU have two values only: 0 for values smaller than 0, and 1 for values larger than 0. Hence, the gradients of ReLU are sparse. Although the gradient of ReLU at 0 is undefined, common practice sets it to 0. There are variations to the ReLU non-linearity including the ELU and the Leaky RELU. Compared to sigmoid and Tanh, the derivative of ReLU is faster to compute and induces sparsity in models:

主站蜘蛛池模板: 宝丰县| 景谷| 安龙县| 伽师县| 封开县| 东至县| 略阳县| 紫云| 平塘县| 开封市| 从化市| 宣威市| 大邑县| 白玉县| 奉化市| 五大连池市| 元阳县| 临江市| 沽源县| 和政县| 廉江市| 固安县| 榆林市| 大理市| 灌阳县| 临海市| 盐池县| 长丰县| 钟山县| 尉氏县| 龙陵县| 郓城县| 大埔县| 南阳市| 永登县| 沽源县| 六安市| 舞阳县| 黔西县| 阿坝| 凤凰县|