官术网_书友最值得收藏!

ReLU

The ReLU non-linearity is a piecewise linear function with a non-linearity introduced by rectification. Unlike the sigmoid and Tanh non-linearities that have continuous gradients, the gradients of ReLU have two values only: 0 for values smaller than 0, and 1 for values larger than 0. Hence, the gradients of ReLU are sparse. Although the gradient of ReLU at 0 is undefined, common practice sets it to 0. There are variations to the ReLU non-linearity including the ELU and the Leaky RELU. Compared to sigmoid and Tanh, the derivative of ReLU is faster to compute and induces sparsity in models:

主站蜘蛛池模板: 铅山县| 元谋县| SHOW| 潮州市| 房山区| 南乐县| 申扎县| 临江市| 禄丰县| 星座| 赤壁市| 年辖:市辖区| 安庆市| 元朗区| 葫芦岛市| 兴义市| 托里县| 容城县| 泰和县| 运城市| 东安县| 安福县| 上栗县| 温州市| 丰都县| 西林县| 日照市| 香河县| 达孜县| 宁化县| 正镶白旗| 鸡西市| 石嘴山市| 兴业县| 浙江省| 郯城县| 台湾省| 罗江县| 阿拉善盟| 志丹县| 铜山县|