官术网_书友最值得收藏!

  • Deep Learning with Keras
  • Antonio Gulli Sujit Pal
  • 94字
  • 2021-07-02 23:58:02

Activation function — ReLU

The sigmoid is not the only kind of smooth activation function used for neural networks. Recently, a very simple function called rectified linear unit (ReLU) became very popular because it generates very good experimental results. A ReLU is simply defined as , and the nonlinear function is represented in the following graph. As you can see in the following graph, the function is zero for negative values, and it grows linearly for positive values:

              

主站蜘蛛池模板: 灵台县| 呼和浩特市| 尤溪县| 岑溪市| 全椒县| 桦甸市| 瓮安县| 无为县| 河西区| 罗江县| 新河县| 蓝山县| 晋州市| 齐齐哈尔市| 思茅市| 新化县| 张家口市| 呈贡县| 洪洞县| 东阿县| 乌苏市| 平湖市| 西昌市| 墨玉县| 安新县| 安溪县| 万全县| 昌江| 尼勒克县| 浦江县| 海安县| 奈曼旗| 论坛| 新乐市| 姚安县| 潜山县| 青冈县| 宁波市| 平潭县| 兴城市| 顺义区|