官术网_书友最值得收藏!

Activation function — ReLU

The sigmoid is not the only kind of smooth activation function used for neural networks. Recently, a very simple function called rectified linear unit (ReLU) became very popular because it generates very good experimental results. A ReLU is simply defined as , and the nonlinear function is represented in the following graph. As you can see in the following graph, the function is zero for negative values, and it grows linearly for positive values:

              

主站蜘蛛池模板: 红桥区| 灵寿县| 四子王旗| 太谷县| 和硕县| 鹤壁市| 定西市| 孝昌县| 华蓥市| 新疆| 阜平县| 阿坝县| 高淳县| 东兴市| 大关县| 彭水| 通州市| 介休市| 清涧县| 青铜峡市| 舟曲县| 渭源县| 合阳县| 丘北县| 石楼县| 棋牌| 泗阳县| 如东县| 土默特右旗| 宜春市| 婺源县| 遂昌县| 如东县| 平安县| 左云县| 改则县| 花垣县| 肇庆市| 龙南县| 黔东| 桦南县|