官术网_书友最值得收藏!

What happens if we use too many neurons?

If we make our network architecture too complicated, two things will happen:

  • We're likely to develop a high variance model
  • The model will train slower than a less complicated model

If we add many layers, our gradients will get smaller and smaller until the first few layers barely train, which is called the vanishing gradient problem. We're nowhere near that yet, but we will talk about it later.

In (almost) the words of rap legend Christopher Wallace, aka Notorious B.I.G., the more neurons we come across, the more problems we see. With that said, the variance can be managed with dropout, regularization, and early stopping, and advances in GPU computing make deeper networks possible.

If I had to pick between a network with too many neurons or too few, and I only got to try one experiment, I'd prefer to err on the side of slightly too many.  

主站蜘蛛池模板: 安新县| 成武县| 城固县| 旺苍县| 婺源县| 三河市| 龙陵县| 当涂县| 海丰县| 通化市| 巫山县| 松原市| 阿尔山市| 临沂市| 陇川县| 桐乡市| 汶川县| 历史| 霞浦县| 霍邱县| 利辛县| 富宁县| 会泽县| 当雄县| 龙山县| 奉新县| 会同县| 乌鲁木齐市| 邵阳县| 南城县| 五常市| 海伦市| 阿勒泰市| 凯里市| 顺昌县| 东宁县| 延长县| 龙南县| 穆棱市| 黄陵县| 曲阜市|