官术网_书友最值得收藏!

What happens if we use too many neurons?

If we make our network architecture too complicated, two things will happen:

  • We're likely to develop a high variance model
  • The model will train slower than a less complicated model

If we add many layers, our gradients will get smaller and smaller until the first few layers barely train, which is called the vanishing gradient problem. We're nowhere near that yet, but we will talk about it later.

In (almost) the words of rap legend Christopher Wallace, aka Notorious B.I.G., the more neurons we come across, the more problems we see. With that said, the variance can be managed with dropout, regularization, and early stopping, and advances in GPU computing make deeper networks possible.

If I had to pick between a network with too many neurons or too few, and I only got to try one experiment, I'd prefer to err on the side of slightly too many.  

主站蜘蛛池模板: 双城市| 寿阳县| 东安县| 江孜县| 汤阴县| 大名县| 呼图壁县| 仙桃市| 桂林市| 深州市| 淮安市| 南雄市| 阳西县| 黎城县| 开原市| 盱眙县| 铜山县| 太湖县| 竹溪县| 虞城县| 雷州市| 郎溪县| 山东省| 连平县| 崇义县| 怀宁县| 北京市| 阿坝县| 开封县| 瑞金市| 象山县| 镇康县| 鹤山市| 抚宁县| 永昌县| 海阳市| 九台市| 玉林市| 宜都市| 东海县| 株洲县|