官术网_书友最值得收藏!

Tuning the model hyperparameters

Now that we've trained an MLP and a six-layer deep neural network on the problem, we're ready to tune and optimize model hyperparameters.

We will discuss model tuning in depth in Chapter 6, Hyperparameter Optimization. There are a variety of strategies that you can use to choose the best parameters for your model. As you've probably noticed, there are many possible parameters and hyperparameters that we could still optimize.

If you wanted to fully tune this model you should do the following:

  • Experiment with the number of hidden layers. It appears that five might be too many, and one might not be enough.
  • Experiment with the number of neurons in each hidden layer, relative to the number of layers.
  • Experiment with adding dropout or regularization.
  • Attempt to further reduce model error by trying SGD or RMS prop instead of Adam, or by using a different learning rate for Adam.

Deep neural networks have so many moving parts, getting to optimal is sometimes an exhausting notion. You'll have to decide whether your model is good enough.

主站蜘蛛池模板: 绥宁县| 石阡县| 涟水县| 米林县| 花垣县| 邵武市| 应用必备| 昌图县| 习水县| 罗源县| 贵定县| 焦作市| 报价| 水富县| 沙河市| 潞城市| 昌乐县| 喀什市| 赤城县| 漳浦县| 临沂市| 青铜峡市| 肇东市| 南丹县| 赤壁市| 昌乐县| 吉安县| 双桥区| 林周县| 福贡县| 昌邑市| 上饶县| 拜泉县| 报价| 商河县| 东海县| 博乐市| 岑溪市| 托克逊县| 新津县| 桓台县|