官术网_书友最值得收藏!

Tuning the model hyperparameters

Now that we've trained an MLP and a six-layer deep neural network on the problem, we're ready to tune and optimize model hyperparameters.

We will discuss model tuning in depth in Chapter 6, Hyperparameter Optimization. There are a variety of strategies that you can use to choose the best parameters for your model. As you've probably noticed, there are many possible parameters and hyperparameters that we could still optimize.

If you wanted to fully tune this model you should do the following:

  • Experiment with the number of hidden layers. It appears that five might be too many, and one might not be enough.
  • Experiment with the number of neurons in each hidden layer, relative to the number of layers.
  • Experiment with adding dropout or regularization.
  • Attempt to further reduce model error by trying SGD or RMS prop instead of Adam, or by using a different learning rate for Adam.

Deep neural networks have so many moving parts, getting to optimal is sometimes an exhausting notion. You'll have to decide whether your model is good enough.

主站蜘蛛池模板: 呼图壁县| 乌拉特中旗| 江达县| 牟定县| 沁水县| 长寿区| 灵丘县| 新宁县| 洛川县| 锡林浩特市| 淮安市| 理塘县| 柞水县| 嫩江县| 莆田市| 新巴尔虎右旗| 昌都县| 杂多县| 大兴区| 巴彦淖尔市| 北川| 榆林市| 田林县| 陆丰市| 老河口市| 乾安县| 黄龙县| 九江县| 上高县| 琼中| 红桥区| 始兴县| 怀安县| 台江县| 郯城县| 黄骅市| 夏津县| 兴文县| 陵水| 上栗县| 察雅县|