官术网_书友最值得收藏!

Visualizing training

Since we've written log data from both the models in Chapter 2Using Deep Learning to Solve Regression Problems, we can use TensorBoard to compare the two models graphically. Open up TensorBoard and head to the SCALARS tab. You should see something like this. You may need to click loss and val_loss to expand the graphs:

Tensorboard displaying the loss and val_loss plots for the model

If you look at the bottom-left corner of the screen, you should notice that each directory we created has a run associated with it. Both are currently selected. This means that on our graphs, we will see output for both models.

TensorBoard can accommodate many, many runs, and you can filter them via a regular expression (for example ^dnn* would show all runs that start with dnn). This means that if you're searching for the best model through many experiments or runs (such as hyperparameter optimization), you can quickly navigate them if you explicitly and consistently name your runs and include meaningful hyperparameter and architecture information in the name, so do that!

The default X scale on these graphs is epochs. The Y value is the loss function we chose, which was MAE. You can click on the graphs to explore them and drag to zoom. 

Seeing the graphs like this, we can really see the relative bias and variance of each network. While there is a good separation between the models in train loss, the deep neural network only gets marginally better on the validation set, suggesting that we've headed into overfitting territory. 

主站蜘蛛池模板: 天全县| 邹平县| 泰顺县| 卢氏县| 和平区| 石台县| 凤山县| 时尚| 菏泽市| 开远市| 克什克腾旗| 县级市| 兰西县| 大悟县| 新和县| 棋牌| 菏泽市| 准格尔旗| 嘉禾县| 中西区| 库车县| 长武县| 阳江市| 沂南县| 怀远县| 梅河口市| 商都县| 伽师县| 聂拉木县| 桂林市| 额尔古纳市| 济南市| 沾益县| 松潘县| 谷城县| 杭锦后旗| 延安市| 绥宁县| 铁岭县| 峨眉山市| 潮州市|