官术网_书友最值得收藏!

Visualizing a broken network

TensorBoard is a great troubleshooting tool. To demonstrate this, I'm going to copy our deep neural network and break it! Luckily, breaking a neural network is really easy. Trust me, I've done it enough unintentionally that I'm basically an expert at this point.

Imagine that you have just trained a new neural network and seen that the loss looked like this:

The loss function for this network is stuck, and it's way higher than our previous run. What went wrong?

Navigate to the HISTOGRAMS section of TensorBoard and visualize the first hidden layer. Let's compare the histogram of the weights for hidden layer 1 in both networks: 

Sceenshot displaying the histogram of the weights for hidden layer 1 in both networks

For both the biases and weights of the network labelled dnn, you'll see that the weights are spread out across the graph. You might even say that the distribution of each could be normal(ish).

You can also compare the weights and biases in the distributions section. Both present mostly the same information in slightly different ways.

Now, look at the weight and biases of our broken network. Not so spread out, and in fact, the weights are all basically the same. The network isn't really learning. Every neuron in the layer appears to be more or less the same. If you look at the other hidden layers you'll see more of the same.

You might be wondering what I did to make this happen. You're in luck, I'll share my secret. After all, you never know when you might need to break your own network. To break things, I initialized every neuron in the network to the exact same value. When this happens, the error every neuron receives during backprop is exactly the same and changes exactly the same way. The network then fails to break symmetry. Initializing the weights to a deep neural network in a random way is really important, and this is what happens if you break that rule!

You can use TensorBoard exactly like this when you have a problem. Keep in mind our deep neural network has 4033, and that still qualifies as tiny in the world of deep learning. With TensorBoard, we were able to visually inspect 4033 parameters and identify a problem. TensorBoard is an amazing flashlight in the dark room that is deep learning.

主站蜘蛛池模板: 景宁| 和平县| 明水县| 昌黎县| 南岸区| 手游| 泸水县| 五大连池市| 新源县| 杨浦区| 廉江市| 青海省| 汾阳市| 本溪市| 新郑市| 白河县| 塘沽区| 北川| 秦皇岛市| 姜堰市| 浦东新区| 昆山市| 五峰| 台江县| 息烽县| 察雅县| 安化县| 绥芬河市| 定安县| 西平县| 庐江县| 鄂托克旗| 民勤县| 浏阳市| 安岳县| 乌拉特中旗| 仲巴县| 崇义县| 霸州市| 醴陵市| 洱源县|