官术网_书友最值得收藏!

The need for Dropout

Now, let's go back to our much-needed discussion about Dropout. We use dropout in deep learning as a way of randomly cutting network connections between layers during each iteration. An example showing an iteration of dropout being applied to three network layers is shown in the following diagram:



Before and after dropout

The important thing to understand is that the same connections are not always cut. This is done to allow the network to become less specialized and more generalized. Generalizing a model is a common theme in deep learning, and we often do this so our models can learn a broader set of problems, more quickly. Of course, there may be times where generalizing a network limits a network's ability to learn.

If we go back to the previous sample now and look at the code, we see a Dropout layer being used like so:

x = Dropout(.5)(x)

That simple line of code tells the network to drop out or disconnect 50% of the connections randomly after every iteration. Dropout only works for fully connected layers (Input -> Dense -> Dense) but is very useful as a way of improving performance or accuracy. This may or may not account for some of the improved performance from the previous example.

In the next section, we will look at how deep learning mimics the memory sub-process or temporal scent.

主站蜘蛛池模板: 威宁| 蕉岭县| 安远县| 砀山县| 云龙县| 东港市| 托克逊县| 汤阴县| 大埔区| 涪陵区| 神木县| 万盛区| 武山县| 宝清县| 来安县| 清新县| 巫溪县| 壤塘县| 临沧市| 吉隆县| 怀安县| 新昌县| 宣城市| 资溪县| 井研县| 通海县| 岢岚县| 兴和县| 邢台市| 霍林郭勒市| 浙江省| 沙雅县| 那曲县| 桂林市| 闽清县| 垣曲县| 深水埗区| 林甸县| 屏东县| 尉氏县| 五家渠市|