官术网_书友最值得收藏!

The need for Dropout

Now, let's go back to our much-needed discussion about Dropout. We use dropout in deep learning as a way of randomly cutting network connections between layers during each iteration. An example showing an iteration of dropout being applied to three network layers is shown in the following diagram:



Before and after dropout

The important thing to understand is that the same connections are not always cut. This is done to allow the network to become less specialized and more generalized. Generalizing a model is a common theme in deep learning, and we often do this so our models can learn a broader set of problems, more quickly. Of course, there may be times where generalizing a network limits a network's ability to learn.

If we go back to the previous sample now and look at the code, we see a Dropout layer being used like so:

x = Dropout(.5)(x)

That simple line of code tells the network to drop out or disconnect 50% of the connections randomly after every iteration. Dropout only works for fully connected layers (Input -> Dense -> Dense) but is very useful as a way of improving performance or accuracy. This may or may not account for some of the improved performance from the previous example.

In the next section, we will look at how deep learning mimics the memory sub-process or temporal scent.

主站蜘蛛池模板: 海盐县| 禹城市| 商城县| 白河县| 南华县| 太湖县| 塔城市| 曲靖市| 连平县| 逊克县| 若尔盖县| 怀集县| 福鼎市| 兴和县| 江津市| 延吉市| 大姚县| 任丘市| 明水县| 辽阳市| 岳阳市| 米泉市| 徐州市| 永州市| 大方县| 汉阴县| 惠东县| 乐陵市| 象山县| 滨州市| 安福县| 东港市| 乌兰察布市| 和平县| 淳安县| 和平区| 乐东| 新巴尔虎右旗| 高州市| 东丰县| 宁河县|