官术网_书友最值得收藏!

Dropout

Dropout is a widely used technique to improve convergence and robustness of a neural net and prevent neural nets from overfitting. It consists of setting some random values to zero for the layers on which we'd like it to apply. It introduces some randomness in the data at every epoch.

Usually, dropout is used before the fully connected layers and not used very often in convolutional layers. Let's add the following lines before each of our two fully connected layers:

dropout = 0.5

if dropout > 0 :
    mask = srng.binomial(n=1, p=1-dropout, size=hidden_input.shape)
    # The cast is important because
    # int * float32 = float64 which make execution slower
    hidden_input = hidden_input * T.cast(mask, theano.config.floatX)

The full script is in 5-cnn-with-dropout.py. After 1,000 iterations, the validation error of the CNN with dropout continues to drops down to 1.08%, while the validation error of the CNN without dropout will not go down by 1.22%.

Readers who would like to go further with dropout should have a look at maxout units. They work well with dropout and replace the tanh non-linearities to get even better results. As dropout does a kind of model averaging, maxout units try to find the optimal non-linearity to the problem.

主站蜘蛛池模板: 杭州市| 建瓯市| 喀什市| 吉木萨尔县| 中超| 嫩江县| 桂阳县| 万山特区| 远安县| 威信县| 汶上县| 砀山县| 台江县| 育儿| 囊谦县| 会泽县| 百色市| 洮南市| 招远市| 永新县| 如东县| 天长市| 皋兰县| 荣昌县| 高淳县| 新平| 怀集县| 正镶白旗| 体育| 定远县| 兴海县| 新龙县| 扎鲁特旗| 都江堰市| 湾仔区| 梨树县| 博客| 道孚县| 清原| 蓬莱市| 阳信县|