官术网_书友最值得收藏!

Dropout

Dropout is a widely used technique to improve convergence and robustness of a neural net and prevent neural nets from overfitting. It consists of setting some random values to zero for the layers on which we'd like it to apply. It introduces some randomness in the data at every epoch.

Usually, dropout is used before the fully connected layers and not used very often in convolutional layers. Let's add the following lines before each of our two fully connected layers:

dropout = 0.5

if dropout > 0 :
    mask = srng.binomial(n=1, p=1-dropout, size=hidden_input.shape)
    # The cast is important because
    # int * float32 = float64 which make execution slower
    hidden_input = hidden_input * T.cast(mask, theano.config.floatX)

The full script is in 5-cnn-with-dropout.py. After 1,000 iterations, the validation error of the CNN with dropout continues to drops down to 1.08%, while the validation error of the CNN without dropout will not go down by 1.22%.

Readers who would like to go further with dropout should have a look at maxout units. They work well with dropout and replace the tanh non-linearities to get even better results. As dropout does a kind of model averaging, maxout units try to find the optimal non-linearity to the problem.

主站蜘蛛池模板: 噶尔县| 盖州市| 吉林省| 吐鲁番市| 朝阳区| 阳原县| 大关县| 清远市| 大余县| 乌兰察布市| 鹤岗市| 洛浦县| 新化县| 绍兴县| 龙州县| 耿马| 双峰县| 汉源县| 高邮市| 涪陵区| 井陉县| 滦南县| 海阳市| 铜川市| 武强县| 竹山县| 大渡口区| 延川县| 勐海县| 搜索| 通辽市| 衡水市| 逊克县| 方正县| 广宁县| 扶绥县| 会昌县| 牙克石市| 古交市| 泸西县| 都匀市|