Publications
Detailed Information
Analysis on the Dropout Effect in Convolutional Neural Networks
Cited 93 time in
Web of Science
Cited 129 time in Scopus
- Authors
- Issue Date
- 2017
- Publisher
- SPRINGER INTERNATIONAL PUBLISHING AG
- Citation
- COMPUTER VISION - ACCV 2016, PT II, Vol.10112, pp.189-204
- Abstract
- Regularizing neural networks is an important task to reduce overfitting. Dropout [1] has been a widely-used regularization trick for neural networks. In convolutional neural networks (CNNs), dropout is usually applied to the fully connected layers. Meanwhile, the regularization effect of dropout in the convolutional layers has not been thoroughly analyzed in the literature. In this paper, we analyze the effect of dropout in the convolutional layers, which is indeed proved as a powerful generalization method. We observed that dropout in CNNs regularizes the networks by adding noise to the output feature maps of each layer, yielding robustness to variations of images. Based on this observation, we propose a stochastic dropout whose drop ratio varies for each iteration. Furthermore, we propose a new regularization method which is inspired by behaviors of image filters. Rather than randomly drop the activation, we selectively drop the activations which have high values across the feature map or across the channels. Experimental results validate the regularization performance of selective max-drop and stochastic dropout is competitive to the dropout or spatial dropout [2].
- ISSN
- 0302-9743
- Files in This Item:
- There are no files associated with this item.
Related Researcher
- Graduate School of Convergence Science & Technology
- Department of Intelligence and Information
Item View & Download Count
Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.