Deterministic dropout for deep neural networks using composite random forest

Article Type

Research Article

Publication Title

Pattern Recognition Letters

Abstract

Dropout prevents overfitting in deep neural networks. Typical strategy of dropout involves random termination of connections irrespective of their importance. Termination blocks the propagation of class discriminative information across the network. As a result, dropout may lead to inferior performance. We propose a deterministic dropout where only unimportant connections are dropped ensuring propagation of class discriminative information. We identify the unimportant connections using a novel composite random forest, integrated into the network. We prove that better generalization is achieved by terminating these unimportant connections. The proposed algorithm is useful in preventing overfitting in noisy datasets. The proposal is equally good for datasets with smaller number of training examples. Experiments on several benchmark datasets show up to 8% improvement in classification accuracy.

First Page

205

Last Page

212

DOI

10.1016/j.patrec.2019.12.023

Publication Date

3-1-2020

Share

COinS