Feature Selection Using a Neural Network with Group Lasso Regularization and Controlled Redundancy

Article Type

Research Article

Publication Title

IEEE Transactions on Neural Networks and Learning Systems

Abstract

We propose a neural network-based feature selection (FS) scheme that can control the level of redundancy in the selected features by integrating two penalties into a single objective function. The Group Lasso penalty aims to produce sparsity in features in a grouped manner. The redundancy-control penalty, which is defined based on a measure of dependence among features, is utilized to control the level of redundancy among the selected features. Both the penalty terms involve the $L_{2,1}$ -norm of weight matrix between the input and hidden layers. These penalty terms are nonsmooth at the origin, and hence, one simple but efficient smoothing technique is employed to overcome this issue. The monotonicity and convergence of the proposed algorithm are specified and proved under suitable assumptions. Then, extensive experiments are conducted on both artificial and real data sets. Empirical results explicitly demonstrate the ability of the proposed FS scheme and its effectiveness in controlling redundancy. The empirical simulations are observed to be consistent with the theoretical results.

First Page

1110

Last Page

1123

DOI

10.1109/TNNLS.2020.2980383

Publication Date

3-1-2021

This document is currently not available here.

Share

COinS