A nonparametric algorithm for estimating mutual information between digital signals

Article Type

Research Article

Publication Title

Digital Signal Processing: A Review Journal


Recently it has been mathematically proved that any digital signal can be expressed as a string 13 3-point motifs and information in the signal is encoded in terms of those motifs. This result has several important applications in statistical signal processing, such as entropy estimation, transfer entropy, data mining, etc. In the current work we have shown an application in estimation of mutual information (MI) across any number of signals. MI is an important measure of interdependence among the signals with wide applications. However, estimating the MI between two signals as random variables, whose probability distributions are not known, is quite challenging. Here we propose a nonparametric, real time method to estimate MI between two signals and then generalize it to any number of signals. This has been done in terms of motifs. On simulated data, in which MI can be controlled, it performed at par with k nearest neighbor (kNN) based method and the permutation motif based method, but much faster and memory efficient. On real data with added noise and varying signal to noise ratio (SNR) the proposed method worked at par with the other two methods. A novel hypothesis testing method has also been introduced for comparing performances.



Publication Date


This document is currently not available here.