"A new family of bounded divergence measures and application to signal " by Shivakumar Jolad, Ahmed Roman et al.
 

A new family of bounded divergence measures and application to signal detection

Document Type

Conference Article

Publication Title

ICPRAM 2016 - Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods

Abstract

We introduce a new one-parameter family of divergence measures, called bounded Bhattacharyya distance (BBD) measures, for quantifying the dissimilarity between probability distributions. These measures are bounded, symmetric and positive semi-definite and do not require absolute continuity. In the asymptotic limit, BBD measure approaches the squared Hellinger distance. A generalized BBD measure for multiple distributions is also introduced. We prove an extension of a theorem of Bradt and Karlin for BBD relating Bayes error probability and divergence ranking. We show that BBD belongs to the class of generalized Csiszar f-divergence and derive some properties such as curvature and relation to Fisher Information. For distributions with vector valued parameters, the curvature matrix is related to the Fisher-Rao metric. We derive certain inequalities between BBD and well known measures such as Hellinger and Jensen-Shannon divergence. We also derive bounds on the Bayesian error probability. We give an application of these measures to the problem of signal detection where we compare two monochromatic signals buried in white noise and differing in frequency and amplitude.

First Page

72

Last Page

83

DOI

10.5220/0005695200720083

Publication Date

1-1-2016

Comments

Open Access; Hybrid Gold Open Access

Share

COinS