Document Type
Article
Journal/Book Title/Conference
Entropy
Volume
20
Issue
8
Publication Date
7-27-2018
First Page
1
Last Page
27
Abstract
Recent work has focused on the problem of nonparametric estimation of information divergence functionals between two continuous random variables. Many existing approaches require either restrictive assumptions about the density support set or difficult calculations at the support set boundary which must be known a priori. The mean squared error (MSE) convergence rate of a leave-one-out kernel density plug-in divergence functional estimator for general bounded density support sets is derived where knowledge of the support boundary, and therefore, the boundary correction is not required. The theory of optimally weighted ensemble estimation is generalized to derive a divergence estimator that achieves the parametric rate when the densities are sufficiently smooth. Guidelines for the tuning parameter selection and the asymptotic distribution of this estimator are provided. Based on the theory, an empirical estimator of Rényi-α divergence is proposed that greatly outperforms the standard kernel density plug-in estimator in terms of mean squared error, especially in high dimensions. The estimator is shown to be robust to the choice of tuning parameters. We show extensive simulation results that verify the theoretical results of our paper. Finally, we apply the proposed estimator to estimate the bounds on the Bayes error rate of a cell classification problem.
Recommended Citation
Moon, Kevin R.; Sricharan, Kumar; Greenewald, Kristjan; and Hero, Alfred O. III, "Ensemble Estimation of Information Divergence" (2018). Mathematics and Statistics Faculty Publications. Paper 236.
https://digitalcommons.usu.edu/mathsci_facpub/236