Class

Article

College

College of Science

Department

English Department

Faculty Mentor

Kevin Moon

Presentation Type

Poster Presentation

Abstract

The problem of estimating a probability density function from data has many applications in machine learning and data science. Nonparametric estimators are useful in this context as they require relatively few assumptions on the densities. Unfortunately, standard nonparametric methods such as kernel density estimationtend to converge slowly to the true value in high dimensions as a function of the data sample size. Recent work has shown that optimally weighted ensembles of nonparametric estimators can be used to achieve a fast convergence rate when estimating information theoretic functionals such as information divergence. We explore the extension of this theory to density estimation to derive a nonparametric kernel density estimator that converges to the true density function quickly.

Location

Logan, UT

Start Date

4-6-2022 12:00 AM

Included in

Mathematics Commons

Share

COinS
 
Apr 6th, 12:00 AM

Ensemble Kernel Density Estimation

Logan, UT

The problem of estimating a probability density function from data has many applications in machine learning and data science. Nonparametric estimators are useful in this context as they require relatively few assumptions on the densities. Unfortunately, standard nonparametric methods such as kernel density estimationtend to converge slowly to the true value in high dimensions as a function of the data sample size. Recent work has shown that optimally weighted ensembles of nonparametric estimators can be used to achieve a fast convergence rate when estimating information theoretic functionals such as information divergence. We explore the extension of this theory to density estimation to derive a nonparametric kernel density estimator that converges to the true density function quickly.