A Comparative Study of Various Probability Density Estimation Methods for Data Analysis
Tóm tắt
Probability density estimation (PDF) is a task of primary importance in many contexts, including Bayesian learning and novelty detection. Despite the wide variety of methods at disposal to estimate PDF, only a few of them are widely used in practice by data analysts. Among the most used methods are the histograms, Parzen windows, vector quantization based Parzen, and finite Gaussian mixtures. This paper compares these estimations methods from a practical point of view, i.e. when the user is faced to various requirements from the applications. In particular it addresses the question of which method to use when the learning sample is large or small, and of the computational complexity resulting from the choice (by cross-validation methods) of external parameters such as the number of kernels and their widths in kernel mixture models, the robustness to initial conditions, etc. Expected behaviour of the estimation algorithms is drawn from an algorithmic perspective; numerical experiments are used to illustrate these results.
Tài liệu tham khảo
R. O. Duda, P. E. Hart and D. G. Stork, Pattern Classification (Wiley, New York, 2001).
W. Härdle, M. Müller, S. Sperlich and A. Werwatz, Nonparametric and Semiparametric Models (Springer, New York, 2004).
B. W. Silverman, Density Estimation (Chapman & Hall/CRC, London, 1986).
G. McLachlan and D. Peel, Finite Mixture Models (Wiley, New York, 2000).
E. Parzen, On Estimation of a Probability Density Function and Mode, Annals of Math. Statistics, 33 (1962) 1065–1076.
B. Efron and R. J. Tibshirani, An Introduction to the Bootstrap (Chapman & Hall, London, 1998).
A. W. Bowman, An alternative method of cross-validation for the smoothing of density estimates. Biometrika, 71 (1984) 153–176.
M. C. Jones, J. S. Marron and S. J. Sheather, A brief survey of bandwidth selection for density estimation, Journal of the American Statistical Association 91(433) (1996) 401–407.
S. C. Ahalt, A. K. Krishnamurthy, P. K. Chen and D. E. Melton, Competitive Learning Algorithms for Vector Quantization, Neural Networks, 3(3) (1990) 277–290.
S. Grossberg, Competitive Learning – From Interactive Activation to Adaptive Resonance, Cognitive Science 11(1) (1987) 23–63.
T. M. Martinetz, S. G. Berkovich and K. J. Schulten, Neural-Gas Network for Vector Quantization and its Application to Time-Series Prediction, IEEE Trans. Neural Networks, 4(4) (1993) 558–569.
T. Kohonen, Self-Organizing Maps (Springer, Berlin, 1995).
N. Benoudjit, C. Archambeau, A. Lendasse, J. A. Lee and M. Verleysen, Width Optimization of the Gaussian Kernels in Radial Basis Function Networks, in Proc. ESANN ’02, Bruges, Belgium, April 24–26, 2002, pp. 425–432.
C. M. Bishop, Neural Networks for Pattern Recognition (Oxford University Press, Oxford, 1995).
A. P. Dempster, N. M. Laird and D. B. Rubin, Maximum Likelihood from Incomplete Data via the EM algorithm, J. Roy. Stat. Soc. (B), 39 (1977) 1–38.
V. N. Vapnik, The Nature of Statistical Learning Theory (Springer, New-York, 2000).
H. U. Bauer, R. Der and M. Herrmann, Controlling the magnification factor of self-organizing feature maps, Neural Computation, 8(4) (1996) 757–771.
C. Archambeau, J. A. Lee and M. Verleysen, On Convergence Problems of the EM Algorithm for Finite Gaussian Mixtures, in Proc. ESANN’03, Bruges, Belgium, April 23–25, 2003, pp. 99–106.
D. Ormoneit and V. Tresp, Averaging, maximum penalized likelihood and Bayesian estimation for improving Gaussian mixture probability density estimates, IEEE trans. on Neural Networks, 9(4) (1998) 639–649.
H. Attias, A variational bayesian framework for graphical models. In NIPS 12, eds. S. Solla, T. Leen and K.R. Muller (MIT Press, 1999).
M. Basseville, Distance measures for signal processing and pattern recognition, European Journal Signal Processing, 18(4) (1989) 349–369.