A recursive Renyi's entropy estimator
Tóm tắt
Estimating the entropy of a sample set is required, in solving numerous learning scenarios involving information theoretic optimization criteria. A number of entropy estimators are available in the literature; however, these require a batch of samples to operate on in order to yield an estimate. We derive a recursive formula to estimate Renyi's (1970) quadratic entropy on-line, using each new sample to update the entropy estimate to obtain more accurate results in stationary situations or to track the changing entropy of a signal in nonstationary situations.
Từ khóa
#Entropy #Recursive estimation #Information theory #Adaptive systems #Biomedical computing #Biomedical engineering #Yield estimation #Digital communication #Neural networks #Stochastic processesTài liệu tham khảo
10.1109/NNSP.2001.943148
erdogmus, 2001, An On-Line Adaptation A lgorithm for Adaptive System Training with Minimum Error Entropy: Stochastic Information Gradient, Proc of Indenendent Component Analysis 2001 (LCA'O I)
rcnyi, 1970
parzen, 1967, On Estimation of a Probabi Density Functio and Mode, Series Analysis Papers
principe, 2000, Information Theoretic Learning, Unsupervised Adaptive Fjltering, 1
erdogmus, 2002, Information Theoretic Learning: Renyi's Entropy and Its Applications to Adapt, System Trajning
10.1109/TNN.2002.1031936
10.1109/TSP.2002.1011217
10.1088/0954-898X/3/2/009
fisher, 1997, NonlinearExtensjons to the Mjnimnm Ayerage Correlation Energy Filter
linsker, 1988, An Application of the Principle of Maximum lnformation Preservation to Linear Systems
10.1162/neco.1993.5.1.45
10.1109/9.587329
10.1002/j.1538-7305.1948.tb01338.x
erdogmus, 2001, Convergence Analysis of he Information Potential Criterion in Adaline Training, Proc Neural Networks for Signal Processing (NNSP), 123