Quasi-continuous maximum entropy distribution approximation with kernel density Online publication date: Thu, 30-Oct-2014
by Thomas Mazzoni, Elmar Reucher
International Journal of Information and Decision Sciences (IJIDS), Vol. 3, No. 4, 2011
Abstract: This paper extends maximum entropy estimation of discrete probability distributions to the continuous case. This transition leads to a non-parametric estimation of a probability density function, preserving the maximum entropy principle. Furthermore, the derived density estimate provides a minimum mean integrated square error. In the second step, it is shown how boundary conditions can be included, resulting in a probability density function obeying maximum entropy. The criterion for deviation from a reference distribution is the Kullback-Leibler entropy. It is further shown, how the characteristics of a particular distribution can be preserved by using integration kernels with mimetic properties.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Information and Decision Sciences (IJIDS):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com