The extended Kullback-Leibler divergence measure in the unknown probability density function cases and applications
by Hoa Le; Hoang Van Truong; Pham The Bao
International Journal of Intelligent Information and Database Systems (IJIIDS), Vol. 14, No. 4, 2021

Abstract: The Kullback-Leibler divergence measure is used to evaluate the similarity between two probability distributions. In theory, the probability density functions are known before applying the formula. However, estimating this information of real data is challenging. For that reason, the Kullback-Leibler divergence needs to be modified for similarity measures in these cases. In this paper, we proposed and evaluated an extended Kullback-Leibler divergence similarity measure by two experiments. The first experiment is based on two datasets that have unknown probability density functions, while the second one is conducted on one dataset with an unknown probability density function and the other with a known probability density function. Besides, the proposed method is applied to the simulated data and the plagiarism detection cases.

Online publication date: Thu, 28-Oct-2021

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Intelligent Information and Database Systems (IJIIDS):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com