Approximating the covariance matrix of GMMs with low-rank perturbations
by Malik Magdon-Ismail; Jonathan T. Purnell
International Journal of Data Mining, Modelling and Management (IJDMMM), Vol. 4, No. 2, 2012

Abstract: Covariance matrices capture correlations that are invaluable in modelling real-life datasets. Using all d2 elements of the covariance (in d dimensions) is costly and could result in over-fitting; and the simple diagonal approximation can be over-restrictive. In this work, we present a new model, the low-rank Gaussian mixture model (LRGMM), for modelling data which can be extended to identifying partitions or overlapping clusters. The curse of dimensionality that arises in calculating the covariance matrices of the GMM is countered by using low-rank perturbed diagonal matrices. The efficiency is comparable to the diagonal approximation, yet one can capture correlations among the dimensions. Our experiments reveal the LRGMM to be an efficient and highly applicable tool for working with large high-dimensional datasets.

Online publication date: Sat, 23-Aug-2014

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Data Mining, Modelling and Management (IJDMMM):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com