A comparison of generalised maximum entropy and ordinary least square
by Manije Sanei Tabass; G.R. Mohtashami Borzadaran
International Journal of Information and Decision Sciences (IJIDS), Vol. 10, No. 4, 2018

Abstract: The generalised maximum entropy (GME) estimation method is based on the classic maximum entropy approach of Jaynes (1957). It has the ability to estimate the parameters of a regression model without imposing any constraints on the probability distribution of errors and it is robust even when we have ill-posed problems. In this paper, we simulate two sets of data from regression model with different distribution for disturbance, standard normal and Cauchy distributions respectively. For this dataset, regression coefficients are obtained by GME and OLS methods and these techniques are compared with each other for some sample sizes. Moreover, we have used some prior information on parameters to obtain GME estimators. The estimation results of GME in the case of non-normal distributed are discussed here.

Online publication date: Mon, 08-Oct-2018

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Information and Decision Sciences (IJIDS):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com