Integrated models and features-based speaker independent emotion recognition
by C. Jeyalakshmi; A. Revathi; Y. Yenkataramani
International Journal of Telemedicine and Clinical Practices (IJTMCP), Vol. 1, No. 3, 2016

Abstract: Speech emotion recognition has become a challenging task in speech technology in order to ensure better and effective human-machine interaction. Improving the accuracy of speaker independent emotion recognition system by using a database containing the speech samples of limited number of utterances spoken by limited set of speakers is more challenging. This paper mainly discusses the use of integrated features and integrated models for improving the accuracy of the emotion recognition system. Integrated features are obtained by concatenating the probability with the normal system features. Integrated models are obtained by combining clustering and continuous density hidden Markov models. Integrated models with Mel frequency perceptual linear predictive cepstrum concatenated with probability provides better accuracy of 89% for recognising emotions using EMO-DB which contains speech samples of ten different utterances of ten different speakers in different emotions such as anger, boredom, disgust, fear, happy, neutral and sad by applying spectral analysis as a additional pre processing technique.

Online publication date: Fri, 22-Jul-2016

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Telemedicine and Clinical Practices (IJTMCP):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com