Cross-corpus classification of affective speech
by Imen Trabelsi; Med Salim Bouhlel
International Journal of Advanced Intelligence Paradigms (IJAIP), Vol. 22, No. 3/4, 2022

Abstract: Automatic speech emotion recognition still has to overcome several obstacles before it can be employed in realistic situations. One of these barriers is the lack of suitable training data, both in quantity and quality. The aim of this study is to investigate the effect of cross-corpus data on automatic classification of emotional speech. In this work, features vectors, constituted by the Mel frequency cepstral coefficients (MFCC) extracted from the speech signal are used to train the support vector machines (SVM) and Gaussian mixture models (GMM). The research describes the evaluation of three different emotional databases from three different languages (English, Polish and German) following a three cross-corpus strategies. In the intra-corpus scenario, the accuracies were found to vary widely between 70% and 87%. In the inter-corpus scenario, the obtained average recall is 70.87%. The accuracies of the cross-corpus scenario were found to be below to 50%.

Online publication date: Fri, 22-Jul-2022

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Advanced Intelligence Paradigms (IJAIP):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com