Rule grouping and multiple minimum support thresholds for semantic multi-label associative classifier using feature reoccurrences Online publication date: Sat, 05-Aug-2017
by Preeti A. Bailke; S.T. Patil
International Journal of Data Mining, Modelling and Management (IJDMMM), Vol. 9, No. 2, 2017
Abstract: Multi-label classification is one of the important tasks in data mining. Researchers have addressed and extensively studied supervised classification which has vast applications in many domains. Associative classifiers are better performing classifiers, but they still have some issues which need to be addressed. This paper handles class imbalance problem, semantically organises vast number of generated rules, and applies relevant rules during classification. An algorithm called semantic multi-label associative classifier using feature reoccurrences (SeMACR) is proposed. Considering reoccurrence of features while generating rules proves to be beneficial, in particular for text documents. Class imbalance problem is handled with the help of balanced training and use of multiple minimum support thresholds based on the class distribution. A novel semantic-based approach is proposed for grouping of association rules using relatedness score between features rather than the traditional distance-based measure. Such organisation of rules makes them manageable and interpretable. During classification, only the relevant rules i.e., the rules present in the semantically most related group are applied. SeMACR algorithm has shown improved or comparable performance as compared to state-of-the-art techniques.
Online publication date: Sat, 05-Aug-2017
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Data Mining, Modelling and Management (IJDMMM):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email email@example.com