Title: MLNMF: multi-label learning based on non-negative matrix factorisation

Authors: Yu Han; Cheng Shao; ShouTao Yang; Weiwei Deng

Addresses: Dalian University of Technology, Dalian, China ' Dalian University of Technology, Dalian, China ' Dalian University of Technology, Dalian, China ' Dalian University of Technology, Dalian, China

Abstract: Multi-label learning deals with the problem where each instance is associated with a set of labels. The task is to predict the label sets of unseen instances through training the instances with known label sets. Existing approaches make predictions by learning from the distribution of multi-label instances. However, the direct relevance between features and labels has often been overlooked in the literature. In this paper, a multi-label learning approach named MLNMF is presented, which is derived from the traditional non-negative matrix factorisation algorithm. In detail, first propose to generate a label probability predict model (LPPM) utilising the NMF method to capture the direct relevance between features and labels. Then exploit the decision stump method to generate a classifier for each label. The proposed method is a first-order approach which assumes that each label is independent with each other. Compared to the existing approaches for multi-label learning, the proposed approach is advantageous which is able to explore the direct correlation between features and labels and the operating efficiency is much higher than other algorithms. The experimental results on a total of nine benchmark datasets illustrate the competitiveness of MLNMF against some well-established multi-label learning algorithms.

Keywords: multi-label learning; non-negative matrix factorisation; algorithm; predict model; multi-label learning.

DOI: 10.1504/IJMIC.2018.093545

International Journal of Modelling, Identification and Control, 2018 Vol.30 No.1, pp.1 - 8

Received: 28 Mar 2017
Accepted: 31 Aug 2017

Published online: 27 Jul 2018 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article