Title: MA-HGCN: modality attentive hierarchical graph convolutional network for EEG-fNIRS brain-computer interface system
Authors: Jie Zhang
Addresses: Guangzhou Xinhua University, Guangzhou, Guangdong, 510000, China
Abstract: In order to address the challenging issue of cross-modal correlation modelling in EEG-fNIRS multimodal fusion, this paper suggests a novel multi-feature graph construction approach. The graph structure data is constructed by extracting multiple time-domain and frequency-domain features from the original EEG-fNIRS signals. The modal attention hierarchical graph convolution neural network then uses the positional relationship between the scalp electrodes to extract high-level regional features. To better extract the connectivity between various modal nodes, the hierarchical graph convolution handles the adjacency between the vertical and horizontal electrodes independently, while the modal attention mechanism can assign varying weights to distinct modal nodes. Experiments show that this strategy achieves 96.13% and 98.25% in motor imagery and mental arithmetic, respectively, improves classification accuracy by 2.51% to 5.37% when compared to the current optimal methods. Furthermore, it provides a new viewpoint on multimodal brain signal fusion by validating the simplified feature combinations.
Keywords: BCIs; EEG-fNIRS; MA-HGCN; motor imagery; mental activity.
DOI: 10.1504/IJICT.2025.148134
International Journal of Information and Communication Technology, 2025 Vol.26 No.31, pp.40 - 71
Received: 21 May 2025
Accepted: 16 Jul 2025
Published online: 26 Aug 2025 *