Title: A multimodal health data fusion and deep analysis approach in smart campus systems
Authors: Fang Fu; Zhiguang Li
Addresses: School of Sport and Physical Education, North University of China, Taiyuan 030051, China ' School of Advanced Imaging Science, Multimedia and Film, Chung-Ang University, Seoul 06974, South Korea
Abstract: Health monitoring, a key component of the smart campus system, involves multimodal health data. Aiming at the problem of insufficient intermodal interactivity in existing research, we first vectorise the multimodal health data, such as audio, image and text, and design a multiscale convolutional neural network (MCNN) to extract the multimodal data features and carry out statistical pooling to obtain the standard deviation, maximum value and average value of the feature vectors. Then, the dense attention mechanism (DAM) is designed to realise the interactive fusion of multimodal data, the multivariate Gaussian distribution is utilised to classify the health states, and the multilayer perceptron is combined to construct the health data analysis algorithm. The experimental results show that the fusion efficiency of the proposed method is greater than 85% and the classification accuracy reaches 95.07%, which significantly improves the monitoring of multimodal health data.
Keywords: smart campus; health monitoring; multimodal data fusion; multiscale convolutional neural network; MCNN; dense attention mechanism; DAM.
DOI: 10.1504/IJICT.2025.146674
International Journal of Information and Communication Technology, 2025 Vol.26 No.17, pp.147 - 162
Received: 15 Apr 2025
Accepted: 29 Apr 2025
Published online: 11 Jun 2025 *