Title: Rupture risk prediction of intracranial aneurysm by using gene expression data mining and intelligent optimisation algorithm
Authors: Yueling Xiong; Mingquan Ye; Peipei Wang; Qingqing Li
Addresses: Translational Medicine Centre, The Second Affiliated Hospital, Wannan Medical College, Wuhu, China ' School of Medical Information, Wannan Medical College, Wuhu, China; Institute of Artificial Intelligence, Hefei Comprehensive National Science Centre, Hefei, China ' School of Medical Information, Wannan Medical College, Wuhu, China ' School of Medical Information, Wannan Medical College, Wuhu, China
Abstract: Intracranial aneurysm (IA) rupture can precipitate severe subarachnoid haemorrhage. Despite the importance of uncovering key disease traits through high-throughput gene expression data, the application of machine learning to identify informative genes linked to IA rupture remains limited. Hence, we present a novel machine-learning model, constructed on the intelligent optimisation algorithms, to forecast IA rupture states and pinpoint efficacious informative genes. The model integrated adaptive boosting (AdaBoost) with particle swarm optimisation (PSO) to eliminate redundant genes, followed by ReliefF for further optimisation. Subsequently, a small set of informative genes fully representing the IA rupture state was obtained and evaluated using various classification models. The experimental results showed the proposed algorithm particle swarm optimisation-adaptive boosting-ReliefF (PSO-AdaBoost-ReliefF) achieved significant improvements in all evaluation metrics. Additionally, Gene ontology (GO) and enrichment analysis were performed to reveal gene-IA association. The PSO-AdaBoost-ReliefF model can effectively mine informative genes, accurately evaluate the rupture state, while potentially identifying new target genes.
Keywords: intracranial aneurysm; IA; informative gene selection; PSO-AdaBoost-ReliefF; rupture status; gene expression data mining.
DOI: 10.1504/IJBIC.2025.148394
International Journal of Bio-Inspired Computation, 2025 Vol.26 No.1, pp.24 - 34
Received: 17 Oct 2024
Accepted: 07 May 2025
Published online: 03 Sep 2025 *