Title: Brain tumour classification in MRI: self-improved osprey optimised U-Net model for segmentation and fused deepnet model-based classification

Authors: P. Devisivasankari; K. Lavanya

Addresses: School of Computer Science and Engineering, Vellore Institute of Technology, Vellore, Tamil Nadu, India ' School of Computer Science and Engineering, Vellore Institute of Technology, Vellore, Tamil Nadu, India

Abstract: Tumours rank as the tenth most prevalent global cause of death, with brain tumours representing a particularly serious medical condition stemming from the uncontrolled proliferation of brain cells, disrupting normal brain functions. The causes of brain tumours are unknown, emphasising early detection, treatment, and identification. In recent decades, researchers have explored novel brain tumour diagnosis technologies. Traditional diagnoses work poorly. Our cutting-edge brain tumour classification model addresses this issue with advanced deep learning (DL) classifiers and rapid-convergence, self-improving optimisation. Our SIOO-U-Net brain tumour classification model starts with enhanced picture fusion and Wiener filtering. The pre-processed image is segmented using SIOO-U-Net. Subsequently, the segmented map is used to extract features, including improved local gradient increasing pattern (LGIP), residual network (ResNet), and Visual geometry group 16 (VGG16) features. A novel hybrid brain tumour classification model uses these derived properties. PyramidNet and Bi-GRU classifiers are used in this hybrid DL classification model. Brain tumour classification outcomes are improved by using Bruce's formula-based score-level fusion with weight initialisation conditions. We used the BraTS 2015 dataset to test our SIOO-U-Net-based brain tumour classification classifier.

Keywords: brain tumour classification; MRI; SIOO-U-Net segmentation; PyramidNet; and score level fusion; visual geometry group 16; VGG16.

DOI: 10.1504/IJBRA.2024.140007

International Journal of Bioinformatics Research and Applications, 2024 Vol.20 No.3, pp.287 - 321

Received: 16 Oct 2023
Accepted: 02 Jan 2024

Published online: 15 Jul 2024 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article