Title: Liver tumour segmentation and classification using MV3CNN-KHO: a combination of multiparameterised inception V3 CNN and Krill Herd optimisation
Authors: A. Bathsheba Parimala; R.S. Shanmugasundaram
Addresses: Department of Computer Science, Vinayaka Missions Research Foundation, Salem, 636308, Tamil Nadu, India ' Department of Computer Science, Vinayaka Missions Research Foundation, Salem, 636308, Tamil Nadu, India
Abstract: The segmentation and classification of liver tumours are crucial in medical imaging, aiding early detection and treatment planning for liver diseases. Deep learning-based liver lesion segmentation has the potential to enhance the precision and effectiveness of liver disease detection. Recent studies have shown promising results in liver cancer prediction using convolutional neural network (CNN)-based techniques. This work proposes a Multiparameterised Inception v3 CNN to improve feature extraction for liver cancer prediction. Additionally, Krill Herd optimisation (KHO) optimisation can be applied to identify ideal hyperparameters, further enhancing the system's performance. By integrating KHO, the proposed model can achieve higher accuracy in predicting liver cancer, benefiting both patients and medical professionals. The study, conducted on the liver tumour segmentation (LiTS) dataset, evaluates accuracy, sensitivity, and specificity, with the MIV3CNNKHO model achieving 96% accuracy, 0.96 sensitivity, and 0.94 specificity. The implementation was done using Jupyter Notebook, with Python as the programming language. The optimised system offers an improved solution for liver cancer detection and prognosis, making it a valuable tool in medical imaging.
Keywords: segmentation classification; multiparameterised inception V3; KHO; Krill Herd Optimisation; convolutional neural network; medical imaging; feature extraction; model evaluation; accuracy analysis.
DOI: 10.1504/IJCBDD.2025.146185
International Journal of Computational Biology and Drug Design, 2025 Vol.16 No.3, pp.212 - 232
Received: 22 Nov 2023
Accepted: 23 Aug 2024
Published online: 09 May 2025 *