Title: AMAA-GMM: adaptive Mexican axolotl algorithm based enhanced Gaussian mixture model to segment the cervigram images

Authors: Lalasa Mukku; Jyothi Thomas

Addresses: CHRIST (Deemed to be University), Kanmanike, Kumbalgodu, Mysore Road, Bangalore, Karnataka, 560074, India ' CHRIST (Deemed to be University), Kanmanike, Kumbalgodu, Mysore Road, Bangalore, Karnataka, 560074, India

Abstract: Colposcopy is a crucial imaging technique for finding cervical abnormalities. Colposcopic image evaluation, particularly the accurate delineation of the cervix region, has considerable medical significance. Before segmenting the cervical region, specular reflection removal is an efficient approach. Cervical cancer can be found using a visual check with acetic acid that turns precancerous and cancerous areas white and these could be viewed as signs of abnormalities. Similarly, bright white regions known as specular reflections obstruct the identification of aceto-white areas and should therefore be removed. So, in this paper, specular reflection removal with segmenting the cervix region in a colposcopy image is proposed. The proposed approach consists of two main stages, namely, pre-processing and segmentation. In the pre-processing stage, specular reflections are detected and removed using a swin transformer. After that, cervical regions are segmented using an enhanced Gaussian mixture model (EGMM). For better segmentation accuracy, the best parameters of GMM are chosen via the adaptive Mexican axolotl optimisation (AMAO) algorithm. The performance of the proposed approach is analysed based on accuracy, sensitivity, specificity, Jaccard index, and dice coefficient, and the efficiency of the suggested strategy is compared with various methods.

Keywords: Gaussian mixture models; machine learning; segmentation; metaheuristics; deep learning; enhanced Gaussian mixture model; EGMM; adaptive Mexican axolotl optimisation; AMAO.

DOI: 10.1504/IJRIS.2026.150612

International Journal of Reasoning-based Intelligent Systems, 2026 Vol.18 No.1, pp.33 - 40

Received: 01 Jun 2023
Accepted: 22 Nov 2023

Published online: 18 Dec 2025 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article