Open Access Article

Title: English text classification model based on graph neural networks and contrastive learning

Authors: Yingying Cao

Addresses: Department of Education and Art, Minxi Vocational & Technical College, Longyan 364000, China

Abstract: To address traditional graph neural networks' (GNNs) neglect of word-order features and sensitivity to adversarial noise, this study proposes DGCL-TC, a text classification model integrating dual-graph fusion and adaptive contrastive learning. The framework leverages bidirectional encoder representations from transformers (BERT) to encode contextual semantics and constructs dual graphs capturing local and global text structures. A learnable augmentation module dynamically generates contrastive views via node dropout and attribute masking, optimising representations through cross-view consistency. A gated graph attention network fuses topology-aware graph features with BERT embeddings, balancing structural and sequential cues. Evaluations on benchmark datasets confirm that DGCL-TC significantly outperforms baseline methods in accuracy and robustness, particularly under adversarial perturbations and sparse data conditions. The model advances text classification by unifying semantic, structural, and noise-resistant representation learning.

Keywords: text classification; graph neural network; GNNs; contrastive learning; BERT.

DOI: 10.1504/IJICT.2025.147141

International Journal of Information and Communication Technology, 2025 Vol.26 No.25, pp.48 - 69

Received: 16 May 2025
Accepted: 29 May 2025

Published online: 10 Jul 2025 *