Title: Text Swin Transformer: a new transformer model for enterprise management text classification
Authors: Bo Yuan; Yang Wang; Jing Chi
Addresses: School of Economics and Management, Qiqihar University, Qiqihar 161000, Qiqihar, China ' School of Economics and Management, Qiqihar University, Qiqihar 161000, Qiqihar, China ' School of Economics and Management, Qiqihar University, Qiqihar 161000, Qiqihar, China
Abstract: As the Internet and information technology advance at a swift pace, enterprises have amassed vast quantities of data stemming from their operational activities, managerial processes, and other business functions. Behind these data lies extremely valuable knowledge that is closely related to the enterprise, which can provide critical decision-making support for enterprise management. Therefore, how to discover valuable knowledge from voluminous textual data, transform these textual data into a genuine asset for enterprises, and leverage this asset to inform decision-making and propel their growth, has become a challenging task. Inspired by this, this paper revises the Swin Transformer model, which has achieved wide success in computer vision tasks, and proposes a Text Swin Transformer (TST) model for enterprise management text data classification tasks. Specifically, we revise the original window attention, shifted window attention and patch merging strategies to text window attention, text shift window attention and window scaling strategies, in order to meet the needs of text classification tasks. In contrast to the state-of-the-art approaches in text classification task, the proposed model not only achieves excellent results on multiple standard datasets, but also requires less computation and boasts a faster computation speed.
Keywords: enterprise management text classification; Text Swin Transformer; window attention; shift window attention.
DOI: 10.1504/IJCAT.2025.148142
International Journal of Computer Applications in Technology, 2025 Vol.76 No.1/2, pp.36 - 45
Received: 29 Oct 2023
Accepted: 23 Jul 2024
Published online: 27 Aug 2025 *