You can view the full text of this article for free using the link below.

Title: Large-scale text classification with deeper and wider convolution neural network

Authors: Min Huang; Wei Huang

Addresses: Department of Software Engineering, South China University of Technology, Guangzhou, Guangdong Province 510006, China ' Department of Software Engineering, South China University of Technology, Guangzhou, Guangdong Province 510006, China

Abstract: The dominant approaches for most natural language processing (NLP) tasks like text classification are recurrent neural networks (RNNs) and convolutional neural networks (CNNs). These architectures are usually shallow and only have one or two layers, which cannot easily extract inner patterns in natural language. Different from the original feature of image pixels with regularity, words and phrases are highly abstracted from human knowledge without direct correlation. Shallow models only capture the surface relation between them while deep models cannot directly apply them. Therefore, a shuffle convolution neural network (SCNN) is proposed to address the shallow learning problem by introducing wider inception cell and deeper residual connection. In the paper, the difficulty of applying deep models to NLP problems is overcome by tricks of shuffling channel input and reshaping the output dimension in the first layer. The results of the experiments carried out in this research work demonstrate that the proposed SCNN makes a great improvement in accuracy and efficiency compared to shallow models.

Keywords: text classification; shuffle channel; inception cell; residual connection.

DOI: 10.1504/IJSPM.2020.106977

International Journal of Simulation and Process Modelling, 2020 Vol.15 No.1/2, pp.120 - 133

Received: 24 Aug 2018
Accepted: 03 May 2019

Published online: 21 Apr 2020 *

Full-text access for editors Access for subscribers Free access Comment on this article