You can view the full text of this article for free using the link below.

Title: Automatic insect identification system based on SE-ResNeXt

Authors: Yao Xiao; Aocheng Zhou; Lin Zhou; Yue Zhao

Addresses: The School of Technology, Beijing Forestry University, Beijing 100083, China ' The School of Technology, Beijing Forestry University, Beijing 100083, China ' The School of Technology, Beijing Forestry University, Beijing 100083, China ' The School of Technology, Beijing Forestry University, Beijing 100083, China

Abstract: The Wudalianchi Scenic Area in Heilongjiang Province is the greatest place in the world to study species adaption and the evolution of biological communities. To solve the problems of heavy workload, poor timeliness, strong professionalism, and low accuracy in insect identification, an automatic insect identification system based on SE-ResNeXt is proposed. Firstly, to be suitable for the study of Wudalianchi insects, the dataset adopts the images of 105 species of eight orders insect in Wudalianchi. Then, through the comparison of three convolution neural networks, SE-ResNeXt has higher accuracy of insect identification than ResNet and Inception-V4, and its recall, precision, F1-score and accuracy all reach over 98%. Finally, based on Django framework, the website and app of system are built to realise the visualisation of identification results and the digital storage of insect data in Wudalianchi. The system has the characteristics of strong interactivity and convenient operation, and it was designed to provide technical assistance for insect protection, insect knowledge popularisation in agriculture and forestry, and a data foundation for the long-term evolution of insect variety in Wudalianchi, China.

Keywords: insect image; image identification; deep learning; convolutional neural network; CNN; Wudalianchi.

DOI: 10.1504/IJSCC.2023.127487

International Journal of Systems, Control and Communications, 2023 Vol.14 No.1, pp.81 - 98

Received: 06 May 2022
Accepted: 07 Jul 2022

Published online: 06 Dec 2022 *

Full-text access for editors Full-text access for subscribers Free access Comment on this article