Title: SE-SqueezeNet: SqueezeNet extension with squeeze-and-excitation block

Authors: Supasit Kajkamhaeng; Chantana Chantrapornchai

Addresses: Department of Computer Engineering, Faculty of Engineering, Kasetsart University, Bangkok, Thailand ' Department of Computer Engineering, Faculty of Engineering, Kasetsart University, Bangkok, Thailand

Abstract: Convolutional neural networks have been popularly used for image recognition tasks. It is known that deep convolutional neural network can yield high recognition accuracy while training it can be very time-consuming. AlexNet was one of the very first networks shown to be effective for the tasks. However, due to its large kernel sizes and fully connected layers, the training time is significant. SqueezeNet has been known as smaller network that yields the same performance as AlexNet. Based on SqueezeNet, we are interested in exploring the effective insertion of the squeeze-and-excitation (SE) module into SqueezeNet that can further improve the performance and cost efficiency. The promising methodology and pattern of module insertion have been explored. The experimental results for evaluating the module insertion show the improvement on top1 accuracy by 1.55% and 3.32% while the model size is enlarged by up to 16% and 10% for CIFAR100 and ILSVRC2012 datasets respectively.

Keywords: convolutional neural network; deep learning; image classification; residual network; SENet; SqueezeNet.

DOI: 10.1504/IJCSE.2021.115105

International Journal of Computational Science and Engineering, 2021 Vol.24 No.2, pp.185 - 199

Received: 21 Apr 2020
Accepted: 08 Sep 2020

Published online: 12 May 2021 *

Full-text access for editors Access for subscribers Purchase this article Comment on this article