SE-SqueezeNet: SqueezeNet extension with squeeze-and-excitation block
by Supasit Kajkamhaeng; Chantana Chantrapornchai
International Journal of Computational Science and Engineering (IJCSE), Vol. 24, No. 2, 2021

Abstract: Convolutional neural networks have been popularly used for image recognition tasks. It is known that deep convolutional neural network can yield high recognition accuracy while training it can be very time-consuming. AlexNet was one of the very first networks shown to be effective for the tasks. However, due to its large kernel sizes and fully connected layers, the training time is significant. SqueezeNet has been known as smaller network that yields the same performance as AlexNet. Based on SqueezeNet, we are interested in exploring the effective insertion of the squeeze-and-excitation (SE) module into SqueezeNet that can further improve the performance and cost efficiency. The promising methodology and pattern of module insertion have been explored. The experimental results for evaluating the module insertion show the improvement on top1 accuracy by 1.55% and 3.32% while the model size is enlarged by up to 16% and 10% for CIFAR100 and ILSVRC2012 datasets respectively.

Online publication date: Tue, 18-May-2021

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computational Science and Engineering (IJCSE):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com