Title: Performance analysis of nonlinear activation function in convolution neural network for image classification

Authors: Edna C. Too; Li Yujian; Pius Kwao Gadosey; Sam Njuki; Firdaous Essaf

Addresses: Department of Computer Science and Technology, Beijing University of Technology, Beijing 100124, China; Department of Computer Science, Chuka University, P.O. Box 109-60400, Chuka, Kenya ' Department of Computer Science and Technology, Beijing University of Technology, Beijing 100124, China ' Department of Computer Science and Technology, Beijing University of Technology, Beijing 100124, China ' Department of Computer Science and Technology, Beijing University of Technology, Beijing 100124, China ' Department of Computer Science and Technology, Beijing University of Technology, Beijing 100124, China

Abstract: Deep learning architectures which are exceptionally deep have exhibited to be incredibly powerful models for image processing. As the architectures become deep, it introduces challenges and difficulties in the training process such as overfitting, computational cost, and exploding/vanishing gradients and degradation. A new state-of-the-art densely connected architecture, called DenseNets, has exhibited an exceptionally outstanding result for image classification. However, it still computationally costly to train DenseNets. The choice of the activation function is also an important aspect in training of deep learning networks because it has a considerable impact on the training and performance of a network model. Therefore, an empirical analysis of some of the nonlinear activation functions used in deep learning is done for image classification. The activation functions evaluated include ReLU, Leaky ReLU, ELU, SELU and an ensemble of SELU and ELU. Publicly available datasets Cifar-10, SVHN, and PlantVillage are used for evaluation.

Keywords: deep learning; convolution neural networks; activation functions; nonlinear activation functions; image classification; rectified linear unit; exponential linear unit; scaled exponential linear unit; leaky rectified linear unit; DenseNet.

DOI: 10.1504/IJCSE.2020.106866

International Journal of Computational Science and Engineering, 2020 Vol.21 No.4, pp.522 - 535

Received: 31 Oct 2018
Accepted: 18 Mar 2019

Published online: 24 Apr 2020 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article