An effective learning rate scheduler for stochastic gradient descent-based deep learning model in healthcare diagnosis system
by K. Sathyabama; K. Saruladha
International Journal of Electronic Healthcare (IJEH), Vol. 12, No. 1, 2022

Abstract: This study develops an effective stochastic gradient descent (SGD) and time with exponential decay (TED)-based learning rate scheduler called SGD-TED model for deep learning-based healthcare diagnosis. The presented SGD-TED model involves pre-processing, classification, SGD-based parameter tuning and TED-based learning rate scheduling. Once the data is pre-processed, three DL models namely recurrent neural network (RNN), long short-term memory (LSTM) and gated recurrent unit (GRU) are used for diagnosis. Then, the hyperparameter tuning takes place by SGD and TED is applied to schedule the learning rate proficiently. The application of SGD-TED approach in the DL models considerably helps to increase the classification performance. The effectiveness of the SGD-TED model is assessed on three benchmark medical dataset and the experimental outcome ensured that the SGD-TED-LSTM model has resulted to a higher accuracy of 98.59%, 93.68% and 95.20% on the applied diabetes, EEG Eye State and sleep stage dataset.

Online publication date: Thu, 09-Dec-2021

  Free full text Access Free full text access

If you still need assistance, please email subs@inderscience.com