An effective learning rate scheduler for stochastic gradient descent-based deep learning model in healthcare diagnosis system Online publication date: Thu, 09-Dec-2021
by K. Sathyabama; K. Saruladha
International Journal of Electronic Healthcare (IJEH), Vol. 12, No. 1, 2022
Abstract: This study develops an effective stochastic gradient descent (SGD) and time with exponential decay (TED)-based learning rate scheduler called SGD-TED model for deep learning-based healthcare diagnosis. The presented SGD-TED model involves pre-processing, classification, SGD-based parameter tuning and TED-based learning rate scheduling. Once the data is pre-processed, three DL models namely recurrent neural network (RNN), long short-term memory (LSTM) and gated recurrent unit (GRU) are used for diagnosis. Then, the hyperparameter tuning takes place by SGD and TED is applied to schedule the learning rate proficiently. The application of SGD-TED approach in the DL models considerably helps to increase the classification performance. The effectiveness of the SGD-TED model is assessed on three benchmark medical dataset and the experimental outcome ensured that the SGD-TED-LSTM model has resulted to a higher accuracy of 98.59%, 93.68% and 95.20% on the applied diabetes, EEG Eye State and sleep stage dataset.
Online publication date: Thu, 09-Dec-2021
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Electronic Healthcare (IJEH):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email firstname.lastname@example.org