Title: Enhanced gradient boosting - a novel method to improve performance of XGB technique

Authors: Kolluru Venkata Nagendra; Maligela Ussenaiah

Addresses: Department of Computer Science, Vikrama Simhapuri University, SPSR Nellore, AP, India ' Department of Computer Science, Vikrama Simhapuri University, SPSR Nellore, AP, India

Abstract: Gradient boosting algorithm (Friedman, 1999) was produced for high prescient capacity. Its selection was restricted to minimise errors for the previous trees; only one decision tree was created. To build small size models, it takes large amount of time. To overcome these drawbacks, extreme gradient boosting (XGBoost) (Chen and Guestrin, 2016) was developed. It decreases the model building time as well as increases the performance. The experimental results demonstrate that enhanced gradient boosting (EGB) algorithm perform better than the remaining algorithms like XGB, gradient boosting (GB) etc in the context of class imbalanced dataset. The EGB algorithm works as same as XGB and also works on balanced data with high accuracy. EGB works well on both balanced and imbalanced data. The results obtained show that the area under curve obtained through EGB is higher than the area under curve obtained through XGB.

Keywords: machine learning; boosting; gradient boosting; enhanced gradient boosting; EGB; extreme gradient boosting; XGB; multithreading.

DOI: 10.1504/IJAIP.2025.149743

International Journal of Advanced Intelligence Paradigms, 2025 Vol.30 No.5, pp.371 - 378

Received: 31 Dec 2018
Accepted: 24 Feb 2019

Published online: 12 Nov 2025 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article