Title: A comprehensive survey on the role of explanation in artificial intelligence: a case study on prediction of gross calorific value of coal
Authors: Sindhu P. Menon
Addresses: School of Computing and Information Technology, REVA University, Bengaluru, India
Abstract: The study presented here could act as a basis for researchers interested in learning about essential components of the nascent and quickly developing field of research on explainable artificial intelligence (XAI). SHAP-Xgboost is applied to show the working principle of XAI. This is archived by analysing the coal content in the coal reserves. SHapley Additive explanations will be proposed as a revolutionary XAI for this aim. SHAP allows users to understand the extent of relationships between each unique input data along with its corresponding output, as well as rank input variables in order of efficacy. SHAP was combined with extreme gradient boosting (xgboost) (SHAP-Xgboost) which is one of the latest technological developments. SHAPXgboost was able to model GCV accurately (R2 = 0.99) using proximate and ultimate analysis (chemical content in coal) from the coal samples. These significant discoveries pave the way for the development of high-interpretability algorithms to learn coal properties and point out crucial variables.
Keywords: explainable artificial intelligence; XAI; artificial intelligence; gross calorific value; explainability.
DOI: 10.1504/IJCIS.2026.151576
International Journal of Critical Infrastructures, 2026 Vol.22 No.1, pp.25 - 58
Received: 14 Jul 2023
Accepted: 22 Jan 2024
Published online: 09 Feb 2026 *