Future of Information and Communication Conference (FICC) 2025
28-29 April 2025
Publication Links
IJACSA
Special Issues
Future of Information and Communication Conference (FICC)
Computing Conference
Intelligent Systems Conference (IntelliSys)
Future Technologies Conference (FTC)
International Journal of Advanced Computer Science and Applications(IJACSA), Volume 15 Issue 10, 2024.
Abstract: Bringing more transparency to the decision making process in fields deploying ML tools is important in various fields. ML tools need to be designed in such a way that they are more understandable and explainable to end users while bringing trust. The field of XAI, although a mature area of research, is increasingly being seen as a solution to address these missing aspects of ML systems. In this paper, we focus on transparency issues when using ML tools in the decision making process in general, and specifically while recruiting candidates to high-profile positions. In the field of software development, it is important to correctly identify and differentiate highly skilled developers from developers who are adept at only performing regular and mundane programming jobs. If AI is used in the decision process, HR recruiting agents need to justify to their managers why certain candidates were selected and why some were rejected. Online Judges (OJ) are increasingly being used for developer recruitment across various levels attracting thousands of candidates. Automating this decision-making process using ML tools can bring speed while mitigating bias in the selection process. However, the raw and huge dataset available on the OJs need to be well curated and enhanced to make the decision process accurate and explainable. To address this, we built and subsequently enhanced a ML regressor model and the underlying dataset using XAI tools. We evaluated the model to show how XAI can be actively and iteratively used during pre-deployment stage to improve the quality of the dataset and to improve the prediction accuracy of the regression model. We show how these iterative changes helped improve the r2-score of the GradientRegressor model used in our experiments from 0.3507 to 0.9834 (an improvement of 63.27%). We also show how the explainability of LIME and SHAP tools were increased using these steps. A unique contribution of this work is the application of XAI to a very niche area in recruitment, i.e. in the evaluation of performance of users on OJs in software developer recruitment.
Waseem Ahmed, “Feature Creation to Enhance Explainability and Predictability of ML Models Using XAI” International Journal of Advanced Computer Science and Applications(IJACSA), 15(10), 2024. http://dx.doi.org/10.14569/IJACSA.2024.01510101
@article{Ahmed2024,
title = {Feature Creation to Enhance Explainability and Predictability of ML Models Using XAI},
journal = {International Journal of Advanced Computer Science and Applications},
doi = {10.14569/IJACSA.2024.01510101},
url = {http://dx.doi.org/10.14569/IJACSA.2024.01510101},
year = {2024},
publisher = {The Science and Information Organization},
volume = {15},
number = {10},
author = {Waseem Ahmed}
}
Copyright Statement: This is an open access article licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.