Future of Information and Communication Conference (FICC) 2023
2-3 March 2023
Publication Links
IJACSA
Special Issues
Future of Information and Communication Conference (FICC)
Computing Conference
Intelligent Systems Conference (IntelliSys)
Future Technologies Conference (FTC)
Copyright Statement: This is an open access article licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.
Digital Object Identifier (DOI) : 10.14569/IJACSA.2021.0120480
Article Published in International Journal of Advanced Computer Science and Applications(IJACSA), Volume 12 Issue 4, 2021.
Abstract: In most conditions, it is a problematic mission for a machine-learning model with a data record, which has various attributes, to be trained. There is always a proportional relationship between the increase of model features and the arrival to the overfitting of the susceptible model. That observation occurred since not all the characteristics are always important. For example, some features could only cause the data to be noisier. Dimensionality reduction techniques are used to overcome this matter. This paper presents a detailed comparative study of nine dimensionality reduction methods. These methods are missing-values ratio, low variance filter, high-correlation filter, random forest, principal component analysis, linear discriminant analysis, backward feature elimination, forward feature construction, and rough set theory. The effects of used methods on both training and testing performance were compared with two different datasets and applied to three different models. These models are, Artificial Neural Network (ANN), Support Vector Machine (SVM) and Random Forest classifier (RFC). The results proved that the RFC model was able to achieve the dimensionality reduction via limiting the overfitting crisis. The introduced RFC model showed a general progress in both accuracy and efficiency against compared approaches. The results revealed that dimensionality reduction could minimize the overfitting process while holding the performance so near to or better than the original one.
Mustafa Abdul Salam, Ahmad Taher Azar, Mustafa Samy Elgendy and Khaled Mohamed Fouad, “The Effect of Different Dimensionality Reduction Techniques on Machine Learning Overfitting Problem” International Journal of Advanced Computer Science and Applications(IJACSA), 12(4), 2021. http://dx.doi.org/10.14569/IJACSA.2021.0120480
@article{Salam2021,
title = {The Effect of Different Dimensionality Reduction Techniques on Machine Learning Overfitting Problem},
journal = {International Journal of Advanced Computer Science and Applications},
doi = {10.14569/IJACSA.2021.0120480},
url = {http://dx.doi.org/10.14569/IJACSA.2021.0120480},
year = {2021},
publisher = {The Science and Information Organization},
volume = {12},
number = {4},
author = {Mustafa Abdul Salam and Ahmad Taher Azar and Mustafa Samy Elgendy and Khaled Mohamed Fouad}
}