Computer Vision Conference (CVC) 2026
16-17 April 2026
Publication Links
IJACSA
Special Issues
Future of Information and Communication Conference (FICC)
Computing Conference
Intelligent Systems Conference (IntelliSys)
Future Technologies Conference (FTC)
International Journal of Advanced Computer Science and Applications(IJACSA), Volume 16 Issue 4, 2025.
Abstract: Continual learning, also referred to as lifelong learning, has emerged as a significant advancement for model adaptation and generalization in deep learning with the capability to train models sequentially from a continuous stream of data across multiple tasks while retaining previously acquired knowledge. Continual learning is used to build powerful deep learning models that can efficiently adapt to dynamic environments and fast-shifting preferences by utilizing computational and memory resources, and it can ensure scalability by acquiring new skills over time. Continual learning enables models to train incrementally from an ongoing stream of data by learning new data as it comes while saving old experiences, which eliminates the need to collect new data with old data to be retrained together from scratch, saving time, resources, and effort. However, despite continual learning advantages, it still faces a significant challenge known as catastrophic forgetting. Catastrophic forgetting is a phenomenon in continual learning where a model forgets previously learned knowledge when trained on new tasks, making it challenging to preserve performance on earlier tasks while learning new ones. Catastrophic forgetting is a central challenge in advancing the field of continual learning as it undermines the main goal of continual learning, which is to maintain long-term performance across all encountered tasks. Therefore, several types of research have been proposed recently to address and mitigate the catastrophic forgetting dilemma to unlock the full potential of continual learning. As a result, this research provides a detailed and comprehensive review of one of the state-of-the-art approaches to mitigate catastrophic forgetting in continual learning known as the gradient-based approach. Furthermore, a performance evaluation is conducted for the recent gradient-based models, including the limitations and the promising directions for future research.
Haitham Ghallab, Mona Nasr and Hanan Fahmy, “Mitigating Catastrophic Forgetting in Continual Learning Using the Gradient-Based Approach: A Literature Review” International Journal of Advanced Computer Science and Applications(IJACSA), 16(4), 2025. http://dx.doi.org/10.14569/IJACSA.2025.0160414
@article{Ghallab2025,
title = {Mitigating Catastrophic Forgetting in Continual Learning Using the Gradient-Based Approach: A Literature Review},
journal = {International Journal of Advanced Computer Science and Applications},
doi = {10.14569/IJACSA.2025.0160414},
url = {http://dx.doi.org/10.14569/IJACSA.2025.0160414},
year = {2025},
publisher = {The Science and Information Organization},
volume = {16},
number = {4},
author = {Haitham Ghallab and Mona Nasr and Hanan Fahmy}
}
Copyright Statement: This is an open access article licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.