28-29 August 2025
Publication Links
IJACSA
Special Issues
Publication Links
IJACSA
Special Issues
Future of Information and Communication Conference (FICC)
Computing Conference
Intelligent Systems Conference (IntelliSys)
Future Technologies Conference (FTC)
International Journal of Advanced Computer Science and Applications(IJACSA), Volume 16 Issue 3, 2025.
Abstract: In recommendation systems, negative sampling strategies are crucial for the calculation of contrastive learning loss. Traditional random negative sampling methods may lead to insufficient quality of negative samples during training, thereby affecting the convergence and performance of the model. In addition, the Bayesian Personalized Ranking (BPR) loss function usually converges slowly and is prone to falling into suboptimal local solutions. To address the above problems, this paper proposes a recommendation algorithm based on popularity-corrected sampling and improved contrastive loss. First, a dynamic negative sampling method with popularity correction is proposed, which reduces the impact of item popularity distribution bias on model training and dynamically screens out negative samples to improve the quality of model recommendations. Second, an improved contrastive loss is proposed, which selects the most challenging negative samples and introduces a boundary threshold to control the sensitivity of the loss, enabling the model to focus more on samples that are difficult to distinguish and further optimize the recommendation effect. Experimental results on the Amazon-Book, Yelp2018, and Gowalla datasets show that the proposed model significantly outperforms mainstream state-of-the-art models in recommendation tasks. Specifically, the Recall metric, which reflects model accuracy, improves by 16.8%, 12.9%, and 5.72% respectively on these three datasets. The NDCG metric, which measures ranking quality, increases by 20.7%, 16.4%, and 7.76% respectively. These results confirm the effectiveness and superiority of the recommendation algorithm across different scenarios. Compared with baseline models, it demonstrates stronger adaptability in complex situations, such as the sparse dataset Gowalla and the long - tail distribution dataset Amazon-Book, with the highest improvement in core metrics exceeding 20%.
Wei Lu, Xiaodong Cai and Minghui Li, “Popularity-Correction Sampling and Improved Contrastive Loss Recommendation” International Journal of Advanced Computer Science and Applications(IJACSA), 16(3), 2025. http://dx.doi.org/10.14569/IJACSA.2025.0160333
@article{Lu2025,
title = {Popularity-Correction Sampling and Improved Contrastive Loss Recommendation},
journal = {International Journal of Advanced Computer Science and Applications},
doi = {10.14569/IJACSA.2025.0160333},
url = {http://dx.doi.org/10.14569/IJACSA.2025.0160333},
year = {2025},
publisher = {The Science and Information Organization},
volume = {16},
number = {3},
author = {Wei Lu and Xiaodong Cai and Minghui Li}
}
Copyright Statement: This is an open access article licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.