Future of Information and Communication Conference (FICC) 2023
2-3 March 2023
Publication Links
IJACSA
Special Issues
Future of Information and Communication Conference (FICC)
Computing Conference
Intelligent Systems Conference (IntelliSys)
Future Technologies Conference (FTC)
Copyright Statement: This is an open access article licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.
Digital Object Identifier (DOI) : 10.14569/IJACSA.2023.0140423
Article Published in International Journal of Advanced Computer Science and Applications(IJACSA), Volume 14 Issue 4, 2023.
Abstract: The comments in the evolution of network public opinion events not only reflect the attitude of netizens towards the event itself, but also are the key basis for mastering the dynamics of public opinion. According to the comment data in the event evolution process, an event feature vector pre-training model NL2ER-Transformer is constructed to realize the real-time automatic extraction of event features. Firstly, a semi-supervised multi-label curriculum learning model is proposed to generate comment words, event word vectors, event words, and event sentences, so that a public opinion event is mapped into a sequence similar to vectorized natural language. Secondly, based on the Transformer structure, a training method is proposed to simulate the evolution process of events, so that the event vector generation model can learn the evolution law and the characteristics of reversal events. Finally, the event vectors generated by the presented NL2ER-Transformer model are compared with the event vectors generated by the current mainstream models such as XLNet and RoBerta. This paper tests the pre-trained model NL2ER-Transformer and three pre-trained benchmark models on four downstream classification models. The experimental results show that using the vectors generated by NL2ER-Transformer to train downstream models compared to using the vectors generated by other pre-trained benchmark models to train downstream models, the accuracy, recall, and F1 values are 16.66%, 44.44%, and 19% higher than the best downstream model. At the same time, in the evolutionary capability analysis test, only four events show partial errors. In terms of performance of semi-supervised model, the proposed semi-supervised multi-label curriculum learning model outperforms mainstream models in four indicators by 6%, 23%, 8%, and 15%, respectively.
WANG Nan, TAN Shu-Ru, XIE Xiao-Lan, LI Hai-Rong and JIANG Jia-Hui, “Event Feature Pre-training Model Based on Public Opinion Evolution” International Journal of Advanced Computer Science and Applications(IJACSA), 14(4), 2023. http://dx.doi.org/10.14569/IJACSA.2023.0140423
@article{Nan2023,
title = {Event Feature Pre-training Model Based on Public Opinion Evolution},
journal = {International Journal of Advanced Computer Science and Applications},
doi = {10.14569/IJACSA.2023.0140423},
url = {http://dx.doi.org/10.14569/IJACSA.2023.0140423},
year = {2023},
publisher = {The Science and Information Organization},
volume = {14},
number = {4},
author = {WANG Nan and TAN Shu-Ru and XIE Xiao-Lan and LI Hai-Rong and JIANG Jia-Hui}
}