Future of Information and Communication Conference (FICC) 2025
28-29 April 2025
Publication Links
IJACSA
Special Issues
Future of Information and Communication Conference (FICC)
Computing Conference
Intelligent Systems Conference (IntelliSys)
Future Technologies Conference (FTC)
International Journal of Advanced Computer Science and Applications(IJACSA), Volume 15 Issue 6, 2024.
Abstract: Assessing language competency in a variety of linguistic and cultural situations requires the use of a cross-cultural language proficiency scale. This study suggests a hybrid model that takes cross-cultural characteristics into account and successfully scales language competency by combining Transformer design with attention processes. The approach seeks to improve the precision and consistency of language competency evaluation by capturing both cross-cultural subtleties and linguistic context. The suggested hybrid model is made up of many essential parts. To capture semantic information, the incoming text is first tokenized into subword units and then transformed into embeddings using word2vec, a pre-trained word embedding algorithm. The contextual information is then extracted from the input sequence using a Transformer encoder stack, which uses multi-head self-attention techniques to focus on distinct textual elements. An attention mechanism layer (or layers) particularly tailored to attend to cross-cultural traits are introduced, in addition to the Transformer encoder. Through learning cross-cultural patterns and links between various languages or cultural settings, this attention mechanism improves the model's comprehension and incorporation of cross-cultural subtleties. A representation that blends linguistic context and cross-cultural elements is produced by fusing the results of the Transformer encoder and the cross-cultural attention mechanism layer(s). This fused representation is subsequently subjected to a classifier in order to forecast language competency levels. The hybrid model uses categorical cross-entropy as the objective function and is trained on a variety of datasets that span several languages and cultural situations. Python is used to implement the suggested work. The accuracy of the suggested study is 97.3% when compared to the T-TC-INT Model, BERT + MECT.
Anna Gustina Zainal, M. Misba, Punit Pathak, Indrajit Patra, Adapa Gopi, Yousef A.Baker El-Ebiary and Prema S, “Cross-Cultural Language Proficiency Scaling using Transformer and Attention Mechanism Hybrid Model” International Journal of Advanced Computer Science and Applications(IJACSA), 15(6), 2024. http://dx.doi.org/10.14569/IJACSA.2024.01506116
@article{Zainal2024,
title = {Cross-Cultural Language Proficiency Scaling using Transformer and Attention Mechanism Hybrid Model},
journal = {International Journal of Advanced Computer Science and Applications},
doi = {10.14569/IJACSA.2024.01506116},
url = {http://dx.doi.org/10.14569/IJACSA.2024.01506116},
year = {2024},
publisher = {The Science and Information Organization},
volume = {15},
number = {6},
author = {Anna Gustina Zainal and M. Misba and Punit Pathak and Indrajit Patra and Adapa Gopi and Yousef A.Baker El-Ebiary and Prema S}
}
Copyright Statement: This is an open access article licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.