The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Outstanding Reviewers

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • ICONS_BA 2025

Computer Vision Conference (CVC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • RSS Feed

DOI: 10.14569/IJACSA.2024.01506116
PDF

Cross-Cultural Language Proficiency Scaling using Transformer and Attention Mechanism Hybrid Model

Author 1: Anna Gustina Zainal
Author 2: M. Misba
Author 3: Punit Pathak
Author 4: Indrajit Patra
Author 5: Adapa Gopi
Author 6: Yousef A.Baker El-Ebiary
Author 7: Prema S

International Journal of Advanced Computer Science and Applications(IJACSA), Volume 15 Issue 6, 2024.

  • Abstract and Keywords
  • How to Cite this Article
  • {} BibTeX Source

Abstract: Assessing language competency in a variety of linguistic and cultural situations requires the use of a cross-cultural language proficiency scale. This study suggests a hybrid model that takes cross-cultural characteristics into account and successfully scales language competency by combining Transformer design with attention processes. The approach seeks to improve the precision and consistency of language competency evaluation by capturing both cross-cultural subtleties and linguistic context. The suggested hybrid model is made up of many essential parts. To capture semantic information, the incoming text is first tokenized into subword units and then transformed into embeddings using word2vec, a pre-trained word embedding algorithm. The contextual information is then extracted from the input sequence using a Transformer encoder stack, which uses multi-head self-attention techniques to focus on distinct textual elements. An attention mechanism layer (or layers) particularly tailored to attend to cross-cultural traits are introduced, in addition to the Transformer encoder. Through learning cross-cultural patterns and links between various languages or cultural settings, this attention mechanism improves the model's comprehension and incorporation of cross-cultural subtleties. A representation that blends linguistic context and cross-cultural elements is produced by fusing the results of the Transformer encoder and the cross-cultural attention mechanism layer(s). This fused representation is subsequently subjected to a classifier in order to forecast language competency levels. The hybrid model uses categorical cross-entropy as the objective function and is trained on a variety of datasets that span several languages and cultural situations. Python is used to implement the suggested work. The accuracy of the suggested study is 97.3% when compared to the T-TC-INT Model, BERT + MECT.

Keywords: Cross-cultural; language proficiency; transformer; attention mechanism; hybrid model

Anna Gustina Zainal, M. Misba, Punit Pathak, Indrajit Patra, Adapa Gopi, Yousef A.Baker El-Ebiary and Prema S. “Cross-Cultural Language Proficiency Scaling using Transformer and Attention Mechanism Hybrid Model”. International Journal of Advanced Computer Science and Applications (IJACSA) 15.6 (2024). http://dx.doi.org/10.14569/IJACSA.2024.01506116

@article{Zainal2024,
title = {Cross-Cultural Language Proficiency Scaling using Transformer and Attention Mechanism Hybrid Model},
journal = {International Journal of Advanced Computer Science and Applications},
doi = {10.14569/IJACSA.2024.01506116},
url = {http://dx.doi.org/10.14569/IJACSA.2024.01506116},
year = {2024},
publisher = {The Science and Information Organization},
volume = {15},
number = {6},
author = {Anna Gustina Zainal and M. Misba and Punit Pathak and Indrajit Patra and Adapa Gopi and Yousef A.Baker El-Ebiary and Prema S}
}



Copyright Statement: This is an open access article licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

IJACSA

Upcoming Conferences

Computer Vision Conference (CVC) 2026

21-22 May 2026

  • Amsterdam, The Netherlands

Computing Conference 2026

9-10 July 2026

  • London, United Kingdom

Artificial Intelligence Conference 2026

3-4 September 2026

  • Amsterdam, The Netherlands

Future Technologies Conference (FTC) 2026

15-16 October 2026

  • Berlin, Germany
The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computer Vision Conference
  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

The Science and Information (SAI) Organization Limited is a company registered in England and Wales under Company Number 8933205.