The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

DOI: 10.14569/IJACSA.2023.0140478
PDF

Integrating Dropout Regularization Technique at Different Layers to Improve the Performance of Neural Networks

Author 1: B. H. Pansambal
Author 2: A. B. Nandgaokar

International Journal of Advanced Computer Science and Applications(IJACSA), Volume 14 Issue 4, 2023.

  • Abstract and Keywords
  • How to Cite this Article
  • {} BibTeX Source

Abstract: In many facial expression recognition models it is necessary to prevent overfitting to check no units (neurons) depend on each other. Therefore, dropout regularization can be applied to ignore few nodes randomly while processing the remaining neurons. Hence, dropout helps dealing with overfitting and predicts the desired results with more accuracy at different layers of the neural network like ‘visible’, ‘hidden’ and ‘convolutional’ layers. In neural networks there are layers like dense, fully connected, convolutional and recurrent (LSTM- long short term memory). It is possible to embed the dropout layer with any of these layers. Model drops the units randomly from the neural network, meaning model removes its connection from other units. Many researchers found dropout regularization a most powerful technique in machine learning and deep learning. Dropping few units (neurons) randomly and processing the remaining units can be considered in two phases like forward and backward pass (stages). Once the model drops few units randomly and select ‘n’ from the remaining units it is obvious that weight of the units could change during processing. It must be noted that updated weight doesn’t reflect on the dropped units. Dropping and stepping-in few units seem to be very good process as those units which step-in will represent the network. It is assumed to have maximum chance for the stepped-in units to have less dependency and model gives better results with higher accuracy.

Keywords: Convolutional layer; visible layer; hidden layer; dropout regularization; long short term memory (LSTM); deep learning; facial expression recognition

B. H. Pansambal and A. B. Nandgaokar, “Integrating Dropout Regularization Technique at Different Layers to Improve the Performance of Neural Networks” International Journal of Advanced Computer Science and Applications(IJACSA), 14(4), 2023. http://dx.doi.org/10.14569/IJACSA.2023.0140478

@article{Pansambal2023,
title = {Integrating Dropout Regularization Technique at Different Layers to Improve the Performance of Neural Networks},
journal = {International Journal of Advanced Computer Science and Applications},
doi = {10.14569/IJACSA.2023.0140478},
url = {http://dx.doi.org/10.14569/IJACSA.2023.0140478},
year = {2023},
publisher = {The Science and Information Organization},
volume = {14},
number = {4},
author = {B. H. Pansambal and A. B. Nandgaokar}
}



Copyright Statement: This is an open access article licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

IJACSA

Upcoming Conferences

IntelliSys 2025

28-29 August 2025

  • Amsterdam, The Netherlands

Future Technologies Conference 2025

6-7 November 2025

  • Munich, Germany

Healthcare Conference 2026

21-22 May 2026

  • Amsterdam, The Netherlands

Computing Conference 2026

9-10 July 2026

  • London, United Kingdom

IntelliSys 2026

3-4 September 2026

  • Amsterdam, The Netherlands

Computer Vision Conference 2026

15-16 October 2026

  • Berlin, Germany
The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org