Future of Information and Communication Conference (FICC) 2025
28-29 April 2025
Publication Links
IJACSA
Special Issues
Future of Information and Communication Conference (FICC)
Computing Conference
Intelligent Systems Conference (IntelliSys)
Future Technologies Conference (FTC)
International Journal of Advanced Computer Science and Applications(IJACSA), Volume 8 Issue 7, 2017.
Abstract: Advancement in deep unsupervised learning are finally bringing machine learning close to natural learning, which happens with as few as one labeled instance. Ladder Networks are the newest deep learning architecture that proposes semi-supervised learning at scale. This work discusses how the ladder network model successfully combines supervised and unsupervised learning taking it beyond the pre-training realm. The model learns from the structure, rather than the labels alone transforming it from a label learner to a structural observer. We extend the previously-reported results by lowering the number of labels, and report an error of 1.27 on 40 labels only, on the MNIST dataset that in a fully supervised setting, uses 60000 labeled training instances.
Behroz Mirza, Tahir Syed, Jamshed Memon and Yameen Malik, “Ladder Networks: Learning under Massive Label Deficit” International Journal of Advanced Computer Science and Applications(IJACSA), 8(7), 2017. http://dx.doi.org/10.14569/IJACSA.2017.080769
@article{Mirza2017,
title = {Ladder Networks: Learning under Massive Label Deficit},
journal = {International Journal of Advanced Computer Science and Applications},
doi = {10.14569/IJACSA.2017.080769},
url = {http://dx.doi.org/10.14569/IJACSA.2017.080769},
year = {2017},
publisher = {The Science and Information Organization},
volume = {8},
number = {7},
author = {Behroz Mirza and Tahir Syed and Jamshed Memon and Yameen Malik}
}
Copyright Statement: This is an open access article licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.