The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 1 Issue 3

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Comparative Study of Gaussian Mixture Model and Radial Basis Function for Voice Recognition

Abstract: A comparative study of the application of Gaussian Mixture Model (GMM) and Radial Basis Function (RBF) in biometric recognition of voice has been carried out and presented. The application of machine learning techniques to biometric authentication and recognition problems has gained a widespread acceptance. In this research, a GMM model was trained, using Expectation Maximization (EM) algorithm, on a dataset containing 10 classes of vowels and the model was used to predict the appropriate classes using a validation dataset. For experimental validity, the model was compared to the performance of two different versions of RBF model using the same learning and validation datasets. The results showed very close recognition accuracy between the GMM and the standard RBF model, but with GMM performing better than the standard RBF by less than 1% and the two models outperformed similar models reported in literature. The DTREG version of RBF outperformed the other two models by producing 94.8% recognition accuracy. In terms of recognition time, the standard RBF was found to be the fastest among the three models.

Author 1: Fatai Adesina Anifowose

Keywords: Gaussian Mixture Model, Radial Basis Function, Artificial Intelligence, Computational Intelligence, Biometrics, Optimal Parameters, Voice Pattern Recognition, DTREG

PDF

Paper 2: Multiphase Scalable Grid Scheduler Based on Multi-QoS Using Min-Min Heuristic

Abstract: In scheduling, the main factor that affects searching speed and mapping performance is the number of resources or the size of search space. In grid computing, the scheduler performance plays an essential role in the overall performance. So, it is obvious the need for scalable scheduler that can manage the growing in resources (i.e. scalable). With the assumption that each resource has its own specifications and each job has its own requirements; then searching the whole search space (all the resources) can waste plenty of scheduling time. In this paper, we propose a two-phase scheduler that uses min-min algorithm to speed up the mapping time with almost the same efficiency. The scheduler is also based on the assumption that the resources in grid computing can be classified into clusters. The scheduler tries first to schedule the jobs to the suitable cluster (i.e. first phase) and then each cluster schedule the incoming jobs to the suitable resources (i.e. second phase). The scheduler is based on multidimensional QoS to enhance the mapping as much as it can. The simulation results show that the use of two-phase strategy can support the scalable scheduler.

Author 1: Nawfal A Mehdi
Author 2: Ali Mamat
Author 3: Hamidah Ibrahim
Author 4: Shamala A/P K

Keywords: Multi-phase; QoS; Grid Scheduling

PDF

Paper 3: Loss Reduction in Distribution System Using Fuzzy Techniques

Abstract: In this paper, a novel approach using approximate reasoning is used to determine suitable candidate nodes in a distribution system for capacitor placement. Voltages and power loss reduction indices of distribution system nodes are modeled by furzy membership functions. A fuzzy expert system (FES) containing a set of heuristic rules is then used to determine the capacitor placement suitability of each node in the distribution system. Capacitors are placed on the nodes with the highest suitability. a new design methodology for determining the size, location, type and number of capacitors to be placed on a radial distribution system is presented. The objective is to minimize the peak power losses and the energy losses in the distribution system considering the capacitor cost. Test results have been presented along with the discussion of the algorithm.

Author 1: Sheeraz kirmani
Author 2: Md. Farrukh Rahman
Author 3: Chakresh Kumar

Keywords: Capacitor placement, Distribution systems, Fuzzy expert system.

PDF

Paper 4: A threat risk modeling framework for Geospatial Weather Information System (GWIS) a DREAD based study

Abstract: Over the years, the focus has been on protecting network, host, database and standard applications from internal and external threats. The Rapid Application Development (RAD) process makes the web application extremely short and makes it difficult to eliminate the vulnerabilities. Here we study web application risk assessment technique called threat risk modeling to improve the security of the application. We implement our proposed mechanism the application risk assessment using Microsoft’s threat risk DREAD model to evaluate the application security risk against vulnerability parameters. The study led to quantifying different levels of risk for Geospatial Weather Information System (GWIS) using DREAD model.

Author 1: K Ram Mohan Rao
Author 2: Durgesh Pant

Keywords: Rapid Application Development, Risk rating, Security assessment.

PDF

Paper 5: Council-based Distributed Key Management Scheme for MANETs

Abstract: Mobile ad hoc networks (MANETs) have been proposed as an extremely flexible technology for establishing wireless communications. In comparison with fixed networks, some new security issues have arisen with the introduction of MANETs. Secure routing, in particular, is an important and complicated issue. Clustering is commonly used in order to limit the amount of secure routing information. In this work, we propose an enhanced solution for ad hoc key management based on a cauterized architecture. This solution uses clusters as a framework to manage cryptographic keys in a distributed way. This paper sheds light on the key management algorithm for the OLSR protocol standard. Our algorithm takes into account the node mobility and engenders major improvements regarding the number of elected cluster heads to create a PKI council. Our objective is to distribute the certification authority functions for a reduced and less mobile cluster heads that will serve for keys exchange.

Author 1: Abdelmajid HAJAMI
Author 2: Mohammed ELKOUTBI

Keywords: Key Management; MANET; Clustering

PDF

Paper 6: Improved Spectrogram Analysis for ECG Signal in Emergency Medical Applications

Abstract: This paper presents the spectrogram effect of biomedical signal, especially for ECG. Simulation module developed for the spectrogram implementation. Spectrogram based on ECG signal and power spectral density together with off-line evaluation has been observed. ECG contains very important clinical information about the cardiac activities of heart. The features of small variations in ECG signal with time-varying morphological characteristics needs to be extracted by signal processing method because there are not visible of graphical ECG signal. Small variations of simulated normal and noise corrupted ECG signal have been extracted using spectrogram. The spectrogram found to be more precise over conventional FFT in finding the small abnormalities in ECG signal. These form time-frequency representations for processing time-varying signals. By using the presented method, it is ensure that high resolution time-varying spectrum estimation with no lag error can be produced. Other benefits of the method are the straightforward procedure for evaluating the statistics of the spectrum estimation.

Author 1: A.K.M Fazlul Haque
Author 2: Md. Hanif Ali
Author 3: M Adnan Kiber

Keywords: Spectrogram, ECG, PSD, Periodogram, Time-varying signal, FFT.

PDF

Paper 7: High Quality Integrated Data Reconstruction for Medical Applications

Abstract: In this paper, the implementation of a high quality integrated data reconstruction model and algorithm has been proposed, especially for medical applications. Patients’ Information acquired at the sending end and reconstructed at the receiving end by using a technique that would be high quality for the signal reconstruction process. A method is proposed in which the reconstruction of data like ECG, audio and other patients’ vital parameters that are acquired in the time-domain and operated in the frequency-domain. Further the data will be reconstructed in the time-domain from the frequency domain where high quality data is required. In this particular case, high quality ensures the distortion less and noiseless recovered baseband signal. This would usually require the application of Fast Fourier Transform (FFT) and Inverse Fast Fourier Transform (IFFT) return the data to the spatial domain. The simulation is performed using Matlab. The Composite baseband signal has been generated by developing a program as well as by acquiring to the workspace. The feature of the method is that it can achieve high-quality integrated data reconstruction and can be associated easily with spatial domain.

Author 1: A.K.M Fazlul Haque
Author 2: Md. Hanif Ali
Author 3: M Adnan Kiber

Keywords: FFT, IFFT, ECG, Baseband, Reconstruction, Noise, FDA tool.

PDF

Paper 8: An Electronic Design of a Low Cost BRAILLE HANDGLOVE

Abstract: This paper documents a new design for a Braille Hand glove, comprising of a majority of electrical components, the design aims to produce a product to perform vibrations in six position of blind’s person right hand. A low cost and robust design will provide the blind with an affordable and reliable tool also it produce the new technique and communications method for blind persons.

Author 1: M Rajasenathipathi
Author 2: M.Arthanari
Author 3: M.Sivakumar

Keywords: Braille, cell, vibration, dots, motor

PDF

Paper 9: Test-Bed for Emergency Management Simulations

Abstract: We present a test-bed for Emergency Management Simulations by contrasting two prototypes we have built, CAVIAR and Reverse 111. We outline the desirable design principles that guide our choices for simulating emergencies and implement these ideas in a modular system, which utilizes proactive crowd-sourcing to enable emergency response centers to contact civilians co-located with an emergency, to provide more information about the events. This aspect of proactive crowd-sourcing enables Emergency response centers to take into account that an emergency situation’s inherent nature is dynamic and that initial assumptions while deploying resources to the emergency may not hold, as the emergency unfolds. A number of independent entities, governmental and non-governmental are known to interact while mitigating emergencies. Our test-bed utilizes a number of agents to simulate various resource sharing policies amongst different administrative domains and non-profit civilian organizations that might pool their resources at the time of an emergency. A common problem amongst first responders is the lack of interoperability amongst their devices. In our test-bed, we integrate live caller data obtained from traces generated by Telecom New Zealand, which tracks cell-phone users and their voice and data calls across the network, to identify co-located crowds. The test-bed has five important components including means to select and simulate Events, Resources and Crowds and additionally provide a visual interface as part of a massive online multi-player game to simulate Emergencies in any part of the world. We also present our initial evaluation of some resource sharing policies in our intelligent agents, which are part of our test-bed.

Author 1: Anu Vaidyanathan

Keywords: test-bed, Emergency Management, Live Call Records, PCMD, Proactive Crowd-Sourcing, Agents

PDF

Paper 10: Emerging Trends of Ubiquitous Computing

Abstract: Ubiquitous computing is a method of enhancing computer use by making many computers available throughout the physical environment, but making them effectively invisible to the user. Background network to support ubiquitous computing is ubiquitous network by which users can enjoy network services whenever and wherever they want (home, office, outdoors). In this paper Issues related to ubiquitous network, smart objects and wide area ubiquitous Networks have been discussed. We also discuss various elements used in ubiquitous computing with the challenges in this computing environment

Author 1: Prakriti Trivedi
Author 2: Kamal Kishore Sagar
Author 3: Vernon

Keywords: Braille, cell, vibration, dots, motor

PDF

Paper 11: Modelling and Analysing of Software Defect Prevention Using ODC

Abstract: As the time passes the software complexity is increasing and due to this software reliability and quality will be affected. And for measuring the software reliability and quality various defect measurement and defect tracing mechanism will be used .Software defect prevention work typically focuses on individual inspection and testing technique. ODC is a mechanism by which we exploit software defect that occur during the software development life cycle. Orthogonal defect classification is a concept which enables developers, quality managers and project managers to evaluate the effectiveness and correctness of the software

Author 1: Prakriti Trivedi
Author 2: Som Pachori

Keywords: Defect Prevention, ODC, Defect Trigger

PDF

Paper 12: Enhanced Segmentation Procedure for Intima-Adventitial Layers of Common Carotid Artery

Abstract: This paper presents an enhanced Segmentation technique for use on noisy B-mode ultrasound images of the carotid artery. This method is based on Image Enhancement, Edge detection and Morphological operations in Boundary detection. This procedure may simplify the job of the practitioner for analyzing accuracy and variability of segmentation results. Possible plaque regions are also highlighted. A thorough evaluation of the method in the clinical environment shows that inter observer variability is evidently decreased and so is the overall analysis time. The results demonstrate that it has the potential to perform qualitatively better than applying existing methods in intima and adventitial layer detection on B-mode images.

Author 1: V Savithri
Author 2: S.Purushothaman

Keywords: Artery, boundary detection, imaging, Ultrasonic, parallel programming

PDF

Paper 13: Application of Locality Preserving Projections in Face Recognition

Abstract: Face recognition technology has evolved as an enchanting solution to address the contemporary needs in order to perform identification and verification of identity claims. By advancing the feature extraction methods and dimensionality reduction techniques in the application of pattern recognition, a number of face recognition systems has been developed with distinct degrees of success. Locality preserving projection (LPP) is a recently proposed method for unsupervised linear dimensionality reduction. LPP preserve the local structure of face image space which is usually more significant than the global structure preserved by principal component analysis (PCA) and linear discriminant analysis (LDA). This paper focuses on a systematic analysis of locality-preserving projections and the application of LPP in combination with an existing technique This combined approach of LPP through MPCA can preserve the global and the local structure of the face image which is proved very effective. Proposed approach is tested using the AT & T face database. Experimental results show the significant improvements in the face recognition performance in comparison with some previous methods.

Author 1: Shermina J

Keywords: Defect Prevention, ODC, Defect Trigger

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org