The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 6 Issue 3

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Skew Detection/Correction and Local Minima/Maxima Techniques for Extracting a New Arabic Benchmark Database

Abstract: We propose a set of techniques for extracting a new standard benchmark database for Arabic handwritten scripts. Thresholding, filtering, and skew detection/correction techniques are developed as a pre-processing step of the database. Local minima and maxima using horizontal and vertical histogram are implemented for extracting the script elements of the database. Elements of the database contain pages, paragraphs, lines, and characters. The database divides into two major parts. The first part represents the original elements without modifications; the second part represents the elements after applying the proposed techniques. The final database has collected, extracted, validated, and saved. All techniques are tested for extracting and validating the elements. In this respect, ACDAR proposes a first issue of the Arabic benchmark databases. In addition, the paper confirms establishment a specialized research-oriented center refers to learning, teaching, and collaboration activities. This center is called "Arabic Center for Document Analysis and Recognition (ACDAR)" which is similar to other centers developed for other languages such as English.

Author 1: Husam Ahmed Al Hamad

Keywords: ACDAR; Arabic benchmark database; Arabic scripts; document analysis; handwriting recognition; skew detection and correction

PDF

Paper 2: Android Application to Assess Smartphone Accelerometers and Bluetooth for Real-Time Control

Abstract: Modern smart phones have evolved into sophisticated embedded systems, incorporating hardware and software features that make the devices potentially useful for real-time control operations. An object-oriented Android application was developed to quantify the performance of the smartphone’s on-board linear accelerometers and bluetooth wireless module with a view to potentially transmitting accelerometer data wirelessly between bluetooth-enabled devices. A portable bluetooth library was developed which runs the bluetooth functionality of the application as an independent background service. The performance of bluetooth was tested by pinging data between 2 smartphones, measuring round-trip-time and round-trip-time variation (jitter) against variations in data size, transmission distance and sources of interference. The accelerometers were tested for sampling frequency and sampling frequency jitter.

Author 1: M. A. Nugent
Author 2: Dr. Harold Esmonde

Keywords: Android; Bluetooth; control; real-time; sensors; smartphone

PDF

Paper 3: Design of a Cloud Learning System Based on Multi-Agents Approach

Abstract: Cloud Computing can provide many benefits for university. It is a new paradigm of IT, which provides all resources such as software (SaaS), platform (PaaS) and infrastructure (IaaS) as a service over the Internet. In cloud computing, user can access the services anywhere, at any time and using any devices (Smart phones, tablet computers, laptops, desktops…). Multi-Agents System approach provides ideal solution for open and scalable systems whose structure can be changed dynamically. Educational institutions all over the world have already adapted the cloud to their own settings and made use of its great potential for innovation. Based on the analysis of the advantages of cloud computing and multi-agents system approach to support e-learning session, the paper presents a complete design and experimentation of a new layer in cloud computing called Smart Cloud Learning System.

Author 1: Mohammed BOUSMAH
Author 2: Ouidad LABOUIDYA
Author 3: Najib EL KAMOUN

Keywords: Cloud computing; Multi-Agents System; Project Based Learning

PDF

Paper 4: Standard Positioning Performance Evaluation of a Single-Frequency GPS Receiver Implementing Ionospheric and Tropospheric Error Corrections

Abstract: This paper evaluates the positioning performance of a single-frequency software GPS receiver using Ionospheric and Tropospheric corrections. While a dual-frequency user has the ability to eliminate the ionosphere error by taking a linear combination of observables, a single-frequency user must remove or calibrate this error by other means. To remove the ionosphere error we take advantage of the Klobuchar correction model, while for troposphere error mitigation the Hopfield correction model is used. Real GPS measurements were gathered using a single frequency receiver and post–processed by our proposed adaptive positioning algorithm. The integrated Klobuchar and Hopfield error correction models yeild a considerable reduction of the vertical error. The positioning algorithm automatically combines all available GPS pseudorange measurements when more than four satellites are in use. Experimental results show that improved standard positioning is achieved after error mitigation.

Author 1: Alban Rakipi
Author 2: Bexhet Kamo
Author 3: Shkelzen Cakaj
Author 4: Algenti Lala

Keywords: algorithm; Global Positioning System; GDOP; Hopfield model; Klobuchar model; receiver; PVT; Raw measurements

PDF

Paper 5: Steganography: Applying and Evaluating Two Algorithms for Embedding Audio Data in an Image

Abstract: Information transmission is increasing with grow of using WEB. So, information security has become very important. Security of data and information is the major task for scientists and political and military people. One of the most secure methods is embedding data (steganography) in different media like text, audio, digital images. this paper present two experiments in steganography of digital audio data file. It applies empirically, two algorithms in steganography in images through random insertion of digital audio data using bytes and pixels in image files. Finally, it evaluates both experiments, in order to enhance security of transmitted data.

Author 1: Khaled Nasser ElSayed

Keywords: Steganography; Encryption and Decryption; Data and Information Security; Data Hiding; Images; Data Communication

PDF

Paper 6: A Minimum Number of Features with Full-Accuracy Iris Recognition

Abstract: A minimum number of features for 100% iris recognition accuracy is developed in this paper. Such number is based on dividing the unwrapped iris into vertical and horizontal segments for a single iris and only vertical segments for dual-iris recognition. In both cases a simple technique that regards the mean of a segment as a feature is adopted. Algorithms and flowcharts to find the minimum of Euclidean Distance (ED) between a test iris and a matching database (DB) one are discussed. A threshold is selected to discriminate between a genuine acceptance (recognition) and a false acceptance of an imposter. The minimum number of features is found to be 47 for single iris and 52 for dual iris recognition. Comparison with recently-published techniques shows the superiority of the proposed technique regarding accuracy and recognition speed. Results were obtained using the phoenix database (UPOL).

Author 1: Ibrahim E. Ziedan
Author 2: Mira Magdy Sobhi

Keywords: Iris recognition; Iris features; Speed of Iris recognition; Features reduction

PDF

Paper 7: Apply Metaheuristic ANGEL to Schedule Multiple Projects with Resource-Constrained and Total Tardy Cost

Abstract: In this paper the multiple projects resource-constrained project scheduling problem is considered. Several projects are to be scheduled simultaneously with sharing several kinds of limited resources in this problem. Each project contains non-preemptive and deterministic duration activities which compete limited resources under resources and precedence constraints. Moreover, there are the due date for each project and the tardy cost per day that cause the penalty when the project cannot be accomplished before its due date. The objective is to find the schedules of the considered projects to minimize the total tardy cost subject to the resource and precedence constraints. Since the resource-constrained project scheduling problem has been proven to be NP-Hard, we cannot find a deterministic algorithm to solve this problem efficiently and metaheuristics or evolutionary algorithms are developed for this problem instead. The problem considered in this paper is harder than the original problem because the due date and tardy cost of a project are considered in addition. The metaheuristic method called ANGEL was applied to this problem. ANGEL combines ant colony optimization (ACO), genetic algorithm (GA) and local search strategy. In ANGEL, ACO and GA run alternately and cooperatively. ANGEL performs very well in the multiple projects resource-constrained project scheduling problem as revealed by the experimental results.

Author 1: Shih-Chieh Chen
Author 2: Ching-Chiuan Lin

Keywords: multiple project scheduling; resource-constrained project scheduling; ANGEL; ant colony optimization; genetic algorithms; local search; metaheuristics

PDF

Paper 8: Development and Role of Electronic Library in Information Technology Teaching in Bulgarian Schools*

Abstract: The electronic library can be considered as an interactive information space. Its creation substantially supports the communication between the teachers and the student, as well as between the teachers and the parents. The enlargement of information space allows improving the efficiency and the quality of teaching, assigning more projects for realization. The main purpose of this paper is to examine the role of the electronic library in teaching of information technologies in Bulgarian schools for providing more time for applying the learned material in order to increase the effectiveness of the educational process. We summarize and represent the advantages and disadvantages of the use of digital libraries in teaching information technologies together with the main features of a developed electronic library for teaching and educational subsidiary materials.

Author 1: Tsvetanka Georgieva-Trifonova
Author 2: Gabriela Chotova

Keywords: electronic library; information technology teaching; multimedia information system

PDF

Paper 9: Implementation of Binary Search Trees Via Smart Pointers

Abstract: Study of binary trees has prominent place in the training course of DSA (Data Structures and Algorithms). Their implementation in C++ however is traditionally difficult for students. To a large extent these difficulties are due not so much to the complexity of algorithms as to language complexity in terms of memory management by raw pointers – the programmer must consider too many details to ensure a reliable, efficient and secure implementation. Evolution of C++ regarded to automated resource management, as well as experience in implementation of linear lists by means of C++ 11/14 lead to an attempt to implement binary search trees (BST) via smart pointers as well. In the present paper, the authors share experience in this direction. Some conclusions about pedagogical aspects and effectiveness of the new classes, compared to traditional library containers and implementation with built-in pointers, are made.

Author 1: Ivaylo Donchev
Author 2: Emilia Todorova

Keywords: abstract data structures; binary search trees; C++; smart pointers; teaching and learning

PDF

Paper 10: Revised Use Case Point (Re-UCP) Model for Software Effort Estimation

Abstract: At present the most challenging issue that the software development industry encounters is less efficient management of software development budget projections. This problem has put the modern day software development companies in a situation wherein they are dealing with improper requirement engineering, ambiguous resource elicitation, uncertain cost and effort estimation. The most indispensable and inevitable aspect of any software development company is to form a counter mechanism to deal with the problems which leads to chaos. An emphatic combative domain to deal with this problem is to schedule the whole development process to undergo proper and efficient estimation process, wherein the estimation of all the resources can be made well in advance in order to check whether the conceived project is feasible and within the resources available. The basic building block in any object oriented design is Use Case diagrams which are prepared in early stages of design after clearly understanding the requirements. Use Case Diagrams are considered to be useful for approximating estimates for software development project. This research work gives detailed overview of Re-UCP (revised use case point) method of effort estimation for software projects. The Re-UCP method is a modified approach which is based on UCP method of effort estimation. In this research study 14 projects were subjected to estimate efforts using Re-UCP method and the results were compared with UCP and e-UCP models. The comparison of 14 projects shows that Re-UCP has significantly outperformed the existing UCP and e-UCP effort estimation techniques.

Author 1: Mudasir Manzoor Kirmani
Author 2: Abdul Wahid

Keywords: Use Case Point; Extended Use case point; Revised Use case Point; Software Effort Estimation

PDF

Paper 11: Bootstrap Approximation of Gibbs Measure for Finite-Range Potential in Image Analysis

Abstract: This paper presents a Gibbs measure approximation method through the adjustment of the associated estimated potential. We use the information criterion to prove the accuracy of this approach and the bootstrap computation method to determine the explicit form. The Gibbs sampler is the tool of our simulations while taking advantage of the use of the only one MCMC inside of the multiple necessary MCMC in the classical approximation. We focus on the validity of our approach for the Gibbs measure of a Markov Random Field with an interaction potential function and the associated uniqueness condition. Some theoretical and numerical results are given.

Author 1: Abdeslam EL MOUDDEN

Keywords: bootstrap computation; Gibbs measure; Markov Chain Monte Carlo; image Analysis; parameter estimation. Likelihood inference

PDF

Paper 12: Jigsopu: Square Jigsaw Puzzle Solver with Pieces of Unknown Orientation

Abstract: In this paper, we consider the square jigsaw puzzle problem in which one is required to reassemble the complete image from a number of unordered square puzzle pieces. Here we focus on the special case where both location and orientation of each piece are unknown. We propose a new automatic solver for such problem without assuming prior knowledge about the original image or its dimensions. We use an accelerated edge matching based greedy method with combined compatibility measures to provide fast performance while maintaining robust results. Complexity analysis and experimental results reveal that the new solver is fast and efficient.

Author 1: Abdullah M. Moussa

Keywords: jigsaw puzzle; image merging; edge matching; jigsaw puzzle assembly; automatic solver

PDF

Paper 13: Construction of FuzzyFind Dictionary using Golay Coding Transformation for Searching Applications

Abstract: searching through a large volume of data is very critical for companies, scientists, and searching engines applications due to time complexity and memory complexity. In this paper, a new technique of generating FuzzyFind Dictionary for text mining was introduced. We simply mapped the 23 bits of the English alphabet into a FuzzyFind Dictionary or more than 23 bits by using more FuzzyFind Dictionary, and reflecting the presence or absence of particular letters. This representation preserves closeness of word distortions in terms of closeness of the created binary vectors within Hamming distance of 2 deviations. This paper talks about the Golay Coding Transformation Hash Table and how it can be used on a FuzzyFind Dictionary as a new technology for using in searching through big data. This method is introduced by linear time complexity for generating the dictionary and constant time complexity to access the data and update by new data sets, also updating for new data sets is linear time depends on new data points. This technique is based on searching only for letters of English that each segment has 23 bits, and also we have more than 23-bit and also it could work with more segments as reference table.

Author 1: Kamran Kowsari
Author 2: Maryam Yammahi
Author 3: Nima Bari
Author 4: Roman Vichr
Author 5: Faisal Alsaby
Author 6: Simon Y. Berkovich

Keywords: FuzzyFind Dictionary; Golay Code; Golay Code Transformation Hash Table; Unsupervised learning; Fuzzy search engine; Big Data; Approximate search; Informational Retrieval; Pigeonhole Principle; Learning Algorithms ; Data Structure

PDF

Paper 14: An Approach to Extend WSDL-Based Data Types Specification to Enhance Web Services Understandability

Abstract: Web Services are important for integrating distributed heterogeneous applications. One of the problems that facing Web Services is the difficulty for a service provider to represent the datatype of the parameters of the operations provided by a Web service inside Web Service Description Language (WSDL). This problem will make it difficult for service requester to understand, reverse engineering, and also to decide if Web service is applicable to the required task of their application or not. This paper introduces an approach to extend Web service datatypes specifications inside WSDL in order to solve the aforementioned challenges. This approach is based on adding more description to the provided operations parameters datatypes and also simplified the WSDL document in new enrichment XML-Schema. The main contributions of this paper are: 1. Comprehensive study of 33 datatypes in C# language, and how they are represented inside WSDL document. 2. Classification of the previous mentioned datatypes into 3 categories: ( Clear, Indistinguishable, and Unclear ) datatypes. 3. Enhance the representation of 18% of C# datatypes that are not supported by XML by producing a new simple enrichment XML-based schema. 4. Enhance Web Service Understandability by simplifying WSDL document through producing summarized new simple enrichment schema.

Author 1: Fuad Alshraiedeh
Author 2: Samer Hanna
Author 3: Raed Alazaidah

Keywords: Datatypes; Understandability; Web Service

PDF

Paper 15: Modifications of Particle Swarm Optimization Techniques and Its Application on Stock Market: A Survey

Abstract: Particle Swarm Optimization (PSO) has become popular choice for solving complex and intricate problems which are otherwise difficult to solve by traditional methods. The usage of the Particle Swarm Optimization technique in coping with Portfolio Selection problems is the most important applications of PSO to predict the stocks that have maximum profit with minimum risk, using some common indicators that give advice of buy and sell. This paper gives the reader the state of the art of the various modifications of the PSO and study whether had been applied over the stock market or not.

Author 1: Razan A. Jamous
Author 2: EssamEl.Seidy
Author 3: Assem A. Tharwat
Author 4: Bayoumi Ibrahim Bayoum

Keywords: Computational intelligence; Particle Swarm Optimization; modification; Stock Market; Portfolio Selection

PDF

Paper 16: A survey on top security threats in cloud computing

Abstract: Cloud computing enables the sharing of resources such as storage, network, applications and software through internet. Cloud users can lease multiple resources according to their requirements, and pay only for the services they use. However, despite all cloud benefits there are many security concerns related to hardware, virtualization, network, data and service providers that act as a significant barrier in the adoption of cloud in the IT industry. In this paper, we survey the top security concerns related to cloud computing. For each of these security threats we describe, i) how it can be used to exploit cloud components and its effect on cloud entities such as providers and users, and ii) the security solutions that must be taken to prevent these threats. These solutions include the security techniques from existing literature as well as the best security practices that must be followed by cloud administrators.

Author 1: Muhammad Kazim
Author 2: Shao Ying Zhu

Keywords: Cloud computing; Data security; Network security; Cloud service provider security

PDF

Paper 17: Allocation of Roadside Units for Certificate Update in Vehicular Ad Hoc Network Environments

Abstract: The roadside unit (RSU) plays an important role in VANET environments for privacy preservation. In order to conserve the privacy of a vehicle, the issued certificate must be updated frequently via RSUs. If a certificate expires without being updated, the services for the vehicle will be terminated. Therefore, deploying as more as possible RSUs ensures that the certificate can be updated before it expires. However, the cost for allocating an RSU is very high. In this paper, we consider the roadside unit allocating problem such that the certificates can be updated before it expired. Previous researches focus on the roadside unit placement problem in a small city in which for any origination-destination pair the certificate is limited to update at most once. The RSU placement problem in which more than once certificate updates are required is discussed in this paper. The RSU allocation problem is formulated and the decision problem of the RSUs allocation problem is proved as an NP-complete problem. We proposed three roadside unit placement algorithms which works well for a large city. In order to reduce the number of required RSUs for certificate update, we also proposed three backward removing methods to remove the intersections found by the RSU allocation methods. Simulation results show that the proposed algorithms yields lower number of required RSUs than the simple method named the most driving routes first method. One backward removing method named the least driving routes first backward removing method was shown to be able to further reduce the number of required RSUs.

Author 1: Sheng-Wei Wang

Keywords: Roadside units allocation, VANET, certificate up-date, privacy conservation, NP-complete

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org