The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 8 Issue 11

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: A Multiple-Criteria Decision Making Model for Ranking Refactoring Patterns

Abstract: The analytic network process (ANP) is capable of structuring decision problems and finding mathematically determined judgments built on knowledge and experience. Researches suggest that ANP can be useful in software development, where complicated decisions happen routinely. In extreme programming (XP), the refactoring is applied where the code smells bad. This might cost more effort and time. As a result, in order to increase the advantages of refactoring in less effort and time, the analytic network process has been used to accomplish this purpose. This paper presents an example of applying the ANP in order to rank the refactoring patterns regarding the internal code quality attributes. A case study that was conducted in an academic environment is presented in this paper. The results of the case study show the benefits of using the ANP in XP development cycle.

Author 1: Abdulmajeed Aljuhani
Author 2: Luigi Benedicenti
Author 3: Sultan Alshehri

Keywords: Analytic network process; extreme programming; refactoring practice; refactoring patterns

PDF

Paper 2: Classification of Alzheimer Disease based on Normalized Hu Moment Invariants and Multiclassifier

Abstract: There is a great benefit of Alzheimer disease (AD) classification for health care application. AD is the most common form of dementia. This paper presents a new methodology of invariant interest point descriptor for Alzheimer disease classification. The descriptor depends on the normalized Hu Moment Invariants (NHMI). The proposed approach deals with raw Magnetic Resonance Imaging (MRI) of Alzheimer disease. Seven Hu moments are computed for extracting images’ features. These moments are then normalized giving new more powerful features that highly improve the classification system performance. The moments are invariant which is the robustness point of Hu moments algorithm to extract features. The classification process is implemented using two different classifiers, K-Nearest Neighbors algorithm (KNN) and Linear Support Vector Machines (SVM). A comparison among their performances is investigated. The results are evaluated on Alzheimer’s Disease Neuroimaging Initiative (ADNI) database. The best classification accuracy is 91.4% for KNN classifier and 100% for SVM classifier.

Author 1: Arwa Mohammed Taqi
Author 2: Fadwa Al-Azzo
Author 3: Mariofanna Milanova

Keywords: Alzheimer disease; machine learning; Hu moment invariants; SVM; K-Nearest Neighbors (KNN) classifier

PDF

Paper 3: Multi-Valued Autoencoders and Classification of Large-Scale Multi-Class Problem

Abstract: Two-layered neural networks are well known as autoencoders (AEs) in order to reduce the dimensionality of data. AEs are successfully employed as pre-trained layers of neural networks for classification tasks. Most of the existing studies conceived real-valued AEs in real-valued neural networks. This study investigated complex- and quaternion-valued AEs for complex- and quaternion-valued neural networks. Inputs, weights, biases, and outputs in complex-valued AE (CAE) are complex variables, whereas those in quaternion-valued AE (QAE) are quaternions. In both methods, a split-type activation function is used in the hidden and output units. To deal with the images using the proposed methods, pairs of pixels are allotted to complex-valued inputs in the CAE and quartets of pixels are allotted to quaternion-valued inputs in the QAE. Proposed autoencoders are tested and performance compared with conventional AE for several tasks which are encoding/decoding, handwritten numeral recognition and large-scale multi-class classification. Proposed CAE and QAE revealed as good recognition methods for the tasks and outperformed conventional AE with significance performance in case of large-scale multi-class images recognition.

Author 1: Ryusuke Hata
Author 2: M. A. H. Akhand
Author 3: Kazuyuki Murase

Keywords: Autoencoder; classification; complex-valued autoencoder; quaternion-valued autoencoder; recognition

PDF

Paper 4: An Adaptive Intrusion Detection Method for Wireless Sensor Networks

Abstract: Current intrusion detection systems for Wireless Sensor Networks (WSNs) which are usually designed to detect a specific form of intrusion or only applied for one specific type of network structure has apparently restrictions in facing various attacks and different network structures. To bridge this gap, based on the mechanism that attacks are much likely to be deviated from normal features and from different shapes of aggregations in feature space, we proposed a knowledge based intrusion detection strategy (KBIDS) to detect multiple forms of attacks over different network structure. We firstly, in the training stage, used a modified unsupervised mean shift clustering algorithm to discover clusters in network features. Then the discovered clusters were classified as an anomaly if they had a certain amount of deviation from the normal cluster captured at the initial stage where no attacks could occur at all. The training data combined with a weighted support vector machine were then used to build the decision function that was used to flag network behaviors. The decision function was updated periodically after training by merging newly added network features to adapt network variability as well as to achieve time efficiency. During network running, each node uniformly captured their status as feature vector at certain interval and forwarded them to the base station on which the model was deployed and run. Using this way, our model can work independently of network structure in both detection and deployment. The efficiency and adaptability of the proposed method have been tested and evaluated by simulation experiments deployed on QualNet. The simulations were conducted as a full-factorial experiment in which all combinations of three forms of attacks and two types of WSN structures were tested. Results demonstrated that the detection accuracy and network structure adaptability of the proposed method outperforms the state-of-the-art intrusion detection methods for WSN.

Author 1: Hongchun Qu
Author 2: Zeliang Qiu
Author 3: Xiaoming Tang
Author 4: Min Xiang
Author 5: Ping Wang

Keywords: Wireless sensor network; intrusion detection system; knowledge based detection; clustering algorithm; weighted support vector machine

PDF

Paper 5: User based Recommender Systems using Implicative Rating Measure

Abstract: This paper proposes the implicative rating measure developed on the typicality measure. The paper also proposes a new recommendation model presenting the top N items to the active users. The proposed model is based on the user-based collaborative filtering approach using the implicative intensity measure to find the nearest neighbors of the active users, and the proposed measure to predict users’ ratings for items. The model is evaluated on two datasets MovieLens and CourseRegistration, and compared to some existing models such as: the item based collaborative filtering model using the Jaccard measure, the user based collaborative filtering model using Jaccard measure, the popular items based model, the latent factor based model, and the association rule based model using the confidence measure. The experimental results show that the performance of the proposed model is better when compared to other five models.

Author 1: Lan Phuong Phan
Author 2: Hung Huu Huynh
Author 3: Hiep Xuan Huynh

Keywords: Implicative rating measure; recommender system; user-based collaborative filtering

PDF

Paper 6: A New Architecture for Real Time Data Stream Processing

Abstract: Processing a data stream in real time is a crucial issue for several applications, however processing a large amount of data from different sources, such as sensor networks, web traffic, social media, video streams and other sources, represents a huge challenge. The main problem is that the big data system is based on Hadoop technology, especially MapReduce for processing. This latter is a high scalability and fault tolerant framework. It also processes a large amount of data in batches and provides perception blast insight of older data, but it can only process a limited set of data. MapReduce is not appropriate for real time stream processing, and is very important to process data the moment they arrive at a fast response and a good decision making. Ergo the need for a new architecture that allows real-time data processing with high speed along with low latency. The major aim of the paper at hand is to give a clear survey of the different open sources technologies that exist for real-time data stream processing including their system architectures. We shall also provide a brand new architecture which is mainly based on previous comparisons of real-time processing powered with machine learning and storm technology.

Author 1: Soumaya Ounacer
Author 2: Mohamed Amine TALHAOUI
Author 3: Soufiane Ardchir
Author 4: Abderrahmane Daif
Author 5: Mohamed Azouazi

Keywords: Data stream processing; real-time processing; Apache Hadoop; Apache spark; Apache storm; Lambda architecture; Kappa architecture

PDF

Paper 7: A Brief Survey on 5G Wireless Mobile Network

Abstract: The new upcoming technology of the fifth generation wireless mobile network is advertised as lightning speed internet, everywhere, for everything, for everyone in the nearest future. There are a lot of efforts and research carrying on many aspects, e.g. millimetre wave (mmW) radio transmission, massive multiple input and multiple output (Massive-MIMO) new antenna technology, the promising technique of SDN architecture, Internet of Thing (IoT) and many more. In this brief survey, we highlight some of the most recent developments towards the 5G mobile network.

Author 1: Marwan A. Al-Namari
Author 2: Ali Mohammed Mansoor
Author 3: Mohd. Yamani Idna Idris

Keywords: 5G; millimetre wave (mmW); Internet of Thing (IoT); SDN; massive multiple input and multiple output (Massive-MIMO)

PDF

Paper 8: Improved-Node-Probability Method for Decision Making in Priority Determination of Village Development Proposed Program

Abstract: This research proposes a new method, the probability of nodes (NP) and the cumulative frequency of indicators within the framework of Bayesian networks to calculate the weight of participation. This method uses the PLS-PM approach to examine the relationship structure of participatory factors and estimate latent variables. Data were collected using questionnaires involving participants offering proposals, the village residents themselves. The participation factors identified in this research were divided into two categories, namely, internal factors (abilities) and external factors (motivation). The internal factors included gender, age, education, occupation, and income, while the external factors included motivation relating to economic, political, socio-cultural, norm-related, and knowledge-related issues. Moreover, there are three factors directly affecting the level of participation, they are: the level of attendance in meetings, participation in giving suggestions, and involvement in decision making. The test results showed that the application of participation weight in decision making priority of proposal of village development program give change of final rank of decision with test result as: recall 50%, precision 80% and accuracy 50%.

Author 1: Dedi Trisnawarman
Author 2: Sri Hartati
Author 3: Edi Winarko
Author 4: Purwo Santoso

Keywords: Bayesian networks; PLS-PM; participation weight; decision making; village

PDF

Paper 9: Low Cost Countermeasure at Authentication Protocol Level against Electromagnetic Side Channel Attacks on RFID Tags

Abstract: Radio Frequency Identification (RFID) technology is widely spread in many security applications. Producing secured low-cost and low-power RFID tags is a challenge. The used of lightweight encryption algorithms can be an economic solution for these RFID security applications. This article proposes low cost countermeasure to secure RFID tags against Electromagnetic Side Channel Attacks (EMA). Firstly, we proposed a parallel architecture of PRESENT block cipher that represents a one way of hiding countermeasures against EMA. 200 000 Electromagnetic traces are used to attack the proposed architecture, whereas 10 000 EM traces are used to attack an existing serial architecture of PRESENT. Then we proposed a countermeasure at mutual authentication protocol by limiting progressively the number of EM traces. This limitation prevents the attacker to perform the EMA. The proposed countermeasure is based on time delay function. It requires 960 GEs and represents a low cost solution compared to existing countermeasures at primitive block cipher (2471 GEs).

Author 1: Yassine NAIJA
Author 2: Vincent BEROULLE
Author 3: Mohsen MACHHOUT

Keywords: Radio Frequency Identification (RFID); electromagnetic side channel attack; PRESENT; mutual authentication protocol; countermeasures

PDF

Paper 10: Task Scheduling in Cloud Computing using Lion Optimization Algorithm

Abstract: Cloud computing has spread fast because of its high performance distributed computing. It offers services and access to shared resources to internet users through service providers. Efficient performance of task scheduling in clouds is one of the most important research issues which needs to be focused on. Various task scheduling algorithms for cloud based on metaheuristic techniques have been examined and showed high performance in reasonable time such as scheduling algorithms based on Ant Colony Optimization (ACO), Genetic Algorithm (GA), and Particle Swarm Optimization (PSO). In this paper, we propose a new task-scheduling algorithm based on Lion Optimization Algorithm (LOA), for cloud computing. LOA is a nature-inspired population-based algorithm for obtaining global optimization over a search space. It was proposed by Maziar Yazdani and Fariborz Jolai in 2015. It is a metaheuristic algorithm inspired by the special lifestyle of lions and their cooperative characteristics. The proposed task scheduling algorithm is compared with scheduling algorithms based on Genetic Algorithm and Particle Swarm Optimization. The results demonstrate the high performance of the proposed algorithm, when compared with the other algorithms.

Author 1: Nora Almezeini
Author 2: Alaaeldin Hafez

Keywords: Cloud computing; task scheduling algorithm; cloud scheduling; lion optimization algorithm; optimization algorithm

PDF

Paper 11: Restructuring of System Analysis and Design Course with Agile Approach for Computer Engineering/Programming Departments

Abstract: Today software plays an increasingly important and central role in every aspect of everyday life. The number, size, complexity and application areas of the programs developed continue to grow. Many software products have serious problems in cost, timing and quality. It has become almost normal for software projects to exceed their planned cost and schedule. A significant number of development projects have never been completed and many of them have not met user requirements. Employers are not satisfied with new graduates for a variety of reasons (They do not know how to communicate. They do not have enough experience and preparation to work as a team member. They do not have the ability to manage their individual works efficiently and productively. This work was reconfigured and conducted with Scrum from the Agile methodologies of the System Analysis and Design course, which is especially taught in Vocational Schools and Engineering faculties. The recommended approach is available for software development departments. The suggested approach is applied in System Analysis and Design course.

Author 1: Ahmet ALBAYRAK

Keywords: Agile software development; scrum; course re-design; computer science education

PDF

Paper 12: Linear Prediction Model for Effort in Programming based on User Acceptance and Revised use Case Point Method

Abstract: As long as most of the processes of verification and validation of software to grant acceptance by the customer/user, are subjective type, it is aimed to design a standard mathematical model with empirical to perform an appointment with areas or stages where development teams most fail involving large-scale software projects. This model will be based on a survey that the user must fill as going testing and validating the software, and which response curve must be linear with respect to the software development process. This paper aims to discuss the aspects surrounding the estimation of mathematical model in the validation and acceptance by a user through the revised Use Case Point Method. First, an assessment of the most recent techniques of application of the method are done, and then a simulation of the process of acceptance and validation by a standard user (Beta Test) will be taken as a practical example. For purposes of this paper, revised use case point method (Re-UCP) must have a specific weight, based on the prerequisites for the development of large-scale software. Once obtained this weighting, the user shall assess the finished product and then an approximation function will be to determine the coefficients of the final model approach, and indicating that is the efficient trend of the development team

Author 1: Fahad H. Alshammari

Keywords: Function; point; software; engineering; mathematical; model; large-scale; programming; acceptance; validation

PDF

Paper 13: Repository of Static and Dynamic Signs

Abstract: Gesture-based communication is on the rise in Human Computer Interaction. Advancement in the form of smart phones has made it possible to introduce a new kind of communication. Gesture-based interfaces are increasingly getting popular to communicate at public places. These interfaces are another effective communication medium for deaf and dumb. Gestures help conveying inner thoughts of these physically disabled people to others thus eliminating the need of a gesture translator. These gestures are stored in data sets so we need to work on developing an efficient dataset. Many datasets have been developed for languages like American sign language; however, for other sign languages, like Pakistani Sign Language (PSL), there has not been done much work. This paper presents technique for storing datasets for static and dynamic signs for any sign language other than British Sign Language or American Sign Language. For American and British Sign Languages many datasets are available publicly. However, other regional languages lack public availability of data sets. Pakistan Sign Language has been taken as a case study more than 5000 gestures have been collected and they will be made part of a public database as a part of this research. The research is first initiative towards building Universal Sign Language as every region has a different Sign Language. The second focus of the research is to explore methodologies where a signer communicates without any constraint like Data gloves or a particular background. Thirdly, the paper proposes use of spelling based gestures for easier communication. So the dataset design suggested is not affected by constraints of any kind.

Author 1: Shazia Saqib
Author 2: Syed Asad Raza Kazmi

Keywords: Data gloves; feature extraction; human computer interaction; image segmentation; object recognition

PDF

Paper 14: Lightweight Internet Traffic Classification based on Packet Level Hidden Markov Models

Abstract: During the last decade, Internet traffic classification finds its importance not only to safeguard the integrity and security of network resources, but also to ensure the quality of service for business critical applications by optimizing existing network resources. But optimization at first place requires correct identification of different traffic flows. In this paper, we have suggested a framework based on Hidden Markov Model, which will use Internet Packet intrinsic statistical characteristics for traffic classification. The packet inspection based on statistical analysis of its different characteristics has helped to reduce overall computational complexity. Generally, the major challenges associated with any internet traffic classifier are: 1) the limitation to accurately identify encrypted traffic when classification is performed using traditional port based techniques; 2) overall computational complexity, and 3) to achieve high accuracy in traffic identification. Our methodology takes advantage of internet packet statistical characteristics in terms of its size and their inter arrival time in order to model different traffic flows. For experimental results, the data set of mostly used internet applications was used. The proposed HMM models best fit the observed traffic with high accuracy. Achieved traffic identification accuracy was 91% for packet size classifier whereas it was 82% for inter packet time based classifier.

Author 1: Naveed Akhtar
Author 2: Muhammad Kamran

Keywords: Hidden Markov model; traffic classification; net- work security; deep packet inspection; internet traffic modeling; Internet of Things

PDF

Paper 15: Efficient K-Nearest Neighbor Searches for Multiple-Face Recognition in the Classroom based on Three Levels DWT-PCA

Abstract: The main weakness of the k-Nearest Neighbor algorithm in face recognition is calculating the distance and sort all training data on each prediction which can be slow if there are a large number of training instances. This problem can be solved by utilizing the priority k-d tree search to speed up the process of k-NN classification. This paper proposes a method for student attendance systems in the classroom using facial recognition techniques by combining three levels of Discrete Wavelet Transforms (DWT) and Principal Component Analysis (PCA) to extract facial features followed by applying the priority of k-d tree search to speed up the process of facial classification using k-Nearest Neighbor. The proposed algorithm is tested on two datasets that are Honda/UCSD video dataset and our dataset (AtmafaceDB dataset). This research looks for the best value of k to get the right facial recognition using k-fold cross-validation. 10-fold cross-validation at level 3 DWT-PCA shows that face recognition using k-Nearest Neighbor on our dataset is 95.56% with k = 5, whereas in the Honda / UCSD dataset it is only 82% with k = 3. The proposed method gives computational recognition time on our dataset 40 milliseconds.

Author 1: Hadi Santoso
Author 2: Agus Harjoko
Author 3: Agfianto Eko Putra

Keywords: Multiple-face recognition; DWT; PCA; priority k-d tree; k-Nearest Neighbor

PDF

Paper 16: Optimization and Evaluation of Hybrid PV/WT/BM System in Different Initial Costs and LPSP Conditions

Abstract: A modelling and optimization study was performed to manage energy demand of a faculty in Karabuk University campus area working with a hybrid energy production system by using genetic algorithm (GA). Hybrid system consists of photovoltaic (PV) panels, wind turbines (WT) and biomass (BM) energy production units. Here BM is considered as a back-up generator. Objective function was constituted for minimizing total net present cost (TNPC) in optimization. In order to obtain more accurate results, measurements were performed with a weather station and data were read from an electricity meter. The system was also checked for reliability by the loss of power supply probability (LPSP). Changes in TNPC and localized cost of energy (LCOE) were interpreted by changing LPSP and economic parameters such as PV investment cost, WT investment cost, BM investment cost, and interest rates. As a result, it was seen that a hybrid system consisted of PV and BM associated with an effective flow algorithm benefited from a GA meets the energy demand of the faculty.

Author 1: Abdülsamed Tabak
Author 2: Mehmet Özkaymak
Author 3: Muhammet Tahir Güneser
Author 4: Hüseyin Oktay Erkol

Keywords: Photovoltaic (PV)/wind turbines (WT)/ biomass (BM); hybrid system; optimization; sizing; cost-effective; reliability; genetic algorithm

PDF

Paper 17: A Web based Inventory Control System using Cloud Architecture and Barcode Technology for Zambia Air Force

Abstract: Inventory management of spares is one of the activities Zambia Air Force (ZAF) undertakes to ensure optimal serviceability state of equipment to effectively achieve its roles. This obligation could only be made possible by automating the current manual and paper based inventory system. A web based inventory management system using cloud architecture and barcode technology was proposed. A literature review was conducted on three technologies used in the inventory management that is Radio Frequency Identification (RFID), Barcode Technology and Near Field Communication (NFC). A review was also undertaken on the related works to identify the concept that could be adopted in the proposed system. A baseline study was performed to understand the challenges faced by ZAF in the inventory management of spares. The results of the baseline study were analyzed and found that the challenges were attributed to the current manual inventory management system mainly due to human errors, incorrect inventory reporting and pilferage of items. The proposed prototype system was developed and tested and proved to be faster, efficient and more reliable than the manual and paper based system.

Author 1: Thomas Muyumba
Author 2: Jackson Phiri

Keywords: Zambia Air Force (ZAF); inventory system; barcode technology; Radio Frequency Identification (RFID); Near Field Communication (NFC); cloud computing; web based application

PDF

Paper 18: A Model for Forecasting the Number of Cases and Distribution Pattern of Dengue Hemorrhagic Fever in Indonesia

Abstract: Dengue Hemorrhagic Fever (DHF) ourbreaks is one of the lethal health problems in Indonesia. Aedes aegypti type of insect prolefiration as the main vector of DHF has affected climate factors, such as temperature, humidity, rainfall, and irradiation time. Therefore, to project the number of DHF cases is a very important assignment for the Ministry of Health to initiate contingencies planning as a prevention step in confronting the increasing number of DHF cases in nearby future. This study aims in developing a forecasting model in anticipating the number of cases and distribution pattern of DHF with multivariate time series using Vector Autoregressive Spatial Autocorrelation (VARSA). VARSA model uses multivariate time series, such as a number of DHF case, minimum temperature, maximum temperature, rainfall, average humidity, irradiation time and population density. This modeling is done in two steps: Vector Autoregressive modeling to predict the number of DHF cases and Local Indicators of Spatial Association (LISA) method to visualize distribution pattern of DHF based on the spatial connectivity of the number of DHF cases among the neighboring districts. This study covers 17 districts in Sleman Yogyakarta, resulting in low errors with Root Means Square Error (RMSE) of 2.10 and Mean Absolute Error (MAE ) of 1.51. This model produces smaller errors than using univariate time series methods, such as Linear regression and Autoregressive Integrated Moving Average (ARIMA).

Author 1: Deni Mahdiana
Author 2: Ahmad Ashari
Author 3: Edi Winarko
Author 4: Hari Kusnanto

Keywords: Dengue Hemorrhagic Fever (DHF); Vector Autoregressive Spatial Autocorrelation (VARSA); forecasting; multivariate time series; Local Indicators of Spatial Association (LISA)

PDF

Paper 19: An AHP Model towards an Agile Enterprise

Abstract: Companies are facing different challenges in order to adapt to their environmental context. They should be aware of the changes on the social, political, ecological and economical levels. Moreover, they should act in an efficient and rapid manner by leveraging new and reconfigurable resources. Organizational agility is the firm’s key dynamic capability which enables it to deal with changes and exploit them as opportunities. Firms’ objective is thus to attain a higher degree of agility which can help them to perform durably. In this article, a new model based on analytical hierarchy process (AHP) method is proposed. This can help companies to raise their agility level by deploying the most suitable agility enablers which can be either general or specific when related to information technologies. They can thus develop the most appropriate strategy towards agility regarding their internal and external contexts.

Author 1: Mohamed Amine Marhraoui
Author 2: Abdellah El Manouar

Keywords: Organizational agility; analytical hierarchy process; information technology; agility enablers

PDF

Paper 20: Ghanaian Consumers’ Online Privacy Concerns: Causes and its Effects on E-Commerce Adoption

Abstract: Online privacy has gradually become a concern for internet users over the years as a result of the interconnection of customers’ devices with other devices supporting the internet technology. This research investigates and discusses the factors that influence the privacy concerns faced by online consumers of internet services and the possible outcomes of these privacy concerns on the African online market with Ghana being the primary focus. Results from this study indicated that only 10.1% of respondents felt that the internet was safe for purchase and payment transaction in Ghana. However, respondents were willing to shop online if e-Commerce was the only means of getting their products. Respondents also had a high sense of perceived vulnerability and their perceived vulnerability to unauthorized data collection and misuse of personal information could affect Ghanaian e-Commerce platform adoption. The perceived ability of users of e-Commerce platforms in Ghana to control data collection and its subsequent use by other third parties was also found to negatively impact customers’ willingness to wholly transact and share their personal information online. The perceived vulnerability was found to be affected by the high levels of internet illiteracy whiles the perceived ability to control the collection of information and use was influenced by both the internet literacy level as well as the level of social awareness of the Ghanaian internet consumer.

Author 1: E. T. Tchao
Author 2: Kwasi Diawuo
Author 3: Christiana Aggor
Author 4: Seth Djane Kotey

Keywords: E-Commerce; technology adoption; online privacy; perceived vulnerability; perceived control

PDF

Paper 21: Expert System of Chili Plant Disease Diagnosis using Forward Chaining Method on Android

Abstract: This research was conducted to make an expert system that is able to diagnose disease in chili plants based on knowledge that provided directly from the experts. This research uses classical probability calculation method in calculating the percentage of diagnoses and implemented on the Android mobile device. This research consisted of 37 symptoms data, 10 data of chili disease caused by fungi, and 10 rules. This expert system uses forward chaining inference method. Test results shows: (1) Functional testing using the Black Box Equivalence Partitioning (EP) method give the results as expected on the test scenario on each test class. (2) Expert testing by comparing the results of manual and system calculations matches and run well. (3) User acceptance test is done to 53 respondents which is divided into four groups of respondents. The first respondents group that is consisting of experts of chili disease give average score of 85.14% (excellent). The second group that consist of Agriculture Department students give score of 84.13% (excellent). The third respondent group that consist of Computer Science Department students give score of 84.28% (excellent) whereas the last group (chili farmers) give a score of 86% (excellent).

Author 1: Aristoteles
Author 2: Mita Fuljana
Author 3: Joko Prasetyo
Author 4: Kurnia Muludi

Keywords: Android; classic probability; expert system; forward chaining; likert scale

PDF

Paper 22: Cross-Organizational Information Systems: A Case for Educational Data Mining

Abstract: Establishing a new organization is becoming more difficult day by day due to the extremely competitive business environment. A new organization may not have enough experience to survive in the competitive market; which in turn may push down the reputation of the organization and the trust of the investors. The goal of this research project work is to design a framework for the cross-organizational information system for assessment and decision making using machine learning with the emphasis on the educational sector. In the proposed framework, organizations share information (even raw data) with each other and machine learning tool will be utilized for shared data analysis for decision making for a particular organization. A framework like this can help new organizations to get benefit from the experience of other ‘older’ organizations and institutions. Such knowledge-based machine learning system helps to improve the organizational capability of newly established institutions. As an implementation of the framework, we build a fuzzy system that can effectively work as a cross-platform system for educational entities.

Author 1: Gufran Ahmad Ansari
Author 2: Mohammad Tanvir Parvez
Author 3: Ali Al Khalifah

Keywords: Information system; machine learning; cross-organization; decision making; education; fuzzy matching; data mining

PDF

Paper 23: A Comparative Study between Applications Developed for Android and iOS

Abstract: Now-a-days, mobile applications implement complex functionalities that use device’s core features extensively. This paper realizes a performance analysis of the most important core features used frequently in mobile application development: asynchronous multi-threaded code execution, drawing views/elements on the screen and basic network communications. While multiple mobile platforms have emerged in recent years, in this paper two well-established and popular operating systems were considered for comparison and testing: Android and iOS. Thus, two basic applications featuring the same functionality and complexity were developed to run natively on both platforms. Applications were developed by using development languages and tools recommended for each operating system. This paper aims to highlight the differences between the two operating systems by analyzing core feature performance metrics for both functionally identical mobile applications developed for each platform. Results obtained could be further used for guiding the optimization of application’s development process for each considered operating system.

Author 1: Robert Gyorödi
Author 2: Doina Zmaranda
Author 3: Vlad Georgian Adrian
Author 4: Cornelia Gyorödi

Keywords: Android; iOS; mobile application development; mobile device core features; common scenario performance comparison; development optimization

PDF

Paper 24: Recognizing Human Actions by Local Space Time and LS-TSVM over CUDA

Abstract: Local space-time features can be used to make the events adapted to the velocity of moving patterns, size of the object and the frequency in captured video. This paper purposed the new implementation approach of Human Action Reorganization (HAR) using Compute Unified Device Architecture (CUDA). Initially, local space-time features extracted from the customized dataset of videos. The video features are extracted by utilizing the Histogram of Optical Flow (HOF) and Harris detector algorithm descriptor. A new extended version of SVM classifier which is four time faster and has better precision than classical SVM known as the Least Square Twin SVM (LS-TSVM); a binary classifier which use two non-parallel hyperplanes, is applied on extracted video features. Paper evaluates the LS-TSVM performance on the customized data and experimental result showed the significant improvements.

Author 1: Mohsin Raza Siyal
Author 2: Muhammad Saeed
Author 3: Jibran R. Khan
Author 4: Farhan A. Siddiqui
Author 5: Kamran Ahsan

Keywords: Motion detection; human action recognition; LS-TSVM; GPU Programming; Compute Unified Device Architecture (CUDA)

PDF

Paper 25: A Generic Methodology for Clustering to Maximises Inter-Cluster Inertia

Abstract: This paper proposes a novel clustering methodology which undeniably manages to offer results with a higher inter-cluster inertia for a better clustering. The advantage obtained with this methodology is due to an algorithm that showed beforehand its efficiency in clustering exercises, MC- DBSCAN, which is associated to an iterative process with a potential of auto-adjustment of the weights of the pertinent criteria that allows the reclassification of objects of the two closest clusters through each iteration, as well as the aptitude of the auto-evaluation of the precision of the clustering during the clustering process. This work conducts the experiments using the well-known benchmark, ‘Seismic’, ‘Landform-Identification’ and ‘Image Segmentation’, to compare the performance of the proposed methodology with other algorithms (K-means, EM, CURE and MC-DBSCAN). The experimental results demonstrate that the proposed solution has good quality of clustering results.

Author 1: A. Alaoui
Author 2: B. Olengoba Ibara
Author 3: B. Ettaki
Author 4: J. Zerouaoui

Keywords: MC-DBSCAN; iterative process; inter-cluster inertia; unsupervised precision-recall metrics

PDF

Paper 26: Software Migration Frameworks for Software System Solutions: A Systematic Literature Review

Abstract: This study examines and review the current software migration frameworks. With the quick technological enhancement, companies need to move their software’s from one platform to another platform like cloud-based migration. There are different types of risks involved during migration. By performing migration activities correctly these risks might be reduced. Due to the absence of resources, such as workforce, time, budget in small organizations, the software migration is not performed in optimized way. Therefore, many functionalities are not implemented exactly after migration. In this paper, we have described different methods and frameworks which provide guideline for developers to enhance software migration process.

Author 1: Muhammad Shoaib
Author 2: Adeed Ishaq
Author 3: Muhammad Awais Ahmad
Author 4: Sidra Talib
Author 5: Ghulam Mustafa
Author 6: Aqeel Ahmed

Keywords: Software migration; frameworks; system migration; cloud migration; migration risk

PDF

Paper 27: Comparison of Machine Learning Algorithms to Classify Web Pages

Abstract: The ‘World Wide Web’, or simply the web, represents one of the largest sources of information in the world. We can say that any topic we think about is probably finding it's on the web. Web information comes in different forms and types such as text documents, images and videos. However, extracting useful information, without the help of some web tools, is not an easy process. Here comes the role of web mining, which provides the tools that help us to extract useful knowledge from data on the internet. Many researchers focus on the issue of web pages classification technology that provides high accuracy. In this paper, several ‘supervised learning algorithms’ evaluation to determining the predefined categories among web documents. We use machine learning algorithms ‘Artificial Neural Networks (ANN)’, ‘Random Forest (RF)’, ‘AdaBoost’ to perform a behavior comparison on the web pages classifications problem.

Author 1: Ansam A. AbdulHussien

Keywords: Web page classification; artificial neural networks; random forest; adaboost

PDF

Paper 28: GDPI: Signature based Deep Packet Inspection using GPUs

Abstract: Deep Packet Inspection (DPI) is necessitated for many networked application systems in order to prevent from cyber threats. The signature based Network Intrusion and etection System (NIDS) works on packet inspection and pattern matching mechanisms for the detection of malicious content in network traffic. The rapid growth of high speed networks in data centers demand an efficient high speed packet processing mechanism which is also capable of malicious packets detection. In this paper, we proposed a framework GDPI for efficient packet processing which inspects all incoming packet’s payload with known signature patterns, commonly available is Snort. The framework is developed using enhanced GPU programming techniques, such as asynchronous packet processing using streams, minimizing CPU to GPU latency using pinned memory and zero copy, and memory coalescing with shared memory which reduces read operation from global memory of the GPU. The overall performance of GDPI is tested on heterogeneous NVIDIA GPUs, like Tegra Tk1, GTX 780, and Tesla K40 and observed that the highest throughput is achieved with Tesla K40. The design code of GDPI is made available for research community.

Author 1: Nausheen Shoaib
Author 2: Jawwad Shamsi
Author 3: Tahir Mustafa
Author 4: Akhter Zaman
Author 5: Jazib ul Hasan
Author 6: Mishal Gohar

Keywords: Packet processing; Graphic Processing Units (GPUs); deep packet inspection; network security; parallel computing; heterogeneity; CUDA

PDF

Paper 29: Multi-Target Tracking Using Hierarchical Convolutional Features and Motion Cues

Abstract: In this paper, the problem of multi-target tracking with single camera in complex scenes is addressed. A new approach is proposed for multi-target tracking problem that learns from hierarchy of convolution features. First fast Region-based Convolutional Neutral Networks is trained to detect pedestrian in each frame. Then cooperate it with correlation filter tracker which learns target’s appearance from pretrained convolutional neural networks. Correlation filter learns from middle and last convolutional layers to enhances targets localization. However correlation filters fail in case of targets full occlusion. This lead to separated tracklets (mini-trajectories) problem. So a post processing step is added to link separated tracklets with minimum-cost network flow. A cost function is used, that depends on motion cues in associating short tracklets. Experimental results on MOT2015 benchmark show that the proposed approach produce comparable result against state-of-the-art approaches. It shows an increase 4.5 % in multiple object tracking accuracy. Also mostly tracked targets is 12.9% vs 7.5% against state-ofthe- art minimum-cost network flow tracker.

Author 1: Heba Mahgoub
Author 2: Khaled Mostafa
Author 3: Khaled T. Wassif
Author 4: Ibrahim Farag

Keywords: Multi-target tracking; correlation filters; convolution neural networks

PDF

Paper 30: Machine Learning for Bioelectromagnetics: Prediction Model using Data of Weak Radiofrequency Radiation Effect on Plants

Abstract: Plant sensitivity and its bio-effects on non-thermal weak radio-frequency electromagnetic fields (RF-EMF) identifying key parameters that affect plant sensitivity that can change/unchange by using big data analytics and machine learning concepts are quite significant. Despite its benefits, there is no single study that adequately covers machine learning concept in Bioelectromagnetics domain yet. This study aims to demonstrate the usefulness of Machine Learning algorithms for predicting the possible damages of electromagnetic radiations from mobile phones and base station on plants and consequently, develops a prediction model of plant sensitivity to RF-EMF. We used rawdata of plant exposure from our previous review study (extracted data from 45 peer-reviewed scientific publications published between 1996-2016 with 169 experimental case studies carried out in the scientific literature) that predicts the potential effects of RF-EMF on plants. We also used values of six different attributes or parameters for this study: frequency, specific absorption rate (SAR), power flux density, electric field strength, exposure time and plant type (species). The results demonstrated that the adaptation of machine learning algorithms (classification and clustering) to predict 1) what conditions will RF-EMF exposure to a plant of a given species may not produce an effect; 2) what frequency and electric field strength values are safer; and 3) which plant species are affected by RF-EMF. Moreover, this paper also illustrates the development of optimal attribute selection protocol to identify key parameters that are highly significant when designing the in-vitro practical standardized experimental protocols. Our analysis also illustrates that Random Forest classification algorithm outperforms with highest classification accuracy by 95.26% (0.084 error) with only 4% of fluctuation among algorithm measured. The results clearly show that using K-Means clustering algorithm, demonstrated that the Pea, Mungbean and Duckweeds plants are more sensitive to RF-EMF (p <= 0.0001). The sample size of reported 169 experimental case studies, perhaps low significant in a statistical sense, nonetheless, this analysis still provides useful insight of exploiting Machine Learning in Bioelectromagnetics domain. As a direct outcome of this research, more efficient RF-EMF exposure prediction tools can be developed to improve the quality of epidemiological studies and the long-term experiments using whole organisms.

Author 1: Malka N. Halgamuge

Keywords: Machine learning; plants; prediction; mobile phones; base station; radiofrequency electromagnetic fields; RFEMF; plant sensitivity; classification; clustering

PDF

Paper 31: K-means Based Automatic Pests Detection and Classification for Pesticides Spraying

Abstract: Agriculture is the backbone to the living being that plays a vital role to country’s economy. Agriculture production is inversely affected by pest infestation and plant diseases. Plants vitality is directly affected by the pests as poor or abnormal. Automatic pest detection and classification is an essential research phenomenon, as early detection and classification of pests as they appear on the plants may lead to minimizing the loss of production. This study puts forth a comprehensive model that would facilitate the detection and classification of the pests by using Artificial Neural Network (ANN). In this approach, the image has been segmented from the fields by using enhanced K-Mean segmentation technique that identifies the pests or any object from the image. Subsequently, features will be extracted by using Discrete Cosine Transform (DCT) and classified using ANN to classify pests. The proposed approach is verified for five pests that exhibited 94% effectiveness while classifying the pests.

Author 1: Muhammad Hafeez Javed
Author 2: M Humair Noor
Author 3: Babar Yaqoob Khan
Author 4: Nazish Noor
Author 5: Tayyaba Arshad

Keywords: Automatic plant pest detection; pest classification; Delta-E; discrete wavelet transform; support vector machine

PDF

Paper 32: Deployment Protocol for Underwater Wireless Sensors Network based on Virtual Force

Abstract: Recently, Underwater Sensor Networks (UWSNs) have attracted researchers’ attention due to the challenges and the peculiar characteristics of the underwater environment. The initial random deployment of UWSN where sensors are scattered over the area via planes or ships is inefficient, and it doesn’t achieve full coverage nor maintain network connectivity. Moreover, energy efficiency in underwater networks is a crucial issue since nodes utilize battery power as a source of energy and it is difficult and sometimes impossible to change or replenish these batteries. Our contribution in this research is to improve the performance of UWSNs by designing UW-DVFA, an underwater 3-D self-distributed deployment algorithm based on virtual forces. The main target for this work is to stretch the randomly deployed network in the 3-D area in a way that guarantees full area coverage and network connectivity.

Author 1: Abeer Almutairi
Author 2: Saoucene Mahfoudh

Keywords: Deployment algorithm; underwater wireless sensor network; virtual force; coverage; connectivity

PDF

Paper 33: FabricVision: System of Error Detection in the Manufacture of Garments

Abstract: A computer vision system is implemented to detect errors in the cutting stage within the manufacturing process of garments in the textile industry. It provides solution to errors within the process that cannot be easily detected by any employee, in addition to significantly increase the speed of quality review. In the textile industry as in many others, quality control is required in manufactured products and this has been carried out manually by means of visual inspection by employees over the years. For this reason, the objective of this project is to design a quality control system using computer vision to identify errors in the cutting stage within the garment manufacturing process to increase the productivity of textile processes by reducing costs.

Author 1: Jaime Moreno
Author 2: Arturo Aguila
Author 3: Eduardo Partida
Author 4: Oswaldo Morales
Author 5: Ricardo Tejeida

Keywords: Computer vision; histogram of oriented gradient; segmentation; object detection; image capture

PDF

Paper 34: An Efficient Method for Breast Mass Segmentation and Classification in Mammographic Images

Abstract: According to the World Health Organization, breast cancer is the main cause of cancer death among women in the world. Until now, there are no effective ways of preventing this disease. Thus, early screening and detection is the most effective method for rising treatment success rates and reducing death rates due to breast cancer. Mammography is still the most used as a diagnostic and screening tool for early breast cancer detection. In this work, we propose a method to segment and classify masses using the regions of interest of mammographic images. Mass segmentation is performed using a fuzzy active contour model obtained by combining Fuzzy C-Means and the Chan-Vese model. Shape and margin features are then extracted from the segmented masses and used to classify them as benign or malignant. The generated features are usually imprecise and reflect an uncertain representation. Thus, we propose to analyze them by a possibility theory to deal with imprecise and uncertain aspect. The experimental results on Regions Of Interest (ROIs) extracted from MIAS database indicate that the proposed method yields good mass segmentation and classification results.

Author 1: Marwa Hmida
Author 2: Kamel Hamrouni
Author 3: Basel Solaiman
Author 4: Sana Boussetta

Keywords: Mammography; breast mass; mass segmentation; fuzzy active contour; mass classification; possibility theory

PDF

Paper 35: Collaborative Editing over Opportunistic Networks: State of the Art and Challenges

Abstract: Emerging Opportunistic Networks (ON) are under intensive research and development by many academics. However, research efforts on ON only addressed routing protocols as well as data dissemination. Too little attention was given to the applications that can be deployed over ON. These are assumed to use immutable data (e.g., photos/video files). Nevertheless, Collaborative Editors (CE) which are based on mutable messages are widely used in many fields. Indeed, they allow many users to concurrently edit the same shared document (e.g., Google Docs). Consequently, it becomes necessary to adapt CE to ON which represents a challenging task. As a matter of fact, CE synchronization algorithms should ensure the convergence of the shared content being modified concurrently by users. In this work, we give an overview on ON and CE in an attempt to combine both states of the art. We highlight the challenges that could be faced when trying to deploy CE over ON.

Author 1: Noha Alsulami
Author 2: Asma Cherif

Keywords: Collaborative editors; opportunistic networks; operational transformation

PDF

Paper 36: Realtime Application of Constrained Predictive Control for Mobile Robot Navigation

Abstract: This work addresses the implementation issue of constrained Model Predictive Control (MPC) for the autonomous trajectory-tracking problem. The chosen process to control is a Wheeled Mobile Robot (WMR) described by a discrete, Multiple Input Multiple Output (MIMO), state-space and linear parameter varying kinematic model. The main motivation of the constrained MPC usage in this case relies on its ability in considering, in a straightforward way, control and states constraints that naturally arise in trajectory tracking practical problems. The efficiency of the presented control scheme is validated through experimental results on a two wheeled mobile robot using both STM32F429II and STM32F407ZG microcontrollers. The controller implementation is facilitated by the usage of the automatic C code generation and interesting optimization before real-time execution. Based on the experimental results obtained, the good performance and robustness of the proposed control scheme are established.

Author 1: Ibtissem Malouche
Author 2: Faouzi Bouani

Keywords: Embedded C; STM32; microcontrollers; constrained model predictive control; otpimization

PDF

Paper 37: Contextual Requirements for Mobile Native Applications

Abstract: Mobile apps have found wide acceptance in today’s world which heavily depend on smart technology to access data over wide location. The apps are mostly of native type which can be used for accessing data even without the internet availability. In this paper the development of mobile native applications requires the assimilation of various analytical contexts depending on the requirement of users. We have done an empirical study of various papers based on ubiquitous systems and mobile apps for finding out the contexts in building mobile native apps and the mobile contexts are such as device context, user context, mobility context and social context. We have found that the overall weight of each mobile context is an empirical study. We have taken various activities which are performed among a user and mobile native apps and formed them into questionnaires which are sent to different mobile native app developers of different software industries. The mapping is done among these activities with the attributes and their associated mobile contexts. We have identified and obtained four contexts as main requirements for developing mobile native apps under any domain. The analysis of requirements is done modeling the contexts and their attributes through OWLDL language. We have determined from the empirical study that the overall weight of device context is more than the other contexts. Hence it is clear that the device context with its numerous features have a great impact on developing mobile native apps under any domain.

Author 1: Sasmita Pani
Author 2: Jibitesh Mishra

Keywords: Mobile contexts; pervasiveness; device usability; mobility interaction

PDF

Paper 38: UML based Formal Model of Smart Transformer Power System

Abstract: Recently many significant improvements have been done in traditionally power system. But still a lot of work is needed in traditional power system to mend many challenges. We propose formal method based on subnet model for smart power system. Formal method is mathematics based technique that is used to develop, specify and verify model in a systematic manner. It involve components i.e., power plant, smart grid, transformer and smart meters. Power plant produces electricity and then distributes it to the smart grid. Smart grid generates electricity to transformers and then transformers transfer electricity to smart meters. Smart transformers and smart meters are deployed inform of subnets that increase the energy efficiency of smart power system. In this paper our main focus is on two components of smart power system that is transformers and smart meters. Graph theory is used for the semi-formal representation of model. In this paper we present system requirements through UML use case diagrams that are used to describe actions of system and then real topology is transferred into model topology in graph theory that is used to represent the structure of system. Mathematical technique and notation based formal method approaches are used for describing and analyzing the system. VDM-SL formal method language is used for formal specification and VDM toolbox is used for the verification and analysis of system.

Author 1: Muniba Sultan
Author 2: Amna Pir
Author 3: Nazir Ahmad Zafar

Keywords: Smart power system; unified model language (UML); formal method; VDM-SL

PDF

Paper 39: FPGA Prototyping and Design Evaluation of a NoC-Based MPSoC

Abstract: Chip communication architectures become an important element that is critical to control when designing a complex MultiProcessor System-on-Chip (MPSoC). This led to the emergence of new interconnection architectures, like Network-on-Chip (NoC). NoCs have been proven to be a promising solution to the concerns of MPSoCs in terms of data parallelism. Field-Programmable Gate Arrays (FPGA) has some perceived challenges. Overcoming those challenges with the right prototyping solutions is easy and cost-effective leading to much faster time-to-market. In this paper, we present an FPGA based on rapid prototyping in hardware/software co-design and design evaluation of a mixed HW/SW MPSoC using a NoC. A case study of two-dimensional mesh NoC-based MPSoC architecture is presented with a validation environment. The synthesis and implementation results of the NoC-based MPSoC on a Virtex 5 ML 507 enable a reasonable frequency (151.5 MHz) and a resource usage rate equals to 58% (6,586 out of 11,200 slices used).

Author 1: Ridha SALEM
Author 2: Yahia SALAH
Author 3: Imed BENNOUR
Author 4: Mohamed ATRI

Keywords: MultiProcessor System-on-Chip; Network-on-Chip; FPGA Field-Programmable Gate Arrays (FPGA) prototyping; design evaluation

PDF

Paper 40: Investigate the use of Anchor-Text and of Query-Document Similarity Scores to Predict the Performance of Search Engine

Abstract: Query difficulty prediction aims to estimate, in advance, whether the answers returned by search engines in response to a query are likely to be useful. This paper proposes new predictors based upon the similarity between the query and answer documents, as calculated by the three different models. It examined the use of anchor text-based document surrogates, and how their similarity to queries can be used to estimate query difficulty. It evaluated the performance of the predictors based on 1) the correlation between the average precision (AP), 2) the precision at 10 (P@10) of the full text retrieved results, 3) a similarity score of anchor text, and 4) a similarity score of full-text, using the WT10g data collection of web data. Experimental evaluation of our research shows that five of our proposed predictors demonstrate reliable and consistent performance across a variety of different retrieval models.

Author 1: Abdulmohsen Almalawi
Author 2: Rayed AlGhamdi
Author 3: Adel Fahad

Keywords: Data mining; information retrieval; web search; query prediction

PDF

Paper 41: A Survey on the Cryptographic Encryption Algorithms

Abstract: Security is the major concern when the sensitive information is stored and transferred across the internet where the information is no longer protected by physical boundaries. Cryptography is an essential, effective and efficient component to ensure the secure communication between the different entities by transferring unintelligible information and only the authorized recipient can be able to access the information. The right selection of cryptographic algorithm is important for secure communication that provides more security, accuracy and efficiency. In this paper, we examine the security aspects and processes involved in the design and implementation of most widely used symmetric encryption algorithms such as Data Encryption Standard (DES), Triple Data Encryption Standard (3DES), Blowfish, Advanced Encryption Standard (AES) and Hybrid Cubes Encryption Algorithm (HiSea). Furthermore, this paper evaluated and compared the performance of these encryption algorithms based on encryption and decryption time, throughput, key size, avalanche effect, memory, correlation assessment and entropy. Thus, amongst the existing cryptographic algorithm, we choose a suitable encryption algorithm based on different parameters that are best fit to the user requirements.

Author 1: Muhammad Faheem Mushtaq
Author 2: Sapiee Jamel
Author 3: Abdulkadir Hassan Disina
Author 4: Zahraddeen A. Pindar
Author 5: Nur Shafinaz Ahmad Shakir
Author 6: Mustafa Mat Deris

Keywords: Cryptography; encryption algorithms; Data Encryption Standard (DES); Triple Data Encryption Standard (3DES); Blowfish; Advanced Encryption Standard (AES); Hybrid Cubes Encryption Algorithm (HiSea)

PDF

Paper 42: University ERP Preparation Analysis: A PPU Case Study

Abstract: The Enterprise Resources Planning (ERP) systems are one of the most frequently used systems by business organizations. Recently, the university sectors began using the ERP system in order to increase the quality of their academic and administrative services. However, the implementation of ERP is complicated, risky, and no factor can guarantee a successful system. Previous studies were primarily concerned with Critical Success Factors (CSFs) in business organizations and organizational success factors. This produced plenty of information about these topics. However, the university environment and structure is different, which encourages us to study its specific technical critical success factors. In this paper, Palestine Polytechnic University (PPU) will be our case study. Our attention is concentrated on technical success factors at PPU. Firstly, the paper focused on the technical problems which current systems in the PPU suffered from, in order to extract the particular CSFs which are needed to implement ERP systems. Secondly, the paper focused on the most technical critical factors that ensure successful implementation of the ERP project. Thirdly, a study of the degree to which PPU’s technical staff uses software engineering practices during the development process has been conducted by focusing on phases activities. Our main aim is to get a pool of parameters related to a successful preparation of universities’ ERP systems.

Author 1: Islam K. Sowan
Author 2: Radwan Tahboub
Author 3: Faisal Khamayseh

Keywords: Enterprise Resources Planning (ERP); University ERP; software engineering practices software engineering phases activities; critical success factor; technical success factors; ERP implementation; successful ERP

PDF

Paper 43: Improved QoS for Multimedia Transmission using Buffer Management in Wireless Sensor Network

Abstract: Wireless Sensor Network (WSN) diverts the attention of the research community as it is easy to deploy, self-maintained and does not require predefine infrastructure. These networks are commonly used to broadcast multimedia data from source to destination. However, this kind of data transmission has some challenges, i.e. power and bandwidth limitation with small delay. Art of work mainly focus on optimization either by the shortest route or to minimize the delay by increasing the bandwidth. However, Buffer management is the main constraint to cause delay and loss of packets. In this paper, an approach is presented to manage the buffer and increase the packet delivery ratio (PDR) and reduce delay by assigning the priorities to Intra- coded (I) frame, predictive – coded (P) frame and bidirectional- coded (B) frames dynamically. This approach is very much effective to control the loss of packets in WSN. The presented approach is validated by using Network Simulator 2.

Author 1: Majid Alotaibi

Keywords: packet delivery ratio (PDR); multimedia; buffer; priority; delay

PDF

Paper 44: A Comparative Study of Stereovision Algorithms

Abstract: Stereo vision has been and continues to be one of the most researched domains of computer vision, having many applications, among them, allowing the depth extraction of a scene. This paper provides a comparative study of stereo vision and matching algorithms, used to solve the correspondence problem. The study of matching algorithms was followed by experiments on the Middlebury benchmarks. The tests focused on a comparison of 6 stereovision methods. In order to assess the performance, RMS and some statistics related were computed. In order to emphasize the advantages of each stereo algorithm considered, two-frame methods have been employed, both local and global. The experiments conducted have shown that the best results are obtained by Graph Cuts. Unfortunately, this has a higher computational cost. If high quality is not an issue in applications, local methods provide reasonable results within a much lower time-frame and offer the possibility of parallel implementations.

Author 1: Elena Bebeselea-Sterp
Author 2: Raluca Brad
Author 3: Remus Brad

Keywords: Stereo vision; disparity; correspondence; comparative study; middlebury benchmark

PDF

Paper 45: A Systematic Report on Issue and Challenges during Requirement Elicitation

Abstract: Retracted: After careful and considered review of the content of this paper by a duly constituted expert committee, this paper has been found to be in violation of IJACSA`s Publication Principles. We hereby retract the content of this paper. Reasonable effort should be made to remove all past references to this paper.

Author 1: Burhan Mohy-ud-din
Author 2: Muhammad Awais
Author 3: Muhammad Sheraz Arshad Malik
Author 4: Ayesha Shahid

Keywords: Problems and difficulties in requirement gathering; process of eliciting requirements; requirement gathering

PDF

Paper 46: Resilient Framework for Distributed Computation Offloading: Overview, Challenges and Issues

Abstract: Gradually, mobile and smart computing devices are becoming pervasive and prevalent in society and also increasingly being used to undertake the daily tasks and business activities of individuals and organizations worldwide as compared to their desktop counterparts. But these mobile and smart computing devices are resource constrained and sometimes lack the needed computational capacities; thus, memory, energy, storage, and processor to run the plethora of resource intensive applications available for mobile users. There is a lot of benefit of offloading resource demanding applications and intensive computations from mobile devices to other systems with higher resource capacities in the cloud. Mobile cloud computing is a form of cloud computing that seeks to enhance the capacity and capabilities of mobile and smart computing devices by enabling mobile devices to offload some computational tasks to the cloud for processing which otherwise would have been a challenge. The study setup an experiment to investigate computation offloading for mobile devices. The study also presented an energy model for computation offloading. It was observed during the experiment that by offloading intensive applications from mobile and smart computing devices to other systems with higher resource capacities, a great amount of resource efficiency is achieved.

Author 1: Collinson Colin M. Agbesi
Author 2: Jamal-Deen Abdulai
Author 3: Katsriku Apietu Ferdinand
Author 4: Kofi Adu-Manu Sarpong

Keywords: Cloud computing; mobile cloud computing; computation offloading; distributed computation offloading

PDF

Paper 47: Fuzzy Logic Tsukamoto for SARIMA On Automation of Bandwidth Allocation

Abstract: The wireless network is used in different fields to enhance information transfer between remote areas. In the education area, it can support knowledge transfer among academic member including lecturers, students, and staffs. In order to achieve this purpose, the wireless network is supposed to be well managed to accommodate all users. Department of Electrical Engineering and Information Technology UGM sets wireless network for its daily campus activity manually and monitor data traffic at a time then share it to the user. Thus, it makes bandwidth sharing becomes less effective. This study, build a dynamic bandwidth allocation management system which automatically determines bandwidth allocation based on the prediction of future bandwidth using by implementing Seasonal Autoregressive Integrated Moving Average (SARIMA) with the addition of outlier detection since the result more accurate. Moreover, the determination of fixed bandwidth allocation was done using Fuzzy Logic with Tsukamoto Inference Method. The results demonstrate that bandwidth allocations can be classified into 3 fuzzy classes from quantitative forecasting results. Furthermore, manual and automatic bandwidth allocation was compared. The result on manual allocation MAPE was 70,76% with average false positive value 56 MB, compared to dynamic allocation using Fuzzy Logic and SARIMA which has MAPE 38,9% and average false positive value around 13,84 MB. In conclusion, the dynamic allocation was more effective in bandwidth allocation than manual allocation.

Author 1: Isna Alfi Bustoni
Author 2: Adhistya Erna Permanasari
Author 3: Indriana Hidayah
Author 4: Indra Hidayatulloh

Keywords: Bandwidth allocation management; dynamic allocation; fuzzy logic; Tsukamoto inference method; SARIMA

PDF

Paper 48: Figural a Flexibility Test for Improving Creative Thinking in an Arabic Learning Environment A Saudi Arabia-Based Case Study

Abstract: The capability of graduates to be flexible in the face of rapidly altering situations is an increasingly crucial requirement that teachers should be conscious of, given that persistent development and technological progress are characteristic of contemporary life. Proficiency for learning and cognitive abilities are two areas in which learners need to acquire knowledge. Cognitive spatial ability has various dynamics, the assessment of which can be undertaken through numerous techniques. The major objectives of this paper are to develop a web-based system for measuring adult cognitive ability within an Arabic learning environment, in addition to enhancing their creative thinking and learning capabilities through utilising the kit of factor referenced cognitive tests, devised by Ekstrom et al. (1976). The web-based system will focus on the figural flexibility test (Toothpicks test - planning patterns – storage test). Each test has its own objective with regard to assisting with the measurement of people’s creative ability in different ways, as a means of enhancing creative thinking and learning. Prior to constructing the figural flexibility test system, we are going to distribute a questionnaire in order to assess and examine certain crucial aspects, to inform the construction of our system. The questionnaires were distributed to university students in the Faculty of Computing and Information Technology (FCIT), in addition to random distribution via email and social media, namely, Facebook, Twitter and WhatsApp. Over 500 questionnaires were distributed with 400 responses received. The objective was to assess the new system’s feasibility, as well as to design a system that meets the user’s requirements. As a result of the questionnaire, 77% of people were found to believe that creating a web-based system can assist students with developing their creative thinking and learning abilities.

Author 1: Nahla Aljojo

Keywords: Creative thinking; kit of factor referenced cognitive tests; students; toothpicks; planning patterns; storage test; cognitive abilities

PDF

Paper 49: Development of Self-Learning Program for the Bending Process of Quartz Glass

Abstract: Quartz glass is a high-performance glass material with its high heat and chemical resistance, wide optical transparency ranging from ultraviolet light to infrared light, and the high formativeness as a glass material. Because it has high morphological stability due to its heat resistance and low thermal expansion, it is widely used as material for specialized research and development or high-precision components. There are several techniques to process quartz glass material, and an important process among them is the fire processing. The fire processing requires technology to heat and mold glass material in high temperature, and high-quality processing is done by the manual works of experts. In this study, we focused on bending work, which is the process that demands particularly high skill among the fire processing. We developed a self-learning program for beginners to improve their skill in short time by using the bending know-hows of the experts that were clarified through process analysis, product evaluation and an interview with the expert, and examined its effectiveness. As a result, a consistent educational effect was observed in the bending skill improvement of the beginner in a short period of time.

Author 1: Masamichi Suda
Author 2: Akio Hattori
Author 3: Akihiko Goto
Author 4: Noriaki Kuwahara
Author 5: Hiroyuki Hamada

Keywords: Quartz glass; self-learning; bending; experts; beginner; process analysis; text analysis

PDF

Paper 50: A Remote Sensor Network using Android Things and Cloud Computing for the Food Reserve Agency in Zambia

Abstract: In order to introduce modern warehousing, improve upon the storage of grain and grain marketing business processes for the Food Reserve Agency in Zambia, a prototype of a remote sensor network was developed and built as a proof of concept for a much wider deployment using cloud computing and the internet of things concept. It was determined that a wireless sensor network would aid the Food Reserve Agency in analytics, timely action and real-time reporting from all its food depots spread-out throughout Zambia. Google’s Android Things Platform was used in order to achieve the objectives. Advantages of Android Things over traditional platforms that have been used to develop wireless sensor networks were looked into and presented in this paper.

Author 1: Mulima Chibuye
Author 2: Jackson Phiri

Keywords: Internet of things; android things; wireless sensor network (WSN); remote sensor network; Food Reserve Agency (FRA); grain marketing; modern warehousing; cloud computing

PDF

Paper 51: Implementation of an Image Processing Algorithm on a Platform DSP Tms320c6416

Abstract: In the context of emerging technologies, Cloud Computing (CC) was introduced as a new paradigm to host and deliver Information Technology Services. Cloud computing is a new model for delivering resources. However, there are many critical problems appeared with cloud computing, such as data privacy, security, and reliability, etc. But security is the most important between these problems. Biometric identification is a reliable and one of the easiest ways to recognize a person using extractable characteristics. In addition, a biometric application requires a fast and powerful processing systems, hence the increased use of embedded systems in biometric applications especially in image processing. Embedded systems have a wide variety and the choice of a well-designed processor is one of the most important factors that directly affect the overall performance of the system. This study highlights the performance of the Texas Instrument DSP for processing a biometric fingerprint recognition system.

Author 1: Farah Dhib Tatar
Author 2: Mohsen Machhout

Keywords: Fingerprint; biometrics; images processing; embedded systems; DSP

PDF

Paper 52: Efficient Node Monitoring Mechanism in WSN using Contikimac Protocol

Abstract: Wireless Sensor Network is monitored with ContikiMAC Cooja flavor to diagnose the energy utilization ratio by nodes and the fault detection process in distributed approach; adopted the Low power Listening (LPL) mechanism with ContikiMAC to prolong the network’s lifetime. LPL locate the root cause of communication issue, get rid of the interruption problems, and get back normal communication state. The LPL mechanism reduces the energy utilization in centralized and distribute approaches. Even more, the distributed approach is best suited for network monitoring when energy utilization is main objective in the presence of LPL. It is also important how soon the faulty node can be detected. In this case, latency has vital contributions in monitoring mechanism and latency is achieved by developing the efficient faulty node detection methodology.

Author 1: Shahzad Ashraf
Author 2: Mingsheng Gao
Author 3: Zhengming Chen
Author 4: Syed Kamran Haider
Author 5: Zeeshan Raza

Keywords: Wireless sensor networks; low power listening; ContikiMAC; Cooja

PDF

Paper 53: Relaxed Random Search for Solving K-Satisfiability and its Information Theoretic Interpretation

Abstract: The problem of finding satisfying assignments for conjunctive normal formula with K literals in each clause, known as K-SAT, has attracted many attentions in the previous three decades. Since it is known as NP-Complete Problem, its effective solution (finding solution within polynomial time) would be of great interest due to its relation with the most well-known open problem in computer science (P=NP Conjecture). Different strategies have been developed to solve this problem but in all of them the complexity is preserved in NP class. In this paper, by considering the recent approach of applying statistical physic methods for analyzing the phase transition in the complexity of algorithms used for solving K-SAT, we try to compute the complexity of using randomized algorithm for finding the solution of K-SAT in more relaxed regions. It is shown how the probability of literal flipping process can change the complexity of algorithm substantially. An information theoretic interpretation of this reduction in time complexity will be argued.

Author 1: Amirahmad Nayyeri
Author 2: Gholamhossein Dastghaibyfard

Keywords: Constraint satisfaction problem; K-SAT; threshold phenomena; randomized algorithm; entropy; NP-completeness

PDF

Paper 54: A Review of State-of-the-Art on Wireless Body Area Networks

Abstract: During the last few years, Wireless Body Area Networks (WBANs) have emerged into many application domains, such as medicine, sport, entertainments, military, and monitoring. This emerging networking technology can be used for e-health monitoring. In this paper, we review the literature and investigate the challenges in the development architecture of WBANs. Then, we classified the challenges of WBANs that need to be addressed for their development. Moreover, we investigate the various diseases and healthcare systems and current state-of-the-art of applications and mainly focus on the remote monitoring for elderly and chronically diseases patients. Finally, relevant research issues and future development are discussed.

Author 1: Fatemeh Rismanian Yazdi
Author 2: Mehdi Hosseinzadeh
Author 3: Sam Jabbehdari

Keywords: Wireless body area networks; review; challenges; applications; architecture; radio technologies; telemedicine

PDF

Paper 55: Performances Analysis of a SCADA Architecture for Industrial Processes

Abstract: SCADA (Supervisory Control And Data Acquisition) systems are used to monitor and control various industrial processes, and have been continuously developed in order to incorporate the new technologies from software development and field busses areas. The middleware communication has the most relevant role in the development of such complex distributed systems as SCADA systems. These systems are very complex and must be reliable and predictable. Furthermore, their performance capabilities are very important. This paper presents a performance analysis of a SCADA system developed for Windows platform, including Windows Compact Embedded. The analysis is focused on the performance difference between computing systems based on Windows desktop and Windows CE operating systems. The utilization of the Windows CE is useful on the application with real-time requirements that cannot be achieved by the Windows desktop. Testing the application and analyzing the results led to the validation of the proposed SCADA system.

Author 1: Simona-Anda TCACIUC

Keywords: SCADA systems; middleware; data acquisition; data stream; distributed systems

PDF

Paper 56: Performances Comparison of IEEE 802.15.6 and IEEE 802.15.4 Optimization and Exploitation in Healthcare and Medical Applications

Abstract: In this paper, we simulate the energy consumption, throughput and reliability for both, Zigbee IEEE 802.15.4 Mac protocol and BAN IEEE 802.15.6 exploited in medical applications using Guaranteed Time Slot (GTS) and polling mechanisms by CASTALIA software. Then, we compare and analyze the simulation results. These results show that the originality of this work focuses on giving decisive factors to choose the appropriate MAC protocol in a medical context depending on the energy consumption, number of used nodes, and sensors data rates.

Author 1: C.E. AIT ZAOUIAT
Author 2: A.LATIF

Keywords: Guaranteed Time Slot (GTS); polling; WBAN; IEEE 802.15.6; IEEE 802.15.4; energy consumption

PDF

Paper 57: Model Study and Fault Detection for the Railway System

Abstract: The wheel-rail-sleepers system is simulated as a series of moving point loads on an Euler–Bernoulli beam resting on a visco-elastic half space. This paper concentrates on the rail-sleepers interaction system (railway system) and the fault detection. The main objective is to mathematically develop and implement a dynamic model of a railway system then the diagnosis of system defects using a Luenberger observer (LO). The simulation results are based on a physical description, mathematical equations and simulations with MATLAB simulation program.

Author 1: ARFA Rawia
Author 2: TLIJANI Hatem
Author 3: KNANI Jilani

Keywords: Dynamic model; the wheel-rail–sleepers system; interaction system; Euler–Bernoulli; Luenberger observer (LO); fault detection

PDF

Paper 58: NHCA: Developing New Hybrid Cryptography Algorithm for Cloud Computing Environment

Abstract: The amount of transmitted data through the internet become larger and larger every day. The need of an encryption algorithm that guarantee transmitting data speedily and in a secure manner become a must. The aim of the research is to encrypt and decrypt data efficiently and effectively protect the transmitted data. This research paper presents a model for encrypting transmitted cloud data. This model uses the following encryption algorithms RSA, Triple DES, RC4, and Krishna to generate a new encryption algorithm that encrypt and decrypt transmitted data. The algorithm will help cloud agencies and users to secure their transmitted data and prevent it from being stolen.

Author 1: Ali Abdulridha Taha
Author 2: Diaa Salama AbdElminaam
Author 3: Khalid M Hosny

Keywords: Hybrid cryptography algorithms; symmetric encryption algorithms; asymmetric encryption algorithms

PDF

Paper 59: Recognizing Rainfall Pattern for Pakistan using Computational Intelligence

Abstract: Over the world, rainfall patterns and seasons are shifting in new directions due to global warming. In the case of Pakistan, unusual rainfall events may outcome with droughts, floods and other natural disasters along with disturbance of economy, so the scientific understating of rainfall patterns will be very helpful to water management and for the economy. In this paper, we have attempted to recognize rainfall patterns over selected regions of the Pakistan. All the time series data of metrological stations are taken from the PMD (Pakistan Meteorological Department). Using PCA (Principal Component Analysis), monthly metrological observations of all the stations in Punjab have been analyzed which covers the area of 205,344 km² and includes monsoon-dominated regions. To tackle the problem of inter-annual variations, trend detection, and seasonality, rainfall data of Lahore, the Pakistan is taken that covers the period of 1976-2006. To obtain results, MASH (Moving Average over Shifting Horizon), PCA (Principal Component Analysis) along with other supporting techniques like bi-plots, the Pair-wise correlation has been applied. The results of this study successfully show seasonal patterns, variations and hidden information in complex precipitation data structure.

Author 1: M. Ali Aun
Author 2: Abdul Ghani
Author 3: M.Azeem
Author 4: M. Adnan
Author 5: M. Ahsan Latif

Keywords: Rainfall patterns; trend detection; time-series analysis; principal component analysis; box-plot; moving average over shifting horizon; inter-annual variability

PDF

Paper 60: Analysis and Formal Model of RFID-Based Patient Registration System

Abstract: Patient Registration System (PRS) is an important part of hospital environment. Therefore, semiformal model of Patient Registration System that registers the patients by assigning Radio Frequency Identification (RFID) card or bracelet is presented in this paper. The existing Patient Registration Systems do not properly work due to ambiguities and semiformal modeling techniques. However, that is why we will propose formal modeling for PRS using Vienna Development Method (VDM-SL). Firstly, we develop the Unified Modeling Language (UML) based semiformal model of PRS because UML is used for better understanding of the system architecture. Formal methods are used to ensure accuracy and robustness of the system. Therefore, we transform the UML based model into formal model by writing formal specification of the system to improve accuracy and efficiency of PRS. In this way, development time, testing and maintenance cost in building RFID based PRS software is reduced to a great extent.

Author 1: Marrium Khalid
Author 2: Hamra Afzaal
Author 3: Shoaib Hassan
Author 4: Nazir Ahmad Zafar

Keywords: Patient registration system; Radio Frequency Identification (RFID); bracelet; formal method; semiformal modeling; verification

PDF

Paper 61: Adaptive Multilayered Particle Swarm Optimized Neural Network (AMPSONN) for Pipeline Corrosion Prediction

Abstract: Artificial Neural Network (ANN) design has long been a complex problem because its performance depends heavily on the network topology and algorithm to train the set of synaptic weights. Particle Swarm Optimization (PSO) has been the favored optimization algorithm to complement ANN, but a thorough literature study has shown that there are gaps with current approaches which integrate PSO with ANN, including the optimization of network topology and the unreliable weight training process. These gaps have caused inferior effect on critical Artificial Intelligence (AI) applications and systems, particularly when predicting plant machinery and piping failure due to corrosion. The problem of corrosion prediction in the oil and gas domain remains unanswered due to the lack of a flexible prediction method which targets specific damage mechanisms that caused corrosion. This paper proposes a hybrid prediction method known as the Adaptive Multilayered Particle Swarm Optimized Neural Network (AMPSONN), which integrates several layers of PSO to optimize different parameters of the ANN. The multilayered PSO enables the method to optimize the network topology and train the set of synaptic weights at the same time using a hierarchical optimization approach. Through detailed discussion and literature study, the damage mechanism focused in this research is the CO2 corrosion and the dataset for this research is obtained from the NORSOK empirical model. The proposed AMPSONN method is tested against BP, MPSO and PSOBP methods on an industrial corrosion dataset for different test conditions. The results showed that AMPSONN performs best on all three problems, exhibiting high classification accuracies and time efficiency.

Author 1: Kien Ee Lee
Author 2: Izzatdin bin Abdul Aziz
Author 3: Jafreezal bin Jaafar

Keywords: Corrosion; damage mechanism; prediction method; artificial neural network; particle swarm optimization

PDF

Paper 62: Implementation of Pattern Matching Algorithm for Portable Document Format

Abstract: Internet availability and e-documents are freely used in the community. This condition has the potential for the occurrence of the act of plagiarism against an e-document of scientific work. The process of detecting plagiarism in some cases seems to be done manually by using human power so that it has the potential to make mistakes in observing and remembering the checkpoints that have been done. The method used in this research is to represent two sets of objects compared in the form of probability. In order for the method to run perfectly, the Rabin-Karp algorithm is applied, wherein Rabin-Karp is a string matching algorithm that uses hash functions as a comparison between the searched string (m) and substring in the text (n). If both hash values are the same then the comparison will be done once again to the characters. The resulting system is a web-based application that shows the value of the similarity of two sets of objects.

Author 1: Anton Yudhana
Author 2: Sunardi
Author 3: Abdul Djalil Djayali

Keywords: Pattern matching; Rabin-Karp algorithm; data mining; web

PDF

Paper 63: Collective Movement Method for Swarm Robot based on a Thermodynamic Model

Abstract: In this paper, a distributed collective movement control method is proposed for a swarm robotics system based on an internal energy thermodynamic model. The system can move between obstacles with a changing aggregation suitable for confronting obstacle arrangements in an environment. The swarm robot shape is a fixed aggregation formed by virtual attraction and repulsion forces based on the proposed method. It follows a leader agent while retaining its shape. When the swarm robot aggregation shape cannot be maintained during movement though narrow spaces with obstacles, the swarm robot flexibly changes shape according to that of the local environment. To this end, it employs virtual thermal motion, which is made possible with directives and enables continuous movement. A simulation confirmed the capability of the proposed method in enabling the solidity and flexibility collective movement of swarm robots. The results furthermore showed that the parameter setting range is important for applying the proposed method to collective movement.

Author 1: Kouhei YAMAGISHI
Author 2: Tsuyoshi SUZUKI

Keywords: Swarm robotics system; solidity and flexibility collective movement; thermodynamics; distributed control

PDF

Paper 64: Optimization and Simulation Approach for Empty Containers Handling

Abstract: Container handling problems at the container terminals are NP-hard problems. In this paper, we propose a new handling operation’s design and simulation of empty containers, taking into account the interrelated activities at the container terminal. This simulation have been built using a doubled trailer. It moves containers from quayside to yard side or the opposite depending on the flow in container terminal, and it is used to optimize the cycle time and to improve the efficiency of the other equipment. Our interest is to test this new model first for empty containers. The proposed model is applied on a real case study data of the container terminal at Tanger Med port. This new design was developed using Arena software and verifying the strength of materials constraint for the loaded containers. The computational results show the effectiveness of the proposed model, where the cycle time of the port equipment have been reduced by -58%, and the efficiency has been increased where +47% of the moves in container terminal was achieved.

Author 1: Chafik Razouk
Author 2: Youssef Benadada

Keywords: Container terminal; design; doubled trailer; simulation; arena; strength of materials; quay

PDF

Paper 65: Traffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers

Abstract: Detection and recognition of traffic signs in a video streams consist of two steps: the detection of signs in the road scene and the recognition of their type. We usually evaluate globally this process. This evaluated approach unfortunately does not allow to finely analyze the performance of each step. It is difficult to know what step needs to be improved to obtain a more efficient system. Our previous work focused on a real-time detection of road signs, by improving the performances of the detection step in real time. In this paper, we complete the work by focusing on recognition step, where we compare the performances between histogram projection (HP) descriptor, and the histogram-oriented gradient (HOG) descriptor combined with the Multi-Layer Perceptron (MLP) classifier, and the Support Vector Machine (SVM) classifier, to compute characteristics and descriptors of the objects extracted in the step of detection, and identify the kind of traffic signs. Experimental results present the performances of the four combinations of these methods “Descriptor-Classifier” to identify which of them could have high performance for traffic sign recognition.

Author 1: A. Salhi
Author 2: B. Minaoui
Author 3: M. Fakir
Author 4: H. Chakib
Author 5: H. Grimech

Keywords: Traffic signs detection and recognition; Histogram of oriented gradient (HOG); Support Vector Machine (SVM); Histogram projection (HP); Multi-layer perceptron (MLP)

PDF

Paper 66: Model Driven Development Transformations using Inductive Logic Programming

Abstract: Model transformation by example is a novel approach in model-driven software engineering. The rationale behind the approach is to derive transformation rules from an initial set of interrelated source and target models; e.g., requirements analysis and software design models. The derived rules describe different transformation steps in a purely declarative way. Inductive Logic Programming utilizes the power of machine learning and the capability of logic programming to induce valid hypotheses from given examples. In this paper, we use Inductive Logic Programming to derive transformation rules from given examples of analysis-design pairs. As a proof concept, we applied the approach to two major software design tasks: class packaging and introducing Façade design. Various analysis-design model pairs collected from different sources were used as case studies. The resultant performance measures show that the approach is promising.

Author 1: Hamdi A. Al-Jamimi
Author 2: Moataz A. Ahmed

Keywords: Transformation model; software design models; transformation rules; inductive logic programming

PDF

Paper 67: Software Refactoring Approaches: A Survey

Abstract: The objective of software refactoring is to improve the software product’s quality by improving its performance and understandability. There are also different quality attributes that software refactoring can improve. This study gives a wide overview of five primary approaches to software refactoring. These are two clustering approaches at class level and two at package level, as well as one graph transformational approach at class level. The research also compares the approaches using several evaluation criteria.

Author 1: Ismail M. Keshta

Keywords: Software refactoring; refactoring tool; machine learning; hierarchical clustering; graph transformations

PDF

Paper 68: Investigating Clinical Decision Support Systems Success Factors with Usability Testing

Abstract: Clinical Decision Support Systems (CDSS) have been used widely since 2000s to improve the healthcare quality. CDSS can be utilized to support healthcare services as a tool to diagnose, predict, as well as to provide clinical interpretation, alert, and reminder. There are many researches of CDSS implementation on literatures but not many of them present the evidence of CDSS successful implementation. In spite of the potential use of CDSS, there are some researches that reveal the failures of CDSS implementation. This paper contributes to CDSS development by investigating and exploring CDSS success factors with usability testing. The testing involves participants from different types of backgrounds (physicians, IT developers, and students). The participants are being asked to experience three different CDSS to predict cardiovascular risk factors. The result of the research shows that involving different type of users give more insight to design process. It can be concluded that user center design is very critical to produce successful CDSS.

Author 1: Vitri Tundjungsari
Author 2: Abdul Salam Mudzakir Sofro
Author 3: Ahmad Sabiq
Author 4: Aan Kardiana

Keywords: Clinical decision support systems; success factors; user; usability testing

PDF

Paper 69: Design and Simulation of Adaptive Controller for Single Phase Grid Connected Photovoltaic Inverter under Distorted Grid Conditions

Abstract: This paper presents an adaptive controller for single-phase grid-connected photovoltaic inverter under abnormal grid conditions. The main problem associated with the controllers of the grid-connected inverter is that they are tuned for some assumed values of the electrical grid parameters. However, when the parameters, such as voltage and frequency are changed or the grid is subjected to uncertain distortion, these controllers unable to track those variations of the grid parameters and handle the output power within the allowable limit. To overcome such problem, a suitable control strategy is proposed, which is based on frequency adaptive current control and an accurate grid detection. For validity confirmation, a controlled 3KW system, with specific features was designed and simulated. The simulation results confirmed that the strategy is an effective way of control.

Author 1: Mohamed Alswidi
Author 2: Abdulaziz Aldobhani
Author 3: Abdurraqib Assad

Keywords: Single phase grid-connected photovoltaic inverter; adaptive controller; grid parameter variations

PDF

Paper 70: A Short Review of Gender Classification based on Fingerprint using Wavelet Transform

Abstract: In some cases, knowing the gender of fingerprint owner found in criminal or disaster scene is advantageous. Theoretically, if the number of the male and female fingerprints in a database is equal, then the identification process of a fingerprint on that database would be two times faster. Some methods have been used to classify gender based on the fingerprint. Most of the method is based on ridge density. This method showed good result. However, it is sensitive to the location of the fingerprint area where ridge density is determined. This paper reviews some literature that used wavelet transform to generate features of a fingerprint. As far as what we found in literature, the number of papers on this topic is very limited. However, based on the literature we reviewed, wavelet transform gives some advantages compared to ridge density counting.

Author 1: Sri Suwarno
Author 2: P. Insap Santosa

Keywords: Fingerprint; gender; ridge density; wavelet transform

PDF

Paper 71: An Optimal Load Balanced Resource Allocation Scheme for Heterogeneous Wireless Networks based on Big Data Technology

Abstract: An important issue in heterogeneous wireless networks is how to optimally utilize various radio resources. While many methods have been proposed for managing radio resources in each network, these methods are not suitable for heterogeneous wireless networks. In this study, a new management method is proposed which provides acceptable service quality and roaming rate, reduces the cost of the service, and utilizes big data technology for its operation. In our proposed scheme, by considering various parameters such as the type of the service, the information related to the location of the user, the movement direction of the user, the cost of the service, and a number of other statistical measures, the most suitable technology for radio access will be selected. It is expected that besides improving the decision making accuracy in selecting the radio access technology and the balanced distribution of network resources, the proposed method provides lower roaming and lower probability of stopping roaming requests entering the network. By considering the various service classes and various quality of service requirements regarding delay, vibration and so on, this can be useful in optimal implementation of heterogeneous wireless networks.

Author 1: Abbas Mirzaei
Author 2: Morteza Barari
Author 3: Houman Zarrabi

Keywords: Heterogeneous wireless networks; radio resource management; quality of service; big data technology; decision making

PDF

Paper 72: Forced-Driven Wet Cloth Simulation based on External Physical Dynamism

Abstract: Cloth simulation remains challenging for past two decades. There are several factors that contribute to this challenge such as, internal and external forces, water, oil and other fluid elements. This paper focuses on simulating wet cloth by considering external forces and water element. Initially, the mass spring technique is used to produce cloth sheet that is composed from collection of matrix point that connects to the spring, then external and internal forces are applied into cloth surfaces. The inner strength is represented by stiffness between springs of cloth particles, while outside forces depend on wind pressure and mass of object that rely on gravity. The wet cloth simulation is started by adding the fluid component into the textile elements which will affect the mass of the cloth itself. The cloth will absorb significance quantity of fluid that will distress the tension between spring particles inside cloth. The experiment has been conducted by simulating the cloth while absorbing the fluid which is controlled by particular equation. It has shown that saturation level of cloth is changing as well as the texture turn to be darker compared to dry cloth. The darkest color of cloth reflects the highest saturation level of the cloth. It also means that cloth cannot absorb more fluid since it is already full in term of capacity. The evaluation is conducted by comparing the dry and wet cloth in terms of motion and physical appearance. It is concluded that the proposed method is able to simulate the convincing wet cloth simulation with high Frame per Second (FPS) rate and realistic motion and appearance. The future work can focus on simulating interaction between fluid and cloth elements to see spoil scene, or washing cloth that remain challenging.

Author 1: Ahmad Hoirul Basori
Author 2: Hani Moaiteq Abdullah AlJahdali
Author 3: Omar Salim Abdullah

Keywords: Wet cloth simulation, fabric; mass spring; fluid; wind; gravity forces

PDF

Paper 73: Tsunami Warning System with Sea Surface Features Derived from Altimeter Onboard Satellites

Abstract: A tsunami warning system based on active database system with satellite derived real-time data of tidal, significant wave height and ocean wind speed as well as assimilation data of sea level changes as one of the global risk management systems is proposed. Also, Geographic Information System (GIS) with free open source software of PostGIS is proposed for active database system. It may be said that the proposed tsunami warning and evacuation information provided system is recommendable.

Author 1: Kohei Arai

Keywords: Active database system; ocean related data stream; assimilation data; altimeter onboard satellites; Geographic Information System (GIS), tsunami

PDF

Paper 74: Information Security and Learning Content Management System (LCMS)

Abstract: The learning environment has recently undergone a quantum leap due to rapid growth in information technology. This development has allowed the e-learning environment to take advantage of electronic tools to improve teaching methods using LCMS. The emergence of many e-learning institutions has accelerated the adoption of information and communication technology without taking due care and understanding of security concerns. LCMS is a new learning method that ultimately relies on the web in its implementation. This article argues essential elements of information security (IS) that require application through the information management system. On other hand, the paper also identifies anti IS measures that can boost IS within the information management system.

Author 1: Walid Qassim Qwaider

Keywords: E-Learning; LCMS; LMS; CMS; information security (IS)

PDF

Paper 75: Examining Software Intellectual Property Rights

Abstract: Intellectual property rights (IPR) of computer software is the right to assign the software to its creator, not limited to time and space, and non-transferable. Proving IPR of the creators of computer software requires a rigorous review of the ways in which these rights may be violated. The present study was conducted by comparing two populations in Iran with the aim of identifying the level of familiarity and observance of software IPR: 1) 96 software engineers member of IEEE Association and 2) 386 students randomly. Results are analyzed by SPSS software and the validity of the results is verified using T-test. By comparing the results, it was concluded that the first population significantly observed these cases more. Then a model was presented for protecting software IPR so that the challenges are reduced. This research is the completion of our previous work that was discussed as a future work.

Author 1: Ehsan Sargolzaei
Author 2: Fateme Keikha

Keywords: Software intellectual property rights (IPR); software piracy; copyright; patent

PDF

Paper 76: Dimensionality Reduction using Hybrid Support Vector Machine and Discriminant Independent Component Analysis for Hyperspectral Image

Abstract: Hyperspectral image is an image obtain from a satellite sensor. This image has more than 100 bands with a wide spectral range and increased spatial image resolution, providing detailed information on objects or materials that exist on the ground in a specific way. The hyperspectral image is well suited for the classification of the earth’s surface covering due to it. Due to the features of the hyperspectral data, then lately research related to trend hyperspectral data tend to increase. The transformation of the reduction of the dimensions of the original data into a new dimension reduction chamber is often done to overcome the problem of ‘curse of dimensionality’ in which its dimensions tend to increase exponentially. Data is mapped from the original data to a lower dimensionless space through a dimensional reduction procedure which must display the observation input effectively. Therefore, in this research we proposed a hyperspectral dimension hybrid reduction method which adopted Support Vector Machine (SVM) and Discriminant Independent Component Analysis (DICA) techniques to reduce original data in order to obtain better accuracy. By using SVM+DICA is used to reduction dimension hyperspectral images. In this research, we use KNN as classifier. From the experiment obtained that value of average accuracy is 0.7527, overall accuracy is 0.7901, and Kappa is 0.7608 for AVIRIS dataset.

Author 1: Murinto
Author 2: Nur Rochmah Dyah PA

Keywords: Classification; discriminant independent component analysis; support vector machine; hyperspectral image

PDF

Paper 78: NoSQL Racket: A Testing Tool for Detecting NoSQL Injection Attacks in Web Applications

Abstract: A NoSQL injection attack targets interactive Web applications that employ NoSQL database services. These applications accept user inputs and use them to form query statements at runtime. During NoSQL injection attack, an attacker might provide malicious query segments as user input which could result in a different database request. In this paper, a testing tool is presented to detect NoSQL injection attacks in web application which is called “NoSQL Racket”. The basic idea of this tool depends on checking the intended structure of the NoSQL query by comparing NoSQL statement structure in code query statement (static code analysis) and runtime query statement (dynamic analysis). But we faced a big challenge, there is no a common query language to drive NoSQL databases like the same way in relational database using SQL as a standardized query language. The proposed tool is tested on four different vulnerable web applications and its effectiveness is compared against three different well known testers, none of them is able to detect any NoSQL Injection attacks. However, the implemented testing tool has the ability to detect the NoSQL injection attacks.

Author 1: Ahmed M. Eassa
Author 2: Omar H. Al-Tarawneh
Author 3: Hazem M. El-Bakry
Author 4: Ahmed S. Salama

Keywords: NoSQL; injection attack; web application; web security; testing tool

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org