The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 10 Issue 2

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: A Hazard Detection and Tracking System for People with Peripheral Vision Loss using Smart Glasses and Augmented Reality

Abstract: Peripheral vision loss is the lack of ability to recognise objects and shapes in the outer area of the visual field. This condition can affect people’s daily activities and reduces their quality of life. In this work, a smart technology that implements computer vision algorithms in real-time to detect and track moving hazards around people with peripheral vision loss is presented. Using smart glasses, the system processes real-time captured video and produces warning notifications based on predefined hazard danger levels. Unlike other obstacle avoidance systems, this system can track moving objects in real-time and classify them based on their motion features (such as speed, direction, and size) to display early warning notification. A moving camera motion compensation method was used to overcome artificial motions caused by camera movement before an object detection phase. The detected moving objects were tracked to extract motion features which were used to check if the moving object is a hazard or not. A detection system for camera motion states was implemented and tested on real street videos as the first step before an object detection phase. This system shows promising results in motion detection, motion tracking, and camera motion detection phases. Initial tests have been carried out on Epson’s smart glasses to evaluate the real-time performance for this system. The proposed system will be implemented as an assistive technology that can be used in daily life.

Author 1: Ola Younis
Author 2: Waleed Al-Nuaimy
Author 3: Mohammad H. Alomari
Author 4: Fiona Rowe

Keywords: Peripheral vision loss; vision impairment; computer vision; assistive technology; motion compensation; optical flow; smart glasses

PDF

Paper 2: Adaptive Generalized Gaussian Distribution Oriented Thresholding Function for Image De-Noising

Abstract: In this paper, an Adaptive Generalized Gaussian Distribution (AGGD) oriented thresholding function for image de-noising is proposed. This technique utilizes a unique threshold function derived from the generalized Gaussian function obtained from the HH sub-band in the wavelet domain. Two-dimensional discrete wavelet transform is used to generate the decomposition. Having the threshold function formed by using the distribution of the high frequency wavelet HH coefficients makes the function data dependent, hence adaptive to the input image to be de-noised. Thresholding is performed in the high frequency sub-bands of the wavelet transform in the interval [-t, t], where t is calculated in terms of the standard deviation of the coefficients in the HH sub-band. After thresholding, inverse wavelet transform is applied to generate the final de-noised image. Experimental results show the superiority of the proposed technique over other alternative state-of-the-art methods in the literature.

Author 1: Noorbakhsh Amiri Golilarz
Author 2: Hasan Demirel
Author 3: Hui Gao

Keywords: Adaptive generalized Gaussian distribution; thresholding function; image de-noising; high frequency sub-bands

PDF

Paper 3: Smart Building’s Elevator with Intelligent Control Algorithm based on Bayesian Networks

Abstract: Implementation of the intelligent elevator control systems based on machine-learning algorithms should play an important role in our effort to improve the sustainability and convenience of multi-floor buildings. Traditional elevator control algorithms are not capable of operating efficiently in the presence of uncertainty caused by random flow of people. As opposed to conventional elevator control approach, the proposed algorithm utilizes the information about passenger group sizes and their waiting time, provided by the image acquisition and processing system. Next, this information is used by the probabilistic decision-making model to conduct Bayesian inference and update the variable parameters. The proposed algorithm utilizes the variable elimination technique to reduce the computational complexity associated with calculation of marginal and conditional probabilities, and Expectation-Maximization algorithm to ensure the completeness of the data sets. The proposed algorithm was evaluated by assessing the correspondence level of the resulting decisions with expected ones. Significant improvement in correspondence level was obtained by adjusting the probability distributions of the variables affecting the decision-making process. The aim was to construct a decision engine capable to control the elevators actions, in way that improves user’s satisfaction. Both sensitivity analysis and evaluation study of the implemented model, according to several scenarios, are presented. The overall algorithm proved to exhibit the desired behavior, in 94% case of the scenarios tested.

Author 1: Yerzhigit Bapin
Author 2: Vasilios Zarikas

Keywords: Bayesian network; smart city; smart building; elevator control algorithm; intelligent elevator system; decision theory; decision support systems

PDF

Paper 4: Investigating the Impact of Mobility Models on MANET Routing Protocols

Abstract: A mobile ad hoc network (MANET) is a type of multi-hop network under different movement patterns without requiring any fixed infrastructure or centralized control. The mobile nodes in this network moves arbitrarily and topology changes frequently. In MANET routing, protocols play an important role to make reliable communication between nodes. There are several issues affecting the performance of MANET routing protocols. Mobility is one of the most significant factors that has an impact on the routing process. In this paper, FCM, SCM, RWM and HWM mobility models are designed to analyze the performance of AODV, OLSR and GRP protocols, with ten pause time values. These models are based on varying speeds and pause time of MANET participants. Different node parameters such as data drop rate, average end-to-end delay, media access delay, network load, retransmission attempts and throughput are used to make a performance comparison between mobility models. The simulation results showed that in most of the cases OLSR protocol provides better performance than other two routing protocols and it is more suitable for networks that require low delay and retransmission attempts, and high throughput.

Author 1: Ako Muhammad Abdullah
Author 2: Emre Ozen
Author 3: Husnu Bayramoglu

Keywords: MANET; routing protocols; AODV; OLSR; GRP; node mobility

PDF

Paper 5: Several Jamming Attacks in Wireless Networks: A Game Theory Approach

Abstract: Wireless jamming attacks have recently been a subject of several researches, due to the exposed nature of the wireless medium. This paper studies the anti-jamming resistance in the presence of several attackers. Two kind of jammers are considered, smart jammers which have the ability to sense the legitimate signal power and regular jammers which don’t have this ability. An Anti Multi-Jamming based Power Control problem modeled as a non-zero-sum Game is suggested to study how the transmitter can adjust its signal power against several jamming attacks. A closed-form expression of Nash Equilibrium is derived when players actions are taken simultaneously. In addition, a Stackelberg Equilibrium closed-form expression is derived when the hierarchical behavior between the transmitter and jammers is assumed. Simulation results show that the proposed scheme can enhance the anti-jamming-resistance against several attackers. Furthermore, this study proves that on the transmitter side, the most dangerous jammer is considered to have the highest ratio between channel gain and jamming cost. Finally, based on the Q-Learning technique, the transmitter can learn autonomously without knowing the patterns of attackers.

Author 1: Moulay Abdellatif Lmater
Author 2: Majed Haddad
Author 3: Abdelillah Karouit
Author 4: Abdelkrim Haqiq

Keywords: Wireless communications; game theory; jamming attacks; stackelberg game; nash game

PDF

Paper 6: Clustering of Multidimensional Objects in the Formation of Personalized Diets

Abstract: When developing personalized diets (personalized nutrition) it is necessary to take into account individual physiological nutritional needs of the body associated with the presence of gene polymorphism among consumers. This greatly complicates the development of rations and increases their cost. A methodology for the formation of target diets based on the multidimensional objects clustering method has been proposed. Clustering in the experimental group was carried out on the basis of a calculation of the integral assessment of reliable risks of developing decease conditions according to selected metabolic processes. And genetic data of participants was taken into account. The use of the proposed method allowed reducing the needed number of typical solutions of individual diets for the experimental group from 10 to 3.

Author 1: Valentina N. Ivanova
Author 2: Igor A. Nikitin
Author 3: Natalia A. Zhuchenko
Author 4: Marina A. Nikitina
Author 5: Yury I. Sidorenko
Author 6: Vladimir I. Karpov
Author 7: Igor V. Zavalishin

Keywords: Multidimensional objects clustering method; integral assessment of reliable risks; nutritional needs of the body; personalized nutrition

PDF

Paper 7: Optimized Field Oriented Control Design by Multi Objective Optimization

Abstract: Permanent Magnet Synchronous Motors are popular electrical machines in industry because they have high efficiency, low ratio of weight/power and smooth torque with no or less ripple. In addition to this, control of synchronous motor is a complex process. Vector control techniques are widely used for control of synchronous motors because they simplify the control of AC machines. In this study, Field Oriented Control technique is used as a speed controller of a Permanent Magnet Synchronous Motor. The controller must be good tuned for applications which need high performance, and classical methods are not enough or need more time to achieve the requested performance criteria. Optimization algorithms are good options for tuning process of controllers. They guarantee finding one of the best solutions and need less time for solving the problem. Therefore, in this study, Tree-Seed Algorithm is used for tuning process of the controller parameters and the results show that Tree-Seed Algorithm is good tool for controller tuning process. The controller is also tuned by Particle Swarm Algorithm to make a comparison. The results show that optimized system by Tree-Seed Algorithm has good performance for the applications which need changing speed and load torque. It has also better performance than the system which is optimized by Particle Swarm Optimization algorithm.

Author 1: Hüseyin Oktay ERKOL

Keywords: Permanent magnet synchronous motor; field oriented control; speed controller; tree-seed algorithm; optimization

PDF

Paper 8: Proposal of Automatic Methods for the Reuse of Software Components in a Library

Abstract: The increasing complexity of applications is constraining developers to use reusable components in component markets and mainly free software components. However, the selected components may partially satisfy the requirements of users. In this article, we propose an approach of optimization the selection of software components based on their quality. It consists of: (1) Selecting components that satisfy the customer's non-functional needs; (2) Calculate the quality score of each of these candidate components to select; (3) Select the best component meeting the customer's non-functional needs with linear programming by constraints. Our aim is to maximize this selection for considering financial cost of component and adaptation effort. Yet in the literature review, researchers are unanimous that software components reuse reduces the cost of development, maintenance time and also increases the quality of the software. However, the models already developed to evaluate the quality of the component do not simultaneously take into account financial cost and adaptation effort factors. So, in our research, we established a connection between the financial cost and the adaptation time of the selected component by a linear programming model with constraints. For our work's validation, we propose an algorithm to support the developed theory. User will then be able to choose the relevant software component for his system from the available components.

Author 1: Koffi Kouakou Ive Arsene
Author 2: Samassi Adama
Author 3: Kimou Kouadio Prosper
Author 4: Brou Konan Marcellin

Keywords: Method development; reuse; software component; quality of component; functional size; functional processes; financial cost; adaptation effort

PDF

Paper 9: Extracting the Features of Modern Web Applications based on Web Engineering Methods

Abstract: With the revolution of the information, an advanced version of the web proposed from web 1.0 to web 4.0. In each version, many web applications appeared. In the new versions, modern web applications (MWAs) proposed. These applications have specific features and different features, and these features made a new challenge for web engineering methods. The problem is that web engineering methods have limitations for MWAs, and the gap is that the developers cannot highlight the new features based on web engineering methods. In this paper, we extract features of the MWA based on web engineering methods. We extract web application modules for showing interaction and structure of their feature based on models and elements of web engineering methods. The result of this work helps the developers for designing MWAs through web engineering methods. Furthermore, lead to researchers to improve web engineering methods for developing MWAs features.

Author 1: Karzan Wakil
Author 2: Dayang N.A. Jawawi

Keywords: Modern web applications; MWA; web engineering; extracting features; web versions

PDF

Paper 10: Development of Home Network Sustainable Interface Tools

Abstract: The home network has become a norm in today's life. Previous studies have shown that home network management is a problem for users who are not in the field of network technology. The existing network management tools are far too difficult to understand by ordinary home network users. Its interface is complex, and does not address the home user's needs in their daily use. This paper presents an interactive network management tool, which emphasizes support features for home network users. The tool combine interactive visual appearance with persuasive approach that support sustainability. It is not only understandable to all categories of home network users, but also acts as a feature for the user to achieve usability.

Author 1: Erman Hamid
Author 2: Nazrulazhar Bahaman
Author 3: Azizah Jaafar
Author 4: Ang Mei Choo
Author 5: Akhdiat Abdul Malek

Keywords: Home network; visualization; sustainable interface

PDF

Paper 11: Comparison of Multilevel Wavelet Packet Entropy using Various Entropy Measurement for Lung Sound Classification

Abstract: Wavelet Entropy (WE) is one of the entropy measurement methods by means of the discrete wavelet transform (DWT) subband. Some of the developments of WE are wavelet packet entropy (WPE), wavelet time entropy. WPE has several variations such as the Shannon entropy calculation on each subband of WPD that produces 2N entropy or WPE, which yields an entropy value. One of the WPE improvements is multilevel wavelet packet entropy (MWPE), which yields entropy value as much as N decomposition level. In a previous research, MWPE was calculated using Shannon method; hence, in this research MWPE calculation was done using Renyi and Tsallis method. The results showed that MWPE using Shannon calculation could yield the highest accuracy of 97.98% for N = 4 decomposition level. On the other hand, MWPE using Renyi entropy yielded the highest accuracy of 93.94% and the one using Tsallis entropy yielded 57.58% accuracy. Here, the test was performed on five lung sound data classes using multilayer perceptron as the classifier.

Author 1: Achmad Rizal
Author 2: Risanuri Hidayat
Author 3: Hanung Adi Nugroho

Keywords: Wavelet packet entropy; lung sound; Shannon entropy; Renyi entropy; Tsallis entropy

PDF

Paper 12: Implementation of Efficient Speech Recognition System on Mobile Device for Hindi and English Language

Abstract: Speech recognition or speech to text conversion has rapidly gained a lot of interest by large organizations in order to ease the process of human to machine communication. Optimization of the speech recognition process is of utmost importance, due to the fact that real-time users want to perform actions based on the input speech given by them, and these actions sometime define the lifestyle of the users and thus the process of speech to text conversion should be carried out accurately. Here`s the plan to improve the accuracy of this process with the help of natural language processing and speech analysis. Some existing speech recognition softwares of Google, Amazon, and Microsoft tend to have an accuracy of more than 90% in real time speech detection. This system combines the speech recognition approach used by these softwares and joined with language processing to improve the overall accuracy of the process with the help of phonetic analysis. Proposed Phonetic Model supports multi-lingual speech recognition and observed that the accuracy of this system is 90% for Hindi and English speech to text recognition. The Hindi WordNet database provided by IIT Mumbai used in this research work for Hindi speech to text conversion.

Author 1: Gulbakshee Dharmale
Author 2: Dipti D. Patil
Author 3: V. M. Thakare

Keywords: Automatic Speech Recognition (ASR); Mel Frequency Cepstral Coefficient (MFCC); Vector Quantization (VQ); Gaussian Mixture Model (GMM); Hidden Markov Model (HMM); Receiver Operating Characteristics (ROC)

PDF

Paper 13: Ensuring Privacy Protection in Location-based Services through Integration of Cache and Dummies

Abstract: Location-Based Services (LBS) have recently gained much attention from the research community due to the openness of wireless networks and the daily development of mobile devices. However, using LBS is not risk free. Location privacy protection is a major issue that concerns users. Since users utilize their real location to get the benefits of the LBS, this gives an attacker the chance to track their real location and collect sensitive and personal information about the user. If the attacker is the LBS server itself, privacy issues may reach dangerous levels because all information related to the user's activities are stored and accessible on the LBS server. In this paper, we propose a novel location privacy protection method called the Safe Cycle-Based Approach (SCBA). Specifically, the SCBA ensures location privacy by generating strong dummy locations that are far away from each other and belong to different sub-areas at the same time. This ensures robustness against advanced inference attacks such as location homogeneity attacks and semantic location attacks. To achieve location privacy protection, as well as high performance, we integrate the SCBA approach with a cache. The key performance enhancement is storing the responses of historical queries to answer future ones using a bloom filter-based search technique. Compared to well-known approaches, namely the ReDS, RaDS, and HMC approaches, experimental results showed that the proposed SCBA approach produces better outputs in terms of privacy protection level, robustness against inference attacks, communication cost, cache hit ratio, and response time.

Author 1: Sara Alaradi
Author 2: Nisreen Innab

Keywords: Privacy protection; dummy; cache; safe cycle; location homogeneity attack; semantic location attack

PDF

Paper 14: Improved Industrial Modeling and Harmonic Mitigation of a Grid Connected Steel Plant in Libya

Abstract: Currently, we are living in a new transition process towards the fourth phase of industrialization, well known as the purported Industry 4.0. This development backbone supposes a sustainable manufacturing. Were optimal functionalities of a factory components especially energy rationalization and enhanced power quality are nonetheless a privilege but an obligation to introduce efficiently artificial intelligence AI, smart metering SM and automated decision making ADM. In the same axis of mitigating power quality issues, this paper is introduced first to draw innovatively a virtual reality (VR) complex grid connected steel power plant and then to depict harmonic sources in order to moderate them which are caused essentially by nonlinear installed loads manifesting power system quality issues and exhibiting periodic signal distortion. Accordingly, it was essential to assay the diverse origins of harmonic problems and to present the most accommodate and economic solution techniques. Related voltage and current harmonic flows at 30 kV levels, of the General Electricity Company of Libya GECOL located in Tripoli city, are examined. Afterward, inquire jointly their harmful effects on plant components. In order to attenuate distortion, a harmonic analysis has been investigated. Then appropriate filters design have been sized, designed, simulated and appended to the panel. Simulation results are presented and validated using ETAP industrial software under real measurement arena.

Author 1: Abeer Oun
Author 2: Ibrahim Benabdallah
Author 3: Adnen Cherif

Keywords: Industry 4.0; distribution systems; THD; harmonic load flow; passive filters

PDF

Paper 15: Multi-Depots Vehicle Routing Problem with Simultaneous Delivery and Pickup and Inventory Restrictions: Formulation and Resolution

Abstract: Reverse logistics can be defined as a set of practices and processes for managing returns from the consumer to the manufacturer, simultaneously with direct flow management. In this context, we have chosen to study an important variant of the Vehicle Routing Problem (VRP) which is the Multi-Depot Vehicle Routing Problem with Simultaneous Delivery and Pickup and Inventory Restrictions (MD-VRPSDP-IR). This problem involves designing routes from multiple depots that simultaneously satisfy delivery and pickup requests from a set of customers, while taking into account depot stock levels. This study proposes a hybrid Genetic Algorithm which incorporates three different procedures, including a newly developed one called the K- Nearest Depot heuristic, to assign customers to depots and also the Sweep algorithm for routes construction, and the Farthest Insertion heuristic to improve solutions. Computational results show that our methods outperform the previous ones for MD-VRPSDP.

Author 1: BOUANANE Khaoula
Author 2: BENADADA Youssef
Author 3: BENCHEIKH Ghizlane

Keywords: Reverse logistic; inventory restrictions; VRPSDP; multi-depots version; Genetic Algorithm

PDF

Paper 16: An Automated Advice Seeking and Filtering System

Abstract: Advice seeking and knowledge exchanging over the Internet and social networks became a very common activity. The system proposed in this work aims to assist the users in choosing the best possible advice and allows them to exchange advice automatically without knowing each other. The approach used in this work is based on a newly proposed dynamic version of the hidden metric model, where the distance between each couple of users is computed and used to represent the users in a d dimensional Euclidean space. In addition to the position, a degree is also assigned to each user, which represents his/her popularity or how much he/she is trusted by the system. The two factors, distance and degree, are used in selecting advice providers. Both the positions of the users and their degrees are adjusted according to the feedback of the users. The proposed feedback algorithm is based on a Bayesian framework and has a goal of obtaining more accurate advice in the future. The system evaluated and tested using simulation. In the applied experiment, the mean square error was measured for different parameters. All parts of the experiments are performed on a varying number of users (100, 500 and 1000 users). This shows that the system can scale to a large number of users.

Author 1: Reham Alskireen
Author 2: Dr. Said Kerrache
Author 3: Dr. Hafida Benhidour

Keywords: Recommender system; hidden metric; advice; Bayesian framework

PDF

Paper 17: Existing Trends of Digital Watermarking and its Significant Impact on Multimedia Streaming: A Survey

Abstract: Nowadays digital media has reached the general level of resource sharing system and become a convenient way for sharing lots of information among various individuals. However, these digital data are stored and shared over an internet which is an entirely unsecured and most frequently attacked by several attackers, resulting in a massive loss at various parameters and creates severe issues of copyright protection, ownership protection, authentication, secure communication, etc. In recent years, digital watermarking technology has received extensive attention from users and researchers for content protection and digital data authentication. However, before implementing digital watermarking techniques in practical applications, there are still many problems that need to be solved technically and efficiently. The purpose of this manuscript is to provide a detailed survey on current research techniques of digital watermarking techniques for all media formats with their applications and operational process. The prime objective of this manuscript is to reveal the research problem and the efficient requirement to implement robust watermarking technique after analyzing the progress of watermarking schemes and current research trend.

Author 1: R. Radha Kumari
Author 2: V. Vijaya Kumar
Author 3: K.Rama Naidu

Keywords: Authentication; copyright-protection; digital information; digital watermark; robustness, security

PDF

Paper 18: A Usability Model for Mobile Applications Generated with a Model-Driven Approach

Abstract: Usability evaluation of mobile applications (referred to as apps) is an emerging research area in the field of Software Engineering. Several research studies have focused their interest on the challenge of usability evaluation in mobile context. Typically, the usability is measured once the mobile apps is implemented. At this stage of the development process, it is costly to go back and makes the required changes in the design in order to overcome usability problems. Model-driven Engineering (MDE) was proven as a promising solution for this problem. In such approach, a model can be build and analyzed early in the design cycle to identify key characteristics like usability. The traceability established between this model and the final application by means of model transformation plays a key role to preserve its usability or even improve it. This paper attempts to review existing usability studies and subsequently propose a usability model for conducting early usability evaluation for mobile apps generated with an MDE tool.

Author 1: Lassaad Ben Ammar

Keywords: Usability; mobile apps; model-driven engineering

PDF

Paper 19: Analysis of Efficient Cognitive Radio MAC Protocol for Ad Hoc Networks

Abstract: Cognitive Radio (CR) is an emerging technology to exploit the existing spectrum dynamically. It can intelligently access the vacant spectrum frequency bands. Although a number of methodologies have been suggested for improving the performance of CR networks, little attention has been given to efficient usage, management and energy efficiency. In this paper, a modern paradigm pertaining to the spectrum allotment and usage, manifested as CR, has been introduced as a potential solution to this problem, where the CR (unlicensed) users can opportunistically deploy the available free licensed spectrum bands in such a way that restricts the degree of interference to the extent that the primary (licensed) users can allow. In this article, we analysis and compare various protocols, in addition, we evaluate CREAM MAC, RMC MAC, SWITCH MAC, EECR MAC protocols related to the CR MAC in term of different parameters such as throughput, data transmission and time efficiency. We conclude the most efficient protocol, which have similar features named as Proposed Efficient Cognitive Radio MAC (PECR-MAC) protocol.

Author 1: Muhammad Yaseer
Author 2: Haseeb Ur Rehman
Author 3: Amir Usman
Author 4: Muhammad Tayyab Shah

Keywords: Ad Hoc networks; cognitive radio (CR); backup channel; energy efficient protocols; MAC protocol; primary users; secondary users

PDF

Paper 20: Fuzzy Logic Driven Expert System for the Assessment of Software Projects Risk

Abstract: This paper presents an expert risk evaluation system developed and based on up-to-date empirical study that uses a real data from huge number of software projects to identify the most factors that affect the project success. Software project can be affected by a range of risk factors through all phases of the development process. Therefore, it has become necessary to consider risk concerns while developing the software project. Risk assessment and management play a significant role in avoiding failure of the software project, and can help in mitigating the effect of the undesirable events that could affect the project outcomes. In this paper, the researchers have developed a novel expert fuzzy-logic tool that can be used by project decision makers to evaluate the expected risks .The developed tool helps in estimating the risk probability based on the software project’s critical success factors. A user-friendly interface is created to enable the project managers to perform general risk evaluation during any stage of the software development process. The proposed tool can be helpful in achieving effective risk control, and therefore improving the overall project outcomes.

Author 1: Mohammad Ahmad Ibraigheeth
Author 2: Syed Abdullah Fadzli

Keywords: Risk assessment; critical success factors; fuzzy expert systems; fuzzy rule-base; risk probability

PDF

Paper 21: Self Adaptable Deployment for Heterogeneous Wireless Sensor Network

Abstract: Wireless Sensor Networks (WSN) is becoming a crucial component of most of the fields of engineering. Heterogeneous WSN (HWSN) is characterized by wireless sensor nodes having link (communication), computation or energy heterogeneity for a specific application. WSN applications are constrained by the availability of power hence; conserving energy in a sensor network becomes a major challenge. Literature survey shows that node deployments can have good impact on energy conservation. Works show that self-adaptable nodes can significantly save energy as compared to other types of deployment. This work uses the concept of self-adaptation of nodes to conserve energy in a HWSN. A deployment strategy driven by some dynamic decision making capability can boost the overall performance of a WSN. The work presents an analysis of three types of deployments: like keeping all nodes fixed, all node moving and high energy nodes moving with respect to throughput, delay and energy consumption. Experimental results show that self-adaptable dynamic deployment gives 10% better throughput and 6% better energy conservation than static deployment strategies.

Author 1: Umesh M. Kulkarni
Author 2: Harish H. Kenchannavar
Author 3: Umakant P. Kulkarni

Keywords: Wireless sensor network (WSN); deployment strategy; self-adaptable

PDF

Paper 22: Document Similarity Detection using K-Means and Cosine Distance

Abstract: A two-year study by the Ministry of Research, Technology and Education in Indonesia presented the evaluation of most universities in Indonesia. The findings of the evaluation are the peculiarities of various dissertation softcopies of doctoral students which are similar to any texts available on internet. The suspected plagiarism behavior has a negative effect on both students and faculty members. The main reason behind this behavior is the lack of standardized awareness among faculty members with regard to plagiarism. Therefore, this study proposes a computerized system that is able to detect plagiarism information by using K-means and cosine distance algorithm. The process starts from preprocessing process that includes a novel step of checking Indonesian big dictionary, vector space model design, and the combined calculation of K-means and cosine distance from 17 documents as test data. The result of this study generally shows that the documents have detection accuracy of 93.33%.

Author 1: Wendi Usino
Author 2: Anton Satria Prabuwono
Author 3: Khalid Hamed S. Allehaibi
Author 4: Arif Bramantoro
Author 5: Hasniaty A
Author 6: Wahyu Amaldi

Keywords: K-means; cosine distance; cluster; document similarity; document frequency; inverse document frequency; preprocessing; vector space model

PDF

Paper 23: Smart City and Smart-Health Framework, Challenges and Opportunities

Abstract: The new age of mobile health is accompanied with wider implementation of ubiquitous and pervasive mobile communication and computing, that in turn, has brought enormous opportunities for organizations and governments to reconsider their healthcare concept. Alongside, the global process of urbanization signifies a daunting test and attracts the expert concentration towards towns that can obtain significant high populations and service people in a human and efficient approach. The consistent need of these two trends led to evolution of the concept of smart cities plus mobile healthcare. The given article is intended to provide an overview of smart health, explained to be context-aware that is accompanied by mobile health within the smart cities. The purpose of the article is to offer a standpoint on the main fields of research and knowledge explained in the procedure of establishment of the new idea. Furthermore, the article will also focus on major opportunities and challenges that are implied by s-health and will offer a common opportunity for future research.

Author 1: Majed Kamel Al-Azzam
Author 2: Malik Bader Alazzam

Keywords: Smart city; challenges; opportunities; smart health

PDF

Paper 24: Impact of Privacy Issues on Smart City Services in a Model Smart City

Abstract: With the recent technological development, there is prevalent trend for smart infrastructure deployment with intention to provide smart services for inhabitants. City governments of current era are under huge pressure to facilitate their residents by offering state of the art services equipped with modern technology gadgets. To achieve this goal they have been forced for massive investment in IT infrastructure deployment, eventually they are collecting huge amount of data from users with intention of providing them better or improved services. These services are very exciting but on the other side they also pose a big threat to the privacy of individuals. This paper designed and simulated a smart city model. This model is connected with some mandatory communication devices which also produce data for different sensors, Based on simulation results and possible threats for alteration of this data, it suggests solution for privacy issues which are to be considered at top priority to ensure secrecy and privacy of smart city residents.

Author 1: Nasser H. Abosaq

Keywords: IOT; Public-Wi-Fi; Privacy; D2D; D2U; industrial 4.0; 5G; Secrecy; FIDO

PDF

Paper 25: Networking Issues for Security and Privacy in Mobile Health Apps

Abstract: It is highly important to give social care on the personal information that is collected by mobile health applications. There has been a rise in the mobile applications which are applied in almost all the departments and this is as a result of the high technological advancement globally. The developers of these applications need to be somehow reluctant in maintaining the privacy of information collected through the applications because many release insecure apps. The aim of this report is to analyze the status of privacy and security in relation to mobile health. The analysis or the review has been done through academic literature review, a study of the laws which regulate mobile health in the EU and USA. Also, lastly, giving a recommendation for the mobile application developers, on how to maintain privacy and security. As a result, other certifications and standards will be proposed for app developers and another guide for the researchers and developers as well.

Author 1: Yasser Mohammad Al-Sharo

Keywords: Wireless networks; security; privacy; mobile; analyses

PDF

Paper 26: A Survey on Techniques to Detect Malicious Activites on Web

Abstract: The world wide web is more vulnerable for malicious activities. Spam–advertisements, Sybil attacks, Rumour propagation, financial frauds, malware dissemination, and Sql injection are some of the malicious activities on web. Terrorist are using web as a weapon to propaganda false information. Many innocent youths were trapped by web terrorist. It is very difficult to trace the impression of malicious activities on web. Many researches are under development to find a mechanism to protect web users and avoid malicious activities. The aim of the survey is to provide a study on recent techniques to find malicious activities on web.

Author 1: Abdul Rahaman Wahab Sait
Author 2: Dr.M.Arunadevi
Author 3: Dr.T.Meyyappan

Keywords: Malware detection; malicious behavior; spam detection; web terrorism; Sql injection

PDF

Paper 27: The Growing Role of Complex Sensor Systems and Algorithmic Pattern Recognition for Vascular Dementia Onset

Abstract: Vascular Dementia is often Clinically diagnosed once the effects of the disease are prevalent in a person’s daily living routines. However previous research has shown various behavioral and physiological changes linked to the development of Vascular Dementia, with these changes beginning to present earlier than clinical diagnosis is currently possible. In this review, works focused on these early signs of Vascular Dementia are highlighted. However, recognizing these changes is difficult. Many computational systems have been proposed for the evaluation these early signs of Vascular Dementia. The chosen works have largely focused on utilizing sensors systems or algorithmic evaluation can be incorporated into a person’s environment to measure behavioral, and phycological metrics. This raw data can then be computationally analyzed to draw conclusions about the patterns of change surrounding the onset of Vascular Dementia. This compilation of works presents current a framework for investigating the various behavioral and physiological metrics as well as potential avenues for further investigating of sensor system and algorithmic design with the goal of enabling earlier Vascular Dementia detection.

Author 1: Janna Madden
Author 2: Arshia Khan

Keywords: Vascular dementia; pattern recognition; machine learning; artificial intelligence; algorithmic disease detection; vascular dementia onset

PDF

Paper 28: Graphic User Interface Design Principles for Designing Augmented Reality Applications

Abstract: The reality is a combination of perception, reconstruction, and interaction. Augmented Reality is the advancement that layer over consistent everyday existence which includes content based interface, voice-based interfaces, voice-based interface and guide based or gesture-based interfaces, so designing augmented reality application interfaces is a difficult task for the maker. Designing a user interface which is not only easy to use and easy to learn but its more interactive and self-explanatory which have high perceived affordability, perceived usefulness, consistency and high discoverability so that the user could easily recognized and understand the design. For this purpose, a lot of interface design principles such as learnability, Affordance, Simplicity, Memorability, Feedback, Visibility, Flexibly and others are introduced but there no such principles which explain the most appropriate interface design principles for designing an Augmented Reality application interfaces. Therefore, the basic goal of introducing design principles for Augmented Reality application interfaces is to match the user efforts and the computer display (“plot user input onto computer output”) using an appropriate interface action symbol (“metaphors”) or to make that application easy to use, easy to understand and easy to discover. In this study by observing augmented reality system and interfaces, few of well-known design principle related to GUI (“user-centered design”) are identified and through them, few issues are shown which can be determined through the design principles. With the help of multiple studies, our study suggests different interface design principles which make designing Augmented Reality application interface more easier and more helpful for the maker as these principles make the interface more interactive, learnable and more usable. To accomplish and test our finding, Pokémon Go, an Augmented Reality game, was selected and all the suggested principles are implemented and tested on its interface. From the results, our study concludes that our identified principles are most important principles while developing and testing any Augmented Reality application interface.

Author 1: Afshan Ejaz
Author 2: Dr Syed Asim Ali
Author 3: Muhammad Yasir Ejaz
Author 4: Dr Farhan Ahmed Siddiqui

Keywords: GUI; augmented reality; metaphors; affordance; perception; satisfaction; cognitive burden

PDF

Paper 29: The Photometric Stereo Approach and the Visualization of 3D Face Reconstruction

Abstract: The 3D Morphable models of the human face have prepared myriad of applications in computer vision, human computer interaction and security surveillances. However, due to the variation in size, complexity of training data set, the landmark mapping, the representation in real time and rendering or synthesis of images in three dimensional is limited. In this paper, we extend the approach of the photometric stereo and provide the human face reconstruction in three dimensional. The proposed method consists of two steps. First it automatically detects the face and segment the iris along with statistical features of pupil location in it. Secondly it provides the selection of minimum six features and where iris process to generate the 3D face. In compare with existing methods our approach provides the automation which produces more better and efficient results in contrast to the manual methods.

Author 1: Muhammad Sajid Khan
Author 2: Zabeeh Ullah
Author 3: Maria Shahid Butt
Author 4: Zohaib Arshad
Author 5: Sobia Yousaf

Keywords: 3D face; photometric stereo; reconstruction; recognition; feature selection

PDF

Paper 30: MINN: A Missing Data Imputation Technique for Analogy-based Effort Estimation

Abstract: Success and failure of a complex software project are strongly associated with the accurate estimation of development effort. There are numerous estimation models developed but the most widely used among those is Analogy-Based Estimation (ABE). ABE model follows human nature as it estimates the future project’s effort by making analogies with the past project's data. Since ABE relies on the historical datasets, the quality of the datasets affects the accuracy of estimation. Most of the software engineering datasets have missing values. The researchers either delete the projects containing missing values or avoid treating the missing values which reduce the ABE performance. In this study, Numeric Cleansing (NC), K-Nearest Neighbor Imputation (KNNI) and Median Imputation of the Nearest Neighbor (MINN) methods are used to impute the missing values in Desharnais and DesMiss datasets for ABE. MINN technique is introduced in this study. A comparison among these imputation methods is performed to identify the suitable missing data imputation method for ABE. The results suggested that MINN imputes more realistic values in the missing datasets as compared to values imputed through NC and KNNI. It was also found that the imputation treatment method helped in better prediction of the software development effort on ABE model.

Author 1: Muhammad Arif Shah
Author 2: Dayang N. A. Jawawi
Author 3: Mohd Adham Isa
Author 4: Karzan Wakil
Author 5: Muhammad Younas
Author 6: Ahmed Mustafa

Keywords: Analogy-based estimation; effort estimation; missing data imputation; software development

PDF

Paper 31: Automatic Structured Abstract for Research Papers Supported by Tabular Format using NLP

Abstract: The abstract is an extensive summary of a scientific paper that supports making a quick decision about reading it. The employment of a structured abstract is useful to represent the major components of the paper. This, in turn, enhances extracting information about the study. Regardless of the importance of the structured abstract, many computer science research papers do not apply it. This may lead to weak abstracts. This paper aims at implementing the natural language processing (NLP) techniques and machine learning on conventional abstracts to automatically generate structured abstracts that are formatted using the IMRaD (Introduction, Methods, Results, and Discussion) format which is considered as a predominant in medical, scientific writing. The effectiveness of such sentence classification, which is the capability of a method to produce an expected outcome of classifying unstructured abstracts in computer science research papers into IMRAD sections, depends on both feature selection and classification algorithm. This can be achieved via IMRaD Classifier by measuring the similarity of sentences between the structured and the unstructured abstracts of different research papers. After that, it can be classified the sentences into one of the IMRaD format tags based on the measured similarity value. Finally, the IMRaD Classifier is evaluated by applying Naïve Bayes (NB) and Support Vector Machine (SVM) classifiers on the same dataset. To conduct this work, we use dataset contains 250 conventional Computer Science abstracts for periods 2015 to 2018. This dataset is collected from two main websites: DBLP and IOS Press content library. In this paper, 200 xml based files are used for training, and 50 xml based files are used for testing. Thus, the dataset is 4x250 files where each file contains a set of sentences that belong to different abstracts but belong to the same IMRaD sections. The experimental results show that Naïve Bayes (NB) can predict better outcomes for each class (Introduction, method, results, Discussion and Conclusion) than Support Vector Machine (SVM). Furthermore, the performance of the classifier depends on an appropriate number of the representative feature selected from the text.

Author 1: Zainab Almugbel
Author 2: Nahla El Haggar
Author 3: Neda Bugshan

Keywords: Natural language processing (NLP); Naïve Bayes (NB) classifier; SVM

PDF

Paper 32: A Framework to Automate Cloud based Service Attacks Detection and Prevention

Abstract: With the increasing demand for high availability, scalability and cost minimization, the adaptation of cloud computing is also increasing. By the demand from the data, consumer or the customers of the applications, the service providers or the application owners are migrating all the applications into the cloud. These migrations of the traditional applications and deploying new applications are benefiting the consumers and the service providers. The consumers are getting the higher availability of the applications and in the other hand, the consumers of the applications are getting benefits from of the cost reduction by optimal scalability and deploying additional features with the least cost, which intern providing the better customer satisfaction. Nevertheless, this migrations and new deployments are attracting the attention of the hackers and attackers as well. In the recent past, several attacks are reported on various popular services like search engines, storage services, and critical application ranging from healthcare to defence. The attacks are sometimes limited to the data exploration, where the attackers only consume the data and sometimes the attackers destroy crucial services. The major challenge in detecting these attacks is mostly identifying the nature of the connection request. Also, identifying the attacks are not sufficient in providing the security for the cloud services and must be deployed as security as a service in the applications or the services or in the data centre as automatic and continuous measures. Various research endeavours have shown critical enhancements in the on-going past for recognizing the security attacks. Nonetheless, these attempts have not provided any solution in preventing the security attacks. Also, the existing methods as mentioned are not automated and cannot be included in the services. Thus, this work provides a unique automated framework solution for detecting the application traffic pattern and generates the rule sets for detecting any anomalies in the request types. The major outcome of this work is to identify the attack types and prevent further damages to the cloud services with a minimal computational load. The additional benefits from this work are the preventive measure for popular attack types. The work also demonstrates the ability to detect a new type of attacks based on traffic pattern analysis and provides preventive measures for making the cloud computing application hosting industry a safer place.

Author 1: P Ravinder Rao
Author 2: Dr. V.Sucharita

Keywords: Data breach; HoA; insider threat; malware injection; ACS; insecure APIs; DoS; automated attack detection; automated prevention; characteristics based detection

PDF

Paper 33: Smart Book Reader for Visual Impairment Person using IoT Device

Abstract: This paper focuses on development of Smart Book Reader will help the blind people or who have low vision to read the book without using braille. This project utilises IoT technology with the use of an IoT device, IoT infrastructure and service. An IoT device, Raspberry Pi, is used which is very energy efficient because it only uses 5V of power to run. It is also a high portability device with only credit card size and can be carried out anywhere. Book reader will capture the picture of book pages using camera and book reader will process the images using Optical Character Recognition software. When the image is recognised, book reader will read it aloud . Therefore, the blind people or those who have low vision will hear it without needing to touch using their fingertips. By using this book reader, the user can enjoy both softcopy and hardcopy books, by using online text to voice converter with a help of IoT connectivity protocol such as Wifi and 4G services. For hardcopy book, a camera is embedded to capture the page. The motivation to develop this product is to encourage all blind people to read ordinary books. This will help them to gain particular knowledge from the reading without a need to learn Braille.

Author 1: Norharyati binti Harum
Author 2: Nurul Azma Zakaria
Author 3: Nurul A. Emran
Author 4: Zakiah Ayop
Author 5: Syarulnaziah Anawar

Keywords: Internet of Things; Raspberry Pi; image processing; wellness; IR4.0; smart book reader

PDF

Paper 34: Sentiment Analysis of Arabic Jordanian Dialect Tweets

Abstract: Sentiment Analysis (SA) of social media contents has become one of the growing areas of research in data mining. SA provides the ability of text mining the public opinions of a subjective manner in real time. This paper proposes a SA model of Arabic Jordanian dialect tweets. Tweets are annotated on three different classes; positive, negative, and neutral. Support Vector Machines (SVM) and Naïve Bayes (NB) are used as supervised machine learning classification tools. Preprocessing of such tweets for SA is done via; cleaning noisy tweets, normalization, tokenization, namely, Entity Recognition, removing stop words, and stemming. The results of the experiments conducted on this model showed encouraging outcomes when Arabic light stemmer/segment is applied on Arabic Jordanian dialect tweets. Also, the results showed that SVM has better performance than NB on such tweets’ classifications.

Author 1: Jalal Omer Atoum
Author 2: Mais Nouman

Keywords: Sentiment analysis; Arabic Jordanian dialect; tweets; machine learning; text mining

PDF

Paper 35: Query Expansion in Information Retrieval using Frequent Pattern (FP) Growth Algorithm for Frequent Itemset Search and Association Rules Mining

Abstract: Documents on the Internet have increased in number exponentially; this has resulted in users having difficulty finding documents or information needed. Special techniques are needed to retrieve documents that are relevant to user queries. One technique that can be used is Information Retrieval (IR). IR is the process of finding data (generally documents) in the form of text that matches the information needed from a collection of documents stored on a computer. Problems that often appear on IRs are incorrect user queries; this is caused by user limitations in representing their needs in the query. Researchers have proposed various solutions to overcome these limitations, one of which is to use the Expansion Query (QE). Various methods that have been applied to QE include Ontology, Latent Semantic Indexing (LSI), Local Co-Occurrence, Relevance Feedback, Concept Based, WordNet / Synonym Mapping. However, these methods still have limitations, one of them in terms of displaying the connection or relevance of the appearance of words or phrases in the document collection. To overcome this limitation, in this study we have proposed an approach to QE using the FP-Growth algorithm for the search for frequent itemset and Association Rules (AR) on QE. In this study, we applied the use of AR to QE to display the relevance of the appearance of a word or term with another word or term in the collection of documents, where the term produced is used to perform QE on user queries. The main contribution in this study is the use of Association rules with FP-Growth in the collection of documents to look for the connection of the emergence of words, which is then used to expand the original query of users on IR. For the evaluation of QE performance, we use recall, precision, and f-measure. Based on the research that has been done, it can be concluded that the use of AR on QE can improve the relevance of the documents produced. This is indicated by the average recall, precision, and f-measure values produced at 94.44%, 89.98%, and 92.07%. After comparing the IR process without QE with IR using QE, an increase in recall value was 25.65%, precision was 1.93%, and F-Measure was 15.78%.

Author 1: Lasmedi Afuan
Author 2: Ahmad Ashari
Author 3: Yohanes Suyanto

Keywords: IR; query expansion; association rules; support; confidence; recall; precision

PDF

Paper 36: Predicting 30-Day Hospital Readmission for Diabetes Patients using Multilayer Perceptron

Abstract: Hospital readmission is considered a key metric in order to assess health center performances. Indeed, readmissions involve different consequences such as the patient’s health condition, hospital operational efficiency but also cost burden from a wider perspective. Prediction of 30-day readmission for diabetes patients is therefore of prime importance. The existing models are characterized by their limited prediction power, generalizability and pre-processing. For instance, the benchmarked LACE (Length of stay, Acuity of admission, Charlson comorbidity index and Emergency visits) index traded prediction performance against ease of use for the end user. As such, this study propose a comprehensive pre-processing framework in order to improve the model’s performance while exploring and selecting a prominent feature for 30-day unplanned readmission among diabetes patients. In order to deal with readmission prediction, this study will also propose a Multilayer Perceptron (MLP) model on data collected from 130 US hospitals. More specifically, the pre-processing technique includes comprehensive data cleaning, data reduction, and transformation. Random Forest algorithm for feature selection and SMOTE algorithm for data balancing are some example of methods used in the proposed pre-processing framework. The proposed combination of data engineering and MLP abilities was found to outperform existing research when implemented and tested on health center data. The performance of the designed model was found, in this regard, particularly balanced across different metrics of interest with accuracy and Area under the Curve (AUC) of 95% and close to the optimal recall of 99%.

Author 1: Ti’jay Goudjerkan
Author 2: Manoj Jayabalan

Keywords: Readmission; diabetes; multilayer perceptron; feature engineering

PDF

Paper 37: One-Lead Electrocardiogram for Biometric Authentication using Time Series Analysis and Support Vector Machine

Abstract: In this research, a person identification system has been simulated using electrocardiogram (ECG) signals as biometrics. Ten adult people were participated as the subjects in this research taken from their signal ECG using the one-lead ECG machine. A total of 65 raw ECG waves from the 10 subjects were analyzed. This raw signal is then processed using the Hjorth Descriptor and Sample Entropy (SampEn) to get the signal features. Support Vector Machine (SVM) algorithm was used as the classifier for the subject authentication based upon the record of ECG signal. The results of the research showed that the highest accuracy value of 93.8% was found in Hjorth Descriptor. Compared to SampEn, this method is quite promising to be implemented for having a good performance and fewer features.

Author 1: Sugondo Hadiyoso
Author 2: Suci Aulia
Author 3: Achmad Rizal

Keywords: ECG; biometric; Hjorth; sample entropy; SVM

PDF

Paper 38: Analysis of Resource Utilization on GPU

Abstract: The problems arising due to massive data storage and data analysis can be handled by recent technologies, like cloud computing and parallel computing. MapReduce, MPI, CUDA, OpenMP, OpenCL are some of the widely available tools and techniques that use multithreading approach. However, it is a challenging task to use these technologies effectively to handle the compute intensive problems in the fields like life science, environment, fluid dynamics, image processing, etc. In this paper, we have used many core platforms with graphics processing units (GPU) to implement one of very important and fundamental problem of sequence alignment in the field of bioinformatics. Dynamic and concurrent kernel features offered by graphics card are used to speed up the performance. With these features, we achieved a speed up of around 120X and 55X. We have coupled well-known tiling technique with these features and observed a performance improvement up to 4X and 2X, as compared to non-tiling execution. The paper also analyses resource parameters, GPU occupancy and proposes their relationship with the design parameters for the chosen algorithm. These observations have been quantified and the relationship between the parameters is presented. The results of study can be extended further to study similar algorithms in this area.

Author 1: M. R. Pimple
Author 2: S.R. Sathe

Keywords: Dynamic kernel; GPU; Multithreading; occupancy; parallel computing

PDF

Paper 39: Minimizing Load Shedding in Electricity Networks using the Primary, Secondary Control and the Phase Electrical Distance between Generator and Loads

Abstract: This paper proposes a method for determining location and calculating the minimum amount of power load needed to shed in order to recover the frequency back to the allowable range. Based on the consideration of the primary control of the turbine governor and the reserve power of the generators for secondary control, the minimum amount of load shedding was calculated in order to recover the frequency of the power system. Computation and analysis of the phase electrical distance between the outage generator and the loads to prioritize distribution of the amount power load shedding at load bus positions. The nearer the load bus from the outage generator is, the higher the amount of load shedding will shed and vice versa. With this technique, a large amount of load shedding could be avoided, hence, saved from economic losses, and customer service interruption. The case study simulation has been verified through using PowerWorld sofware systems. The effectiveness of the proposed method tested on the IEEE 37 bus 9 generators power system standard has demonstrated the effectiveness of this method.

Author 1: Nghia. T. Le
Author 2: Anh. Huy. Quyen
Author 3: Binh. T. T. Phan
Author 4: An. T. Nguyen
Author 5: Hau. H. Pham

Keywords: Load shedding; primary control; secondary control; phase electrical distance

PDF

Paper 40: Improving Modified Grey Relational Method for Vertical Handover in Heterogeneous Networks

Abstract: With the advent of next-generation wireless network technologies, vertical handover has become indispensable to keep the mobile user always best connected (ABC) in a heterogeneous environment, especially the significant number of multimedia applications that require good quality of service (QoS) for users. To handle this issue, an improvement of modified Grey Relational Analysis (MGRA) to select the Always-Suitable-Connection (ASC) network has been proposed. Then, Fuzzy analytic hierarchy process (FAHP) method has been used to determine the weight of criteria. In order to validate our contribution, the proposed method called E-MGRA has been applied to obtain the ranking of suitable network. Finally, a simulation has been presented to demonstrate the performances of our developed approach to reduce the number of handovers compared to the classical method.

Author 1: Imane Chattate
Author 2: Mohamed El Khaili
Author 3: Jamila Bakkoury

Keywords: Component; vertical handover; network selection; Quality of Service (QoS); Multi Criteria Decision-Making (MCDM); Grey Relational Analysis (GRA); Fuzzy Analytic Hierarchy Process (FAHP)

PDF

Paper 41: Evaluation of APi Interface Design by Applying Cognitive Walkthrough

Abstract: The usability evaluation of APi interface design was conducted through Cognitive Walkthrough method. APi is a mobile application game designed specifically for preschool children of Tabika Kemas Kampung Berawan, Limbang Sarawak in order to learn about fire safety education. The existing fire safety games have few interaction styles issues and interface design tested on preschool children. A key ingredient to promote the preschool children to learn basic skills of fire safety is by providing them an interactive learning as the new learning method. Low-fidelity of APi prototype was designed based on the user requirements of the preschool children focusing on cognitive, psychomotor and behaviour aspects. This Cognitive Walkthrough method applied on APi interface design involved a small group of professional designers and developers. As a result, the high-fidelity of APi prototype interface design was developed for the preschool children.

Author 1: Nur Atiqah Zaini
Author 2: Siti Fadzilah Mat Noor
Author 3: Tengku Siti Meriam Tengku Wook

Keywords: Cognitive Walkthrough; interface design; usability evaluation

PDF

Paper 42: An Adaptive Neural Network State Estimator for Quadrotor Unmanned Air Vehicle

Abstract: An adaptive neural observer design is presented for the nonlinear quadrotor unmanned aerial vehicle (UAV). This proposed observer design is motivated by the practical quadrotor where the whole dynamical model of system is unavailable. In this paper, dynamics of the quadrotor UAV system and its state space model are discussed and a neural observer design, using a back propagation algorithm is presented. The steady state error is reduced with the neural network term in the estimator design and the transient performance of the system is improved. This proposed methodology reduces the number of sensors and weight of the quadrotor which results in the decrease of manufacturing cost. A Lyapunov-based stability analysis is utilized to prove the convergence of error to the neighborhood of zero. The performance and capabilities of the design procedure are demonstrated by the Simulation results.

Author 1: Jiang Yuning
Author 2: Muhammad Ahmad Usman Rasool
Author 3: Qian Bo
Author 4: Ghulam Farid
Author 5: Sohaib Tahir Chaudary

Keywords: Neural network observer; quadrotor; nonlinear systems; state estimator

PDF

Paper 43: A Real-Time Street Actions Detection

Abstract: Human action detection in real time is one of the most important and challenging problems in computer vision. Nowadays, CCTV cameras exist everywhere in our lives. However, the contents of these cameras are monitored and analyzed using human operator. This paper proposes a real time human action detection approach which efficiently detects basic and common actions in the street such as stopping, walking, running, group stopping, group walking, and group running. The proposed approach measures the object movement type based on three techniques: YOLO object detection, Kalman Filter and Homography. Real videos from CCTV camera and BEHAVE dataset are used to test the proposed method. The experimental results show that the proposed method is very effective and accurate to detect basic human actions in the street. The accuracies of the proposed method on the tested videos are 96.9% and 88.4% for the BEHAVE and the created CCTV datasets, respectively. The proposed approach runs in real time with more than 50 fps for BEHAVE dataset and 32 fps for the created CCTV datasets.

Author 1: Salah Alghyaline

Keywords: Online human action detection; group behavior analysis; CCTV cameras; computer vision

PDF

Paper 44: A Qualitative Comparison of NoSQL Data Stores

Abstract: Due to the proliferation of big data with large volume, velocity, complexity, and distribution among remote servers, it became obvious that traditional relational databases are unsuitable for meeting the requirements of such data. This led to the emergence of a novel technology among organizations and business enterprises; NoSQL datastores. Today such datastores have become popular alternatives to traditional relational databases, since their schema-less data models can manipulate and handle a huge amount of structured, semi-structured and unstructured data, with high speed and immense distribution. Those data stores are of four basic types, and numerous instances have been developed under each type. This implies the need to understand the differences among them and how to select the most suitable one for any given data. Unfortunately, research efforts in the literature either consider differences from a theoretical point of view (without real use cases), or address performance issues such as speed and storage, which is insufficient to give researchers deep insight into the mapping of a given data structure to a given NoSQL datastore type. Hence, this paper provides a qualitative comparison among three popular datastores of different types (Redis, Neo4j, and MongoDB) using a real use case of each type, translated to the others. It thus highlights the inherent differences among them, and hence what data structures each of them suits most.

Author 1: Sarah H. Kamal
Author 2: Hanan H. Elazhary
Author 3: Ehab E. Hassanein

Keywords: Document datastores; graph datastores; key-value datastores; MonoDB; Neo4j; NoSQL datastores; Redis

PDF

Paper 45: JWOLF: Java Free French Wordnet Library

Abstract: The electronic lexical databases WordNets, have become essential for many computer applications, especially in linguistic research. Free French WordNet is an XML lexical database for French language based on Princeton WordNet for the English language and other multilingual resources. So far, research on Free French WordNet has focused on the construction and relevance of lexico-semantic information. However, no effort is made to facilitate the exploitation of this database under the Java language. In this context, this paper proposes our approach for the development of a new Java API based on Java Architecture for XML Binding. This Java API will make it easier for developers to exploit and use Free French WordNet to create applications for natural language processing. In order to assess the usefulness of our API, The API performance has been evaluated in the context of a Browser that we developed to extract semantic and lexical relations connecting synsets contained in this database, such as: the tree of hypernymy, the tree of hyponymy, synonyms, etc. The results showed that our API perfectly meets the needs of programmatically exploitation, exploration and consultation of this database in a Java application.

Author 1: Morad HAJJI
Author 2: Mohammed QBADOU
Author 3: Khalifa MANSOURI

Keywords: JAVA; API; WordNet; WOLF; JAXB; natural language processing

PDF

Paper 46: Flood Analysis in Peru using Satellite Image: The Summer 2017 Case

Abstract: At the beginning of the year 2017, different regions of Peru suffered from heavy rains mainly due to the 'El Niño' and 'La Niña' phenomena. As a result of these massive storms, several cities were affected by overflows and landslides. Chosica and Piura were the most affected cities. On the other hand, the satellite images have many applications, one of them is the aid for the better management of the natural disasters (post-disaster management). In this sense, the present work proposes the use of radar satellite images from Sentinel constellation to make an analysis of the most-affected areas by floods in the cities of Chosica and Piura. The applied methodology is to analyse and compare two images (one before and one after the disaster) to identify the affected areas based on differences between both images. The analysing process includes radiometric calibration, speckle filtering, terrain correction, histogram plotting, and image binarization. The results show maps of the analysed cities and identify a significant number of areas flooded according to satellite images from March 2017. Using the resulting maps, authorities can make better decisions. The satellite images used were from the Sentinel 1 satellite belonging to the European Union.

Author 1: Avid Roman-Gonzalez
Author 2: Brian A. Meneses-Claudio
Author 3: Natalia I. Vargas-Cuentas

Keywords: Overflow; landslide; chosica; piura; satellite image processing; sentinel 1

PDF

Paper 47: Application of Sentiment Lexicons on Movies Transcripts to Detect Violence in Videos

Abstract: In the modern era of technological development, the emergence of Web 2.0 applications, related to social media, the dissemination of opinions, feelings, and participation in discussions on various issues have become very easy, which have led to a boom in text mining and natural language processing research. YouTube is one of the most popular social sites for video sharing. This may contain different types of unwanted content such as violence, which is the cause of many social problems, especially among children like aggression and bullying at home, in school and in public places. The research work reports performance of two different sentiment lexicons when they were applied on video transcripts to detect violence in YouTube videos. The automation of process to detect violence in videos can be helpful for censor boards that can use the technology to restrict violent video for a certain age group or can fully block entire video regardless of age. The models were built using the existing sentiment lexicons. The dataset consists of 100 English video transcripts collected from the web and was annotated manually as violent and non-violent. Various experiments were performed on the dataset using English SentiWordNet (ESWN) and Vader Package with different text preprocessing settings. The Vader package outperformed the ESWN by providing 75% accuracy. ESWN results for all POS tagging with 66% accuracy were better than its result for adjectives POS tagging with 58% accuracy.

Author 1: Badriya Murdhi Alenzi
Author 2: Muhammad Badruddin Khan

Keywords: Sentiment lexicons; sentiment analysis; video transcript; part-of-speech tagging; English SentiWordNet; Vader Package; violence detection

PDF

Paper 48: A Study on Sentiment Analysis Techniques of Twitter Data

Abstract: The entire world is transforming quickly under the present innovations. The Internet has become a basic requirement for everybody with the Web being utilized in every field. With the rapid increase in social network applications, people are using these platforms to voice them their opinions with regard to daily issues. Gathering and analyzing peoples’ reactions toward buying a product, public services, and so on are vital. Sentiment analysis (or opinion mining) is a common dialogue preparing task that aims to discover the sentiments behind opinions in texts on varying subjects. In recent years, researchers in the field of sentiment analysis have been concerned with analyzing opinions on different topics such as movies, commercial products, and daily societal issues. Twitter is an enormously popular microblog on which clients may voice their opinions. Opinion investigation of Twitter data is a field that has been given much attention over the last decade and involves dissecting “tweets” (comments) and the content of these expressions. As such, this paper explores the various sentiment analysis applied to Twitter data and their outcomes.

Author 1: Abdullah Alsaeedi
Author 2: Mohammad Zubair Khan

Keywords: Twitter; sentiment; Web data; text mining; SVM; Bayesian algorithm; hybrid; ensembles

PDF

Paper 49: Optimization and Deployment of Femtocell: Operator’s Perspectives

Abstract: This study examines the deployment issues of Femtocell, which require the satisfaction level of users on available bandwidth. Femtocells are small Base Stations installed in Homes for the improvement of coverage and capacity of Cellular Networks. Femtocells are connected over traditional DSL, FTTH (fiber to the home) to the Network. Optimization of Cellular Network is required for efficient utilization of available bandwidth and resources. In this paper, we present deployments issues, optimizations of Femtocell, Operator perspective survey results, and Service level agreement (SLA) between cellular operators, which achieve the user’s desires and support in the deployment of Femtocell Network.

Author 1: Javed Iqbal
Author 2: Zuhaibuddin Bhutto
Author 3: Zahid latif
Author 4: M. Zahid Tunio
Author 5: Ramesh Kumar
Author 6: Murtaza Hussain Shaikh
Author 7: Muhammad Nawaz

Keywords: Femtocell; deployment; optimization; service level agreement; fixed mobile convergence; cellular networks

PDF

Paper 50: Breast Cancer Classification using Global Discriminate Features in Mammographic Images

Abstract: Breast cancer has become a rapidly prevailing disease among women all over the world. In term of mortality, it is considered to be the second leading cause of death. Death risk can be reduced by early stage detection, followed by a suitable treatment procedure. Contemporary literature shows that mammographic imaging is widely used for premature discovery of breast cancer. In this paper, we propose an efficient Computer Aided Diagnostic (CAD) system for the detection of breast cancer using mammography images. The CAD system extracts largely discriminating features on the global level for representation of target categories in two sets: all 20 extracted features and top 7 ranked features among them. Texture characteristics using co-occurrence matrices are calculated via the single offset vector. Multilayer perceptron neural network with optimized architecture is fed with individual feature sets and results are produced. Data division corresponds as 60%, 20%, and 20% is used for training, cross-validation, and test purposes, respectively. Robust results are achieved and presented after rotating the data up to five times, which shows higher than 99% accuracy for both target categories, and hence outperform the existing solutions.

Author 1: Nadeem Tariq
Author 2: Beenish Abid
Author 3: Khawaja Ali Qadeer
Author 4: Imran Hashim
Author 5: Zulfiqar Ali
Author 6: Ikramullah Khosa

Keywords: Breast cancer; mammography; pattern recognition; classification

PDF

Paper 51: Cervical Cancer Prediction through Different Screening Methods using Data Mining

Abstract: Cervical cancer remains an important reason of deaths worldwide because effective access to cervical screening methods is a big challenge. Data mining techniques including decision tree algorithms are used in biomedical research for predictive analysis. The imbalanced dataset was obtained from the dataset archive belongs to the University of California, Irvine. Synthetic Minority Oversampling Technique (SMOTE) has been used to balance the dataset in which the number of instances has increased. The dataset consists of patient age, number of pregnancies, contraceptives usage, smoking patterns and chronological records of sexually transmitted diseases (STDs). Microsoft azure machine learning tool was used for simulation of results. This paper mainly focuses on cervical cancer prediction through different screening methods using data mining techniques like Boosted decision tree, decision forest and decision jungle algorithms as well performance evaluation has done on the basis of AUROC (Area under Receiver operating characteristic) curve, accuracy, specificity and sensitivity. 10-fold cross-validation method was utilized to authenticate the results and Boosted decision tree has given the best results. Boosted decision tree provided very high prediction with 0.978 on AUROC curve while Hinslemann screening method has used. The results obtained by other classifiers were significantly worse than boosted decision tree.

Author 1: Talha Mahboob Alam
Author 2: Muhammad Milhan Afzal Khan
Author 3: Muhammad Atif Iqbal
Author 4: Abdul Wahab
Author 5: Mubbashar Mushtaq

Keywords: Boosted decision tree; cervical cancer; data mining; dcision trees; decision forest; decision jungle; screening methods

PDF

Paper 52: Active and Reactive Power Control of Wind Turbine based on Doubly Fed Induction Generator using Adaptive Sliding Mode Approach

Abstract: In this work, a robust Adaptive sliding mode controller (ASMC) is proposed to improve the dynamic performance of the Doubly Fed Induction generator (DFIG) based wind system under variable wind speed conditions. Firstly, the dynamic modeling of the main components of the system is performed. Thereafter, the ASMC is designed to control the active and reactive powers of the machine stator. The structure of these controllers was improved by adding two integral terms. Their sliding gains are determined using Lyapunov stability theorem to make them automatically adjusted in order to tackle the external disturbances. Maximum Power Point Tracking (MPPT) strategy was also applied to enhance the power system efficiency. Then, a comparison study with the Field Oriented Control (FOC) based on conventional PI control was conducted to assess the robustness of this technique under the DFIG parameters variations. Finally, a computer simulation was achieved in MATLAB/SIMULINK environment using 2MW wind system model. Satisfactory performances of the proposed strategy were clearly confirmed under variable operating conditions.

Author 1: Othmane Zamzoum
Author 2: Youness El Mourabit
Author 3: Mustapha Errouha
Author 4: Aziz Derouich
Author 5: Abdelaziz El Ghzizal

Keywords: Wind turbine; DFIG; OP-MPPT; ASMC; adaptive sliding gains

PDF

Paper 53: Ontological Model to Predict user Mobility

Abstract: With the remarkable technological evolution of mobile devices, the use of computing resources has become possible at any time and independent of the geographical position of the user. This phenomenon has various names such as omnipresent diffuse computing, pervasive computing, or ubiquitous systems. This new form of computing allows users to have access to shared and ubiquitous services focused on their needs, and it is based on context prediction, especially the prediction of the user’s location. This paper aims to present a new approach for predicting a user’s next probable location, by presenting an ontological model which is based on the pattern technique. This is carried out by using an ontological model that comprises different user behaviors and presents details about the environment, where the user is located. The results after tested on real data show that the presented ontological model was able to achieve 85% future location-prediction accuracy (in the case of no similar patterns). Future work will focus on the integration of the Bayesian network to improve the results. This approach will be implemented in smart homes or smart cities to reduce energy consumption.

Author 1: Atef Zaguia
Author 2: Roobaea Alroobaea

Keywords: Context prediction; pervasive system; context-aware system; pattern; ontology; ontological model

PDF

Paper 54: Modelling, Command and Treatment of a PV Pumping System Installed in Tunisia

Abstract: This paper studied the modeling, the command and the optimization of a photovoltaic (PV) pumping systems using performed strategies of command laws. The system is formed by a PV generator, a DC-DC converter with a maximal power point tracking (MPPT) command, a DC-AC converter with V/f command law and a submersed motor-pump. The first part of this paper presents the obtained models of the various components of the PV pumping system. Dynamic commands composed of a V/f and MPPT laws are calculated around the converters. The MPPT command insures the power adaptation between PV generator and load whereas the V/f command insures a PWM control of the asynchronous motor and a sinusoidal output signal. Some important results of simulation of the PV pumping system under the environment of MATLAB/SIMULINK are presented. In the second part of this paper some experimental results of a PV pumping system installed in Tunisia are developed. Those results are used to validate the simulating model and to test the performances of the command approach.

Author 1: Nejib Hamrouni
Author 2: Sami Younsi
Author 3: Moncef Jraidi

Keywords: Stand-alone PV systems; PV pumping; modelling; louata pumping system

PDF

Paper 55: Unique Analytical Modelling of Secure Communication in Wireless Sensor Network to Resist Maximum Threats

Abstract: Security problems in Wireless Sensor Network (WSN) are still open-end problems. Qualitative evaluation of the existing approaches of security in WSN shows adoption of either complex cryptographic use or attack-specific solution. As WSN is an integral part of upcoming Internet-of-Things (IoT), the attack scenario becomes more complicated owing to the integration of two different forms of networks and so is for the attackers. Therefore, this paper introduces a novel secure communication technique that considers time, energy, and traffic environment as prominent constraints to perform security modeling. The proposed solution designed using analytical methodology has some unique capability to resist any form of illegitimate queries of network participation and yet maintain a superior form of communication service. The simulated outcome shows that the proposed system offers reduced end-to-end delay and highest energy retention as compared to other existing security approaches.

Author 1: Manjunath B. E
Author 2: Dr. P.V. Rao

Keywords: Encryption; energy; secure communication; threats; traffic environment; wireless sensor network

PDF

Paper 56: IoT Technological Development: Prospect and Implication for Cyberstability

Abstract: Failure to address the risk poses by future technological development could cause devastating damage to public trust in the technologies. Therefore, ascendant technologies such as artificial intelligence are the key components to provide solutions for new cybersecurity threats and strengthen the capabilities of the future technological developments. In effect, ability of the technologies to prevent and withstand a cyber-attack could become the new deterrence. This paper will provide gaps to guide the government, industry, and the research community in pursuing Internet of Things (IoT) technological development that may be in need of improvement. The contribution of this paper is as follows: First, a roadmap that outline security requirements and concerns of future technology and the significant of IoT technology in addressing the concerns. Second, an assessment that illustrates the expected and unexpected impact of future technology adoption and its significant geopolitical implication on potential impacted areas such as regulatory, legal, political, military, and intelligence.

Author 1: Syarulnaziah Anawar
Author 2: Nurul Azma Zakaria
Author 3: Mohd Zaki Masu’d
Author 4: Zulkiflee Muslim
Author 5: Norharyati Harum
Author 6: Rabiah Ahmad

Keywords: Internet of things; cybersecurity; geopolitical; artificial intelligence; technology adoption

PDF

Paper 57: Using Academy Awards to Predict Success of Bollywood Movies using Machine Learning Algorithms

Abstract: Motion Picture Production has always been a risky and pricey venture. Bollywood alone has released approximately 120 movies in 2017. It is disappointing that only 8% of the movies have made to box office and the remaining 92% failed to return the total cost of production. Studies have explored several determinants that make a motion picture success at box office for Hollywood movies including academy awards. However, same can’t be said for Bollywood movies as there is significantly less research has been conducted to predict their success of a movie. Research also shows no evidence of using academy awards to predict a Bollywood movie’s success. This paper investigates the possibility; does an academy award such as ZeeCine or IIFA, previously won by the actor, playing an important role in movie, impact its success or not? In order to measure, the importance of these academy awards towards a movie’s success, a possible revenue for the movie is predicted using the academy awards information and categorizing the movie in different revenue range classes. We have collected data from multiple sources like Wikipedia, IMDB and BoxOfficeIndia. Various machine-learning algorithms such as Decision Tree, Random Forest, Artificial Neural Networks, Naïve Bayes and Bayesian Networks are used for the said purpose. Experiment and their results show that academy awards slightly increase the accuracy making an academy award a non-dominating ingredient of predicating movie’s success on box office.

Author 1: Salman Masih
Author 2: Imran Ihsan

Keywords: Machine learning; supervised learning; classification

PDF

Paper 58: A Novel Scheme for Address Assignment in Wireless Sensor Networks

Abstract: Assigning network addresses to nodes in a wireless sensor network is a crucial task that has implications for the functionality, scalability, and performance of the network. Since sensor nodes generally have scarce resources, the address assignment scheme must be efficient in terms of communications and storage. Most addressing schemes reported in literature or employed in standard specifications have weak aspects. In this paper, a distributed addressing scheme has been proposed that first organizes the raw address space into a regular structure and then maps it into a logical tree structure that is subsequently used to assign addresses in a distributed but conflict-free manner. As an additional benefit, this approach allows underlying tree structure to be used for default routing mechanism in the network, thus, avoiding costly route discovery mechanisms.

Author 1: Ghulam Bhatti

Keywords: Wireless sensor networks; address assignment; logical network topology; routing; address conflict; IEEE 802.15.4; address space; ZigBee

PDF

Paper 59: Customer Value Proposition for E-Commerce: A Case Study Approach

Abstract: E-Commerce tools have become a human needs everywhere and important not only to customers but to industry players. The intention to use E-Commerce tools among practitioners, especially in the Malaysian retail sector is not comprehensive as there are still many businesses choosing to use expensive traditional marketing. The research applies academic models and frameworks to the real life situation to develop a value proposition in the practical world by considering 11Street as the company under study and comparing it with Lazada as a leading competitor in the market. The objectives include identification of customers’ perception of a value for E-Commerce Businesses, followed by critical evaluation of existing value proposition of 11Street with Lazada to identify gap and finally to propose a new value proposition for 11street. This paper first identifies customer perceived value of E-Commerce followed by critical review of existing value proposition of 11Street and then comparing and contrasting with the leading player Lazada. By the end of this research, a new consumer value proposition proposal for 11Street proposed for consideration in matching with the Malaysian consumers’ value criteria.

Author 1: Nurhizam Safie Mohd Satar
Author 2: Omkar Dastane
Author 3: Muhamad Yusnorizam Ma’arif

Keywords: Online consumer; perceived value; e-commerce; value proposition

PDF

Paper 60: Forensic Analysis of Docker Swarm Cluster using Grr Rapid Response Framework

Abstract: An attack on Internet network does not only hap-pened in the web applications that are running natively by a web server under operating system, but also web applications that are running inside container. The currently popular container machines such as Docker is not always secure from Internet attacks which result in disabling servers that are attacked using DoS/DDoS. Therefore, to improve server performance running this web application and provides the application log, DevOps engineer builds advance method by transforming the system into a cluster computers. Currently this method can be easily implemented using Docker Swarm. This research has successfully investigated digital evidence on the log file of containerized web application running on cluster system built by Docker Swarm. This investigation was carried out by using the Grr Rapid Response (GRR) framework.

Author 1: Sunardi
Author 2: Imam Riadi
Author 3: Andi Sugandi

Keywords: Forensics; Network; Docker Swarm; Grr Rapid Response

PDF

Paper 61: Hypercube Graph Decomposition for Boolean Simplification: An Optimization of Business Process Verification

Abstract: This paper deals with the optimization of busi-ness processes (BP) verification by simplifying their equivalent algebraic expressions. Actual approaches of business processes verification use formal methods such as automated theorem proving and model checking to verify the accuracy of the business process design. Those processes are abstracted to mathematical models in order to make the verification task possible. However, the structure of those mathematical models is usually a Boolean expression of the business process variables and gateways. Thus leading to a combinatorial explosion when the number of literals is above a certain threshold. This work aims at optimizing the verification task by managing the problem size. A novel algorithm of Boolean simplification is proposed. It uses hypercube graph decomposition to find the minimal equivalent formula of a business process model given in its disjunctive normal form (DNF). Moreover, the optimization method is totally automated and can be applied to any business process having the same formula due to the independence of the Boolean simplification rules from the studied processes. This new approach has been numerically validated by comparing its performance against the state of the art method Quine-McCluskey (QM) through the optimization of several processes with various types of branching.

Author 1: Mohamed NAOUM
Author 2: Outman EL HICHAMI
Author 3: Mohammed AL ACHHAB
Author 4: Badr eddine EL MOHAJIR

Keywords: Business process verification; minimal disjunctive normal form; Boolean reduction; hypercube graph; Karnaugh map; Quine-McCluskey

PDF

Paper 62: Service-Oriented Context-Aware Messaging System

Abstract: In services oriented computing, location or spatial models are required to model the domain environment whenever location or spatial relationships are utilised by users and/or services. This research presents an ontology-based methodol-ogy for context-aware messaging service. There are five main contributions to this research. First, the research provides a service oriented methodology for modelling and building context-aware messaging systems based on ontological principles. Second, it describes a method that assists understanding the domain’s spatial environment. Third, it includes a proposal of the generic Mona-ServOnt core service ontology that offers context-aware reasoning for capture and use of context. Mona-ServOnt is able to support the deployment of context-aware messaging services in both indoor and outdoor environments. Fourth, a novel generic architecture that captures the requirements for context-aware messaging services is given. Fifth, the generic messaging protocols that describe the exchange of messages within context-aware mes-saging services is modelled. A few experiments were completed to measure the performance of the peer-to-peer services using actual smartphone with Bluetooth capability. In addition, the methodology’s main steps have been validated individually in various context-aware messaging domains. It has been evaluated using competency questions that gauge the scope of the proposed ontology. Furthermore, the generic architecture and messaging protocols have been verified in constructing for each domain.

Author 1: Alaa Omran Almagrabi
Author 2: Arif Bramantoro

Keywords: Context-Awareness; messaging service; service on-tology; semantic web service

PDF

Paper 63: Browsing Behaviour Analysis using Data Mining

Abstract: Now-a-days most of our time is spent online using some form of digital technology such as search engines, news portals, or social media websites. Our online presence makes us engaged most of the time and leads us to become oblivious of our important work, resulting in a form of procrastination that decreases our productivity significantly. Some desktop and mobile applications have recently emerged to counter the problem by introducing various means of self-tracking to reduce the wasting of time and engage in productive activities. However, these systems suffer several shortcomings in terms of being static or providing a limited view of actions using one aspect only. To promote self-awareness that helps bring positive changes in individual’s performance, there is a need to present the data in a more persuasive ways, bringing interaction to it and present the same data in different ways using both temporal and cate-gorical dimensions. We describe a framework that collects and processes the browsing data and creates a user behavior model to extract valuable and interesting temporal and categorical patterns regarding user online behavior and interests. To discover the valuable behavior patterns from the individual’s browsing data, different web usage mining techniques have been used. Finally, we demonstrate interactive visualizations for the analysis and monitoring of web browsing behavior patterns with the goal of providing the individual with detailed understanding of his/her behavior. We also present a small-scale study including university students, which proves the importance of our work.

Author 1: Hamid Mukhtar
Author 2: Farhana Seemi
Author 3: Hania Aslam
Author 4: Sana Khattak

Keywords: Pattern discovery; visualization; behavior modeling; web usage mining; browsing

PDF

Paper 64: Design and Analysis of DNA Encryption and Decryption Technique based on Asymmetric Cryptography System

Abstract: Security of sensitive information at the time of transmission over public channels is one of the critical issues in digital society. The DNA-based cryptography technique is a new paradigm in the cryptography field that is used to protect data during transmission. In this paper we introduce the asymmetric DNA cryptography technique for encrypting and decrypting plain-texts. This technique is based on the concept of data dependency, dynamic encoding and asymmetric cryptosystem (i.e. RSA algorithm). The asymmetric cryptosystem is used solely to initiate the encryption and decryption processes that are completely conducted using DNA computing. The basic idea is to create a dynamic DNA table based on the plaintext, using multi-level security, data dependency and generating 14 dynamic round keys. The proposed technique is implemented using the JAVA platform and its efficiency is examined in terms of avalanche property. The evaluation process proves that the proposed technique outperforms the RSA algorithm in terms of avalanche property.

Author 1: Hassan Al-Mahdi
Author 2: Meshrif Alruily
Author 3: Osama R.Shahin
Author 4: Khalid Alkhaldi

Keywords: DNA cryptography; asymmetric encryption; block cipher; data dependency; dynamic encoding

PDF

Paper 65: Towards a Fine-Grained Access Control Mechanism for Privacy Protection and Policy Conflict Resolution

Abstract: Access control is a security technique that specifies access rights to resources in a computing environment. As information systems nowadays become more complex, it plays an important role in authenticating and authorizing users and preventing an attacker from targeting sensitive information. However, no proper consideration has been fully investigated so far in privacy protection. While many studies have acknowledged this issue, recent studies have not provided a fine-grained access control system for data privacy protection. As the data set becomes larger, we have to confront more privacy challenges. For example, the access control mechanism must be able to guarantee fine-grained access control, privacy protection, conflicts and redundancies between rules of the same policy or between different policies. In this paper, we propose a comprehensive framework for enforcing attribute-based security policies stored in the JSON document together with the feature of data privacy protection and incorporates a policy structure based on the prioritization of functions to resolve conflicts at a fine-grained level called “Privacy aware access control model for policy conflict resolution”. We also use Polish notation for modeling condi-tional expressions which are the combination of subject, action, resource, and environment attributes so that privacy policies are flexible, dynamic and fine-grained. Experiments are carried out to two aspects (i) illustrate the relationship between the processing time for access decision and the complexity of policies;(ii) illustrate the relationship between the processing time for the traditional approach (single policy, multi-policy without priority) and our approach (multi-policy with priority). Experimental results show that the evaluation performance satisfies the privacy requirements defined by the user.

Author 1: Ha Xuan Son
Author 2: En Chen

Keywords: ABAC; privacy; JSON; policy conflict resolving; document store; fine-grained access control

PDF

Paper 66: Effect of Routing Protocols and Layer 2 Mediums on Bandwidth Utilization and Latency

Abstract: Computer networks (CNS) are progressing as emerging field in information and communication technology (ICT). Various computer networks related problems relies on performance of computer network specifically bandwidth utiliza-tion and network latency issues. CNS especially Routing protocols play a vital role for management of network resources as well as for managing the network performance but on the other hand these have adverse effect on performance of network. Network routing protocols, bandwidth and latency rate of any computer network are tightly bounded with each other with respect to network performance. This research is being conducted to analyze the relationship between performance of different protocols, their effect on bandwidth utilization, and network latency rate using layer 2 medium. After analysis of relationship of these parameters suggestions will be made for enhancement of network performance over layer 2 medium.

Author 1: Ghulam Mujtaba
Author 2: Babar Saeed
Author 3: Furhan Ashraf
Author 4: Fiaz Waheed

Keywords: Routing protocols; layer2 technologies; FDDI; la-tency rate; bandwidth utilization

PDF

Paper 67: A Survey on Wandering Behavior Management Systems for Individuals with Dementia

Abstract: Alzheimer’s and related dementia are associated with a gradual decline in cognitive abilities of an individual, impairing independent living abilities. Wandering, a purposeless disoriented locomotion tendency or behavior of dementia patients, requires constant caregiver supervision to reduce the risk of phys-ical harm to patients. Integrating technology into care ecology has the potential to alleviate stress and expense. An automatic wandering detection system integrated with an intervention mod-ule may provide warnings and assistive suggestions in times of abnormal behavior. In this study, we survey existing research on technology aided methodologies and algorithms used in detection and management of wandering behavior of individuals affected with dementia. Our study provides insights into mechanisms of collecting movement data and finding patterns that distinguish wandering from normal behavior.

Author 1: Arshia Zernab Hassan
Author 2: Arshia Khan

Keywords: Dementia; wandering behavior; technology; algorithm

PDF

Paper 68: Framework for Disease Outbreak Notification Systems with an Optimized Federation Layer

Abstract: Data that is needed to detect outbreaks of known and unknown diseases is often gathered from sources that are scattered in many geographical locations. Often these scattered data exist in a wide variety of formats, structures, and models. The collection, pre-processing, and analysis of these data to detect potential disease outbreaks is very challenging, time-consuming and error-prone. To fight disease outbreaks, healthcare practition-ers, epidemiologists and researchers need to access the scattered data in a secure and timely manner. They also require a uniform and logical framework or methodology to access the relevant data. In this paper, authors propose a federated framework for Disease Outbreak Notification Systems (DONSFed). Using advanced design and an XML technique patented in the US in 2016 by our team, the framework was tested and validated as part of this work. The proposed approach enables healthcare professionals to quickly and uniformly access data that is required to detect potential disease outbreaks. This research focuses on implementing a cloud-based prototype as a proof-of-concept to demonstrate the functionalities and to verify the concept of the proposed framework.

Author 1: Farag Azzedin
Author 2: Mustafa Ghaleb
Author 3: Salahadin Adam Mohammed
Author 4: Jaweed Yazdani

Keywords: Disease outbreak notification system; database fed-eration; web services; service oriented architecture; health systems

PDF

Paper 69: Towards an Architecture for Handling Big Data in Oil and Gas Industries: Service-Oriented Approach

Abstract: Existing architectures to handle big data in Oil & gas industry are based on industry-specific platforms and hence limited to specific tools and technologies. With these architectures, we are confined to big data single-provider solutions. The idea of multi-provider big data solutions is essential. When building up big data solutions, organizations should embrace the best-in-class technologies and tools that different providers offer. In this article, we hypothesize that the limitations of the proposed big-data architectures for oil and gas industries can be addressed by a Service Oriented Architecture approach. In this article, we are proposing the idea of breaking complex systems to simple separate yet reliable distributed services. It should be noted that loose coupling exists between the interacting services. Thus, our proposed architecture enables petroleum industries to select the necessary services from the SOA-based ecosystem and create viable big data solutions.

Author 1: Farag Azzedin
Author 2: Mustafa Ghaleb

Keywords: Service-oriented architecture; big data; Hadoop; oil and gas; big data architecture

PDF

Paper 70: Parallel Backpropagation Neural Network Training Techniques using Graphics Processing Unit

Abstract: Training of artificial neural network using back-propagation is a computational expensive process in machine learning. Parallelization of neural networks using Graphics Pro-cessing Unit (GPU) can help to reduce the time to perform computations. GPU uses a Single Instruction Multiple Data (SIMD) architecture to perform high speed computing. The use of GPU shows remarkable performance gain when compared to CPU. This work discusses different parallel techniques for the backpropagation algorithm using GPU. Most of the techniques perform comparative analysis between CPU and GPU.

Author 1: Muhammad Arslan Amin
Author 2: Muhammad Kashif Hanif
Author 3: Muhammad Umer Sarwar
Author 4: Abdur Rehman
Author 5: Fiaz Waheed
Author 6: Haseeb Rehman

Keywords: Artificial neural network; backpropagation; SIMD; CPU; GPU; machine learning

PDF

Paper 71: Overlapped Apple Fruit Yield Estimation using Pixel Classification and Hough Transform

Abstract: Researchers proposed various visual based methods for estimating the fruit quantity and performing qualitative analysis, they used ariel and ground vehicles to capture the fruit images in orchards. Fruit yield estimation is a challenging task with environmental noise such as illumination changes, color variation, overlapped fruits, cluttered environment, and branches or leaves shading. In this paper, we proposed a learning free fast visual based method to correctly count the apple fruits tightly overlapped in a complex outdoor orchard environment. We first carefully build the color based HS model to perform the color based segmentation. This step extracts the apple fruits from the complex orchard background and produces the blobs representing apples along with the additional noisy regions. We used the fine tuned morphological operators to refine the blobs received from the previous step and remove the noisy regions fol-lowed by the Gaussian smoothing. Finally we treated the circular shaped blobs with Hough Transform algorithm to calculate the center coordinates of each apple edge and the method correctly locates the apples in the images. The results ensures the proposed algorithm successfully detects and count apple fruits in the images captured from apple orchard and outperforms the standard state of the art contoured based method.

Author 1: Zartash Kanwal
Author 2: Abdul Basit
Author 3: Muhammad Jawad
Author 4: Ihsan Ullah
Author 5: Anwar Ali Sanjrani

Keywords: Apple detection; pixel classification; curvature esti-mation; Hough circle transform; visual tracking; color segmentation

PDF

Paper 72: Comparative Analysis of Network Libraries for Offloading Efficiency in Mobile Cloud Environment

Abstract: In the modern era, smartphones are increasingly becoming an integral and essential part of our daily life. Although the hardware capabilities of the smartphones (i.e., processing, memory, battery, and communication) are improving every day, however, it is not enough to handle computation-intensive applications, such as image processing, data analytics, and encryption. To overcome these limitations, mobile cloud computing (MCC) is introduced, which augments the capabilities of smartphones and resources of the cloud to provide better QoS performance to the user. The idea is to save resources in the smartphones by offloading the computationally intensive tasks to the cloud. In this context, researchers have proposed several offloading frameworks, mainly addressing challenges of why-what-when and where to offload. In this paper, however, we explore another challenging issue of offloading, i.e., how-to-offload. More specifically, we analyze different networking libraries (HttpURLConnection, OkHttp, Volley, Retrofit) and study their performance on various dynamic factors such as data size, communication media, hardware and software of the smartphone. Our objective is to explore if an application can use the same networking library for all the smartphones and all purposes or there is a need to make an adaptive decision based on the local constraints. To understand this, we perform a comprehensive analysis of the networking libraries on different Andriod smart-phones in the real environment and found that there is a need of adaptive network library selection because libraries perform changes in different scenarios.

Author 1: Farhan Sufyan
Author 2: Amit Banerjee

Keywords: Android; Mobile Cloud Computing (MCC); network libraries; offloading; performance

PDF

Paper 73: A Novel Data Aggregation Scheme for Wireless Sensor Networks

Abstract: Wireless sensor networks (WSN) consist of diverse and minute sensor nodes which are widely employed in different applications, for example, atmosphere monitoring, search and rescue activities, disaster management, untamed life checking and so on. A WSN which is an accumulation of clusters and information exchange occurs with the assistance of cluster head (CH). A lot of sensor nodes’ energy is utilized in procedures like detection, information exchange and making clusters using various protocols. In a cluster based WSN, it is profitable to segregate the tasks performed by cluster heads as a fair amount of energy could be conserved. Following this, we propose a solution to include a supplementary node that is named as a ‘super node’ alongside cluster head in a cluster based WSN in this work. This node is in-charge of all the clusters in a WSN and takes care of the entire cluster’s energy information. It manages the cluster heads from their creation to the end. All the clusters in the network send their respective information to this node that eliminates redundant information and forwards the aggregated information towards the sink. This not only saves the CH energy but also conserves individual cluster node’s energy by proper monitoring the energy levels. This mechanism enhances the lifetime of the network by minimizing the number of communications between nodes and the sink. In order to evaluate the performance of our proposed mechanism, we use various parameters like packet delay, communication overhead and energy consumption that show the optimality of our approach.

Author 1: Syed Gul Shah
Author 2: Atiq Ahmed
Author 3: Ihsan Ullah
Author 4: Waheed Noor

Keywords: Wireless Sensor Networks; energy consumption; energy-aware routing; clustering; data aggregation

PDF

Paper 74: Review of Community Detection over Social Media:Graph Prospective

Abstract: Community over the social media is the group of globally distributed end users having similar attitude towards a particular topic or product. Community detection algorithm is used to identify the social atoms that are more densely interconnected relatively to the rest over the social media platform. Recently researchers focused on group-based algorithm and member-based algorithm for community detection over social media. This paper presents comprehensive overview of community detection technique based on recent research and subsequently explores graphical prospective of social media mining and social theory (Balance theory, status theory, correlation theory) over community detection. Along with that this paper presents a comparative analysis of three different state of art community detection algorithm available on I-Graph package on python i.e. walk trap, edge betweenness and fast greedy over six different social media data set. That yield intersecting facts about the capabilities and deficiency of community analysis methods.

Author 1: Pranita Jain
Author 2: Deepak Singh Tomar

Keywords: Community detection; social media; social media mining; homophily; influence; confounding; social theory; community detection algorithm

PDF

Paper 75: Text Mining Techniques for Intelligent Grievances Handling System: WECARE Project Improvements in EgyptAir

Abstract: The current work provides quick responding and minimize the required time of processing of the incoming grievances by using automated categorization that analyses the English text contents and predict the category. This work built a model by text mining and NLP processing to extract the useful information from customer grievances data ,to be used as a guidelines to air transport industry. Since soon , a customer grievances ‘system in EGYPTAIR called WECARE has had large feeds of data which can be collected in data sets through various channels such as e-mail, website or mobile Apps. Then the incoming data sets are analyzed and assessed by organization's staff then it is assigned to related department through manual classification. Finally, it provides proposed solution for the issue. Thence grievances categorization that handled manually is time consuming process. So, this work decided a model to improve WECARE system in Egypt Airlines. Classification based data mining Techniques are used to identify data into groups of categories across the variable touch points. The system has 166 categories of problems, but for experimental purposes we decide to study 6 categories only. We have applied four commonly used classifiers namely, Support Vector Machine (SVM), K-Nearest Neighbours(KNN), Naïve Bayesian and Decision Tree on our data set to classify the grievances data set then selecting the best of them to be the candidate grievances classifier in enhanced WECARE system. Among four classifiers applied on the dataset, KNN achieved the highest average accuracy (97.5% ) with acceptable running time. Also, the work is extended to make hint to the system user, about how to solve this grievance issue based on previous issues saved in Knowledge Base (KB). Several experiments were conducted to test solution hint module by changing similarity score. The benefits of performing a thorough analysis of problems include better understanding of service performance.

Author 1: Shahinaz M.Al-Tabbakh
Author 2: Hanaa M.Mohammed
Author 3: Hayam. El-zahed

Keywords: Knowledge base; Grievances; NLP; SVM; KNN; Naïve Bayesian; Decision Tree

PDF

Paper 76: Designing Smart Sewerbot for the Identification of Sewer Defects and Blockages

Abstract: Internet of thing (IoT) is a new concept where the term ‘thing’ is associated with the configurable sensors and devices no matter domestic or industrial, whereas bridging up a relationship in between these things and internet protocol is known as Internet of thing. Moreover, the same concept has been introduced in the field of robotics as ‘Internet of Robotic Things (IoRT)’, which is mainly concerned with active sensorization of sensors dully interfaced with any type of robots i.e. autonomous unmanned ground vehicle (UGV). This paper describes the prototyping of an autonomous sewerbot that will not only identify the sewer defects in sewerage pipelines but will also identify the type of blockages using the technique of digital image processing. Furthermore, the deployed configurable sensors will also share the attributes of particular sewerage line on IoT such that temperature, humidity, availability of hazardous gases, exact depth at which it is available and global positioning using GPS module. The paper also provides the brief construction of this mechatronic and amphibian system via which it can extricate the blockages from sewerage lines along with wireless camera surveillance.

Author 1: Ghulam E Mustafa Abro
Author 2: Bazgha Jabeen
Author 3: Ajodhia
Author 4: Kundan Kumar
Author 5: Abdul Rauf
Author 6: Ali Noman
Author 7: Syed Faiz ul Huda
Author 8: Amjad Ali Qureshi

Keywords: Internet of robotic things (IoRT), GPS, humidity, Internet Protocol, Temperature, wireless communication and sewer defects

PDF

Paper 77: Thinging for Computational Thinking

Abstract: This paper examines conceptual models and their application to computational thinking. Computational thinking is a fundamental skill for everybody, not just for computer scientists. It has been promoted as skills that are as fundamental for all as numeracy and literacy. According to authorities in the field, the best way to characterize computational thinking is the way in which computer scientists think and the manner in which they reason how computer scientists think for the rest of us. Core concepts in computational thinking include such notions as algorithmic thinking, abstraction, decomposition, and generalization. This raises several issues and challenges that still need to be addressed, including the fundamental characteristics of computational thinking and its relationship with modeling patterns (e.g., object-oriented) that lead to programming/coding. Thinking pattern refers to recurring templates used by designers in thinking. In this paper, we propose a representation of thinking activity by adopting a thinking pattern called thinging that utilizes a diagrammatic technique called thinging machine (TM). We claim that thinging is a valuable process as a fundamental skill for everybody in computational thinking. The viability of such a proclamation is illustrated through examples and a case study.

Author 1: Sabah Al Fedaghi
Author 2: Ali Abdullah Alkhaldi

Keywords: Computational thinking; conceptual modeling; abstract machine; thinging; abstraction

PDF

Paper 78: Genetic Algorithm for Data Exchange Optimization

Abstract: Dynamic architectures have emerged to be a promising implementation platform to provide flexibility, high performance, and low power consumption for computing devices. They can bring unique capabilities to computational tasks and offer the performance and energy efficiency of hardware with the flexibility of software. This paper proposes a genetic algorithm to develop an optimum configuration that optimizes the routing among its communicating processing nodes by minimizing the path length and maximizing possible parallel paths. In addition, this paper proposes forward, virtually inverse, and hybrid data exchange approaches to generate dynamic configurations that achieve data exchange optimization. Intensive experiments and qualitative comparisons have been conducted to show the effectiveness of the presented approaches. Results show significant performance improvement in terms of total execution time of up to 370%, 408%, 477%, and 550% when using configurations developed based on genetic algorithm, forward, virtually inverse, and hybrid data exchange techniques, respectively.

Author 1: Medhat H A Awadalla

Keywords: Genetic algorithm; dynamic architectures; forward data exchange; virtually inverse data exchange; and hybrid data exchange method

PDF

Paper 79: Video Watermarking System for Copyright Protection based on Moving Parts and Silence Deletion

Abstract: In recent years, video watermarking has emerged as a powerful technique for ensuring copyright protection. However, ensuring the lowest level of distortion, high transparency and transparency control, integrity of the watermarked video, and robustness against attacks that can be applied to destroy the embedded watermark are important properties that should be satisfied in a watermarking system. In this paper, we propose a video watermarking system that hides a watermark in both the visual and audio streams to ensure the integrity of the watermarked video. Specifically, we propose the moving block detection (MBD) algorithm for hiding the watermark in the moving parts of the original visual stream of the video. The MDB algorithm ensures that a minimal amount of distortion is caused by embedding the watermark. The MBD uses entropy to find the moving parts of the visual stream to hide the watermark. The process of hiding in the visual stream is performed using DWT to ensure both transparency and resistance against attacks. We employ the power factors of DWT to control the level of transparency. In addition, we propose the silence deletion algorithm (SDA), which generates a pure original audio stream by removing the noise from the original audio stream to form the hiding place of the watermark within the audio stream. DCT is employed to hide the watermark within the pure original audio stream to ensure resistance against attacks. Under a threat model, which includes bilinear, curved, and LPF geometric attacks and compression and Gaussian noise non-geometric attacks, the experimental results demonstrated that the proposed system outperformed four similar systems: key-frame-, I-frame-, spread-spectrum-, and LBS-based systems.

Author 1: Shahad Almuzairai
Author 2: Nisreen Innab

Keywords: Watermark; audio stream; visual stream; moving block, silence deletion; DWT; DCT; attacks

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org