The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 8 Issue 9

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: The Analysis of Anticancer Drug Sensitivity of Lung Cancer Cell Lines by using Machine Learning Clustering Techniques

Abstract: Lung cancer is the commonest type of cancer with the highest fatality rate worldwide. There is continued research that experiments on drug development for lung cancer patients by assessing their responses to chemotherapeutic treatments to select novel targets for improved therapies. This study aims to analyze the anticancer drug sensitivity in human lung cancer cell lines by using machine learning techniques. The data for this analysis is extracted from the National Cancer Institute (NCI). This experiment uses 408,291 human small molecule lung cancer cell lines to conclude. The values are drawn from describing the raw viability values for 91 human lung cancer cell lines treated with 354 different chemical compounds and 432 concentration points tested in each replicate experiments. Our analysis demonstrated the data from a considerable amount of cell lines clustered by using Simple K-means, Filtered clustering and by calculating sensitive drugs for each lung cancer cell line. Additionally, our analysis also demonstrated that the Neopeltolide, Parbendazole, Phloretin and Piperlongumine anti-drug chemical compounds were more sensitive for all 91 cell lines under different concentrations (p-value < 0.001). Our findings indicated that Simple K-means and Filtered clustering methods are completely similar to each other. The available literature on lung cancer cell line data observed a significant relationship between lung cancer and anticancer drugs. Our analysis of the reported experimental results demonstrated that some compounds are more sensitive than other compounds; Phloretin was the most sensitive compound for all lung cancer cell lines which were nearly about 59% out of 91 cell lines. Hence, our observation provides the methodology on how anticancer drug sensitivity of lung cancer cell lines can be analyzed by using machine learning techniques, such as clustering algorithms. This inquiry is a useful reference for researchers who are experimenting on drug developments for the lung cancer in the future.

Author 1: Chandi S. Wanigasooriya
Author 2: Malka N. Halgamuge
Author 3: Azeem Mohammad

Keywords: Data analysis; clustering; filtered clustering; simple k-means clustering; cancer; lung cancer; cancer cell lines; drug sensitivity

PDF

Paper 2: A Minimum Redundancy Maximum Relevance-Based Approach for Multivariate Causality Analysis

Abstract: Causal analysis, a form of root cause analysis, has been applied to explore causes rather than indications so that the methodology is applicable to identify direct influences of variables. This study focuses on observational data-based causal analysis for factors selection in place of a correlation approach that does not imply causation. The study analyzes the causality relationship between a set of categorical response variables (binary and more than two categories) and a set of explanatory dummy variables by using multivariate joint factor analysis. The paper uses the Minimum Redundancy Maximum Relevance (MRMR) algorithm to identify the causation utilizing data obtained from the National Automotive Sampling System’s Crashworthiness Data System (NASS-CDS) database.

Author 1: Yawai Tint
Author 2: Yoshiki Mikami

Keywords: Causal analysis; dummy variable; Minimum Redundancy Maximum Relevance (MRMR); multivariate analysis

PDF

Paper 3: Root-Cause and Defect Analysis based on a Fuzzy Data Mining Algorithm

Abstract: Manufacturing organizations have to improve the quality of their products regularly to survive in today’s competitive production environment. This paper presents a method for identification of unknown patterns between the manufacturing process parameters and the defects of the output products and also of the relationships between the defects. Discovery of these patterns helps practitioners to achieve two main goals: first, identification of the process parameters that can be used for controlling and reducing the defects of the output products and second, identification of the defects that very probably have common roots. In this paper, a fuzzy data mining algorithm is used for discovery of the fuzzy association rules for weighted quantitative data. The application of the association rule algorithm developed in this paper is illustrated based on a net making process at a netting plant. After implementation of the proposed method, a significant reduction was observed in the number of defects in the produced nets.

Author 1: Seyed Ali Asghar Mostafavi Sabet
Author 2: Alireza Moniri
Author 3: Farshad Mohebbi

Keywords: Data mining; association rules; defect analysis; fuzzy sets; root cause analysis; quality

PDF

Paper 4: Design and Control of Self-Stabilizing Angular Robotics Anywalker

Abstract: Walking robots are designed to overcome obstacles when moving. The walking robot AnyWallker is developed, in the design of which the task of self-stabilization of the center of the mass is solved; a special type of chassis is developed, providing movement on high cross-country capability. The paper presents the results of designing and controlling the robot, the architecture of the software complex provides management and mastification of the hardware platform. AnyWalker is actually a chassis which can be used to build robots for many different purposes, such as surveying complex environment, industrial operations, and work in hazardous environment.

Author 1: Igor Ryadchikov
Author 2: Semyon Sechenev
Author 3: Sergey Sinitsa
Author 4: Alexander Svidlov
Author 5: Pavel Volkodav
Author 6: Anton Feshin
Author 7: Anas Alotaki
Author 8: Aleksey Bolshakov
Author 9: Michail Drobotenko
Author 10: Evgeny Nikulchev

Keywords: Walking robots; self-stabilization platform; stability of dynamic systems; chassis of robotic complexes

PDF

Paper 5: A New 30 GHz AMC/PRS RFID Reader Antenna with Circular Polarization

Abstract: The work on this guideline focus on the development and the design of a circularly polarized metallic EBG antenna fed by two microstrip lines. In order to achieve that purpose, a list of indicative specifications has been established, namely, obtaining an antenna operating from 29.5 to 30 GHz, with a high gain value, an ellipticity rate less than 3 dB and a secondary lobes less than -12 dB which is designed for Radio Frequency Identification (RFID) readers operating in the millimeter band. The size of the patch is 17*17 mm2. Artificial materials, such as artificial magnetic conductor (AMC) and Partially Reflective Surface (PRS) were added as an upper layer to this antenna in order to expand its bandwidth for the RFID reader applications. The new antenna has -35 dB of insertion loss with an impedance bandwidth of 0.6 GHz and a gain of 12.4 dB at 30 GHz. Analysis of the proposed antenna was carried out based on the finite element method using two electromagnetics simulations software: CST-MW Studio® and ANSYS HFSS. The simulation results obtained are presented and discussed.

Author 1: Omrane NECIBI
Author 2: Chaouki GUESMI
Author 3: Ali GHARSALLAH

Keywords: Radio Frequency Identification (RFID); Fabry-Perot Cavity Antenna (FPCA); Electromagnetics Band Gap (EBG); circular polarization; high impedance surface (HIS); artificial magnetic conductor (AMC); millimeter wave identification; Partially Reflective Surface (PRS); axial ratio (AR)

PDF

Paper 6: Modelling Planar Electromagnetic Levitation System based on Phase Lead Compensation Control

Abstract: Electromagnetic Levitation System is commonly used in the field of train Maglev (magnetic levitation) system. Modelling Maglev system including all the magnetic force characteristics based on the current and position. This paper presents 2DOF model which represents a sample of uniform rigid plane body based on the functions of current and the air gap. The present work identifies the dynamic correlation of the levitation system of the Maglev using three sub-models. Lead controller is developed to achieve system stability by considering the system correlation of system moments and inductance variations. The control properties of the present model are obtained through SIMLAB microcontroller board to achieve the stable Maglev system.

Author 1: Mundher H. A. YASEEN

Keywords: Electromagnetic levitation system; lead controller; (magnetic levitation) maglev system; SIMLAB board

PDF

Paper 7: Enhanced Mechanism to Detect and Mitigate Economic Denial of Sustainability (EDoS) Attack in Cloud Computing Environments

Abstract: Cloud computing (CC) is the next revolution in the Information and Communication Technology arena. CC is often provided as a service comparable to utility services such as electricity, water, and telecommunications. Cloud service providers (CSP) offers tailored CC services which are delivered as subscription-based services, in which customers pay based on the usage. Many organizations and service providers have started shifting from traditional server-cluster infrastructure to cloud-based infrastructure. Nevertheless, security is one of the main factors that inhibit the proliferation of cloud computing. The threat of Distributed Denial of Service (DDoS) attack continues to wreak havoc in these cloud infrastructures. In addition to DDoS attacks, a new form of attack known as Economic Denial of Sustainability (EDoS) attack has emerged in recent years. DDoS attack in conventional computing setup usually disrupts the service, which affects the client reputation, and results in financial loss. In CC environment, service disruption is very rare due to the auto-scalability (Elasticity), capability, and availability of service level agreements (SLA). However, auto scalability utilize more computing resources in event of a DDoS attack, exceeding the economic bounds for service delivery, thereby triggering EDoS for the organization targeted. Although EDoS attacks are small at the moment, it is expected to grow in the near future in tandem with the growth in cloud usage. There are few EDoS detection and mitigation techniques available but they have weaknesses and are not efficient in mitigating EDoS. Hence, an enhanced EDoS mitigation mechanism (EDoS-EMM) has been proposed. The aim of this mechanism is to provide a real-time detection and effective mitigation of EDoS attack.

Author 1: Parminder Singh Bawa
Author 2: Shafiq Ul Rehman
Author 3: Selvakumar Manickam

Keywords: Cloud computing; Economic Denial of Sustainability (EDoS) attack; security; Distributed Denial of Service (DDoS) attack; mitigation mechanism; anomaly detection technique

PDF

Paper 8: An Automated Surveillance System based on Multi-Processor System-on-Chip and Hardware Accelerator

Abstract: The video surveillance, such as an example of security system presents one of the powerful techniques used in advanced systems. Manual vision which is used to analyze video in the traditional approach should be avoided. An automated surveillance system based on suspicious behavior presents a great challenge to developers. The detection is encountered by complexity and time-consuming process. An abnormal behavior could be identified by different ways: actions, face, trajectory, etc. The characteristics of an abnormal behavior still presents a great problem. This paper proposes a specific System On Chip architecture for surveillance system based on Multi-Processor (MPSOC) and hardware accelerator. The aim is to accelerate the processing and obtain a reliable and accelerated suspicious behavior recognition. Finally, the experiment section proves the opportunity of the proposed system in terms of performance and cost.

Author 1: Mossaad Ben Ayed
Author 2: SabeurElkosantini
Author 3: Mohamed Abid

Keywords: Surveillance system; suspicious behaviors; multi-processor; accelerator; architecture

PDF

Paper 9: Gait Identification using Neural Network

Abstract: Biometric System has become more important in security and verification of any human, which is under surveillance. Identification from distance is also possible by this technology. Researchers are taking interest to find out identification of gait by unknown manners and without informing the human as object. We are going to offer sufficient self-similarity gait recognition system for identification using artificial neural network. In which background modeling is made by video camera, in front of camera movement will be generated as to collect frames as segments using background subtraction algorithm. Then logically head (Skelton) is used to find out the walking object as a walking figure. In short, when a video framing is entered, the offered system identifies the gait properties and body based. Offered system is worked with collected gait dataset with different trials. Video framing sequence showed the algorithm attains recognition performance with its accomplishment. Human as Object identification method using gait is a different technique to verify an individual by the way he move or walk and by the intensity of moving on feet. Biometric recognition is method to assess the behavioural properties of anybody by setting up different pattern as according to need. Gait recognition is type of that biometric system which works without giving any hint to moving object quickly. This is the best way of monitoring the people. Using this system different environment can be controlled like airports, banks, airbase to detect the danger and threat.

Author 1: Muhammad Ramzan Talib
Author 2: Ayesha Shafique
Author 3: Muhammad Kashif Hanif
Author 4: Muhammad Umer Sarwar

Keywords: Gait recognition; biometric identification; neural network; back preparation; human detection and tracking; morphological operator; feature extraction

PDF

Paper 10: Phishing Website Detection based on Supervised Machine Learning with Wrapper Features Selection

Abstract: The problem of Web phishing attacks has grown considerably in recent years and phishing is considered as one of the most dangerous Web crimes, which may cause tremendous and negative effects on online business. In a Web phishing attack, the phisher creates a forged or phishing website to deceive Web users in order to obtain their sensitive financial and personal information. Several conventional techniques for detecting phishing website have been suggested to cope with this problem. However, detecting phishing websites is a challenging task, as most of these techniques are not able to make an accurate decision dynamically as to whether the new website is phishing or legitimate. This paper presents a methodology for phishing website detection based on machine learning classifiers with a wrapper features selection method. In this paper, some common supervised machine learning techniques are applied with effective and significant features selected using the wrapper features selection approach to accurately detect phishing websites. The experimental results demonstrated that the performance of the machine learning classifiers was improved by using the wrapper-based features selection. Moreover, the machine learning classifiers with the wrapper-based features selection outperformed the machine learning classifiers with other features selection methods.

Author 1: Waleed Ali

Keywords: Phishing website; machine learning; wrapper features selection

PDF

Paper 11: Basic Health Screening by Exploiting Data Mining Techniques

Abstract: This study aimed at proposing a basic health screening system based on data mining techniques in order to help related personnel on basic health screening and to facilitate citizens on self-examining health conditions. The research comprised of two steps. The first step was to create a model by using classification techniques that are Bayesian methods (Naïve Bayes, Bayesian networks, and Naïve Bayesian Updateable) and decision tree methods (C4.5, ID3, Partial Rule) to find important attributes causing the disease. In this step, the accuracy of each method was compared to the other methods to select the most efficient model as an input for the next step. The second step was to develop a basic health screening system by exploiting rules from the model developed in the first step as the second step’s inputs were to classify from a citizen’s health profile whether a given citizen is in a normal group, risk group or sick group. Research findings revealed two important attributes directly contributing to diabetes: Blood pressure (BP) and docetaxel (DTX). Furthermore, C4.5 algorithm provided the most accuracy with accuracy of 99.7969%, precision of 99.8%, recall of 99.8% and F-measure of 99.8%.

Author 1: Dolluck Phongphanich
Author 2: Nattayanee Prommuang
Author 3: Benjawan Chooprom

Keywords: Bayesian methods; classification technique; data-mining; decision tree methods

PDF

Paper 12: Fuzzy-Semantic Similarity for Automatic Multilingual Plagiarism Detection

Abstract: A word may have multiple meanings or senses, it could be modeled by considering that words in a sentence have a fuzzy set that contains words with similar meaning, which make detecting plagiarism a hard task especially when dealing with semantic meaning, and even harder for cross language plagiarism detection. Arabic is known by its richness, word’s constructions and meanings diversity, hence changing texts from/to Arabic is a complex task, and therefore adopting a fuzzy semantic-based approach seems to be the best solution. In this paper, we propose a detailed fuzzy semantic-based similarity model for analyzing and comparing texts in CLP cases, in accordance with the WordNet lexical database, to detect plagiarism in documents translated from/to Arabic, a preprocessing phase is essential to form operable data for the fuzzy process. The proposed method was applied to two texts (Arabic/English), taking into consideration the specificities of the Arabic language. The result shows that the proposed method can detect 85% of the plagiarism cases.

Author 1: Hanane EZZIKOURI
Author 2: Mohamed ERRITALI
Author 3: Mohamed OUKESSOU

Keywords: CLPD; fuzzy similarity; natural language processing; plagiarism detection; semantic similarity

PDF

Paper 13: Customization of Graphical Visualization for Health Parameters in Health Care Applications

Abstract: In the 21st century, health care systems worldwide are facing many challenges as a result of the growing concern of diseases in humans, such as intestine, breathing, paralysis, nutritional value, and urogenital disorders. The use of the mobile technology in the field of healthcare system not only reduces the cost but also facilitates quality of the long-term health care, intelligent automation and rationalization of patient health monitoring wherever needed. While regular monitoring of the readings of the vital signs is critical, it is often overlooked because of the busy life schedule. There is a number of apps for health monitoring systems, but users are generally not satisfied with these applications because of the lack of custom graphical visualization of parameters representations like daily, weekly or yearly graphs, and the relationship between the vital signs. In this research study, we identify a principal issue in the health monitoring application, which is the custom graphical visualization of parameters representations of the health monitoring application. To solve the identified problems in this research study, we focus on the design and implementation of custom graphical visualization of parameters for health monitoring applications. The model emphasizes on the monitor, save and retrieve logs. System usability scale has been modified and evaluated for usability, learnability, and customization of the graph. In this research study, we took N=20 in observation for collecting the readings of Heart Rate, Skin Temperature, Respiration, and Glucose Rate. A total number of responses collected were R=60 from the age group of 24 to 40 years. Comparisons were made between three different Android based health monitoring applications, i.e., S-Health, Health Monitoring and the developed applications. The usability and learnability responses for the developed application as compared to other two applications are significantly high. The overall System Usability Score for the developed application was significantly high.

Author 1: Saima Tunio
Author 2: Hameedullah Kazi
Author 3: Sirajuddin Qureshi

Keywords: Custom graphical visualization; health monitoring application; learnability; usability

PDF

Paper 14: Face Extraction from Image based on K-Means Clustering Algorithms

Abstract: This paper proposed a new application of K-means clustering algorithm. Due to ease of implementation and application, K-means algorithm can be widely used. However, one of the disadvantages of clustering algorithms is that there is no balance between the clustering algorithm and its applications, and many researchers have paid less attention to clustering algorithm applications. The purpose of this paper is to apply the clustering algorithm application to face extraction. An improved K-means clustering algorithm was proposed in this study. A new method was also proposed for the use of clustering algorithms in image processing. To evaluate the proposed method, two case studies were used, including four standard images and five images selected from LFW standard database. These images were reviewed first by the K-means clustering algorithm and then by the RER-K-means and FE-RER-clustering algorithms. This study showed that the K-means clustering algorithm could extract faces from the image and the proposed algorithm used for this work increased the accuracy rate and, at the same time, reduced the number of iterations, intra cluster distance, and the related processing time.

Author 1: Yousef Farhang

Keywords: K-means; RER-K-means; clustering algorithm; face extraction; edge detection; image clustering

PDF

Paper 15: Estimating Evapotranspiration using Machine Learning Techniques

Abstract: The measurement of evapotranspiration is the most important factor in irrigation scheduling. Evapotranspiration means loss of water from the surface of plant and soil. Evaporation parameters are being used in studying water balances, water resource management, and irrigation system design and for estimating plant growth and height as well. Evapotranspiration is measured by different methods by using various parameters. Evapotranspiration varies with the climate change and as the climate has a lot of variation geographically, the pre-developed systems have not used all available meteorological data hence not robust models. In this research work, a model is developed to estimate evapotranspiration with more authentic and accurate reduced meteorological parameters using different machine learning techniques. The study reveals to learn and generalize the relationship among different parameters. The dataset with reduced dimension is modeled through time series neural network giving the regression value R=83%.

Author 1: Muhammad Adnan
Author 2: M. Ahsan Latif
Author 3: Abaid-ur-Rehman
Author 4: Maria Nazir

Keywords: Evapotranspiration; principle component analysis; neural network; irrigation scheduling

PDF

Paper 16: A Novel Approach for Boosting Base Station Anonymity in a WSN

Abstract: Nodes in a wireless sensor network scrutinize the nearby region and transmit their findings to the base station (BS) using multi-hop transmission. As the BS plays an important role in a wireless sensor network, therefore an adversary who wants to interrupt the operation of the network would avidly look for the BS location and imposes maximum damage by destroying the BS physically. The multi-hop data transmission towards BS makes a prominent pattern of the traffic (huge traffic near the BS region) that indicates the presence of BS in the nearby region and thus the location of the BS may expose to the adversaries. This work aims to provide a novel approach which will increase the BS anonymity. For this purpose, a randomly roamed BS and the special nodes are proposed to achieve the above mentioned objective. The special nodes produce a large number of high traffic regions, which are similar to the BS region. Now, there are many regions which look like BS region and hence the probability to get the BS region using traffic analysis is very low. Therefore, this approach increases the effort of adversaries in order to find the exact BS position. We have used a standard entropy model to measure the anonymity of the base station and the GSAT test is used to calculate the number of steps required to find the base station. The results show that the proposed technique provides better results in terms of anonymity compared to the existing techniques.

Author 1: Vicky Kumar
Author 2: Ashok Kumar

Keywords: Anonymity; network lifetime; wireless sensor networks

PDF

Paper 17: Effectiveness of Existing CAD-Based Research Work towards Screening Breast Cancer

Abstract: Accurate detection as well as classification of the breast cancer is still an unsolved question in the medical image processing techniques. We reviewed the existing Computer Aided Diagnosis (CAD)-based techniques to find that there has been enough work carried out towards both detection as well as classification of the breast cancer; however, all the existing techniques were implemented in highly controlled research environment. The prime contribution of this paper is it reviews some of the significant journals published during 2005–2016 and discusses its effectiveness thoroughly. The paper finally discusses about the open research issues that require a serious attention from the research community in order to address the existing issues. At the end, the paper makes some suggestion for carrying out future work direction in order to bridge the research gap explored from the existing system.

Author 1: Vidya Kattepura
Author 2: Dr. Kurian M Z

Keywords: Breast cancer detection; computer aided diagnosis; cancer; classification

PDF

Paper 18: Embedded System Design and Implementation of an Intelligent Electronic Differential System for Electric Vehicles

Abstract: This paper presents an experimental study of the electronic differential system with four-wheel, dual-rear in wheel motor independently driven an electric vehicle. It is worth bearing in mind that the electronic differential is a new technology used in electric vehicle technology and provides better balancing in curved paths. In addition, it is more lightweight than the mechanical differential and can be controlled by a single controller. In this study, intelligently supervised electronic differential design and control is carried out for electric vehicles. Embedded system is used to provide motor control with a fuzzy logic controller. High accuracy is obtained from experimental study.

Author 1: Ali UYSAL
Author 2: Emel SOYLU

Keywords: Electronic differential; electric vehicle; embedded system; fuzzy logic controller; in-wheel motor

PDF

Paper 19: Educational Game Application Development on Classification of Diseases and Related Health Problems Treatment in Android Platform

Abstract: The classification and codification of diseases and related problems is one of the competences of medical recorder as stated in Kepmenkes RI.377 in 2007. The current problem is the lack of reference exercise in learning KKPMT (Klasifikasi dan Kodifikasi Penyakit dan Masalah Terkait) in Program Diploma-III Medical Recorder and Health Information Malang State Health Polytechnics. The purpose of this research is to design android based KKPMT educational application to improve students understanding of KKPMT course. This investigation was using pre-experiment, one group pretest-posttest with waterfall development method. The population in this study was all the students active in year two of Program Diploma-III Medical Record and Health Information Malang State Health Polytechnics. The result of the implementation showed that after the use of KKPMT educational game application with the diagnosis code G the percentage was above minimum, passing value increased from 6% before using the game to 94% after implementation of the game application. Results of the statistical test by paired t-test showed p-value 0,000 <0.05. The conclusion was that android game software help students in understanding the KKPMT subject matter.

Author 1: Bernadus Rudy Sunindya
Author 2: Nur Hasti Purwani

Keywords: Game; KKPMT (Klasifikasi dan Kodifikasi Penyakit dan Masalah Terkait); android

PDF

Paper 20: Uniform Segregation of Densely Deployed Wireless Sensor Networks

Abstract: In wireless sensor networks, the selection of cluster heads relies upon the various selection parameters, such as energy, distance, node concentration and rate of retransmission. There is always uncertainty in the suitability of sensor node for the cluster head role due to these various selection parameters. Fuzzy logic is capable of overcoming uncertainties even with incomplete available information. This quality of fuzzy logic can reduce uncertainty in cluster head selection up to large extent. Therefore, in this paper, a fuzzy logic based clustering approach is proposed to enhance the network operational lifetime. The cluster formation is done on the basis of the spatial correlation value between sensors to organize clusters uniformly in the network. The results are compared with well-known approaches CHEF and LEACH.

Author 1: Manjeet Singh
Author 2: Surender Soni

Keywords: Clustering; fuzzy logic; wireless sensor network; cluster head; uncertainty

PDF

Paper 21: Analysis of Zigbee Data Transmission on Wireless Sensor Network Topology

Abstract: The purpose of this study is to measure the distance in the line of sight environment and to see the data resulted from zigbee transmission by using star, mesh and tree topologies by using delay, throughput and packet loss parameters. The results showed that star topology had the average value which tended to be stable on the measurement of throughput and packet loss because there was no router nodes in star topology so that the accuracy of data delivery was better and it had the smallest delay value because the number of nodes was less than in mesh and tree topology, while the mesh and tree topologies had a poor average value on throughput and packet loss measurements, since the mesh and tree topologies had to go through many processes in which they had to pass through the router node to transmit the data to the coordinator node. However, the mesh and tree topologies had an advantage in which the data delivery could go through more distances than the star topology and they could add more nodes.

Author 1: Sigit Soijoyo
Author 2: Ahmad Ashari

Keywords: Zigbee; delay; throughput; packet loss; topology

PDF

Paper 22: A New Strategy in Trust-Based Recommender System using K-Means Clustering

Abstract: Recommender systems are among the most important parts of online systems, including online stores such as Amazon, Netflix that have become very popular in the recent years. These systems lead users to finding desired information and goods in electronic environments. Recommender systems are one of the main tools to overcome the problem of information overload. Collaborative filtering (CF) is one of the best approaches for recommender systems and are spreading as a dominant approach. However, they have the problem of cold-start and data sparsity. Trust-based approaches try to create a neighborhood and network of trusted users that demonstrate users’ trust in each other’s opinions. As such, these systems recommend items based on users’ relationships. In the proposed method, we try to resolve the problems of low coverage rate and high RMSE rate in trust-based recommender systems using k-means clustering and ant colony algorithm (TBRSK). For clustering data, the k-means method has been used on MovieLens and Epinion datasets and the rating matrix is calculated to have the least overlapping.

Author 1: Naeem Shahabi Sani
Author 2: Ferial Najian Tabriz

Keywords: Recommendation systems; collaborative filtering; trust-based recommendation system; k-means; ant colony

PDF

Paper 23: Relevance of the Indicators Observed in the Measurement of Social Resilience

Abstract: This article scrutinizes the validation of the observed properties by the experts in the study of social resilience. To that purpose, it utilizes the method of factorial analysis of multi-correspondences (ACM) in the reflections and practices of observatories about impact strength. Furthermore, a mathematical modeling of the concept of social resilience, a description of databases of the observatory of impact strength are made in understanding the process of analysis of impact strength of an individual.

Author 1: Ida Brou ASSIE
Author 2: Amadou SAWADOGO
Author 3: Jérôme K. ADOU
Author 4: Souleymane OUMTANAGA

Keywords: Social resilience; observatory of social resilience; mathematical modeling of the resilience; analysis of multi-correspondences (ACM)

PDF

Paper 24: QR Code Patterns Localization based on Hu Invariant Moments

Abstract: The widespread utilization of QR code and its coincidence with the swift growth of e-commerce transactions have imposed the computer vision researchers to continuously devise a variety of QR code recognition algorithms. The latter performances are generally limited due to two main factors. Firstly, most of them are computationally expensive because of the implemented feature descriptor complexities. Secondly, the evoked algorithms are often sensitive to pattern geometric deformations. In this paper a robust approach is proposed, in which the architecture is based on three distinct treatments among others: 1) An image quality assessment stage which evaluates the quality of the captured image in consideration that the presence of blur decreases significantly the recognition accuracy. 2) This stage is followed by an image segmentation based on an achromatic filter through which only the regions of interest are highlighted and consequently the execution time is reduced. 3) Finally, the Hu invariant moments technique is used as feature descriptor permitting removing false positives. This technique is implemented to filter out the set of extracted candidate QR code patterns, which have been roughly extracted by a scanning process. The Hu moments descriptor is able to recognize patterns independently of the geometric transformations they undergo. The experiments show that the incorporation of the aforementioned three stages enhances significantly the recognition accuracy along with a notable diminution of processing time. This makes the proposed approach adapted to embedded systems and devices with limited performances.

Author 1: Hicham Tribak
Author 2: Youssef Zaz

Keywords: QR code; Hu invariant moments; pattern recognition; image blur estimation

PDF

Paper 25: Determinants Impacting the Adoption of E-Government Information Systems and Suggesting Cloud Computing Migration Framework

Abstract: This research intends to investigate underlying elements that effect the adoption of E-Government Information Systems in Board of Intermediate and Secondary Education (BISE), Pakistan. The study is grounded on the theory of technology, organization and environment (TOE) model. Cloud computing is becoming a viable alternative for System Analysts or IT managers to consider in today’s latest information technology environment and dynamic changes in the technology landscape. The second purpose of this study is to help Government decision makers appropriately decide on the reasonableness of uses for migration to cloud computing. Considering that the provided Services in e-government (BISE) are available by means of the Internet, in this way cloud computing can be used in the implementation of e-government architecture and provide better service utilizing its benefits.

Author 1: Muhammad Aatif Shafique
Author 2: Babar Hayat Malik
Author 3: Yasar Mahmood
Author 4: Sadaf Nawaz Cheema
Author 5: Khizar Hameed
Author 6: Shabana Tabassum

Keywords: E-government information systems; adoption; TOE; cloud computing migration; Board of Intermediate and Secondary Education (BISE), Pakistan

PDF

Paper 26: Developing an Assessment Tool of ITIL Implementation in Small Scale Environments

Abstract: Considering the problematic of IT Service Management (ITSM) frameworks Implementation in SMEs, among the various frameworks available for companies to manage their IT services, ITIL is recognized as the most structured and effective framework. Nevertheless, ITIL has been criticized for not been appropriate for small scale enterprises. This paper provided a practical tool formally developed according to Design Science Research (DSR) approach, it aimed to find out the key factors that affect ITIL implementation success in SMEs, the objective was to eliminate the misunderstanding of the IT service management model’s implementation purpose. It determines various Critical Success Factors (CSFs) of ITIL implementation, the weight of each CSF is calculated with Analytical Hierarchy Process (AHP) and the evaluation was executed in a Moroccan SME. Therefore, it provides an evaluation method in order to help researchers and managers to determine the issues related to local culture of SMEs while adopting ITIL Framework. Results show that the top management support is the most important factor for Moroccan SMEs. It is found that an approach for determining ITIL processes implementation sequencing order need to be developed in order to achieve quick wins.

Author 1: Abir EL YAMAMI
Author 2: Souad AHRIZ
Author 3: Khalifa MANSOURI
Author 4: Mohammed QBADOU
Author 5: Elhossein ILLOUSSAMEN

Keywords: Component; IT Service Management (ITSM); Information Technology Infrastructure Library (ITIL); CSFs (Critical Success Factors); Design Science Research (DSR); Analytical Hierarchy Process (AHP); Small and Medium-sized enterprises (SMEs)

PDF

Paper 27: Method for Productive Cattle Finding with Estrus Cycle Estimated with BCS and Parity Number and Hormone Treatments based on a Regressive Analysis

Abstract: Estrus cycle estimation method through correlation analysis among influencing factors based on regressive analysis is carried out for Japanese Dairy Cattle Productivity Analysis. Through the experiments with 280 Japanese anestrus Holstein dairy cows, it is found that estrus cycle can be estimated with the measured with visual index of Body Condition Score (BCS), hormone treatments, and parity number, based on regressive equation. Also, it is found that the time from the delivery to the next estrus can be expressed with BCS, hormonal treatments, parity. Thus it is found that productivity of cattle can be identified.

Author 1: Kohei Arai
Author 2: Narumi Suzaki
Author 3: Iqbal Ahmed
Author 4: Osamu Fukuda
Author 5: Hiroshi Okumura
Author 6: Kenji Endo
Author 7: Kenichi Yamashita

Keywords: Body Condition Score (BCS); postpartum interval; parity number; estrous cycle; cattle productivity

PDF

Paper 28: Active and Reactive Power Control of a Variable Speed Wind Energy Conversion System based on Cage Generator

Abstract: This manuscript presents the modeling and control design for a variable speed wind energy conversion system (VS-WECS). This control scheme is based on three-phase squirrel cage induction generator driven by a horizontal-axis wind turbine through the overhead transmission network. In this manuscript, a static VAR compensator is proposed and connected with the squirrel cage induction generator terminals in order to regulate the system parameters, such as voltage, power. Through the pitch angle, the mechanical power was controlled through Simulink (Matlab) software. From the simulation results, the response of the proposed system offers good robustness and fast recovery under various dynamic system disturbances.

Author 1: Mazhar Hussain Baloch
Author 2: Waqas Ahmed Wattoo
Author 3: Dileep Kumar
Author 4: Ghulam Sarwar Kaloi
Author 5: Ali Asghar Memon
Author 6: Sohaib Tahir

Keywords: VAR compensator; wind turbine; cage generator

PDF

Paper 29: Hyperspectral Image Segmentation using Homogeneous Area Limiting and Shortest Path Algorithm

Abstract: Segmentation, as a preprocessing, plays an important role in hyperspectral images. In this paper, considering the similarity of neighboring pixels, using the size measure, the image spectrum is divided into several segments so that the existence of several sub areas in each segment is possible. Then, using the methods of area limiting and the shortest path to seed pixel, and considering the pixel spectra in all bands, the available areas in each section are separated. The area limiting method controls the amplitude changes of area pixels from seed pixel, and the shortest path method, considering the shortest path to seed, controls the size of area. The proposed method is implemented on AVIRIS images and in terms of the number of areas, the border between areas and the possibility of area interference show better results than other methods.

Author 1: Fatemeh Hajiani
Author 2: Azar Mahmoodzadeh

Keywords: Segmentation; hyperspectral; shortest path; area limiting

PDF

Paper 30: Aquabot: A Diagnostic Chatbot for Achluophobia and Autism

Abstract: Chatbots or chatter bots have been a good way to entertain one. This paper emphasizes on the use of a chatbot in the diagnosis of Achluophobia – the fear of darkness and autism disorder. Autism and Achluophobia (fear of darkness) are the most common neurodevelopment disorders usually found in children. State of the art trivial diagnosis methods require a lot of time and are also unable to maintain the case history of psychological disease. A chatbot has been developed in this work which can diagnose the severity of disease based on user’s text based questions. It performs Natural Language Processing (NLP) for meaning extraction and uses Decision Trees to characterize a patient in terms of possible disease. NLP unit extracts meaning of keywords defining intensity of disease’s symptoms, from user’s chat. After that similarity matching of sentence containing keywords is performed. Depth First Search (DFS) technique is used for traversing Decision Tree and making decision about severity of disease. The proposed system namely Aquabot, proves to be an efficient technique in diagnosing Achluophobia and Autism. Aquabot is useful for practitioner psychologists to assist a human psychologist. Aquabot not only saved time and resources but also achieved an accuracy of 88 percent when compared against human psychologist’s diagnosed results.

Author 1: Sana Mujeeb
Author 2: Muhammad Hafeez Javed
Author 3: Tayyaba Arshad

Keywords: Chatbot; Achluophobia; autism; expert system

PDF

Paper 31: Question Answering Systems: A Review on Present Developments, Challenges and Trends

Abstract: Question Answering Systems (QAS) are becoming a model for the future of web search. In this paper we present a study of the latest research in this area. We collected publications from top conferences and journals on information retrieval, knowledge management, artificial intelligence, web intelligence, natural language processing and the semantic web. We identified and classified the topics of Question Answering (QA) being researched on and the solutions that are being proposed. In this study we also identified the issues being most researched on, the most popular solutions being proposed and the newest trends to help researchers gain an insight on the latest developments and trends of the research being done in the area of question answering.

Author 1: Lorena Kodra
Author 2: Elinda Kajo Meçe

Keywords: Question answering systems; community question answering systems

PDF

Paper 32: Framework for Applicability of Agile Scrum Methodology: A Perspective of Software Industry

Abstract: Agile scrum methodology has been evolved over the time largely through software industry where it has grown and developed through empirical progress. The research work presented in this paper has proposed a framework by identifying critical elements for applicability of agile scrum methodology in software industry. The proposed framework is based on four elements, i.e. technical, people, environmental and organizational. The proposed framework is validated through statistical analysis, i.e. Structural Equation Modeling (SEM) after collecting data from software industry personals who are working on agile methodologies. The research concludes that 15 out of 18 hypothesis were found significant which include Training & Learning, Societal Culture, Communication & Negotiation, Personal Characteristics, Customer collaboration, Customer commitment, Decision Time, Team Size, Corporate Culture, Planning, Control, Development, Information Administration, and Working Environment.

Author 1: Anum Ali
Author 2: Mariam Rehman
Author 3: Maria Anjum

Keywords: Scrum agile methodology; framework; software industry; critical factors

PDF

Paper 33: A New Design of in-Memory File System based on File Virtual Address Framework

Abstract: Rapid growth in technology is increasing day by day that demands computer systems to work better, should be reliable and have faster performance with fair cost and best functionalities. In the modern era of technology, memory files are used to shorten the performance gap between memory and storage. Sustainable in-memory file system (SIMFS) was the first that introduces the concept of open file address space into the address space of the process and exploits the memory mapping hardware while accessing files. The purpose of designing and implementing the SIMFS architecture is to achieve performance improvement of in-memory file system. SCMFS are designed for the storage class system that uses the presented memory management component in the operating system to assist in managing block, and it manages the space for each and every file adjacent to the virtual address space. A recent study has proposed that non-volatile memories are powerful enough to minimize the performance gap, as compared to previous generation non-volatile memories. This is because the performance gap between non-volatile and volatile memories has been reduced and there are possibilities of using a non-volatile memory as a computer’s main memory in near future. Lately, high-speed non-volatile storage media, such as Phase Change Memory (PCM) has come into view and it is expected that for storage device PCM will be used by replacing the hard disk in upcoming years. Moreover, the PCM is byte-addressable, it means that it can access individual byte of data rather than word and data access time is expected to be almost indistinguishable of DRAM, a volatile memory. These features and innovations in computer architecture are making the computer system more reliable and faster.

Author 1: Fahad Samad
Author 2: Zulfiqar Ali Memon

Keywords: Phase change memory; non-volatile memory; Spin Transfer Torque – RAM; sustainable in-memory file system; journaling file system

PDF

Paper 34: Medicloud: Hybrid Cloud Computing Framework to Optimize E-Health Activities

Abstract: Cloud computing is emerging technology and its usage in health sector is marvelous. It enhances the patient treatment process and allows the physicians to get remotely access to patient medical record anywhere and anytime. Numerous cloud based solution are working currently and offering facilities to people in rural area of developing countries. It is estimated by global healthcare that within few years of adoption of cloud in health sector will increase drastically whereas cloud based health services have opportunities and challenges as well. Privacy, security, interoperability and standards are the factors that influence cloud computing in e-health. For cloud adoption, organization must understand the existing requirements and make strategy for further development. Cloud offers service and deployment model, each organization select the appropriate model according to their requirements. Interesting thing in cloud is that the responsibility is shared among provider and customer from usage perspective. For initiation of whole procedure service level agreement is signed among customer and provider. Organization can access the cloud services from multiple providers. Hybrid cloud computing is best suitable architecture for health organizations. The whole scenario will provide ease to physician and patient and maximize the work production.

Author 1: Hina Kunwal
Author 2: Dr. Babur Hayat Malik
Author 3: Amber Saeed
Author 4: Husnain Mushtaq
Author 5: Hassan Bilal Cheema
Author 6: Farhat Mehmood

Keywords: E-health; cloud computing; hybrid cloud; cloud based services; patient; security; cloud adoption

PDF

Paper 35: Rising Issues in VANET Communication and Security: A State of Art Survey

Abstract: VANET (Vehicular Adhoc Network) has made an evolution in the transportation hi-tech system in most of the developed countries. VANET plays an important role in an intelligent transportation system (ITS). This paper gives an overall survey on the research in VANET security and communication. It also gives parameters considered by the previous researchers. After the survey, it considered the authentication and message forwarding issues required more research. Authentication is first line of security in VANET; it avoids attacks made by the malicious nodes. Previous research has come up with some Cryptographic, Trust based, Id based, and Group signature based authentication schemes. Speed of authentication and privacy preservation are the important parameters in VANET authentication. This paper presented the AECC (Adaptive Elliptic Curve Cryptography), and EECC (Enhanced Elliptic Curve Cryptography) schemes to improve the speed and security of authentication. In AECC, the key size is adaptive, i.e. different sizes of keys are generated during the key generation phase. Three ranges are specified for key sizes: small, large, and medium. In EECC, added an extra parameter during the transmission of information from, the vehicle to the RSU for key generation. This additional parameter gives the information about the vehicle ID, and the location of the vehicle to the RSU and the other vehicle. Under the communication issue of VANET, the paper gives priority based message forwarding for improving the message forwarding scheme. It handles emergency situations more effectively.

Author 1: Sachin P. Godse
Author 2: Parikshit N. Mahalle
Author 3: Sanjeev J. Wagh

Keywords: Vehicular Adhoc Network (VANET); Adaptive Elliptic Curve Cryptography (AECC); Enhanced Elliptic Curve Cryptography (EECC); authentication; message forwarding

PDF

Paper 36: Using Hybrid Evolutionary Algorithm based Adaptive Filtering

Abstract: Noise degrades the overall efficiency of the data transmission in the networking models which is no different in Cognitive Radio Adhoc Networks (CRAHNs). For efficient opportunistic routing in CRAHN, the Modified SMOR (M-SMOR) and Sparsity based Distributed Spectrum Map M-SMOR (SDS-M-SMOR) have been developed which provide significant improvement in the overall routing behavior. However, the increase in the noises is inevitable especially in large scale networks which Swarm Optimization (PSO) and Genetic Algorithm (GA) together termed as HPSOGA. The proposed HPSOGA based adaptive filter readjusts the filter constraints in accordance to the channel and the signals, thus mitigates the noise in the reconfigurable systems, like CRAHNs. The key benefit of the HPSOGA based adaptive filter is the global optimization when compared to other, the proposed model with noise cancellation has better performance values than other routing models.

Author 1: Adnan Alrabea

Keywords: Cognitive radio adhoc networks; distributed spectrum map; swarm optimization; genetic algorithm

PDF

Paper 37: A Fast Method to Estimate Partial Weights Enumerators by Hash Techniques and Automorphism Group

Abstract: BCH codes have high error correcting capability which allows classing them as good cyclic error correcting codes. This important characteristic is very useful in communication and data storage systems. Actually after almost 60 years passed from their discovery, their weights enumerators and therefore their analytical performances are known only for the lengths less than or equal to 127 and only for some codes of length as 255. The Partial Weights Enumerator (PWE) algorithm permits to obtain a partial weights enumerators for linear codes, it is based on the Multiple Impulse Method combined with a Monte Carlo Method; its main inconveniece is the relatively long run time. In this paper we present an improvement of PWE by integration of Hash techniques and a part of Automorphism Group (PWEHA) to accelerate it. The chosen approach applies to two levels. The first is to expand the sample which contains codewords of the same weight from a given codeword, this is done by adding a part of the Automorphism Group. The second level is to simplify the search in the sample by the use of hash techniques. PWEHA has allowed us to considerably reduce the run time of the PWE algorithm, for example that of PWEHA is reduced at more than 3900% for the BCH (127,71,19) code. This method is validated and it is used to approximate a partial weights enumerators of some BCH codes of unknown weights enumerators.

Author 1: Moulay Seddiq EL KASMI ALAOUI
Author 2: Saïd NOUH
Author 3: Abdelaziz MARZAK

Keywords: Partial weights enumerator; PWEHA; automorphism group; hash function; hash table; BCH codes

PDF

Paper 38: Colored Image Retrieval based on Most used Colors

Abstract: The Fast Development of the image capturing in digital form leads to the availability of large databases of images. The manipulation and management of images within these databases depend mainly on the user interface and the search algorithm used to search these huge databases for images, there are two search methods for searching within image databases: Text-Based and Content-Based. In this paper, we present a method for content-based image retrieval based on most used colors to extract image features. A preprocessing is applied to enhance the extracted features, which are smoothing, quantization and edge detection. Color quantization is applied using RGB (Red, Green, and Blue) Color Space to reduce the range of colors in the image and then extract the most used color from the image. In this approach, Color distance is applied using HSV (Hue, Saturation, Value) color space for comparing a query image with database images because it is the closest color space to the human perspective of colors. This approach provides accurate, efficient, less complex retrieval system.

Author 1: Sarmad O. Abter
Author 2: Dr. Nada A.Z Abdullah

Keywords: Most used colors feature; color histogram; content-based image retrieval (CBIR); contour analysis; HSV color space

PDF

Paper 39: A Fuzzy based Model for Effort Estimation in Scrum Projects

Abstract: This paper aims to utilize the fuzzy logic concepts to improve the effort estimation in Scrum framework and in turn add a significant enhancement to Scrum. Scrum framework is one of the most popular agile methods in which the team accomplishes their work by breaking down the work into a series of sprints. In Scrum, there are many factors that have a significant influence on the effort estimation of each task in a Sprint. These factors are: Development Team Experience, Task Complexity, Task Size, and Estimation Accuracy. These factors are usually presented using linguistic quantifiers. Therefore, this paper utilizes the fuzzy logic concepts to build a fuzzy based model that can improve the effort estimation in Scrum framework. The proposed model includes three components: fuzzifier, inference engine, and defuzzifier. In addition, the proposed model takes into consideration the feedback that is resulted from comparing the estimated effort and the actual effort. The researcher designed the proposed model using MATLAB. The proposed model is applied on three Sprints of a real software development project to present how the proposed model works and to show how it becomes more accurate over time and gives a better effort estimation. In addition, the Scrum Master and the development team can use the proposed model to monitor the improvement in effort estimation accuracy over the project life.

Author 1: Jasem M. Alostad
Author 2: Laila R. A. Abdullah
Author 3: Lamya Sulaiman Aali

Keywords: Scrum; sprint; effort estimation; fuzzy logic; fuzzy inference system

PDF

Paper 40: Clustering based Max-Min Scheduling in Cloud Environment

Abstract: Cloud Computing ensures Service Level Agreement (SLA) by provisioning of resources to cloudlets. This provisioning can be achieved through scheduling algorithms that properly maps given tasks considering different heuristics such as execution time and completion time. This paper is built on the concept of max-min algorithm with and unique proposed modification. A novel idea of clustering based max-min scheduling algorithm is introduced to decrease overall make-span and better VM utilization for variable length of the tasks. Experimental analysis shows that due to clustering, it provides better result than the different variations of max-min as well as other heuristics algorithm in terms of effective utilization of faster VMs and proper scheduling of tasks considering all possible scheduling scenarios and picking up the best solution.

Author 1: Zonayed Ahmed
Author 2: Adnan Ferdous Ashrafi
Author 3: Maliha Mahbub

Keywords: Cloud computation; cluster; heuristics; batch-mode heuristics; cluster based max-min scheduling

PDF

Paper 41: Performance Chronicles of Multicast Routing Protocol in Wireless Sensor Network

Abstract: Routing protocol in wireless sensor network (WSN) has always been a frequently adopted topic of research in WSN owing to many unsolved issues in it. This paper discusses about the multicast routing protocols in WSN and briefs up different forms of standard research contribution as well as significant recent research techniques toward leveraging the performance of multicast routing. The paper then discusses the beneficial factor and limiting factor in existing multicast techniques and highlights the research gap in this. In order to overcome the research gap, a novel architecture to address the optimization as a cost minimization problem associated with multicast routing in WSN is proposed. This paper contributes to show a present scenario of multicast routing performance in WSN and thereby assists the readers about the possible direction of future with clear visualization of system architecture.

Author 1: Nandini G
Author 2: J. Anitha

Keywords: Complexity; multicast routing techniques; overhead; optimization; routing protocol; wireless sensor network

PDF

Paper 42: Enhancing the Administration of National Examinations using Mobile Cloud Technologies: A Case of Malawi National Examinations Board

Abstract: Technological advances and the search for efficiency have catalyzed recently a migration from paper-and-pencil based way of doing things to computer-based in education and training at all levels with its drivers being faster administration, processing and delivery of examination results, error free marking of test items and enhanced interactivity. This research paper aims at establishing the challenges currently faced by Malawi National Examinations Board (MANEB) when registering candidates for national examinations as well as disseminating examinations results. A Short Message Service/Unstructured Supplementary Service Data (SMS/USSD) based mobile application using cloud infrastructure is proposed to address the challenges. Data was collected from 80 respondents consisting of teachers, parents and students whose analytical results show that current MANEB business processes have a number of irregularities that subsequently result in candidates’ registration records missing or being incorrect as well as delayed access to examinations results by candidates. The proposed SMS/USSD application was tested and proved to be faster and more reliable than the traditional computer based approach that is currently being utilized.

Author 1: Lovemore Solomon
Author 2: Jackson Phiri

Keywords: National examinations; Short Message Service (SMS); Unstructured Supplementary Service Data (USSD); candidate; cloud computing; Malawi National Examinations Board (MANEB)

PDF

Paper 43: Distributed Swarm Optimization Modeling for Waste Collection Vehicle Routing Problem

Abstract: In this paper, we consider a complex garbage collection problem, where the residents of a particular area dispose of recyclable garbage, which is collected and managed using a fleet of trucks with different weight capacities and volume. This tour is characterized by a set of constraints such as the maximum tour duration (in term of distance and the timing) consumed to collect wastes from several locations. This problem is modeled as a garbage collection vehicle routing problem, which aims to minimize the cost of traveling routes (minimizing the distance traveled) by finding optimal routes for vehicles such that all waste bins are emptied and the waste is driven towards the disposal locations. We propose a distributed technique based on the Ant Colony system Algorithm to find optimal routes that help vehicles to visit all the wastes bins using interactive agents consumed based on the behavior of real ants. The designed solution will try to create a set of layers to control and manage the waste collection, each layer will be handled by an intelligent agent which is characterized by a specific behavior, in this architecture a set of behaviors have been designed to optimizing routes and control the real time capacity of vehicles. Finally, manage the traffic messages between the different agents to select the best solutions that will be assigned to each vehicle. The developed solution performs well compared to the traditional solution on small cases.

Author 1: ELGAREJ Mouhcine
Author 2: MANSOURI Khalifa
Author 3: YOUSSFI Mohamed
Author 4: BENMOUSSA Nezha
Author 5: EL FAZAZI Hanae

Keywords: Vehicle routing system; ant colony optimization; multi-agent system; garbage collection system

PDF

Paper 44: An Intelligent Security Approach using Game Theory to Detect DoS Attacks In IoT

Abstract: The Internet of Things (IoT) is a new concept in the world of Information and Communication Technology (ICT). The structure of this global network is highly interconnected and presents a new category of challenges from the security, trust, and privacy perspectives. The data transfer problems through the Denial-of-Service (DoS) attacks simply occur in this network and lead to service slow down or system crash. At the present time, traditional techniques are being widely used to confront the denial-of-service attacks in the Internet of Things and unfortunately, smart techniques have been less studied and exploited. In this research, a security solution on the basis of game theory is proposed to detect the denial-of-service attacks and prevent the problems in the services of the network of the Internet of Things. In order to scrutinize the performance of the suggested method in the network, this method was simulated using the NS2 simulator. The simulation results confirmed that the game-theory strategies in the proposed method outperformed the existing methods. Furthermore, in order to verify the acquired findings, a comparative evaluation was exhibited according to the three factors of operational throughput, latency, and energy consumption.

Author 1: Farzaneh Yazdankhah
Author 2: Ali Reza Honarvar

Keywords: Internet of Things (IoT); Network security; Attack detection

PDF

Paper 45: Hybrid Forecasting Scheme for Financial Time-Series Data using Neural Network and Statistical Methods

Abstract: Currently, predicting time series utilizes as interesting research area for temporal mining aspects. Financial Time Series (FTS) delineated as one of the most challenging tasks, due to data characteristics is devoid of linearity, stationary, noisy, high degree of uncertainty and hidden relations. Several singles' models proposed using both statistical and data mining approaches powerless to deal with these issues. The main objective of this study to propose a hybrid model, using additive and linear regression methods to combine linear and non-linear models. However, three models are investigated namely ARIMA, EXP, and ANN. Firstly, those models are feeding by exchange rate data set (SDG-EURO). Then, the arithmetical outcome of each model examined as benchmark models and set of aforementioned hybrid models in related literature. Results showed the superiority in hybrid model on all other investigated models based on 0.82% MAPE error's measure for accuracy. Based on the results of this study, we can conclude that further experiments desirable to estimate the weights for accurate combination method and more models essential to be surveyed in the areas of series prediction.

Author 1: Mergani Khairalla
Author 2: Xu-Ning
Author 3: Nashat T. AL-Jallad

Keywords: Financial Time Series; hybrid Model; Additive Combination; regression Combination; Exchange Rate

PDF

Paper 46: Data Distribution Aware Classification Algorithm based on K-Means

Abstract: Giving data driven decisions based on precise data analysis is widely required by different businesses. For this purpose many different data mining strategies exist. Nevertheless, existing strategies need attention by researchers so that they can be adapted to the modern data analysis needs. One of the popular algorithms is K-Means. This paper proposes a novel improvement to the classical K-Means classification algorithm. It is known that data characteristics like data distribution, high-dimensionality, the size, the sparseness of the data, etc. have a great impact on the success of the K-Means clustering, which directly affects the accuracy of classification. In this study, the K-Means algorithm was modified to remedy the algorithm’s classification accuracy degradation, which is observed when the data distribution is not suitable to be clustered by data centroids, where each centroid is represented by a single mean. Specifically, this paper proposes to intelligently include the effect of variance based on the detected data distribution nature of the data. To see the performance improvement of the proposed method, several experiments were carried out using different real datasets. The presented results, which are achieved after extensive experiments, prove that the proposed algorithm improves the classification accuracy of KMeans. The achieved performance was also compared against several recent classification studies which are based on different classification schemes.

Author 1: Tamer Tulgar
Author 2: Ali Haydar
Author 3: Ibrahim Ersan

Keywords: Classification; k-means; variance effect; big data

PDF

Paper 47: A Knowledge-based Topic Modeling Approach for Automatic Topic Labeling

Abstract: Probabilistic topic models, which aim to discover latent topics in text corpora define each document as a multinomial distributions over topics and each topic as a multinomial distributions over words. Although, humans can infer a proper label for each topic by looking at top representative words of the topic but, it is not applicable for machines. Automatic Topic Labeling techniques try to address the problem. The ultimate goal of topic labeling techniques are to assign interpretable labels for the learned topics. In this paper, we are taking concepts of ontology into consideration instead of words alone to improve the quality of generated labels for each topic. Our work is different in comparison with the previous efforts in this area, where topics are usually represented with a batch of selected words from topics. We have highlighted some aspects of our approach including: 1) we have incorporated ontology concepts with statistical topic modeling in a unified framework, where each topic is a multinomial probability distribution over the concepts and each concept is represented as a distribution over words; and 2) a topic labeling model according to the meaning of the concepts of the ontology included in the learned topics. The best topic labels are selected with respect to the semantic similarity of the concepts and their ontological categorizations. We demonstrate the effectiveness of considering ontological concepts as richer aspects between topics and words by comprehensive experiments on two different data sets. In another word, representing topics via ontological concepts shows an effective way for generating descriptive and representative labels for the discovered topics.

Author 1: Mehdi Allahyari
Author 2: Seyedamin Pouriyeh
Author 3: Krys Kochut
Author 4: Hamid Reza Arabnia

Keywords: Topic modeling; topic labeling; statistical learning; ontologies; linked open data

PDF

Paper 48: A Comparative Study of Mamdani and Sugeno Fuzzy Models for Quality of Web Services Monitoring

Abstract: This paper presents a comparative study of fuzzy inference system (FIS) with respect to Mamdani and Sugeno FISs to show the accuracy and precision of quality of web service (QoWS) compliance monitoring. We used these two types of FIS for designing the QoWS compliance monitoring model. Clustering validity index is used to optimize the number of clusters of both models. Then both models are constructed based on Fuzzy CMeans (FCM) clustering algorithm. Simulation results with a Mamdani model, a Sugeno model and a crisp-based model for benchmark are presented. We consider different levels of noise (to represent uncertainties) in the simulations for comparison and to analyze the performance of the models when applied in QoWS compliance monitoring. The results show that Sugeno FIS outperforms Mamdani FIS in terms of accuracy and precision by producing better total error, error percentage, precision, mean squared error and root mean squared error measurements.The advantage of using fuzzy-based model is also verified with benchmark model.

Author 1: Mohd Hilmi Hasan
Author 2: Izzatdin Abdul Aziz
Author 3: Jafreezal Jaafar
Author 4: Lukman AB Rahim
Author 5: Joseph Mabor Agany Manyiel

Keywords: Quality of web service (QoWS) monitoring; fuzzy inference system; QoS

PDF

Paper 49: A P System for Solving All-Solutions of TSP

Abstract: P system is a parallel computing system based on a membrane computing model. Since the calculation process of the P system has the characteristics of maximum parallelism and Non-determinism, it has been used to solve the NP-hard problem in polynomial time. This paper designs a P system for TSP problem solving. This P system can not only determine whether the TSP problem has solution, but also give the allsolution when the TSP problem is solved. Finally, an example is given to illustrate the feasibility and effectiveness of the P system designed in this paper.

Author 1: Ping Guo
Author 2: Junqi Xiang
Author 3: Jingya Xie
Author 4: Jinhang Zheng

Keywords: P system, TSP, membrane computing, natural computing

PDF

Paper 50: Camera Calibration for 3D Leaf-Image Reconstruction using Singular Value Decomposition

Abstract: Features of leaves can be more precisely captured using 3D imaging. A 3D leaf image is reconstructed using two 2D images taken using stereo cameras. Reconstructing 3D from 2D images is not straightforward. One of the important steps to improve accuracy is to perform camera calibration correctly. By calibrating camera precisely, it is possible to project measurement of distances in real world to the image plane. To maintain the accuracy of the reconstruction, the camera must also use correct parameter settings. This paper aims at designing a method to calibrate a camera to obtain its parameters and then using the method in the reconstruction of 3D images. Camera calibration is performed using region-based correlation methods. There are several steps necessary to follow. First, the world coordinate and the 2D image coordinate are measured. Extraction of intrinsic and extrinsic camera parameters are then performed using singular value decomposition. Using the available disparity image and the parameters obtained through camera calibration, 3D leafimage reconstruction can finally be performed. Furthermore, the results of the experimental depth-map reconstruction using the intrinsic parameters of the camera show a rough surface, so that a smoothing process is necessary to improve the depth map.

Author 1: Hermawan Syahputra
Author 2: Reza Pulungan

Keywords: Camera calibration; image reconstruction; 3D leaf images; singular value decomposition

PDF

Paper 51: Modeling and Simulation of the Effects of Social Relation and Emotion on Decision Making in Emergency Evacuation

Abstract: Applying agent-based modeling to simulate the evacuation in case of emergency situations is recognized by many research works as an efficient tool for understanding the behavior and decision making of occupants in these situations.In this paper, we present our work aiming to modeling the influence of the emotion and social relationship of occupants on their behaviors and decision making in emergency as in case of fire disaster. Firstly, we proposed a formalization of occupants’ behavior at group level in emergency situations based on the social theory. This formalization details possible behaviors and actions of people in emergency evacuations, taking into account occupant’s social relationship. The formalization will facilitate the construction of simulation for emergency evacuation. Secondly, we modeled the influence of emotion and group behavior on the decision making of occupants in crisis situations. Thirdly, we developed an agent-based simulation that took into account the effect of group and emotion on the decision making of occupants in emergency situations. We conducted a set of experiments allowing to observe and analyze the behavior of people in emergency evacuation.

Author 1: Xuan Hien Ta
Author 2: Dominique Longin
Author 3: Benoit Gaudou
Author 4: Tuong Vinh Ho
Author 5: Manh Hung Nguyen

Keywords: Agent-based simulation; emotion; social relation; emergency evacuation

PDF

Paper 52: Design Patterns and General Video Game Level Generation

Abstract: Design patterns have become a vital solution for a number of problems in software engineering. In this paper, we have performed rhythmic analysis of General Video Game Level Generation (GVG-LG) framework and have discerned 23 common design patterns. In addition, we have segregated the identified patterns into four unique classes. The categorization is based on the usage of identified patterns in game levels. Our future aim is to employ these patterns as an input for a search based level generator.

Author 1: Mudassar Sharif
Author 2: Adeel Zafar
Author 3: Uzair Muhammad

Keywords: General video game level generation; rhythmic analysis; procedural content generation; design pattern; search based level generator

PDF

Paper 53: Person re-ID while Crossing Different Cameras: Combination of Salient-Gaussian Weighted BossaNova and Fisher Vector Encodings

Abstract: Person re-identification (re-ID) is a challenging task in the camera surveillance field, since it addresses the problem of re-identifying people across multiple non-overlapping cameras. Most of existing approaches have been concentrated on: 1) achieving a robust and effective feature representation; and 2) enforcing discriminative metric learning to predict if two images represent the same identity. In this context, we present a new approach for person re-ID built upon multi-level descriptors. This is achieved by combining three complementary representations: salient-Gaussian Fisher Vector (SGFV) encoding method, salient-Gaussian BossaNova (SGBN) histogram encoding method and deep Convolutional Neural Network (CNN) features. The two first methods adapt the histogram encoding framework to the person re-ID task. This is achieved by integrating the pedestrian saliency map and the spatial location information, in the histogram encoding process. On one hand, human saliency is reliable and distinctive in the person re-ID task, since it can model the uniqueness of the identity. On the other hand, localizing a person in the image can effectively discard noisy background information. Finally, one of the most advanced metric learning in person re-ID: the Cross-view Quadratic Discriminant Analysis (XQDA) is applied on the top of the resulting description. The proposed method yields promising person re-ID results on two challenging image-based person re-ID benchmarks: CUHK03 and Market-1501.

Author 1: Mahmoud Mejdoub
Author 2: Salma Ksibi
Author 3: Chokri Ben Amar
Author 4: Mohamed Koubaa

Keywords: Person re-identification; histogram encoding; fisher vector; BossaNova; Convolutional Neural Network (CNN); salient weight; Gaussian weight

PDF

Paper 54: DBpedia based Ontological Concepts Driven Information Extraction from Unstructured Text

Abstract: In this paper a knowledge base concept driven named entity recognition (NER) approach is presented. The technique is used for information extraction from news articles and linking it with background concepts in knowledge base. The work specifically focuses on extracting entity mentions from unstructured articles. The extraction of entity mentions from articles is based on the existing concepts from DBPedia ontology, representing the knowledge associated with the concepts present in Wikipedia knowledge base. A collection of the Wikipedia concepts through structured DBpedia ontology has been extracted and developed. For processing of unstructured text, Dawn news articles have been scrapped, preprocessed and thereby a corpus has been built. The proposed knowledge base driven system shows that given an article, the system identifies the entity mentions in the text article and how they can automatically be linked with the concepts to the corresponding entity mentions representing their respective pages on Wikipedia. The system is evaluated on three test collections of news articles on politics, sports and entertainment domains. The experimental results in respect of entity mentions are reported. The results are presented as precision, recall and f-measure, where the precision of extraction of relevant entity mentions identified yields the best results with a little variation in percent recall and f-measures. Additionally, facts associated with the extracted entity mentions both in form of sentences and Resource Description Framework (RDF) triples are presented so as to enhance the user’s understanding of the related facts presented in the article.

Author 1: Adeel Ahmed
Author 2: Syed Saif ur Rahman

Keywords: Ontology-based information extraction; semantic web; named entity recognition; entity linking

PDF

Paper 55: Classification of Human Emotions from Electroencephalogram (EEG) Signal using Deep Neural Network

Abstract: Estimation of human emotions from Electroencephalogram (EEG) signals plays a vital role in developing robust Brain-Computer Interface (BCI) systems. In our research, we used Deep Neural Network (DNN) to address EEG-based emotion recognition. This was motivated by the recent advances in accuracy and efficiency from applying deep learning techniques in pattern recognition and classification applications. We adapted DNN to identify human emotions of a given EEG signal (DEAP dataset) from power spectral density (PSD) and frontal asymmetry features. The proposed approach is compared to state-of-the-art emotion detection systems on the same dataset. Results show how EEG based emotion recognition can greatly benefit from using DNNs, especially when a large amount of training data is available.

Author 1: Abeer Al-Nafjan
Author 2: Manar Hosny
Author 3: Areej Al-Wabil
Author 4: Yousef Al-Ohali

Keywords: Electroencephalogram (EEG); Brain-Computer Interface (BCI); emotion recognition; affective state; Deep Neural Network (DNN); DEAP dataset

PDF

Paper 56: A Proposed Framework for Generating Random Objective Exams using Paragraphs of Electronic Courses

Abstract: Objective exams (OE) plays a major role in educational assessment as well as in electronic learning. The main problem in the traditional system of exams is a low quality of questions caused by some human factors, such as the traditional method for the development of the exam covers a narrow scope of curriculum topics. This does nothing for the separation of teaching process about the examination process. In this study we present a framework that generates three types of Objective exams questions (multiple choice questions (MCQ), true-false question (T/FQ), and complete Questions (CQ) from paragraphs of electronic course. The proposed framework consists of a lot of main stages, it uses both of the natural language processing (NLP) techniques to generate three types of questions (GFQ, T/FQ, and MCQ), and exam maker (EM), it uses the generated questions to produce the objective exams. The proposed system was evaluated by the extent of its ability to generate multiple objective questions. The questions that have been generated from the proposed system was presented to the three of the arbitrators specialists in the field of computer networks to express an opinion on the extent of their relationship to E-course and the accuracy of linguistic and scientific formulation. The results of the study showed an increase in the accuracy and number of the objective exams that were generated through the proposed system compared to the accuracy and number of the exams created by the traditional system this proves the efficiency of the proposed system.

Author 1: Elsaeed E. AbdElrazek

Keywords: Objective exams (OE); Applications Artificial Intelligence (AAI); Random Objective Exams Generation (ROEG)

PDF

Paper 57: Defense against SYN Flood Attack using LPTR-PSO: A Three Phased Scheduling Approach

Abstract: Security has become a critical factor in today’s computation systems. The security threats that risk our confidential information can come in form of seemingly legitimate client request to server. While illegitimate requests consume the number of connections a server can handle, no valid new connections can be made. This scenario, named SYN-flooding attacks can be controlled through a fair scheduling algorithm that provides more opportunity to legal requests. This paper proposes a detailed scheduling approach named Largest Processing Time Rejection- Particle Swarm Optimization (LPTR-PSO) that defends the server against varying intensity SYN-flood attack scenarios through a three-phased algorithm. This novel approach considers the number of half-open connections in the server buffer and chooses a phase accordingly. The simulation results show that the proposed defense strategy improves the performance of under attack system in terms of memory occupancy of legal requests and residence time of attack requests.

Author 1: Zonayed Ahmed
Author 2: Maliha Mahbub
Author 3: Sultana Jahan Soheli

Keywords: SYN flood; LPTR-PSO; three-phased algorithm; legal request; buffer

PDF

Paper 58: Secure Device Pairing Methods: An Overview

Abstract: The procedure of setting up a secure communication channel among unfamiliar human-operated devices is called “Secure Device Pairing”. Secure binding of electronic devices is a challenging task because there are no security measures and commonly trusted infrastructure. It opens up the doors for many security threats and attacks e.g. man in middle and evil twin attacks. In order to mitigate these attacks different techniques have been proposed; some level of user participation is required in decreasing attacks in the device pairing process. A comparative and comprehensive evaluation of prominent secure device pairing methods is described here. The main motive of this research is to summarize the cryptographic protocols used in pairing process and compare the existing methods to secure the pairing devices. That will help in selecting best method according to the situation, as the most popular or easy method, instead they choose different methods in different circumstances.

Author 1: Aatifah Noureen
Author 2: Umar Shoaib
Author 3: Muhammad Shahzad Sarfraz

Keywords: Device pairing methods; binding method; OOB channel; cryptographic protocols

PDF

Paper 59: A Proposed Approach for Image Compression based on Wavelet Transform and Neural Network

Abstract: Over the last years, wavelet theory has been used with great success in a wide range of applications as signal de-noising and image compression. An ideal image compression system must yield high-quality compressed image with high compression ratio. This paper attempts to find the most useful wavelet function to compress an image among the existing members of wavelet families. Our idea is that a backpropagation neural network is trained to select the suitable wavelet function between the two families: orthogonal (Haar) and biorthogonal (bior4.4), to be used to compress an image efficiently and accurately with an ideal and optimum compression ratio. The simulation results indicated that the proposed technique can achieve good compressed images in terms of peak signal to noise ratio (PSNR) and compression ratio (t) in comparison with random selection of the mother wavelet.

Author 1: Houda Chakib
Author 2: Brahim Minaoui
Author 3: Mohamed Fakir
Author 4: Abderrahim Salhi
Author 5: Imad Badi

Keywords: Haar wavelet transform; biorthogonal wavelet; backpropagation neural network; scaled conjugate gradient algorithm

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org