The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 10 Issue 7

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Hybrid Geo-Location Routing Protocol for Indoor and Outdoor Positioning Applications

Abstract: Internet of Things (IoT) essentially demands smart connectivity and contextual awareness of current networks with low power and cost effective wireless solutions. Routing is the backbone of the system controlling the flow of transmission. This work demonstrates a collation between performance investigations of a location-based routing protocol Geocast Adaptive Mesh Environment for Routing with contextual information collected from Global Positioning System (GPS) and Framework for Internal Navigation and Discovery (FIND) respectively. The systems are evaluated based on various metrics i.e. Accuracy, Packet Delivery Ratio and Packet Overhead by means of Network Simulator (NS-2). FIND shows enhanced performance in most cases as compared to GPS for indoor and outdoor environments. The results of this research can be deployed in different areas such as in-building navigation, hospital patient tracking, Smart City context aware service provisioning and Industry 4.0 deployments.

Author 1: Sania Mushtaq
Author 2: Dr. Gasim Alandjani
Author 3: Saba Farooq Abbasi
Author 4: Dr. Nasser Abosaq
Author 5: Dr. Adeel Akram
Author 6: Dr. Shahbaz Pervez

Keywords: Internet of Things (IoT); Network Simulator (NS-2); Routing; Geocast Adaptive Mesh Environment for Routing (GAMER); Mobility; Global Positioning System (GPS); Framework for Internal Navigation and Discovery (FIND)

PDF

Paper 2: Phishing Websites Detection using Machine Learning

Abstract: Tremendous resources are spent by organizations guarding against and recovering from cybersecurity attacks by online hackers who gain access to sensitive and valuable user data. Many cyber infiltrations are accomplished through phishing attacks where users are tricked into interacting with web pages that appear to be legitimate. In order to successfully fool a human user, these pages are designed to look like legitimate ones. Since humans are so susceptible to being tricked, automated methods of differentiating between phishing websites and their authentic counterparts are needed as an extra line of defense. The aim of this research is to develop these methods of defense utilizing various approaches to categorize websites. Specifically, we have developed a system that uses machine learning techniques to classify websites based on their URL. We used four classifiers: the decision tree, Naïve Bayesian classifier, support vector machine (SVM), and neural network. The classifiers were tested with a data set containing 1,353 real world URLs where each could be categorized as a legitimate site, suspicious site, or phishing site. The results of the experiments show that the classifiers were successful in distinguishing real websites from fake ones over 90% of the time.

Author 1: Arun Kulkarni
Author 2: Leonard L. Brown III

Keywords: Phishing websites; classification; features; machine learning

PDF

Paper 3: Analysis of Software Deformity Prone Datasets with Use of AttributeSelectedClassifier

Abstract: Software Deformity Prone datasets models are interesting research direction in the era of software world. In this research study, the interest class of software deformity prone is defective model datasets. There are different techniques to predict the deformity prone datasets model. Our proposed solution technique is AttributeSelectedClassifier with selected evaluators and searching method for reducing the dimensionality of training and testing data provided by defected models NASA datasets by attribute selection before being passed on classifiers. We have used three evaluators and search methods. These evaluators are CFSSubsetEval, GainRatio and Principal Component Analysis (PCA). The search methods are BestFirst and Ranker. We have used 12 different classifiers for analyzing the performance of these three evaluators with search methods. The experimental results and analysis are measured with True Positive (TP-Rate), Positive Accuracy, Area under Curve (ROC) and Correctly Classified Instances. The results showed that that CFSSubsetEval and GainRatio performance is better in almost classifiers. Hoeffding tree, Naive Bayes, Multiclass, IBK and Randomizable filtered class increased performance in Positive Accuracy in all techniques. Stacking has worst performance in positive accuracy and True Positive tp-rate in all over technique.

Author 1: Maaz Rasheed Malik
Author 2: Liu Yining
Author 3: Salahuddin Shaikh

Keywords: GainRatio; CFSSubsetEval; PCA; classification; defect prediction; deformity prone; defect model; classifier; bug model; softwar; attributesubsetclassifier

PDF

Paper 4: Detecting Inter-Component Vulnerabilities in Event-based Systems

Abstract: Event-based system (EBS) has become popular because of its high flexibility, scalability, and adaptability. These advantages are enabled by its communication mechanism—implicit invocation and implicit concurrency between components. The communication mechanism is based on non-determinism in event processing, which can introduce inherent security vulnerabilities into a system referred to as event attacks. Event attack is a particular type of attack that can abuse, incapacitate, and damage a target system by exploiting the system's event-based communication model. It is hard to prevent event attacks because they are administered in a way that does not differ from ordinary event-based communication in general. While a number of techniques have focused on security threats in EBS, they do not appropriately resolve the event attack issues or suffer from inaccuracy in detecting and preventing event attacks. To address the risk of event attacks, I present a novel vulnerability detection technique for EBSs that are implemented by using message-oriented middleware platform. My technique has been evaluated on 25 open-source benchmark apps and eight real-world EBSs. The evaluation exhibited my technique's higher accuracy in detecting vulnerabilities on event attacks than existing techniques as well as its applicability to real-world EBSs.

Author 1: Youn Kyu Lee

Keywords: Event-based system; program analysis; software security

PDF

Paper 5: Towards the Adoption of Smart Manufacturing Systems: A Development Framework

Abstract: Today, a new era of manufacturing innovation is introduced as Smart Manufacturing Systems (SMS) or Industry 4.0. Many studies have discussed the different characteristics and technologies associated with SMS, however, little attention has been devoted to study the development process when establishing new SMS. The study’s objective is to propose a development framework that increases the adoption and awareness of Industry 4.0 among manufacturers and aids decision-makers in designing better SMS capabilities. The framework consists of three phases, iterative process of application modelling; evaluation to ensure optimal configuration and adoption; and finally implementation. The proposed framework is hoped to assist the industries’ management in planning for the adoption of technology, in establishing SMS or assessing the need in existing ones. Indirectly, more industry will gain the benefits as a support for their initiatives to transform into Industry 4.0.

Author 1: Moamin A Mahmoud
Author 2: Jennifer Grace

Keywords: Component; smart manufacturing; industry 4.0; development framework; simulation-based evaluation

PDF

Paper 6: Smart City Parking Lot Occupancy Solution

Abstract: In the context of Smart City projects, the management of parking lots is one of the main concerns of local administrations and of industrial solution providers. In this respect, we have presented an image processing application, which overcomes the issues of classical electro-mechanical solutions and employs the feed of a surveillance camera. The final web-based interface could provide to the clients the real-time availability and position of the parking space. The proposed method uses a series of feature measures in order to speed-up and accurately classifies the occupancy of the space. Using a published benchmark, our method has proved to provide very accurate results and have been extensively tested on two proprietary parking locations.

Author 1: Paula Tatulea
Author 2: Florina Calin
Author 3: Remus Brad
Author 4: Lucian Brâncovean
Author 5: Mircea Greavu

Keywords: Smart city; car parking; occupancy; monitoring system; image features

PDF

Paper 7: FLACC: Fuzzy Logic Approach for Congestion Control

Abstract: The popularity of network applications has increased the number of packets travelling within the routers in networks. The movement expends most resources in such networks and consequently leads to congestion, which worsens the performance measures of networks, such as delay, packet loss and bandwidth. This study proposes a new method called Fuzzy Logic Approach for Congestion Control (FLACC), which uses fuzzy logic to decrease delay and packet loss. This method also improves network performance. In addition, FLACC employs average queue length (aql) and packet loss (PL) as input linguistic variables to control the congestion at early stages. In this study, the proposed and compared methods were simulated and evaluated. Results reveal that fuzzy logic Gentle Random Early Detection (FLGRED) showed better performance results than Gentle Random Early Detection (GRED) and GRED Fuzzy Logic in delay and packet loss and when the router buffer was in heavy congestion.

Author 1: Mahmoud Baklizi

Keywords: Congestion; Network Result Performance; GREDFL

PDF

Paper 8: Measuring the Effect of Packet Corruption Ratio on Quality of Experience (QoE) in Video Streaming

Abstract: The volume of Internet video traffic which consists of downloaded or streamed video from the Internet is projected to increase from 42,029PB monthly in 2016 to 159,161PB monthly, in 2021, representing a 31% increase in the Compound Annual Growth Rate (CAGR). The market for mobile network operators is unpredictable, fast-paced and very competitive. End users now have more options when choosing service providers. With superior network Quality of Experience (QoE), service providers can increase margins by charging more for better quality. Packet corruption occurs when the receiver cannot correctly decode transmitted bits. This study identified the threshold at which the QoE of video streaming services became unacceptable due to the effect of packet corruption. In this paper, several experiments were carried out on video streaming services, creating disturbances to evaluate the user satisfaction level using the mean opinion scores. Network Emulator (NetEm) tool was used to create the packet corruption experienced during the video sessions and the QoE for different packet corruption percentages was established. From the experiments conducted, we found that user QoE decreased as the Packet Corruption Ratio (PCR) increased. With knowledge of the effect of the PCR, service providers can ensure that the PCR is kept within acceptable limits from end-to-end and this will ultimately lead to superior QoE from end users, which will in turn translate to improved subscriber base and profitability.

Author 1: Jonah Joshua
Author 2: Akpovi Ominike
Author 3: Oludele Awodele
Author 4: Achimba Ogbonna

Keywords: Network Emulator; Packet Corruption Ratio (PCR); Quality of Experience (QoE)

PDF

Paper 9: Vision based Indoor Localization Method via Convolution Neural Network

Abstract: Existing indoor localization methods have bottleneck constraints such as multipath effect for Wi-Fi based methods, high cost for ultra-wide-band based methods and poor anti-interference for Bluetooth-based methods and so on. In order to avoid these problems, a vision-based indoor localization method is proposed. Firstly, the whole deployment environment is departed into several regions and each region is assigned to a location center. Then, in offline mode, the VGG16NET is pre-trained by ImageNet dataset and it is fine-tuned by images on a custom dataset towards indoor localization. In online mode, the fully trained and converged VGG16NET takes as input a video stream captured by the front RGB camera of a mobile robot and outputs features specific to the current location. The features are then used as input to an ArcFace classifier which outputs the current location of the mobile robot. Experimental results show that our method can estimate the location of a mobile object with imaging capability accurately in cluttered unstructured scenes without any other additional device. The localization accuracy can reach to 94.7%.

Author 1: Zeyad Farisi
Author 2: Tian Lianfang
Author 3: Li Xiangyang
Author 4: Zhu Bin

Keywords: Indoor localization; VGG16NET; transfer learning; ArcFace classifier

PDF

Paper 10: Fixation Detection with Ray-casting in Immersive Virtual Reality

Abstract: This paper demonstrates the application of a proposed eye fixation detection algorithm to eye movement recorded during eye gaze input within immersive Virtual Reality and compares it with the standard frame-by-frame analysis for validation. Pearson correlations and a sample paired t-test indicated strong correlations between the two analysis methods in terms of fixation duration. The results showed that the principle of eye movement event detection in 2D can be applied successfully in a 3D environment and ensures efficient detection when combined with ray-casting and event time.

Author 1: Najood Alghamdi
Author 2: Wadee Alhalabi

Keywords: Eye Movement; Eye Tracking; Virtual Reality; Fixation Detection; HMD

PDF

Paper 11: Traceability Establishment and Visualization of Software Artefacts in DevOps Practice: A Survey

Abstract: DevOps based software process has become popular with the vision of an effective collaboration between the development and operations teams that continuously integrates the frequent changes. Traceability manages the artefact consistency during a software process. This paper explores the trace-link creation and visualization between software artefacts, existing tool support, quality aspects and the applicability in a DevOps environment. As the novelty of this study, we identify the challenges that limit the traceability considerations in DevOps and suggest research directions. Our methodology consists of concept identification, state-of-practice exploration and analytical review. Despite the existing related work, there is a lack of tool support for the traceability management between heterogeneous artefacts in software development with DevOps practices. Although many existing studies have low industrial relevance, a few proprietary traceability tools have shown a high relevance. The lack of evidence of the related applications indicates the need for a generalized traceability approach. Accordingly, we conclude that the software artefact traceability is maturing and applying collaboratively in the software process. This can be extended to explore features such as artefact change impact analysis, change propagation, continuous integration to improve software development in DevOps environments.

Author 1: D A Meedeniya
Author 2: I. D. Rubasinghe
Author 3: I. Perera

Keywords: Software traceability; visualization; comparative study; DevOps; continuous software development

PDF

Paper 12: Blood Diseases Detection using Classical Machine Learning Algorithms

Abstract: Blood analysis is an essential indicator for many diseases; it contains several parameters which are a sign for specific blood diseases. For predicting the disease according to the blood analysis, patterns that lead to identifying the disease precisely should be recognized. Machine learning is the field responsible for building models for predicting the output based on previous data. The accuracy of machine learning algorithms is based on the quality of collected data for the learning process; this research presents a novel benchmark data set that contains 668 records. The data set is collected and verified by expert physicians from highly trusted sources. Several classical machine learning algorithms are tested and achieved promising results.

Author 1: Fahad Kamal Alsheref
Author 2: Wael Hassan Gomaa

Keywords: Machine learning; classification algorithms; decision trees; KNN; k-means; blood disease

PDF

Paper 13: A Mobile Robot Teleoperation System with a Wireless Communication Infrastructure using a Leaky Coaxial Cable based on TCP/IP

Abstract: In this study, we propose and develop a wireless teleoperation system for mobile robots using a leaky coaxial cable (LCX) with a wireless communication infrastructure. In closed spaces resulting from disasters, some problems have been reported, such as cable entanglement, disconnection of wired communications, and problems with teleoperations (e.g., unstable communication quality for wireless communications). In this paper, we propose a communication infrastructure system for teleoperation of a mobile robot using LCXs as a communication infrastructure that considers the above issues. In addition, the communication quality was measured for the operability of the mobile robot by constructing an IEEE 802.11b/g/n network using an LCX, and the effectiveness of the proposed system in an actual environment was confirmed. In the evaluation of the communication quality, bandwidth compression throughput values and packet jitter were measured as evaluation items at the packet level to objectively consider the teleoperation controllability.

Author 1: Kei Sawai
Author 2: Satoshi Aoyama
Author 3: Tatsuo Motoyoshi
Author 4: Toru Oshima
Author 5: Ken’ichi Koyanagi
Author 6: Hiroyuki Masuta
Author 7: Takumi Tamamoto

Keywords: Mobile robot; teleoperation; leaky coaxial cable; teleoperation infrastructure

PDF

Paper 14: Communication Disconnection Prevention System by Bandwidth Depression-Type Traffic Measurement in a Multi-Robot Environment using an LCX Network

Abstract: In this paper, we propose and develop a method for determining the transmission amount of each mobile robot connected to a network constructed with a leaky coaxial cable (LCX) by using broadcast packets. Tele-operation of mobile robots using an LCX network is more effective as an information collection method in closed spaces compared with existing methods in terms of the maintenance of the mobile robots’ running performance and the stability of the communication quality for disaster reduction activity. However, when the transmission and reception of information exceeds the maximum transmission amount, communication disconnection and transmission amount reduction occur because of band division in the communication path, and there is a risk that mobile robots will be separated from the LCX network. Therefore, to prevent the network division and the decrease of transmission amount during multi-robot operation on an LCX network, we propose a method for determining the transmission amount of each mobile robot using broadcast packets. The proposed method is evaluated on an LCX network, and its effectiveness is confirmed by evaluating the transmittability of broadcast packets and operability of mobile robot.

Author 1: Kei Sawai
Author 2: Satoshi Aoyama
Author 3: Takumi Tamamoto
Author 4: Tatsuo Motoyoshi
Author 5: Hiroyuki Masuta
Author 6: Ken’ichi Koyanagi
Author 7: Toru Oshima

Keywords: Multi-robot; tele-operation; leaky coaxial cable; LCX networks; operability; broadcast packets; transmittability of broadcast packets; network disconnection prevention; disaster reduction activity

PDF

Paper 15: Developing an Integrated Cloud-based Framework for Securing Dataflow of Wireless Sensors

Abstract: Cloud computing environment has been developed rapidly and becomes a popular trend in recent years. It provides on-demand services to several applications with access to an unlimited number of resources such as servers, storage, networks. Wireless Sensor Network, on the other hand, has been enormously progressing in various applications and producing a considerable amount of sensor data. Sensor networks are based on a group of interconnected small size sensor nodes that can be distributed over different geographical areas to observe environmental and physical phenomena. Nevertheless, it has limitations concerning power, storage, and scalability that need to be addressed adequately. Integrating wireless sensor networks with cloud computing can overcome these problems. Cloud computing provides a more secure and high available platform for effective management of sensor data. This paper proposes a framework to secure the dataflow of sensor devices from wireless sensor networks to cloud computing using an integrated environment. The framework presents an authentication scheme to validate the identity of sensor devices connected to the cloud environment. Furthermore, it provides secure environments with high availability and data integrity.

Author 1: Habibah AL-Harbi
Author 2: Khalil H. A. Al-Shqeerat
Author 3: Ibrahim S. Alsukayti

Keywords: Framework; security; wireless sensor; cloud computing; data integrity; availability

PDF

Paper 16: Cancer Classification from DNA Microarray Data using mRMR and Artificial Neural Network

Abstract: Cancer is the uncontrolled growth of abnormal cells in the body and is a major death cause nowadays. It is notable that cancer treatment is much easier in the initial stage rather than it outbreaks. DNA microarray based gene expression profiling has become efficient technique for cancer identification in early stage and a number of studies are available in this regard. Existing methods used different feature selection methods to select relevant genes and then employed distinct classifiers to identify cancer. This study considered information theoretic based minimum Redundancy Maximum Relevance (mRMR) method to select important genes and then employed artificial neural network (ANN) for cancer classification. Proposed mRMR-ANN method has been tested on a suite of benchmark datasets of various cancer. Experimental results revealed the proposed method as an effective method for cancer classification when performance compared with several related exiting methods.

Author 1: M A H Akhand
Author 2: Md. Asaduzzaman Miah
Author 3: Mir Hussain Kabir
Author 4: M. M. Hafizur Rahman

Keywords: Cancer classification; gene expression data; minimum redundancy maximum relevance method; artificial neural network

PDF

Paper 17: A Convolutional Neural Network for Automatic Identification and Classification of Fall Army Worm Moth

Abstract: To combat the problem caused by the Fall Army Worm in the country there is a need to come up with robust early warning and monitoring systems as the current manual system is labor intensive and time consuming. The automation of the identification and classification of the insect is one of the novel methods that can be undertaken. Therefore this paper presents the results of training a Convolutional Neural Network model using Google’s Tensorflow Deep Learning Framework for the identification and classification of the Fall Army worm moth. Due to lack of enough training dataset and good computing power, we used transfer learning, which is the process of reusing a model trained on one task as a starting point for a model on a second task. Googles pre-trained InceptionV3 model was used as the underlying model. Data was collected from four sources namely the field, Lab setup, by crawling the internet and using Data Augmentation. We Present results of the best three trials in terms of training accuracy after several attempts to get the best metrics in terms of learning rate and training steps. The best model gave a prediction average accuracy of 82% and a 32% average prediction accuracy on false positives. The results shows that it is possible to automate the identification and classification of the Fall Army worm Moth using Convolutional Neural Networks.

Author 1: Francis Chulu
Author 2: Jackson Phiri
Author 3: Phillip O.Y. Nkunika
Author 4: Mayumbo Nyirenda
Author 5: Monica M.Kabemba
Author 6: Philemon H.Sohati

Keywords: Augmentation; convolutional neural networks; classification; fall army worm; machine learning; tensorflow; transfer learning

PDF

Paper 18: Communication and Computation Aware Task Scheduling Framework Toward Exascale Computing

Abstract: The race for Exascale Computing has naturally led computer architecture to transit from the multicore era and into the heterogeneous era. Exascale Computing within the heterogenous environment necessarily use the best-fit scheduling and resource utilization improvement. Task scheduling is the main critical aspect in managing the challenges of Exascale in the heterogenous computing environment. In this paper, a Communication and Computation Aware task scheduler framework (CCATSF) is introduced. The CCATSF framework consists of four parts; the first of which is the resource monitor, the second is the resources manager, the third is the task scheduler and the fourth is the dispatcher. The framework is based on a new hybrid task scheduling algorithm for a heterogenous computing environment. Our results are based on the random job generator that we implemented, and they indicate that the CCATSF framework, based on the proposed dynamic variant heterogenous early finish time (DVR-HEFT) algorithm is able to reduce the scheduler's makespan and increase the efficiency without increasing the algorithm's time complicity.

Author 1: Suhelah Sandokji
Author 2: Fathy Eassa

Keywords: Exascale computing; resource utilization; hybrid task scheduling; heterogeneous computing environment; task scheduler framework

PDF

Paper 19: A Mobile-based Tremor Detector Application for Patients with Parkinson’s Disease

Abstract: Parkinson’s disease affects millions of people worldwide and its frequency is steadily increasing. No cure is currently available for Parkinson’s disease patients, and most medications only treat the symptoms. This treatment depends on the quantification of Parkinson’s symptoms, such as hand tremors. The most commonly used method to measure human tremors is a severity scale, which lacks accuracy because it is based on the subjectivity of neurologist review. Furthermore, the use of severity scales prevents the extraction of information from tremor activity, such as speed, amplitude, and frequency. Therefore, a mobile application was developed to measure the hand tremor level of Parkinson’s patients using a mobile phone-based accelerometer. Agile method was used to develop this application, and Android Studio and Android Software Development Kit were utilized. The application runs on an Android smartphone. This application allows patients to identify their tremor activity and subsequently seek relevant medical advice. In addition, a neurologist can monitor tremor activity of patients by analyzing the records generated from this application.

Author 1: Pang Xin Yi
Author 2: Maryati Mohd. Yusof
Author 3: Kalaivani Chelappan

Keywords: Agile; mhealth; mobile application; Parkinson; tremor detector

PDF

Paper 20: The Effect of Social Feature Quality on the Social Commerce System

Abstract: The emergence of social networks has triggered the evolution of e-commerce to what is now known as social-commerce (s-commerce). However, s-commerce users experience problems related to its social features that affect s-commerce effectiveness. Therefore, the paper examines the effect of social feature quality (SFQ) determinants on s-commerce from customer perspective by adapting the information systems success model. A total of 220 online survey responses were analyzed by using confirmatory factor analysis and the structural equitation model to test the proposed model. SFQ shows a significant effect on perceived usefulness and customer satisfaction with an s-commerce system, whereas relationship support quality shows a significant effect on perceived usefulness and customer satisfaction with an s-commerce system but not on social support. A significant relationship is also identified among perceived usefulness, customer satisfaction, and net benefits of an s-commerce system.

Author 1: Nona M Nistah
Author 2: Maryati Mohd. Yusof
Author 3: Suaini Sura
Author 4: Ook Lee

Keywords: Social feature quality; relationship support; social support; s-commerce; e-commerce; customer satisfaction

PDF

Paper 21: New Approach of Automatic Modulation Classification based on in Phase-Quadrature Diagram Combined with Artificial Neural Network

Abstract: Automatic Modulation Classification (AMC) with intelligent system is an attracting area of research due to the development of SDR (Software Defined Radio). This paper proposes a new algorithm based on a combination of k-means clustering and Artificial Neural Network (ANN). We use constellation diagram I-Q (In phase, Quadrature) as basic information. K-means algorithm is used to normalize data transmitted and pollute by the Additive White Gaussian Noise (AWGN), then the new diagram obtained is considered as an image and coded in pixel before entering in MLP (Multi-Layer Perceptron) Neural Network. Simulation results show an improvement of recognition rate under low SNR (Signal Noise Rate) compare to some results obtained in the literature.

Author 1: Jean Baptiste Bi Gouho
Author 2: Désiré Melèdje
Author 3: Boko Aka
Author 4: Michel Babri

Keywords: Modulations; artificial neural network; clustering; machine learning

PDF

Paper 22: User Perspective on External Value Creation Factors in Indonesia e-Commerce

Abstract: Value creation is very important for the e-commerce companies in order to reach customers and increase company's value in the view of customer. Value creation mostly developed based on internal factor of the company. This statement is supported by many studies that researched on value creation from within the company. The purpose of this research is to find out customer’s perspective on the external environmental factors that can affect the value creation of e-commerce companies, especially in Indonesia. This research uses primary and secondary methods in data collection. Questionnaire is used as primary research methodology in this research to gather data from e-commerce users. For the secondary method, literature reviewed from previous research and existing journals is used. The results of this research are the statements from respondents regarding external environmental factors that can be classified on 5 (five) factors: Government policy and legal, telecommunication infrastructure, financial and capital investment, physical environment, and payment system.

Author 1: Sfenrianto Sfenrianto
Author 2: Hilda Oktavianni JM
Author 3: Hafid Prima Putra
Author 4: Khoerintus

Keywords: External environment factors; e-commerce; user perspective; value creation

PDF

Paper 23: Double Diode Ideality Factor Determination using the Fixed-Point Method

Abstract: In this paper, we are interested in the diode ideality factor study of the double exponential equivalent model, based on the properties of the fixed point method. The optimal choice of this factor will improve the photovoltaic installation profitability. The diode ideality factor is a crucial parameter to describe solar cell behavior. Different methods have been elaborated to determine its value; some of them are analytical as Lambert function and others are direct as the normal method of the coordinates of the parameters. In our case, we applied the fixed point method which is an iterative algorithm to solve non-linear equations. The values obtained by this method are compared with the calculated values achieved by other methods to prove its significance and effects.

Author 1: Traiki Ghizlane
Author 2: Ouajji Hassan
Author 3: Bifadene Abdelkader
Author 4: Bouattane Omar

Keywords: Ideality factor; fixed point; double diode; solar cell

PDF

Paper 24: The Impact of Flyweight and Proxy Design Patterns on Software Efficiency: An Empirical Evaluation

Abstract: In this era of technology, delivering quality software has become a crucial requirement for the developers. Quality software is able to help an organization to success and gain a competitive edge in the market. There are numerous quality attributes introduced by various quality models. Various researches and studies prove that the quality of the object-oriented software can be improved by using design patterns. The main purpose of this research is to identify the relationships between the design patterns and software efficiency quality attribute. This research is focused on the impact of Flyweight and Proxy Design Patterns on the efficiency of software. An example scenario is used to empirically evaluate the effectiveness of applied design refinements on efficiency of a system. The techniques to measure software efficiency and the results obtained for each solution are elaborated in detail. At the end of this research, comparative analysis is provided to show the relative impact of each selected design pattern on software efficiency.

Author 1: Muhammad Ehsan Rana
Author 2: Wan Nurhayati Wan Ab Rahman
Author 3: Masrah Azrifah Azmi Murad
Author 4: Rodziah Binti Atan

Keywords: Software efficiency; design patterns; flyweight design pattern; proxy design pattern; measuring software efficiency; empirical evaluation of software

PDF

Paper 25: Micro Agent and Neural Network based Model for Data Error Detection in a Real Time Data Stream

Abstract: In this paper, we present a model for learning and detecting the presence of data type errors in a real time big data stream processing context. The proposed approach is based on a collection of micro-agents. Each micro-agent is trained to detect a specific type of error using an atomic neural network based on a sample multilayer perceptron. The supervised learning process is based on a binary classifier where the training data inputs are represented by data types and data values. The Micro-Agent for Error Detection (MAED) is deployed at several instances depending on the number of error types to be handled. The orchestration mechanism of data streams to be examined is performed by a special Host Micro Agent (HMA). This later receives in real time a data stream, splits the current record into several elementary fields. Each field value is streamed to an instance of MAED Agent which responds with a signal of presence or not of a specific data type error of the corresponding data field. For each detected data type error, the HMA Agent selects and performs the appropriate cleaning algorithm from a repository to correct the present errors of the data stream. To validate this approach, we propose an implementation based on Framework Deep Learning 4j for the Machines Learning part and JADE as a Multi Agent System (MAS) platform. The used dataset is generated by an events generator for smart highways.

Author 1: Sidi Mohamed Snineh
Author 2: Mohamed Youssfi
Author 3: Abdelaziz Daaif
Author 4: Omar Bouattane

Keywords: Micro-agent; machine learning; errors; big data; multilayer perceptron; stream processing

PDF

Paper 26: An Automatic Multiple Sclerosis Lesion Segmentation Approach based on Cellular Learning Automata

Abstract: Multiple Sclerosis (MS) is a demyelinating nerve disease which for an unknown reason assumes that the immune system of the body is affected, and the immune cells begin to destroy the myelin sheath of nerve cells. In Pathology, the areas of the distributed demyelination are called lesions that are pathologic characteristics of the Multiple Sclerosis (MS) disease. In this research, the segmentation of the lesions from one another is studied by using gray scale features and the dimensions of the lesions. The brain Magnetic Resonance Imaging (MRI) images in three planes (T1, T2, PD)1,2,3 containing MS disease lesions have been used. Cellular Learning Automata (CLA) is applied on the MRI images with a novel trial and error approach to set penalty and reward frames for each pixel. The images were analyzed in MATLAB and the results show the MS disease lesions in white and the brain anatomy in red on a black background. The proposed approach can be considered as a supplementary or superior method for other methods such as Graph Cuts (GC), fuzzy c-means, mean-shift, k-Nearest Neighbor (KNN), Support Vector Machines (SVM).

Author 1: Mohammad Moghadasi
Author 2: Dr. Gabor Fazekas

Keywords: Multiple Sclerosis (MS); MATLAB; Magnetic Resonance Imaging (MRI); MS Lesions; Cellular Learning Automata (CLA); Segmentation; Support Vector Machines (SVM)

PDF

Paper 27: A Constraint-based Approach to Deal with Self-Adaptation: The Case of Smart Irrigation Systems

Abstract: Smart irrigation is a specific application of the IoT, where devices composed of sensors and actuators, collect environmental data, like soil humidity, air temperature and brightness, in order to lunch or plan irrigation cycles. These systems function according to a configuration that dictates the way in which every component should operate. Static configurations are limited, as they only represent a set of fixed requirements. However, in domains such as the IoT, technology is continuously evolving, and various users, sometimes with various needs, interact with the system. This leads to dynamic requirements, which are fulfilled by dynamic configurations. This purpose uses the case of an irrigation system to illustrate such requirements, and proposes a constrained-based approach to design self-adaptive smart irrigation systems.

Author 1: Asmaa Achtaich
Author 2: Nissrine Souissi
Author 3: Camille Salinesi
Author 4: Raúl Mazo
Author 5: Ounsa Roudies

Keywords: IoT; irrigation systems; smart; constraint; product lines; self-adaptive systems

PDF

Paper 28: Vietnamese Speech Command Recognition using Recurrent Neural Networks

Abstract: Voice control is an important function in many mobile devices, in a smart home, especially in providing people with disabilities a convenient way to communicate with the device. Despite many studies on this problem in the world, there has not been a formal study for the Vietnamese language. In addition, many studies did not offer a solution that can be expanded easily in the future. During this study, a dataset of Vietnamese speech commands is labeled and organized to be shared with community of general language research and Vietnamese language study in particular. This paper provides a speech collection and processing software. This study also designs and evaluates Recurrent Neural Networks to apply it to the data collected. The average recognition accuracy on the set of 15 commands for controlling smart home devices is 98.19%.

Author 1: Phan Duy Hung
Author 2: Truong Minh Giang
Author 3: Le Hoang Nam
Author 4: Phan Minh Duong

Keywords: Vietnamese speech command; voice control; recurrent neural networks

PDF

Paper 29: Investigating Students’ Acceptance of Online Courses at Al-Ahliyya Amman University

Abstract: Online courses allow students to access the course materials anytime and anywhere. Those courses are meant to enhance and improve the learning processes. Unfortunately, by analyzing data of an online course in Al-Ahliyya Amman University, it was found that only 51% of enrolled students accessed the animated course material. This study proposed a model to understand the factors which affect students’ intention to use an online course by extending the Unified Theory of Acceptance and Use of Technology (UTAUT) and Technology Acceptance Model (TAM). The proposed research model investigated the effects of experience, perceive usefulness, awareness, effort expectancy, cost, subjective (social) norms, and behavioral intentions to use online courses on students’ adoption of online courses. Besides, the model investigated the effects of moderators, such as: college, college level, personal computer ownership, an internet access, and an online course enrollment on the relations. A questionnaire was distributed and then a structural equation modeling (SEM) approach was used to analyze the responses using SmartPLS.

Author 1: Qasem Kharma

Keywords: TAM; UTAUT; online learning; e-learning

PDF

Paper 30: A Comprehensive Evaluation of Cue-Words based Features and In-text Citations based Features for Citation Classification

Abstract: Citation plays a vital role in the scientific community of evaluating the contributions of scientific authors. Citing sources delivers a measurable way of evaluating the impact factor of journals and authors and allows for the recognition of new research issues. Different techniques for classifying citations have been proposed. Citations that provide background knowledge in the citing document have been classified as non-important or incidental by previous researchers. Citations that extend previous work in the citing document are classified as important. The accuracy achieved by existing citation models is not much higher. Better features need to be included for accurate predictions. A hybrid approach would present all possible combinations of cue-words and in-text citation-based features for citation classifications.

Author 1: Syed Jawad Hussain
Author 2: Sohail Maqsood
Author 3: NZ Jhanjhi
Author 4: Azeem Khan
Author 5: Mahadevan Supramaniam
Author 6: Usman Ahmed

Keywords: Cue-words; in-text citation; hybrid

PDF

Paper 31: Analysis of Accuracy and Precision of WLAN Position Estimation System based on RSS

Abstract: The coordinates of the position of a wireless access point are the main goal in the wireless localization technique. In common, outdoor wireless localization techniques use the trilateration method by installing several devices used as anchors, and this makes the cost for installation and maintenance higher. An outdoor wireless localization system that is more efficient, effective, and flexible but still accurate and precise becomes indispensable. This paper proposes a wireless position estimation system based on a received signal strength (RSS) value called WLAN Position Estimation System (WLAN PES) integrating WLAN Distance Estimation System (WLAN DES) with WLAN PES formula. To confirm the accuracy and precision of WLAN PES, this paper focuses on the analysis of WLAN PES testing conducted at ten points in which each measurement point had coordinates and angles that are different from one another. The test point was made to circle the target WLAN access point. The distance of each test point was 1000 meters against the targeted WLAN access point. In each of these test points, the WLAN finder reads RSSfnd. Then, the system calculated the estimated distance value using WLAN DES based on the RSSfnd value. After the system obtained a distance estimation value, the estimated position value was calculated with the WLAN PES formula. In addition to the distance estimation value, WLAN PES formula required some variables such as latitude and longitude coordinates from the WLAN finder position, and the bearing of the WLAN finder. WLAN PES was found to be capable of determining the estimated position coordinates of a WLAN access point with an accuracy value and precision value of 93.26% and 98.77%, respectively.

Author 1: Sutiyo
Author 2: Risanuri Hidayat
Author 3: I Wayan Mustika
Author 4: Sunarno

Keywords: WLAN position estimation system; WLAN distance estimation system; accuracy and precision wireless localization; efficient and flexible outdoor wireless localization

PDF

Paper 32: Metric-based Measurement and Selection for Software Product Quality Assessment: Qualitative Expert Interviews

Abstract: A systematic and efficient measurement process can assist towards the production of quality software product. Metric-based measurement method often used to assess the product quality. Currently several hundreds of metrics have been proposed by previous researchers. However, there is no specific and structured mechanism for metrics selection process. Lack of awareness, knowledge and experience lead to selecting inappropriate and unsuitable metrics for assessment of software product quality done by the practitioners and stakeholders in the industry. Literature study found that the existing selection models are irrelevant and insufficient for assisting and supporting metrics selection process in which it should consists of criteria, and systematic and practical methods of selection process. A qualitative interview was conducted involving 12 experts and practitioners to reveal current issues in software measurement, to identify elements relevant for software metric selection process and to identify the appropriate and valid software metric selection criteria. Finding from this expert interview revealed important input from industry which are: Five main issues in software measurement, six elements associated with metric selection process and 13 criteria relevant for software metric selection.

Author 1: Zubaidah Bukhari
Author 2: Jamaiah Yahaya
Author 3: Aziz Deraman

Keywords: Software product metric; metric selection criteria; software quality; software measurement; selection process; qualitative study

PDF

Paper 33: Systematic Review of Existing IoT Architectures Security and Privacy Issues and Concerns

Abstract: Internet of things (IoT) has become one of the most prominent technologies that the world has been witnessing nowadays. It provides great solutions to humanity in many significant fields of life. IoT refers to a collection of sensors or object in the universe with the capability of communicating with each other through the internet without human intervention. Currently, there is no standard IoT architecture. As it is in its infancy, IoT is surrounded by numerous security and privacy concerns. Thus, to avoid such concerns that may hinder its deployment, an IoT architecture has to be carefully designed to incorporate security and privacy solutions. In this paper, a systematic literature review was conducted to trace the evolvement of IoT architectures from its initial development in 2008 until 2018. The Comparison among these architectures is based on terms of the architectural stack, covered issues, the technology used and considerations of security and privacy aspects. The findings of the review show that the initial IoT architectures did not provide a comprehensive meaning for IoT that describe its nature, whereas the recent IoT architectures convey a comprehensive meaning of IoT, starting from data collection, followed by data transmission and processing, and ending with data dissemination. Moreover, the findings reveal that IoT architecture has evolved gradually across the years, through improving architecture stack with new solutions to mitigate IoT challenges such as scalability, interoperability, extensibility, management, etc. with lack consideration of security solutions. The findings disclose that none of the discussed IoT architectures considers privacy concerns, which indeed considered as a critical factor of IoT sustainability and success. Therefore, there is an inevitable need to consider security and privacy solutions when designing IoT architecture.

Author 1: Fatma Alshohoumi
Author 2: Mohammed Sarrab
Author 3: Abdulla AlHamadani
Author 4: Dawood Al-Abri

Keywords: Internet of things; IoT architecture stack; IoT layers; IoT privacy concerns; IoT security

PDF

Paper 34: Complex Binary Adder Designs and their Hardware Implementations

Abstract: Complex Binary Number System (CBNS) is (-1+j)-based on binary number system which facilitates both real and imaginary components of a complex number to be represented as single binary number. In this paper, we have presented three designs of nibble-size complex binary adders (ripple-carry, decoder-based, minimum-delay) and implemented them on various Xilinx FPGAs. The designs of base2 4-bit binary adder have also been implemented so that statistics of different adders can be compared.

Author 1: Tariq Jamil
Author 2: Medhat Awadalla
Author 3: Iftaquaruddin Mohammed

Keywords: Complex number; complex binary; adder; ripple carry; decoder; minimum delay

PDF

Paper 35: Enhanced Mutual Authenticated Key Agreement Protocol for Anonymous Roaming Service in Global Mobility Networks

Abstract: With the rapid development of mobile intelligent terminals, users can enjoy ubiquitous life in global mobility networks (GLOMONET). It is essential to secure user information for providing secure roaming service in GLOMONET. Recently, Xu et al. proposed a mutual authentication and key agreement (MAKA) protocol as the basic security building block. The purpose of this paper is not only to show some security problems in Xu et al.’s MAKA protocol and but also proposes an enhanced MAKA protocol as a remedy protocol for Xu et al.’s MAKA protocol. The proposed protocol ensures higher security compared to the well-known authentication and key agreement protocols but has a bit computational overhead than them due to the security enhancements.

Author 1: Hyunsung Kim

Keywords: Information security; roaming security; anonymity; authenticated key agreement; cryptanalysis

PDF

Paper 36: Thermal Pain Level Estimation Method with Heart Rate and Cerebral Blood Flow

Abstract: Method for thermal pain level estimation with heart rate and cerebral blood flow using SVM is proposed. Through experiments, it is found that thermal pain level is much sensitive to the cerebral blood flow rather than heart rate. Also, it is found that the performance of thermal pain estimation is much better than the previously proposed method with the number of blinks, the enlarging rate of pupil size.

Author 1: Kohei Arai
Author 2: Asato Mizoguchi
Author 3: Hiroshi Okumura

Keywords: Thermal pain; support vector machine; thermal stimulus; classification; cerebral blood flow; heart rate

PDF

Paper 37: Citizen Attention Web Application for the Municipality of Sabinas, Coahila, Mexico

Abstract: The information systems are fundamental to perform the daily activities of any organization. There is an increasing dependence on organizations to use information technology to achieve their objectives. This article presents the web information system that has been developed and implemented as support to manage the administrative services needs required by the citizens of the municipality of Sabinas Coahuila, México, seeking to be served in the best way and to obtain information from Faster and reliable way to follow up. Previously we worked manually, keeping the records in a format in Microsoft Excel. For the development of the system the agile XP methodology was used. The creation of the database was in MySQL and the development in Visual Studio 2015, the part of web programming in ASP .NET and programming in C #. With the implementation of the system, there is currently electronic control of the requests made by citizens, providing integrity, availability and confidentiality of information; at the same time, streamlining the process of capturing and receiving applications in each department of the municipality of Sabinas, Coahuila, Mexico. In addition, the system provides statistics of the requests that were attended, those that are in process and those that were not attended.

Author 1: Griselda Cortes
Author 2: Alicia Valdez
Author 3: Laura Vazquez
Author 4: Alma Dominguez
Author 5: Cesar Gonzalez
Author 6: Ernestina Leija
Author 7: Jose Cendejas

Keywords: Web system; citizen attention; database

PDF

Paper 38: Usability of “Traysi”: A Web Application for Tricycle Commuters

Abstract: This study measured the usability of a web application for tricycle commuters that was developed using Hypertext Markup Language (HTML), Cascading Style Sheet (CSS) and Javascript (JS) with the aid of Google Artificial Programming Interfaces (APIs). Toward this goal, the effectiveness, efficiency and user satisfaction were measured using common usability metrics. The effectiveness was measured in terms of task completion rate and user errors while efficiency was measured in terms of time on task. For user satisfaction, the post-task questionnaire, Single Ease Question (SEQ), was used. In order to check whether the web application will be usable even for first time users, the usability test was conducted three times. The result revealed that the usability of the web application is acceptable on the first trial. However, the usability improved on the next used which is evident on the third trial that yielded a 93.33% task completion rate with only one user error. The average time on task on every trial was lower than the maximum acceptable task time and the user satisfaction was high (x̅ = 6.00). Thus, the web application was highly usable in doing its intended purpose especially if it is repeatedly used.

Author 1: Ertie C Abana

Keywords: Tricycle; usability metrics; web application; fare calculation; Google API

PDF

Paper 39: A Decision Tree Approach for Predicting Student Grades in Research Project using Weka

Abstract: Data mining in education is an emerging multidiscipline research field especially with the upsurge of new technologies used in educational systems that led to the storage of massive student data. This study used classification, a data mining process, in evaluating computer engineering student’s data to identify students who need academic counseling in the subject. There were five attributes considered in building the classification model. The decision tree was chosen as the classifier for the model. The accuracy of the decision tree algorithms, Random Tree, RepTree and J48, were compared using cross-validation wherein Random Tree returned the highest accuracy of 75.188%. Waikato Environment for Knowledge Analysis (WEKA) data mining tool was used in generating the classification model. The classification rules extracted from the decision tree was used in the algorithm of the Research Project Grade Predictor application which was developed using Visual C#. The application will help research instructors or advisers to easily identify students who need more attention because they are predicted to have low grades.

Author 1: Ertie C Abana

Keywords: Data mining; classification rules; decision tree; educational data mining; WEKA

PDF

Paper 40: A Review of Ontology Development Aspects

Abstract: Although it is widely recognized that ontology is the main approach towards semantic interoperability among information systems and services, the understanding of ontology aspects among researchers is limited. To provide a clear insight to this problem and support researchers, we need a background understanding of various aspects related to ontology. Consequently, in this paper, a comprehensive review is conducted to map the literature studies to a coherent taxonomy. These include the benefits of ontology, types of ontology, application domains, development platforms, languages, tools, and methodologies. The paper also discusses the concept of ontology, semantic Web, and its contribution to several research fields such as Artificial Intelligence, Library Science and shared knowledge. The fundamentals of ontology presented in this paper can benefit readers who wish to embark in ontology-based research and applications development.

Author 1: Nur Liyana Law Mohd Firdaus Law
Author 2: Moamin A. Mahmoud
Author 3: Alicia Y.C. Tang
Author 4: Fung-Cheng Lim
Author 5: Hairoladenan Kasim
Author 6: Marini Othman
Author 7: Christine Yong

Keywords: Component; ontology; semantics web; artificial intelligence

PDF

Paper 41: Post Treatment of Guided Wave by using Wavelet Transform in the Presence of a Defect on Surface

Abstract: This article presents a Lamb wave processing by using two methods: Fast Fourier Transform (FFT2D) and Continuous Wavelet Transform (CWT) using Morlet wavelet. This treatment is done for a structure of two aluminum-copper plates, which are in contact edge to edge of a perpendicular junction with a thickness “e” in the presence of a rectangular and symmetrical defect located on the surface of the junction with a depth “d”. The aim under this study is to calculate the transmission and reflection energy coefficients by the two methods. The results of simulation obtained by Comsol software of an incident wave S0 at F = 800 kHz indicate us a good coherence between the two methods (FFT2D and CWT).

Author 1: MARRAKH Rachid
Author 2: ALOUANE Houda
Author 3: BELHOUSSINE DRISSI Taoufiq
Author 4: NSIRI Benayad
Author 5: ZAYRIT Soumaya

Keywords: Lamb wave; defect; reflection; transmission; CWT; FFT2D

PDF

Paper 42: Vectorization of Text Documents for Identifying Unifiable News Articles

Abstract: Vectorization is imperative for processing textual data in natural language processing applications. Vectorization enables the machines to understand the textual contents by converting them into meaningful numerical representations. The proposed work targets at identifying unifiable news articles for performing multi-document summarization. A framework is introduced for identification of news articles related to top trending topics/hashtags and multi-document summarization of unifiable news articles based on the trending topics, for capturing opinion diversity on those topics. Text clustering is applied to the corpus of news articles related to each trending topic to obtain smaller unifiable groups. The effectiveness of various text vectorization methods, namely the bag of word representations with tf-idf scores, word embeddings, and document embeddings are investigated for clustering news articles using the k-means. The paper presents the comparative analysis of different vectorization methods obtained on documents from DUC 2004 benchmark dataset in terms of purity.

Author 1: Anita Kumari Singh
Author 2: Mogalla Shashi

Keywords: Vectorization; news articles; tf-idf; word embeddings; document embeddings; text clustering

PDF

Paper 43: A Proposed Model for Detecting Facebook News’ Credibility

Abstract: Social networks are currently one of the main News’ sources for most of their users. Moreover, News channels also consider social networks as main channels not only for spreading the news but also for measuring the feedback from their followers. Facebook Followers can comment or react to the news, which represents the follower’s feedback about this topic. Therefore, it is a fact that measuring the News’ credibility is one of the important tasks that could control the propagation of the fake news as well as the number of News’ followers. The proposed model in this research highlights the impact of the News’ followers on detecting the News’ polarity either it is fake or not. The proposed model focuses on applying an intelligent sentiment analysis using Vector Space Model (VSM) which is one of the most successful techniques on the users’ comments and reactions through the emoji. Then the degree of credibility is determined according to the correlation coefficient. An experimental study was applied using Facebook News dataset, which included the News and the followers’ feedbacks.

Author 1: Amira M Idrees
Author 2: Fahad Kamal Alsheref
Author 3: Ahmed I. ElSeddawy

Keywords: Social network; vector space model; correlation coefficient; sentiment analysis

PDF

Paper 44: Cyber Terrorist Detection by using Integration of Krill Herd and Simulated Annealing Algorithms

Abstract: This paper presents a technique to detect cyber terrorists suspected activities over the net by integrating the Krill Herd and Simulated Annealing algorithms. Three new level of categorizations, including low, high, and interleave have been introduced in this paper to optimize the accuracy rate. Two thousand datasets had been used for training and testing with 10-fold cross validation for this research and the simulations were performed using Matlab®. Based on the conducted experiment, this technique produced 73.01% accuracy rate for the interleave level; thus, outperforming the benchmark work. The findings can be used as a guidance and baseline work for other researchers with the same interest in this area.

Author 1: Hassan Awad Hassan Al-Sukhni
Author 2: Azuan Bin Ahmad
Author 3: Madihah Mohd Saudi
Author 4: Najwa Hayaati Mohd Alwi

Keywords: Krill Herd; web content classification; cyber terrorists; simulating annealing

PDF

Paper 45: Knowledge Discovery based Framework for Enhancing the House of Quality

Abstract: Mining techniques proved to have a successful impact in different fields for many targets; one of these targets is to gain customers’ satisfaction through enhancing the products’ quality according to the voice of these customers. This research proposes a framework that is based on mining techniques and adopted Saaty method targeting to gain the customers’ satisfaction and consequently a competitive advantage in the real estate market. The proposed framework is applied during the design phase of a real estate residential building project as an improvement tool to design the building according to the customers’ requirements representing the voice of customers (VOC). The proposed Saaty method adaptation increased the number of the consistent sample which was incorrectly excluded using the traditional Saaty method. Saaty method adaptation has succeeded in enhancing the house of quality (HOQ) by achieving the consistent technical customers’ requirements for residential buildings, while customers’ segmentation succeeded in focusing on the homogeneous grouping of customers.

Author 1: Amira M Idrees
Author 2: Ahmed I. ElSeddawy
Author 3: Mohammed Ossama Zeidan

Keywords: Knowledge discovery; mining techniques; customer segmentation; Saaty’s method; customer satisfaction

PDF

Paper 46: Convolutional Neural Network for Diagnosing Skin Cancer

Abstract: Diagnosis of melanoma (skin cancer disease) is a challenging task in medical science field due to the amount and nature of the data. Skin cancer datasets are usually comes in different format and shapes including medical images, hence, data require tremendous efforts for preprocessing before the auto-diagnostic task itself. In this work, deep learning (convolutional neural network) is used to build a computer model for predicting new cases of skin cancer. The first phase in this work is to prepare images data, this include images segmentation to find useful parts that are easier for analysis and to detect region of interest in digital images, reduce the amount of noise and image illumination, and to easily detect sharp edges (boundaries) of objects. Then, the proposed approach built a convolutional neural network model which consists of three convolution layers, three max pooling layers, and four fully connected layers. Testing the model produced promising results with accuracy of 0.74. The result encourages and motivates for future improvement and research on online diagnosing of melanoma in early stages. Therefore, a web application was built to utilize the model and provide online diagnosis of melanoma.

Author 1: Mohammad Ashraf Ottom

Keywords: Convolutional neural network CNN; melanoma; skin cancer; image preprocessing

PDF

Paper 47: A Review of Embedding Hexagonal Cells in the Circular and Hexagonal Region of Interest

Abstract: Hexagonal cells are applied in various fields of research. They exhibit many advantages, and one of the most important is their possibility to be closely packed and to form a hexagonal grid that fully covers the Region of Interest (ROI) without overlaps or gaps. ROI can be of various geometrical shapes, but this paper deals with the circular or hexagonal ROI approximations. The main purpose of our research is to provide a short review on the literature concerning the hexagonal grid, summarizing the existing state-of-the-art approaches on embedding hexagonal cells in the targeted ROI shapes and offering application-specific advantages. We report on formulas and algebraic expressions given in the existing researches that are used for calculating the number of embedded inner hexagonal cells or their vertices and/or edges. We contribute by integrating all researches in one place, finding a connection between previously unrelated applications concerning the use of embedded hexagonal grid and extracting commonality between previous researches on whether it provides the formulas on calculating the inner hexagon cells. In case only the number of edges or vertices is provided for the targeted application, we derive formulas for calculating the number of inner hexagons. Therefore, our survey results with the overview on solving the problem of embedding hexagonal cells in the desired circular or hexagonal ROI. The contribution of the review is the following: first it provides the existing and the derived formulas for calculating the embedded hexagons and second, it provides a theoretical background that is necessary to encourage further research. Namely, our main motivation, that is the geometrical design of the one of the world’s largest CERN particle detectors, Compact Muon Solenoid (CMS) is analyzed as a source for the future research directions.

Author 1: Marina Prvan
Author 2: Julije Ožegovic
Author 3: Arijana Burazin Mišura

Keywords: Detector design; hexagon tessellation; region of Interest; regular grid

PDF

Paper 48: Efficient Software Testing Technique based on Hybrid Database Approach

Abstract: In the field of computer science, software testing is referred as a critical process which is executed in order to assess and analyze the performance and risks existing in software applications. There is an emphasis on integrating specific approaches to carry out testing activities in an effective mode; the efficient strategy being explored recently is adopting hybrid database approach. For this purpose, a hybrid algorithm will be proposed to ensure the functionality and outcomes of testing procedure. The technical processes and its impact on current methodology would help to evaluate its effectiveness in software testing through which specific conclusions could be drawn. The findings of the research will elaborate effectiveness of the proposed algorithm that would be used in software testing. It would be suggested that new technology is easier and simple to assess and analyze the reliability of the software. Basically, hybrid database approach comprises of traditional and modern techniques that are deployed in order to achieve testing outcomes. It is explored from various testing methods that challenges have been identified related to focusing on traditional techniques due to which hybrid approach is now being developed in most of the areas. In the light of addressing these concepts, the paper aims to investigate the complexity and efficiency of hybrid database approach in software testing, as well as its scope in the IT industry.

Author 1: Humma Nargis Aleem
Author 2: Mirza Mahmood Baig
Author 3: Muhammad Mubashir Khan

Keywords: Software testing; database testing; hypothetical database testing; traditional database testing; test case(s); grey box testing; software quality assurance

PDF

Paper 49: Agile Methods Selection Model: A Grounded Theory Study

Abstract: Agile methods adoption has increased in recent years because of its contribution to the success rate of project development. Nevertheless, the success rate of projects implemented using Agile methods has not completely reached its expected mark, and selecting the appropriate Agile methods is one of the reasons for such lag. Selecting the appropriate Agile methods is a challenging task because there are so many methods to select from. In addition, a lot of organizations consider the selection of Agile methods as a mammoth task. Therefore, to assist Agile team members, this study aimed to investigate how the appropriate Agile methods can be determined for different projects. Based on a Grounded Theory study, 23 Agile experts drawn from 19 teams across thirteen countries were interviewed. Hence, this study employed the Ground Theory of selecting Agile methods. Sixteen factors, grouped into five categories, have been found to affect the selection of twenty Agile methods. The nature of project (size, maturity, criticality and decomposability), development team skills (communication skills, domain knowledge, team technical skills and maturity), project constraints (cost/value/ROI, cost of change, time, scope and requirements volatility), customer involvement (collaboration, commitment and domain knowledge) and organizational culture (type of organizational culture) are the key factors that should guide Agile team members in the selection of an appropriate Agile methods based on the value these factors have for different organizations and/or different projects.

Author 1: Mashal Kasem Alqudah
Author 2: Rozilawati Razali
Author 3: Musab Kasim Alqudah

Keywords: Agile methods selection; factors; model; grounded theory analysis

PDF

Paper 50: A Group Cooperative Coding Model for Dense Wireless Networks

Abstract: Generally, node groups in dense wireless networks (WNs) often pose the problem of communication between the central node and the rest of the nodes in a group. Adaptive Network Coded Cooperation (ANCC) for wireless central node networks adapts precisely to an extensive and dense WN, at this level the random linear network coding (RLNC) scheme with Low-Density Parity-Check (LDPC) are used as the essential ANCC evolution. This paper suggests two phases effective technique then studies the influence of the randomly chosen number of coded symbols which are correctly received in the second phase on bit error rate (BER). The proposed technique also focuses on the role of dispersion impact related to the LDPC code generating matrix. The results of the simulation allow selecting the best compromise for the best BER.

Author 1: El Miloud Ar-Reyouchi
Author 2: Ahmed Lichioui
Author 3: Salma Rattal

Keywords: Dense Wireless networks; Bit Error Rate (BER); Low-Density Parity-Check (LDPC) codes

PDF

Paper 51: Entanglement Classification for a Three-qubit System using Special Unitary Groups, SU(2) and SU(4)

Abstract: Entanglement is a physical phenomenon that links a pair, or a set of particles that correlates with each other, regardless of the distance between them. Recent researches conducted on entanglement are mostly focused on measurement and classification in multiqubit systems. Classification of two qubits will only distinguish the quantum state as either separable or entangled, and it can be done by measurement. Meanwhile, in a three-qubit system, it becomes more complex because of the structure of the three qubits itself. It is not sufficient to do measurement because the states are divided into three types, including fully separable state, biseparable state, and genuine entangled state. Therefore, the classification is needed to distinguish the type of states in the three-qubit system. This study aims to classify the entanglement of three-qubit pure states using a combination model of special unitary groups, SU(2) and SU(4), by changing the angle of selected parameters in SU(4) and acting on the separable pure state. The matrix represents SU(2) is 2×2 matrix while matrix for SU(4) is 4×4 matrix. Hence, the combination of SU(2) and SU(4) represent 8×8 matrix. This classification uses the von Neumann entropy and three tangle measurements to classify the class, respectively. The results of this study have indicated that the three-qubit pure state has been successfully classified into different classes, namely, A-B-C, A-BC, C-AB, GHZ, and W, with A-B-C being a fully separable state, A-BC and C-AB are biseparable states, and GHZ and W are genuine entangled states. The results show that this model can change separable pure states to other entanglement class after transformation is done.

Author 1: Siti Munirah Mohd
Author 2: Bahari Idrus
Author 3: Hishamuddin Zainuddin
Author 4: Muriati Mukhtar

Keywords: Quantum entanglement; multiqubit entanglement; entanglement classification; special unitary group; three-qubit system; quantum information

PDF

Paper 52: Proposal Models for Personalization of e-Learning based on Flow Theory and Artificial Intelligence

Abstract: This paper presents the comparison of the results of two models for the personalization of learning resources sequences in a Massive Online Open Course (MOOC). The compared models are very similar and differ just in the way how they recommend the learning resource sequences to each participant of the MOOC. In the first model, Case Based Reasoning (CBR) and Euclidean distance is used to recommend learning resource sequences that were successful in the past, while in the second model, the Q-Learning algorithm of Reinforcement Learning is used to recommend optimal learning resource sequences. The design of the learning resources is based on the flow theory considering dimensions as knowledge level of the student versus complexity level of the learning resource with the aim of avoiding the problems of anxiety or boredom during the learning process of the MOOC.

Author 1: Anibal Flores
Author 2: Luis Alfaro
Author 3: José Herrera
Author 4: Edward Hinojosa

Keywords: Massive Online Open Course; MOOC; e-learning; flow-theory; learning resource sequence; case based reasoning; reinforcement learning; q-learning

PDF

Paper 53: Evaluation of LoRa-based Air Pollution Monitoring System

Abstract: Air pollution is a threat to human health and the environment. Pollution caused by harmful gases emitted from car exhausts, factories, forest fires and other sources. Carbon monoxide, nitrogen oxides and carbon dioxide are the main elements of air pollution. Serious air pollution may cause harm to our health, thus, a real-time air pollution system to measure existing pollution is needed to classify the pollution level so that appropriate actions can be taken. In a high density area, a big number of sensor nodes are deployed to cover such places and to allow high range communication between nodes and a gateway. This paper presents a real-time and long range air pollution monitoring system for indoor and outdoor environments. The system implemented a wireless sensor network using LoRa technology for data communication between all nodes and sensors. The system consists of three nodes distributed within 900m distance to gateway for measuring the concentration of carbon monoxide, carbon dioxide and nitrogen oxide. Experimental results show the system is reliable in both indoor and outdoor applications. The distance coverage achieved up to 900m and can be displayed through a web-based system. The experiment with LoRa transmission has shown that the LoRa technology is very suitable for the air pollution system especially in long range transmission compared to other wireless transmission techniques.

Author 1: Nael Abd Alfatah Husein
Author 2: Abdul Hadi Abd Rahman
Author 3: Dahlila Putri Dahnil

Keywords: Air pollution; carbon monoxide; wireless sensor network; LoRa; communication; gateway; transmission

PDF

Paper 54: New Quintupling Point Arithmetic 5P Formulas for Lopez-Dahab Coordinate over Binary Elliptic Curve Cryptography

Abstract: In Elliptic Curve Cryptography (ECC), computational levels of scalar multiplication contains three levels: scalar arithmetic, point arithmetic and field arithmetic. To achieve an efficient ECC performance, precomputed points help to realize a faster computation, which takes away the need to repeat the addition process every time. This paper introduces new quintupling point (5P) formulas which can be precomputed once and can be reused at the scalar multiplication level. We considered mixed addition in Affine and Lŏpez-Dahab since the mixed addition computation cost is better than the traditional addition in Lŏpez-Dahab coordinates over binary curve. Two formulas are introduced for the point quintupling which (Double Double Add) and (Triple Add Double), the cost of the two formulas are 17 multiplication+12 squaringand 23 multiplication+13 squaring respectively. The two formulas are proven as valid points. The new quintupling point can be implemented with different scalar multiplication methods.

Author 1: Waleed K AbdulRaheem
Author 2: Sharifah Bte Md Yasin
Author 3: Nur Izura Binti Udzir
Author 4: Muhammad Rezal bin Kamel Ariffin

Keywords: Elliptic Curve Cryptosystem (ECC); scalar multiplication algorithm; point arithmetic; point quintupling; Lopez-Dahab (LD); binary curve

PDF

Paper 55: YAWARweb: Pilot Study about the usage of a Web Service to Raise Awareness of Blood Donation Campaigns on University Campuses in Lima, Peru

Abstract: This document presents a preliminary study about a pilot deployment of a web service. The service is used as means to raise awareness in university campuses prior to blood donation campaigns and to measure its effect into posterior donor enrollment. The measure the level of awareness a score range from zero to four inclusive was set. It was quantified before and after giving the information. This allowed evaluating the score change influenced by the received information. Another important metric was the contrast between the community participation between the blood donation campaigns at 12th June 2018 and June 2017. During these campaigns 41 and 25 blood units were collected following the new approach and the traditional way respectively. This variation represents an increase of 64% with respect to the campaign carried out in 2017 by INSN-SB, where the only variation was the use of the application YAWARweb. Moreover, in 2018 there were 36 people interested to donate. Nonetheless, it was not possible because of insufficient hemoglobin, narrow veins, and other causes. This research has as goal to evaluate the usage of our survey through a web service as a tool to raise awareness in university campuses prior to blood donation campaigns. This survey will provide information to the participants about the benefits of blood donation. Thus, creating an incentive to participate in the campaigns and getting the results as an increment of the number of participants. Our group keeps working on preventive health and changing the picture of blood donation leveraged by technology development. The document starts with a general summary of the situation of blood donation in Peru, and then it analyzes the population where the tool is applied. It then proceeds to the methodology of implementation of YAWARweb. Finally, it presents the results of the use of the web application in the community as a method of raising awareness.

Author 1: Alva Mantari Alicia
Author 2: Lipa Cueva Alonso
Author 3: Trinidad Quiñonez Oscar
Author 4: Brian Meneses-Claudio
Author 5: Zamora Benavente Isabel
Author 6: Arias Guzmán Belinda
Author 7: Delgado-Rivera Gerson
Author 8: Roman-Gonzalez Avid

Keywords: Aplication; survey; blood donation; donor benefits

PDF

Paper 56: Transforming Service Delivery with TOGAF and Archimate in a Government Agency in Peru

Abstract: The application of The Open Group Architecture Framework (TOGAF) and Archimate to transform the citizen's service delivery by the Ministry of Labor and Employment Promotion of Peru is presented. The enterprise architecture development has followed the phases of the TOGAF Architecture Development Method (ADM) method: Architecture Vision, Business, Data, Application and Technology Architecture Definition to determine the source architecture, the target architecture and gaps to meeting the target architecture requirements. The strategic motivations, active structure, passive structure, behavior and different viewpoints models in the domains: business, data, application and technology have been constructed with the Archimate descriptive language. The viewpoints achieved by the conjunction of these open standards have allowed identifying fragmented and isolated business services, as well as duplicated data and duplicating application functions. A proposal for integrated and transversal services emerges as a result of the applied Enterprise Architecture approach. Design science is applied to obtain knowledge from the generated artifacts. The knowledge generated from this application can be useful for new initiatives to improve the delivery of services to the citizen in the Peruvian government.

Author 1: Jorge Valenzuela Posadas

Keywords: Enterprise architecture; business architecture; digital government; The Open Group Architecture Framework (TOGAF); archimate; design science

PDF

Paper 57: Let’s Code: A Kid-friendly Interactive Application Designed to Teach Arabic-speaking Children Text-based Programming

Abstract: Programming is the cornerstone for the development of all of the technologies we encounter in our daily lives. It also plays an important role in enhancing creativity, problem-solving, and logical thinking. Due to the importance of programming in combination with the shortage of Arabic content that aims to teach children programming, we decided to develop Let’s Code, an interactive mobile-based application designed for Arabic-speaking children from 8 to 12 years old. The application focuses on the basics of programming such as data types, variables, and control structures using the Python programming language through a simple, attractive, and age-appropriate design. The application presents its users with an interesting storyline that involves a trip to space with “Labeeb”, a robot character designed to explain programming concepts to the child throughout the trip. Each planet represents a level in the application and introduces a programming concept through a set of lessons and exercises. The application can be used by educational institutions and parents to teach programming and will provide an opportunity through which Arabic-speaking children can keep up with the development and dissemination of technology.

Author 1: Tahani Almanie
Author 2: Shorog Alqahtani
Author 3: Albatoul Almuhanna
Author 4: Shatha Almokali
Author 5: Shaima Guediri
Author 6: Reem Alsofayan

Keywords: Edutainment; mobile application; interactive application; Arabic; children; programming; coding for kids; Python

PDF

Paper 58: Boosted Constrained K-Means Algorithm for Social Networks Circles Analysis

Abstract: The volume of information generated by a huge number of social networks users is increasing every day. Social networks analysis has gained intensive attention in the data mining research community to identify circles of users depending on the characteristics in the individual profiles or the structure of the network. In this paper, we propose the boosting principle to find the circles of social networks. Constrained k-means clustering method is used as a weak learner with the boosting framework. This method generates a constrained clustering represented by a kernel matrix according to the priorities of the pair-wise constraints. The experimental results show that the proposed algorithm using boosting principle for social network analysis improves the performance of the clustering and outperforms the state-of-the-art.

Author 1: Intisar M Iswed
Author 2: Yasser F. Hassan
Author 3: Ashraf S. Elsayed

Keywords: Constrained clustering; boosting; social networks; k-means; kernel matrix

PDF

Paper 59: Collaborative Integrated Model in Agile Software Development (MDSIC/MDSIC–M)-Case Study and Practical Advice

Abstract: The fast increase of mobile device users based on wider and easier internet access has detonated the development of mobile applications (APP) and web. Therefore, improvement and innovation have become a top priority for businesses and consumer relations. The functional quality and interface aspects in applications (software) drive companies to succeed in mobile apps market competition. This paper introduces an agile software development methodology denominated MDSIC and MDSIC-M focused on rapid application development as required by small and medium software enterprises (SMEs), and results in better quality and competitiveness. MDSIC and MDSIC - M proposes some levels with better practices that should be followed in software development projects. This article also aims to show matching indicators and results of MDSIC and MDSIC - M implementations in software projects, by assessing the needed parameters to generate quality software, and thus align technology with the goals of the organizations.

Author 1: José L Cendejas Valdez
Author 2: Heberto Ferreira Medina
Author 3: Gustavo A. Vanegas Contreras
Author 4: Griselda Cortes Morales
Author 5: Alfonso Hiram Ginori González

Keywords: Agile methodology; development Software (web–mobile); MDSIC / MDSIC – M; quality assurance

PDF

Paper 60: Low Power and High Reliable Triple Modular Redundancy Latch for Single and Multi-node Upset Mitigation

Abstract: CMOS based circuits are more susceptible to the radiation environment as the critical charge (Qcrit) decreases with technology scaling. A single ionizing radiation particle is more likely to upset the sensitive nodes of the circuit and causes Single Event Upset (SEU). Subsequently, hardening latches to transient faults at control inputs due to either single or multi-nodes is progressively important. This paper proposes a Fully Robust Triple Modular Redundancy (FRTMR) latch. In FRTMR latch, a novel majority voter circuit is proposed with a minimum number of sensitive nodes. It is highly immune to single and multi-node upsets. The proposed latch is implemented using CMOS 45 nm process and is simulated in cadence spectre environment. Results demonstrate that the proposed latch achieves 17.83 % low power and 13.88 % low area compared to existing Triple Modular Redundant (TMR) latch. The current induced due to transient fault occurrence at various sensitive nodes are exhibited with a double exponential current source for circuit simulation with a minimum threshold current value of 40 µA.

Author 1: S Satheesh Kumar
Author 2: S Kumaravel

Keywords: Multiple Event Transient (MET); Single Event Upset (SEU); Single Event Transient (SET); Radiation hardening; Reliability; Transient fault; Triple Modular Redundancy (TMR)

PDF

Paper 61: Efficient Algorithm for Maximal Clique Size Evaluation

Abstract: A large dataset network is considered for computation of maximal clique size (MC). Additionally, its link with popular centrality metrics to decrease uncertainty and complexity and for finding influential points of any network has also been investigated. Previous studies focus on centrality metrics like degree centrality (DC), closeness centrality (CC), betweenness centrality (BC) and Eigenvector centrality (EVC) and compare them with maximal clique size however, in this study Katz centrality measure is also considered and shows a pretty robust relation with maximal clique size (MC). Secondly, maximal clique size (MC) algorithm is also revised for network analysis to avoid complexity in computation. Association between MC and five centrality metrics has been evaluated through recognized methods that are Pearson’s correlation coefficient (PCC), Spearman’s correlation coefficient (SCC) and Kendall’s correlation coefficient (KCC). The strong strength of association between them is seen through all three correlation coefficients measure.

Author 1: Ubaida Fatima
Author 2: Saman Hina

Keywords: Centrality measures; network analysis; maximal clique size

PDF

Paper 62: A Conceptual Smart City Framework for Future Industrial City in Indonesia

Abstract: In Indonesia, the growth of cities from various big cities and industrial cities can cause many challenges. To face this challenge, policy makers can apply the concept of smart cities. This paper aims to analyze many studies that discuss prospective industrial city planning in a smart city perspective. This research uses information from research, models, frameworks, and tools that discuss IoT, smart cities, and industrial cities. This research provides the latest insight into smart city frameworks for industrial cities. In this study found the pillars forming the smart city for industrial cities. This framework can also be used by governments such as Kulonprogo District in the Special Region of Yogyakarta, Indonesia in preparation to transform itself into a smart industrial city. The latest use of information technology in this concept and with implementation priority steps is recommended.

Author 1: Julius Galih Prima Negara
Author 2: Andi W. R. Emanuel

Keywords: Smart city; industrial city; smart industrial city; framework; Kulonprogo District

PDF

Paper 63: A Tool for C++ Header Generation

Abstract: This paper presents a novel approach in the field of C++ development for increasing performance by reducing cognitive overhead and complexity, which results in lower costs. C++ code is split into header and cpp files. This split induces code redundancy. In addition, there are (commonly used) features for classes in C++ that are not supported by recent compilers. The developer must maintain two different files for one single content and implements unsupported features by hand. This leads to the unnecessary cognitive overhead and complex sources. The result is low development performance and high development cost. Our approach utilizes an enhanced syntax inside cpp files. It allows header file generation and therefore obsoletes the need to main-tain a header file. It also enables the generation of fea-tures/methods for classes. It aims to decrease cognitive overhead and complexity, so developers can focus on more sophisticated tasks. This will lead to increased performance and lower costs.

Author 1: Patrick Hock
Author 2: Koichi Nakayama
Author 3: Kohei Arai

Keywords: Development; C++; header file generation; feature generation

PDF

Paper 64: Heuristic Evaluation of Serious Game Application for Slow-reading Students

Abstract: The findings of preliminary studies found that conventional approaches were still relevant but students showed weak and moderate interest and quickly lost focus rather than technology approaches such as serious games were used especially for slow reading students (SRS). Most teachers use interventions that are not specifically designed to help SRS. They usually use teaching aids below the literacy level of the SRS. Therefore, an easy and user-friendly games application called “Mari Membaca” or M2M was developed. The objective is to make sure the application is free from design and interface problems by demonstrating the application of expert-based usability evaluation techniques such as Heuristic evaluation. This paper reports the experimental heuristic evaluation of M2M for SRS among expert evaluators includes remedial teachers and game developers. This study adopted ten Usability Heuristics and seven brain-compatible instructional phases of brain-based learning to be included in the questionnaire. The overall result derived from the evaluation is 14 out of 17 (3.41-5.00) above average mean score, which are neutral (2.61-3.40) in one domain. Several comments and feedback from the experts were essentials for further improvement of the game application to ensure meets the user requirement and expectation.

Author 1: Saffa Raihan Zainal Abidin
Author 2: Siti Fadzilah Mat Noor
Author 3: Noraidah Sahari Ashaari

Keywords: Serious game; brain-based learning; heuristic evaluation; literacy skills; slow-reading students

PDF

Paper 65: Seamless Connectivity for Adaptive Multimedia Provisioning over P2P-enabled IP Multimedia Subsystem

Abstract: The subsystem multimedia internet network (IMS) has been upgraded to support peer-to-peer content distribution services, the peer heterogeneity constraint imposes a challenge that is the guarantee of a certain level of QOS of the video for the different pairs, and the transfer poses a challenge for maintaining quality of service (QoS) in IMS. In this article, we we have extended a scalable Video Coding (SVC) peer-to-peer streaming adaptation model and our extension allows to add another parameter for this adaptation scheme which is the IP address of the peers, so to manage multiple network accesses to a peer when the user changes the type of access, due to a transfer or a loss of connectivity, this model is used to adapt the video quality to the static resources of the peers in order to avoid long start times, and to compare the results obtained for changing the type of network access, we performed two simulation scenarios, one with multiple peers that are connected to the network until the end of the video download by the peers and the other scenario with the change of the address of a peer during the operation of downloading the video sequence by peers. We quantified streaming performance using two metrics of evaluation (peak signal-to-noise ratio (PSNR)) and video quality metrics (VQM)), and also we extracted from the values of PLR (Packets loss rate). Our results show that our model has a better adaptation of the quality according to the network resources of the peers in term of bandwidth available in the network and the performances of the users (CPU, RAM, autonomy of the battery) and also allows a continuity of service in the network by ensuring that the list of peers is updated after each change. The results show a clear quality adaptation with heterogeneous terminals.

Author 1: Adnane Ghani
Author 2: El Hassan Ibn Elhaj
Author 3: Ahmed Hammouch
Author 4: Abdelaali Chaoub

Keywords: Next generation networks; NS2; quality adaptation; scalable video coding; peer to peer

PDF

Paper 66: Cas-GANs: An Approach of Dialogue Policy Learning based on GAN and RL Techniques

Abstract: Dialogue management systems are commonly applied in daily life, such as online shopping, hotel booking, and driving booking. Efficient dialogue management policy helps systems to respond to the user in an effective way. Policy learning is a complex task to build a dialogue system. There are different approaches have been proposed in the last decade to build a goal-oriented dialogue agent to train the systems with an efficient policy. The Generative adversarial network (GAN) is used in the dialogue generation, in previous works to build dialogue agents by selecting the optimal policy learning. The efficient dialogue policy learning aims to improve the quality of fluency and diversity for generated dialogues. Reinforcement learning (RL) algorithms are used to optimize the policies because the sequence is discrete. In this study, we have proposed a new technique called Cascade Generative Adversarial Network (Cas-GAN) that is combination of the GAN and RL for dialog generation. The Cas-GAN can model the relations between the dialogues (sentences) by using Graph Convolutional Networks (GCN). The graph nodes are consisting of different high level and low-level nodes representing the vertices and edges of the graph. Then, we use the maximum log-likelihood (MLL) approach to train the parameters and choose the best nodes. The experimental results compared with the HRL, RL agents and we got state-of-the-art results.

Author 1: Muhammad Nabeel
Author 2: Adnan Riaz
Author 3: Wang Zhenyu

Keywords: Generative Adversarial Networks (GANs); Graph Convolutional Network (GCN); Reinforce Learning (RL); Dialogue policy learning; Maximum Log-Likelihood (MLL)

PDF

Paper 67: Prioritization of Software Functional Requirements: Spanning Tree based Approach

Abstract: Requirements prioritization shows significant role during effective implementation of requirements. Prioritization of requirements is not easy process particularly when requirements are large in size. The current methods of prioritization face limitations as the current prioritization techniques for functional requirements rely on the responses of stakeholders instead of prioritizing requirements on the basis of internal dependencies of one requirement on other requirements. Moreover, there is need to classify requirements on the basis of their importance i.e. how much they are needed for other requirements or dependent on other requirements. Requirements are first represented with spanning trees and then prioritized. Suggested spanning tree based approach is evaluated on requirements of ODOO ERP. Requirements are assigned to four developers. Time estimation with and without prioritization are calculated. The difference in time estimation with prioritization and without prioritization shows the significance of prioritization of functional requirements.

Author 1: Muhammad Yaseen
Author 2: Aida Mustapha
Author 3: Noraini Ibrahim

Keywords: Requirements prioritization; functional requirements; spanning tree

PDF

Paper 68: A Behavioral Study of Task Scheduling Algorithms in Cloud Computing

Abstract: All the services offered by cloud computing are bundled into one service know as IT as a Service (ITaaS). The user’s processes are executed using these services. The scheduling techniques used in the cloud computing environment execute the tasks at different datacenters considering the needs of the consumers. As the requirements vary from one to one, and so the priorities also change. The jobs are executed either in a preemptive or non-preemptive way. The tasks in cloud computing also migrate from one datacenter to another considering load balancing. This research mainly focused on the study of how the Round Robin (RR) and Throttled (TR) scheduling techniques function subject to different tasks given for processing. An analysis is carried out to measure the performance based on the metrics like response time and service time at different userbases and data centers. The consumers have the option to select the server broker policy as they are the ultimate users and payers.

Author 1: Mohammad Riyaz Belgaum
Author 2: Shahrulniza Musa
Author 3: M. S. Mazliham
Author 4: Muhammad Alam

Keywords: Cloud computing; load balancing; service broker policy; scheduling

PDF

Paper 69: Assessing Assistive Learning Technologies with Experimental Design

Abstract: Assistive learning technologies are generally computer-based instruments which are focused at supporting individuals with disabilities in enhancing their learning session with minimal intervention of parents, guardians, as well as helpers. Assessments using experimental research design have frequently been utilized in order to evaluate their efficacy along with feasibility. An experimental design is categorized by experimental units or treatment to use, the tendencies that are tested, as well as the way treatments are designated to units. The experimental or treatment units need sufficient a number of and representative respondents or sample. Even so, due to the limited numbers of sample units or respondents, such type of experiments is noted as subtle yet challenging experiences. Based upon our substantial encounters, this article tries to disclose such precious research experiences.

Author 1: Gede Pramudya
Author 2: Aliza Che Amran
Author 3: Muhammad Suyanto
Author 4: Siti Nur Azrreen Ruslan
Author 5: Helmi Adly Mohd Noor
Author 6: Zuraida Abal Abas

Keywords: Assistive learning technology; disabilities; experimental design; mathematical learning; serious games; autism; visual perception

PDF

Paper 70: Feature Fusion: H-ELM based Learned Features and Hand-Crafted Features for Human Activity Recognition

Abstract: Recognizing human activities is one of the main goals of human-centered intelligent systems. Smartphone sensors produce a continuous sequence of observations. These observations are noisy, unstructured and high dimensional. Therefore, efficient features have to be extracted in order to perform an accurate classification. This paper proposes a combination of Hierarchical and kernel Extreme Learning Machine (HK-ELM) methods to learn features and map them to specific classes in a short time. Moreover, a feature fusion approach is proposed to combine H-ELM based learned features with hand-crafted ones. Our proposed method was found to outperform state-of-the-art in terms of accuracy and training time. It gives an accuracy of 97.62% and takes 3.4 seconds as a training time by using a normal Central Processing Unit (CPU).

Author 1: Nouar AlDahoul
Author 2: Rini Akmeliawati
Author 3: Zaw Zaw Htike

Keywords: Hierarchical extreme learning machine; kernel extreme learning machine; deep learning; feature learning; human activity recognition; feature fusion

PDF

Paper 71: Comparison Shopping Engines

Abstract: Since the stimulation of both feelings of need and temptation have become excessive with the spread of internet advertising, the e-consumer have begun to feel increasingly lost and overwhelmed by offers in a purchasing cycle whose process is mostly unstructured, unguided, and unassisted or - in other words - non user-friendly. As a result, he displays a confused and suspicious attitude and desperately turns to the comparison shopping engines (CSEs) to save time and identify the best matching offer for his search request. Thus, the article in question serves as an investigation of the comparison shopping engines to know if they are up to the task of satisfying the needs of the e-consumer. This study adopts an exploratory approach about the history of online shopping engines, their operating modes, categories, and business plans as well as how they are perceived, used and evaluated. Then, a detailed identification of the various shortcomings that CSEs manifest on the side of both e-consumers and e-merchants was presented in order to eventually discuss the numerous innovations and scientific research which have been developed on the subject.

Author 1: Ghizlane LAGHMARI
Author 2: Sanae KHALI ISSA
Author 3: M’hamed AIT KBIR

Keywords: Price comparators; shopbots; e-consumer; online consumption; shopping engines

PDF

Paper 72: Classifying Red and Healthy Eyes using Deep Learning

Abstract: Eye is one of the most vital organs of human body. Despite being small in size, humans cannot see the life around them without it. Human eye is protected by a thin covering termed as conjunctiva which protects the eye from dust particles. It plays the role of lubricant in the eye which prevents any sort of friction in opening and closing of eye. Broadly there are two kinds of conjunctiva: bulbar and palpebral. The membrane covering the inner portion of eyelids is termed as palpebral conjunctiva and the one covering the outside portion of the eye is called as bulbar conjunctiva (white portion of eye).Due to the dilation of blood vessels the white portion of the eye also termed as sclera becomes red in color. This condition is also termed as hyperemia. The study of this development is vital in diagnosis of various pathologies. It could be result of some trauma, injury or other eye related diseases which needs to be identified for timely treatment. Enormous amount of studies have been done to study the structure and functionality of human eye. This paper highlights the work done so far for measuring the level of redness in the eye using various methodologies ranging from statistical ways to machine learning techniques and proposes a methodology using Matlab and Convolutional neural network to automate this evaluation process.

Author 1: Sherry Verma
Author 2: Latika Singh
Author 3: Monica Chaudhry

Keywords: Bulbar conjunctiva; hyperemia; convolutional neural network

PDF

Paper 73: Development and Evaluation of Massive Open Online Course (MOOC) as a Supplementary Learning Tool: An Initial Study

Abstract: The popularity of Massive Open Online Courses (MOOCs) is prevalent among researchers and practitioners as a new paradigm of open education resource. Since the development of this technology may entail enormous investment, it is critical for institutions to clearly plan the process in designing, developing and evaluating MOOCs that fulfill the needs of target users while keeping the investment to a minimum. Evaluation plays a vital role in assuring that the developed product meets user’s satisfaction. This study presents the process of developing a MOOC as a supplementary learning tool for students in a higher education and its usability evaluation which are rarely discussed in detail in prior literatures. Evaluation was done through a questionnaire and the items were adapted from Computer System Usability Questionnaire (CSUQ). The MOOC development process in this research which was based on the ADDIE (Analysis, Design, Development, Implementation and Evaluation) model and the MOOC usability evaluation results enrich existing literatures on MOOC. Overall, findings showed that users were satisfied with the developed MOOC with most of the items gained high mean score above 4.00. When respondents were asked to comment on the strength of the MOOC, the most prominent one turned out to be the MOOC’s ability to make students’ learning easier.

Author 1: Husna Hafiza R Azami
Author 2: Roslina Ibrahim

Keywords: MOOC; development; usability evaluation

PDF

Paper 74: Simultaneous Stream Transmission Methods for Free Viewpoint TV: A Comparative Study

Abstract: Free Viewpoint TV is a system to view natural videos and allow users to control the viewpoint interactively. The main idea is that the users can switch between multiple video streams to find viewpoints of their own choice. The purpose of this research is to provide fast switching between video streams so that users experience less delay while viewpoint switching. In this paper, we discussed different stream switching methods in detail, including their transmission issues. In addition, we discussed various scenarios for fast stream switching in order to make services more interactive by minimizing delays. The quality of service is another factor which can be improved by assigning priorities to the packets. In addition, we discussed simultaneous stream transmission methods which are based on predictions and reduced quality streams for fast switching. Finally, we propose a prediction algorithm (Linear Regression) and system model for fast viewpoint switching and evaluate simultaneous stream transmission methods for free Viewpoint TV. The results indicate that the proposed system model improves the viewpoint switching and perform fast switching.

Author 1: Mudassar Hussain
Author 2: Abdurahman Hassan A Alhazmi
Author 3: Rashid Amin
Author 4: Muhammad Almas Anjum
Author 5: Ali Tahir

Keywords: Free viewpoint TV; comparison of stream switching methods; video stream switching; fast switching; simultaneous transmissions; linear regression

PDF

Paper 75: Non-intrusive Driver Drowsiness Detection based on Face and Eye Tracking

Abstract: The rate of annual road accidents attributed to drowsy driving are significantly high. Due to this, researchers have proposed several methods aimed at detecting drivers’ drowsiness. These methods include subjective, physiological, behavioral, vehicle-based, and hybrid methods. However, recent reports on road safety are still indicating drowsy driving as a major cause of road accidents. This is plausible because the current driver drowsiness detection (DDD) solutions are either intrusive or expensive, thus hindering their ubiquitous nature. This research serves to bridge this gap by providing a test-bed for achieving a non-intrusive and low-cost DDD solution. A behavioral DDD solution is proposed based on tracking the face and eye state of the driver. The aim is to make this research an inception to DDD pervasiveness. To achieve this, National Tsing Hua University (NTHU) Computer Vision Lab’s driver drowsiness detection video dataset was utilized. Several video and image processing operations were performed on the videos so as to detect the drivers’ eye state. From the eye states, three important drowsiness features were extracted: percentage of eyelid closure (PERCLOS), blink frequency (BF), and Maximum Closure Duration (MCD) of the eyes. These features were then fed as inputs into several machine learning models for drowsiness classification. Models from the K-nearest Neighbors (KNN), Support Vector Machine (SVM), Logistic Regression, and Artificial Neural Networks (ANN) machine learning algorithms were experimented. These models were evaluated by calculating their accuracy, sensitivity, specificity, miss rate, and false alarm rate values. Although these five metrics were evaluated, the focus was more on getting optimal accuracies and miss rates. The result shows that the best models were a KNN model when k = 31 and an ANN model that used an Adadelta optimizer with 3 hidden layer network of 3, 27, and 9 neurons respective. The KNN model obtained an accuracy of 72.25% with a miss rate of 16.67%, while the ANN model obtained 71.61% and 14.44% accuracy and miss rate respectively.

Author 1: Ameen Aliu Bamidele
Author 2: Kamilia Kamardin
Author 3: Nur Syazarin Natasha Abd Aziz
Author 4: Suriani Mohd Sam
Author 5: Irfanuddin Shafi Ahmed
Author 6: Azizul Azizan
Author 7: Nurul Aini Bani
Author 8: Hazilah Mad Kaidi

Keywords: Driver Drowsiness Detection (DDD); face tracking; eye tracking; K-nearest Neighbors (KNN); Support Vector Machine (SVM); Logistic Regression; Artificial Neural Networks (ANN)

PDF

Paper 76: QoS Analysis to Optimize the Indoor Network IEEE 802.11 at UNTELS

Abstract: This paper arose from the need to improve mobility and connectivity to network users of the Universidad Nacional Tecnológica de Lima Sur and the problems that arise on the quality of services (QoS) such as signal intermittence, high latency, low decibels of received power, low coverage equipment that doesn't support high data transit and exceed the limit of connected users. An analysis is presented for optimization of the IEEE 802.11 wireless QoS in indoor media, the coverage data collected from the different equipment based on reception power measurements made with the Wi-Fi Network Analyzer software, mapping of the ideal approach to new coverage areas and the location of Access Points in areas with high student, administrative and teaching staff of the university through simulations of the coverage of the equipment made with NETSPOT software. The results obtained show that the actual design of the wireless network presents interference by equipment configured in the same transmission channel and adjacent, as well as insufficient coverage of radiated power for indoor environments due to the poor location and choice of models of access points, and to be replaced with Access point with high density characteristics of users connected simultaneously by equipment, suitable for indoor environments with technologies such as MU-MIMO and ARM; the results are shown favorable and optimized.

Author 1: Jhon S Acuña-Aroni
Author 2: Brayan W. Alvarado-Gomez
Author 3: Avid Roman–Gonzalez

Keywords: Access point; QoS; coverage; indoor; software

PDF

Paper 77: Graduation Certificate Verification Model: A Preliminary Study

Abstract: Graduation certificates issued by universities and other educational institutions are one of the most important documents for a graduate. It is a proof of graduate’s qualifications and can be used to advance forward in one’s career and life. However, due to advances in software, printing and photocopying technologies, forgery of those certificates is made easy and as good as the original, making them difficult to detect. Several universities and educational institutions as well as businesses started to dedicate resources for verifying certificates however that is usually a tedious and quite costly process and there isn't a clear model that can be adopted by those institutions that could minimize cost and speed up the process. There are many techniques proposed for paper based document verification and this paper analyzes and expatiate the issues on those techniques. Most of the verification techniques require change in the process of certificate generation either by changing template, changing paper, changing printers, adding hardware or even adding extra information. This change may mean that the university or verifier needs the proper knowledge to execute and run the proposed technique. This also means that older certificates may not work with the newly introduced techniques. To also add some proposed techniques require a change that is not always easy or cheap like in creating a third body to verify certificates.

Author 1: Omar S Sale
Author 2: Osman Ghazali
Author 3: Qusay Al Maatouk

Keywords: Graduation certificate verification; graduation certificate authentication; graduation certificate forgery

PDF

Paper 78: New Approach based on Machine Learning for Short-Term Mortality Prediction in Neonatal Intensive Care Unit

Abstract: Mortality remains one of the most important outcomes to predict in Intensive Care Units (ICUs). In fact, the sooner mortality is predicted, the better critical decisions are made by doctors based on patient’s illness severity. In this paper, a new approach based on Machine Learning (ML) techniques for short-term mortality prediction in Neonatal Intensive Care Unit (NICU) is proposed. This approach relies on many steps. At first, relevant features are selected from available data upon neonates’ admission and from the time-series variables collected within the two first hours of stay in the NICU from the Medical Information Mart for Intensive Care III (MIMIC-III). After that, to predict mortality, many classifiers were tested which are Linear Discriminant Analysis (LDA), K-Nearest Neighbors (KNN), Classification and Regression Trees (CART), Logistic Regression (LR), Support Vector Machine (SVM), Naïve Bayes (NB) and Random Forest (RF). The experimental results showed that LDA was the best performing classifier with an accuracy equal to 0.947 and AUROC equal to 0.97 with 31 features. The third step of this approach is mortality time prediction using the Galaxy-Random Forest method achieving an f-score equal to 0.871. The proposed approach compared favorably in terms of time, accuracy and AUROC with existing scoring systems and ML techniques. It is the first work predicting neonates mortality based on ML techniques and time-series data after only two hours of admission to the NICU.

Author 1: Zaineb Kefi
Author 2: Kamel Aloui
Author 3: Mohamed Saber Naceur

Keywords: Mortality prediction; neonates; Intensive Care Units; machine learning

PDF

Paper 79: Unmanned Ground Vehicle with Stereoscopic Vision for a Safe Autonomous Exploration

Abstract: At present there are several systems in cars that provide assistance to the driver and the tendency is that these systems are increasingly efficient and that for their operation do not require the intervention of the driver. Computer vision is relevant in this sector due to the contribution it provides, for example, through the treatment of images algorithms are designed for the detection of objects, pedestrian detection, traffic signal detection, one lane tracking and assistant parking.In this work we present an Unmanned Ground Vehicle system through Computer vision specifically Stereoscopic vision, through a sensor we obtain a disparity map that allows us to quantify the depth of each point of the captured image. The system captures an image of its environment that through internal processing of the sensor, returns a disparity map as data, which is processed by an algorithm that allows the system to navigate in a region of the space in which it is positioned for Autonomous Exploration.

Author 1: Jes´us Jaime Moreno-Escobar
Author 2: Oswaldo Morales-Matamoros
Author 3: Ricardo Tejeida-Padilla

Keywords: Unmanned ground vehicle; stereoscopic vision; computer vision; autonomous exploration

PDF

Paper 80: Privacy Concerns in Online Social Networks: A Users’ Perspective

Abstract: Social networking has elevated the human life to the heights of interaction, response and content sharing. It has been offering state of the art facilities to its users for a long time. Though, over the period of time, the systems have become quite matured yet alongside the benefits, multiple concerns of the user with regard to the privacy and information security also exist. Multidimensional threat spectrum to the Internet has also been posed to social networking tools. A lot of work is being done to un-derstand privacy concerns in social networks. In this scenario, a survey of privacy concerns in online social networks is conducted. Risks, privacy issues, and threats have been highlighted that occurred in recent years, analyzing the targets of attackers, their methods of attack and measures taken to counter/manage these threats are the focus. A social network depends on the user, social network site/application and communication medium provider i.e. the Internet facility. Existing research contains domain specific re-search work regarding privacy issues in social networks; however, a comprehensive research work related to overall infrastructure of online social networks is missing. Development of a taxonomy of threats and categorization of frauds relevant to social networks is an important contribution of this survey. After completing a comprehensive research survey on privacy concerns in online social networks, a set of privacy guidelines is provided and open research challenges are highlighted.

Author 1: Ahmad Ali
Author 2: Ahmad Kamran Malik
Author 3: Mansoor Ahmed
Author 4: Basit Raza
Author 5: Muhammad Ilyas

Keywords: Online social networks; information security; pri-vacy; social networking; attribute disclosure

PDF

Paper 81: New Criteria for Comparing Global Stochastic Derivative-Free Optimization Algorithms

Abstract: For many situations, the function that best models a situation or data set can have a derivative that may be difficult or impossible to find, leading to difficulties in obtaining information about the optimal values of the function. Thus, numerical methods for finding these important values without the direct involvement of the derivative have been developed, making the representation and interpretation of the results for these algorithms of importance to the researchers using them. This is the motivation to use and compare between derivative-free optimization (DFO) algorithms. The comparison methods developed in this paper were tested using three global solvers: Genetic Algorithm (GA), Particle Swarm Optimization (PSO), and Simulated Annealing (SA) on a set of 26 n-dimensional test problems of varying convexity, continuity, differentiability, separability, and modality. Each solver was run 100 times per problem at 2, 20, 50 and 100 dimensions. The formulation for each algorithm used comes from the MATLAB Optimization Toolbox, unedited or revised. New criteria for comparing DFO solver performance are introduced in terms defined as Speed, Accuracy, and Efficiency, taken at different levels of precision and dimensionality. The numerical results for these benchmark problems are analyzed using these methods.

Author 1: Jonathan McCart
Author 2: Ahmad Almomani

Keywords: Derivative-free optimization; algorithm comparison; test problem benchmarking

PDF

Paper 82: A Survey on Location Privacy-Preserving Mechanisms in Mobile Crowdsourcing

Abstract: Mobile Crowdsourcing (MCS) surfaced as a new affluent method for data collection and processing as a result of the boom of sensor-rich mobile devices popularity. MCS still has room for improvement, particularly in protecting workers’ private information such as location. Therefore, the installation of privacy-preserving mechanisms that insulate sensitive infor-mation and prevent attackers from obtaining information is a necessity. In this paper, we discuss location privacy threats and analyze some recently proposed mechanisms that targeted location privacy in mobile crowdsourcing. Finally, we compare and evaluate these mechanisms according to specific criteria that we define in this paper.

Author 1: Arwa Bashanfar
Author 2: Eman Al-Zahrani
Author 3: Maram Alutebei
Author 4: Wejdan Aljagthami
Author 5: Suhari Alshehri

Keywords: Mobile crowdsourcing; privacy; security; location privacy-preserving

PDF

Paper 83: Ionospheric Anomalies before the 2015 Deep Earthquake Doublet, Mw 7.5 and Mw 7.6, in Peru

Abstract: Two major earthquakes separated by ∼5 minutes occurred in the same fault in Peru at depths of 606.2 and 620.6 km on November 24, 2015. By using Global Ionospheric Maps (GIMs) from the Center for Orbit Determination in Europe (CODE) and a broadly used statistical method, differential Vertical Total Electron Content (VTEC) maps were derived. Two positive ionospheric anomalies were clearly identified in the differential VTEC maps 2 and 1 day prior to the day of the earthquakes. These anomalies were located inside the earthquakes’ preparation regions defined by the Dobrovolsky equation. On the other hand, due to the low-latitude nature of the seismic events, the Equatorial Ionization Anomaly (EIA) shape was also analyzed. A third positive disturbance was revealed between November 20 and 21, 2015. For the aforementioned anomaly and the one on November 22 (2 days before the earthquakes), an enhancement of the VTEC was observe through the considerable modification of the EIA shape into a well-defined double-crest with a trough. By looking into the Dst and Kp indices, the geomagnetic conditions starting on November 20 until the 24 were very quiet; thus, it is considered that the three detected anomalies are precursors to the earthquake doublet. Moreover, it is suggested that the mechanism at work that produced the positive disturbances is the air ionization through the release of radon from the Earth’s crust.

Author 1: Carlos Sotomayor-Beltran

Keywords: Ionospheric anomalies; earthquakes; total electron content

PDF

Paper 84: NetMob: A Mobile Application Development Framework with Enhanced Large Objects Access for Mobile Cloud Storage Service

Abstract: Mobile enterprise applications are primarily devel-oped for existing backend enterprise systems and some usage scenarios require storing of large files on the mobile devices.These files range from large PDFs to media files in various formats (MPEG videos).These files needs to be used offline , sometimes updated and shared among users. Present work studied different Mobile Backend as a service (M)BaaS platforms to understand techniques use to handle large file and found that many are either missing the feature or does not handle performance issues for large files. In this paper, we are proposing, NetMob, a mobile synchronization platform that allows resource-limited mobile devices to access large objects from the cloud. This framework is mainly focused on large file handling and has support for both table and objects data models that can be tuned for three consistency semantics, resembling strong, causal and eventual consistency. Experimental results conducted using representative workloads showed that NetMob can handle large files access with the size ranging from 100MB upto 1GB and is able to reduce sync time with object chunking in our experiment settings.

Author 1: Yunus Parvej Faniband
Author 2: Iskandar Ishak
Author 3: Fatimah Sidi
Author 4: Marzanah A. Jabar

Keywords: Mobile cloud computing; data consistency; mobile back-end as a service; mobile apps; distributed systems

PDF

Paper 85: Visualization and Analysis in Bank Direct Marketing Prediction

Abstract: Gaining the most benefits out of a certain data set is a difficult task because it requires an in-depth investigation into its different features and their corresponding values. This task is usually achieved by presenting data in a visual format to reveal hidden patterns. In this study, several visualization techniques are applied to a bank’s direct marketing data set. The data set obtained from the UCI machine learning repository website is imbalanced. Thus, some oversampling methods are used to enhance the accuracy of the prediction of a client’s subscription to a term deposit. Visualization efficiency is tested with the oversampling techniques’ influence on multiple classifier performance. Results show that the agglomerative hierarchical clustering technique outperforms other oversampling techniques and the Naive Bayes classifier gave the best prediction results.

Author 1: Alaa Abu-Srhan
Author 2: Bara’a Alhammad
Author 3: Sanaa Al zghoul
Author 4: Rizik Al-Sayyed

Keywords: Bank direct marketing; prediction; visualization; oversampling; Naive Bayes

PDF

Paper 86: Evaluation of Peer Robot Communications using CryptoROS

Abstract: The demand of cloud robotics makes data encryp-tion essential for peer robot communications. Certain types of data such as odometry, action controller and perception data need to be secured to prevent attacks. However, the introduction of data encryption caused increment of overhead for data stream communication. This paper presents an evaluation of CryptoROS architecture on Robot Operating System (ROS) which focused on peer-to-peer conversations between nodes with confidentiality and integrity violation. OpenSSL is used to create a private key and generate a Certificate Signing Request (CSR) that contains public key and a signature. The CSR is submitted to a Certificate Authority (CA) to chain the root CA certificate and encryption of RSA private key with AES-256 and a passphrase. The protected private key are securely backed up, transported, and stored. Experiments were carried out multiple times with and without the proposed protocol intervention to assess the performance impact of the Manager. The results for different number of messages transmitted each time increased from 100, 250 to 500 with performance impact 1.7%, 0.5% and 0.2%, respectively. It is concluded that CryptoROS capable of protecting messages and service requests from unauthorized intentional alteration with authenticity verification in all components.

Author 1: Abdul Hadi Abd Rahman
Author 2: Rossilawati Sulaiman
Author 3: Nor Samsiah Sani
Author 4: Afzan Adam
Author 5: Roham Amini

Keywords: Robot communication; encryption; robot operating system

PDF

Paper 87: A Novel Image Encryption using Memetic Differential Expansion based Modified Logistic Chaotic Map

Abstract: Under this paper, the primary conditions of a modified logistic chaotic map are created with the help of memetic differential expansion. In the beginning, the color image is broken down into different channels like red, blue and green. Then the modified logistic chaotic map essential variables are enhanced with the execution of memetic differential expansion. The strength operation is employed by the association coefficient and Entropy function. The private keys are produced by the modified logistic chaotic map. The encoded image is acquired with a combination of different encoded color channels. The memetic differential expansion builds on image encryption and the previous image encryption techniques build with the different standard images play a vital role in carrying out the larger experiments. The evaluation of the outcome of the proposed technique gives better security and efficiency in contrast to all the previously implemented image encryption techniques.

Author 1: Anvita Gupta
Author 2: Dilbag Singh
Author 3: Manjit Kaur

Keywords: Image encryption; modified logistic chaotic map; memetic differential expansion

PDF

Paper 88: Associative Classification using Automata with Structure based Merging

Abstract: Associative Classification, a combination of two important and different fields (classification and association rule mining), aims at building accurate and interpretable classifiers by means of association rules. The process used to generate association rules is exponential by nature; thus in AC, researchers focused on the reduction of redundant rules via rules pruning and rules ranking techniques. These techniques take an important part in improving the efficiency; however, pruning may negatively affect the accuracy by pruning interesting rules. Further, these techniques are time consuming in term of processing and also require domain specific knowledge to decide upon the selection of the best ranking and pruning strategy. In order to overcome these limitations, in this research, an automata based solution is proposed to improve the classifier’s accuracy while replacing ranking and pruning. A new merging concept is introduced which used structure based similarity to merge the association rules. The merging not only help to reduce the classifier size but also minimize the loss of information by avoiding the pruning. The extensive experiments showed that the proposed algorithm is efficient than AC, Naive Bayesian, and Rule and Tree based classifiers in term of accuracy, space, and speed. The merging takes the advantages of the repetition in the rules set and keep the classifier as small as possible.

Author 1: Mohammad Abrar
Author 2: Alex Tze Hiang Sim
Author 3: Sohail Abbas

Keywords: Associative classification; automata; ranking and pruning; rules merging; classification

PDF

Paper 89: Smart Coaching: Enhancing Weightlifting and Preventing Injuries

Abstract: Getting injured is one of the most devastating and dangerous challenges that an athlete can go through and if it is a big injury it could end his/her athletic career. In this paper, we propose a system to automate the idea of coaching an athlete, by using an IR Camera (Microsoft Kinect Xbox 360) to detect the misplaced joints of the athlete while doing the lift, and alerting the athlete before an injury can occur. We are now able to detect if the lift was correct or wrong and to detect what kind of mistake has been done in the lift by the athlete by using the Fast Dynamic time warping (FastDTW) method. The FastDTW method has outperformed other classification methods and can achieve recognition with 100% accuracy for dependent user movements.

Author 1: Ammar Yasser
Author 2: Doha Tariq
Author 3: Radwa Samy
Author 4: Mennat Allah Hassan
Author 5: Ayman Atia

Keywords: Weightlifting; joints; KNN; fastDTW; Naive Bayes; SVM; detecting injuries; machine learning; IR camera

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org