The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 11 Issue 8

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: LXPER Index: A Curriculum-specific Text Readability Assessment Model for EFL Students in Korea

Abstract: Automatic readability assessment is one of the most important applications of Natural Language Processing (NLP) in education. Since automatic readability assessment allows the fast selection of appropriate reading material for readers at all levels of proficiency, it can be particularly useful for the English education of English as Foreign Language (EFL) students around the world. However, most readability assessment models are developed for the native readers of English and have low accuracy for texts in non-native English Language Training (ELT) curriculum. We introduce LXPER Index, which is a readability assessment model for non-native EFL readers in the ELT curriculum of Korea. To measure LXPER Index, we use the mixture of 22 features which we prove to be significant in text readability assessment. We also introduce the Text Corpus of the Korean ELT Curriculum (CoKEC-text), which is the first collection of English texts from a non-native country’s ELT curriculum with each text’s target grade level labeled. In addition, we assembled the Word Corpus of the Korean ELT Curriculum (CoKEC-word), which is a collection of words from the Korean ELT curriculum with word difficulty labels. Our experiments show that our new model, trained with CoKEC-text, significantly improves the accuracy of automatic readability assessment for texts in the Korean ELT curriculum. The methodology used in this research can be applied to other ELT curricula around the world.

Author 1: Bruce W. Lee
Author 2: Jason Hyung-Jong Lee

Keywords: Natural language processing; machine learning; text readability assessment; EFL (English as Foreign Language) education

PDF

Paper 2: Developing a Radiating L-shaped Search Algorithm for NASA Swarm Robots

Abstract: This paper focuses on designing a search algorithm that the DustySWARM team used in the 2019 NASA Swarmathon competition. The developed search algorithm will be implemented and tested on multiple rovers, a.k.a. Swarmies or Swarm Robots. Swarmies are compact rovers, designed by NASA to mimic Ants behavior and perform an autonomous search for simulated Mars resources. This effort aimed to assist NASA’s mission to explore the space and discover new resources on the Moon and Mars. NASA’s going-on project has the goal to send robots that explore and collect resources for analysis before sending Astronauts, as the swarm option is safer and more affordable. All rovers must utilize the exact algorithm and collaborate and cooperate to find all available resources in their search path and retrieve them to the space station location. Additionally, swarmies will autonomously search while avoiding obstacles and mapping the surrounding environment for future missions. This algorithm allows a swarm of six robots to search an unknown area for simulated resources called AprilTags (cubes with QR codes). The code was developed using C/C++, GitHub, and Robotics Operation Systems (ROS) and tested by utilizing the Gazebo Simulation environment and by running physical trials on the swarmies. The team analyzed a few algorithms from previous years and other researchers then developed the Radiating L-Shape Search (RLS) Algorithm. This paper will summarize the algorithm design, code development, and trial results that were provided to the NASA Space Exploration Engineering team.

Author 1: Tariq Tashtoush
Author 2: Jalil Ahmed
Author 3: Valeria Arce
Author 4: Heriberto Dominguez
Author 5: Kevin Estrada
Author 6: William Montes
Author 7: Ashley Paredez
Author 8: Pedro Salce
Author 9: Alexia Serna
Author 10: Mireya Zarazua

Keywords: NASA Swarmathon competition; swarm robotics; search algorithm; autonomous; Robot Operating System (ROS); NASA space exploration; simulation; autonomous robot swarm; collaborative robots; autonomous behavior; cooperative robots; swarmies; L-Shaped search; GitHub

PDF

Paper 3: Impacts of Decomposition Techniques on Performance and Latency of Microservices

Abstract: Micro service architecture (MSA) has undoubtedly become the most popular modern-day architecture, often used in conjunction with the rapidly advancing public cloud platforms to reap the best benefits of salability, elasticity and agility. Though MSA is highly advantageous and comes with a huge set of benefits, it has its own set of challenges. To achieve the separation of concerns and optimal performance, defining the boundaries for the services clearly and their underlying persistent stores is quintessential. But logically segregating the services is a major challenge faced while designing the MSA. Some of the guiding principles like Single responsibility principle (SRP) and common closure principle (CCP) are put in place to drive the design and separation of microservices. With the use of these techniques the service layer can be designed either by (i) Building the services related to a business subdomain and packaging them as a microservice; (ii) or Defining the entity relationship model and then building the services based on the business capabilities which are grouped together as a microservice; (iii) or understanding the big picture of the application scope and combining both the strategies to achieve the best of both worlds. This paper explains these decomposition approaches in detail by comparing them with the real-world use cases and explains which pattern is suitable under which circumstances and at the same time examines the impacts of these approaches on the performance and latency using a research project.

Author 1: Chaitanya K. Rudrabhatla

Keywords: Microservices; decomposition techniques; single responsibility principle; common closure principle; performance; latency

PDF

Paper 4: A Home Intrusion Detection System using Recycled Edge Devices and Machine Learning Algorithm

Abstract: This paper proposes a home intrusion detection system that makes the best use of a retired smartphone and an existing Wi-Fi access point. On-board sensors in the smartphone mounted on an entrance door records signals upon unwanted door opening. The access point is reconfigured to serve as a home server and thus it can process sensor data to detect unauthorized access to home by an intruder. Recycling devices enables a home owner to build own security system with no cost as well as helps our society deal with millions of retired devices and waste of computing resources in already-deployed IT devices. In order to improve detection accuracy, this paper proposes a detection method that employs a machine learning algorithm and an analysis technique of time series data. To minimize energy consumption on a battery-powered smartphone, the proposed system utilizes as few sensors as possible and offloads all the computation to the home edge server. We develop a prototype and run experiments to evaluate accuracy performance of the proposed system. Results show that it can detect intrusion with probability of 95% to 100%.

Author 1: Daewoo Kwon
Author 2: Jinseok Song
Author 3: Chanho Choi
Author 4: Eun-Kyu Lee

Keywords: Security; intrusion detection; edge computing; Internet of Things; recycling

PDF

Paper 5: Development and Verification of Serial Fault Simulation for FPGA Designs using the Proposed RASP-FIT Tool

Abstract: Fault simulation is the critical approach for many applications such as fault detection & diagnostics, test set quality measurement, generation of test vectors, circuit testability, and many others along with the help of fault injection technique. The fault simulation approach is divided into many types. The most straightforward approach among them is a serial fault simulation. In the simulation process, the circuit under test is faulted, and a faulty copy is achieved by either using a simulator command technique or instrumentation technique. A fault simulator must examine the behaviour of specified target fault in design and classified as detected or undetected by the applied test patterns. To modify the original code is a very challenging and time-consuming task. Therefore, the RASP-FIT tool is developed, which alters the fault-free FPGA design, which is under investigation, at the Verilog HDL code level. It produces the copies of faulty design along with the top design file for several fault simulation methods. Using this tool, a serial fault simulation environment can easily be created with no much effort. In this work, a serial fault simulation method is verified and validated using the RASP-FIT tool for an ISCAS’85 benchmark design as an example.

Author 1: Abdul Rafay Khatri
Author 2: Ali Hayek
Author 3: Josef Borcsok

Keywords: FPGA design; fault injection; fault simulation; RASP-FIT tool; Verilog HDL

PDF

Paper 6: Study of K-Nearest Neighbour Classification Performance on Fatigue and Non-Fatigue EMG Signal Features

Abstract: For our body to move, the muscle must activate by relaxing and contracting. Muscle activation produces bio-electric signals that can be detected using Electromyography or EMG. The signal produced by the muscle is affected by the type of contraction done by the muscle. The eccentric contraction generating different EMG signals from concentric contraction. EMG signal contains multiple features. These features can be extracted using MATLAB software. This paper focuses on the bicep brachii and brachioradialis in the upper arm and forearm, respectively. The EMG signals are extracted using surface EMG whereby electrical pads are placed onto the surface of the muscle. Features can then be extracted from the EMG signal. This paper will focus on the MAV, VAR, and RMS features of the EMG signal. The features are then classified into eccentric, concentric or isometric contraction. The performance of the K-Nearest Neighbour (KNN) classifier is inconsistent due to the EMG data variabilities. The accuracy varies from one data set to another. However, it is concluded that non-fatigue signal classification accuracy is higher than fatigue signal classification accuracy.

Author 1: W. M. Bukhari
Author 2: C. J. Yun
Author 3: A. M. Kassim
Author 4: M. O. Tokhi

Keywords: Electromyography; surface electromyography; k-nearest neigbour classifier; feature extraction; dynamic contraction

PDF

Paper 7: Road Object Detection using Yolov3 and Kitti Dataset

Abstract: Road objects (such as pedestrians and vehicles) detection is a very important step to enhance road safety and achieve autonomous driving. Many on-vehicle sensors, such as radars, lidars and ultrasonic sensors, are used to detect surrounding objects. However, cameras are widely used sensors for road objects detection for the rich information they provide and their inexpensive prices with compared to other sensors. Machine learning and computer vision algorithms are utilized to classify objects in the collected images and videos. There are many computer vision algorithms proposed for image and video object detection, e.g. logistic regression and SVM with feature extraction. However, Convolutional Neural Network (CNN) al-gorithms showed a high detection accuracy compared to other approaches. This research implements You Only Look Once (YOLO) algorithm that uses Draknet-53 CNN to detect four classes: pedestrians, vehicles, trucks and cyclists. The model is trained using Kitti images dataset which is collected from public roads using vehicle’s front looking camera. The algorithm is tested, and detection results are presented.

Author 1: Ghaith Al-refai
Author 2: Mohammed Al-refai

Keywords: Pedestrian detection; computer vision; CNN; ma-chine learning; artificial intelligence; vehicle safety

PDF

Paper 8: Predicting Breast Cancer via Supervised Machine Learning Methods on Class Imbalanced Data

Abstract: A widespread global health concern among women is the incidence of the second most leading cause of fatality which is breast cancer. Predicting the occurrence of breast cancer based on the risk factors will pave the way to an early diagnosis and an efficient treatment in a quicker time. Although there are many predictive models developed for breast cancer in the past, most of these models are generated from highly imbalanced data. The imbalanced data is usually biased towards the majority class but in cancer diagnosis, it is crucial to diagnose the patients with cancer correctly which are oftentimes the minority class. This study attempts to apply three different class balancing techniques namely oversampling (Synthetic Minority Oversampling Technique (SMOTE)), undersampling (SpreadSubsample) and a hybrid method (SMOTE and SpreadSubsample) on the Breast Cancer Surveillance Consortium (BCSC) dataset before constructing the supervised learning methods. The algorithms employed in this study include Naïve Bayes, Bayesian Network, Random Forest and Decision Tree (C4.5). The balancing method which yields the best performance across all the four classifiers were tested using the validation data to determine the final predictive model. The performances of the classifiers were evaluated using a Receiver Operating Characteristic (ROC) curve, sensitivity, and specificity.

Author 1: Keerthana Rajendran
Author 2: Manoj Jayabalan
Author 3: Vinesh Thiruchelvam

Keywords: Breast cancer; class imbalance; diagnosis; bayesian network

PDF

Paper 9: A Novel Fuzzy Clustering Approach for Gene Classification

Abstract: Automatic cluster detection is crucial for real-time gene expression data where the quantity of missing values and noise ratio is relatively high. In this paper, algorithms of dynamical determination of the number of cluster and clustering have been proposed without any pre and post clustering assumptions. Proposed fuzzy Meskat-Hasan (MH) clustering provides solutions for sophisticated datasets. MH clustering extracts the hidden information of the unknown datasets. Based on the findings, it determines the number of clusters and performs seed based clustering dynamically. MH Extended K-Means cluster algorithm which is a nonparametric extension of the traditional K-Means algorithm and provides the solution for automatic cluster detection including runtime cluster selection. To ensure the accuracy and optimum partitioning, seven validation techniques were used for cluster evaluation. Four well known datasets were used for validation purposes. In the end, MH clustering and MH Extended K-Means clustering algorithms were found as a triumph over traditional algorithms.

Author 1: Meskat Jahan
Author 2: Mahmudul Hasan

Keywords: Meskat-Hasan clustering (MH clustering); MH Extended K-Means clustering; K-Means; fuzzy clustering

PDF

Paper 10: Local Binary Pattern Method (LBP) and Principal Component Analysis (PCA) for Periocular Recognition

Abstract: Identification of identity through eye is gaining more and more importance. Commonly, the researchers approach the eye from any of three parts, the iris, the circumference around the eye, and the iris and its circumference. This study follows a holistic approach to identity identification by using the iris and whole periocular area and proposes a periocular recognition system (PRS) that has been developed using the Local Binary Pattern (LBP) technique combined with Principal Component Analysis (PCA) at the feature extraction stage and the k-nearest neighbors (k-NN) algorithm as a classifier at the classification stage. This system achieves identity recognition through three steps: pre-processing, feature extraction, and classification. Pre-processing is applied to the images so as to convert them to grayscale. In the feature extraction step, the LBP method is applied to extract the texture feature from the images and use it in PCA to reduce data dimensionality and obtain the relevant data so that only the important features are extracted. These two steps are applied both in the training phase and the testing phase of image processing. On the other hand, the testing data sets are processed using the k-NN classifier. The proposed PRS was tested on data drawn from the PolyU database using more than one basis of system experience. Specifically, the system performance was tested once on all 209 subjects present in the database and once on 140 subjects. This database also contains images taken in the visible (VIS) and near-infrared (NIR) regions of the electromagnetic radiation (EMR) spectrum. So, the system was tested on images taken in both regions separately for matching. As well, the proposed PRS benefited from the availability of images for the right and left perioculars. Performance was, therefore, tested on images of each side of the periocular area (the left and right sides) separately, as well as for the combination of the two sides. The identity recognition rates characteristic of the proposed PRS were most often higher than the recognition rates produced by systems reported in the literature. The highest recognition accuracy obtained from the proposed system, which is 98.21%, was associated with the 140-subject data sub-set.

Author 1: Sereen Alkhazali
Author 2: Mohammad El-Bashir

Keywords: Periocular recognition; Local Binary Pattern (LBP); Principal Component Analysis (PCA); k-Nearest Neighbors (k-NN)

PDF

Paper 11: Design and Performance Analysis of Different Dielectric Substrate based Microstrip Patch Antenna for 5G Applications

Abstract: In this paper, a 3.5 GHz microstrip patch antenna using three different substrates materials with varying relative permittivity have been designed. However, the thickness of the substrates are slightly different from each other which is 1.6 mm for FR-4, 1.575 mm for RT-5880 and 1.58 mm for TLC-30 have been chosen to carry out this work. The three substrates materials are FR-4 (Design-1), RT-5880 (Design-2), and TLC-30 (Design-3) with the relative permittivity of 4.3, 2.2, and 3, respectively. The antennas' performances in terms of reflection coefficient, voltage standing wave ratio (VSWR), bandwidth, gain, and efficiency performance is simulated, analyzed and compared using CST Microwave studio (CST 2019). The findings reveal that there is a significant change in gain and bandwidth due to different relative permittivity and the thickness value of the substrate materials. The gains achieved were at 3.338 dB, 4.660 dB, and 5.083 dB for Design-1, Design-2 and Design-3 respectively. The efficiency of the antennas also showed that TLC-30 gave the best efficiency at 75.70% when compared to FR-4 which was at 60.13% and RT-5880 which was at 61.51% efficiency. All the proposed antennas have a bandwidth above 100 MHz where Design-1 had a bandwidth of 247.1 MHz whilst Design-2 had a bandwidth of 129.7 MHz and finally, Design-3 had a bandwidth of 177.2 MHz.

Author 1: Nurulazlina Ramli
Author 2: Shehab Khan Noor
Author 3: Taher Khalifa
Author 4: N. H. Abd Rahman

Keywords: Efficiency; gain; microstrip patch antenna; permittivity; substrates

PDF

Paper 12: Monopole Antenna on Transparent Substrate and Rectifier for Energy Harvesting Applications in 5G

Abstract: In line with the harvested energy required for the emerging 5G technology, this article proposes a planar monopole antenna and a rectifier. The proposed Coplanar Waveguide (CPW)-fed antenna is printable on a transparent Poly-Ethylene Terephthalate (PET) substrate. The antenna has a center frequency at 3.51 GHz within a bandwidth of 307 MHz that covers the pioneer 5G band in Malaysia. The designed omnidirectional antenna exhibits the maximum gain of 1.51 dBi with a total efficiency of 95.17 percent. At the antenna frequency, a rectifier has been designed with the voltage doubler technique on a Rogers RO3003 substrate. Over an input RF power of 0 dBm, the rectifier has a power conversion efficiency around 42 percent. The proposed antenna rejects harmonics at least until 16 GHz frequency that makes it compatible with the rectifier to eliminate an additional bandpass filter or impedance matching network from the energy harvesting system.

Author 1: S. M. Kayser Azam
Author 2: Md. Shazzadul Islam
Author 3: A. K. M. Zakir Hossain
Author 4: Mohamadariff Othman

Keywords: Monopole antenna; transparent substrate; 5G; energy harvesting; rectifier

PDF

Paper 13: Blockchain-based Global Travel Review Framework

Abstract: An online review system is an important part of almost every e-commerce platform, especially a tourism e-commerce. However, various problems exist in the current online review systems. The review content is stored in a centralized database of each individual platform. Each platform differs in review management methods. In some cases, the review score of the same product disagrees across different platforms. Moreover, a centralized system has low transparency because it is difficult to trace individual actions within the system. As a result, some users are skeptical of the reliability of online reviews in centralized systems. This work proposes a global travel review framework based-on the blockchain technology. The incorporation of blockchain helps improve an online review system. The best practices for online review management from popular platforms, and the guidelines from trusted sources are used to develop the new system. The use of blockchain improves an online review system through its unique features of high transparency, security, and reliability. Additionally, the proposed framework relies on a community-driven environment. The accessibility level of users is controlled by using the smart contract. There is no single authoritative owner of the system. All participants in the system can exert controls on the system equally. This work illustrates the details of a blockchain-based global travel review framework. The advantages and disadvantages of such a system are discussed. The proposed framework can be easily integrated with any existing platforms since it can be accessed publicly.

Author 1: Tanakorn Karode
Author 2: Warodom Werapun
Author 3: Tanwa Arpornthip

Keywords: Consumer online review; traveling; blockchain; smart contract

PDF

Paper 14: Assessing Vietnamese Text Readability using Multi-Level Linguistic Features

Abstract: Text readability is the problem of determining whether a text is suitable for a certain group of readers, and thus building a model to assess the readability of text yields great significance across the disciplines of science, publishing, and education. While text readability has attracted attention since the late nineteenth century for English and other popular languages, it remains relatively underexplored in Vietnamese. Previous studies on this topic in Vietnamese have only focused on the examination of shallow word-level features using surface statistics such as frequency and ratio. Hence, features at higher levels like sentence structure and meaning are still untapped. In this study, we propose the most comprehensive analysis of Vietnamese text readability to date, targeting features at all linguistic levels, ranging from the lexical and phrasal elements to syntactic and semantic factors. This work pioneers the investigation on the effects of multi-level linguistic features on text readability in the Vietnamese language.

Author 1: An-Vinh Luong
Author 2: Diep Nguyen
Author 3: Dien Dinh
Author 4: Thuy Bui

Keywords: Text readability; text difficulty; readability formula; linguistics features; Vietnamese

PDF

Paper 15: An ACM\IEEE and ABET Compliant Curriculum and Accreditation Management Framework

Abstract: Following methodological and systemized approaches in creating course syllabi and program curriculums are very crucial for assuring the coherence (correctness, completeness, consistency, and validity) of curriculums. Furthermore, designing coherent curriculums have a direct impact on achieving curriculum outcomes. For institutions seeking accreditation, presenting evidence of curriculum coherence is mandatory. In this paper, a general framework architecture for curriculum and accreditation management is proposed. Furthermore, we propose a detailed design for a knowledge base that comprises of: a) the ACM\IEEE body of knowledge for the Computer Science Department, b) course syllabi, and c) course articulation matrices. We show how to utilize the proposed knowledge base in the quality improvement life cycle, in ABET accreditation, and as a significant step towards curriculum coherence.

Author 1: Manar Salamah Ali

Keywords: Curriculum coherence; body of knowledge; accreditation; knowledge base design; ABET

PDF

Paper 16: A Hybrid Deep Learning Model for Arabic Text Recognition

Abstract: Arabic text recognition is a challenging task because of the cursive nature of Arabic writing system, its joint writing scheme, the large number of ligatures and many other challenges. Deep Learning (DL) models achieved significant progress in numerous domains including computer vision and sequence modelling. This paper presents a model that can recognize Arabic text that was printed using multiple font types including fonts that mimic Arabic handwritten scripts. The proposed model employs a hybrid DL network that can recognize Arabic printed text without the need for character segmentation. The model was tested on a custom dataset comprised of over two million word samples that were generated using (18) different Arabic font types. The objective of the testing process was to assess the model’s capability in recognizing a diverse set of Arabic fonts representing a varied cursive styles. The model achieved good results in recognizing characters and words and it also achieved promising results in recognizing characters when it was tested on unseen data. The prepared model, the custom datasets and the toolkit for generating similar datasets are made publically available, these tools can be used to prepare models for recognizing other font types as well as to further extend and enhance the performance of the proposed model.

Author 1: Mohammad Fasha
Author 2: Bassam Hammo
Author 3: Nadim Obeid
Author 4: Jabir AlWidian

Keywords: Arabic optical character recognition; deep learning; convolutional neural networks; recurrent neural networks

PDF

Paper 17: Cyber Security Defence Policies: A Proposed Guidelines for Organisations Cyber Security Practices

Abstract: Many organisations have been struggling to defend their cyberspace without a specific direction or guidelines to follow and they have described and identified cyber attack as a devastating potential on business operation in a broader perspective. Since then, researchers in cyber security have come out with numerous reports on threats and attack on organisations. This study is conducted to develop and propose a Cyber Security Defence Policies (CSDP) by harmonising and synthesizing the existing practices identified from the literature review. Observation and questionnaire were adopted to evaluate, review and collect data under ethical agreement from 10 organisations. The validation is based on the principal components for the proposed CSDP and the proposed CSDP, using SPSS as the statistical tool. The result shows that, the validation of the proposed CSDP by 20 experts reveals a standard deviation of 0.607, 0.759, 0.801, 0.754, 0.513, 0.587 and 0.510 on each of the principal components without a missing value respectively. While the correlation matrix and the reproduced correlation matrix for the proposed CSDP indicated 61% and the percentage of acceptance on the principal components for the proposed CSDP are higher than 50%. Therefore, from the outcome, it has shown that the acceptance responds towards the proposed CSDP and the result from the principal components analysis (eigenvalue analysis) are significant enough for implementation and can be adopted by organisations as a guidelines for organisation cyber security practices.

Author 1: Julius Olusegun Oyelami
Author 2: Azleena Mohd Kassim

Keywords: Cyber security; cyber defence policy; organisation; cyber security practices

PDF

Paper 18: New RTP Packet Payload Shrinking Method to Enhance Bandwidth Exploitation Over RTP Protocol

Abstract: One of the pillars to run businesses is the telecommunications. Most of the institutions are migrating, if not already migrated, to Voice over Internet Protocol (VoIP) technology. However, VoIP still need some improvements, in terms of networks bandwidth exploitation and VoIP call quality, to meet the businesses expectations. Networks bandwidth exploitation, which is our concern in this paper, has been enhanced using different approaches and methods. This paper suggests a new method to enhance networks bandwidth exploitation Packet’s payload shrinking (compression) approach. The suggested method works with the RTP protocol and called RTP Payload Shrinking (RPS) method. As the name implies, the RPS method will reduce the size of the RTP packet payload, through shrinking it based on specific algorithm, which enhances the networks bandwidth exploitation. The RPS method utilizes the RTP fields to store the values that are needed to apply the shrinking algorithm at the sender and receiver sides. The effectiveness of the proposed RPS method has been examined in comparison to conventional RTP protocol without shrinking. The deployment result showed that the saved bandwidth ratio has reached up to nearly 17% in the tested scenarios. Therefore, enhancing the network bandwidth exploitation.

Author 1: AbdelRahman H. Hussein
Author 2: Mwaffaq Abu-Alhaija
Author 3: Kholoud Nairoukh

Keywords: VoIP; RTP protocol; payload compression; bandwidth exploitation

PDF

Paper 19: Arabic Handwritten Character Recognition based on Convolution Neural Networks and Support Vector Machine

Abstract: Recognition of Arabic characters is essential for natural language processing and computer vision fields. The need to recognize and classify the handwritten Arabic letters and characters are essentially required. In this paper, we present an algorithm for recognizing Arabic letters and characters based on using deep convolution neural networks (DCNN) and support vector machine (SVM). This paper addresses the problem of recognizing the Arabic handwritten characters by determining the similarity between the input templates and the pre-stored templates using both fully connected DCNN and dropout SVM. Furthermore, this paper determines the correct classification rate (CRR) depends on the accuracy of the corrected classified templates, of the recognized handwritten Arabic characters. Moreover, we determine the error classification rate (ECR). The experimental results of this work indicate the ability of the proposed algorithm to recognize, identify, and verify the input handwritten Arabic characters. Furthermore, the proposed system determines similar Arabic characters using a clustering algorithm based on the K-means clustering approach to handle the problem of multi-stroke in Arabic characters. The comparative evaluation is stated and the system accuracy reached 95.07% CRR with 4.93% ECR compared with the state of the art.

Author 1: Mahmoud Shams
Author 2: Amira. A. Elsonbaty
Author 3: Wael. Z. ElSawy

Keywords: Handwritten Arabic recognition; convolutional neural networks; support vector machine

PDF

Paper 20: Noise Reduction on Bracketed Images for High Dynamic Range Imaging

Abstract: The quality of a high dynamic range (HDR) image produced from bracketed images taken at different camera exposure times can be degraded by noise contained in bracketed images. In this paper, we propose a noise reduction method on bracketed images based on exposure time ratio. First, for each pixel pair of a same scene point lying on two different images, the ratio of their intensity values is compared with the ratio of exposure times of the images on which the pixels are lying. If the compared ratios are close, these two pixels are included in noise-free pixels set. The complement of noise-free pixels set is defined as noisy pixels set. Then, the intensity value of each pixel in noisy pixels set is corrected by its expected value computed from noise-free pixel of the same scene point lying on another image. Experimental results show that all the noisy intensity values can be correctly restored from noise-free pixels except the saturated pixels. Denoising results by PSNR show that the proposed method outperforms the other recent denoising methods such as based-on pixel density filter (BPDF), noise adaptive fuzzy switching median filter (NAFSMF), and adaptive Riesz mean filter (ARmF).

Author 1: Seong-O Shim

Keywords: Image denoising; high dynamic range imaging; noise detection; noise removal; image restoration

PDF

Paper 21: Quality in Use of an Android-based Mobile Application for Calculation of Bone Mineral Density with the Standard ISO/IEC 25022

Abstract: One of the most critical bone diseases is osteoporosis, which can be evaluated through measurements of bone mineral density. Though there is a lot of commercialized portable about health in general, few are oriented towards bone health and with a lack of user-friendly interface and data management system. This paper presents a mobile application development for the calculation of the Bone Mineral density, which integrates with Google technology. The Mobile-D methodology for the development of mobile applications, due to the sequentially in the processes or stages is used. Bone mineral density (BMD) was calculated using anthropometric regression equations, and an Android-based Mobile Application with Google technology was developed. By using Firebase Authentication and Firebase Storage provided by Google technology, it allows admin to have full control over database management. In short, this mobile application allows the calculation of the BMD of the students and data storage and data uploading to cloud storage for post-processing, online data management system with user authentication. In addition, the Internacional Organization for Standardization / International Electrotechnical Commission (ISO/IEC) 25022 standard was used to evaluate the quality in use of the mobile app, resulting in 93% of quality in use, this app being able to be used by health professionals for better decision-making.

Author 1: Jose Sulla-Torres
Author 2: Andrea Gutierrez-Quintanilla
Author 3: Henry Pinto-Rodriguez
Author 4: Rossana Gomez-Campos
Author 5: Marco Cossio-Bolanos

Keywords: Android mobile application; bone mineral density; firebase; software quality; ISO/IEC 25022

PDF

Paper 22: Fuzzy based Reliable Cooperative Spectrum Sensing for Smart Grid Environment

Abstract: The huge demand for spectrum has created immediate need to make available new licensed and/or unlicensed spectrum bands to satisfy the explosive growth of spectrum demands and to satisfy the quality of service requirements of diverse applications. Spectrum shortage and harsh environment have become a challenging bottleneck to achieve reliable communications in the smart grid. Cognitive radio is the emerging technology to achieve both spectrum and reliability awareness. Cooperative spectrum sensing takes advantage of spatial diversity to reduce the impact of receiver uncertainty. However, the harsh smart grid environments limit advantageous of cooperation due to variations of signal to noise ratio on which energy detection technique depends on. This paper proposes a reliable spectrum detection for a cluster based cooperative spectrum sensing in harsh smart grid environment, where cognitive cluster heads and a centralized cognitive radio based fusion center are deployed to solve both spectrum and reliability problems. The proposed fuzzy inference system is based on three fuzzy descriptors of energy difference, link quality, and local probability of detection. The results show the superiority of proposed fuzzy based fusion scheme to enhance accuracy of spectrum decision in harsh environment.

Author 1: Laila Nassef
Author 2: Reemah Al-Hebshi

Keywords: Cognitive radio; wireless networks; cooperative spectrum sensing; reliable fusion; fuzzy inference system

PDF

Paper 23: Modeling and Assessing the Power Consumption Behavior of Sensor Nodes using Petri Nets

Abstract: Power Consumption is an influential and important concern in Wireless Sensor Networks (WSNs). Assessing and determining the impact of the power behavior of sensor nodes is an inclusive concern that should be executed at the network pre-deployment phase rather than post-deployment phase. Providing an accurate power modeling improves the development process of WSN applications and protocols. This paper introduces the use of Colored Petri Nets (CPNs) to model the power behavior and relations among different sensor node components when operated in an event driven environment. The objective is to figure out the overall power behavior of the nodes by considering power consumed accompanied with different states transition at a specific packets arrival and service rate values. Colored Petri Nets is a modeling language employed for validating and evaluating concurrent and distributed systems. The introduced model is beneficial since it provides the network designer with a way to determine surrogate designs or to check the compatibility of an existing power model behavior. The proposed model has been validated through the comparison of the power behaviors of two sensor nodes Mica2 and Telos. The results demonstrate the efficiency of using the proposed model to draw and analyze the WSNs energy consumption behavior.

Author 1: Alaa E. S. Ahmed

Keywords: Wireless sensor networks; power modeling; event-trigger; colored petri net; WSN protocols; distributed systems

PDF

Paper 24: The TPOA Telecentre: A Community Sustainable Telecentre Architecture

Abstract: This paper presented the telecentre implementation for the Orang Asli villages in remote rural areas under the Telecentre Program for Orang Asli (TPOA). TPOA telecentre architecture aims to assist the achievement of a rural community sustainable telecentre through innovation and strategic adoption of ICT technology. Lessons learned from our past telecentre experience have outlined various challenges in the technical aspects of the telecentre implementation and operation. The TPOA telecentre ICT architecture has been designed to address the outlined issues hence producing a smoother telecentre operation that enables the rural communities to self-sustain their own telecentres. The technical support for the remote rural telecentre can be very expensive and impractical due to the extreme physical access condition. Hence, the rural communities themselves have to carry out the support and maintenance to sustain the operation of the telecentre. The TPOA telecentre architecture has enabled a relatively friendly to operate ICT platform in order to assist and make it possible for the Orang Asli to sustain, support, and maintain the telecentre operation.

Author 1: Chong Eng Tan
Author 2: Poline Bala
Author 3: Sei Ping Lau
Author 4: Siew Mooi Wong

Keywords: Telecentre; sustainability; TPOA; telecentre architecture; ICT4D; rural development

PDF

Paper 25: A Computational Approach to Explore Extremist Ideologies in Daesh Discourse

Abstract: This paper uses a computer-based frequency analysis to present an ideological discourse analysis of extremist ideologies in Daesh discourse. More specifically, by using a computer-assisted text analysis, the paper attempts to investigate the hidden extremist ideologies beyond the discourse of the first issue of Rumiyah, one of the main digital publications of Daesh. The paper’s main objectives are to expose hidden ideologies beyond the mere linguistic form of discourse, to offer better linguistic understanding of the manipulative use of language in religious discourse, and to highlight the relevance of using a computer-based frequency analysis to discourse studies and corpus linguistics. The paper also employs van Dijk's ideological discourse analysis, by adopting his positive self-presentation and negative other-presentation strategies. Findings reveal that Daesh discourse in Rumiyah is rhetorically structured to hide the manipulative ideologies of its users, which in turn functions to reformulate the social, political and religious attitudes of its readers.

Author 1: Ayman F. Khafaga

Keywords: Computational linguistics; concordance; Daesh; frequency analysis; ideology; Rumiyah

PDF

Paper 26: Optimized Cardiovascular Disease Detection and Features Extraction Algorithms from ECG Data

Abstract: A heart disease called cardiovascular diseases (CVD) is another leading cause for the death. There are several reasons that lead the CVD in human beings. The early detect of CVD helps to take necessary medical attentions to prevent the harms. The conventional techniques for CVD detection were manual and expensive which often delivers the inaccurate diagnosis. Since from the last decade the other inexpensive Computer Aided Diagnosis (CAD) based methods gained significant medical attentions. The CAD based techniques mainly based on raw Electro Cardiogram (ECG) signals of patient for the accurate and economical detection of CVD at early stage. In recent past, there are several CAD systems designed for CVD diagnosis utilizing raw ECG signals, however accuracy of CVD detection utilizing ECG bothered through several issues of research like QRS beats extraction, artefacts, efficient features extraction. This research paper present CVD novel framework, utilizing raw ECG signals and designed hybrid pre-processing algorithm for extracting artefacts and noise through raw ECG signal. Further designed simple and efficient dynamic thresholding based technique to extract the beats such as Q, R, S, and ST segment through pre-processing ECG signal. Third step perform the fusion of extracted beats and apply the feature extraction method called Normalized Higher Order Statistic (NHOS). The normalized HOS techniques asses the complexity among all the QRS based beats and delivers the more unique features for the accuracy enhancement. The final step is the classification by using five different classifiers for the CVD detection. The simulation results presented in this paper demonstrate that proposed framework achieved the significant accuracy improvement.

Author 1: Sanjay Ghodake
Author 2: Shashikant Ghumbre
Author 3: Sachin Deshmukh

Keywords: Electrocardiogram; heart disease; cardiovascular disease; hybrid filtering; features extraction; QRS and ST beats

PDF

Paper 27: Facilitating the Detection of ASD in Ultrasound Video using RHOOF and SVM

Abstract: In the medical field various motion tracking techniques like block matching, optical flow, and histogram of oriented optical flow (HOOF) are being experimented for the abnormality detection. The information furnished by the existing techniques is inadequate for medical diagnosis. This technique has an inherent drawback, as the entire image is considered for motion vector calculation, increasing the time complexity. Also, the motion vectors of unwanted objects are getting accounted during abnormality detection, leading to misidentification / misdiagnosis. In this research, our main objective is to focus more on the region of abnormality by avoiding the unwanted motion vectors from the rest of the portion of the heart, allowing better time complexity. Proposed a region-based HOOF (RHOOF) for blood motion tracking and estimation; after experimentation, it is observed that RHOOF is four times faster than HOOF. The performance of supervised machine learning techniques was evaluated based on accuracy, precision, sensitivity, specificity, and area under the curve. In the medical field more importance is given to the sensitivity than accuracy. Support vector machine (SVM) has outperformed other technique on sensitivity and time complexity, hence chosen for abnormality classification in this work. An algorithm has been devised to use combination of RHOOF and SVM for the detection of atrial septal defect (ASD).

Author 1: Mrunal Ninad Annadate
Author 2: Manoj Nagmode

Keywords: Two dimensional; apical four chamber; region-based histograms of oriented optical flow; machine learning; area under the curve; support vector machine; congenital

PDF

Paper 28: Impact of Circular Field in Underwater Wireless Sensor Networks

Abstract: Underwater Wireless Sensor Networks (UWSNs) face challenges regarding high propagation delay, limited bandwidth, 3D topology and excessive energy consumptions. In this paper, a routing scheme with circular field is proposed for an efficient collection of data packets by using two mobile sinks in UWSNs. Results of proposed scheme are compared with previous implemented schemes which are used to measure the usage of mobile sink in the collection of data packets. In this study, we have compared the proposed scheme with current state-of-the-art routing protocols. The statistical significance of this work was analyzed in MATLAB. Marked observations to emerge from obtained results include an improvement in lifetime, increased throughput, increment in alive nodes and balanced energy consumption. In our view, these results strengthen the validity of proposed circular field. A significant increase in received packets is observed because maximum nodes are alive till 1500 rounds which provide maximum communication and less chance of creating void holes.

Author 1: Syed Agha Hassnain Mohsan
Author 2: Mushtaq Ali Khan
Author 3: Arfan Mahmood
Author 4: Muhammad Hammad Akhtar
Author 5: Hussain Amjad
Author 6: Asad Islam
Author 7: Alireza Mazinani
Author 8: Syed Muhammad Tayyab Shah

Keywords: Component; underwater wireless sensor networks; mobile sink; routing scheme; circlular field

PDF

Paper 29: A Comparative Analysis of Data Mining Techniques on Breast Cancer Diagnosis Data using WEKA Toolbox

Abstract: Breast cancer is considered the second most common cancer in women compared to all other cancers. It is fatal in less than half of all cases and is the main cause of mortality in women. It accounts for 16% of all cancer mortalities worldwide. Early diagnosis of breast cancer increases the chance of recovery. Data mining techniques can be utilized in the early diagnosis of breast cancer. In this paper, an academic experimental breast cancer dataset is used to perform a data mining practical experiment using the Waikato Environment for Knowledge Analysis (WEKA) tool. The WEKA Java application represents a rich resource for conducting performance metrics during the execution of experiments. Pre-processing and feature extraction are used to optimize the data. The classification process used in this study was summarized through thirteen experiments. Additionally, 10 experiments using various different classification algorithms were conducted. The introduced algorithms were: Naïve Bayes, Logistic Regression, Lazy IBK (Instance-Bases learning with parameter K), Lazy Kstar, Lazy Locally Weighted Learner, Rules ZeroR, Decision Stump, Decision Trees J48, Random Forest and Random Trees. The process of producing a predictive model was automated with the use of classification accuracy. Further, several experiments on classification of Wisconsin Diagnostic Breast Cancer and Wisconsin Breast Cancer, were conducted to compare the success rates of the different methods. Results conclude that Lazy IBK classifier k-NN can achieve 98% accuracy among other classifiers. The main advantages of the study were the compactness of using 13 different data mining models and 10 different performance measurements, and plotting figures of classifications errors.

Author 1: Majdah Alshammari
Author 2: Mohammad Mezher

Keywords: Data mining; breast cancer; data mining techniques; classification; WEKA toolbox

PDF

Paper 30: Performance Analysis of Efficient Pre-trained Networks based on Transfer Learning for Tomato Leaf Diseases Classification

Abstract: Early diagnosis and accurate identification to tomato leaf diseases contribute on controlling the diffusion of infection and guarantee healthy to the plant which in role result in increasing the crop harvest. Nine common types of tomato leaf diseases have a great effect on the quality and quantity of tomato crop yield. The tradition approaches of features extraction and image classification cannot ensure a high accuracy rate for leaf diseases identification. This paper suggests an automatic detection approach for tomato leaf diseases based on the fine tuning and transfer learning to the pre-trained of deep Convolutional Neural Networks. Three pre-trained deep networks based on transfer learning: AlexNet, VGG-16 Net and SqueezeNet are suggested for their performances analysis in tomato leaf diseases classification. The proposed networks are carried out on two different dataset, one of them is a small dataset using only four different diseases while the other is a large dataset of leaves accompanied with symptoms of nine diseases and healthy leaves. The performance of the suggested networks is evaluated in terms of classification accuracy and the elapsed time during their training. The performance of the suggested networks using the small dataset are also compared with that of the-state-of-the-art technique in literature. The experimental results with the small dataset demonstrate that the accuracy of classification of the suggested networks outperform by 8.1% and 15% over the classification accuracy of the technique in literature. On other side when using the large dataset, the proposed pre-trained AlexNet achieves high classification accuracy by 97.4% and the consuming time during its training is lower than those of the other pre-trained networks. Generally, it can be concluded that AlexNet has outstanding performance for diagnosing the tomato leaf diseases in terms of accuracy and execution time compared to the other networks. On contrary, the performance of VGG-16 Net in metric of classification accuracy is the best yet the largest consuming time among other networks.

Author 1: Sawsan Morkos Gharghory

Keywords: Deep learning; Alex; squeeze; VGG16 networks; tomato leaf diseases diagnosis and classification

PDF

Paper 31: A New Big Data Architecture for Real-Time Student Attention Detection and Analysis

Abstract: Big Data technologies and their analytical methods can help improve the quality of education. They can be used to process and analyze classroom video streams to predict student attention, this would greatly improve the learning-teaching experience. With the increasing number of students and the expansion of educational institutions, processing and analyzing video streams in real-time become a complicated issue. In this paper, we have reviewed the existing systems of student attention detection, open-source real-time data stream processing technologies, and the two major data stream processing architectures. We also proposed a new Big Data architecture for real-time student attention detection.

Author 1: Tarik Hachad
Author 2: Abdelalim Sadiq
Author 3: Fadoua Ghanimi

Keywords: Attention detection; big data analysis; stream processing; real-time processing; Apache Flink; Apache Spark; Apache Storm; Lambda architecture; Kappa architecture

PDF

Paper 32: Analysis of K-means, DBSCAN and OPTICS Cluster Algorithms on Al-Quran Verses

Abstract: Chapter Al-Baqarah is the longest chapter in the Holy Quran, and it covers various topics. Al-Quran is the primary text of Islamic faith and practice. Millions of Muslims worldwide use Al - Quran as their reference book, and it, therefore, helps Muslims and Islamic scholars as guidance of the law life. Text clustering (unsupervised learning) is a process of separation that has to be divided text into the same section of similar documents. There are many text clustering algorithms and techniques used to make clusters, such as partitioning and density-based methods. In this paper, k-means preferred as a partitioning method and DBSCAN, OPTICS as a density-based method. This study aims to investigate and find which algorithm produced as the best accurate performance cluster for Al-Baqarah’s English Tafseer chapter. Data preprocessing and feature extraction using Term Frequency-Inverse Document Frequency (TF-IDF) have applied for the dataset. The result shows k-means outperformed even has the smallest of Silhouette Coefficient (SC) score compared to others due to less implementation time with no noise production for seven clusters of Al-Baqarah chapter. OPTICS has no noise with the medium of SC score but has the longest implementation time due to its complexity.

Author 1: Mohammed A. Ahmed
Author 2: Hanif Baharin
Author 3: Puteri N.E. Nohuddin

Keywords: K-means; DBSCAN; OPTICS; Al-Baqarah clustering; Silhouette Coefficient; Tafseer; text clustering

PDF

Paper 33: Image Restoration based on Maximum Entropy Method with Parameter Estimation by Means of Annealing Method

Abstract: Image restoration based on Maximum Entropy Method (MEM) with parameter estimation by means of annealing method is proposed. The proposed method allows spatial resolution enhancement. Using overlap sampling with a low resolution of sensor, high spatial resolution (corresponding to the sampling interval) can be achieved through ground data processing with image restoration methods. Through the experiments with simulation imagery data derived from Advanced Very High Resolution Radiometer (AVHRR) data, it was found that spatial resolution enhancement can be achieved, MEM is superior to the others when S/N ratio is poor (less than 33) while Conjugate Gradient Method (CGM) is superior when the S/N ratio is higher than 33. It was also found that the CGM is superior to the proposed method for the existing sampling jitter.

Author 1: Kohei Arai

Keywords: Image restoration; Maximum Entropy Method (MEM); annealing; Advanced Very High Resolution Radiometer (AVHRR); Conjugate Gradient Method (CGM)

PDF

Paper 34: Design and Implementation of 6LoWPAN Application: A Performance Assessment Analysis

Abstract: Industrial Revolution 4.0 promises an overall improvement to the communications technology by improving on the quality and flexibility of IoT application deployment. Currently, most of these applications are embedded devices from various manufacturers, networks, and technologies. As such, it would be total chaos of just getting the various and myriad devices and technologies to work together, let alone making them to work in perfect harmony. Regardless, the IoT espouses on the seamless integration and interoperability of the said devices and technologies. In realizing this goal, it would be imperative to say that the ability of the IoT system in adopting and adapting to new devices, services, and application is crucial, while at the same time it would not be in any way jeopardizing or compromising the existing system, especially the routing protocol. In view of the IP-based communication technology in WSN, the 6LoWPAN network has been chosen for the task, and the RPL protocol has been strongly considered for the 6LoWPAN solution. However, the RPL overhead tends to be spiralling upwards when additional information transmission occurs. In mitigating this anomaly, therefore, the HRPL was proposed to enhance the RPL protocol in reducing routing overhead. This study focusses on the performance analysis of RPL and HRPL based on the physical experimentation of the 6LoWPAN network in a real scenario. The results show HRPL protocol outperforms in all the performance-tested evaluations: CTO (38.7%), latency (26%), and convergence time (37%). It was also discovered that the number of DIS and DAO (RPL control message) packet is significantly reduced when the DIO message was reduced. At the same time, latency and convergence time also registered a decrease in their respective values correspondingly. Meanwhile, based on our observation, several experiments are needed to investigate how variants topology affect HRPL capabilities.

Author 1: Nin Hayati Mohd Yusoff
Author 2: Nurul Azma Zakaria
Author 3: Adil Hidayat Rosli

Keywords: 6LoWPAN protocol; performance analysis; overhead

PDF

Paper 35: An Automated Framework for Detecting Change in the Source Code and Test Case Change Recommendation

Abstract: Improvements and acceleration in software development has contributed towards high quality services in all domains and all fields of industry causing increasing demands for high quality software developments. In order to match with the high-quality software development demands, the software development industry is adopting human resources with high skills, advanced methodologies and technologies for accelerating the development life cycle. In the software development life cycle, one of the biggest challenges is the change management between versions of the source codes. The versing of the source code can be caused by various reasons such as change in the requirements or adaptation of functional update or technological upgradations. The change management does not only affect the correctness of the release for the software service, rather also impact the number of test cases. It is often observed that, the development life cycle is delayed due to lack of proper version control and due to the improver version control, the repetitive testing iterations. Hence the demand for better version control driven test case reduction methods cannot be ignored. A number of version control mechanisms are proposed by the parallel research attempts. Nevertheless, most of the version controls are criticized for not contributing towards the test case generation of reduction. Henceforth, this work proposes a novel probabilistic refactoring detection and rule-based test case reduction method in order to simplify the testing and version control mechanism for the software development. The refactoring process is highly adopted by the software developers for making efficient changes such as code structure, functionality or apply change in the requirements. This work demonstrates a very high accuracy for change detection and management. This results into a higher accuracy for test case reductions. The final outcome of this work is to reduce the development time for the software for making the software development industry a better and efficient world.

Author 1: Niladri Shekar Dey
Author 2: Purnachand Kollapudi
Author 3: M V Narayana
Author 4: I Govardhana Rao

Keywords: Change detection; pre-requisite detection; feature detection; functionality detection and test case change recommendation

PDF

Paper 36: Investigating Transmission Power Control Strategy for Underwater Wireless Sensor Networks

Abstract: Underwater wireless sensor network (UWSN) has proven its high stature in both civil and military operations including underwater life monitoring, communication and invasion detection. However, UWSNs are vulnerable to a wide class of power consumption issues. Underwater sensor nodes consume power provided by integrated limited batteries. It is a challenging issue to replace these batteries under harsh aquatic conditions. Thus, in an energy-constrained underwater system it is pivotal to seek strategies for improving the life expectancy of the sensors. In this paper, we propose transmission power control mechanism for underwater wireless sensor networks (UWSNs). We experimentally investigate the impact of transmission power and propose a control mechanism to enhance the performance of underwater wireless sensor network. In this proposed mechanism, source nodes will adjust its transmission power according to the location of destination node. This paper aims to provide a mechanism which is incorporated in SEEC. This study also outlines the mathematical modeling for proposed idea. Moreover, we have compared results of our scheme with previous implemented schemes.

Author 1: Syed Agha Hassnain Mohsan
Author 2: Hussain Amjad
Author 3: Alireza Mazinani
Author 4: Sahibzada Adil Shahzad
Author 5: Mushtaq Ali Khan
Author 6: Asad Islam
Author 7: Arfan Mahmood
Author 8: Ahmad Soban

Keywords: Component; underwater wireless sensor networks; transmission power; sensor; power consumption

PDF

Paper 37: Attendance System using Machine Learning-based Face Detection for Meeting Room Application

Abstract: In a modern meeting room, a smart system to make attendance quickly is mandatory. Most of the existing systems perform manual attendance, such as registration and fingerprint. Despite the fingerprint method can reject the Unknown person and give the grant access to the Known person, it will take time to register first a person one-by-one. Moreover, it is possible to create long queues for fingerprint checking before entering the meeting room. Machine learning, along with the Internet of Things (IoT) technology is the best solution; it offers many advantages when applied in the meeting rooms. Generally, the method used is to create a presence by detecting faces. In this paper, we present a facial recognition authentication based on machine learning technology for connection to the meeting rooms. Furthermore, specific website to display the detection result and data storage design testing is developed. The method uses 1) the Dlib library for deep learning purposes, 2) OpenCV for video camera processing, and 3) Face Recognition for Dlib processing. The proposed system allows placing the multiple cameras in a meeting room as needed. However, in this work, we only used one camera as the main system. Tests conducted include identification of one Known person, identification of one Unknown person, identification of two people, and three people. The parameter to be focused is the required time in detecting the number of faces recorded by the camera. The results reveal that the face can be recognized or not recognized, then it will be displayed on the website.

Author 1: Rahmat Muttaqin
Author 2: Nopendri
Author 3: Syifaul Fuada
Author 4: Eueung Mulyana

Keywords: Detection; face; IoT; meeting room; attendance

PDF

Paper 38: Combined Text Mining: Fuzzy Clustering for Opinion Mining on the Traditional Culture Arts Work

Abstract: The Indonesian government is currently intensifying work programs in the field of traditional arts and culture. In order to realize the promotion of the country's culture, the government has enacted a law on cultural promotion. One indicator of the achievement of the promotion of culture, among others, with the collection of data on traditional culture, the data mapping and data inventory can be processed into information and knowledge. In this research, indicators of performance indicators were compiled from connoisseurs of traditional works of art using data in the city of Malang, East Java, Indonesia. The results of the audience's opinion on cultural offerings can be used as a benchmark for the success of the promotion of traditional culture. When the culture is explored and tried to be displayed again, it is important to know the audience's satisfaction and understanding of the display that has just been witnessed. The results of the description of respondents in the form of opinions on the artwork will be collected as data processed using Text Mining with the Clustering of Fuzzy C-Means method to determine the audience's opinion about Feeling , which is related to feelings when viewing the beauty of the artwork, Value is related to the assessment to an art work that can be in the form of art weights contained in the work of art, and Emphasizing , which is related to empathy or respect for the art world, including professions related to the world such as dancers, musicians and others. The results achieved from this study show that has good performance on the proposed method. This can be known from the results of data testing using cluster variance V = 0.00000217901. Based on these values it can be concluded that the value of all cluster variants is good.

Author 1: Elta Sonalitha
Author 2: Anis Zubair
Author 3: Priyo Dari Molyo
Author 4: Salnan Ratih Asriningtias
Author 5: Bambang Nurdewanto
Author 6: Bondhan Rio Prambanan
Author 7: Irfan Mujahidin

Keywords: Text mining; opinion mining; fuzzy clustering; arts work

PDF

Paper 39: Increasing User Satisfaction of Mobile Commerce using Usability

Abstract: Online shopping continues to simplify people’s way of life in the present world. People no longer need to physically visit stores to buy items for home use or other purposes. This can be done online using mobile applications to order for preferred items, which can be delivered in return. However, the increase in the features of mobile applications and mobile devices makes usability testing a necessary aspect in today’s advancement of technology. This paper uses an experiment-based usability testing evaluation on Lazada Indonesia mobile application to understand four usability factors, namely, ease of use, efficiency, functionality, and satisfaction. The test was performed using 40 participants and all were students from Universitas Atma Jaya Yogyakarta, Universitas Gadjah Mada, and Universitas Sanata Dharma. These performed 6 tasks on the mobile application and later answered a questionnaire to capture their views concerning the usability of the mobile application. The results were analyzed using SPSS software and descriptive statistics were displayed using standard deviation and arithmetic means. The results from the evaluation showed that the mobile application is easy to use, efficient, and has good functionality. However, some issues were mentioned by participants which indicated that users were not fully satisfied with the mobile application. Therefore, this calls for designers to consider these usability issues to increase user satisfaction.

Author 1: Ninyikiriza Deborah Lynn
Author 2: Arefin Islam Sourav
Author 3: Djoko Budiyanto Setyohadi

Keywords: Usability; usability testing; mobile commerce; mobile application

PDF

Paper 40: Recognition of Local Birds of Bangladesh using MobileNet and Inception-v3

Abstract: Recognition of bird species can be a challenging task due to various complex factors. The purpose of this work is to distinguish various local bird species of Bangladesh from the image data. The MobileNet and Inception-v3 model which is mainly an image classification model used here to accomplish this work. Here, we have used a total of four approaches namely Inception-v3 without transfer learning, Inception-v3 with transfer learning, MobileNet without transfer learning, and MobileNet with transfer learning to accomplish the task. To evaluate our experimental results, we have calculated F1 Score besides the model’s accuracy and also presented the ROC curve to evaluate the model’s output quality. Then we have done a comparison among the applied four approaches. The experimental result has proved the working capability of the applied four approaches. Among these four approaches, MobileNet with transfer learning outperforms the others and obtained a test accuracy of 91.00%. For each of the classes, MobileNet with transfer learning obtained the highest F1 Score than other approaches.

Author 1: Md. Mahbubur Rahman
Author 2: Al Amin Biswas
Author 3: Aditya Rajbongshi
Author 4: Anup Majumder

Keywords: Recognition; MobileNet; Inception-v3; transfer learning; computer vision; Bangladeshi bird

PDF

Paper 41: EDES-ACM: Enigma Diagonal Encryption Standard Access Control Model for Data Security in Cloud Environment

Abstract: The data management across the different domains is the foremost requirement for many organizations universally. The organization establishes the cloud computing paradigm to handle the data effectively due to its robust scaling at a low cost. In recent times the usage of cloud and its data is increasing with the multiuser environments. This resulted in the issue of ensuring the security of data in the cloud environment uploaded by the owners. The cloud service providers and researchers implemented several schemes to ensure data security. However, the task of providing security with multiuser remains tedious with data leakage. A novel Enigmatic Diagonal Encryption Standard (EDES) algorithm to provide access control over the cloud is proposed. The framework with the proposed algorithm is named as EDES-ACM. The Inverse Decisional Diffie Hellman (IDDH) technique is used for generating the signature of the group. The data is encrypted with the EDES algorithm by the data owner. The encrypted data is provided to the user and is accessed by the EDES based private key. The group manager monitors the cloud and provides the activity report to owners based on which the revocation is performed. The framework is validated for its performance on security parameters and compared with the existing models on computation cost. The EDES-ACM framework is effective with low computation cost. The future notion for the proposed framework is to include the block chain technology that may improve the security and better accumulation of data.

Author 1: Sameer
Author 2: Harish Rohil

Keywords: Cloud; security; multiuser; EDES-ACM; computation cost

PDF

Paper 42: Detecting Health-Related Rumors on Twitter using Machine Learning Methods

Abstract: Nowadays, the huge usage of internet leads to tremendous information growth as a result of our daily activities that deal with different sources such as news articles, forums, websites, emails and social media. Social media is a rich source of information that deeply affect users by its useful content. However, there are a lot of rumors in these social media platforms which can cause critical consequences to the people’s lives, especially if it is related to the health-related information. Several studies focused on automatically detecting rumors from social media by applying machine learning and intelligent methods. However, few studies concerned about health-related rumors in Arabic language. Therefore, this paper is dealing with detecting health-related rumors focusing on cancer treatment information that are spread over social media using Arabic language. In addition, it presents the process of creating a dataset that is called Health-Related Rumors Dataset (HRRD) which will be available and beneficial for further studies in health-related research. Furthermore, an experiment has been conducted to investigate the performance of several machine learning methods to detect the health-related rumors on social media for Arabic language. The experimental results showed the rumors can be detected with an accuracy of 83.50%.

Author 1: Faisal Saeed
Author 2: Wael M.S. Yafooz
Author 3: Mohammed Al-Sarem
Author 4: Essa Abdullah Hezzam

Keywords: Health-related misinformation; cancer disease; fake information; Twitter; classification formatting

PDF

Paper 43: A Novel Low Power, Minimal Dead Zone Digital PFD for Biomedical Applications

Abstract: Chronic diseases and rising aging populations are the major reasons towards the usage of low power, low noise, life time performance Biomedical Implantable Devices. Efficient architectural designs will be responsible for the requirements set out above. This paper focuses on the ADPLL DPFD architecture for implantable biomedical devices. For high performance DPFD, the dead zone, lock in time is a seldom limitation to ADPLLs. In the present paper, a new approach to design a dead zone free with fast and high locking time and low phase noise DPFD is considered to be a challenge. This can be accomplished by carefully controlling the reference and feedback clock frequencies of the phase detector with the proposed NIKSTRO/SURAV latch based sense amplifier. The proposed architecture was developed and simulated using 45nm technology and it is observed that it provides a 20ns dead zone with 4.8mW of power consumption at the rate of 1.8GHz, while the lock in time for the proposed method is 340ns with moderate phase noise. It is also noted that the designed one showed better results when compared to the existing ones.

Author 1: Sudhakiran Gunda
Author 2: Ernest Ravindran R. S

Keywords: Biomedical Implantable Device (BIMD); Digital Phase Frequency Detector (DPFD); Digital Controlled Oscillator (DCO); Sense Amplifier Based Flip-flop (SAFF); NIKSTRO or SURAV

PDF

Paper 44: Comprehensive Interaction Model for Cloud Management

Abstract: Cloud computing is readily being adopted by enterprises due to its following benefits: ability to provide better service to customers, improved flexibility, lower barrier to entry for an enterprise, lower maintenance cost on IT service, availability etc. However, the interaction between cloud service provider and customer is not well-defined yet. Understanding of the service offered while approaching cloud computing paradigm and also understanding of the required actions during the period of receiving a cloud service e.g. provision of new resources, scaling up/down, billing, etc. remains a concern for the enterprises. This paper proposes a segregated interaction model to manage the receiving of a cloud service in a hierarchical way.

Author 1: Md. Nasim Adnan
Author 2: Md. Majharul Haque
Author 3: Mohammad Rifat Ahmmad Rashid
Author 4: Mohammod Akbar Kabir
Author 5: Abu Sadat Mohammad Yasin
Author 6: Muhammad Shakil Pervez

Keywords: Cloud computing; cloud management; cloud customer

PDF

Paper 45: Determining the Presence of Metabolic Pathways using Machine Learning Approach

Abstract: The reconstruction of the metabolic network of an organism based on its genome sequence is a key challenge in systems biology. One of the strategies that can be used to address this problem is the prediction of the presence or the absence of a metabolic pathway from a reference database of known pathways. Although, such models have been constructed manually, obviously such a method cannot be used to cover thousands of genomes that has been sequenced. Therefore, more advanced techniques are needed for computational representation of metabolic networks. In this research, we have explored machine learning approach to determine the presence or the absent of metabolic pathway based on its annotated genome. We have built our own dataset of 4978 instances of pathways. The dataset consists of 1585 pathways with each having 20 different representations from 20 organisms. The pathways were obtained from the BioCyc Database Collection. The pathway dataset also consists of 20 features used to describe each pathway. In order to identify the suitable classifier, we have experimented five machine learning algorithms with and without applying feature selection methods, namely Decision Tree, Naive Bayes, Support Vector Machine, K-Nearest Neighbor and Logistic Regression. Our experiments have shown that Support Vector Machine is the best classifier with an accuracy of 96.9%, while the maximum accuracy reached by the previous work is 91.2%. Hence, adding more data to the pathway dataset can improve the performance of the machine learning classifiers.

Author 1: Yara Saud Aljarbou
Author 2: Fazilah Haron

Keywords: Metabolic pathway prediction; pathway dataset; metabolic network of organism; machine learning; support vector machine

PDF

Paper 46: Intelligent and Scalable IoT Edge-Cloud System

Abstract: Scalability is an utter compulsory for the success of the IoT’s unprecedentedly growing network. The operational and financial bottlenecks allied with growth can be overwhelming for those peeping to integrate IoT solutions. As the IoT technology proceeds, so is the scale of operations desired to arrive at a wider target region. Breakdown may take place not because of device’s ability to scale, but due to data scale. As more devices are being incorporated, more data/information will be amassed, stored, processed, and scrutinized. The volume of this collection simply cannot be managed from a single edge device by deploying vertical approach. When starting small, it’s important to peep into the future and anticipate growth. Companies that can’t adapt to unpredictable market changes will fold without the right IoT architecture in place. Therefore, a scalable IOT framework has been proposed in the paper, which will provide load balancing or scalability by deploying the provisions of horizontal scalability for the system. The framework will be utilizing SOM for the purpose of classifying applications (whether delay sensitive or delay insensitive), so that proper decisions can be made based on the incoming data (typically signals) and if edge gets over flooded with the data, then edge is scaled to instigate the other edge for computing additional requests . The proposed system is termed as intelligent because its algorithm empowers the edge to take decision and classify applications based on the type of requirement of the application.

Author 1: Shifa Manihar
Author 2: Tasneem Bano Rehman
Author 3: Ravindra Patel
Author 4: Sanjay Agrawal

Keywords: Scalability; internet of things; self organizing map; edge; horizontal scalability

PDF

Paper 47: Scalable Asymmetric Security Mechanism for Internet of Things

Abstract: The Internet of things stances rigorous demands on excellence of quality and the vitality of security. It becomes vital to provide an extremely reliable encryption algorithm with less complexity and computational expense in IoT paradigm. Most of the protocols designed in past for communication between sender and the receiver based on asymmetric cryptography algorithms poses high computational cost. Therefore, this paper presents a less complex and more secure and fast encryption algorithm for communication between devices i.e. Asymmetric Scalable Security between sender and the receiver of the information. We present a reliable, secure, scalable and efficient communication protocol that used asymmetric algorithm for securing the exchange of information between sender and the receiver. The proposed communication protocol is lightweight encryption method that does not require complex resources to perform the computations involved for using the asymmetric cryptography. The simulation results also show that the proposed method is efficient in terms of time and space and ensures confidentiality. Therefore, the proposed scheme is beneficial for providing the secure communication for the power and resource constrained IoT devices.

Author 1: Ayesha Siddiqa
Author 2: Sohail Ahmed

Keywords: Asymmetric cryptography; confidentiality; internet of things; security

PDF

Paper 48: A Framework for Brain Tumor Segmentation and Classification using Deep Learning Algorithm

Abstract: The brain tumor is a cluster of the abnormal tissues, and it is essential to categorize brain tumors for treatment using Magnetic Resonance Imaging (MRI). The segmentation of tumors from brain MRI is understood to be complicated and also crucial tasks. It can be further use in surgery, medical preparation, and assessments. In addition to this, the brain MRI classification is also essential. The enhancement of machine learning and technology will aid radiologists in diagnosing tumors without taking invasive steps. In this paper, the method to detect a brain tumor and classification has been present. Brain tumor detection processes through pre-processing, skull stripping, and tumor segmentation. It is employing a thresholding method followed by morphological operations. The number of training image influences the feature extracted by the CNN, then CNN models overfit after some epoch. Hence, deep learning CNN with transfer learning techniques has evolved. The tumorous brain MRI is classified using CNN based AlexNet architecture. Further, the malignant brain tumor is classified using GooLeNet transfer learning architecture. The performance of this approach is evaluated by precision, recall, F-measure, and accuracy metrics.

Author 1: Sunita M. Kulkarni
Author 2: G. Sundari

Keywords: Brain MRI; segmentation; CNN; deep learning; transfer learning

PDF

Paper 49: Movie Rating Prediction using Ensemble Learning Algorithms

Abstract: Over the last few decades, social media platforms have gained a lot of popularity. People of all ages, gender, and areas of life have their presence on at least one of the social platforms. The data that is generated on these platforms has been and is being used for better recommendations, marketing activities, forecasting, and predictions. Considering predictions, the movie industry worldwide produces a large number of movies per year. The success of these movies depends on various factors like budget, director, actor, etc. However, it has become a trend to predict the rating of the movie based on the data collected from social media related to the movie. This will help a number of businesses relying on the movie industry in making promotional and marketing decisions. In this report, the aim is to collect movie data from IMDB and its social media data from YouTube and Wikipedia and compare the performance of two machine learning algorithms – Random Forest and XGBoost – best known for their high accuracy with small datasets, but large feature set. The collection of data is done from multiple sources or APIs.

Author 1: Zahabiya Mhowwala
Author 2: A. Razia Sulthana
Author 3: Sujala D. Shetty

Keywords: Machine learning; ensemble learning; random forest algorithm; XGBoost; movie rating prediction

PDF

Paper 50: A Comparative Study of Microservices-based IoT Platforms

Abstract: Internet of Things (IoT) is a set of technologies that aim at fitting together smart devices and applications to build an IoT ecosystem. The target of this kind of ecosystem is to enhance interaction between machines and humans through hardware to software binding while reducing cost and resource consumption. On the application level, IoT ecosystems were implemented by various technologies that all seek better interconnection, monitoring, and controlling of IoT smart devices. Among recent technologies, Microservices, which are a variant of the service-oriented architecture, are subject to great excitement. In fact, Microservices are an emerging technology built around Microservice paradigm which goal is to offer services with a small granularity, which exactly meets the distributed nature of IoT devices while maintaining a loosely coupled architecture between IoT components among other advantages. Efforts to build Microservice-based IoT platforms sooner emerged to take advantage of the numerous benefits of the Microservice paradigm to build scalable, interoperable, and dynamic ecosystems. The goal of this paper is to list these approaches, classify them and compare them using a Weighted Scoring Model (WSM) method. This involves, studying these platforms, establishing relevant criteria for comparison, assigning weights for each criterion, and finally calculating scores. The obtained results reveal the weaknesses and strengths of each of the studied platforms.

Author 1: Badr El Khalyly
Author 2: Abdessamad Belangour
Author 3: Mouad Banane
Author 4: Allae Erraissi

Keywords: IoT platforms; microservices; WSM method; IoT architecture; smart devices

PDF

Paper 51: Weight Prediction System for Nile Tilapia using Image Processing and Predictive Analysis

Abstract: Fish farmers are likely to cultivate poor quality fish to accommodate the rising demands for food due to the ever-increasing population. Fish growth monitoring greatly helps on producing higher quality fish products which leads to a better impact in the aquatic animal food production industry. However, monitoring through manual weighing and measuring stresses them that affects their health resulting to poorer quality or even fish kills. This paper presents a low-cost monitoring and Hough gradient method-based weight prediction system for Nile Tilapia (Oreochromis niloticus) using Raspberry Pi microcontroller and two low-cost USB cameras. This study aims to improve fish growth rate through monitoring the growth of the fishes with image processing eliminating the traditional way of obtaining fish measurements. By using paired t-test, the acquired values imply that the weight algorithm used to measure the weight of the fishes is accurate and acceptable to use. Growth performance of 10 Nile Tilapia was obtained in two intensive aquaculture setups – one for automated fish weighing through image processing and predictive analysis and the other setup for manual weighing. In response to weight prediction application, the growth of the fishes increased by 47.88%.

Author 1: Lean Karlo S. Tolentino
Author 2: Celline P. De Pedro
Author 3: Jatt D. Icamina
Author 4: John Benjamin E. Navarro
Author 5: Luigi James D. Salvacion
Author 6: Gian Carlo D. Sobrevilla
Author 7: Apolo A. Villanueva
Author 8: Timothy M. Amado
Author 9: Maria Victoria C. Padilla
Author 10: Gilfred Allen M. Madrigal

Keywords: Fish; growth; Tilapia; image processing; predictive analysis; weight prediction

PDF

Paper 52: Detection of Plant Disease on Leaves using Blobs Detection and Statistical Analysis

Abstract: Plant is exposed to many attacks from various micro-organism, bacterial disease and pests. The symptoms of the attacks are usually distinguished through the leaves, stem or fruit inspection. Disease that are commonly attack plants are Powdery Mildew and Leaf Blight and it may cause severe damaged if not controlled in early stages. Image processing has widely being used for identification, detection, grading and quality inspection in the agriculture field. Detection and identification disease of a plant is very important especially, in producing a high-quality fruit. Leaves of a plant can be used to determine the health status of that plant. The objective of this work is to develop a system that capable to detect and identify the type of disease based on Blobs Detection and Statistical Analysis. A total 45 sample leaves images from different colour and type were used and the accuracy is analysed. The Blobs Detection technique are used to detect the healthiness of plant leaves. While Statistical Analysis is used by calculating the Standard Deviation and Mean value to identify the type disease. Result is compared with manual inspection and it is found that the system has 86% in accuracy compared to manual detection process.

Author 1: N. S. A. M Taujuddin
Author 2: A.I.A Mazlan
Author 3: R. Ibrahim
Author 4: S. Sari
Author 5: A. R. A Ghani
Author 6: N. Senan
Author 7: W.H.N.W Muda

Keywords: Image processing; blob detection; edge detection; statistical analysis; disease detection

PDF

Paper 53: IoT based Automatic Damaged Street Light Fault Detection Management System

Abstract: The IoT (Internet of Things) is a blooming technology that mainly concentrates on the interconnection of devices or components to one another and the people. As the time being, many of these connections are changing as “Device – Device” from “Human to Device”. Finding the faulty street light automatically is become a vital milestone by using this technology. The primary goal of the project is to provide control and identification of the damaged street light automatically. The lighting system which targets the energy and automatic operation on economical affordable for the streets and immediate information response about the street light fault. In general, the damage of the street light is observed by getting the complaints from the colony (street) people. Whereas in this proposed work using sensors these lights working status is easily captured without any manual interaction. So that it reduces manual efforts and the delay to fix problems. So, to reduce such problem we come with the solution wherein automatic detection of street light issues i.e.; whether the street light is working or not will be found at night time and it should send the notification to the authorized person if there is a problem in particular streetlight and also the location of the place where the streetlight is damaged. The street lights are automatically ON/OFF using IoT. In this system, it checks whether the street light is ON/ OFF. The LDR sensor will ON/OFF the street lights automatically, based on the condition of the weather.

Author 1: Ashok Kumar Nanduri
Author 2: Siva Kumar Kotamraju
Author 3: G L Sravanthi
Author 4: Sadhu Ratna Babu
Author 5: K V K V L Pavan Kumar

Keywords: IoT (Internet of Things); GSM (Global System for Mobile); LDR (Light Dependent Resistor); LED (Light-Emitting Diode); GPS (Global Positioning System); Raspberry; Twilio

PDF

Paper 54: Lake Data Warehouse Architecture for Big Data Solutions

Abstract: Traditional Data Warehouse is a multidimensional repository. It is nonvolatile, ‎subject-oriented, integrated, time-variant, and non-‎operational data. It is gathered from multiple ‎heterogeneous data ‎sources. We need to adapt traditional Data Warehouse architecture to deal with the new ‎challenges imposed by the abundance of data and the current big data characteristics, containing ‎volume, value, variety, validity, volatility, visualization, variability, and venue. The new ‎architecture also needs to handle existing drawbacks, including availability, scalability, and ‎consequently query performance. This paper introduces a novel Data Warehouse architecture, named Lake ‎Data Warehouse Architecture, to provide the traditional Data Warehouse with the capabilities to ‎overcome the challenges. ‎Lake Data Warehouse Architecture depends on merging the traditional Data Warehouse architecture ‎with big data technologies, like the Hadoop framework and Apache Spark. It provides a hybrid ‎solution in a complementary way. The main advantage of the proposed architecture is that it ‎integrates the current features in ‎traditional Data Warehouses and big data features acquired ‎through integrating the ‎traditional Data Warehouse with Hadoop and Spark ecosystems. Furthermore, it is ‎tailored to handle a tremendous ‎volume of data while maintaining availability, reliability, and ‎scalability.‎

Author 1: Emad Saddad
Author 2: Ali El-Bastawissy
Author 3: Hoda M. O. Mokhtar
Author 4: Maryam Hazman

Keywords: Traditional data warehouse; big data; semi-structured data; unstructured data; novel data warehouses architecture; Hadoop; spark

PDF

Paper 55: An Improved Ant Colony Optimization Algorithm: A Technique for Extending Wireless Sensor Networks Lifetime Utilization

Abstract: Wireless sensor networks (WSNs) are one of the most essential technologies in the 21st century due to their increase in various application areas and can be deployed in areas where cable and power supply are difficult to use. However, sensor nodes that form these networks are energy-constrained because they are powered by non-rechargeable small batteries. Thus, it is imperative to design a routing protocol that is energy efficient and reliable to extend network lifetime utilization. In this article, we propose an improved ant colony optimization algorithm: a technique for extending wireless sensor networks lifetime utilization called AMACO. We present a new clustering method to avoid the overhead that is usually involved during the election of cluster heads in the previous approaches and energy holes within the network. Moreover, fog computing is integrated into the scheme due to its ability to optimize the limited power source of WSNs and to scale up to the requirements of the Internet of Things applications. All the data packets received by the fog nodes are transmitted to the cloud for further analysis and storage. An improved ant colony optimization (ACO) algorithm is used to construct optimal paths between the cluster heads and fog nodes for a reliable end-to-end data packets delivery. The simulation results show that the network lifetime in AMACO increased by 22.0%, 30.7%, and 32.0% in comparison with EBAR, IACO-MS, and RRDLA before the first node dies (FND) respectively. It increased by 15.2%, 18.4%, and 33.5% in comparison with EBAR, IACO-MS, and RRDLA before half nodes die (HND) respectively. Finally, it increased by 28.2%, 24.9%, and 58.9% in comparison with EBAR, IACO-MS, and RRDLA before the last node dies (LND) respectively.

Author 1: Ademola P. Abidoye
Author 2: Elisha O. Ochola
Author 3: Ibidun C. Obagbuwa
Author 4: Desmond W. Govender

Keywords: Sensor nodes; advanced nodes; fog nodes; data centre; cloud computing; ant colony optimization; visual sensor networks

PDF

Paper 56: A Flood Forecasting Model based on Wireless Sensor and Actor Networks

Abstract: Flood forecasting is a challenging area of research that can help to save precious lives by timely intimating about the flood possibilities. The role of advancements in computing and allied technologies has moved the research towards a new horizon. Researchers from all over the world are using Artificial Neural Networks (ANN), Global Information Systems (GIS), and Wireless Sensor Networks (WSN) based schemes for flash flood forecasting and hydrological risk analysis. ANN and GIS-based solutions are much costly whereas the analysis and prediction using WSN require much less cost for infrastructure deployment. It will enable the use of flood prediction mechanisms in the third world and poor countries. New variation in storage capacity can be a vital source to eliminate or reduce the chance of flood. By considering this observation, it is proposed to develop a generic flood prediction scheme that can manage the system as per resources and environmental conditions. A heterogeneous WSN has considered where powerful Collector Nodes (CN) continuously takes values from member sensor nodes in the region. CN transmits the alerts to Administration Unit (AU) when threshold values are crossed. It is proposed to save the threshold values from all water sources like storage capacity, water inflow, and outflow in the repository for decision making. Moreover, environmental parameters including water level, humidity, temperature, air pressure, rainfall, soil moisture, etc. are considered for the analysis of flood prediction. We have also considered the evaluation of these parameters in specified boundary values that were not considered in existing schemes. In this research study, we have used Arc GIS and NS2 simulation tools to analyze the parameters and predict the probability of the occurrence of a flood.

Author 1: Sheikh Tahir Bakhsh
Author 2: Naveed Ahmed
Author 3: Basit Shahzad
Author 4: Mohammed Basheri

Keywords: Flood forecasting; GIS; remote sensing; hydrology; particle swarm optimization

PDF

Paper 57: Automatic Extraction of Rarely Explored Materials and Methods Sections from Research Journals using Machine Learning Techniques

Abstract: The scientific community is expanding by leaps and bounds every day owing to pioneering and path breaking scientific literature published in journals around the globe. Viewing as well as retrieving this data is a challenging task in today’s fast paced world. The essence and importance of scientific research papers for the expert lies in their experimental and theoretical results along with the sanctioned research projects from the organizations. Since scant work has been done in this direction, the alternative option is to explore text mining by machine learning techniques. Myriad journals are available on material research which throws light on a gamut of materials, synthesis methods, and characterization methods used to study properties of the materials. Application of materials has many diversified areas, hence selected papers from “Journal of Material Science” where “Materials and Methods” sections contains names of the method, characterization techniques (instrumental methods), algorithms, images, etc. used in research work. The “Acknowledgment” section conveys information about authors’ proximity, collaborations with organizations that are again not explored for the citation network. In the present articulated work, our attempt is to derive a means to automatically extract methods or terminologies used in characterization techniques, author, organization data from “Materials and Methods” and “Acknowledgment” sections, using machine learning techniques. Another goal of this research is to provide a data set for characterization terms, classification and an extended version of the existing citation network for material research. The complete dataset will help new researchers to select research work, find new domains and techniques to solve advanced scientific research problems.

Author 1: Kavitha Jayaram
Author 2: Prakash G
Author 3: Jayaram V

Keywords: Data-mining; rule-based; machine-learning; term extraction; classification; materials and methods; acknowledgment

PDF

Paper 58: Improvement of Body Movements and Stability of Blind or Visually Impaired Adults by Physical Activity using Kinect V2

Abstract: People who are blind or low vision need to follow activities routines for their mental and physical health to minimize the risk of suffering from bleeding in articulation but they have problems due to difficulties and inaccessibility of displacement. This paper introduces and evaluate a set of exercises to improve the bodily movement and stability using body tracking by Microsoft Kinect V2 and audio feedback. These exercises are composed of a sequence of different postures, has an audio feedback personalized to help people to understand each gestures and can correct them if it is not correct, and generates a summary graph to evaluate the success rate of exercises. To obtain the 3D joint coordinates from the depth sensor, we used the SDK V2.0 of the Microsoft Kinect. We use these coordinates to calculate the distances and angles between joints of interest firstly to position the user in the area field of the Kinect sensor, evaluate the different postures of movements of knees, elbows and shoulders, and detect the body balance if he is leaning and in which direction to avoid falling. These physical exercises have been evaluated to improve feasibility and feedback with persons who are blind or low vision.

Author 1: Marwa Bouri
Author 2: Ali Khalfallah
Author 3: Med Salim Bouhlel

Keywords: Posture; visual impaired; physical exercise; audio feedback; Kinect; body tracking; balance; falling

PDF

Paper 59: CAREdio: Health Screening and Heart Disease Prediction System for Rural Communities in the Philippines

Abstract: Cardiovascular diseases cover a large quantity of worldwide disease load, setting it to top leading cause of death. In the Philippines, given the rapid economic advancement and urbanization, the most vulnerable sector has not been impacted by this development. Data from the Philippine Statistical Authority (PSA) in 2016 revealed that of the country’s total recorded deaths, six out of ten were medically unattended and of which the largest portion are from the rural population. Consequently, medical analysis is needed to perform effectively and precisely however, most developing countries have limited resources and lack medical expert for specialized field such as cardiologists. The proponents essentially seeks to address the issues Philippine health sector specifically in rural and remote populace by executing efficient and low-cost health screening and diseases prediction system using commercially available medical devices and machine learning algorithms for the prediction of three of the most heart diseases (Hypertension, Heart Attack, Diabetes). The system is composed of CAREdio mobile app, prototype hardware consists of different health sensors and devices, and a machine learning model that is applied to determine the user’s individual probability of having a specific heart disease. The machine learning models used were trained using the data gathered from Rosario Reyes Health Center and Ospital ng Sampaloc (Sampaloc Hospital), both located in Manila City, Philippines. CAREdio achieves accuracy values over 0.80 for all diseases. The system can diagnose multiple cardiovascular diseases in a single app that will benefit people rural communities.

Author 1: Lean Karlo S. Tolentino
Author 2: John Erick L. Isoy
Author 3: Kayne Adriane A. Bulawan
Author 4: Mary Claire T. Co
Author 5: Caryl Faye C. Monreal
Author 6: Ian Joshua W. Vitto
Author 7: Maria Victoria C. Padilla
Author 8: Jay Fel C. Quijano
Author 9: Romeo Jr. L. Jorda
Author 10: Jessica S. Velasco

Keywords: Cardiovascular diseases; health screening; disease prediction; mobile application; machine learning; rural population

PDF

Paper 60: Lean IT Transformation Plan for Information Systems Development

Abstract: Information systems development (ISD) is prone to failure, which can be defined as a time-consuming and costly phenomenon that provides value that is not directly appealing to clients. While ISD can be enhanced using various tools, models, and frameworks, failures related to ISD remain to be evident and costly. These failures are related to human, organizational, and technological factors and waste in ISD. This study identifies the information system (IS) success criteria and factors that contribute to ISD waste. A qualitative case study was conducted for an ICT research unit by using interview, observation, and document analysis techniques as a means of analyzing the IS success criteria, leanness level, and waste. Findings show that lean IT approaches and IS success criteria can be combined to develop a holistic transformation plan for organizational ISD. This transformation plan can potentially assist IS developers deliver high-value IS while driving organizational growth towards the fourth industrial revolution.

Author 1: Muhammad K. A. Kiram
Author 2: Maryati Mohd Yusof

Keywords: Failure; lean IT; information systems; information systems development; socio-technical; waste

PDF

Paper 61: Automatic Hate Speech Detection using Machine Learning: A Comparative Study

Abstract: The increasing use of social media and information sharing has given major benefits to humanity. However, this has also given rise to a variety of challenges including the spreading and sharing of hate speech messages. Thus, to solve this emerging issue in social media sites, recent studies employed a variety of feature engineering techniques and machine learning algorithms to automatically detect the hate speech messages on different datasets. However, to the best of our knowledge, there is no study to compare the variety of feature engineering techniques and machine learning algorithms to evaluate which feature engineering technique and machine learning algorithm outperform on a standard publicly available dataset. Hence, the aim of this paper is to compare the performance of three feature engineering techniques and eight machine learning algorithms to evaluate their performance on a publicly available dataset having three distinct classes. The experimental results showed that the bigram features when used with the support vector machine algorithm best performed with 79% off overall accuracy. Our study holds practical implication and can be used as a baseline study in the area of detecting automatic hate speech messages. Moreover, the output of different comparisons will be used as state-of-art techniques to compare future researches for existing automated text classification techniques.

Author 1: Sindhu Abro
Author 2: Sarang Shaikh
Author 3: Zahid Hussain Khand
Author 4: Zafar Ali
Author 5: Sajid Khan
Author 6: Ghulam Mujtaba

Keywords: Hate speech; online social networks; natural language processing; text classification; machine learning

PDF

Paper 62: Study on Dominant Factor for Academic Performance Prediction using Feature Selection Methods

Abstract: All educational institutions always try to investigate the learning behaviors of students and give early prediction toward student’s outcomes for interventing and improving their learning performance. Educational data mining (EDM) offers various effective prediction models to predict student performance. Simultaneously, feature selection (FS) is a method of EDM that is utilized to determine the dominant factors that are needed and sufficient for the target concept. FS method extracts high-quality data that reduce the complexity of the prediction task that can increase the robustness of decision rule. In this paper, we provide a comparative study of feature selection methods for determining dominant factors that highly affect classification performance and improve the performance of prediction models. A new feature selection CHIMI based on ranked vector score is proposed. Analysis of feature sets of each FS method to get the dominant set is executed. The experimental results show that by using the dominant set of the proposed CHIMI method, the classification performance of the proposed models is significantly improved.

Author 1: Phauk Sokkhey
Author 2: Takeo Okazaki

Keywords: Educational data mining; dominant factors; feature selection methods; prediction models; student performance

PDF

Paper 63: A Novel Framework for Mobile Telecom Network Analysis using Big Data Platform

Abstract: Social Network Analysis measures the interconnection between humans, entities or communities and the streaming of messages between them. This kind of Analysis studies the relationship between different people in a very deep way; it shows how one node (subscriber) in the network can affect the others. This research studies the connections between the customers in many different ways to help any telecom operator increase the cross and up-selling of its products and services as follows: detect communities of subscribers which are a group of nodes collected together to form a community, identify the connection types and label the links between the customers as (business, friends , family and others), as well as identifying the top influencers in the network who can spread positive or negative messages about products and services provided by the company through communities in the network and determine off-net customers who can be acquired to be targeted by specific marketing campaigns. A real cell phone dataset of 116 Million call detailed records of SMS and Voice Calls of an Egyptian Communication Service Provider (CSP) is used.

Author 1: M. M. Abo Khedra
Author 2: A. A. Abd EL-Aziz
Author 3: Hedi HAMDI
Author 4: Hesham A. Hefny

Keywords: SNA; influencer; acquisition; community detection; link prediction; call detailed record; on-net node; off-net node

PDF

Paper 64: Application of Kinect Technology and Artificial Neural Networks in the Control of Rehabilitation Therapies in People with Knee Injuries

Abstract: In the field of physiotherapy, the recognition of the poses of the human body is obtaining more research so that the patient has an accelerated recovery rate in his rehabilitation. Nowadays, it is not so challenging to have devices like Microsoft Kinect that allow us to interact with the user for the recognition of poses and body gestures. The objective of this work to capture the data of the joints of a person's body through a set of angles using the Kinect device, then artificial neural networks with the Back-Propagation algorithm were used for machine learning, and their precision was determined. The results found on the performance of the neural network show that 99.70% accuracy was achieved in the classification of the patients' postures, which can be used as an alternative in the rehabilitation therapies of patients with knee injuries.

Author 1: Bisset Gonzales Loayza
Author 2: Alberto Calla Bendita
Author 3: Mario Huaypuna Cjuno
Author 4: Jose Sulla-Torres

Keywords: Machine learning; artificial neural network; kinect; physiotherapy; rehabilitation

PDF

Paper 65: Detecting Violent Radical Accounts on Twitter

Abstract: In the past few years and as a result of the enormous spreading of social media platforms worldwide, many radical groups tried to invade social media cyber space in order to disseminate their ideologies and destructive plans. This brutal invasion to society daily life style must be resisted as social media networks are interacted with on daily basis. As some violent radical groups such as ISIS has developed well designed propaganda strategies that enables them to recruit more members and supporters all over the world using social media facilities. So it is crucial to find an efficient way to detect the violent-radical accounts in social media networks. In this paper, an intelligent system that autonomously detects ISIS online community in Twitter social media platform is proposed. The proposed system analyzes both linguistic features and behavioral features such as hashtags, mentions and who they follow. The system consists of two main sub-systems, namely the crawling and the inquiring subsystems. The crawling subsystem uses the initially known ISIS-related accounts to establish an ISIS-account detector. The inquiring subsystem aims to detect Pro ISIS-accounts.

Author 1: Ahmed I. A. Abd-Elaal
Author 2: Ahmed Z. Badr
Author 3: Hani M. K. Mahdi

Keywords: Machine learning; ISIS; Daesh; extremism; data mining; social media; Twitter

PDF

Paper 66: Intelligent Tutoring Supported Collaborative Learning (ITSCL): A Hybrid Framework

Abstract: Recently Intelligent Tutoring Systems (ITS) and Computer-Supported Collaborative Learning (CSCL) have got much attention in the field of computer science, artificial intelligence, cognitive psychology, and educational technologies. An ITS is a technologically intelligent system that provides an adaptive learning paradigm for an individual learner only, while CSCL is also a technology-driven learning paradigm that supports groups of learners in pertaining knowledge by collaboration. In a multidisciplinary research field—the Learning Sciences, both individual and collaborative learning have their own significance. This research aims to extend ITS for collaborative constructivist view of learning using CSCL. Integrating both design architecture of CSCL and ITS, this research model propose a new conceptual framework underpinning “Intelligent Tutoring Supported Collaborative Learning (ITSCL)”. ITSCL extend ITS by supporting multiple learners interacting system. ITSCL support three different types of interaction levels. The first level of interaction supports individual learning by learner-tutor interaction. The second and third level of interaction support collaborative learning, by learner-learner interaction and tutor-group of collaborative learners’ interactions, respectively. To evaluate ITSCL, a prototype model was implemented to conduct few experiments. The statistical results extrapolate the learning gains, measured from Paired T-Test and frequency analysis, contend a significant learning gain and improvement in the learning process with enhanced learning performance.

Author 1: Ijaz Ul Haq
Author 2: Aamir Anwar
Author 3: Iqra Basharat
Author 4: Kashif Sultan

Keywords: Intelligent Tutoring System (ITS); Computer-Supported Collaborative Learning (CSCL); Artificial Intelligence (AI); individual learning; collaborative learning

PDF

Paper 67: Secure Access Control Model for Cloud Computing Environment with Fuzzy Max Interval Trust Values

Abstract: Cloud computing needs service provider with reliable communication for increasing the user trust. As existence of cloud depends on quality of services, evaluation of this trust value needs to be carried out by the cloud. Many of the web services provided by E-commerce, social sites, digital platform maintain this for the faith of user by estimating the reliability of service provider. This paper focuses on a model that can identify real nodes by its behavior in cloud. Here fuzzy max interval values have been evaluated from the transactional behavior of the node in fixed interval. By increase in transaction count, trust value of real node trust increases and trust value of malicious nodes decreases. The work is based on Role based Access Control (RBAC), which has three type of roles (Admin, Data owner, Node). Data owner content security was achieved by AES algorithm and only trusted node can access those resources. Experiment was performed by carrying out simulations on ideal and environment under attack. Analysis of evaluation parameters values shows that proposed model of fuzzy max interval trust is better as compared to other existing Domain Partition Trust Model (DPTM), for identification of malicious nodes.

Author 1: Aakib Jawed Khan
Author 2: Shabana Mehfuz

Keywords: Cloud computing; encryption; fuzzy logic; trust computing; role based access control; resource management

PDF

Paper 68: Improving Palmprint based Biometric System Performance using Novel Multispectral Image Fusion Scheme

Abstract: Nowadays, there are several identification systems which are based on different biometric modalities. In particular, multispectral images of palmprints captured in different spectral bands have a very distinctive biometric identifier. This paper proposes a novel fusion scheme of a biometric recognition system by multiSpectral palmprint. This system is composed of three blocks: (1) extraction of the region of interest (ROI) from multispectral images, (2) a new image fusion architecture based on the measurement of decorrelation, and (3) a scheme of dimension reduction and classification. The proposed image fusion system combines the information from the same left and right spectral band using the 2D discrete wavelet (DWT) transform technique. In addition, a feature extraction using the Log-Gabor transform is performed, while the feature size has been reduced using the Kernel Principal Component Analysis technique (KPCA). In Our experiments we use CASIA multispectral palmprint database. We obtained an accuracy rate (ACC) of 99.50% for the spectral bands WHT (white light) and 940 nm and an equal error rate EER = 0.05%.These results show that our system is robust against spoofing.

Author 1: Essia Thamri
Author 2: Kamel Aloui
Author 3: Mohamed Saber Naceur

Keywords: Biometric recognition; palmprint; multispectral images; image fusion; Log-Gabor; KPCA; DWT; CASIA

PDF

Paper 69: A Review on Research Challenges, Limitations and Practical Solutions for Underwater Wireless Power Transfer

Abstract: Wireless power transmission is the process to transmit electrical energy without using wire or any physical link. WPT is mainly used at such places where it is difficult to transfer energy using conventional wires. In this research work, we investigated the need and feasibility of wireless power transmission for underwater applications. This research paper will outline research challenges, limitations and practical consideration for underwater wireless power transfer (UWPT). Recent researchers have focused on WPT in air. However, WPT is still a challenging task in underwater environment. In this study, we have presented a review on previous research works in underwater wireless power transfer (UWPT). We have provided a comparison of different techniques implemented for wireless power transfer. This paper also proposes the idea of MIMO wireless power transmission for AUVs charging. This paper elaborates capabilities and limitations of the wireless power transfer system in underwater media as stochastic nature of ocean is a big challenge in wireless power transmission. We have also addressed design challenges and seawater effects on UWPT system.

Author 1: Syed Agha Hassnain Mohsan
Author 2: Asad Islam
Author 3: Mushtaq Ali Khan
Author 4: Arfan Mahmood
Author 5: Laraba Selsabil Rokia
Author 6: Alireza Mazinani
Author 7: Hussain Amjad

Keywords: Underwater wireless power transfer; charging; MIMO; AUV

PDF

Paper 70: An Intermediate Representation-based Approach for Query Translation using a Syntax-Directed Method

Abstract: We aspire to make one query reasonably sufficient to extract data regardless of the data model used in our research. In such a way, users can freely use any query language they master to interrogate the heterogeneous database, not necessarily the query language associated with the model. Thus, overcoming the needing to deal with multiple query languages, which is, usually, an unwelcome matter for non-expert users and even for the expert ones. To do so, we proposed a new translation approach, relying on an intermediate query language to convert the user query into a suitable query language, according to the nature of data interrogated. Which is more beneficial rather than repeat the whole process for each new query submission. On the other hand, this empowers the system to be modular and divided into multiple, more flexible, and less complicated components. Therefore, it increases possibilities to make independent transformations and to switch between several query languages efficiently. By using our system, querying each data model with the corresponding query language is no longer bothersome. As a start, we are covering the eXtensible Markup Language (XML) and relational data models, whether native or hybrid. Users can retrieve data sources over these models using just one query, expressed with either the XML Path Language (XPath) or the Structured Query Language (SQL).

Author 1: Hassana NASSIRI
Author 2: Mustapha MACHKOUR
Author 3: Mohamed HACHIMI

Keywords: Data Model; Relational Database; eXtensible Markup Language (XML); translation; model integration; intermediate representation; ANTLR (ANother Tool for Language Recognition)

PDF

Paper 71: An Adaptive Quality Switch-aware Framework for Optimal Bitrate Video Streaming Delivery

Abstract: Given a large number of online video viewers, video streaming, over various networks, is important communication technology. The multitude of viewers makes it challenging for service providers to provide a good viewing experience for subscribers. Video streaming capabilities are designed based on concepts including quality, viewing flexibility, changing network conditions, and specifications for different customer devices. Adjusting the quality levels, and controlling various relevant parameters to stream the video content with good quality and without interruption is vital. This paper proposes an adaptive framework to balance the average video bitrates with respect to appropriate quality switches, making the transition to higher switches more seamless. The quality adaptation scheme increases the bitrates to the maximum value at their current quality switch before shifting to a higher level. This reduced switching times between levels and guarantees the stability of viewing and avoids interruptions. The use of a dynamic system ensures optimal performance, by controlling system parameters and making the algorithm more tunable. We built the system using an open-source DASH library (Libdash) with QuickTime player, studied the video load changes on two performance parameters, Central Processing Unit and Memory usages that have a high impact on multimedia quality. Consequently, the values of parameters that affected the performance of video streaming could be decreased, permitting users to regulate the parameters according to their preferences. Further, reducing the switching levels will reduce the overloads that occur while transferring from one level to another.

Author 1: Wafa A. Alqhtani
Author 2: Ashraf A. Taha
Author 3: Maazen S. Alsabaan

Keywords: Adaptive video streaming; average bit rate; mobile devices; modeling; quality of experience; quality switches; wireless networks

PDF

Paper 72: A New Clustering Algorithm for Live Road Surveillance on Highways based on DBSCAN and Fuzzy Logic

Abstract: Video streaming over Vehicular Ad Hoc Networks is a promising technique (VANETs), and it has gained great importance in the last few years. The highly dynamic topology of VANETs makes high-quality video streaming very challenging. In order to provide the most useful camera views to the vehicles, clustering and cluster head selection techniques are used. Too frequent camera view changes can be annoying; therefore, we propose a new stable clustering algorithm to ensure a stable live road surveillance service without interruptions for vehicles that do not have enough vision area. In the proposed solution, we integrated Density-Based Spatial Clustering of Applications with Noise (DBSCAN) with Fuzzy Logic Control (FLC). DBSCAN is used to form the clusters, while FLC is used to find the best cluster head for the cluster. Different parameters are utilized like density parameters for DBSCAN, and relative speed, acceleration, leadership degree and vision area for fuzzy logic. Our proposed algorithm showed better results in terms of cluster lifetime and vehicle status change. Our proposed algorithm has been compared with another clustering scheme to prove the effectiveness of our proposed algorithm.

Author 1: Hasanain Alabbas
Author 2: Árpád Huszák

Keywords: Vehicular ad hoc networks (VANETs); V2V; intelligent transportation systems; clustering algorithms; road surveillance; DBSCAN algorithm; fuzzy logic control

PDF

Paper 73: Towards an Integrated Model of Data Governance and Integration for the Implementation of Digital Transformation Processes in the Saudi Universities

Abstract: In the face of the challenges of the Digital Age and the outbreak of the pandemic COVID-19 which have changed higher education institutions remarkably, universities are urgently required to speed up their digitalization initiatives and cope up with the global digital developments to survive. For the successful implementation of digital transformation, however, data governance should be considered. Despite the extensive literature on data governance and digital transformation, there is no focus on the issue in the Saudi Higher Education institutions. In the face of this, the current study investigates data governance policies and practices in the Saudi Universities. This study is built on a case study design. Nine universities in Saudi Arabia were selected for the purpose of the study. These include public, community, and private universities that make up the Higher Education system in Saudi Arabia. For data collection purposes, three methods were selected. These were the survey, focus group discussions and in-depth interviews. The findings of this study indicate that data governance is an effective tool in the implementation of digital transformation processes in higher education institutions and thus should be embedded into strategies of the universities for using digital technologies in appropriate manners. Good data governance practices are required for smooth and effective digital transformation. Universities are required to create an effective functional team for the data governance tasks, develop an internal audit of data governance, follow-up the regulatory compliance procedures, define the priorities of data governance activities, provide frequent data governance training for employees and faculty members, set enforcement and follow-up standards, and conduct frequent assessments of data governance plans and policies. Although the study is limited to the Saudi universities, results and implications of this study can be a useful reference to choose effective data governance practices that can be successfully adopted and implemented to effectively manage critical information and use it to transform their day-to-day operations.

Author 1: Abdulfattah Omar
Author 2: Ahmed almaghthawi

Keywords: COVID-19; data governance; digital transformation; higher education; Saudi Arabia

PDF

Paper 74: Prudently Secure Information Theoretic LSB Steganography for Digital Grayscale Images

Abstract: The endangerment of online data breaches calls for exploring new and enhancing existing sneaky ways of clandestine communication to tailor those to match the present and futuristic technological and environmental needs, to which malicious intruders wouldn't have an answer. Cryptography and Steganography are the two distinct techniques that, for long, have remained priority choices for hiding vital information from the unauthorized. But the visibility of the encrypted contents makes these vulnerable to attack. Also, the recent legislative protection agreed to law enforcement authorities in Australia to sneak into pre-shared cryptographic secret keys (PSKs) shall have a devastating impact on the privacy of the people. Hence, the need of the hour is to veil in the encrypted data underneath the cover of Steganography, whose sole intent is to hide the very existence of information. This research endeavor enhances one of the most famous images Steganography technique called the Least Significant Bit (LSB) Steganography, from the security and information-theoretic standpoint by taking a known-cover and known-message attack scenario. The explicit proclamation of this research endeavor is that the security of LSB Steganography lies in inducing uncertainty at the time of bit embedding process. The test results rendered by the proposed methodology confers on the non-detectability and imperceptibility of the confidential information along with its strong resistance against LSB Steganalysis techniques.

Author 1: Khan Farhan Rafat

Keywords: Clandestine communication; covert channel; hiding data in plain sight; inveil communication; LSB steganography

PDF

Paper 75: Towards Securing Cloud Computing from DDOS Attacks

Abstract: Cloud computing (CC) is an advanced technology that provides data sharing and access to computing resources. The cloud deployment model represents the exact type of cloud environment based on ownership, size, and accessibility rights, and also describes the purpose and nature of the cloud. Since all processes today are computerized, consumers need a lot amount of data and cache size. The security of the cloud is ensured in many levels, but the scope of intrusions makes it necessary to understand the factors that affect cloud security. CC-certified users rely on third parties for their other important security issues in third-party computing clouds. A DDoS attack is an attack-type in which it is not necessary to send a large number of packets to the server, which makes it impossible for legitimate users to access them. In this research work, a DDoS attack was launched and a tool for launching a DDoS attack was discussed. In this research, DDoS attacks were rejected using three different SNORT rules. In this research, rules predefined for detecting DDoS attacks on SNORT profiles detect and prevent DDoS attacks, but because they block certain legitimate requests and generate false alarms, this should be the subject of future research.

Author 1: Ishtiaq Ahmed
Author 2: Sheeraz Ahmed
Author 3: Asif Nawaz
Author 4: Sadeeq Jan
Author 5: Zeeshan Najam
Author 6: Muneeb Saadat
Author 7: Rehan Ali Khan
Author 8: Khalid Zaman

Keywords: Cloud computing; denial of service; SNORT rules; network; energy consumption

PDF

Paper 76: An Innovative Approach of Verification Mechanism for both Electronic and Printed Documents

Abstract: Documents are inevitably relevant in our day-to-day life. Forgery of document could have severe repercussions including financial losses, misjudgments, damage of respect, goodwill, etc. Hence, documents need to be secured from threats such as counterfeiting, falsification, tempering etc., and there should be an easy way of verification about the originality of documents. There are several existing methods for ensuring authenticity and integrity with modern technologies like the block chain, Digital Signature, etc. Most of the methods are not appropriate for public usage instantly due to their intricacy, excessive costing, and implementation problem for which the easy approach of verification is yet not available for mass people. In this situation, a method of document verification has been proposed in this paper which intends to provide (i) authenticity, (ii) integrity, (iii) availability, and (iv) non-repudiation. The proposed method will serve the purpose of mass people as it has no licensing fee, easily implementable and effortlessly useable for both electronic and printed documents. It is worth to mention that the proposed method will provide a mechanism to confirm the originality of the document using only a Smartphone in no time.

Author 1: Md. Majharul Haque
Author 2: Md. Nasim Adnan
Author 3: Mohammod Akbar Kabir
Author 4: Mohammad Rifat Ahmmad Rashid
Author 5: Abu Sadat Mohammad Yasin
Author 6: Muhammad Shakil Pervez

Keywords: Document verification; integrity; non-repudiation; blockchain; printed documents

PDF

Paper 77: Feature Expansion using Lexical Ontology for Opinion Type Detection in Tourism Reviews Domain

Abstract: Tourism reviews platform such as Trip Advisor become a major source for tourists to share their experiences and get some ideas for decision making. Since there are millions of reviews generated daily in the travel websites, tourist is often overwhelmed with huge information. This is where opinion type detection is important as it makes it easy for a tourist to obtain useful reviews for their understanding and planning processes based on the reviews’ opinion type. The opinion type of texts in travel mostly involves different aspects of opinion related to the travel process, such as transportation, accommodation, price, food, entertainment, and so on. The challenge of this research is to improve this detection by proposing the lexical ontology approach to address the issue of out-of-vocabulary (OOV) keywords during a supervised detection of opinion type. Besides, there are also issues where the training data for detection has poor coverage or limited in a certain domain. In this paper, we propose a review opinion type detection approach by integrating the word (feature) expansion approach in machine learning. The suggested approach consists of two stages namely feature expansion and classification. For feature expansion, Lexical Ontology (LO) is used to expand the feature-related word to the domains such as synonyms. For classification, the expanded feature is corporate to the Machine Learning approach to detect the opinion type.

Author 1: Lim Jie Chen
Author 2: Gan Keng Hoon

Keywords: Tourism domain; online review; opinion type detection; text classification; lexical ontology

PDF

Paper 78: A Novel Approach for Computer Assisted Sleep Scoring Mechanism using ANN

Abstract: Sleep analysis and its categories in sleep scoring system is considered to be helpful in an area of sleep research and sleep medicine. The scheduled study employs novel approach for computer assisted automated sleep scoring system using physiological signals and Artificial neural network. The data collected were recorded for seven hour, 30 second epoch for each subject. The data procured from the physiological signal was controlled and prepared to expel degenerated signals in order to extract essential data or features used for the study. As, it is known human body distributes its own electrical signals which is needed to be eliminated and these are known as artifacts and they are needed to be filtered out. In this study, signal filtering is achieved by using Butterworth Low-Pass filter. The features extracted were trained and classified using an Artificial Neural Network classifier. Even though, it is a highly complicated concept, using same in biomedical field when engaged with electrical signals which is obtained from body is novel. The accuracy estimated for the system was found to be good and thus the procedure can be very helpful in clinics, particularly useful for neurologist for diagnosing the sleep disorders.

Author 1: Hemu Farooq
Author 2: Anuj Jain
Author 3: V.K. Sharma
Author 4: Iflah Aijaz
Author 5: Sheikh Mohammad Idrees

Keywords: Sleep scoring stages; EEG; EMG; EOG; artificial neural network

PDF

Paper 79: Cloud of Things (CoT) based Parking Prediction

Abstract: Cloud computing with an amalgamation of the internet of things (IoT) typically gave birth to an ideal field called Cloud of things (CoT). CoT maintains revolutionary services in every domain, but it has instantly become a rising star in smart transportation because a well-organized facility might present a challenge for dealing with the exponentially expanding people living in smart cities. Lack of management in transport can cause distress among people and nowadays parking has come to be one of the critical issues faced by the public daily. In this paper, we present a parking availability prediction model implemented within a geo-fence ranging from 100-150 meters based on cloud, IoT, and GIS. In contrast to present models, instead of offering no space or parking is full; our model accurately determines the ETA of a vehicle and checks the potential/chance of parking availability. It also calculates the time for the next parking space if the existing parking space accessibility is zero. Moreover, our model includes all the exogenous factors like weather, time zone conditions to gain prediction accuracy.

Author 1: Nazish Razzaq
Author 2: Muhammad Asaad Subih
Author 3: Madiha Khatoon
Author 4: Amir Razi
Author 5: Babur Hayat Malik
Author 6: Nimra Ashraf
Author 7: Tehseen Kausar
Author 8: Rashida Tarrar
Author 9: Muhammad Usman Sabir
Author 10: Syed Izaz ul Hassan Bukhari

Keywords: Cloud computing; internet of things (IoT); parking; prediction; availability; Estimated Time of Arrival (ETA); Geofence; Geographic Information System (GIS)

PDF

Paper 80: xMatcher: Matching Extensible Markup Language Schemas using Semantic-based Techniques

Abstract: Schema matching is a critical step in data inte-gration systems. Most recent schema matching systems require a manual double-check of the matching results to add missed matches and remove incorrect matches. Manual correction is labor-intensive and time-consuming, however without it the results accuracy is significantly lower. In this paper, we present xMatcher, an approach to automatically match XML schemas. Given two schemas S1 and S2, xMatcher identifies semantically similar schema elements between S1 and S2. To obtain correct matches, xMatcher first transforms S1 and S2 into sets of words; then, it uses a context-based measure to identify the meanings of words in their contexts; next, it captures semantic relatedness between sets of words in different schemas; finally, it uses WordNet information to calculate the similarity values between semantically related sets and matches the pairs of sets whose similarity values are greater than or equal to 0.8. The results show that xMatcher provides superior matching accuracy compared to the state of the art matching systems. Overall, our proposal can be a stepping stone towards decreasing human assistance and overcoming the weaknesses of current matching initiatives in terms of matching accuracy.

Author 1: Aola Yousfi
Author 2: Moulay Hafid El Yazidi
Author 3: Ahmed Zellou

Keywords: Schema matching; matching accuracy; semantic similarity; semantic relatedness; WordNet

PDF

Paper 81: Performance Analysis of a Graph-Theoretic Load Balancing Method for Data Centers

Abstract: Modern data centers can process a massive amount of data in a short time with minimal errors. Data center networks (DCNs) use equal-cost, multi-path topologies to deliver split flows across alternative paths between the core layer and hosted servers, which could lead to significant overload if path scheduling is inefficient. Thus, distributing incoming requests among these paths is crucial for providing higher throughput and protection against link or switch failures. Several approaches have been proposed for path selection, mainly relying on round-robin and least-congested methods. In this paper, we propose a load-balancing method based on betweenness centrality to improve the overall performance of a data center in terms of throughput, delay, and energy consumption. For evaluation, we compare our method with baseline methods of different DCN topologies: fat-tree, DCell, and BCube. On average, the evaluation results show that our method outperforms the others. It increases throughput by 202% and 33% while reducing delay by 20% and 22%, and energy consumption by 40% and 41% compared to the round-robin and least-congested methods, respectively.

Author 1: Walaa M. AlShammari
Author 2: Mohammed J.F. Alenazi

Keywords: Data center; load balancing; path diversity; network management; load management; throughput; topology; performance metrics; betweenness centrality; flow scheduling; modeling; DCNs

PDF

Paper 82: Extending Shared-Memory Computations to Multiple Distributed Nodes

Abstract: With the emergence of accelerators like GPUs, MICs and FPGAs, the availability of domain specific libraries (like MKL) and the ease of parallelization associated with CUDA and OpenMP based shared-memory programming, node-based parallelization has recently become a popular choice among developers in the field of scientific computing. This is evident from the large volume of recently published work in various domains of scientific computing, where shared-memory programming and accelerators have been used to accelerate applications. Although these approaches are suitable for small problem-sizes, there are issues that need to be addressed for them to be applicable to larger input domains. Firstly, the primary focus of these works has been to accelerate the core kernel; acceleration of input/output operations is seldom considered. Many operations in scientific computing operate on large matrices - both sparse and dense - that are read from and written to external files. These input-output operations present themselves as bottlenecks and significantly effect the overall application time. Secondly, node-based parallelization limits a developer from distributing the computation beyond a single node without him having to learn an additional programming paradigm like MPI. Thirdly, the problem size that can be effectively handled by a node is limited by the memory of the node and accelerator. In this paper, an Asynchronous Multi-node Execution (AMNE) approach is presented that uses a unique combination of the shared-file system and pseudo-replication to extend node-based algorithms to a distributed multiple node implementation with minimal changes to the original node-based code. We demonstrate this approach by applying it to GEMM, a popular kernel in dense linear algebra and show that the presented methodology significantly advances the state of art in the field of parallelization and scientific computing.

Author 1: Waseem Ahmed

Keywords: GPU; OpenMP; shared memory programming; distributed programming; CUDA

PDF

Paper 83: Deep Learning Approach for Forecasting Water Quality in IoT Systems

Abstract: Global climate change and water pollution effects have caused many problems to the farmers in fish/shrimp raising, for example, the shrimps/fishes had early died before harvest. How to monitor and manage quality of the water to help the farmers tackling this problem is very necessary. Water quality monitoring is important when developing IoT systems, especially for aquaculture and fisheries. By monitoring the real-time sensor data indicators (such as indicators of salinity, temperature, pH, and dissolved oxygen - DO) and forecasting them to get early warning, we can manage the quality of the water, thus collecting both quality and quantity in shrimp/fish raising. In this work, we introduce an architecture with a forecasting model for the IoT systems to monitor water quality in aquaculture and fisheries. Since these indicators are collected every day, they becomes sequential/time series data, we propose to use deep learning with Long-Short Term Memory (LSTM) algorithm for forecasting these indicators. Experimental results on several data sets show that the proposed approach works well and can be applied for the real systems.

Author 1: Nguyen Thai-Nghe
Author 2: Nguyen Thanh-Hai
Author 3: Nguyen Chi Ngon

Keywords: Forecasting model; deep learning; Long-Short Term Memory (LSTM); water quality indicators

PDF

Paper 84: A Complete Methodology for Kuzushiji Historical Character Recognition using Multiple Features Approach and Deep Learning Model

Abstract: As per the studies during many decades, substantial research efforts have been devoting towards character recogni-tion. This task is not so easy as it it appears; in fact humans’ have error rate about more than 6%, while reading the handwritten characters and recognizing. To solve this problem an effort has been made by applying the multiple features for recognizing kuzushiji character, without any knowledge of the font family presented. At the outset a pre-processing step that includes image binarization, noise removal and enhancement was applied. Second step was segmenting the page-sample by applying contour technique along with convex hull method to detect individual character. Third step was feature extraction which included zonal features (ZF), structural features (SF) and invariant moments (IM). These feature vectors were passed for training and testing to the various machine learning and deep learning models to classify and recognize the given character image sample. The accuracy achieved was about 85-90% on the data-set which consisted of huge data samples round about 3929 classes followed by 392990 samples.

Author 1: Aravinda C. V
Author 2: Lin Meng
Author 3: ATSUMI Masahiko
Author 4: Udaya Kumar Reddy K.R
Author 5: Amar Prabhu G

Keywords: Kuzushiji character; zonal features; structural features; invariant moments

PDF

Paper 85: Impact Analysis of Network Layer Attacks in Real-Time Wireless Sensor Network Testbed

Abstract: With the rapid increase in the demand for Wireless Sensor Network (WSN) applications. The intrusive activities are also raised. To save these networks from the intruders it is required to understand the implications of any malicious act. Most of the researchers have utilized simulated software to understand the impact of such intrusions, however, real network conditions vary from the simulated environment. Therefore, the current work focuses on analyzing the impact of network layer attacks in real-time WSN testbed. The contributions of this work are threefold. Firstly, it presents the deployment of a real-time experimental testbed using standardized sensor devices in a multi-hop topological arrangement. Secondly, it provides the implementation details of seven network layer attacks: Blackhole (BH), Dropping Node (DN), Drop Route Request (DRREQ), Drop Route Reply (DRREP), Drop Route Error (DRERR), Grayhole (GH) and Sinkhole (SH) in a single testbed. Finally, the testbed performance with and without each attack is monitored and compared in terms of network performance metrics to understand the attacks’ impact. This work will be helpful for the research community for proposing efficient attack detection and prevention solutions for these networks.

Author 1: Navjot Sidhu
Author 2: Monika Sachdeva

Keywords: Attack; impact; performance; real-time; Wireless Sensor Network (WSN)

PDF

Paper 86: Deep Learning with Data Transformation and Factor Analysis for Student Performance Prediction

Abstract: Student performance prediction is one of the most concerning issues in the field of education and training, especially educational data mining. The prediction supports students to select courses and design appropriate study plans for themselves. Moreover, student performance prediction enables lecturers as well as educational managers to indicate what students should be monitored and supported to complete their programs with the best results. These supports can reduce formal warnings and expulsions from universities due to students’ poor performance. This study proposes a method to predict student performance using various deep learning techniques. Also, we analyze and present several techniques for data pre-processing (e.g., Quantile Transforms and MinMax Scaler) before fetching them into well-known deep learning models such as Long Short Term Memory (LSTM) and Convolutional Neural Networks (CNN) to do prediction tasks. Experiments are built on 16 datasets related to numerous different majors with appropriately four million samples collected from the student information system of a Vietnamese multidisciplinary university. Results show that the proposed method provides good prediction results, especially when using data transformation. The results are feasible for applying to practical cases.

Author 1: Tran Thanh Dien
Author 2: Sang Hoai Luu
Author 3: Nguyen Thanh-Hai
Author 4: Nguyen Thai-Nghe

Keywords: Deep learning; student performance; mark prediction; Long Short Term Memory (LSTM); Convolutional Neural Networks (CNN); data pre-processing; multidisciplinary university

PDF

Paper 87: Scalability Validation of the Posting Access Method through UPPAAL-SMC Model-Checker

Abstract: The standard IEEE 802.15.6 provides a new phys-ical layer (PHY) and medium access control sublayer (MAC) specifications that support several challenges of wireless body area networks (WBANs). The posting is the access method of the IEEE 802.15.6 MAC protocol that is used by the hub to send data to the nodes. In this paper, we use a formal method to evaluate the posting access method under the WBANs stochastic environment. Based on the statistical model checking (SMC) toolset UPPAAL-SMC, we model and evaluate the behavior of the posting access method in terms of scalability. The evaluation results showed that according to the allocated time intervals, the energy consumption, and the throughput the scalability was validated.

Author 1: Bethaina Touijer
Author 2: Yann Ben Maissa
Author 3: Salma Mouline

Keywords: WBANs; IEEE 802.15.6 MAC protocol; posting access method; UPPAAL-SMC; energy consumption; throughput

PDF

Paper 88: A Prototype of an Automatic Irrigation System fo Peruvian Crop Fields

Abstract: Water is an important factor to sustain life and for such a reason it is necessary to take care of it since this is a limited resource. In Peruvian agriculture; however, there is a high percentage of water wasted, as this activity consumes 92%of fresh water; thus, making Peru the 37th country worldwide in misusing water. Due to the aforementioned and considering that the agricultural sector is an important factor for the Peruvian economy, the current study aims to implement a system for automatic irrigation of crop fields in Peru, with the goal of optimizing the use of water and not to waste it as it usually happens. After the implementation of the first prototype of the irrigation system using an Arduino microcontroller and low-cost electronic components, it could be observed that during the tests, 75 and 76.5% of the water that is normally used for irrigation was saved for a dry rainless and dry rainy patch of crop field, respectively. The monitoring of the humidity of the soil was possible due to bluetooth communication. The presented results show the viability of the system and in a follow-up study, large-scale tests are expected.

Author 1: Luis Nunez-Tapia

Keywords: Automatic irrigation; crop fields; Arduino; bluetooth

PDF

Paper 89: Non-invasive Device to Lessen Tremors in the Hands due to Parkinson’s Disease

Abstract: One of the severe neurological disorders that affects the central nervous system is Parkinson’s disease, which causes that patients can not perform routine tasks such as eating and writing. According to statistical data, there are more than 10 million people in the world who suffer from this disease and the Latin American nation of Peru is no stranger to this, since approximately 30 thousand people suffer from it. Until today there is not a cure for this disease; however, there are different chemical, biological and electronic methods that help to improve the quality of life of patients with this disease. This research aims to design a low-cost device that is able to diminish tremors in patients with Parkinson. The non-invasive device presented and developed in this study will work with the help of 5 vibratory motors and a microcontroller. The vibrations generated by the motors in the patient’s wrist will distract the brain and as result the tremors of the hand due to Parkinson’s disease will be reduced.

Author 1: Juan Hinostroza-Quinones
Author 2: Manuel Vasquez-Cunia

Keywords: Parkinson; non-invasive device; vibrations; Arduino

PDF

Paper 90: Evaluating the Quality of a Person’s Calligraphy using Image Recognition

Abstract: The problem of not developing good handwriting as a child has serious consequences for learning, these range from training human memory to the capacity for innovation. To assess the quality of a person’s handwriting, it is necessary to process large amounts of images and with current improvements in machine learning this process is increasingly precise, but the development of these algorithms is complicated. For this reason, this article presents the proposal developed to evaluate the quality of a person’s handwriting through image recognition in order to assist in its improvement, but performing image processing in a practical way. This evaluation will be carried out on a group of university students from the Arequipa region, in Peru, using an image processing system that allows character recognition. According to the degree of proximity, the level of handwriting will be determined in a quantified way in percentage degrees. The tests carried out show that the quality of calligraphy in university students in the Arequipa region varies between low and medium.

Author 1: Aaron Walter Avila Cordova
Author 2: Armando Flores Choque
Author 3: Joseph Clinthon Paucar Nuñez

Keywords: Calligraphy; image processing; character recognition

PDF

Paper 91: Nitrogen Fertilizer Recommendation for Paddies through Automating the Leaf Color Chart (LCC)

Abstract: Nitrogen fertilizer is inevitable for rice production to ensure that the crop’s nitrogen need is adequately supplied, during the growing season. International Rice Research Institute (IRRI) has proposed Leaf Color Chart (LCC) to detect the exact nitrogen need of paddy. Farmers generally monitor the plant’s growth (which is also an indicator of the nitrogen concentration of leaves) by comparing the leaf color with the corresponding color of the LCC. Currently, in most cases, LCC is used manually to determine the fertilizer need and thus, there is a chance of either overestimating or underestimating the amount of fertilizer. To avoid this problem, a smart fertilizer recommendation system is proposed in this paper. The proposed method is able to automate the manual acquisition and interpretation of leaf color for classi-fication through LCC. The experimentation considers a sample of 6000 Aman paddy leaf images. The data acquisition process was performed according to IRRI’s guidance of taking the paddy leaf images within the body shade by our developed application. The data/images have already been made public in Kaggle - a well-known dataset website. The semantic segmentation of the dataset was performed by a powerful Convolutional Neural Network (CNN) backbone architecture - DeepLabV3+. Color classification into 4 categories of the LCC was performed by CNN architecture which consists of seven layers. Information gain based evaluation was performed in the Decision Tree (DT) approach to select features and with the selected features DT classified images into 4 categories. Color classification by our two proposed methods achieved 94.22% accuracy in CNN Model and 91.22% accuracy in the DT classifier.

Author 1: Torikul Islam
Author 2: Rafsan Uddin Beg Rizan
Author 3: Yeasir Arefin Tusher
Author 4: Md Shafiuzzaman
Author 5: Md. Alam Hossain
Author 6: Syed Galib

Keywords: Leaf Color Chart (LCC); Convolutional Neural Net-work (CNN); fertilizer recommendation system; color classification; Decision Tree (DT)

PDF

Paper 92: Performance Comparison of Natural Language Understanding Engines in the Educational Domain

Abstract: Recently, chatbots are having a great importance in different domains and are becoming more and more common in customer service. One possible cause is the wide variety of platforms that offer the natural language understanding as a service, for which no programming skills are required. Then, the problem is related to which platform to use to develop a chatbot in the educational domain. Therefore, the main objective of this paper is to compare the main natural language understanding (NLU) engines and determine which could perform better in the educational domain. In this way, researchers can make more justified decisions about which NLU engine to use to develop an educational chatbot. Besides, in this study, six NLU platforms were compared and performance was measured with the F1 score. Training data and input messages were extracted from Mariateguino Bot, which was the chatbot of the Jose´ Carlos Mari´ategui University during 2018. The results of this comparison indicates that Watson Assistant has the best performance, with an average F1 score of 0.82, which means that it is able to answer correctly in most cases. Finally, other factors can condition the choice of a natural language understanding engine, so that ultimately the choice is left to the user.

Author 1: Victor Juan Jimenez Flores
Author 2: Oscar Juan Jimenez Flores
Author 3: Juan Carlos Jimenez Flores
Author 4: Juan Ubaldo Jimenez Castilla

Keywords: Chatbot; natural language understanding; NLU; F1 score; performance

PDF

Paper 93: Date Grading using Machine Learning Techniques on a Novel Dataset

Abstract: Dates grading is a crucial stage in the dates’ facto-ries. However, it is done manually in most of the Middle Eastern industries. This study, using a novel dataset, identifies the suitable machine learning techniques to grade dates based on the image of the date. The dataset consists of three different types of dates, namely, Ajwah, Mabroom, and Sukkary with each having three different grades. The dates were obtained from Manafez company and graded by their experts. The color, size and texture of the dates are the features that have been considered in this work. To determine the color, we have used color properties in RGB (red, green, and blue) color space. For measuring the size, we applied the best least-square fitting ellipse. To analyze the texture, we used Weber local descriptor to distinguish between texture patterns. In order to identify the suitable grading classifier, we have experimented three approaches, namely, k-nearest neighbor (KNN), support vector machine (SVM) and convolutional neural network (CNN). Our experiments have shown that CNN is the best classifier with an accuracy of 98% for Ajwah, 99% for Mabroom, and 99% for Sukkary. Hence, the CNN classifier has been incorporated in our date grading system

Author 1: Hafsa Raissouli
Author 2: Abrar Ali Aljabri
Author 3: Sarah Mohammed Aljudaibi
Author 4: Fazilah Haron
Author 5: Ghada Alharbi

Keywords: Date grading; machine learning; k-nearest neigh-bor; support vector machine; convolutional neural network

PDF

Paper 94: Robust Control and Fuzzy Logic Guidance for an Unmanned Surface Vehicle

Abstract: This work deals with guidance and control of an unmanned surface vehicle which has the mission to monitor au-tonomously the water quality condition in Peruvian sea onshore. The vehicle is a catamaran class with two slender bodies propelled by two electric thrusts in differential and common modes in order to maneuver in surge and in yaw directions. A multivariable control approach is proposed in order to control these two variables and a fuzzy logic-based guidance tracks predefined trajectories at the sea surface. The conjunction between robust control and guidance algorithms is validated numerically and the results show good stability and performance despite the presence of disturbance, noise sensors and model uncertainties.

Author 1: Marcelo M. Huayna-Aguilar
Author 2: Juan C. Cutipa-Luque
Author 3: Pablo Raul Yanyachi

Keywords: Robust control; guidance; fuzzy; unmanned surface vehicle

PDF

Paper 95: Passenger Communication System for Next-Generation Self-Driving Cars: A Buddy

Abstract: With the rapid emergence of autonomous vehicles, there is a need to build such communication systems which help the passengers to communicate with autonomous vehicles (AVs) robustly. In this regard, this research work presents a multimodal passenger communication system. The communica-tion system is known as “buddy" for AVs. Buddy is an all in one control system for AVs which incorporates touch, speech, text, and emotion recognition methods of interaction. Buddy makes it easy for passengers to interact with AVs. It enables the communication between the passengers and the AV which eventually provides a safe driving experience. Moreover, we have proposed and developed our own simulator two evaluate the performance of our proposed passenger communication system. We have also conducted extensive infield-tests to test the effectiveness of the proposed system. The extensive rigor analysis validates the results and hence the significance of the proposed passenger communication system.

Author 1: M Talha Bin Ahmed Lodhi
Author 2: Faisal Riaz
Author 3: Yasir Mehmood
Author 4: Muhammad Farrukh Farid
Author 5: Abdul Ghafoor Dar
Author 6: Muhammad Atif Butt
Author 7: Samia Abid
Author 8: Hasan Ali Asghar

Keywords: Autonomous Vehicles (AVs); passenger communi-cation system; the simulation engine

PDF

Paper 96: A Hybrid Model based on Radial basis Function Neural Network for Intrusion Detection

Abstract: An Intrusion Detection System (IDS) is a system that monitors the network for identifying malicious activities. Upon identifying the unusual activities, IDS sends a notification to the network administrators to warn about the hackers’ hostile activities. To detect intrusion, signature-based systems are consid-ered to be one of the most effective methods. However, they cannot detect new attacks. Additionally, it is costly and challenging to keep the attack signatures database up to date with known signatures, which constructed a significant drawback. Neural networks are capable of learning through input patterns and have the potential to generalize data. In this paper, we propose a hybrid model based on Directed Batch Growing Self-Organizing Map (DBGSOM) combined with a Radial Basis Function Neural Network (RBFNN) detecting abnormalities in the network. Based on our experiment, the proposed model performed well and has resulted in satisfactory performance measures compared to Self-Organizing Maps and Radial Basis Function Neural Network (SOM&RBFNN) model.

Author 1: Marwan Albahar
Author 2: Ayman Alharbi
Author 3: Manal Alsuwat
Author 4: Hind Aljuaid

Keywords: Intrusion detection; neural network; radial basis function; directed batch growing self-organizing map

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org