The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 11 Issue 6

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Modeling Real-World Load Patterns for Benchmarking in Clouds and Clusters

Abstract: Cloud computing has currently permeated all walks of life. It has proven extremely useful for organizations and individual users to save costs by leasing compute resources that they need. This has led to an exponential growth in cloud computing based research and development. A substantial number of frameworks, approaches and techniques are being proposed to enhance various aspects of clouds, and add new features. One of the constant concerns in this scenario is creating a testbed that successfully reflects a real-world cloud datacenter. It is vital to simulate realistic, repeatable, standardized CPU and memory workloads to compare and evaluate the impact of the different approaches in a cloud environment. This paper introduces Cloudy, which is an open-source workload generator that can be used within cloud instances, Virtual Machines (VM), containers, or local hosts. Cloudy utilizes resource usage traces of machines from Google and Alibaba clusters to simulate up to 16000 different, real-world CPU and memory load patterns. The tool also provides a variety of machine metrics for each run, that can be used to evaluate and compare the performance of the VM, container or host. Additionally, it includes a web-based visualization component that offers a number of real-time statistics, as well as overall statistics of the workload such as seasonal trends, and autocorrelation. These statistics can be used to further analyze the real-world traces, and enhance the understanding of workloads in the cloud.

Author 1: Kashifuddin Qazi

Keywords: Cloud computing; workload generator; cluster com-putting

PDF

Paper 2: Performance Evaluation of LoRa ES920LR 920 MHz on the Development Board

Abstract: This study contains the LoRa ES920LR test on obstruction or resistance conditions and its comparison with Free Space Path Loss (FSPL) using Drone means without obstacles. The ES920LR has 920 MHz frequency channel settings, 125 kHz Bandwidth, and SF 7-12, with 13 dBm Output Power. Comes with Sleep mode operation with Command Prompt based settings. The development board used is a leafony board, Leafony board is a board with a small size that is compatible with Arduino IDE, using a micro ATmega M328P microcontroller, this board is mounted tiled with complete facilities, e.g., a power supply board, four different sensor boards, MCU boards, communication boards e.g., WiFi and Bluetooth, specifically in this article using the LoRa ES920LR on the Leafony board, using LoRa because it requires a long range (km) and low power, and expansion boards that can be developed, Expansion to Leafony boards is expected to reduce the power consumption of the sensor node, lifetime, lif, small size, and lightweight. Furthermore, this algorithm is used to optimize LoRa Coverage and LoRa Lifetime. The results of receiving the FSPL LoRa 920LR have a Power Receiver (Pr) of 30 dB at a distance of 1 meter or A, at 500 meters 85 dB, 1500 meters 95 dB, 2.5km 100 dB. Attenuation is caused by distance, although not significant, other factors are obstacles or obstacles, bad weather (rain, snow).

Author 1: Puput Dani Prasetyo Adi
Author 2: Akio Kitagawa

Keywords: Coverage; lifetime; low power; lightweight; long range; development; board; free Space; drone

PDF

Paper 3: Adapting CRISP-DM for Idea Mining: A Data Mining Process for Generating Ideas Using a Textual Dataset

Abstract: Data mining project managers can benefit from using standard data mining process models. The benefits of using standard process models for data mining, such as the de facto and the most popular, Cross-Industry-Standard-Process model for Data Mining (CRISP-DM) are reduced cost and time. Also, standard models facilitate knowledge transfer, reuse of best practices, and minimize knowledge requirements. On the other hand, to unlock the potential of ever-growing textual data such as publications, patents, social media data, and documents of various forms, digital innovation is increasingly needed. Furthermore, the introduction of cutting-edge machine learning tools and techniques enable the elicitation of ideas. The processing of unstructured textual data to generate new and useful ideas is referred to as idea mining. Existing literature about idea mining merely overlooks the utilization of standard data mining process models. Therefore, the purpose of this paper is to propose a reusable model to generate ideas, CRISP-DM, for Idea Mining (CRISP-IM). The design and development of the CRISP-IM are done following the design science approach. The CRISP-IM facilitates idea generation, through the use of Dynamic Topic Modeling (DTM), unsupervised machine learning, and subsequent statistical analysis on a dataset of scholarly articles. The adapted CRISP-IM can be used to guide the process of identifying trends using scholarly literature datasets or temporally organized patent or any other textual dataset of any domain to elicit ideas. The ex-post evaluation of the CRISP-IM is left for future study.

Author 1: Workneh Y. Ayele

Keywords: CRISP-IM; idea generation; idea evaluation; idea mining evaluation; dynamic topic modeling; CRISP-DM

PDF

Paper 4: A Pedestrian Detection and Tracking Method for Robot Equipped with Laser Radar

Abstract: In order to detect and track pedestrians in complex indoor backgrounds, a pedestrian detection and tracking method for indoor robots equipped with Laser radar is proposed. Firstly, The SLAM (Simultaneous Location and Mapping) technology is applied to obtain 2D grid map for a strange environment; then, Monte Carlo localization is employed to obtain the posterior pose of the robot in the map; then, an improved likelihood field background subtraction algorithm is proposed to extract the interesting foreground in changeable environment; then, the hierarchical clustering algorithm combining with an improved leg model is proposed to detect the objective pedestrian; at last, an improved tracking intensity formula is designed to track and follow the objective pedestrian. Experimental results in some complex environments show that our method can effectively reduce the impact of confusing scenarios which are challenges for other algorithms, such as the motion of the chair, the suddenly passing by person and when the objective pedestrian close to the wall and so on, and can detect, track and follow pedestrians in real time with high accuracy.

Author 1: Zhu Bin
Author 2: Zhang Jian Rong
Author 3: Wang Yan Fang
Author 4: Wu Jin Ping

Keywords: Laser radar; likelihood field model; pedestrian detection; pedestrian tracking; simultaneous location and mapping

PDF

Paper 5: Comparative Study of EIGRP and OSPF Protocols based on Network Convergence

Abstract: Dynamic routing protocols are one of the fastest growing routing protocols in networking technologies because of their characteristics such as high throughput, flexibility, low overhead, scalability, easy configuration, bandwidth, and CPU utilization. Albeit convergence time is a critical problem in any of these routing protocols. Convergence time describes summary of the updated, complete, and accurate information of the network. Several studies have investigated EIGRP and OSPF on the internet; however, only a few of these studies have considered link failure and addition of new links using different network scenarios. This research contributes to this area. This comparative study uses a network simulator GNS3 to simulate different network topologies. The results are validated using Cisco hardware equipment in the laboratory. The network topology implemented in this research are star and mesh topology. The results are validated using Cisco hardware equipment in the laboratory. Wireshark is effectively used in capturing and analyzing the packets in the networks. This helps in monitoring accurate time response for the various packets. The results obtained from Wireshark suggest the EIGRP has a higher performance in terms of convergence duration with a link failure or new link added to the network than the OSPF routing protocol. Following this study EIGRP is recommended for most heterogeneous network implementations over OSPF routing protocol.

Author 1: Ifeanyi Joseph Okonkwo
Author 2: Ikiomoye Douglas Emmanuel

Keywords: OSPF (Open Shortest Path First); EIGRP (Enhanced Interior Gateway Routing Protocol); routing; protocol; network; convergence; topology; routers; packets; Wireshark

PDF

Paper 6: Estimate the Total Completion Time of the Workload

Abstract: The business intelligence workload is required to serve analytical process. The data warehouses have a very large collection of digital data. The large collection of digital data is required to analytical process within the perplexing workload. The main problem for perplexing workload is to estimate the total completion time. Estimate total completion time is required when workload is executed as a batch of queries. To estimate the queries according to their interaction aware scheme because queries are run in batches. The database administrators often require to perceive how much longer time for business intelligence workloads will take to complete. This question ascends, when database administrator entails to accomplish workloads within existing time frame. The database system executes mixes of multiple queries concurrently. We would rather measure query interactions of a mix than practiced approach to consider each query separately. A novel approach as a estimate framework is presented to estimate running time of a workload based on experiment driven modeling coupled with workload simulation. An estimation framework is developed which has two major parts offline phase and online phase. Offline phase collects the experiments sampling of mixes which has different query types. To find the good accuracy for estimating the running time of the workload by evaluation with TPC-H queries on PostgreSQL.

Author 1: Muhammad Amjad
Author 2: Waqas Ahmad
Author 3: Zia Ur Rehman
Author 4: Waqar Hussain
Author 5: Syed Badar Ud Duja
Author 6: Bilal Ahmed
Author 7: Usman Ali
Author 8: M. Abdul Qadoos
Author 9: Ammad Khan
Author 10: M. Umar Farooq Alvi

Keywords: Query interactions; estimate time; running time

PDF

Paper 7: Improving Disease Prediction using Shallow Convolutional Neural Networks on Metagenomic Data Visualizations based on Mean-Shift Clustering Algorithm

Abstract: Metagenomic data is a novel and valuable source for personalized medicine approaches to improve human health. Data Visualization is a crucial technique in data analysis to explore and find patterns in data. Especially, data resources from metagenomic often have very high dimension so humans face big challenges to understand them. In this study, we introduce a visualization method based on Mean-shift algorithm which enables us to observe high-dimensional data via images exhibiting clustered features by the clustering method. Then, these generated synthetic images are fetched into a convolutional neural network to do disease prediction tasks. The proposed method shows promising results when we evaluate the approach on four metagenomic bacterial species abundance datasets related to four diseases including Liver Cirrhosis, Colorectal Cancer, Obesity, and Type 2 Diabetes.

Author 1: Hai Thanh Nguyen
Author 2: Toan Bao Tran
Author 3: Huong Hoang Luong
Author 4: Trung Phuoc Le
Author 5: Nghi C. Tran

Keywords: Clustering algorithm; metagenomic; visualization; disease prediction; mean-shift; personalized medicine; species abundance; bacterial

PDF

Paper 8: Hybrid Memory Design for High-Throughput and Low-Power Table Lookup in Internet Routers

Abstract: Table lookup is a major process to decide the packet processing throughput and power efficiency of routers. To realize high-throughput and low-power table lookup, recent routers have employed several table lookup approaches, such as TCAM (Ternary Content Addressable Memory) based approach and DRAM (Dynamic Random Access Memory) based approach, depending on the purpose. However, it is difficult to realize both ultrahigh throughput and significant low power due to the trade-off between them. To satisfy both of the demands, this study proposes a hybrid memory design, which combines TCAM, DRAM, PPC (Packet Processing Cache), CMH (Cache Miss Handler), and IP Cache, to enable a high-throughput and low-power table lookup. The simulation results using an in-house cycle-accurate simulator showed that the proposed memory design achieved nearly 1 Tbps throughput with similar power of the DRAM-based approach. When compared to the approach proposed in a recent study, the proposed memory design can realize 1.95x higher throughput with 11% power consumption.

Author 1: Hayato Yamaki

Keywords: Inter routers; packet processing; table lookup; hy-brid memory architecture; Packet Processing Cache (PPC)

PDF

Paper 9: Artificial Intelligence: What it Was, and What it Should Be?

Abstract: Artificial Intelligence was embraced as an idea of simulating unique abilities of humans, such as thinking, self-improvement, and expressing their feelings using different languages. The idea of “Programs with Common Sense" was the main and central goal of Classical AI; it was, mainly built around an internal, updatable cognitive model of the world. But, now almost all the proposed models and approaches lacked reasoning and cognitive models and have been transferred to be more data driven. In this paper, different approaches and techniques of AI are reviewed, specifying how these approaches strayed from the main goal of Classical AI, and emphasizing how to return to its main objective. Additionally, most of the terms and concepts used in this field such as Machine Learning, Neural Networks and Deep Learning are highlighted. Moreover, the relations among these terms are determined, trying to remove mysterious and ambiguities around them. The transition from the Classical AI to Neuro-Symbolic AI and the need for new Cognitive-based models are also explained and discussed.

Author 1: Hala Abdel Hameed

Keywords: Classical AI; machine learning; Neuro-Symbolic AI; Cognitive-based AI; deep learning

PDF

Paper 10: Performance Assessment and Analysis of Blended Learning in IT Education: A Longitudinal Study in Saudi Electronic University

Abstract: Blended learning is a new educational model that binds traditional face-to-face learning with application of modern tools and technologies. This helps in retaining the positive features of the traditional learning while allowing students to realize potential of modern technologies. In blended learning, student perceptions and satisfaction plays a key role. Longitudinal studies can help identify patterns of these perceptions and expectations to evolve blended learning with changing times and technologies. In this paper, a longitudinal study has been carried out with the students and the faculty of Saudi Electronic University to identify major drivers and their role in shaping student perceptions and satisfaction. The results of this longitudinal study have been validated, and their subsequent comparisons are ascertained with the application of a decision tree based data mining technique. Based on the analysis and the findings of this study, the paper presents recommendations to improve blended learning experience and enhance the effectiveness of the teaching pedagogies developed consequently.

Author 1: Mohamed Habib
Author 2: Muhammad Ramzan

Keywords: Blended learning; educational data; information technology; longitudinal study; data mining; decision tree

PDF

Paper 11: Usability and Design Issues of Mobile Assisted Language Learning Application

Abstract: This paper aims to look at teachers, government officials, and students for Literacy & Numeracy Drive (LND), a smartphone app for students in Punjab province, Pakistan, to teach languages and math. Furthermore, to recognize LND usability and design problems while its use for grade three in schools. As the usability and design issues of LND were not discussed since the launch of this application. The methodology for this study is the questionnaire for teachers and semi-structured interviews for government officials of District Sheikhupura and students. The result shows that LND has various usability and design problems in its current form, i.e., buttons, icons, color schemes, sluggish performance, and fonts. Besides, teachers, government officials, and students suggested that game-based learning consists of an interactive interface, phonics, key animations to be created and adopted. Highly engaging and appealing delivery of the curriculum and improvements in the appraisal will improve the participation of students and deliver better outcomes.

Author 1: Kashif Ishaq
Author 2: Fadhilah Rosdi
Author 3: Nor Azan Mat Zin
Author 4: Adnan Abid

Keywords: Educational technology; language learning; literacy and numeracy drive; mobile application (App); m-learning; usability; user interface design

PDF

Paper 12: Intelligent Risk Alarm for Asthma Patients using Artificial Neural Networks

Abstract: Asthma is a chronic disease of the airways of the lungs. It results in inflammation and narrowing of the respiratory passages; which prevents air flow into the airways and leads to frequent bouts of shortness of breath with wheezing accompanied by coughing and phlegm after exposure to inhalation of substances that provoke allergic reactions or irritation of the respiratory system. Data mining in healthcare system is very important in diagnosing and understanding data, so data mining aims to solve basic problems in diagnosing diseases due to the complexity of diagnosing asthma. Predicting chemicals in the atmosphere is very important and one of the most difficult problems since the last century. In this paper, the impact of chemicals on asthma patient will be presented and discussed. Sensor system called MQ5 will be used to examine the smoke and nitrogen content in the atmosphere. MQ5 will be inserted in a wristwatch that checks the smoke and nitrogen content in the patient’s place, the system shall issue a warning alarm if this gas affects the person with asthma. It will be based on the Artificial Neural Networks (ANN) algorithm that has been built using data that containing a set of chemicals such as carbon monoxide, NMHC (GT) acid gas, C6H6 (GT) Gasoline, NOx (GT) Nitrogen Oxide, and NO2 (GT) Nitrogen Dioxide. The temperature and humidity will be also used as they can negatively affect asthma patient. Finally, the rating model was evaluated and achieved 99.58% classification accuracy.

Author 1: Rawabi A. Aroud
Author 2: Anas H. Blasi
Author 3: Mohammed A. Alsuwaiket

Keywords: Asthma; ANN; data mining; intelligent systems; machine learning; traffic-related pollution

PDF

Paper 13: Cultural Algorithm Initializes Weights of Neural Network Model for Annual Electricity Consumption Prediction

Abstract: The accurate prediction of annual electricity consumption is crucial in managing energy operations. The neural network (NN) has achieved a lot of achievements in yearly electricity consumption prediction due to its universal approximation property. However, the well-known back-propagation (BP) algorithms for training NN has easily got stuck in local optima. In this paper, we study the weights initialization of NN for the prediction of annual electricity consumption using the Cultural algorithm (CA), and the proposed algorithm is named as NN-CA. The NN-CA was compared to the weights initialization using the other six metaheuristic algorithms as well as the BP. The experiments were conducted on the annual electricity consumption datasets taken from 21 countries. The experimental results showed that the proposed NN-CA achieved more productive and better prediction accuracy than other competitors. This result indicates the possible consequences of the proposed NN-CA in the application of annual electricity consumption prediction.

Author 1: Gawalee Phatai
Author 2: Sirapat Chiewchanwattana
Author 3: Khamron Sunat

Keywords: Neural network; weights initialization; metaheuristic algorithm; cultural algorithm; annual electricity consumption prediction

PDF

Paper 14: Learning based Coding for Medical Image Compression

Abstract: The area of Image processing has emerged with different coding approaches, and applications which are ranging from fundamental image compression model to high quality applications. The advancement of image processing, has given the advantage of automation in various image coding applications, among which medical image processing is one of the prime area. Medical diagnosis has always remained a time taking and sensitive approach for accurate medical treatment. Towards improving these issues, automation systems have been developed. In the process of automation, the images are processed and passed to a remote processing unit for processing and decision making. It is observed that, images are coded for compression to minimize the processing and computational overhead. However, the issue of compressing data over accuracy always remains a challenge. Thus, for an optimization in image compression, there is a need for compression through the reduction of non-relevant coefficients in medical images. The proposed image compression model helped in developing a coding technique to attain accurate compression by retaining image precision with lower computational overhead in clinical image coding. Towards making the image compression more efficient, this research work introduces an approach of image compression based on learning coding. This research achieves superior results in terms of Compression rate, Encoding time, Decoding time, Total processing time and Peak signal-to-noise ratio (PSNR).

Author 1: Abdul Khader Jilani Saudagar

Keywords: Image compression; medical image processing; neural network; learning based coding; peak signal-to-noise ratio

PDF

Paper 15: An IoT based Approach for Efficient Home Automation with ThingSpeak

Abstract: With passage of time, technology is rapidly growing. People and daily life processes are highly dependent on internet. The Internet of Things (IoT) is an area of magnificent impact, growth and potential with the advent and rapid growth of smart homes, smart agriculture, smart cities and smart everything. Internet of Things (IoT) construct an environment in which everything is integrated and digitalized. People depend on smart phones and want to do their daily routine tasks in easy and quick way. Ordinary homes consist of multiple digital appliances that are controlled or managed by individual remote systems. It’s very hectic to use multiple individual remotes to control various component of homes. In current technological era, rather than home appliances, almost all type of home components available in digital forms. Various home automation systems with different specifications and implementations were proposed in literature. This research objective is to introduce an IoT based approach for efficient home automation system using Arduino and ThingSpeak. We have automated almost all essential aspects of smart home. Proposed system is efficient in terms of low power consumption, green building and increases the life of digital appliances. ThingSpeak cloud platform is used to integrate the home components; analyze and process the data. State of the art MQTT protocol is implemented for LAN communication. This paper will provide a path to IoT developers and researchers to sense, digitalize and control the homes in perspective of future IoT. Moreover, this work is serving as an instance of how life will be easier with the help of IOT applications.

Author 1: Mubashir Ali
Author 2: Zarsha Nazim
Author 3: Waqar Azeem
Author 4: Muhammad Haroon
Author 5: Aamir Hussain
Author 6: Khadija Javed
Author 7: Maria Tariq

Keywords: Internet of Things (IoT); home automation; Arduino; ThingSpeak; sensors; cloud computing; mobile computing

PDF

Paper 16: Discrete Cosine Transformation based Image Data Compression Considering Image Restoration

Abstract: Discrete Cosine Transformation (DCT) based image data compression considering image restoration is proposed. An image data compression method based on the compression (DCT) featuring an image restoration method is proposed. DCT image compression is widely used and has four major image defects. In order to reduce the noise and distortions, the proposed method expresses a set of parameters for the assumed distortion model based on an image restoration method. The results from the experiment with Landsat TM (Thematic Mapper) data of Saga show a good image compression performance of compression factor and image quality, namely, the proposed method achieved 25% of improvement of the compression factor compared to the existing method of DCT with almost comparable image quality between both methods.

Author 1: Kohei Arai

Keywords: Discrete Cosine Transformation; data compression; image restoration; Landsat TM

PDF

Paper 17: DVB-T2 Radio Frequency Signal Observation and Parameter Correlation

Abstract: In this paper, field test measurement are described and statistically correlated to obtain useful information about radiofrequency (RF) behavior of Digital Video Broadcasting - Terrestrial, second generation (DVB-T2) channels. Monitored radiofrequency data parameters are analyzed from statistical perspective and for finding, if any, linear correlation between them. Practical series of field measurements in the surrounding of Korça city in Albania are performed for consecutive 48 hours with sample data each second. The obtained results show the main issues that need to be considered in monitoring service reception quality which is not strongly related to the received channel power level but to the Modulation Error Rate (MER) parameter.

Author 1: Bexhet Kamo
Author 2: Elson Agastra
Author 3: Shkelzen Cakaj

Keywords: DVB-T2; radio coverage; statistical correlation RF data; field measurements

PDF

Paper 18: Causes of Failure in the Implementation and Functioning of Information Systems in Organizations

Abstract: When implementing or starting up an information system, there are usually a number of causes that can lead to its failure. Today, there are few companies that do not rely on technology to carry out their business processes. Wanting to have a competitive advantage over its competitors and the changing global business, puts pressure on the implementation of information systems implementation projects, be it an ERP (Enterprise Resource Planning), a CRM (Customer Relationship Management) or Big Data projects to manage a central repository of all internal and external data that a company can manage. Although it is an illusion for the company to start a project to implement an information system, its failure can lead to its key business processes not being carried out correctly. This article has the purpose of exposing the most common causes when implementing an information system, but also during the operation of the system, which can lead to organizational chaos and to take measures that no company wishes to take. A real case of failure is exposed during the implementation of an information system in an important Mexican company. The research team was allowed to interview general and systems area managers as well as employees. In addition, a survey was carried out among 30 people between managers and heads of department who followed closely on the implementation process of the global operations and technology system within of the company. The most influential factors were a deficient administration, a bad definition of the project and inappropriate consultancy.

Author 1: José Ramón Figueroa-Flores
Author 2: Elizabeth Acosta-Gonzaga
Author 3: Elena Fabiola Ruiz-Ledesma

Keywords: Information systems; outsourcing; resistance to change; organizational culture; decision making; information systems implementation failures

PDF

Paper 19: Handwritten Arabic Characters Recognition using a Hybrid Two-Stage Classifier

Abstract: Handwritten Arabic character recognition presents a big challenge to researchers in the field of pattern recognition. Arabic characters are characterized by their highly-cursive nature and many of them have a similar appearance. For example, the only difference between some of the alphabet characters is the existence of a number dots above or below the main character shape. This paper proposes a system for isolated off-line handwritten Arabic character recognition using the Discrete Cosine Transform (DCT) as the feature extraction method and a two-stage hybrid classifier. The two stages are a Support Vector Machine (SVM) and a neural network (NN). The first stage is a two-class SVM classifier which classifies a character either a character with dot(s) or without dot(s). The output of this stage is used to extend the feature vector of the character by the class value to give it an extra unique feature. The extend feature vector is fed to a multi-class neural network model to classify the character. The proposed approach is tested on a database of Arabic handwritten characters called AlexU Isolated Alphabet (AIA9K) containing 8,737 character images. The experimental results of the first stage classifier showed a high recognition accuracy rate of 99.14%. The proposed two-stage hybrid classifier obtained an average recognition accuracy rate of 91.84% over all Arabic Alphabet characters.

Author 1: Amjad Ali Al-Jourishi
Author 2: Mahmoud Omari

Keywords: Arabic character recognition; Support Vector Machine (SVM); neural network (NN); hybrid classifier

PDF

Paper 20: Critical Factors Affecting the Intention to Adopt Big Data Analytics in Apparel Sector, Sri Lanka

Abstract: Big data has become a potential research area in apparel industry due to vast amount of data generated in a short period of time. However, the inability to adapt to the challenging and digital environment has pulled out the weaker from the industry while growing the adopters more and more powerful players. As the insights generated out of data becoming core competitive advantages, now it is pertinent to identify which factors would affect the intention to adopt big data analytics in an apparel sector organization. The three contexts of the Technology-Organization-Environment (TOE) framework along with Technology Acceptance Model (TAM) were used as foundational frameworks to explore the influence on the attitude towards using of users which would ultimately affect to the intention of adopting big data analytics. The findings generated from the study denotes that factors considered in both TOE framework and TAM model except organizational context having a positive correlation towards the user’s attitude of using which would ultimately lead the organization in enhancing its intention to adopt big data analytics. Finally, the research concludes that the variable, attitude towards using plays a positive mediating role between the direct relationship of critical factors affecting the intention to adopt big data analytics. It is hoped that findings of this research would enrich the existing literature while affecting practitioners to involve in adopting big data analytics by prioritizing investments accordingly.

Author 1: Hiruni Bolonne
Author 2: Piyavi Wijewardene

Keywords: Critical factors; TOE framework; Technology Acceptance Model (TAM); attitude towards using; intention to adopt; Big Data Analytics (BDA); apparel sector; Sri Lanka

PDF

Paper 21: Distributed Denial of Service Attacks in Cloud Computing

Abstract: The Cloud Computing attacks have been increased since the expanded use of the cloud computing. One of the famous attacks that targets the cloud computing is the distributed denial of service (DDoS) attack. The common features and component of the cloud structure make it more reachable from this kind of attack. The DDOS is targeting the large number of devices connected in any cloud service provider based on its scalability and reliability features that make the cloud available from anywhere and anytime. This attack mainly generate a large number of malicious packets to make the targeted server busy dealing with these huge number of packets. There many techniques to defend the DDOS attack in the regular networks, while in the cloud computing this task is more complicated regarding the various characteristics of the cloud that make the defending process not an easy task. This paper will investigate most of the method used in detecting and preventing and then recover from the DDoS in the cloud computing environment.

Author 1: Hesham Abusaimeh

Keywords: Cloud; cloud computing; DoS attacks; DDoS attacks; DDoS prevention; DDoS mitigation

PDF

Paper 22: Analyzing the Performance of Web-services during Traffic Anomalies

Abstract: Intentional or unintentional, service denial leads to substantial economic and reputation losses to the users and the web-service provider. However, it is possible to take proper measures only if we understand and quantify the impact of such anomalies on the victim. In this paper, essential performance metrics distinguishing both transmission issues and application issues have been discussed and evaluated. The legitimate and attack traffic has been synthetically generated in hybrid testbed using open-source software tools. The experiment covers two scenarios, representing DDoS attacks and Flash Events, with varying attack strengths to analyze the impact of anomalies on the server and the network. It has been demonstrated that as the traffic surges, response time increases, and the performance of the target web-server degrades. The performance of the server and the network is measured using various network level, application level, and aggregate level metrics, including throughput, average response time, number of legitimate active connections and percentage of failed transactions.

Author 1: Avneet Dhingra
Author 2: Monika Sachdeva

Keywords: Denial of service; DDoS attack; flash event; performance metrics; throughput; response time

PDF

Paper 23: 4GL Code Generation: A Systematic Review

Abstract: Code generation is longstanding goal in software engineering. It allows more productivity of computer programming as it aims to provide automation of transformation of models into actual source code. This process has been covered adequately in many programming languages. However, this topic has not been covered sufficiently with regards to Fourth Generation Languages (4GL) which have a high specialized nature. The goal of this paper is to represent a systematic literature review of 4GL Code generation. The paper focuses on reviewing systemically the studies published in the past 20 years on the topic. This is to investigate the trends in the topic and the approaches introduced in order to identify potential new research lines.

Author 1: Abdullah A H Alzahrani

Keywords: Software engineering; code transformation; 4GL; code generation; Model Driven Development (MDD); Extraction Transform Load (ETL); Model Driven Engineering (MDE); Rapid Application Development (RAD)

PDF

Paper 24: Measuring the Performance of Inventory Management System using Arena Simulator

Abstract: The demonstration of inventory management systems is presented in this study to deal with a situation, where organizations are facing challenges due to uncertain behavior of demand. The implementation conducted using simulation technique to generate sampling experiment through computing and statistical methods. The important aspect of the simulation is to understand level of satisfaction regarding proposed system and its attributes. To assist the organization in controlling and managing the inventory system, the research provides the solution using Arena simulation tool. Firstly, the study explained the use of simulation and then reviews the common approaches of simulation. Secondly, the research proposed a framework for inventory control system. For practical implementation Arena tool used for model implementation. The main purpose of this experiment is to measure and analyze the applicability of potential system using a simulation tool. The results indicated the successful implementation of the proposed framework for inventory system. The model used multiple inventory system’s variables such as demand, inventory stock and realized cost. For demonstrating the real world system’s behavior, we used stochastic model for demand. The model executed using single server queuing (M/M/1) approach, but replicated several times. The results highlighted the high performances of machine located on multiples places and processed demand requests on each replication run. The study presented in this research demonstrates the association between demand and inventory. The proposed model can support the manufacturing organizations to control and manage inventory system.

Author 1: Fawaz J. Alsolami

Keywords: Inventory management system; simulation; arena simulation tool; demand and inventory

PDF

Paper 25: Investigating the Awareness and Usage of Moodle Features at Hashemite University

Abstract: E-learning plays a vital role in the educational process. Learning management systems are being essential component of e-learning. Moodle learning management system is being widely used in Higher Education Institutions due to the rich features it provides that support the learning process. Standard Moodle comprises 21 features (14 activities and 7 resources). Little research has been carried out to examine these features in particular. In this research, the awareness and usage of Moodle features among faculty members at Hashemite University, Jordan are investigated. A sample of 140 instructors were surveyed. Then, the responses were analyzed to find the overall awareness and usage of each feature. Furthermore, the correlation between awareness and usage and how the awareness of Moodle features is associated with their usage were analyzed through correlation and regression analysis. The study revealed that instructors expressed highest awareness towards File, Folder, Assignment, URL and Quiz features whilst the least awareness was towards SCORM package and IMS content package features. Regarding usage, the study identified the File, Folder, Assignment and URL features as the most heavily used features whereas the least commonly used features have been IMS Content Package, SCORM package, Wiki, Glossary, Workshop, Database, Survey, External tool and Choice. Moreover, the study statistically demonstrated a strong correlation between the awareness and usage of features and that changes in the awareness of Moodle features are significantly associated with changes in their usage. In other words, the study revealed that features with low awareness tend to have low usage and that the usage would increase as the awareness increases. The study would help Moodle administrators in Higher Education Institutions decide about the most important features that should be installed in their customized instance of Moodle. Furthermore, the study would help Hashemite University responsible parties in identifying the least commonly used and the least well-known features, allowing them to focus on increasing the levels of awareness and usage of those features in a way that might reflect positively on the learning process.

Author 1: Haneen Hijazi
Author 2: Ghadeer Al-Kateb
Author 3: Eslam Alkhawaldeh

Keywords: Moodle; learning management system; features; awareness; usage; activities; resources; tools; correlation; regression

PDF

Paper 26: The Role of ICT Projects in Enterprises: Investments, Benefits and Evaluation

Abstract: The enterprise’s dependency on Information and Communication Technologies (ICT) resources is essential, which cover their several business and operational activities. Enhancing operational capabilities, advancing working environment, and improving employees skills are major benefits provided by modern ICT resources. The real pressure is on organizations to upgrade the ICT infrastructure with latest development to compete in the market. This research investigates the role of ICT projects in an organization from investment, benefits and evaluation perspectives. Based on the literature review, the conceptual framework proposed to understand the relationship between ICT project’s investments, benefits, and evaluation. The main purpose of this study is to investigate the approach of enterprises toward ICT investments. Moreover, to understand the type of ICT evaluation strategies that practicing by organizations. Therefore, the proposed framework is applied and validated through multiple case studies to confirm the list of variables collected from literature review. The conducted investigation will help to certify the findings of literature review through selected case studies. The analysis of responses presented in different format to understand the current role and status of ICT projects; investment, benefits and evaluation performed in different organizations. The outcome of this study will addresses substantial factors and offers references for the organization to build their ICT investment and evaluation model. The type of ICT investment, benefits, and measurement models extracted in this research can act as a reference for the organization to develop their own ICT investments policies.

Author 1: Khaled H. Alyoubi

Keywords: ICT projects; ICT investment; ICT benefits; ICT evaluation strategies

PDF

Paper 27: Serious Games Requirements for Higher-Order Thinking Skills in Science Education

Abstract: Education in the 21st century emphasises the mastery of higher-order thinking skills (HOTS) in the pursuit of developing competitive human capital globally. HOTS can be taught through science education. However, science education is considered very challenging leaving students feeling less interested and less motivated. Apart from that, students are found to be weak in mastering their thinking skills based on the decline in students’ achievements in the Trend in International Mathematics and Science Study (TIMSS) and the Programme International Students Assessment (PISA) tests. This situation highlights the need to change the approach of teaching and learning science in line with the current technological changes to meet the challenges of globalisation. Previous studies showed that the use of serious games in learning can enhance students’ thinking skills. Thus, serious games can be used to develop higher-order thinking skills among students. This paper presented results of preliminary study using interviews, document analysis, and questionnaire survey. Findings have shown that there are several issues and challenges of teaching and learning in implementing HOTS in science education, in addition to game design requirement for science education. The requirements will be used to design a serious game implementing HOTS in science education.

Author 1: Siti Norliza Awang Noh
Author 2: Nor Azan Mat Zin
Author 3: Hazura Mohamed

Keywords: Higher-Order Thinking (HOT) skills; educational games; serious games; interface design; science education

PDF

Paper 28: Transitioning to Online Learning during COVID-19 Pandemic: Case Study of a Pre-University Centre in Malaysia

Abstract: In the last decade, online learning has grown rapidly. However, the outbreak of coronavirus (COVID-19) has caused learning institutions to embrace online learning due to the lockdown and campus closure. This paper presents an analysis of students’ feedback (n=354) from the Centre of Pre-University Studies (PPPU), Universiti Malaysia Sarawak (UNIMAS), Malaysia, during the transition to fully online learning. Three phases of online surveys were conducted to measure the learners’ acceptance of the migration and to identify related problems. The result shows that there is an increased positivity among the students on the vie of teaching and learning in STEM during the pandemic. It is found that online learning would not be a hindrance, but blessing towards academic excellence in the face of calamity like the COVID-19 pandemic. The suggested future research direction will be of interest to educators, academics, and researchers’ community.

Author 1: Ahmad Alif Kamal
Author 2: Norhunaini Mohd Shaipullah
Author 3: Liyana Truna
Author 4: Muna Sabri
Author 5: Syahrul N. Junaini

Keywords: E-learning; STEM; coronavirus: pandemic; education technology; assessment; technology acceptance

PDF

Paper 29: Optimised Tail-based Routing for VANETs using Multi-Objective Particle Swarm Optimisation with Angle Searching

Abstract: Routing protocols for vehicular ad hoc networks (VANETs) are highly important, as they are essential for operating the concept of intelligent transportation system and several other applications. VANET Routing entails awareness about the nature of the road and various other parameters that affect the performance of the protocol. Optimising the VANET routing guarantees optimal metrics, such as low E2E delay, high packet delivery ratio (PDR) and low overhead. Since its performance is of multi-objective nature, it needs multi-objective optimisation as well. Most researchers have focused on a single objective or weighted average for multi-objective optimisation. Only a few of the studies have tackled the actual multi-objective optimisation of VANET routing. In this article, we propose a novel reactive routing protocol named tail-based routing, based on the concept of location-aided routing (LAR). We first re-defined the request zone to reduce the lateral width with respect to the lateral distance between the source and destination and named it tail. Next, we incorporated angle searching with crowding distance inside the multi-objective optimisation MO-PSO and called it MO-PSO-angle. Then, we conducted optimisation of tail-based routing using MO-PSO-angle and compared it with optimised LAR, which exhibited the superiority of the latter. The best improvement was at the optimisation point with a 96% improvement of PDR and a 313% improvement in E2E delay.

Author 1: Mustafa Qasim AL-Shammari
Author 2: Ravie Chandren Muniyandi

Keywords: VANETs; Routing; PDR; E2E delay; optimization; multi-objective particle swarm; location based routing; MOPSO

PDF

Paper 30: Enhancement of Fundus Images for Diagnosing Diabetic Retinopathy using B-Spline

Abstract: Medical images, such as CT scan, MRI, X-ray, mammography and fundus are commonly used in medical diagnosis process and helpful to improve diagnose of disease in a better way and reduces the chances of ambiguous perceptions. Medical images are mostly available in low contrast, brightness and noisy form due to camera/ radio waves intrinsic properties while capturing, which disrupts the diagnosis process using medical images. Enhancement of these images can improve the diagnosis process. The proposed enhancement technique of fundus images is based on the B-spline interpolation, in which intensity transformation curve is based on the control points of the curve. Messidore and Drive datasets of Diabetic Retinopathy (DR) are used to evaluate the proposed enhancement technique. Results shows that the fundus images have reasonable visual and quantitative enhancement when performed comparison with recent techniques. Results are of evidence that the proposed approach has substantial outcome and preserves important information of fundus images by lowering noise.

Author 1: Tayba Bashir
Author 2: Khurshid Asghar
Author 3: Mubbashar Saddique
Author 4: Shafiq Hussain
Author 5: Inam Ul Haq

Keywords: B-spline; medical images enhancement; fundus images; diabetic retinopathy; interpolation

PDF

Paper 31: Developing Skills of Cloud Computing to Promote Knowledge in Saudi Arabian Students

Abstract: The present study aims to develop the skills of the cloud computing applications and the knowledge economy among the university students by designing a participatory electronic learning environment. A sample was chosen from the students of the “General Diploma” in the Faculty of Education, King Khalid University. This sample was divided into two groups; experimental group that comprised of 15 students trained through the participatory e-learning environment; whereas, the control group comprised of 17 students, who were trained through the Blackboard Learning Management System. Skills for cloud computing applications and a knowledge economy skills scale were developed. Kolmogorov-Smirnov Mann was used for identifying the normality test of variables. Whitney test and Spearman correlation test were used to analyze the results, which indicated that the design of a participatory e-learning environment based on the theory of communication contributed to improve the skills level of cloud computing applications and knowledge economy skills. The results showed that participatory e-learning environment based on the theory of communication significantly contributes towards improving the skills level of cloud computing applications and knowledge economy skills among the students from Saudi Arabian universities. Moreover, future studies need to focus on blueprint in the context of the educational system of Saudi Arabia.

Author 1: Ahmed Sadik
Author 2: Mohammed Albahiri

Keywords: Cloud computing applications; e-learning environment; higher education; knowledge economy

PDF

Paper 32: Image Detection Model for Construction Worker Safety Conditions using Faster R-CNN

Abstract: Many accidents occur on construction sites leading to injury and death. According to the Occupational Safety Health Administration (OSHA), falls, electrocutions, being struck-by-objects and being caught in or between an object were the four main causes of worker deaths on construction sites. Many factors contribute to the increase in accidents, and personal protective equipment (PPE) is one of the defense mechanisms used to mitigate them. Thus, this paper presents an image detection model about workers’ safety conditions based on PPE compliance by using the Faster Region-based Convolutional Neural Networks (R-CNN) algorithm. This experiment was conducted using Tensorflow involving 1,129 images from the MIT Places Database (from Scene Recognition) as a training dataset, and 333 anonymous dataset images from real construction sites for evaluation purposes. The experimental results showed 276 of the images being detected as safe, and an average accuracy rate of 70%. The strength of this paper is based on the image detection of the three PPE combinations, involving hardhats, vests and boots in the case of construction workers. In future, the threshold and image sharpness (low resolution) will be two main characteristics of further refinement in order to improve the accuracy rate.

Author 1: Madihah Mohd Saudi
Author 2: Aiman Hakim Ma’arof
Author 3: Azuan Ahmad
Author 4: Ahmad Shakir Mohd Saudi
Author 5: Mohd Hanafi Ali
Author 6: Anvar Narzullaev
Author 7: Mohd Ifwat Mohd Ghazali

Keywords: PPE; OSH; accident; construction site; image detection; faster R-CNN

PDF

Paper 33: A Systematic Overview and Comparative Analysis of Service Discovery Protocols and Middlewares

Abstract: Context is major source of communication, where information is gathered easily from user context due to the progress of smart and context aware systems. Even, service directory also supports the systems to response the requests, sent through client. In this paper, the authors overviews context aware systems, their sensing capabilities in location or beyond location along with COIVA (a context aware system). Eight discovery protocols along with their functionalities (such as DEAPspace, DNS-SD, JXTA, RDP, LDAP, CORBA Trader, UDDI and Superstring) are discussed and compared to evaluate the performance and efficiency of system. In addition, six middleware (such as CAMPUS, CASF, SeCoMan, CoCaMAAL, BDCaM, and FlexRFID) are compared to evaluate factors (such as Architectural style, Context abstraction/Reasoning level, Context awareness level, Contextual adaptation approaches, Decision making, and Programming model). The authors further categorized them into sub categories discussed in Section 4 and named CoCaMAAL as better middleware as compared to others.

Author 1: Jawad Hussain Awan
Author 2: Usman Naseem
Author 3: Shah Khalid Kha
Author 4: Nazar Waheed

Keywords: Pervasive computing; service discovery protocols; context-aware; middleware; privacy

PDF

Paper 34: Classification of Multiple Sclerosis Disease using Cumulative Histogram

Abstract: Multiple sclerosis (MS) is a chronic disease that affects different body parts including the brain. Detection and classification of MS brain lesions is of immense importance to physicians for the administration of appropriate treatment. Thus, this study investigates an automated framework for the diagnoses and classification of MS lesions in brain using magnetic resonance imaging (MRI). First, the MRI images format converted from dicom images of each patient into TIF format as MS lesion appears in white matter (WM) obviously. This is followed by a brain tissue segmentation using a k-nearest neighbor classifier. Then, cumulative empirical distributions or cumulative histograms (CH) of the segmented lesions are estimated along with other texture/statistical features that work on the difference between the intensity of MS lesions and its surrounding tissues. Finally, these CDFs are fused with and the statistical features for the classification of MS using K mean classifiers. Experiments are conducted, using transverse T2-weighted MR brain scans from 20 patients that are highly sensitive in detecting MS plaques, with gold standard classification obtained by an experienced MS. By comparing the evaluated performance with statistical features, our proposed fusion scored the highest accuracy with 98% and a false-positive rate of 1%.

Author 1: Menna Safwat
Author 2: Fahmi Khalifa
Author 3: Hossam El-Din Moustafa

Keywords: Cumulative Histogram (CH); Magnetic Resonance Image (MRI); Multiple Sclerosis (MS); White Matter (WM)

PDF

Paper 35: A Categorization of Relevant Sequence Alignment Algorithms with Respect to Data Structures

Abstract: Sequence Alignment is an active research subfield of bioinformatics. Today, sequence databases are rapidly and steadily increasing. Thus, to overcome this issue, many efficient algorithms have been developed depending on various data structures. The latter have demonstrated considerable efficacy in terms of run-time and memory consumption. In this paper, we briefly outline existing methods applied to the sequence alignment problem. Then we present a qualitative categorization of some remarkable algorithms based on their data structures. Specifically, we focus on research works published in the last two decades (i.e. the period from 2000 to 2020). We describe the employed data structures and expose some important algorithms using each. Then we show potential strengths and weaknesses among all these structures. This will guide biologists to decide which program is best suited for a given purpose, and it also intends to highlight weak points that deserve attention of bioinformaticians in future research.

Author 1: Hasna El Haji
Author 2: Larbi Alaoui

Keywords: Sequence alignment; data structures; bioinformatics

PDF

Paper 36: Bangla Optical Character Recognition and Text-to-Speech Conversion using Raspberry Pi

Abstract: Optical Character Recognition (OCR) technology is very helpful for visually impaired or illiterate persons who are unable to read text documents but need to reach the content of the text documents. In this paper, a camera-based assistive device is used that can be applied for visually impaired or illiterate people to understand Bangla text documents by listening to the contents of the Bangla text images. This work mainly involves the extraction of the Bangla text from the Bangla text image and converts the extracted text to speech. This work has been fulfilled with Raspberry Pi and a camera module by applying the concepts of the Tesseract OCR engine, the Open Source Computer Vision, and the Google Speech Application Program Interface. This work can help people speaking Bangla language who are unable to read or have a significant loss of visual sight.

Author 1: Aditya Rajbongshi
Author 2: Md. Ibadul Islam
Author 3: Al Amin Biswas
Author 4: Md. Mahbubur Rahman
Author 5: Anup Majumder
Author 6: Md. Ezharul Islam

Keywords: Optical character recognition; Bangla text; speech conversion; Raspberry Pi; camera module

PDF

Paper 37: IoT Enabled Air Quality Monitoring for Health-Aware Commuting Recommendation in Smart Cities

Abstract: The importance of air pollution control in smart cities has been realized by almost every department of society. Research community has been working in collaboration with industry to craft sensors for measuring different types of pollution levels in the environment. However, it is rarely possible to implant the sensors in all geographical areas. It is important to measure pollution levels in almost every part of the world with life and implement clean environment policies. However, in unplanned areas, the implementation of environmental policies faces problems because such areas lack in communication infrastructure and cost of huge amount of fixed or static sensors. This work envisions availability of sensor-equipped-VANET based system to monitor pollution levels in different areas of an unplanned city. This paper proposes an autonomous VANET system that can carry environmental sensors to collect data from an area at different intervals, process it to transform data into information and forward the information to node that has capability to collect all information and then send it to server machine for further process either using VANET or some reliable network connectivity. Based on the collected data, this research further contributes health aware commuting recommendation based on cost effective monitoring of air quality.

Author 1: Riaz UlAmin
Author 2: Muhammad Akram
Author 3: Najeeb Ullah
Author 4: Muhammad Ashraf
Author 5: Abdul Sattar Malik

Keywords: Un-planned Areas; Vehicular ad hoc Networks; Pollution Monitoring; AQI; Health Aware Commuting Recommendations

PDF

Paper 38: ER Model Partitioning: Towards Trustworthy Automated Systems Development

Abstract: In database development, a conceptual model is created, in the form of an Entity-relationship (ER) model, and transformed to a relational database schema (RDS) to create the database. However, some important information represented on the ER model may not be transformed and represented on the RDS. This situation causes a loss of information during the transformation process. With a view to preserving information, in our previous study, we standardized the transformation process as a one-to-one and onto mapping from the ER model to the RDS. For this purpose, we modified the ER model and the transformation algorithm resolving some deficiencies existed in them. Since the mapping was established using a few real-world cases as a basis and for verification purposes, a formal-proof is necessary to validate the work. Thus, the ongoing research aiming to create a proof will show how a given ER model can be partitioned into a unique set of segments and use it to represent the ER model itself. How the findings can be used to complete the proof in the future will also be explained. Significance of the research on automating database development, teaching conceptual modeling, and using formal methods will also be discussed.

Author 1: Dhammika Pieris
Author 2: M. C Wijegunesekera
Author 3: N. G. J Dias

Keywords: Conceptual model; Entity Relationship (ER) model; relational database schema; information preservation; transformation

PDF

Paper 39: Generic Framework Architecture for Verifying Embedded Components

Abstract: This dissertation presents a framework for the formal verification of standard embedded components such us bus protocol, microprocessor, memory blocks, various IP blocks, and a software component. It includes a model checking of embedded systems components. The algorithms are modeled on SystemC and transformed on Promela language (PROcess or PROtocol MEta LAnguage) with the integration of LTL (Linear Temporal Logic) properties extracting from state machines in order to reduce verification complexity. Thus, SysVerPml is not only dedicated to verifying generated properties but also for the automation integration of other properties in models if needed. In the following, we will provide the answer to the problems of component representation on the design system, what properties are appropriate for each component, and how to verify properties.

Author 1: Lamia ELJADIRI
Author 2: Ismail ASSAYAD

Keywords: Algorithms; automation; embedded components; embedded systems; formal verification; framework; LTL properties; Promela; SystemC; SysVerPml; system design

PDF

Paper 40: Architectural Proposal for a Syllabus Management System using the ISO/IEC/IEEE 42010

Abstract: The efficiency in the academic and administrative procedures of higher education clearly marks competitive advantage in aspects of quality, which consists in the continuous improvement and improvement for the achievement of educational objectives. In our institution, the syllabus treatment is currently carried out manually, delaying many educational processes. Therefore, it is proposed to innovate through a software architecture approach based on the standard “ISO/IEC/ IEEE 42010: Systems and software engineering - Architecture Description” to describe the architecture Syllabus Management System software. It is developed in three stages: Analysis, Design and Verification. This will allow professors to develop their research, training, teaching and presentation of timely reports, with the measurement of achieved skills and abilities of students, managers and academic authorities, and to make decisions based on the results obtained by the tool allowing an improvement in the quality of the contents and development flow of the syllabus.

Author 1: Anthony Meza-Luque
Author 2: Alvaro Fernández Del Carpio
Author 3: Karina Rosas Paredes
Author 4: Jose Sulla-Torres

Keywords: Management; architecture; software; syllabus; skills; ISO/IEC IEEE 42010

PDF

Paper 41: Preparing Graduates with Digital Literacy Skills Toward Fulfilling Employability Need in 4IR Era: A Review

Abstract: This systematic review aims to review and synthesize employer expectations towards digital skills among graduates, steps, and measurements taken by higher education institutions to prepare students and harness motivation among students to make themselves competitive and marketable toward fulfilling employability needs in 4IR era. It was designed based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). Articles published between January 2016 and 2020 were sought from three electronic databases: Science Direct, Scopus, and Web of Science. Additional items gain from the Universiti Kebangsaan Malaysia repository are also considered to be reviewed. All papers were reviewed, and the quality assessment was performed. Twenty articles were finally selected. Data were extracted, organized, and analyzed using a narrative synthesis. The review identified three overarching themes: (1) Employer perspectives on their expectation from young graduates. (2) Institutions’ views on how they should prepare their students for the 4IR era. (3) Students’ perspectives on how they could motivate themselves. The systematic review provides insightful information on the required digital literacy skills among young graduates, expectations of the industry player, and how digital literacies can be developed in the institutions.

Author 1: Khuraisah MN
Author 2: Fariza Khalid
Author 3: Hazrati Husnin

Keywords: Digital literacy; computer literacy; information literacy; employability

PDF

Paper 42: A Comparative Study of Eight Crossover Operators for the Maximum Scatter Travelling Salesman Problem

Abstract: The maximum scatter traveling salesman problem (MSTSP), a variation of the famous travelling salesman problem (TSP), is considered here for our study. The aim of problem is to maximize the minimum edge in a salesman’s tour that visits each city exactly once in a network. It is proved be NP-hard problem and considered to be very difficult problem. To solve this kind of problems efficiently, one must use heuristic/metaheuristic algorithms, and genetic algorithm (GA) is one of them. Out of three operators in GAs, crossover is the most important operator. So, we consider eight crossover operators in GAs for solving the MSTSP. These operators have originally been designed for the TSP which can also be applied on the MSTSP after some modifications. The crossover operators are first illustrated manually through an example and then executed on some well-known TSPLIB instances of different types and sizes. The obtained comparative study clearly demonstrates the usefulness of the sequential constructive crossover operator for the MSTSP. Finally, a relative ranking of the crossover operators is reported.

Author 1: Zakir Hussain Ahmed

Keywords: Traveling salesman problem; maximum scatter; genetic algorithms; crossover operators; sequential constructive crossover

PDF

Paper 43: Review on Personality Types and Learning Styles in Team-based Learning for Information Systems Students

Abstract: Team-based learning (TBL) has become a preferable method in learning approach at higher educational level. There are a lot of articles that discussed on the benefits and process of implementation of team-based learning but lack of studies that focus on the composition of members in team-based learning and effects of personality types and learning styles towards it. This article set out to analyze the existing literatures on team-based learning implementation at undergraduate and how personality types and learning styles affected the learning process plus exploring these topics in information systems field. Guided by Okoli systematic review method, a systematic review from Scopus, Web of Sciences and Association of Information Systems (AIS) databases has been conducted. Results shows that TBL received positive feedback from the scholars but only have issues on the implementation process consist of the usage of student’s personality and learning styles, role of team members, TBL management in classroom, TBL is not “fit for all” and current studies about TBL. The usage of personality and learning style instruments is one of the suggested ways to improve it but there are no details guidelines available yet on how to use it. There is lack of studies about team-based learning in information systems field.

Author 1: Muhammad Zul Aiman Zulkifli
Author 2: K.S. Savita
Author 3: Noreen Izza Arshad

Keywords: Team-based learning; personality type; learning style; undergraduate students

PDF

Paper 44: Data Fusion-Link Prediction for Evolutionary Network with Deep Reinforcement Learning

Abstract: The sophistication of covert activities employed by criminal networks with technology has been proven to be very challenging for criminal enforcement fraternity to cripple their activities. In view of this, law enforcement agencies need to be equipped with criminal network analysis (CNA) technology which can provide advanced and comprehensive intelligence to uncover the primary members (nodes) and associations (links) within the network. The design of tools to predict links between members mainly rely on Social Network Analysis (SNA) models and machine learning (ML) techniques to improve the precision of the model. The primary challenge of constructing classical ML models such as random forest (RF) with an acceptable level of accuracy is to obtain a large enough dataset to train the model. Obtaining a large enough dataset in the domain of criminal networks is a significant problem due to the stealthy and covert nature of their activities compared to social networks. The main objective of this research is to demonstrate that a link prediction model constructed with a relatively small dataset and dataset generated through self-simulation by leveraging on deep reinforcement learning (DRL) can contribute towards higher precision in predicting links. The training of the model was further fused with metadata (i.e. environment attributes such as criminal records, education level, age and police station proximity) in order to capture the real-life attributes of organised crimes which is expected to improve the performance of the model. Therefore, to validate the results, a baseline model designed without incorporating metadata (CNA-DRL) was compared with a model incorporating metadata (MCNA-DRL).

Author 1: Marcus Lim
Author 2: Azween Abdullah
Author 3: NZ Jhanjhi

Keywords: Metadata; time-series network; social network analysis; criminal network; deep reinforcement learning

PDF

Paper 45: Underwater Wireless Sensor Network Route Optimization using BIHH Technique

Abstract: Underwater wireless sensor network (UWSN) is established in water bodies such as oceans, seas and rivers to observe the activity of military, to perform rescue operations and to do mining activity of resources. The sensor nodes communicate through acoustic channels. These nodes have limited battery life (energy), narrow bandwidth and a channel is incurred with delays and noise posing security thrust. The art of work presented different routing protocols in this era to utilize energy and bandwidth efficiently with less delay and to provide the security against black hole attack. However, these methods do not show an appropriate enhancement in the security and to utilize the bandwidth efficiently due to mobile environment. As a result of which, the delay also increases. In this paper a secured and bandwidth utilization path is enhanced using Bellman Inora Hex Hamming technique (BIHH), which not only improves the performance of the routing but also saves the energy. The presented approach is validated with network simulator.

Author 1: Turki Ali Alghamdi

Keywords: Sensor nodes; energy; routing; black hole; hamming code; hex code

PDF

Paper 46: Application of Homomorphic Encryption on Neural Network in Prediction of Acute Lymphoid Leukemia

Abstract: Machine learning is now becoming a widely used mechanism and applying it in certain sensitive fields like medical and financial data has only made things easier. Accurate Diagnosis of cancer is essential in treating it properly. Medical tests regarding cancer in recent times are quite expensive and not available in many parts of the world. CryptoNets, on the other hand, is an exhibit of the use of Neural-Networks over data encrypted with Homomorphic Encryption. This project demonstrates the use of Homomorphic Encryption for outsourcing neural-network predictions in case of Acute Lymphoid Leukemia (ALL). By using CryptoNets, the patients or doctors in need of the service can encrypt their data using Homomorphic Encryption and send only the encrypted message to the service provider (hospital or model owner). Since Homomorphic Encryptions allow the provider to operate on the data while it is encrypted, the provider can make predictions using a pre-trained Neural-Network while the data remains encrypted all throughout the process and finally sending the prediction to the user who can decrypt the results. During the process the service provider (hospital or the model owner) gains no knowledge about the data that was used or the result since everything is encrypted throughout the process. Our work proposes a Neural Network model which will be able to predict ALL-Acute Lymphoid Leukemia with approximate 80% accuracy using the C_NMC Challenge dataset. Prior to building our own model, we used the dataset and pre-process it using a different approach. We then ran on different machine learning and Neural Network models like VGG16, SVM, AlexNet, ResNet50 and compared the validation accuracies of these models with our own model which lastly gives better accuracy than the rest of the models used. We then use our own pre-trained Neural Network to make predictions using CryptoNets. We were able to achieve an encrypted prediction of about 78% which is close to what we achieved when validating our own CNN model that has a validation accuracy of 80% for prediction of Acute Lymphoid Leukemia (ALL).

Author 1: Ishfaque Qamar Khilji
Author 2: Kamonashish Saha
Author 3: Jushan Amin Shonon
Author 4: Muhammad Iqbal Hossain

Keywords: CryptoNets; neural network; Acute Lymphoid Leukemia (ALL); homomorphic

PDF

Paper 47: On the Digital Applications in the Thematic Literature Studies of Emily Dickinson’s Poetry

Abstract: Thematic studies in literature have traditionally been based on philological methods supported by personal knowledge and evaluation of the texts. A major problem with studies in this tradition is that they are not objective or replicable. With the development of digital technologies and applications, it is now possible for theme analysis in literary texts to be based at least partially on objective replicable methods. In order to address issues of objectivity and replicability in thematic classification of literary text, this study proposes a computational model to theme analysis of the poems of Emily Dickinson using cluster analysis based on a vector space model (VSM) representation of the lexical content of the selected texts. The results indicate that the proposed model yields usable results in understanding the thematic structure of Dickinson’s prose fiction texts and that they do so in an objective and replicable way. Although the results of the analysis are broadly in agreement with existing, philologically-based critical opinion about the thematic structure of Dickinson’s work, the contribution of this study is to give that critical opinion a scientific, objective, and replicable basis. The methodology used in this study is mathematically-based, clear, objective, and replicable. Finally, the results of the study have their positive implications to the use of computational models in literary criticism and literature studies. The success of computer-aided approaches in addressing inherent problems in the field of literary studies related to subjectivity and selectivity argues against the theoretical objections to the involvement of computer and digital applications in the study of literature.

Author 1: Abdulfattah Omar

Keywords: Cluster analysis; digital applications; Emily Dickinson; lexical content; philological methods; thematic studies; Vector Space Model (VSM)

PDF

Paper 48: Situational Modern Code Review Framework to Support Individual Sustainability of Software Engineers

Abstract: Modern Code Review (MCR) is a socio-technical practice to improve source code quality and ensure successful software development. It involves the interaction of software engineers from different cultures and backgrounds. As a result, a variety of unknown situational factors arise that impact the individual sustainability of MCR team members and affect their productivity by causing mental distress, fear of unknown and varying situations. Therefore, the MCR team needs to be aware of the accurate situational factors, however, they are confronted with the issue of lack of competency in the identification of situational factors. This study aims to conduct the Delphi survey to investigate the optimal and well-balanced MCR-related situational factors. The conducted survey also aimed to recognize and prioritize the most influencing situational factors for MCR activities. The study findings reported 21 situational factors, 147 sub-factors, and 5 Categories. Based on the results of the Delphi survey the identified situational factors are transformed into the situational MCR framework. This study might be helpful to support the individual sustainability of the MCR team by making them aware of the situations that can occur and vary during the execution of the MCR process. This research might also help the MCR team to improve their productivity and sustain in the industry for longer. It can also support software researchers who want to contribute to situational software engineering from varying software engineering contexts.

Author 1: Sumaira Nazir
Author 2: Nargis Fatima
Author 3: Suriayati Chuprat

Keywords: Situations; situational factors; Modern Code Review (MCR); sustainable software engineer; situational software engineering

PDF

Paper 49: Solving Travelling Salesman Problem (TSP) by Hybrid Genetic Algorithm (HGA)

Abstract: The Traveling Salesman Problem (TSP) is easy to qualify and describe but difficult and very hard to be solved. There is known algorithm that can solve it and find the ideal outcome in polynomial time, so it is NP-Complete problem. The Traveling Salesman Problem (TSP) is related to many others problems because the techniques used to solve it can be easily used to solve other hard Optimization problems, which allows of circulating it results on many other optimization problems. Many techniques were proposed and developed to solve such problems, including Genetic Algorithms. The aim of the paper is to improve and enhance the performance of genetic algorithms to solve the Traveling Salesman Problem (TSP) by proposing and developing a new Crossover mechanism and a local search algorithm called the Search for Neighboring Solution Algorithm, with the goal of producing a better solution in a shorter period of time and fewer generations. The results of this study for a number of different size standard benchmarks of TSP show that the proposed algorithms that use Crossover proposed mechanism can find the optimum solution for many of these TSP benchmarks by (100%) ,and within the rate (96%-99%) of the optimal solution to some for others. The comparison between the proposed Crossover mechanism and other known Crossover mechanisms show that it improves the quality of the solutions. The proposed Local Search algorithm and Crossover mechanism produce superior results compared to previously propose local search algorithms and Crossover mechanisms. They produce near optimum solutions in less time and fewer generations.

Author 1: Ali Mohammad Hussein Al-Ibrahim

Keywords: Traveling Salesman Problem (TSP); NP-Complete problem; genetic algorithms; local Search algorithm; crossover mechanism; neighboring solution algorithm

PDF

Paper 50: Efficiency and Performance of Optimized Robust Controllers in Hydraulic System

Abstract: Common applications involved hydraulic system in real-time including heavy machinery, air-craft system, and transportation. Real-time applications, however, are notorious to suffered with nonlinearities due to unavoidable mechanical structures. Thus, control system engaged to manipulate and minimize the effect resulting from the nonlinearities. In this paper, few control approaches are executed in the Electro-Hydraulic Actuator (EHA) system. The control approaches including the generally used Proportional-Integral-Derivative (PID) controller, the reinforced Fractional-Order (FO) PID controller, the Sliding Mode Controller (SMC), and the enhanced hybrid SMC-PID controller. In order to obtain proper parameter of each controller, the Particle Swarm Optimization (PSO) technique is applied. The output data are then analysed based on the performance indices in terms of the consumption of the energy and the error produced. The performance indices including Root Mean Square Error/Voltage (RMSE/V), Integral Square Error/Voltage (ISE/V), Integral Time Square Error/Voltage (ITSE/V), Integral Absolute Error/Voltage (IAE/V), and Integral Time Absolute Error/Voltage (ITAE/V). It is observed in the results, based on the performance indices in terms of error and voltage, the hybrid SMC-PID capable of generating better outcomes with reference to tracking capabilities and energy usage.

Author 1: Chong Chee Soon
Author 2: Rozaimi Ghazali
Author 3: Shin Horng Chong
Author 4: Chai Mau Shern
Author 5: Yahaya Md. Sam
Author 6: Zulfatman Has

Keywords: Robust Control Design; optimization; tracking efficiency analysis; controller effort analysis; electro-hydraulic actuator system

PDF

Paper 51: Automation of Traditional Exam Invigilation using CCTV and Bio-Metric

Abstract: In education, whilst e-learning has been playing a significant role over the last few years due to its flexibility and remote-based education system, majority of courses are still relying upon traditional approaches of learning due to the lack of integrity and security of online based examinations and assessments in e-learning. As such, traditional approach of examination system is considered superior method than e-examination but it has few limitations in its tag such as excessive number of physical resources (invigilators) is required and high occurrences of malpractices by the students during exam. The objective of this paper is to develop a framework for traditional pen and paper based examination system where number of invigilators will substantially be reduced and malpractices by the students during exam will be abolished. In order to implement the proposed examination system, educational institutions are required to preserve a database using Parallax Data Acquisition tool (PLX-DAQ) that incorporates bio-metric information of all students. Before entering in the examination hall, examinees go through authentication process via bio-metric reader that is attached in front of each exam hall. During the examination, examinees are monitored and controlled by an invigilator from distance through the use of 360-degree Closed-Circuit Television (CCTV) cameras as well as ultra-high sensitive microphones and speakers. Here, CCTV cameras are used to monitor examinees physical malpractices and microphones are used to control examinees vocal malpractices. Only one invigilator is required for n-number of exam halls in this process. The communication between students and invigilator can be done with microphones and speakers attached in both exam halls and invigilator room. This model will wipe out malpractices during examination. It will be a cost effective, simple and secure solution of complex traditional exam invigilation process.

Author 1: MD Jiabul Hoque
Author 2: Md. Razu Ahmed
Author 3: Md. Jashim Uddin
Author 4: Muhammad Mostafa Amir Faisal

Keywords: Automation in invigilation; bio-metric authentication; CCTV monitoring; e-assessment; Parallax Data Acquisition Tool (PLX-DAQ); traditional invigilation

PDF

Paper 52: Usability Evaluation of Open Source Learning Management Systems

Abstract: Advancements in Information and Communications Technology has enabled learning to be conducted online frequently through Learning Management Systems (LMS). The use of Learning Management Systems (LMS) as tools for learning in the present Internet age is seen as an important solution to remedy major problems particularly faced by higher education instructors, students and universities. However, any quality and usability related information regarding such widely used learning management systems are rarely encountered in the literature. The main objective of this study is to evaluate the system quality of the top five widely used open source learning management systems through the external characteristics of ISO/IEC 9126 quality standards evaluation model for Moodle, ATutor, Eliademy, Forma LMS and Dokeos with two experts. ISO/IEC 9126 quality model is adequate for evaluating important system quality metrics. Results highlighted in detail a set of usability and quality issues that are associated to external characteristics for each open LMS which require further attention of developers, educators and researchers to improve the quality of learning.

Author 1: Seren Basaran
Author 2: Rafia Khalleefah Hamad Mohammed

Keywords: E-learning; ISO/IEC 9126; learning management systems; quality model; usability evaluation

PDF

Paper 53: Evaluation Criteria for RDF Triplestores with an Application to Allegrograph

Abstract: Since its launching as the standard language of the semantic web, the Resource Description Framework RDF has gained an enormous importance in many fields. This has led to the appearance of a variety of data systems to store and process RDF data. To help users identify the best suited RDF data stores for their needs, we establish a list of evaluation and comparison criteria of existing RDF management systems also called triplestores. This is the first work addressing such topic for such triplestores. The criteria list highlights various aspects and is not limited to special stores but covers all types of stores including among others relational, native, centralized, distributed and big data stores. Furthermore, this criteria list is established taking into account relevant issues in accordance with triplestores tasks with respect to the main issues of RDF data storage, RDF data processing, performance, distribution and ease of use. As a study case we consider an application of the evaluation criteria to the graph RDF triplestore AllegroGraph.

Author 1: Khadija Alaoui
Author 2: Mohamed Bahaj

Keywords: RDF; RDFS; SPARQL; triplestore; big data; NoSQL; AllegroGraph

PDF

Paper 54: A Review on Honeypot-based Botnet Detection Models for Smart Factory

Abstract: Since the Swiss Davos Forum in January 2017, the most searched keywords related to the Fourth Revolutionary Industry are AI technology, big data, and IoT. In particular, the manufacturing industry seeks to advance information and communication technology (ICT) to build a smart factory that integrates the management of production processes, safety, procurement, and logistics services. Such smart factories can effectively solve the problem of frequent occurrences of accidents and high fault rates. An increasing number of cases happening in smart factories due to botnet DDoS attacks have been reported in recent times. Hence, the Internet of Thing security is of paramount importance in this emerging field of network security improvement. In response to the cyberattacks, smart factory security needs to gain its defending ability against botnet. Various security solutions have been proposed as solutions. However, those emerging approaches to IoT security are yet to effectively deal with IoT malware, also known as Zero-day Attacks. Botnet detection using honeypot has been recently studied in a few researches and shows its potential to detect Botnet in some applications effectively. Detecting botnet by honeypot is a detection method in which a resource is intentionally created within a network as a trap to attract botnet attackers with the purpose of closely monitoring and obtaining their behaviors. By doing this, the tracked contents are recorded in a log file. It is then used for analysis by machine learning. As a result, responding actions are generated to act against the botnet attack. In this work, a review of literature looks insight into two main areas, i.e. 1) Botnet and its severity in cybersecurity, 2) Botnet attacks on a smart factory and the potential of the honeypot approach as an effective solution. Notably, a comparative analysis of the effectiveness of honeypot detection in various applications is accomplished and the application of honey in the smart factories is reviewed.

Author 1: Lee Seungjin
Author 2: Azween Abdullah
Author 3: NZ Jhanjhi

Keywords: IoT; smart factory; honeypot; Botnets; detection; security; model

PDF

Paper 55: Routing Protocol based on Floyd-Warshall Algorithm Allowing Maximization of Throughput

Abstract: Routing protocol based on Floyd-Warshall algorithm which allows maximization of throughput is proposed. The metric function in the proposed routing protocol is throughput including not only send packets but also retransmission packets in order for improving effectiveness and efficiency of the network in concern. Through simulation studies, it is found that the proposed routing protocol is superior to the conventional Open Shortest Path First: OSPF based on Dijkstra algorithm for shortest path determination from the point of view of maximizing throughput. A routing protocol for Virtual Private Network (VPN) in Autonomous System (AS) based on maximizing throughput is proposed. Through a comparison between the proposed protocol and the existing protocols, OSPF (Widely used), it is also found that the required time for transmission of packets from one node to another node of the proposed protocol is 56.54% less than that of the OSPF protocol.

Author 1: Kohei Arai

Keywords: Network routing protocol; virtual private network; autonomous system; open shortest path first; Floyd-Warshall algorithm; Dijkstra algorithm; throughput

PDF

Paper 56: Knowledge Sharing Framework for Modern Code Review to Diminish Software Engineering Waste

Abstract: Modern Code Review (MCR) is a quality assurance technique that involves massive interactions between team members of MCR. Presently team members of MCR are confronting with the problem of waiting waste production, which results in their psychological distress and project delays. Therefore, the MCR team needs to have effective knowledge sharing during MCR activities, to avoid the circumstances that lead the team members to the waiting state. The objective of this study is to develop the knowledge sharing framework for MCR team to reduce waiting waste. The research methodology used for this study is the Delphi survey. The conducted Delphi survey intended to produce the finalized list of knowledge sharing factors and to recognize and prioritize the most influencing knowledge sharing factor for MCR activities. The study results reported 22 knowledge sharing factors, 135 sub-factor, and 5 categories. Grounded on the results of the Delphi survey the knowledge sharing framework for MCR has been developed. The study is beneficial for software engineering researchers to outspread the research. It can also help the MCR team members to consider the designed framework to increase knowledge sharing and diminish waiting waste.

Author 1: Nargis Fatima
Author 2: Sumaira Nazir
Author 3: Suriayati Chuprat

Keywords: Knowledge sharing; modern code review; software engineering wastes; waiting waste; lean software development

PDF

Paper 57: A Framework for Semantic Text Clustering

Abstract: Existing approaches for text clustering are either agglomerative, divisive or based on frequent itemsets. However, most of the suggested solutions do not take the semantic associations between words into account and documents are only regarded as bags of unrelated words. Indeed, traditional text clustering methods usually focus on the frequency of terms in documents to create connected homogenous clusters without considering associated semantic which will of course lead to inaccurate clustering results. Accordingly, this research aims to understand the meanings of text phrases in the process of clustering to make maximum usage and use of documents. The semantic web framework is filled with useful techniques enabling database use to be substantial. The goal is to exploit these techniques to the full usage of the Resource Description Framework (RDF) to represent textual data as triplets. To come up a more effective clustering method, we provide a semantic representation of the data in texts on which the clustering process would be based. On the other hand, this study opts to implement other techniques within the clustering process such as ontology representation to manipulate and extract meaningful information using RDF, RDF Schemas (RDFS), and Web Ontology Language (OWL). Since Text clustering is an indispensable task for better exploitation of documents, the use of documents may be more intelligently conducted while considering semantics in the process of text clustering to efficiently identify the more related groups in a document collection. To this end, the proposed framework combines multiple techniques to come up with an efficient approach combining machine learning tools with semantic web principles. The framework allows documents RDF representation, clustering, topic modeling, clusters summarizing, information retrieval based on RDF querying and Reasoning tools. It also highlights the advantages of using semantic web techniques in clustering, subject modeling and knowledge extraction based on processes of questioning, reasoning and inferencing.

Author 1: Soukaina Fatimi
Author 2: Chama EL Saili
Author 3: Larbi Alaoui

Keywords: Text clustering; similarity measure; ontology; semantic web; RDF; RDFS; OWL; reasoning; inferencing rules; SPARQL; topic modeling; summarization

PDF

Paper 58: Comparitive Study of Time Series and Deep Learning Algorithms for Stock Price Prediction

Abstract: Stock Price Prediction has always been an intriguing research problem in financial domain. In the past decade, various methodologies based on classical time series, machine learning, deep learning and hybrid models which constitute the combinations of algorithms have been proposed with reasonable effectiveness in predicting the stock price. There is also considerable research work in comparing the performances of these models. However, from literature review, stems a concern, that is, lack of formal methodology that allows comparison of performances of the different models. For example, the lack of guidance on the generalizability of the time series models and optimised deep learning models is concerning. In addition, there is also a lack of guidance on general fitment of models, which can vary in accordance with forecasting requirement of stock price. This study is aimed at establishing a formal methodology of comparing different types of time series forecasting models based on like for like paradigm. The effectiveness of Deep Learning and Time-Series models have been evaluated by predicting the close prices of three banking stocks. The characteristics of the models in terms of generalizability are compared. The impact of the forecasting period on performance for various models are evaluated on a common metric. In most of the previous studies, the forecasting was done for the periods of 1 day, 5 days or 31 days. To keep the impact of volatility in the stock market due to various political and economic shocks both at international and domestic domains to the minimum, the forecasting periods of up 2 days for short term and 5 days for long term are considered. It has been evidenced that the deep learning models have outperformed time series models in terms of generalisability as well as short- and long-term forecasts.

Author 1: Santosh Ambaprasad Sivapurapu

Keywords: Time series; deep learning; ARIMA; VAR; LSTM; GRU; CNN 1D; genetic algorithm; Tree Structured Parzen Estimator (TPE)

PDF

Paper 59: Enhanced Insertion Sort by Threshold Swapping

Abstract: Sorting is an essential operation that takes place in arranging data in a specific order, such as ascending or descending with numeric and alphabetic data. There are various sorting algorithms for each situation. For applications that have incremental data and require e an adaptive sorting algorithm, the insertion sort algorithm is the most suitable choice, because it can deal with each element without the need to sort the whole dataset. Moreover, the Insertion sort algorithm can be the most popular sorting algorithm because of its simple and straightforward steps. Hence, the insertion sort algorithm performance decreases when it comes to large datasets. In this paper, an algorithm is designed to empirically improve the performance of the insertion sort algorithm, especially for large datasets. The new proposed approach is stable, adaptive and very simple to translate into programming code. Moreover, this proposed solution can be easily modified to obtain in-place variations of such an algorithm by maintaining their main features. From our experimental results, it turns out that the proposed algorithm is very competitive with the classic insertion sort algorithm. After applying the proposed algorithm and comparing it with the classic insertion sort, the time taken to sort a specific dataset was reduced by 23%, regardless of the dataset’s size. Furthermore, the performance of the enhanced algorithm will increase along with the size of the dataset. This algorithm does not require additional resources nor the need to sort the whole dataset every time a new element is added.

Author 1: Basima Elshqeirat
Author 2: Muhyidean Altarawneh
Author 3: Ahmad Aloqaily

Keywords: Sorting; design of algorithm; insertion sort; enhanced insertion sort; threshold swapping

PDF

Paper 60: A Comparison of Data Sampling Techniques for Credit Card Fraud Detection

Abstract: Credit Card fraud is a tough reality that continues to constrain the financial sector and its detrimental effects are felt across the entire financial market. Criminals are continuously on the lookout for ingenious methods for such fraudulent activities and are a real threat to security. Therefore, there is a need for early detection of fraudulent activity to preserve customer trust and safeguard their business. A major challenge faced in designing fraud detection systems is dealing with the class imbalance issue in the data since genuine transactions outnumber the fraudulent transactions typically account less than 1% of the total transactions. This is an important area of study as the positive case (fraudulent case) is hard to distinguish and becomes even harder with the inflow of data where the representation of such cases even decreases further. This study trained four predictive models, Artificial Neural Network (ANN), Gradient Boosting Machine (GBM) and Random Forest (RF) on different sampling methods. Random Under Sampling (RUS), Synthetic Minority Over-sampling Technique (SMOTE), Density-Based Synthetic Minority Over-Sampling Technique (DBSMOTE) and SMOTE combined with Edited Nearest Neighbour (SMOTEENN) was used for all models. The findings of this study indicate promising results with SMOTE based sampling techniques. The best recall score obtained was with SMOTE sampling strategy by DRF classifier at 0.81. The precision score for this classifier was observed to be 0.86. Stacked Ensemble was trained for all the sampled datasets and found to have the best average performance at 0.78. The Stacked Ensemble model has shown promise in the detection of fraudulent transactions across most of the sampling strategies.

Author 1: Abdulla Muaz
Author 2: Manoj Jayabalan
Author 3: Vinesh Thiruchelvam

Keywords: Data imbalance; credit card fraud; sampling techniques

PDF

Paper 61: Usability Evaluation of a Tangible User Interface and Serious Game for Identification of Cognitive Deficiencies in Preschool Children

Abstract: Detecting deficits in reading and writing literacy skills has been of great interest in the scientific community to correlate executive functions with future academic skills. In the present study, a prototype of a serious multimedia runner-type game was developed, Play with SID, designed to detect deficiencies in cognitive abilities in preschool children (sustained attention, memory, working memory, visuospatial abilities, and reaction time), before learning to read and write. Usability tests are used in Human-Computer Interaction to determine the feasibility of a system; it is the proof of concepts before the development of real systems. The aim of this paper was to evaluate the usability of the interface of the serious game, as well as the tangible user interface, a teddy bear with motion sensors. A usability study using the Wizard of Oz technique was conducted with 18 neurotypical preschool participants, ages 4 to 6. Concepts related to interactivity (interaction, the fulfillment of the activity objective, reaction to stimuli, and game time without distraction) were observed, as well as eye-tracking to assess attention and the Usability Scale System (SUS) to measure usability. According to the usability evaluation (confidence interval between 74.74% and 90.47%), the prototype has good to excellent usability, with no statistically significant differences between the age groups. The observed concept with the highest score was the game time without distraction. This characteristic will allow evaluating sustained attention. Also, we found out that the tangible interface use leads to the observation of laterality development, which will be added to the design of the serious game. The use of observation-based usability assessment techniques is useful for obtaining information from the participants when their communication skills are developing, and the expression of their perception in detail is limited.

Author 1: Sánchez-Morales. A
Author 2: Durand-Rivera J.A
Author 3: Martínez-González C.L

Keywords: User interface; wizard of Oz; usability; HCI; input device

PDF

Paper 62: Comparative Analytics of Classifiers on Resampled Datasets for Pregnancy Outcome Prediction

Abstract: The main challenges of predictive analytics revolve around the handling of datasets, especially the disproportionate distribution of instances among classes in addition to classifier-suitability issues. This unequal spread causes imbalance learning and severely obstructs prediction accuracy. In this paper, the performances of six classifiers and the effect of data balancing (DB) and formation approaches for predicting pregnancy outcome (PO) were investigated. Synthetic minority oversampling technique (SMOTE), resampling with and without replacement, were adopted for data imbalance treatment. Six classifiers including random forest (RF) were evaluated on each resampled dataset with four test modes using Waikato Environment for Knowledge Analysis and R programming libraries. The results of analysis of variance performed separately using F-measure and root mean squared error showed that mean performance of classifiers across the datasets varied significantly (F=117.9; p=0.00) at 95% confidence interval, while turkey multi-comparison test revealed RF(mean=0.78) and SMOTE (mean=0.73) as having significantly different means. The RF model on SMOTE produced each PO class accuracy ≥0.89, area under the curve ≥ 0.96 and coverage of 97.8% and was adjudged the best classifier-DB method pair. However, there was no significant difference (F=0.07, 0.01; p=1.000) in the mean performances of classifiers across test data modes respectively. It reveals that train/test data modes insignificantly affect classification accuracy, although there are noticeable variations in computational cost. The methodology significantly enhance the predictive accuracy of minority classes and confirms the importance of data-imbalance treatment, and the suitability of RF for PO classification.

Author 1: Udoinyang G. Inyang
Author 2: Francis B. Osang
Author 3: Imo J. Eyoh
Author 4: Adenrele A. Afolorunso
Author 5: Chukwudi O. Nwokoro

Keywords: Imbalance learning; pregnancy outcome; random forest; SMOTE; imbalance data

PDF

Paper 63: Neuro-fuzzy System with Particle Swarm Optimization for Classification of Physical Fitness in School Children

Abstract: Physical fitness is widely known to be one of the critical elements of a healthy life. The sedentary attitude of school children is related to some health problems due to physical inactivity. The following article aims to classify the physical fitness in school children, using a database of 1813 children of both sexes, in a range that goes from six to twelve years. The physical tests were flexibility, horizontal jump, and agility that served to classify the physical fitness using neural networks and fuzzy logic. For this, the ANFIS (adaptive network fuzzy inference system) model was used, which was optimized using the Particle Swarm Optimization algorithm. The experimental tests carried out showed an RMSE error of 3.41, after performing 500 interactions of the PSO algorithm. This result is considered acceptable within the conditions of this investigation.

Author 1: Jose Sulla-Torres
Author 2: Gonzalo Luna-Luza
Author 3: Doris Ccama-Yana
Author 4: Juan Gallegos-Valdivia
Author 5: Marco Cossio-Bolaños

Keywords: Classification; ANFIS; particle swarm optimization; physical fitness; RMSE

PDF

Paper 64: An Efficient Classifier using Machine Learning Technique for Individual Action Identification

Abstract: Human action recognition is an important branch of computer vision and is getting increasing attention from researchers. It has been applied in many areas including surveillance, healthcare, sports and computer games. This proposed work focuses on designing a human action recognition system for a human interaction dataset. Literature research is conducted to determine suitable algorithms for action recognition. In this proposed work, three machine learning models are implemented as the classifiers for human actions. An image processing method and a projection-based feature extraction algorithm are presented to generate training examples for the classifier. The action recognition task is divided into two parts: 4-class human posture recognition and 5-class human motion recognition. Classifiers are trained to classify input data into one of the posture or motion classes. Performance evaluations of the classifiers are carried out to assess validation accuracy and test accuracy for action recognition. The architecture designs for the centralized and distributed recognition systems are presented. Later these designed architectures are simulated on the sensor network to evaluate feasibility and recognition performance. Overall, the designed classifiers show a promising performance for action recognition.

Author 1: G. L. Sravanthi
Author 2: M.Vasumathi Devi
Author 3: K.Satya Sandeep
Author 4: A.Naresh
Author 5: A.Peda Gopi

Keywords: Human action recognition; machine learning; neural networks

PDF

Paper 65: A Survey on Detection and Prevention of Web Vulnerabilities

Abstract: The Internet provides a vast range of benefits to society and empowers the users in a variety of ways to use web applications. Simply, the internet has become the most transformative and fast-growing technology ever built, but it also brings new security challenges to web services in internet applications because of the scattered and open nature of the internet. A simple vulnerability in the program code could favor/benefit an attacker to obtain unauthorized access and perform adversary actions. Hence, the security of web applications from a hacking attempt is of paramount importance. This paper focuses on a literature survey recapitulating security solutions and major vulnerabilities to promote further research by systemizing the existing methods, on a bigger horizon. The data is collected from an absolute of 86 primary studies that are taken from well-known digital libraries. Different methods comprising secure programming, static, Dynamic, Hybrid analysis, and machine learning classify the data from articles. The quantity of references or the significance of a developing strategy is kept in account while selecting articles. Overall, our survey suggests that there is no way to alleviate all the web vulnerabilities therefore more studies is desirable in the area of web information security. All methods’ complexity is addressed and some recommendations regarding when to use the application of given methods are provided. Finally, we typify the experience gained and examine future research openings in web application security.

Author 1: Muhammad Noman
Author 2: Muhammad Iqbal
Author 3: Amir Manzoor

Keywords: Web security survey; web vulnerabilities; detection and prevention techniques

PDF

Paper 66: Data Warehouse System for Multidimensional Analysis of Tuition Fee Level in Higher Education Institutions in Indonesia

Abstract: In this study, we developed a data warehouse (DW) system for tuition-fee-level management for higher education institutions (HEI) in Indonesia. The system was developed to provide sufficient information to the administrators for decision making of tuition fees of applicants by integrating multisource data. A simple but sufficient method was introduced using the open-source following the business requirements of HEI’s administrator. As a business intelligence (BI) approach, four procedures are applied e.g., preparation, integration, analysis, and visualization to construct a tuition-fee-level management system. The DW demonstrate four basic dimensions (faculty, year, entrant type, and tuition fee level) in all seven dimensions and three data regarding applicants, tuition fee level, and payment status. Analytical results were tuition fee level trends, top five faculty by applicants, and fees collected from the student trends. Those analysis results were presented in various charts and graphics contained at a dashboard of tuition fee level, which has many functions to provide insight relative to the business performance. The DW system described in this paper can be used as a guideline for tuition-fee-level management for HEIs in Indonesia.

Author 1: Ardhian Agung Yulianto
Author 2: Yoshiya Kasahara

Keywords: Data warehouse; higher education institution; multidimensional analysis; Indonesia; tuition-fee-level management

PDF

Paper 67: Signature based Network Intrusion Detection System using Feature Selection on Android

Abstract: This paper Smart Intrusion Detection System (IDS), is a contribution to efforts towards detecting intrusion and malicious activities on Android phone. The goal of this paper is to raise user’s awareness of the high rate of intrusions or malicious activities on Android phones and to provide counter measure system for more secured operations. The proposed system (SIDS) detects any intrusion or illegal activities on android and also takes a selfie of the intruder unknown to him/her and keep in the log for the view of the user. The object oriented analysis and design method (OOADM), was adopted in the development. This approach was used to model and develop the system using real intrusion features and processes to detect intrusions more flexibly and efficiently. Signature detection was also used to detect attacks by looking for specific patterns. The system detects intrusions and immediately sends an alert to the user to notify of an illegal or malicious attempt and the location of the intruder.

Author 1: Onyedeke Obinna Cyril
Author 2: Taoufik Elmissaoui
Author 3: Okoronkwo M.C
Author 4: Ihedioha Uchechi .M
Author 5: Chikodili H.Ugwuishiwu
Author 6: Okwume .B. Onyebuchi

Keywords: Signature Detection; Feature Selection; android phone; Smart Intrusion Detection System (SIDS)

PDF

Paper 68: Hybrid Machine Learning: A Tool to Detect Phishing Attacks in Communication Networks

Abstract: Phishing is a cyber-attack that uses disguised email as a weapon and has been on the rise in recent times. Innocent Internet users if peradventure clicking on a fraudulent link may cause him to fall victim to divulging his personal information such as credit card PIN, login credentials, banking information, and other sensitive information. There are many ways in which attackers can trick victims to reveal their personal information. In this article, we select important phishing URLs features that can be used by an attacker to trick Internet users into taking the attacker's desired action. We use two machine learning techniques to accurately classify our data sets. We compare the performance of other related techniques with our scheme. The results of the experiments show that the approach is highly effective in detecting phishing URLs and attained an accuracy of 97.8% with 1.06% false-positive rate, 0.5% false-negative rate, and an error rate of 0.3%. The proposed scheme performs better compared to other selected related work. This shows that our approach can be used for real-time applications in detecting phishing URLs.

Author 1: Ademola Philip Abidoye
Author 2: Boniface Kabaso

Keywords: Phishing attack; data sets; URL classification; phishing URL; attackers; machine learning; classifiers; Internet

PDF

Paper 69: Enhanced Pre-processing and Parameterization Process of Generic Code Clone Detection Model for Clones in Java Applications

Abstract: Code clones are repeated source code in a program. There are four types of code clone which are: Type 1, Type 2, Type 3 and Type 4. Various code clone detection models have been used to detect code clone. Generic Code Clone model is a model that consists of a combination of five processes in detecting code clone from Type-1 until Type-4 in Java Applications. The five processes are Pre-processing, Transformation, Parameterization, Categorization and Match Detection process. This work aims to improve code clone detection by enhancing the Generic Code Clone Detection (GCCD) model. Therefore, the Preprocessing and Parameterization process is enhanced to achieve this aim. The enhancement is to determine the best constant and weightage that can be used to improve the code clone detection result. The code clone detection result from the proposed enhancement shows that private with its weightage is the best constant and weightage for the Generic Code Clone Detection Model.

Author 1: Nur Nadzirah Mokhtar
Author 2: Al-Fahim Mubarak-Ali
Author 3: Mohd Azwan Mohamad Hamza

Keywords: Code clone; code clone detection model; java applications; computational intelligence

PDF

Paper 70: Improving Intrusion Detection System using Artificial Neural Network

Abstract: Currently, network communication is more suscep-tible to different forms of attacks due to its expanded usage, accessibility, and complexity in most areas, consequently imposing greater security risks. One method to halt attacks is to identify different forms of irregularities in the data transmitted and processed during communication. Detection of anomalies is a vital process to secure a system. To this end, machine learning plays a key role in identifying abnormalities and intrusion in communica-tion over a network. The term regularization is one of the major aspects of training machine learning models, in which, it plays a primary role in several successful Artificial neural network models, by inducing regularization in the model training. Then, this technique is integrated with an Artificial Neural Network (ANN) for classifying and detecting irregularities in network communication efficiency. The purpose of regularization is to discourage learning a more flexible or complex model. Thus, the machine learning model generalizes enough to perform accurately on unseen data. For training and testing purposes, NSL-KDD, CIDDS-001 (External and Internal Server Data), and UNSW-NB15 datasets were utilized. Through extensive experiments, the proposed regularizer reaches higher True Positive Rate (TPR) and precision compared L1 and L2 norm regularization algorithms. Thus, it is concluded that the proposed regularizer demonstrates a strong intrusion detection ability.

Author 1: Marwan Ali Albahar
Author 2: Muhammad Binsawad
Author 3: Jameel Almalki
Author 4: Sherif El-etriby
Author 5: Sami Karali

Keywords: New regularizer; anomaly detection; NSL-KDD dataset; CIDDS-001 dataset; UNSW-NB15

PDF

Paper 71: Successive Texture and Shape based Active Contours for Train Bogie Part Segmentation in Rolling Stock Videos

Abstract: Train Rolling Stock Examination (TRSE) is a pro-cedure for checking damages in the undercarriage of a moving train at 30kmph. The undercarriage of a train is called bogie according to railway manuals. Traditionally, TRSE is performed manually by set of highly skilled personnel of the railway near to the train stations. This paper presents a new method to segment the TRSE bogie parts which can assist trained railway personnel for better performance and consequently reduce train accidents. This work uses visualization techniques as a pair of virtual eyes to help checking of each bogie part remotely using high speed video data. Our previous AC models are being supervised by a weak shape image which has shown to improve segmentation accuracies on a closely packed inhomogeneous train bogie object space. However, the inner texture of the objects in the bogies is found to be necessary for better object segmentation. Here, this paper proposes an algorithm for bogie parts segmentation as successive texture and shape-based AC model (STSAC). In this direction, texture of the bogie part is applied serially before the shape to guide the contour towards the desired object of interest. This contrasts with the previous approaches where texture is applied to extract object shape, loosing texture information completely in the output image. To test the proposed method for their ability in extracting objects from videos captured under ambient conditions, the train rolling stock video database is built with 5 videos. In contrast to previous models the proposed method has produced shape rich texture objects through contour evolution performed sequentially.

Author 1: Kaja Krishnamohan
Author 2: Ch.Raghava Prasad
Author 3: P.V.V.Kishore

Keywords: Automation of Train Rolling Stock Examination; level sets; shape priors; texture priors; export system models

PDF

Paper 72: Exerting 2D-Space of Sentiment Lexicons with Machine Learning Techniques: A Hybrid Approach for Sentiment Analysis

Abstract: Sentiment mining from the textual content on the web can give valuable insights for discernment, strategic decision making, targeted advertisement, and much more. Supervised machine learning (ML) approaches do not capture the sentiment inherent in the individual terms. Whereas the unsupervised sen-timent lexicon (SL) based approaches lag behind ML approaches because of a bias they have towards one sentiment than the other. In this paper, we propose a hybrid approach that uses unsuper-vised sentiment lexicons to transform the term space into a two-dimensional sentiment space on which a discriminative classifier is trained in a supervised fashion. This hybrid approach yields higher accuracy, faster training, and lower memory footprint than the ML approaches. It is more suitable for scenarios where training data is scarce. We support our claim by reporting results on six social media datasets using five sentiment lexicons and four ML algorithms.

Author 1: Muhammad Yaseen Khan
Author 2: Khurum Nazir Junejo

Keywords: Hybrid approach; machine learning; sentiment analysis; sentiment lexicons; sentiment space; social media analysis

PDF

Paper 73: Capsule Network for Cyberthreat Detection

Abstract: In cybersecurity, analyzing social network data has become an essential research area due to its property of providing real-time updates about real-world events. Studies have shown that Twitter can contain information about security threats before some specialized sites. Thus, the classification of tweets into security-related and not security-related can help with early warnings for such attacks. In this study, the use of a capsule network (CapsNet), the new deep learning algo-rithm, is investigated for the first time in the field of security attack detection using Twitter. The aim was to increase the accuracy of tweet classification by using CapsNet rather than a convolutional neural network (CNN). To achieve the research objective, the original implementation of CapsNet with dynamic routing is adapted to be suitable for text analysis using tweet data set. A random search technique was used to tune the model’s hyperparameters. The experimental results showed that CapsNet exceeded the baseline CNN on the same data set, with accuracy of 92.21% and a 92.2% F1 score; also, word2vec embedding performed better than a random initialization.

Author 1: Sahar Altalhi
Author 2: Maysoon Abulkhair
Author 3: Entisar Alkayal

Keywords: Capsule network; dynamic routing; deep learning; Twitter; text analysis; attack detection

PDF

Paper 74: Estimating the Causes of Poor Academic Performance of Students: A Case Study

Abstract: Poor academic performance of students is not the only concern for parents and teachers, but also a concern for the country as a whole. This paper makes an attempt to identify the cause(s) of poor academic achievement. This paper presents a method of identifying the most influencing factor on academic performance. The proposed method capable of using qualitative ratings as input for the factors considered and find the correlation of each factor with academic performance, and finally rank the influences of the factors on performance to sort out the most influencing one. The study was carried out on the academic performance of 189 students of B.Tech for five academic semesters. The results indicate the degree influences of various factors on performance, with the most influencing one being the academic ability of students.

Author 1: Juwesh Binong

Keywords: Academic performance; qualitative rating; factors; correlation coefficient; analytic hierarchy process

PDF

Paper 75: Sensing of Environmental Variables for the Analysis of Indoor Air Pollution

Abstract: Ambient intelligence systems try to perceive the environment and react, proactively and pervasively to improve peoples environmental conditions. A current challenge in Ambient intelligence is trying to mitigate environmental risks that affect global public health, such as increasing air pollution. This paper presents the analysis of some environmental variables related to indoor air pollutants, such as CO, PM 2.5, PM 10, humidity and temperature; all of these captured in a university environment. The environmental measurements were carried out through a wireless sensor network consisting of two nodes. The cloud computing service, that is, ThingSpeak, was used as the storage medium. With this network, the presence of pollutants in the study area were detected with concentration levels within the permitted ranges, as well as its correlation with the atmospheric variables of temperature and humidity. The implementation of the sensor network allowed the capture of data in a transparent and non-intrusive way, and the analysis allowed the understanding of the behavior of pollutants in indoor spaces, where air circulation is limited, which in the face of high levels of pollution can be harmful to human health.

Author 1: Jaime Xilot
Author 2: Guillermo Molero-Castillo
Author 3: Edgard Benitez-Guerrero
Author 4: Everardo Barcenas

Keywords: Air pollution; Ambient Intelligence (AmI); indoor air quality; wireless sensor network

PDF

Paper 76: Future of the Internet of Things Emerging with Blockchain and Smart Contracts

Abstract: The Internet of Things (IoT) has the potential to change the way the world works from home automation to smart cities, from improved healthcare to an efficient management sys-tem in supply chains to industry 4.0 revolution. IoT is increasingly becoming an essential part of the homes and industrial automa-tion; nevertheless, there are still many challenges that need to fix. IoT solutions are costly and complicated, while issues regarding security and privacy must be addressed with a sustainable plan. Support the growing number of connected devices; the IoT is in dire need of a reboot. Blockchain technology might be the answer. Starting as a decentralized financial solution in the form of Bitcoin, Blockchain technology has expanded to diverse areas and Information Technology applications. Blockchain technology and Smart Contracts can address the outstanding security and privacy issues that impede further development of the IoT. Blockchain is a decentralized system with no central governance, facilitates interactions, promotes new and improved transaction models, and allows autonomous coordination of the devices using enhanced encryption techniques. The primary reason for this paper is to showcase the challenges and problems we are facing with the current internet of things solutions and analyze how the use of Blockchain and Smart Contracts can help achieve a new, more robust internet of things system. Finally, we examine some of the many projects using the Internet of Things together with Blockchain and Smart Contracts, to create new solutions that are only possible by integrating these technologies.

Author 1: Mir Hassan
Author 2: Chen Jincai
Author 3: Adnan Iftekhar
Author 4: Xiaohui Cui

Keywords: Internet of Things (IoT); blockchain; smart con-tracts; peer-to-peer security

PDF

Paper 77: News Aggregator and Efficient Summarization System

Abstract: News Aggregator is simply an online software which collects new stories and events around the world from various sources all in one place. News aggregator plays a very important role in reducing time consumption, as all of the news that would be explored through more than one website will be placed only in a single location. Also, summarizing this aggregated content absolutely will save reader’s time. A proposed technique used called the TextRank algorithm that showed promising results for summarization. This paper presents the main goal of this project which is developing a news aggregator able to aggregate relevant articles of a certain input keyword or key-phrase. Summarizing the relevant articles after enhancing the text to give the reader understandable and efficient summary.

Author 1: Alaa Mohamed
Author 2: Marwan Ibrahim
Author 3: Mayar Yasser
Author 4: Mohamed Ayman
Author 5: Menna Gamil
Author 6: Walaa Hassan

Keywords: News aggregator; text summarization; text enhance-ment; textRank algorithm

PDF

Paper 78: Acoustic Frequency Optimization for Underwater Wireless Sensor Network

Abstract: In recent years, research in Underwater Wireless Sensor Network (UWSN) was the interest of many research groups as it can be used for many important applications such as disaster management, marine environment monitoring, fish farming, and military surveillance. There are many challenges in underwater acoustic communication: strong signal attenuation, limited bandwidth, long propagation delay, high transmission loss, and energy consumption. In this paper, we present a simple flow of mathematical models for the underwater acoustic channel for the underwater acoustic communication channel. We also investigate the influence of different parameters governing the communication channel’s performance, such as temperature and wind speed. We also show the importance of selecting the optimal communication frequency to increase communication SNR. We implemented the mathematical model in MATLAB and made it available online for other researchers. We found out that selecting the optimal frequency is very crucial when wind speed is high.

Author 1: Emad Felemban

Keywords: Underwater Wireless Sensor Network (UWSN); acoustic signal; mathematical modeling; optimization; noise level; optimal frequency

PDF

Paper 79: Automated Recognition of Sincere Apologies from Acoustics of Speech

Abstract: Sincerity is an important characteristic of communicative behavior which represents an honest, truthful, and genuine display of verbal and non-verbal expressions. Individuals who are deemed sincere often appear more charismatic and can influence a large number of people. In this paper, we propose a multi-model fusion framework to identify sincerely delivered apologies by modelling difference between acoustics of sincere and insincere utterances. The efficacy of this framework is benchmarked using the Sincere Apology Corpus (SAC). We show that our proposed methods can improve the baseline classification performance (in terms of unweighted average recall) for SAC from 66.02% to 70.97% for the validation partition and 66.61% to 75.49% for the test partition. Moreover, as part of our investigation, we found that gender dependency can influence the classification performance of machine learning models, with models trained for male subjects performing better than those trained for female subjects.

Author 1: Zafi Sherhan Syed
Author 2: Muhammad Shehram Shah
Author 3: Abbas Shah Syed

Keywords: Sincerity; affective computing; social signal processing

PDF

Paper 80: Document Classification Method based on Graphs and Concepts of Non-rigid 3D Models Approach

Abstract: Text document classification is an important re-search topic in the field of information retrieval, and so it is how we represent the information extracted from the documents to be classified. There exists document classification methods and tech-niques based on the vector space model, which doesn’t capture the relation between words, which is considered of importance to make a better comparison and therefore classification. For this reason, two significant contributions were made, the first one is the way to create the feature vector for document comparison, which uses adapted concepts of non-rigid 3D models comparison and graphs as a data structure to represent such documents. The second contribution is the classification method itself, which uses the representative feature vectors of each category to classify new documents.

Author 1: Lorena Castillo Galdos
Author 2: Cristian Lopez Del Alamo
Author 3: Grimaldo D´avila Guill´en

Keywords: Document classification; graphs; non-rigid 3D mod-els; Universidad Nacional de San Agust´in de Arequipa (UNSA)

PDF

Paper 81: Predicting Number of Hospital Appointments When No Data Is Available

Abstract: Usually, in a hospital, the data generated by each department or section is treated in isolation, believing that there is no relationship between them. It is thought that while one department is in high demand, it can not influence that another may have the same demand or not have any demand. In this paper, we question this approach by considering information from departments as components of a large system in the hospital. Thus, we present an algorithm to predict the appointments of departments when data is not available using data from other departments. This algorithm uses a model based on multiple linear regression using a correlation matrix to measure the rela-tionship between the departments with different time windows. After running our algorithm for different time windows and departments, we experimentally find that while we increase the extension of a time window and learn dependencies in the data, its corresponding precision decreases. Indeed, a month of data is the minimum sweet spot to leverage information from other departments and still provide accurate predictions. These results are important to develop per-department health policies under limited data, an interesting problem that we plan to investigate in future works.

Author 1: Harold Caceres
Author 2: Nelson Fuentes
Author 3: Julio Aguilar
Author 4: Cesar Baluarte
Author 5: Karim Guevara
Author 6: Eveling Castro-Gutierrez
Author 7: Omar U. Florez

Keywords: Multi linear Regression; hospital appointments; ma-chine learning; correlation matrix

PDF

Paper 82: Road Damage Detection Utilizing Convolution Neural Network and Principal Component Analysis

Abstract: Roads should always be in a reliable con-dition and maintained regularly. One of the problems that should be maintained well is the pavement cracks problem. This a challenging problem that faces road engineers, since maintaining roads in a stable condition is needed for both drivers and pedestrians. Many meth-ods have been proposed to handle this problem to save time and cost. In this paper, we proposed a two-stage method to detect pavement cracks based on Principal Component Analysis (PCA) and Convolutional Neural Network (CNN) to solve this classification problem. We employed a Principal Component Analysis (PCA) method to extract the most significant features with a di˙erent number of PCA components. The proposed approach was trained using a Mendeley Asphalt Crack dataset, which contains 400 images of road cracks with a 480×480 resolution. The obtained results show how PCA helped in speeding up the learning process of CNN.

Author 1: Elizabeth Endri
Author 2: Alaa Sheta
Author 3: Hamza Turabieh

Keywords: Pavement crack; Convolutional Neural Network (CNN); Principal Component Analysis (PCA)

PDF

Paper 83: Automatic Building Change Detection on Aerial Images using Convolutional Neural Networks and Handcrafted Features

Abstract: In this article, we present a new framework to solve the task of building change detection, making use of a convolutional neural network (CNN) for the building detection step, and a set of handcrafted features extraction for the change detection. The buildings are extracted using the method called Mask R-CNN which is a neural network used for object- based instance segmentation and has been tested in different case studies to segment different types of objects obtaining good results. The buildings are detected in bitemporal images, where three different comparison metrics MSE, PSNR and SSIM are used to differentiate if there are changes in buildings, we used this metrics in the Hue, Saturation and Brightness representation of the image. Finally the characteristics are classified by two algorithms, Support Vector Machine and Random Forest, so that both results can be compared. The experiments were performed in a large dataset called WHU building dataset, which contains very high-resolution (VHR) aerial images. The results obtained are comparable to those of the state of the art.

Author 1: Diego Alonso Javier Quispe
Author 2: Jose Sulla-Torres

Keywords: Bi-temporal images; convolutional neural network (CNN); building detection; building change detection; Mask R-CNN

PDF

Paper 84: ParaCom: An IoT based Affordable Solution Enabling People with Limited Mobility to Interact with Machines

Abstract: There are many people in this world who don’t have the ability to communicate with others due to some unforeseen accident. Users who are paralyzed and/or suffering from different Motor Neuron Diseases (MND) like Amyotrophic Lateral Sclerosis (ALS), Primary Lateral Sclerosis etc, by making them more independent. Patients suffering from these diseases are not able to move their arms and legs, lose their body balance and the ability to speak. Here we propose an IoT based communication controller using the concept of Morse Code Technology which controls the smartphone of the user. This paper proposes a solution to give the user ability to communicate to other people using machine as an intermediator. The device will require minimal inputs from the user.

Author 1: Siddharth Sekar
Author 2: Nirmit Agarwal
Author 3: Vedant Bapodra

Keywords: Internet of Things (IoT); Motor Neuron Disease (MND); Amyotrophic Lateral Sclerosis (ALS); Arduino

PDF

Paper 85: Security of a New Hybrid Ciphering System

Abstract: The protection of privacy is a very sensitive subject and comes into force in all areas. They represent the first priority in the development of new technologies. In fact, opt for a new Big data or IOT technology is a very difficult decision for organizations and calls into question the confidentiality, integrity, authenticity and non-repudiation of their data. Convincing these organizations to adhere to technological intelligence is tantamount to providing them with powerful tools and mechanisms of security that are resistant to new types of vulnerability. However, the problem today is that most security tools are based on old cryptographic primitives. Certainly; they have proved their resistance until today but the need to have others becomes crucial in order to meet the new technological requirements. In this paper, we propose a new hybrid encryption alternative based on two encryption systems, the first one is an evolutionary encryption system and the second one is based on an asymmetric encryption system. To present this work we begin with a description of our evolutionary cipher system. Then, we present the principle of proposed hybridization and its contribution compared to other existing systems. Finally, we perform a detailed study on the safety of this system and its long-term resistance.

Author 1: Mohammed BOUGRINE
Author 2: Fouzia OMARY
Author 3: Salima TRICHNI

Keywords: Security; confidentiality; hybrid encryption; evolutionary algorithms; symmetrical encryption; cryptography

PDF

Paper 86: Factors Influencing Practice of Human Resource Information System in Organizations: A Hybrid Approach of AHP and DEMATEL

Abstract: This paper blends the development of the Technology-Organization-Environment (TOE) framework and Human-Organization-Technology (HOT) fit model to identify the factors that influence the administration choice in embracing human resource information system (HRIS) in the organizations. Here, a hybrid Multi-Criteria Decision Making (MCDM) model combining the Decision Making Trial and Evaluation Laboratory (DEMATEL) and Analytic hierarchy Processes (AHP) is used to achieve the objective of the study. In this study, the experts agree that the staffs IT skill is most significant than other factors for the Human dimension. Similarly, IT infrastructure, top level support, and competitive pressure are the most vital factors for Technology, Organization and Environment dimensions respectively. Moreover, this paper will help the managers to take care of some factors that are vital for HRIS implementation in the organizations.

Author 1: Abdul Kadar Muhammad Masum
Author 2: Faisal Bin Abid
Author 3: ABM Yasir Arafat
Author 4: Loo-See Beh

Keywords: Analytic Hierarchy Processes (AHP); Decision Making Trial and Evaluation Laboratory (DEMATEL); factor; Human Resource Information System (HRIS); Multi-Criteria Decision Making (MCDM) Model

PDF

Paper 87: A Review on Virtual Machine Positioning and Consolidation Strategies for Energy Efficiency in Cloud Data Centers

Abstract: The cloud data center consumes massively more and more energy which is considered inacceptable. Therefore further efforts are needed to improve the energy efficiency of such data centers by using Server Consolidation to minimize the number of Active Physical Machines (APMs) in a data center setting. Strategies for positioning and transformation of VM maintain their usefulness as a roadmap to maximum consolidation. The latest techniques do complex restructuring, thus optimizing VM's positioning. The paper provides a detailed state-of - the-art strategies for VM positioning and consolidation that help improve energy efficiency in cloud data centers. A comparison is provided here between the strategies that revealed the worthiness, limitations and suggestions of strengthening other methods along the way.

Author 1: Nahuru Ado Sabongari
Author 2: Abdulsalam Ya’u Gital
Author 3: Souley Boukari
Author 4: Badamasi Ja’afaru
Author 5: Muhammad Auwal Ahmed
Author 6: Haruna Chiroma

Keywords: Energy efficiency; optimization; cloud data centers

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org