The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 10 Issue 6

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: AHP-based Security Decision Making: How Intention and Intrinsic Motivation Affect Policy Compliance

Abstract: Analytic hierarchy process is a multiple-criteria tool used in applications related to decision-making. In this paper, analytic hierarchy process is used as guidance in information security policy decision-making by identifying influencing factors and their weights for information security policy compliance. The weights for intrinsic motivators are identified based on self-determination theory as essential criteria, namely, autonomy, competence, relatedness, along with behavioural intention towards compliance; and use four awareness focus areas. A survey of cyber-security decision-makers at a Fortune 600 organisation provided data. The results suggest that behavioural intention (52% of the weight of influencing factors) is more important than autonomy (21%), competence (21%) or relatedness (6%) in influencing behaviour towards information security policy compliance. Determining weights of intrinsic motivation, intention, and awareness focus areas can help security decision-making and compliance with policy, and support design of effective security awareness programmes. However, these weights may in turn be affected by local organisational and cultural factors.

Author 1: Ahmed Alzahrani
Author 2: Christopher Johnson

Keywords: Analytic Hierarchy Process; behavioural intention; autonomy; competence; relatedness; information security policy compliance

PDF

Paper 2: Prediction of Visibility for Color Scheme on the Web Browser with Neural Networks

Abstract: In this study, we propose neural networks for predicting the visibility of color schemes. In recent years, most of us have accessed websites owing to the spread of the Internet. It is necessary to design web pages that allow users to access information easily. The color scheme is one of the most important elements of website design and therefore, we focus on the visibility of the background and character colors in this study. The prediction methods of visibility of color scheme have been proposed. In one of the prediction methods, neural networks are used to forecast pairwise comparison tables that indicate the visibility of background and character colors. Our model employs neural networks for color recognition and visibility prediction. The neural networks used for color recognition include functions that forecast the color class name from a color and extract the features of the color. The neural networks used for visibility prediction include functions that employ the features of background and character colors extracted by neural networks for color recognition and forecast the visibility of a color scheme. Pairwise comparison tables are forecasted with the prediction results of neural networks for visibility prediction. We conducted pairwise comparison experiment on a web browser, as well as color recognition experiment and evaluated our model. The results of the experiments suggest that our model could improve the accuracy of pairwise comparison tables compared to existing methods. Thus, proposed model can be used to predict the visibility of color schemes.

Author 1: Miki Yamaguchi
Author 2: Yoshihisa Shinozawa

Keywords: Visibility prediction; color recognition; pairwise comparison experiment; human color vision; neural networks

PDF

Paper 3: Application of the Scattering Bond Graph Methodology for Composite Right/Left Handed Transmission Lines

Abstract: The approach of the metamaterials based on the theory of transmission lines has an influence on the development of the radiofrequency domain, so the types and techniques of elaboration of these artificial lines are quite diversified. This paper will present two models of Artificial Transmission Line CRLH (Composite Right Left Handed) composed of a combination of resonators OSRR (Open Split Ring Resonators) and OSCRR (Complementary Open Split Ring Resonators). This paper will also show how the Scattering Bond Graph (SBG) study methodology allows to provide electromagnetic simulation results (Scattering parameters, phase response) more conducive to Bond Graph (BG) modeling, and will prove the possibility of having the most perfect line (solved the problem of impedance adaptation) and better understanding of the behavior of radiofrequency systems.

Author 1: Islem Salem
Author 2: Hichem Taghouti
Author 3: Ahmed Rahmani
Author 4: Abdelkader Mami

Keywords: Scattering Bond Graph (SBG); metamaterials; wave matrix [W]; modelization; transmission line; scattering matrix [S]; CRLH; OSCRR; OSRR

PDF

Paper 4: Expert System for Milk and Animal Monitoring

Abstract: Expert systems (ES) are one of the prominent research domains of artificial intelligence (AI). They are applications developed to solve complex problems in a particular domain, at the level of extra-ordinary human intelligence and expertise. In the paper is presented design and development of expert system for data collection, analysis and decision making for early mastitis detection. It focuses on both milk quality and animals health.

Author 1: Todor Todorov
Author 2: Juri Stoinov

Keywords: Expert system; milk quality; animal health

PDF

Paper 5: Innovative Means of Medical Students Teaching through Graphical Methods for Cardiac Data Estimating and Serious Games

Abstract: Now-a-days non-traditional methods and tools are introduced in the training of medical students which are mediated by the rapid development of information technologies: software training systems, Serious Games, and video materials. The paper presents a software system for the processing and analysis of physiological data, which is suitable for use in the medical students’ training process and a study has been made of the use of serious games in medical education. For optimal work with a software system, a database of cardiac data has been created for healthy individuals and individuals with various cardiac diseases. The established system and the cardiology database can be used by medical doctors to study statistical parameters and graphical cardiac data representation in various diseases. The results obtained from the analysis of the data through graphical methods can be used as an effective visual means for increasing the success of medical students' training. The paper presents the results of a survey of the interest of medical students in higher education Universities in Bulgaria to the inclusion of Serious Games in their medical training. The results show a high interest in game-based learning: by including serious games such as innovative training, it will be possible to build on theoretical education and to increase the efficiency of the education process.

Author 1: Galya N. Georgieva-Tsaneva

Keywords: Serious games; medical education; cardiovascular disease; heart rate variability; time domain analysis; spectrogram

PDF

Paper 6: Muscles Heating Analysis in Sportspeople to Prevent Muscle Injuries using Thermal Images

Abstract: Muscle heating is the process that every athlete follows before any physical activity or sport which are the legs where greater force is exerted and in case a good heating routine is not practiced, the muscles can suffer tears, cramps or fractures due to sudden movements while the muscles are cold. According to the National Institute of Arthritis and Musculoskeletal and Skin Diseases of the United States, the most common injuries occur in the ankles because it is a central point where greater force is exerted by the induced weight of the athletes, in addition if excessive muscle care is important. That is why the evaluation of muscle heating in athletes to prevent muscle damage was raised in this research work, first two thermal images of the before and after heating will be obtained using the FLIR ONE Pro thermal camera following a protocol of distance, position and temperature range, then the images are processed in the MATLAB software to map them in the temperature range and then subtract them to obtain the zones where the temperature variations are found indicating where an adequate heating has been carried out. As a result, the areas where the subtraction of both images was positive were obtained, this new image of the subtraction is superimposed on the real image, showing the real image with the areas where it has proceeded with an optimal heating.

Author 1: Brian Meneses-Claudio
Author 2: Witman Alvarado-Díaz
Author 3: Fiorella Flores-Medina
Author 4: Natalia I. Vargas-Cuentas
Author 5: Avid Roman-Gonzalez

Keywords: Thermal image; muscle heating; heat map; muscle injuries, temperature range

PDF

Paper 7: Advanced Metaheuristics-based Tuning of Effective Design Parameters for Model Predictive Control Approach

Abstract: This paper presents a systematic tuning approach for Model Predictive Control (MPC) parameters’ using an original LabVIEW-implementation of advanced metaheuristics algorithms. Perturbed Particle Swarm Optimization (pPSO), Gravitational Search Algorithm (GSA), Teaching-Learning Based Optimization (TLBO) and Grey Wolf Optimizer (GWO) metaheuristics are proposed to solve the formulated MPC tuning problem under operational constraints. The MPC tuning strategy is done offline for the selection of both prediction and control horizons as well as the weightings matrices. All proposed algorithms are firstly evaluated and validated on a benchmark of standard test functions. The same algorithms were then used to solve the formulated MPC tuning problem for two dynamical systems such as the magnetic levitation system MAGLEV 33-006, and the three-tank DTS200 process. Demonstrative results, in terms of statistical metrics and closed-loop systems responses, are presented and discussed in order to show the effectiveness and superiority of the proposed metaheuristics-tuned approach. The developed CAD interface for the LabVIEW implementation of the proposed metaheuristics is given and freely accessible for extended optimization puposes.

Author 1: Mohamed Lotfi Derouiche
Author 2: Soufiene Bouallègue
Author 3: Joseph Haggège
Author 4: Guillaume Sandou

Keywords: Model predictive control; parameters tuning; advanced metaheuristics; MAGLEV 33-006; DTS200 three-tank process; LabVIEW implementation

PDF

Paper 8: Secure Medical Internet of Things Framework based on Parkerian Hexad Model

Abstract: Medical Internet of Things (MIoT) applications enhance medical services by collecting data using devices connected to the IoT. The collected data, which may include personal data and location, is transmitted to mobile device and to health care provider via Internet Service Provider (ISP). Unfortunately, connecting a device to a network or sending data via wide network may make those devices and data vulnerable to unauthorized access. In this research, a secure 3-tier MIoT framework is proposed. Tier 1 includes the devices and sensors that will collect data. Those devices and sensors are based upon limited resources; therefore, they cannot apply complex security and privacy algorithms. Tier 2 includes the devices that will collect data from Tier 1 and submit it to Tier 3 via Internet Service Provider (ISP). Tier 3 includes the Health Information System. The framework defines the controls that are needed between layers to secure user privacy and data based on the Parkerian Hexad Model.

Author 1: Nidal Turab
Author 2: Qasem Kharma

Keywords: MIoT; Perkerian Hexad; PRMS; Lightweight Encryption

PDF

Paper 9: Two Dimensional Electronic Nose for Vehicular Central Locking System (E-Nose-V)

Abstract: A new approach to vehicle security is proposed, tried, tested. The designed and tested system comprises an odor detection system (E-Nose) that sends signals corresponding to selected odors to the smart vehicle Electronic Control Unit (ECU), which is interfaced to a smart system with neural networks. The signal is interpreted in time and space, whereby a certain ordered number of samples should be obtained before the vehicle functions are unlocked. Correlation of rise and decay times and amplitudes of the signal is carried out to ensure security. The proposed system is highly secured and could be further developed to become a vital and integrated part of Intelligent Transportation Systems (ITS) through the addition of driver’s body odor smell as an extra measure of security and in cases of accidents to auto call emergency services with driver identification and some diagnostics. Such system can be utilized in smart cities.

Author 1: Mahmoud Zaki Iskandarani

Keywords: Odors; E-Nose; neural networks; security; correlation; smart vehicles; intelligent transportation systems; smart cities

PDF

Paper 10: Forensic Analysis using Text Clustering in the Age of Large Volume Data: A Review

Abstract: Exploring digital devices in order to generate digital evidence related to an incident being investigated is essential in modern digital investigation. The emergence of text clustering methods plays an important role in developing effective digital forensics techniques. However, the issue of increasing the number of text sources and the volume of digital devices seized for analysis has been raised significantly over the years. Many studies indicated that this issue should be resolved urgently. In this paper, a comprehensive review of digital forensic analysis using text-clustering methods is presented, investigating the challenges of large volume data on digital forensic techniques. Moreover, a meaningful classification and comparison of the text clustering methods that have been frequently used for forensic analysis are provided. The major challenges with solutions and future research directions are also highlighted to open the door for researchers in the area of digital forensics in the age of large volume data.

Author 1: Bandar Almaslukh

Keywords: Digital investigation; forensic analysis; text clustering

PDF

Paper 11: Internet of Things (IOT): Research Challenges and Future Applications

Abstract: With the Internet of Things (IoT) gradually evolving as the subsequent phase of the evolution of the Internet, it becomes crucial to recognize the various potential domains for application of IoT, and the research challenges that are associated with these applications. Ranging from smart cities, to health care, smart agriculture, logistics and retail, to even smart living and smart environments IoT is expected to infiltrate into virtually all aspects of daily life. Even though the current IoT enabling technologies have greatly improved in the recent years, there are still numerous problems that require attention. Since the IoT concept ensues from heterogeneous technologies, many research challenges are bound to arise. The fact that IoT is so expansive and affects practically all areas of our lives, makes it a significant research topic for studies in various related fields such as information technology and computer science. Thus, IoT is paving the way for new dimensions of research to be carried out. This paper presents the recent development of IoT technologies and discusses future applications and research challenges.

Author 1: AbdelRahman H. Hussein

Keywords: Internet of Things; IoT applications; IoT challenges; future technologies; smart cities; smart environment; smart agriculture; smart living

PDF

Paper 12: Classification of Hand Movements based on Discrete Wavelet Transform and Enhanced Feature Extraction

Abstract: Extraction of potential electromyography (EMG) features has become one of the important roles in EMG pattern recognition. In this paper, two EMG features, namely, enhanced wavelength (EWL) and enhanced mean absolute value (EMAV) are proposed. The EWL and EMAV are the modified version of wavelength (WL) and mean absolute value (MAV), which aims to enhance the prediction accuracy for the classification of hand movements. Initially, the proposed features are extracted from the EMG signals via discrete wavelet transform (DWT). The extracted features are then fed into the machine learning algorithm for classification process. Four popular machine learning algorithms include k-nearest neighbor (KNN), linear discriminate analysis (LDA), Naïve Bayes (NB) and support vector machine (SVM) are used in evaluation. To examine the effectiveness of EWL and EMAV, several conventional EMG features are used in performance comparison. In addition, the efficacy of EWL and EMAV when combine with other features are also investigated. Based on the results obtained, the combination of EWL and EMAV with other features can improve the classification performance. Thus, EWL and EMAV can be considered as valuable tools for rehabilitation and clinical applications.

Author 1: Jingwei Too
Author 2: Abdul Rahim Abdullah
Author 3: Norhashimah Mohd Saad

Keywords: Electromyography; feature extraction; discrete wavelet transform; classification; pattern recognition

PDF

Paper 13: Modified Graph-theoretic Clustering Algorithm for Mining International Linkages of Philippine Higher Education Institutions

Abstract: Graph-theoretic clustering either uses limited neighborhood or construction of a minimum spanning tree to aid the clustering process. The latter is challenged by the need to identify and consequently eliminate inconsistent edges to achieve final clusters, detect outliers and partition substantially. This work focused on mining the data of the International Linkages of Philippine Higher Education Institutions by employing a modified graph-theoretic clustering algorithm with which the Prim’s Minimum Spanning Tree algorithm was used to construct a minimum spanning tree for the internationalization dataset infusing the properties of a small world network. Such properties are invoked by the computation of local clustering coefficient for the data elements in the limited neighborhood of data points established using the von Neumann Neighborhood. The overall result of the cluster validation using the Silhouette Index with a score of .69 indicates that there is an acceptable structure found in the clustering result – hence, a potential of the modified MST-based clustering algorithm. The Silhouette per cluster with .75 being the least score means that each cluster derived for r=5 by the von Neumann Neighborhood has a strong clustering structure.

Author 1: Sheila R. Lingaya
Author 2: Bobby D. Gerardo
Author 3: Ruji P. Medina

Keywords: MST-based clustering; Small World Network; von Neumann Neighborhood; internationalization; Prim’s MST

PDF

Paper 14: Digital Preservation of Cultural Heritage: Terengganu Brassware Craft Knowledge Base

Abstract: Early exposure to cultural heritage is necessary to preserve it from extinction. One form of cultural heritage that is now on the brink of extinction is the Terengganu brassware craft. Current young generations are mostly not interested in this heritage. Furthermore, intangible heritage in the form of knowledge and skills are only stored in the memory of the practitioners. Lack of documentation has led to the sole reliance on practitioners such that the knowledge is lost upon their demise. Hence, intangible heritage knowledge has to be acquired and stored in a knowledge base system to keep them in a systematic and permanent form. Manipulating and transferring the knowledge and skills will also ensure the continuity of this heritage, and ensure it can be accessed by future generations. This paper discusses the development of the knowledge base of Terengganu Brassware Craft as a digital preservation of cultural heritage. Knowledge acquisition was carried out using interview and observation techniques. Then, the knowledge is represented using ontology. This knowledge, in digital form, can be manipulated and disseminated to the community to ensure the continuity of the knowledge.

Author 1: Wan Malini Wan Isa
Author 2: Nor Azan Mat Zin
Author 3: Fadhilah Rosdi
Author 4: Hafiz Mohd Sarim

Keywords: Cultural heritage; digital preservation; intangible cultural heritage; knowledge acquisition; knowledge base; knowledge representation; ontology

PDF

Paper 15: Computer Students Attitudes on the Integration of m-Learning Applications

Abstract: Technology has an important role in the lives particularly in the field of education nowadays because of its accessibility and affordability. Mobile learning (m-Learning) which is form of e-learning is a novel approach in the arena of educational technology that offers personal, informal, blended learning and flexible learning opportunities to learners and instructors. The present study is an attempt to determine the computer students’ attitudes on the integration of m-Learning app (WhatsApp). A total numbers of 143 student participants participated in the experiment. The study used quasi-experimental research design. The learners were performed intact groups of a public university. They were asked to complete the assignments through WhatsApp application. Two questionnaires were used to gather the data. The data was analyzed descriptively. Findings of the data showed that learners had positive attitudes towards the integration of m-Learning apps. The study also reports the suggestions for the future research and implications for the teachers.

Author 1: Abdulmohsin S. Alkhunaizan

Keywords: e-Learning; m-Learning; mobile applications; computer science; WhatsApp

PDF

Paper 16: Exploring the Use of Digital Games as a Persuasive Tool in Teaching Islamic Knowledge for Muslim Children

Abstract: Various digital games have been developed that focus on providing a sense of enjoyment and excitement for their players in order to be a modern tool for releasing stress or simply for pleasure. In recent years, digital games were also used for teaching and learning. For example, in History subject, games were used for retelling historical stories; at the same time, to preserve the history for the next generation to learn, understand and appreciate. Similarly, Digital games with Islamic values have also been developed to teach Islamic values or knowledge among players, in other words to persuade players to learn or improve their knowledge on Islam. Many designers assumed that games could be used as a persuasive tool to influence players, to learn and understand Islam as a way of life. However, no prior research has been done on the perception of players before and after playing Islamic digital games. To this end, this paper investigates and reports if Islamic Digital Games could persuade gamers to understanding Islam by exploring the use of these games among gamers. A total of 20 school children voluntarily participated in the experiment and the findings are reported in this paper. The study found positive effects on the users’ perception toward playing digital games embedded with Islamic values.

Author 1: Madihah Sheikh Abdul Aziz
Author 2: Panadda Auyphorn
Author 3: Mohd Syarqawy Hamzah

Keywords: Digital games; persuasive tool; Islamic knowledge; Islamic values

PDF

Paper 17: Brain-Controlled for Changing Modular Robot Configuration by Employing Neurosky’s Headset

Abstract: Currently, the Brain Computer Interfaces (BCI) system was designed mostly to be implemented for control purpose or navigation which are mostly being employed for mobile robot, manipulator robot and humanoid robot by using Motor Imagery. This study presents an implementation of BCI system to Modular Self-Reconfigurable (MSR) Dtto robot so the robot able to propagate multiple configurations based on EEG-based brain signals. In this paper, a Neurosky’s Mindwave Mobile EEG headset is being used and a framework of controlling the Dtto robot by EEG signals, processed by OpenViBE software are built. The connection being established between Neurosky’s headsets to the OpenViBE, where a Motor Imagery BCI is created to receive and process the EEG data in real time. The main idea for system developed is to associate a direction (Left, Right, Up and Down) based on Hand and Feet Motor Imagery as a command for Dtto robot control. The Direction from OpenViBE were sent via Lab Streaming Layer (LSL) and transmitted via Python software to Arduino controller in the robots. To test the system performance, this study was conducted in Real time experiments. The results are being discussed in this paper.

Author 1: Muhammad Haziq Hasbulah
Author 2: Fairul Azni Jafar
Author 3: Mohd. Hisham Nordin
Author 4: Kazutaka Yokota

Keywords: Dtto robot; motor imagery; OpenVibe; modular robot; configuration; communication

PDF

Paper 18: Analysis of Spatially Modelled High Temperature Polymer Electrolyte Membrane Fuel Cell under Dynamic Load Conditions

Abstract: This paper presents an interesting approach to observe the effects of the load variations on the performance of high temperature polymer electrolyte membrane fuel cell system, such as: hydrogen and air flow rate, output voltage, power and efficiency. The main advantage of this approach is to analyse the internal behaviour of the fuel cell like current-voltage characteristics during energy conversion, when the load is varying dynamically. This approach of power system simulation models fuel cell system by integrating 3D-COMSOL model of high temperature polymer electrolyte membrane fuel cell with MATLAB/Simulink model of the fuel cell system. The MATLAB/Simulink model for the fuel cell system includes the fuel cell stack (single cell), load (sequence of currents), air supply system (air compressor), fuel supply system (hydrogen tank), and power-efficiency block. The MATLAB/Simulink model is developed in such a way that one part behaves as an input model to the 3D-COMSOL model of the fuel cell system, whereas second part behaves as an output model that recovers the results obtained from the 3D-COMSOL of the fuel cell. This approach of power system modelling is useful to show the performance of high temperature polymer electrolyte membrane fuel cell in much better and accurate way.

Author 1: Jagdesh Kumar
Author 2: Jherna Devi
Author 3: Ghulam Mustafa Bhutto
Author 4: Sajida Parveen
Author 5: Muhammad Shafiq

Keywords: Current-voltage characteristics; energy conversion; fuel cells; power system modeling; power system simulation

PDF

Paper 19: A Collaborative Filtering Recommender System Model for Recommending Intervention to Improve Elderly Well-being

Abstract: In improving elderly well-being nowadays, people at home or health care centre are mostly focusing on guarding and monitoring the elderly using tools, such as CCTV, robots, and other appliances that require a great deal of cost and neat fixtures to prevent damage. Elderly observations using the recommender system are found to be implemented, but only focusing on one aspect such as nutrition and health. However, it is important to give interventions to an elderly by concentrating more on the multiple aspects of successful ageing such as social, environment, health, physical, mental and other so that it can help the elderly people in achieving successful ageing as well as improving their well-being. In this paper, two recommender system models are proposed to recommend interventions for improving elderly well-being in the multiple aspects of successful ageing. These models using a Collaborative Filtering (CF) technique to recommend interventions to an elderly based on the interventions given to other elderly who have similar conditions with the user. The process of recommending interventions involves the generation of user profiles presenting the elderly conditions in multiple aspects of successful ageing. It also applying the k-Nearest Neighbor (kNN) method to find users with similar conditions and recommending interventions based on the interventions given to the similar user. The experiment is conducted to determine the performance of the proposed Collaborative Filtering (CF) recommender system and Collaborative Filtering and Profile Matching (CFS) compared to the Basic Search (BS). The results of the experiment showed that Collaborative Filtering (CF) recommender system and Collaborative Filtering and Profile Matching (CFS) outperformed Basic Search (BS) in terms of precision, recall and F1 measure. This result showed that the proposed models are efficient to recommend interventions using elderly profiles based on many aspects of successful ageing.

Author 1: Aini Khairani Azmi
Author 2: Noraswaliza Abdullah
Author 3: Nurul Akmar Emran

Keywords: Collaborative filtering; elderly well-being; k-nearest neighbor; recommender system; successful ageing

PDF

Paper 20: Forecasting Feature Selection based on Single Exponential Smoothing using Wrapper Method

Abstract: Feature selection is one way to simplify classification process. The purpose is only the selected features are used for classification process and without decreasing its performance when compared without feature selection. This research uses new feature matrix as the base for selection. This feature matrix contains forecasting result using Single Exponential Smoothing (FMF(SES)). The method uses wrapper method of GASVM and it is named FMF(SES)-GASVM. The result of this research is compared with other methods such as GA Bayes, Forward Bayes and Backward Bayes. The result shows that FMF(SES)-GASVM has maximum accuracy when compared of FMF(SES)-GA Bayes, FMF(SES)-Forward Bayes, FMF(SES)-Backward Bayes, however the number of selected features are more than if compared with FMF(SES)-GA Bayes and FMF(SES)-Forward Bayes.

Author 1: Ani Dijah Rahajoe

Keywords: Single exponential smoothing; forecasting; feature selection; genetic algorithm

PDF

Paper 21: A Review on the Verification Approaches and Tools used to Verify the Correctness of Security Algorithms and Protocols

Abstract: Security algorithms and protocols are typical essential upgrades that must be involved within systems and their structures to provide the best performance. The protocols and systems should go through verification and testing processes in order to be more efficient and accurate. In the testing of software, traditional methods are used for accuracy check-up. However, this could not fulfill the measurement of all the testing requirements. The usage of formal verification approaches in checking security properties considers their best environment to be applied. The available literature discussed several approaches on developing the most robust formal verification methods for addressing and analyzing errors that face systems. This could be during the implantation process, unknown attacks, and nondeterministic adversary on the security protocols and algorithm. In this paper, a comprehensive review of the main formal verification approaches such as model checking and theorem approving has been conducted. Moreover, the use of verification tools was briefly presented and explained thoroughly. Those formal verification methods could be involved in the design, redesign of security protocols, and algorithms based on standards and determined sizes that is decided by these techniques’ analysis. The critical analysis of the methods used in verifying the security of systems showed that model checking approaches and its tools were the most used approaches among all the reviewed methods.

Author 1: Mohammed Abdulqawi Saleh Al-humaikani
Author 2: Lukman Bin Ab Rahim

Keywords: Security algorithms; security protocols; formal verification approaches; model checking; theorem proving

PDF

Paper 22: Performance Comparison of Detection, Recognition and Tracking Rates of the different Algorithms

Abstract: This article discusses the approach of human detection and tracking in a homogeneous domain using surveillance cameras. This is a vast area in which significant research has been taking place from more than a decade and the paper is about detection of a human and its face in a given video and stores Local Binary Pattern Histogram (LBPH) features of the detected faces. Once a human is detected in the video, that person will be given a label and him/her is tracked in different video taken by multiple cameras by the application of machine learning and image processing with the help of OpenCV. Many algorithms were used for detecting, recognizing and tracking till date, thus in this paper, main thing is the comparison of the proposed algorithm with some among the state-of-the-art algorithms. And also shows how the proposed algorithm is better than the other chosen algorithms.

Author 1: Meghana Kavuri
Author 2: Kolla Bhanu Prakash

Keywords: Detection; recognition; tracking; local binary pattern histogram; Kalman filter; particle filter

PDF

Paper 23: QoS-based Semantic Micro Services Discovery and Composition using ACO Algorithm

Abstract: In this paper, we present a new model of e-Learning platforms based on semantic micro services using discovery, selection and composition methods to generate learning paths. In this model, each semantic micro service represents an elementary educational resource that can be a course, an exercise, a tutorial or an evaluation implementing a precise learning path objective. The semantic micro services are described using ontologies and deployed in multi-instances in a cloud environment according to a load balancing and a fault tolerance system. Learners’ requests are sent to a proxy micro service having learning paths abstract structures represented as an oriented graph. Proxy micro service analyses the request to define the learner profile and context in order to provide him with the semantic micro services responsible of the educational resources satisfying his functional and non-functional needs. In this model, to achieve an optimal learning path generation a two steps process is employed, where local optimization uses semantic discovery and selection based on a matchmaking algorithm and a quality of service measurement, and global optimization adopts an ant colony optimization algorithm to select the best resource combination. Our experimental results show that the proposed model can effectively returns optimized learning paths considering individual, collective and pedagogical factors.

Author 1: Ahmed ESSAYAH
Author 2: Mohamed Youssfi
Author 3: Omar Bouattane
Author 4: Khalifa Mansouri
Author 5: Elhocein Illoussamen

Keywords: Semantic micro service; quality of service; learning path; e-learning platform; service discovery and composition; ant colony optimization algorithm

PDF

Paper 24: An Approach to Control the Positional Accuracy of Point Features in Volunteered Geographic Information Systems

Abstract: Volunteered geographic information (VGI) is a huge source of user-generated geographic information. There is an enormous potential to use VGI in different mapping activities due to its significant advantages. VGI is found to be richer and more up-to-date than authoritative geographic information. However, VGI quality is an obvious challenge that needs to be addressed in order to get the full potential of VGI. Positional accuracy is one of the important aspects of VGI quality. Although VGI positional accuracy can be high in some contexts, VGI datasets are characterized by a large spatial heterogeneity. This paper proposes an approach for controlling positional accuracy as well as decreasing the spatial heterogeneity of point features in VGI systems. A case study has been conducted in order to ensure the applicability and effectiveness of the proposed approach.

Author 1: Mennatallah H. Ibrahim
Author 2: Nagy Ramadan Darwish
Author 3: Hesham A. Hefny

Keywords: Volunteered geographic information; quality control; positional accuracy; point features

PDF

Paper 25: Android Security Development: SpywareDetection, Apps Secure Level and Data Encryption Improvement

Abstract: Most Android users are unaware that their smartphones are as vulnerable as any computer, and that permission by Android users is an important part of maintaining the security of Android smartphones. We present a method that uses manifest files to determine the presence of spyware and the security level of apps. Furthermore, to ensure that no leaked data occurs in Android smartphones, we propose new method for the encryption of data from Google Suite applications.

Author 1: Lim Wei Xian
Author 2: Chan Shao Hong
Author 3: Yap Ming Jie
Author 4: Azween Abdullah
Author 5: Mahadevan Supramaniam

Keywords: Android; spyware detection; security level index; data encryption

PDF

Paper 26: An Aspect Oriented Programming Framework to Support Transparent Runtime Monitoring of Applications

Abstract: Monitoring the runtime state and behavior of applications is very important to evaluate the performance of these applications and to inspect their behavior. In case of legacy applications that have been developed without monitoring capabilities, there is a real challenge to accomplish runtime state monitoring. This research redefines runtime monitoring concept, and then presents an Aspect Oriented Programming (AOP) framework to equip applications with the capabilities to monitor their runtime state transparently. The framework, called RM Framework, supports three monitoring modes; Invasive-mode, Controlled-mode/(Functionality and Attribute), and Controlled-mode/Selective. The framework is applied on a Java application as a case study. The results show smooth integration between application and runtime monitoring capabilities without affecting the target application consistency.

Author 1: Abdullah O. AL-Zaghameem

Keywords: Runtime state monitoring; application behavior; aspect oriented programming technique; statistical analysis; bytecode transformation

PDF

Paper 27: A Novel Intelligent Cluster-Head (ICH) to Mitigate the Handover Problem of Clustering in VANETs

Abstract: The huge development in the number of Vehicle factories have resulted in many people having lost their life due to accident, which has made vehicular Ad-hoc networks (VANETs) hot topic to enable improved communication between vehicles aimed at reducing the loss of life. The main challenge in this area is vehicle mobility, which has direct effect on network stability. Thus, most previous studies that discussed clustering focused on cluster formation, cluster-head selection and the stability of cluster to reduce the impact of mobility in the network, with little attention given to the clusters when passing from base-station to neighbor base-station. Therefore, this study focused on handover problem that occurs after cluster formation and cluster-head election during cluster passing from base station to base station, known as overlapping area. As the cluster in an overlapping area receives two signals from different base stations, the signal arriving at the cluster becomes weak due to interference between two frequencies resulting in loss of cluster information in the overlapping area. In this study, proposed a novel method named Intelligent Cluster-Head (ICH), which is a controller on two clusters that are used to change uplink between clusters to solve the handover problem in the overlapping area. The proposed method was evaluated with VMaSC-1hop method. The proposed method achieved percentage of packet loss up to 0.8%, percentage of packet delivery ratio (PDR) 99%, percentage of number of disconnected links 0.12% and percentage of network efficiency 99% in the cells edge.

Author 1: A. H. Abbas
Author 2: Mohammed I. Habelalmateen
Author 3: L. Audah
Author 4: N.A.M. Alduais

Keywords: Vehicular Ad-Hoc networks; ITS; clustering; overlapping area; handover; ICH

PDF

Paper 28: Weld Defect Categorization from Welding Current using Principle Component Analysis

Abstract: Real time welding quality control still remains a challenging task due to the dynamic characteristic of welding. Welding current of gas metal arc welding possess valuable information that can be analyzed for weld quality assessment purposes. On-line monitoring of motor current can be provided information about the welding. In this study, current signals obtained during welding in the short- circuit metal transfer mode were used for real-time categorization of deliberately induced weld defects and good welds. A hall-effect current sensor was employed on the ground wiring of the welding machine to acquire the welding current signals during the welding process. Vector reduction of the current signals in time domain was achieved by principle component analysis. The reduced vector was then classified by various classification techniques such as support vector machines, decision trees and nearest neighbor to categorize the arc weld defects or pass it as a good weld. The proposed technique has proved to be successful with accurate classification of the welding categories using all three classifiers. The classification technique is fast enough so it can be used for real time weld quality control as all the signal processing is carried out in the time domain.

Author 1: Hayri Arabaci
Author 2: Salman Laving

Keywords: Arc weld defects; feature extraction; PCA; classification techniques; on-line monitoring

PDF

Paper 29: Moving Object Detection in Highly Corrupted Noise using Analysis of Variance

Abstract: This paper implements three-way nested design to mark moving objects in a sequence of images. Algorithm performs object detection in the image motion analysis. The inter-frame changes (level-A) are marked as temporal contents, while the intra-frame variations identifies critical information. The spatial details are marked at two granular levels, comprising of level-B and level-C. The segmentation is performed using analysis of variance (ANOVA). This algorithm gives excellent results in situations where images are corrupted with heavy Gaussian noise ~N(0,100). The sample images are selected in four categories: ‘baseline’, ‘dynamic background’, ‘camera jitter’, and ‘shadows’. Results are compared with previously published results on four accounts: false positive rate (FPR), false negative rate (FNR), percentage of wrong classification (PWC), and an F-measure. The qualitative and quantitative results prove that the technique out performs the previously reported results by a significant margin.

Author 1: Asim ur Rehman Khan
Author 2: Muhammad Burhan Khan
Author 3: Haider Mehdi
Author 4: Syed Muhammad Atif Saleem

Keywords: Analysis of variance (ANOVA); image motion analysis; object detection

PDF

Paper 30: Smart Home Energy Management System Design: A Realistic Autonomous V2H / H2V Hybrid Energy Storage System

Abstract: The hybrid fuel cell electric vehicle powered by household power during peak use is another opportunity to reduce emissions and save money. For this reason, Vehicle-to-Home (V2H) and Home-to-Vehicle (H2V) systems were proposed as a new method of exchanging smart energy and a new method of exchanging smart energy. The main goal of this paper is to develop a smart home energy management based on IoT, generate more energy efficiency and share production between home and vehicle. In fact, the Hybrid Fuel cell electric vehicle will be used simultaneously to power household appliances during peak demand for electricity to solve energy consumption. The household's energy is derived from an accurate Autonomous hybrid power system. Several technologies such as Proton Exchange Membrane Fuel Cell, solar panel, Supercapacitor (SC) device and water electrolyzer are incorporated into the proposed system. Two-way electrical energy from the PEMFC-Hybrid Electric Vehicle and household power will be exchanged by discharging vehicle energy storage to balance energy demand and supply. To this end, a smart energy management unit (EMU) will be developed and discussed in order to meet the estimated fuel consumption mitigation goals during peak periods when demand is highest, coordinating between household power and vehicle energy storage. The Matlab / Simulink software based on an experimental database extracted from household power to demonstrate the effectiveness of the proposed strategy and its effects on V2H / H2V operations will simulate the presented design for one day. The Matlab / Simulink software based on an experimental database extracted from household power to demonstrate the effectiveness of the proposed strategy and its effects on V2H / H2V operations will simulate the presented design for one day.

Author 1: Bassam Zafar
Author 2: Ben Slama Sami
Author 3: Sihem Nasri
Author 4: Marwan Mahmoud

Keywords: Home energy management system; fuel cell; super-capacitor; solar power; vehicle to home; home to vehicle

PDF

Paper 31: Evaluation of the Performance of the University Information Systems: Case of Moroccan Universities

Abstract: The purpose of this paper is to develop a conceptual model of university information systems performance measurement. To do this resorted to the choice of 3E-3P model. This model proposes a development under the spectrum of the systemic approach. The objective is to provide a tool to decision-makers in order to understand the dynamics of performance measurement. The model is based on a logic of decomposition of the global performance into three partial performances. The measurement is carried out at each pillar individually using a multi-criteria approach (MACBETH), and subsequently the consolidation of the three partial performances is carried out with the same multicriteria logic.

Author 1: Ayoub Gacim
Author 2: Hicham Drissi
Author 3: Abdelwahed Namir

Keywords: Component; information system; performance; multicriteria modeling; university

PDF

Paper 32: Immuno-Computing-based Neural Learning for Data Classification

Abstract: The paper proposes two new algorithms based on the artificial immune system of the human body called Clonal Selection Algorithm (CSA) and the modified version of Clonal Selection Algorithm (MCSA), and used them to train the neural network. Conventional Artificial Neural Network training algorithm such as backpropagation has the disadvantage that it can get trapped into the local optima. Consequently, the neural network is usually incapable of obtaining the best solution to the given problem. In the proposed new CSA algorithm, the initial random weights chosen for the neural networks are considered as a foreign body called an antigen. As the human body creates several antibodies in response to fight the antigen, similarly, in CSA algorithm antibodies are created to fight the antigen. Each antibody is evaluated based on its affinity and clones are generated for each antibody. The number of clones depends on the algorithm, in CSA, the number of clones is fixed and in MCSA, number of clones is directly proportional to the affinity of the antibody. Mutation is performed on clones to improve the affinity. The best antibody emerged becomes the antigen for the next round and the process is repeated for several iterations until the best antibody that satisfies the chosen criterion is found. The best antibody is problem specific. For neural network training for data classification, the best antibody represents the set of weights and biases that gives the least error. The efficiency of the algorithm was analyzed using Iris dataset. The prediction accuracy of the algorithms were compared with other nature-inspired algorithms, such as Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO) and standard backpropagation. The performance of MCSA was ahead of other algorithms with an accuracy of 99.33%.

Author 1: Ali Al Bataineh
Author 2: Devinder Kaur

Keywords: AIS; CSA; MCSA; ACO; BBNN

PDF

Paper 33: Predictive Method for Service Composition in Heterogeneous Environments within Client Requirements

Abstract: Cloud computing is a new delivery model for Information Technology services. Many actors and parameters play an important role in provisioning of dynamically elastic and virtualized resources at the levels of infrastructures, platforms, and softwares. Nowadays, many cloud services are competing and present often similar offers. From the customer side, it is not always easy to select a suitable service according to customer requirements and cloud services scoring. In a real-world scenario, this is more complicated since service scoring may change over time. Besides this, it interferes with many parameters such as hardware, network infrastructure, customer demand, etc. To tackle this issue, this research work presents a novel approach for the prediction of the future score of any service in order to satisfy user requirements when executing service composition in cloud environments. This approach deals with regression techniques in order to predict the expected future offer of service based on sampling service’s history as well as user expectations.

Author 1: Saleh M. Altowaijri

Keywords: Workflow; robust regression; prediction; cloud computing

PDF

Paper 34: Spectral Classification of a Set of Hyperspectral Images using the Convolutional Neural Network, in a Single Training

Abstract: Hyperspectral imagery has seen a great evolution in recent years. Consequently, several fields (medical, agriculture, geosciences) need to make the automatic classification of these hyperspectral images with a high rate and in an acceptable time. The state-of-the-art presents several classification algorithms based on the Convolutional Neural Network (CNN) and each algorithm is training on a part of an image and then performs the prediction on the rest. This article proposes a new Fast Spectral classification algorithm based on CNN, and which allows to build a composite image from multiple hyperspectral images, then trains the model only once on the composite image. After training, the model can predict each image separately. To test the validity of the proposed algorithm, two free hyperspectral images are taken, and the training time obtained by the proposed model on the composite image is better than the time obtained from the model of the state-of-the-art.

Author 1: Abdelali Zbakh
Author 2: Zoubida Alaoui Mdaghri
Author 3: Abdelillah Benyoussef
Author 4: Abdellah El Kenz
Author 5: Mourad El Yadari

Keywords: Classification; spectral; Convolutional Neural Network (CNN); deep learning; hyperspectral data; neural network

PDF

Paper 35: Optical Recognition of Isolated Machine Printed Sindhi Characters using Fourier Descriptors

Abstract: The scale invariance characteristics play an essential role in pattern recognition applications, for example in computer vision, OCR (Optical Character Recognition), electronic publication, etc. In this paper, the shape based feature extraction techniques are used in terms of invariant properties and the region based FD (Fourier Descriptors) have been used for the recognition of isolated printed Sindhi characters. There are 56 isolated characters in Sindhi language than can be categorized to 20 different classes considering the shape of the base of each character. In this work, the dataset contains 4704 images of isolated printed Sindhi characters. The simulation result shows that the proposed method is a capable discriminating algorithm of Similar Sindhi characters and can easily extract the scale invariant features.

Author 1: Nasreen Nizamani
Author 2: Mujtaba Shaikh
Author 3: Jawed Unar
Author 4: Ehsan Ali
Author 5: Ghulam Mustafa Bhutto
Author 6: Abdul Rafay

Keywords: Features extraction; Sindhi optical character recognition; Fourier Descriptors; machine printed Sindhi characters

PDF

Paper 36: Junction Point Detection and Identification of Broken Character in Touching Arabic Handwritten Text using Overlapping Set Theory

Abstract: Touching characters are formed when two or more characters share the same space with each other. Therefore, segmentation of these touching character is very challenging research topic especially for handwritten Arabic degraded documents. This is one of the key issue in recognition of the handwritten Arabic text. In order to make the recognition system more effective segmentation of these touching handwritten Arabic characters is considered to be very important research area. In this research, a new method is proposed, which is used to identify the junction or common point of Arabic touching word image by applying overlapping or intersection set theory operation, which will help to trace the correct boundary of the touching characters, identify the broken characters and also segmented these touching handwritten text in an efficient way. The proposed method has been evaluated on Arabic touching handwritten characters taken from handwritten datasets. The results show the efficiency of the proposed method. The proposed method is applicable to both degraded handwritten documents and printed documents.

Author 1: Inam Ullah
Author 2: Mohd Sanusi Azmi
Author 3: Mohamad Ishak Desa

Keywords: Touching characters; segmentation and recognition; overlapping set theory; junction point; broken character

PDF

Paper 37: Implementation of Machine Learning Model to Predict Heart Failure Disease

Abstract: In the current era, Heart Failure (HF) is one of the common diseases that can lead to dangerous situation. Every year almost 26 million of patients are affecting with this kind of disease. From the heart consultant and surgeon’s point of view, it is complex to predict the heart failure on right time. Fortunately, classification and predicting models are there, which can aid the medical field and can illustrates how to use the medical data in an efficient way. This paper aims to improve the HF prediction accuracy using UCI heart disease dataset. For this, multiple machine learning approaches used to understand the data and predict the HF chances in a medical database. Furthermore, the results and comparative study showed that, the current work improved the previous accuracy score in predicting heart disease. The integration of the machine learning model presented in this study with medical information systems would be useful to predict the HF or any other disease using the live data collected from patients.

Author 1: Fahd Saleh Alotaibi

Keywords: Machine learning model; medical data; heart failure diagnoses

PDF

Paper 38: Hyperparameter Optimization in Convolutional Neural Network using Genetic Algorithms

Abstract: Optimizing hyperparameters in Convolutional Neural Network (CNN) is a tedious problem for many researchers and practitioners. To get hyperparameters with better performance, experts are required to configure a set of hyperparameter choices manually. The best results of this manual configuration are thereafter modeled and implemented in CNN. However, different datasets require different model or combination of hyperparameters, which can be cumbersome and tedious. To address this, several works have been proposed such as grid search which is limited to low dimensional space, and tails which use random selection. Also, optimization methods such as evolutionary algorithms and Bayesian have been tested on MNIST datasets, which is less costly and require fewer hyperparameters than CIFAR-10 datasets. In this paper, the authors investigate the hyperparameter search methods on CIFAR-10 datasets. During the investigation with various optimization methods, performances in terms of accuracy are tested and recorded. Although there is no significant difference between propose approach and the state-of-the-art on CIFAR-10 datasets, however, the actual potency lies in the hybridization of genetic algorithms with local search method in optimizing both network structures and network training which is yet to be reported to the best of author knowledge.

Author 1: Nurshazlyn Mohd Aszemi
Author 2: P.D.D Dominic

Keywords: Hyperparameter; convolutional neural network; CNN; genetic algorithm; GA; random search; optimization

PDF

Paper 39: New Method of Faults Diagnostic based on Neuro-Dynamic Sliding Mode for Flat Nonlinear Systems

Abstract: This paper addresses the problem of simultaneous actuator, process and sensor Fault Detection and Isolation (FDI) for nonlinear system having flatness properties with the presence of disturbances and which are operating in closed-loop. In particular, the nonlinear system is corrupted with additive actuator, process or sensor faults with simultaneous occurrence. In this case, the residual signals might be sensitive to all of these faults that can appear in the system. The proposed FDI method is based on both input and parameter estimators that are designed in parallel. With the flatness property of such system, the design of these two estimators requires information on the measured outputs and their successive derivatives. To estimate these last one, a new scheme of the 2nd-order dynamic sliding mode differentiator is proposed. Residuals are next defined as the difference between the estimated and expected behavior. In order to isolate the faults, dynamic neural networks technique is employed. Besides, comparative study between this new differentiator and the well-known 2nd-order Levant’s differentiator is provided to show the pros and cons of the proposed FDI method. This latter is validated by the simulation results and is carried out on a three tank system.

Author 1: O. Dhaou
Author 2: L.Sidhom
Author 3: A.Abdelkrim

Keywords: Flat system; fault detection and isolation; inputs/parameters estimator; higher order sliding mode differentiator; dynamic neural network

PDF

Paper 40: Big Data Technology-Enabled Analytical Solution for Quality Assessment of Higher Education Systems

Abstract: Educational Intelligence is a broad area of big data analytical applications that make use of big data technologies for implementation of solutions for education and research. This paper demonstrates the designing, development and deployment of an educational intelligence application for real-world scenarios. Firstly, a quality assessment framework for higher education systems that evaluate institutions on the basis of performance of outgoing students was proposed. Secondly, big data enabled technological setup was used for its implementation. Literature was surveyed to evaluate existing quality frameworks. Most existing quality assessment systems take into account the dimensions related to inputs, processes and outputs, but they tend to ignore the perspective that assesses the institution on the basis of outcome of the educational process. This paper demonstrates the use of outcome perspective to compute quality metrics and create visual analytics. In order to implement and test the framework, R programming language and a cloud based big data technology that is Google, BigQuery were used.

Author 1: Samiya Khan
Author 2: Xiufeng Liu
Author 3: Kashish Ara Shakil
Author 4: Mansaf Alam

Keywords: Education big data; educational intelligence; educational technology; higher education; quality education

PDF

Paper 41: A Novel Network user Behaviors and Profile Testing based on Anomaly Detection Techniques

Abstract: The proliferation of smart devices and computer networks has led to a huge rise in internet traffic and network attacks that necessitate efficient network traffic monitoring. There have been many attempts to address these issues; however, agile detecting solutions are needed. This research work deals with the problem of malware infections or detection is one of the most challenging tasks in modern computer security. In recent years, anomaly detection has been the first detection approach followed by results from other classifiers. Anomaly detection methods are typically designed to new model normal user behaviors and then seek for deviations from this model. However, anomaly detection techniques may suffer from a variety of problems, including missing validations for verification and a large number of false positives. This work proposes and describes a new profile-based method for identifying anomalous changes in network user behaviors. Profiles describe user behaviors from different perspectives using different flags. Each profile is composed of information about what the user has done over a period of time. The symptoms extracted in the profile cover a wide range of user actions and try to analyze different actions. Compared to other symptom anomaly detectors, the profiles offer a higher level of user experience. It is assumed that it is possible to look for anomalies using high-level symptoms while producing less false positives while effectively finding real attacks. Also, the problem of obtaining truly tagged data for training anomaly detection algorithms has been addressed in this work. It has been designed and created datasets that contain real normal user actions while the user is infected with real malware. These datasets were used to train and evaluate anomaly detection algorithms. Among the investigated algorithms for example, local outlier factor (LOF) and one class support vector machine (SVM). The results show that the proposed anomaly-based and profile-based algorithm causes very few false positives and relatively high true positive detection. The two main contributions of this work are a new approaches based on network anomaly detection and datasets containing a combination of genuine malware and actual user traffic. Finally, the future directions will focus on applying the proposed approaches for protecting the internet of things (IOT) devices.

Author 1: Muhammad Tahir
Author 2: Mingchu Li
Author 3: Xiao Zheng
Author 4: Anil Carie
Author 5: Xing Jin
Author 6: Muhammad Azhar
Author 7: Naeem Ayoub
Author 8: Atif Wagan
Author 9: Muhammad Aamir
Author 10: Liaquat Ali Jamali
Author 11: Muhammad Asif Imran
Author 12: Zahid Hussain Hulio

Keywords: Network user behaviors; profile testing; anomaly detection techniques; datasets; anomaly detection algorithms; machine learning

PDF

Paper 42: Experimental Analysis of Color Image Scrambling in the Spatial Domain and Transform Domain

Abstract: This paper proposes two image-scrambling algorithms based on self-generated keys. First color image scrambling method works in the spatial domain, and second, works in the transform domain. The proposed methods cull the R, G, and B plane from the color image and scramble each plane separately by utilizing the self-generated keys. The keenness of security of proposed methods is the keys or parameters used in the scrambling process. The exploratory outcomes show that both proposed image scrambling technique performs well in terms of Number of pixel change rate (NPCR), Normalized correlation (NC), Entropy, and time consumed in encoding and decoding. The adequacy of the proposed framework has demonstrated on a data set of five images. In furtherance, the present paper gives a comparative performance analysis between proposed image scrambling methods of spatial domain and transform domain. The proposed paper also tosses some light on the scrambling work reported in the literature.

Author 1: R. Rama Kishore
Author 2: Sunesh

Keywords: Color image scrambling; pixel position modification; spatial domain; Red Green Blue (RGB); transform domain

PDF

Paper 43: Hijaiyah Letter Interactive Learning for Mild Mental Retardation Children using Gillingham Method and Augmented Reality

Abstract: Assistive technology for children with special needs is a problem that is interesting to study. Collaboration between methods and latest technology can be used as a learning aid for them. Learning of Hijaiyah letters is the first step to being able to read the Holy Qur'an. Mentally retarded children have IQs below the average normal child, so their learning process is slower and requires special methods. This study aims to develop an application by using the Gillingham and augmented reality methods to help mentally retarded children recognize Hijaiyah letters. The Gillingham method uses a visual, auditory, kinestetic, and tactile (VAKT) approach, that can be used to facilitate mentally retarded children. While augmented reality is used to develop more interesting and interactive applications. Based on the results of research and testing, it can be concluded that the learning application that was built can improve children's memory and understanding of Hijaiyah letters, The results of the pretest and posttest testing, showed an increase of 12% for children who were difficult to receive learning material and 6% for children who are classified as easy to receive learning material.

Author 1: Irawan Afrianto
Author 2: Agung Faishal Faris
Author 3: Sufa Atin

Keywords: Hijaiyah; intercative learning; mild retarded child; Gillingham; VAKT; augmented reality

PDF

Paper 44: Causal Impact Analysis on Android Market

Abstract: Google play store contains a large repository of apps for android users. Play store has two billion active users that have two million apps to download and use. App developers are competing to get a higher success rate and increase user satisfaction but little information is known to developers for succeeding in the android market. This paper presents a comprehensive analytical study on Google play store apps ratings, installs and reviews. This study focuses on the evaluation of the parameters required for the success of an app in different categories. For this purpose data of 10k apps and its reviews are analyzed using exploratory data analysis. This study focuses on finding a correlation between higher ratings, no of installs, reviews with app info like its category, size, and price. We are also going to analyze user reviews to get useful insights. The evaluation shows that personalization, productivity and games categories are performing very well in the android market both in terms of ratings and installs. Most high rated apps are sized below 40MB and priced below 30$, except game apps that are performing well even if they are bulky. Common customer complaints are functional errors and issues like infrequent updates, excessive ads, limited functionality and high purchase price.

Author 1: Hadiqa AmanUllah
Author 2: Mishal Fatima
Author 3: Umair Muneer
Author 4: Sadaf Ilyas
Author 5: Rana Abdul Rehman
Author 6: Ibraheem Afzal

Keywords: Android; Google; statistics; mobile applications; data visualization

PDF

Paper 45: Emotion Detection in Text using Nested Long Short-Term Memory

Abstract: Humans have the power to feel different types of emotions because human life is filled with many emotions. Human’s emotion can be reflected through reading or writing a text. In recent years, studies on emotion detection through text has been developed. Most of the study is using a machine learning technique. In this paper, we classified 7 emotions such as anger, fear, joy, love, sadness, surprise, and thankfulness using deep learning technique that is Long Short-Term Memory (LSTM) and Nested Long Short-Term Memory (Nested LSTM). We have compared our results with Support Vector Machine (SVM). We have trained each model with 980,549 training data and tested with 144,160 testing data. Our experiments showed that Nested LSTM and LSTM give better performance than SVM to detect emotions in text. Nested LSTM gets the best accuracy of 99.167%, while LSTM gets the best performance in term of average precision at 99.22%, average recall at 98.86%, and f1-score at 99.04%.

Author 1: Daniel Haryadi
Author 2: Gede Putra Kusuma

Keywords: Sentiment analysis; emotion detection; text mining; nested LSTM; machine learning

PDF

Paper 46: Dynamic Matrix Control DMC using the Tuning Procedure based on First Order Plus Dead Time for Infant-Incubator

Abstract: The concept of Model Predictive Control (MPC) is considered as one of the most important controlling strategies. It is used in several fields, such as petrochemical, oil refinery, fertilizer and chemical plants. It is also well spread among the clinicians and in the biomedical fields. In this context, our paper aims to investigate the thermal conditions inside the infant incubator for premature babies. In this study, we propose the Dynamic Matrix Control (DMC) as a control strategy. The most particularity of this strategy is applicable to the Multi-input Multi-output (MIMO) systems. It aims to compare different coupled transfer functions achieved by two identification methods in previous work. Also, a simulation of the air temperature and humidity is achieved inside the unit care. In this work, we focus on the tuning controlling parameters because it is considered as a key step in the successful performance of (DMC). Then, to obtain the (DMC), we have used an analytic tool, which is the Process Reaction Curve (PRC), for higher order transfers function because it needs a lot of work for this purpose. It should be approximated as a low order transfer function with time delay, which is achieved by using the First Order Plus Dead Time (FOPDT) of processing models. Finally, the result of the comparison of the infant-incubator is provided to show an optimal and good performance of the thermal behavior of our propos methodology and prove that a good identification ensures a better performance.

Author 1: J. ElHadj Ali
Author 2: E. Feki
Author 3: A. Mami

Keywords: Infant-incubator; DMC; MPC; higher-order; FOPDT; PRC and MIMO

PDF

Paper 47: Dense Hand-CNN: A Novel CNN Architecture based on Later Fusion of Neural and Wavelet Features for Identity Recognition

Abstract: Biometric recognition or biometrics has emerged as the best solution for criminal identification and access control applications where resources or information need to be protected from unauthorized access. Biometric traits such as fingerprint, face, palmprint, iris, and hand-geometry have been well explored; and matured approaches are available in order to perform personal identification. The work emphasizes the opportunities for obtaining texture information from a palmprint on the basis of such descriptors as Curvelet, Wavelet, Wave Atom, SIFT, Gabor, LBP, and AlexNet. The key contribution is the application of mode voting method for accurate identification of a person at the fusion decision level. The proposed approach was tested in a number of experiments at the CASIA and IITD palmprint databases. The testing yielded positive results supporting the utilization of the described voting technique for human recognition purposes.

Author 1: Elaraby A. Elgallad
Author 2: Wael Ouarda
Author 3: Adel M. Alimi

Keywords: Deep learning; fusion; palmprint; squeezenet; voting

PDF

Paper 48: Detection of Suspicious of Diabetic Feet using Thermal Image

Abstract: Diabetic foot is a chronic disease that occurs due to increased glucose levels, in addition to being the result of poorly controlled diabetes. In this case, the affected foot increases in temperature, because it contains accumulated blood. According to the Alianza para el Slvataje del Pie diabético en el Perú reported that currently has been increasing the cases of diabetic foot being 8% of the Peruvian population suffering from diabetic foot. Many research papers mention that the temperature difference of both feet has to be minimal due to the homogeneous distribution of the body, but when the temperature of the feet is higher than 2.2 ° C degrees with respect to the other foot, it is an indicative of diabetic foot. That is why the thermal evaluation of feet with suspected diabetic foot was raised in this research to prevent future damage or even amputation of the foot; first a thermal image of both feet is captured using the FLIR ONE Pro thermal camera following a temperature range protocol, then the images are processed in the MATLAB software in order to obtain the zones where the variations are greater or equal 2.2 degrees of temperature and finally superimpose it on the foot with higher temperature to determine the area where the highest temperature was detected. It was obtained as results, that patients with diabetic foot do not have sensitivity in both feet, which will indicate us as a result and in addition to the difference in temperature between both feet which is a possible diabetic foot.

Author 1: Brian Meneses-Claudio
Author 2: Witman Alvarado-Díaz
Author 3: Fiorella Flores-Medina
Author 4: Natalia I. Vargas-Cuentas
Author 5: Avid Roman-Gonzalez

Keywords: Diabetic foot; thermal images; image processing; Roberts method; heat map

PDF

Paper 49: Blind Image Quality Evaluation of Stitched Image using Novel Hybrid Warping Technique

Abstract: Image stitching is collection of sequential images captured at fixed camera center having considerable amount of overlap and produces aesthetically pleasing seamless panoramic view. But, practically it is very difficult to obtain clean and pristine stitched panoramic image of particular scene as such images are apparently distorted. In this paper, a novel Hybrid Warping technique is used that combine two global warps and one local warp and helps to refine image alignment stage. Our proposed method optimizes Homography Screening to rectify problem of perspective distortion and Edge Strength Similarity approach to quantify structural irregularities. The Blind Image Quality Evaluation models such as Blind Image Quality Index (BIQI), Blind/Reference-less Image Spatial QUality Evaluator (BRISQUE) and BLind Image Integrity Notator using DCT Statistics (BLIINDS-II) are employed to measure objective quality of stitched image. The experimental results showed that blind image quality score of proposed method is significantly better than latest existing methods.

Author 1: Sanjay T. Gandhe
Author 2: Omkar S. Vaidya

Keywords: Blind image quality evaluation; hybrid warping; image stitching; panoramic image

PDF

Paper 50: Bio-inspired Think-and-Share Optimization for Big Data Provenance in Wireless Sensor Networks

Abstract: Big data systems are being increasingly adopted by the enterprises exploiting big data applications to manage data-driven process, practices, and systems in an enterprise wide context. Specifically, big data systems and their underlying applications empower enterprises with analytical decision making (e.g., recommender/decision support systems) to optimize organizational productivity, competitiveness, and growth. Despite these benefits, big data applications face some challenges that include but not limited to security and privacy, authenticity, and reliability of critical data that may result in propagation of false information across systems. Data provenance as an approach and enabling mechanism (to identify the origin, manage the creation, and track the propagation of information etc.) can be a solution to above mentioned challenges for data management in an enterprise context. Data provenance solution(s) can help stakeholders and enterprises to assess the quality of data along with authenticity, reliability, and trust of information on the basis of identity, reproducibility and integrity of data. Considering the wide spread adoption of big data applications and the needs for data provenance, this paper focuses on (i) analyzing state-of-the-art for holistic presentation of provenance in big-data applications (ii) proposing a bio-inspired approach with underlying algorithm that exploits human thinking approach to support data provenance in Wireless Sensor Networks (WSNs). The proposed ‘Think-and-Share Optimization’ (TaSO) algorithms modularizes and automates data provenance in WSNs that are deployed and operated in enterprises. Evaluation of TaSO algorithm demonstrates its efficiency in terms of connectivity, closeness to the sink node, coverage, and execution time. The proposed research contextualizes bio-inspired computation to enable and optimize data provenance in WSNs. Future research aims to exploit machine learning techniques (with underlying algorithms) to automate data provenance for big data systems in networked environments.

Author 1: Adel Alkhalil
Author 2: Rabie Ramadan
Author 3: Aakash Ahmad

Keywords: Big data systems; data provenance; fuzzy logic; bio-inspired computing

PDF

Paper 51: Fragile Watermarking based on Linear Cellular Automata using Manhattan Distances for 2D Vector Map

Abstract: There has been a growing demand for publishing maps in secure digital format, since this ensures the integrity of data. This had lead us to put forward a method of detecting and locating modification data that is extremely accurate and simultaneously guarantees that the exact original content is recovered. More precisely, this method relies on a fragile watermarking algorithm that is developed in accordance with a frequency manner and, for every spatial feature, it can embed hidden data in 2D vector maps. The current paper proposes a frequency data-hiding scheme, which will be examined in accordance with Linear Cellular Automata Transform using Manhattan distances. Various invertible integer mappings are applied in order to find out the Manhattan distances from coordinates. To begin with, the original map is transformed into LCA, after which the watermark insertion process is carried out to transform the coefficient of the transformation result frequency into LSB. Lastly, a watermarked map is created by applying the inverse LCA transform, meaning that a LCA-transformed map is produced. Findings indicate that the suggested method is effective since in terms of invisibility and the capacity to allow for modifications. The methods also allow the detection of modification data, the addition and removal of some features, and enable the exact original content from the 2D vector map to be included.

Author 1: Saleh AL-ardhi
Author 2: Vijey Thayananthan
Author 3: Abdullah Basuhail

Keywords: Reversible watermarking; fragile watermarking; linear cellular automata; Manhattan distances; vector map

PDF

Paper 52: A Watermarking System Architecture using the Cellular Automata Transform for 2D Vector Map

Abstract: Technological advancement, paired with the emergence of increasingly open and sophisticated communication systems, has contributed to the growing complexity of copyright protection and ownership identification for digital content. The technique of digital watermarking has been receiving attention in the literature as a way to address these complexities. Digital watermarking involves covertly embedding a marker in a piece of digital data (e.g., a vector map, database, or audio, image, or video data) such that the marker cannot be edited, does not interfere with the quality or size of the data, and can be extracted accurately even under the deterioration of the watermarked data (e.g., as a consequence of malicious activity). The purpose of this paper is to describe a watermarking system architecture that can be applied to a 2D vector map. The proposed scheme involves embedding the watermark into the frequency domain, namely, the linear cellular automata transform (LCAT) algorithm. To evaluate the performance of the proposed scheme, the algorithm was applied to vector maps from the Riyadh Development Authority. The results indicate that the watermarking system architecture described here is efficient in terms of its computational complexity, reversibility, fidelity, and robustness against well-known attacks.

Author 1: Saleh AL-ardhi
Author 2: Vijey Thayananthan
Author 3: Abdullah Basuhail

Keywords: Digital watermarking; spatial database; 2d vector map; linear cellular automata transform

PDF

Paper 53: Electromyography Signal Acquisition and Analysis System for Finger Movement Classification

Abstract: Electromyography (EMG) is very important to capture muscle activity. Although many jobs establish data acquisition system, however, it is also essential to demonstrate that these data are reliable. In this sense, one proposes a design and implementation of a data acquisition system with the Myoware device and the ATmega329P microcontroller. One also proved its reliability by classifying the movement of the fingers of the hand, with the help of the algorithm k-Nearest Neighbors (KNN) and the application of Classification Learner code of Matlab. The results show a success rate of 99.1%.

Author 1: Alvarado-Díaz Witman
Author 2: Meneses-Claudio Brian
Author 3: Roman-Gonzalez Avid

Keywords: EMG; muscles; disability; classification learner; Myoware

PDF

Paper 54: Blood Vessels Segmentation in Retinal Fundus Image using Hybrid Method of Frangi Filter, Otsu Thresholding and Morphology

Abstract: Diagnosis of computer-based retinopathic hypertension is done by analyzing of retinal images. The analysis is carried out through various stages, one of which is blood vessel segmentation in retinal images. Vascular segmentation of the retina is a complex problem. This is caused by non-uniform lighting, contrast variations and the presence of abnormalities due to disease. This makes segmentation not successful if it only relies on one method. The aims of this study to segment blood vessels in retinal images. The method used is divided into three stages, namely preprocessing, segmentation and testing. The first stage, preprocessing, is to improve image quality with the CLAHE method and the median filter on the green channel image. The second stage, segmenting using a number of methods, namely, frangi filter, 2D-convolution filtering, median filtering, otsu's thresholding, morphology operation, and background subtraction. The last step is testing the system using the DRIVE and STARE dataset. The test results obtained sensitivity 91.187% performance parameters, 86.896% specificity, and area under the curve (AUC) 89.041%. Referring to the performance produced, the proposed model can be used as an alternative for blood vessel segmentation of retinal images.

Author 1: Wiharto
Author 2: YS. Palgunadi

Keywords: Segmentation; morphology; frangi filter; retinal; blood vessels

PDF

Paper 55: A New Image Inpainting Approach based on Criminisi Algorithm

Abstract: In patch-based inpainting methods, the order of filling the areas to be restored is very important. This filling order is defined by a priority function that integrates two parameters: confidence term and data term. The priority, as initially defined, is negatively affected by the mutual influence of confidence and data terms. In addition, the rapid decrease to zero of confidence term leads the numerical instability of algorithms. Finally, the data term depends only on the central pixel of the patch, without taking into account the influence of neighboring pixels. Our aim in this paper is to propose an algorithm to solve the problems mentioned above. This algorithm is based on a new definition of the priority function, a calculation of the average data term obtained from the elementary data terms in a patch and an update of the confidence term slowing its decrease and avoiding convergence to zero. We evaluated our method by comparing it with algorithms in the literature. The results show that our method provides better results both visually and in terms of the Peak Signal-to-Noise Ratio (PSNR) and Structural SIMilarity index (SSIM).

Author 1: Nouho Ouattara
Author 2: Georges Laussane Loum
Author 3: Ghislain Koffi Pandry
Author 4: Armand Kodjo Atiampo

Keywords: Image inpainting; Criminisi algorithm; priority function; data term; confidence term; identity function

PDF

Paper 56: Improving Knowledge Sharing in Distributed Software Development

Abstract: Distributed Software Development has become an established software development paradigm that provides several advantages but it presents significant challenges to share and understand the knowledge required for developing software. Organizations are expected to implement appropriate practices to address knowledge management. From the existing studies, it is been analyzed that there were problems of collaboration between distributed team members which effects knowledge sharing. Documentation problem (such as missing, poor and outdated documents) and knowledge vaporization (as much of the conversation and communication is done via chat and retrieving it later is a great headache) is a major challenge in Distributed Software Development in knowledge sharing. Our main objective is to improve knowledge sharing between distributed team members and prevent knowledge vaporization and reduced documentation problem that will help in improving software development process in a distributed environment. To eliminate these challenges we proposed a framework which deals with documentation and knowledge vaporization problems and evaluated it through industrial case study and evaluate the framework performance in real-life context where actually the problem arises, we conducted the interviews and analyzed the data using thematic analysis and SUS questioner we came to the conclusion on team members response that they are satisfied with our proposed solution and it improved their knowledge sharing process. Our intention was to improve the knowledge process with our proposed solution and the evaluation showed that we resolved these problems.

Author 1: Sara Waheed
Author 2: Bushra Hamid
Author 3: NZ Jhanjhi
Author 4: Mamoona Humayun
Author 5: Nazir A Malik

Keywords: Distributed software development; knowledge sharing; knowledge management

PDF

Paper 57: IRPanet: Intelligent Routing Protocol in VANET for Dynamic Route Optimization

Abstract: This paper presents novel routing protocol, IRPANET (Intelligent Routing Protocol in VANET) for Vehicular Adhoc Network (VANET). Vehicular Ad Hoc Networks are special class of Mobile Adhoc Network, created by road vehicles installed with wireless gadgets). Since the environment is dynamic due to high mobility and the topology changes are too frequent, no connection or path can be established between nodes. The issues are challenging for the design of an effective and efficient protocol for such a dynamic environment. This problem can be solved using probabilistic, heuristic and even machine learning based approaches incorporated with store and forward mechanism. Here, we proposed a design framework using heuristics and probabilistic approaches composite with the time series techniques for selecting best and optimize path for forwarding packets using open street map (OSM). Our proposed algorithm uses various parameters (Heuristics Based Routing) for calculating optimal path for packets to be sent, such geographical position (GPS installed in every vehicle), velocity / speed of vehicle, priority of the packets, distances (Euclidean, Haversine, Vicinity) between vehicle, direction of vehicle, communication range of the vehicle, free buffer of nodes and network congestion. These networks can be used for medical emergency, security, entertainment and routing purposes (applications of VANET). These parameters while used in collaboration provide us a very strong and admissible heuristics. We have mathematically proved that the proposed technique is efficient for the routing of packets especially in medical emergency situation.

Author 1: Rafi Ullah
Author 2: Shah Muhammad Emad
Author 3: Taha Jilani
Author 4: Waqas Azam
Author 5: Muhammad Zain uddin

Keywords: Intelligent routing protocol; heuristics based routing; applications of VANET; Vehicular Adhoc Network; VANET routing protocol

PDF

Paper 58: Depth Limitation and Splitting Criteria Optimization on Random Forest for Efficient Human Activity Classification

Abstract: Random Forest (RF) is known as one of the best classifiers in many fields. They are parallelizable, fast to train and to predict, robust to outlier, handle unbalanced data, have low bias, and moderate variance. Apart from these advantages, there are still opportunities to increase RF efficiency. The absence of recommendations regarding the number of trees involved in RF ensembles could make the number of trees very large. This can increase the computational complexity of RF. Recommendations for not pruning the decision tree further aggravates the condition. This research attempts to build an efficient RF ensemble while maintaining its accuracy, especially in problem activity. Data collection is performed using an accelerometer sensor on a smartphone device. The data used in this research are collected from five peoples who perform 11 different activities. Each activity is carried out five times to enrich the data. This study uses two steps to improve the efficiency of the classification of the activity: 1) Optimal splitting criteria for activity classification, 2) Measured pruning to limit the tree depth in RF ensemble. The first method in this study can be applied to determine the splitting criteria that are most suitable for the classification problem of activities using Random Forest. In this case, the decision model built using the Gini Index can produce the highest accuracy. The second method proposed in this research successfully builds less complex pruned-tree without reducing its classification accuracy. The research results showed that the method applied to the Random Forest in this study was able to produce a decision model that was simple but yet accurate to classify activity.

Author 1: Syarif Hidayat
Author 2: Ahmad Ashari
Author 3: Agfianto Eko Putra

Keywords: Activity; accuracy; classification; fall; optimization; random forest

PDF

Paper 59: The Mathematical Model of Hybrid Schema Matching based on Constraints and Instances Similarity

Abstract: Schema matching is a crucial issue in applications that involve multiple databases from heterogeneous sources. Schema matching evolves from a manual process to a semi-automated process to effectively guide users in finding commonalities between schema elements. New models are generally developed using a combination of methods to improve the effectiveness of schema matching results. Our previous research has developed a prototype of hybrid schema matching utilizing a combination of constraints-based method and an instance-based method. The innovation of this paper presents a mathematical formulation of a hybrid schema matching model so it can be run for different cases and becomes the basis of development to improve the effectiveness of output and or efficiency during schema matching process. The developed mathematical model serves to perform the main task in the schema matching process that matches the similarity between attributes, calculates the similarity value of the attribute pair, and specifies the matching attribute pair. Based on the test results, a hybrid schema matching model is more effective than the constraints-based method or instance-based method run individually. The more matching criteria used in the schema matching provide better mapping results. The model developed is limited to schema matching processes in the relational model database.

Author 1: Edhy Sutanta
Author 2: Erna Kumalasari Nurnawati
Author 3: Rosalia Arum Kumalasanti

Keywords: Constraint-based; hybrid schema matching model; instance-based; mathematical model

PDF

Paper 60: The Role of Technical Analysis Indicators over Equity Market (NOMU) with R Programing Language

Abstract: The stock market is a potent, fickle and fast-changing domain. Unanticipated market occurrences and unstructured financial information complicate predicting future market responses. A tool that continues to be advantageous when forecasting future market trends in a global aspect is correlation analysis to significant market events. Data analysis can be used for the difficult task of making stock market forecasts in case the stock price rises or fall. A high number of automated exchanges in the stock market are done with advanced prognostic software. Data analysis is centered on the main idea that previously recorded data is used to predict future patterns. This advancement is aimed speculators in pinpointing hidden data in real evidence that would give them some financial foresight when considering their ventures of choice. Data analysis can be applied in order to predict the rises and falls of stocks in the future. This paper aims to critically investigate, develop and judge the different systems that predict and assess future stock trades as these systems have their own various process to foretell the fluctuations in the costs of stocks. Several different technical analysis indicators have been applied in this study including; Chaikin Money Flow (CMF), Stochastic Momentum Index (SMI), Relative Strength Index (RSI), Bollinger Bands (BBands), and Aroon (Aroon) indicator. The experiments have been conducted using R programing language over two companies’ real-world datasets obtained for two years from Saudi stock market (NOMU) which is a parallel stock market with lighter listing requirements that serves as an alternative platform for companies to go public in the main market. To the best of our knowledge, this is the first work to be conducted in NOMU stock market.

Author 1: Mohammed A. Al Ghamdi

Keywords: Data mining; data analysis; R programing language; Chaikin Money Flow (CMF); Stochastic Momentum Index (SMI); Relative Strength Index (RSI); Bollinger Bands (BBands); Aroon indicator

PDF

Paper 61: A Circular Polarization RFID Tag for Medical Uses

Abstract: The aim of this paper is to present Radio Frequency Identification (RFID) Tag. The use of this kind of antennas in the medical field has a great importance in making people's life easier and improving the way to get medical information. This article is dedicated to explain the details of the method used to obtain the circular polarization using a shaped cross slot. A simulation study with the SAR values is performed to obtain an idea about the effects of the electromagnetic waves. In this study, a specific diode pin CPINUC5206-HF has been used in order to obtain a high frequency (3 GHz). Two fabrication methods have been adapted: the printed circuit board method (PCB) and metal cutting through laser ablation method (MCTLA). A comparison study between the two methods has been also conducted.

Author 1: Nada Jebali
Author 2: Ali Gharsallah

Keywords: Radio Frequency Identification (RFID); circular polarization; metal cutting through laser ablation method

PDF

Paper 62: Passenger and Luggage Weight Monitoring System for Public Transport based on Sensing Technology: A Case of Zambia

Abstract: The prevalence of overloading, which is exceeding the maximum load weight, on public buses in Zambia is very rampant because there is currently no system to measure and monitor load weight at bus stations, apart from weighbridges on few selected roads located far away from the loading points. The aim of this study was to design and develop a passenger and luggage weight monitoring system to mitigate the challenge of overloading on public buses. To achieve this, a baseline study was conducted to appreciate the challenges of the current system being used to manage passenger and luggage, i.e., load weight, on public buses. The risk factors considered to contribute to compromised road safety leading to road traffic accidents were also established from all stakeholders as follows: 54 percent human, 39 percent road/environmental, 6 percent vehicle and 1 percent was attributed to other factors. The results were then used as a basis to design and develop a load weight monitoring system (LWMS) based on sensing and other emerging technologies like Load Cells, Wireless Sensor Network (WSN), Internet of Things (IoT), and Cloud Computing concepts to automate the measurement of the load weight, capture and transmit data.

Author 1: Apolinalious Bwalya
Author 2: Jackson Phiri
Author 3: Monica M. Kalumbilo
Author 4: David Zulu

Keywords: Overloading; load weight; load cells; emerging technologies

PDF

Paper 63: Survey Energy Management Approaches in Data Centres

Abstract: Data centers are today the technological backbone for any company. However, the failure to control energy consumption leads to very high operating costs and carbon dioxide emissions. On the other hand, reducing power consumption in data centers can lead to a degradation of application performance and quality of service in terms of SLA Service Level Agreement. It is therefore essential to find a compromise in terms of energy efficiency and resource consumption. This paper highlights the different approaches of energy management, related studies, algorithms used, the advantages and weaknesses of each approach related to server virtualization, consolidation and deconsolidation of virtual machines.

Author 1: Bouchra Morchid
Author 2: Siham Benhadou
Author 3: Mariam Benhadou
Author 4: Abdellah Haddout
Author 5: Hicham Medromi

Keywords: Data center; power management; virtual machines; physical servers "hosts"; energy efficiency; SLA (Service Level Agreement); PUE (Power Usage Effectiveness); QoS (Quality of Service)

PDF

Paper 64: Comparative Study of Methods that Detect Levels of Lead and its Consequent Toxicity in the Blood

Abstract: The present work is the study of the different methods used to determine the toxicity produced by the presence of a contaminating metal in the blood. Mainly, the presence of lead in the blood was taken as a reference to focus the work, knowing that metals like Cadmium (Cd) and Mercury (Hg) are also toxic to health and the environment. Although the information is extensive on the methods to be studied and in some cases it is not detailed to define each process, a comparative study of the most relevant and currently used methods can be carried out, taking into account that the choice will be defined according to the main characteristics of each one. Although all agree to be electrochemical processes, there are details to know which method to choose, either by sensitivity, economic or even structural factors, such as having a laboratory for its development. Environmental pollution with toxic elements is very harmful to health, even in small quantities can be very dangerous. These can be present in rivers, soil and even in the air, and these spaces are more than enough to contaminate the human being since these particles adhere in both cases for many years. It is a problem until today and therein lies the importance of the study.

Author 1: Kevin J. Rodriguez
Author 2: Alicia Alva
Author 3: Virginia T. Santos
Author 4: Avid Roman-Gonzalez

Keywords: Blood lead; toxicity; voltammetry; absorption spectroscopy; healthcare

PDF

Paper 65: Smart Smoking Area based on Fuzzy Decision Tree Algorithm

Abstract: Cigarette smoke is very dangerous for both active and passive smokers who smoke inside a room because nicotine from cigarette smoke can stick on the wall or in the furniture and produce carcinogenic substances when reacting with air. The carcinogen chemicals in cigarettes are more dangerous when cigarette smoke is trapped in a limited space. An exhaust fan is usually used in a special room for smokers that serves to remove cigarette smoke without exchanging air in it. A smart smoking room tool specifically for smokers was made to answer the problem. The room used an ‘in and out’ exhaust fan ventilator. This fan ventilator rotated based on the quantity of carbon monoxide (CO) gas in the room detected by using a smoke sensor. Arduino Uno based on Fuzzy Decision Tree algorithm was used to control of the input voltage level in the fan ventilator. The result showed that by using the tool, the cigarette smoke in the room can be controlled effectively.

Author 1: Iswanto
Author 2: Kunnu Purwanto
Author 3: Weni Hastuti
Author 4: Anis Prabowo
Author 5: Muhamad Yusvin Mustar

Keywords: Fuzzy decision tree algorithm; smart smoking room; microcontroller; smoke sensor

PDF

Paper 66: A Comprehensive Collaborating Filtering Approach using Extended Matrix Factorization and Autoencoder in Recommender System

Abstract: Recommender system is an approach where users get suggestions based on their previous preferences. Nowadays, people are overwhelmed by the huge amount of information that is being present in any system. Sometimes, it is difficult for a user to find an appropriate item by searching the desired content. Recommender system assists users by providing suggestions of re-quired information or items based on the similar features among the users. Collaborative filtering is one of the most re-known process of recommender system where the recommendation is done by similar users or similar items. Matrix factorization is an approach which can be used to decompose a matrix into two or more matrix to generate features. Again, autoencoder is a deep learning based technique which is used to find hidden features of an object. In this paper, features are calculated using extended matrix factorization and autoencoder and then a new similarity metric has been introduced that can calculate the similarity efficiently between each pair of users. Then, an improvement of the prediction method is introduced to predict the rating accurately by using the proposed similarity measure. In the experimental section, it has been shown that our proposed method outperforms in terms of mean absolute error, precision, recall, f-measures, and average reciprocal hit rank.

Author 1: Mahamudul Hasan
Author 2: Falguni Roy
Author 3: Tasdikul Hasan
Author 4: Lafifa Jamal

Keywords: Recommender system; deep learning; autoencoder; matrix factorization; similarity measures

PDF

Paper 67: Satellite Image Enhancement using Wavelet-domain based on Singular Value Decomposition

Abstract: Improving the quality of satellite images has been considered an essential field of research in remote sensing and computer vision. There are currently numerous techniques and algorithms used to achieve enhanced performance. Different algorithms have been proposed to enhance the quality of satellite images. However, satellite images enhancement is considered a challenging task and may play an integral role in a wide range of applications. Having received significant attention in recent years, this manuscript proposes a methodology to enhance the resolution and contrast of satellite images. To improve the quality of satellite images, in this study, first, the resolution of an image is improved. For resolution enhancement, first, the input image is decomposed into four frequency components (LL,LH,HL,and HH) using the stationary wavelet transform (SWT). Second, Singular value matrices (SVMs) U_A and V_A which contains high-frequency elements of an input image are obtained using singular value decomposition (SVD). Third, the high-frequency components (LH,HL) of an input image are obtained using discrete wavelet transform (DWT) and corrected by SVMs and SWT. Next, the interpolation factor is added and the high-resolution image is obtained using inverse discrete wavelet transform (IDWT). Second, the contrast of the image is optimized. For the contrast enhancement, the image is decomposed using DWT into sub-bands such as (LL,LH,HL,and HH). Next, the singular value matrix (SVM) of the LL sub-band is obtained which contains the illumination information. Then, SVM is modified to enhance the contrast. Finally, the image reconstructed using the IDWT. In this paper, the results from the method above are compared with existing approaches. The proposed method achieves high performance and yields more insightful results over conventional technique.

Author 1: Muhammad Aamir
Author 2: Ziaur Rahman
Author 3: Yi-Fei Pu
Author 4: Waheed Ahmed Abro
Author 5: Kanza Gulzar

Keywords: Satellite Images; Image Enhancement; Singular Value Decomposition (SVD); Discrete Wavelet Transforms (DWT); Stationary Wavelet Transform (SWT)

PDF

Paper 68: Convolutional Neural Networks in Predicting Missing Text in Arabic

Abstract: Missing text prediction is one of the major concerns of Natural Language Processing deep learning community’s at-tention. However, the majority of text prediction related research is performed in other languages but not Arabic. In this paper, we take a first step in training a deep learning language model on Arabic language. Our contribution is the prediction of missing text from text documents while applying Convolutional Neural Networks (CNN) on Arabic Language Models. We have built CNN-based Language Models responding to specific settings in relation with Arabic language. We have prepared our dataset of a large quantity of text documents freely downloaded from Arab World Books, Hindawi foundation, and Shamela datasets. To calculate the accuracy of prediction, we have compared documents with complete text and same documents with missing text. We realized training, validation and test steps at three different stages aiming to increase the performance of prediction. The model had been trained at first stage on documents of the same author, then at the second stage, it had been trained on documents of the same dataset, and finally, at the third stage, the model had been trained on all document confused. Steps of training, validation and test have been repeated many times by changing each time the author, dataset, and the combination author-dataset, respectively. Also we have used the technique of enlarging training data by feeding the CNN-model each time by a larger quantity of text. The model gave a high performance of Arabic text prediction using Convolutional Neural Networks with an accuracy that have reached 97.8% in best case.

Author 1: Adnan Souri
Author 2: Mohamed Alachhab
Author 3: Badr Eddine Elmohajir
Author 4: Abdelali Zbakh

Keywords: Natural Language Processing; Convolutional Neu-ral Networks; deep learning; Arabic language; text prediction; text generation

PDF

Paper 69: Multi-Modal Biometric: Bi-Directional Empirical Mode Decomposition with Hilbert-Hung Transformation

Abstract: Biometric systems (BS) helps in reorganization of individual person based on the biological traits like ears, veins, signatures, voices, typing styles, gaits, etc. As, the Uni-modal BS does not give better security and recognition accuracy, the multimodal BS is introduced. In this paper, biological characters like face, finger print and iris are used in the feature level fusion based multimodal BS to overcome those issues. The feature extraction is performed by Bi-directional Empirical Mode Decomposition (BEMD) and Grey Level Co-occurrence Matrix (GLCM) algorithm. Hilbert-Huang transform (HHT) is applied after feature extraction to obtain local features such as local amplitude and phase. The combination of BEMD, HHT and GLCM are used for achieving effective accuracy in the clas-sification process. MMB-BEMD-HHT method is used in Multi-class support vector machine technique (MC-SVM) as a classifier. The false rejection ratio has improved using feature level fusion (FLF) and MC-SVM technique. The performance of MMB-BEMD-HHT method is measured based on the parameters like False Acceptance Ratio (FAR), False Rejection Ratio (FRR), and accuracy and compared it with an existing system. The MMB-BEMD-HHT method gave 96% of accuracy for identifying the biometric traits of individual persons.

Author 1: Gavisiddappa
Author 2: Chandrashekar Mohan Patil
Author 3: Shivakumar Mahadevappa
Author 4: Pramod KumarS

Keywords: Biometric Systems (BS); multimodal biometrics; bi-directional empirical mode decomposition; Hilbert-Huang trans-form; Multi-Class Support Vector Machines technique (MC-SVM); 2000 Mathematics Subject Classification: 92C55, 94A08, 92C10

PDF

Paper 70: Heuristics Applied to Mutation Testing in an Impure Functional Programming Language

Abstract: The task of elaborating accurate test suites for pro-gram testing can be an extensive computational work. Mutation testing is not immune to the problem of being a computational and time-consuming task so that it has found relief in the use of heuristic techniques. The use of Genetic Algorithms in mutation testing has proved to be useful for probing test suites, but it has mainly been enclosed only in the field of imperative programming paradigms. Therefore, we decided to test the feasibility of using Genetic Algorithms for performing mutation testing in functional programming environments. We tested our proposal by making a graph representations of four different functional programs and applied a Genetic Algorithm to generate a population of mutant programs. We found that it is possible to obtain a set of mutants that could find flaws in test suites in functional programming languages. Additionally, we encountered that when a source code increases its number of instructions it was simpler for a genetic algorithm to find a mutant that can avoid all of the test cases.

Author 1: Juan Gutiérrez-Cárdenas
Author 2: Hernan Quintana-Cruz
Author 3: Diego Mego-Fernandez
Author 4: Serguei Diaz-Baskakov

Keywords: Mutation testing; heuristics; functional programming

PDF

Paper 71: Exploiting the Interplay among Products for Efficient Recommendations

Abstract: Recommender systems are built with the aim to reduce the cognitive load on the user. An efficient recommender system should ensure that a user spends minimal time in the process. Conversational Case-Based Recommender Systems (CCBR-RSs) depend on the feedback provided by the user to learn about the preferences of the user. Our goal is to use the feedback provided by the user effectively by exploiting the interplay among the products to build an efficient CCBR-RS. In this work, we propose two ways towards achieving that goal. In the first method, we utilize the higher order similarity and trade-off relationship among the products to propagate the evidence obtained through user feedback. In our second method, we utilize the diversity among cases/products along with the similarity and trade-off relationship to make the best use of the feedback provided by the user.

Author 1: Anbarasu Sekar
Author 2: Sutanu Chakraborti

Keywords: Preference-based feedback; case-based conversa-tional recommender system; evidence; trade-offs; compromise; diversity

PDF

Paper 72: An Assessment of Open Data Sets Completeness

Abstract: The rapid growth of open data sources is driven by free-of-charge contents and ease of accessibility. While it is convenient for public data consumers to use data sets extracted from open data sources, the decision to use these data sets should be based on data sets’ quality. Several data quality dimensions such as completeness, accuracy, and timeliness are common requirements to make data fit for use. More importantly, in many cases, high-quality data sets are desirable in ensuring reliable outcomes of reports and analytics. Even though many open data sources provide data quality guidelines, the responsibility to ensure data of high quality requires commitment from data contributors. In this paper, an initial investigation on the quality of open data sets in terms of completeness dimension was con-ducted. In particular, the results of the missing values in 20 open data sets measurement were extracted from the open data sources. The analysis covered all the missing values representations which are not limited to nulls or blank spaces. The results exhibited a range of missing values ratios that indicated the level of the data sets completeness. The limited coverage of this analysis does not hinder understanding of the current level of data completeness of open data sets. The findings may motivate open data providers to design initiatives that will empower data quality policy and guidelines for data contributors. In addition, this analysis may assist public data users to decide on the acceptability of open data sets by applying the simple methods proposed in this paper or performing data cleaning actions to improve the completeness of the data sets concerned.

Author 1: Abdulrazzak Ali
Author 2: Nurul A. Emran
Author 3: Siti A. Asmai
Author 4: Amelia R. Ismail

Keywords: Data completeness; missing values; open data; open data sources; data collection

PDF

Paper 73: Design and Application of a Smart Diagnostic System for Parkinson’s Patients using Machine Learning

Abstract: For analysis of Parkinson illness gait disabilities de-tection is essential. The only motivation behind this examination is to equitably and consequently differentiate among sound subjects and the one who is forbearing the Parkinson, utilizing IOT based indicative framework. In this examination absolute, 16 distinctive force sensors being attached with the shoes of subjects which documented the Multisignal Vertical Ground Reaction Force (VGRF). Overall sensors signals utilizing 1024 window estimate around the raw signals, utilizing the Packet wavelet change (PWT) five diverse characteristics that includes entropy, energy, variance, standard deviation and waveform length were derived and support vector machine (SVM) is to recognize Parkinson patients and healthy subjects. SVM is trained on 85% of the dataset and tested on 15% dataset. Preparation accomplice relies upon 93 patients with idiopathic PD (mean age: 66.3 years; 63%men and 37% ladies), and 73 healthy controls (mean age: 66.3 years; 55% men and 45% ladies). IOT framework included all 16 sensors, from which 8 compel sensors were appended to left side foot of subject and the rest of the 8 on the right side foot. The outcomes demonstrate that fifth sensor worn on a Medial part of the dorsum of right foot highlighted by R5 gives 90.3% accuracy. Henceforth this examination gives the knowledge to utilize single wearable force sensor. Hence, this examination deduce that a solitary sensor might help in differentiation amongst Parkinson and healthy subjects.

Author 1: Asma Channa
Author 2: Attiya Baqai
Author 3: Rahime Ceylan

Keywords: Parkinson patients; force sensors; machine learn-ing; Wavelet Packet Transform (WPT)

PDF

Paper 74: An Effective Framework for Tweet Level Sentiment Classification using Recursive Text Pre-Processing Approach

Abstract: With around 330 million people around the globe tweet 6000 times per second to express their feelings about a product, policy, service, or an event. Twitter message majorly consists of thoughts. Thoughts are mostly expressed as a text and it is an open challenge to extract some insight from free text. The scope of this work is to build an effective tweet level sentiment classification framework that may use these thoughts to know collective sentiment of the folk on a particular subject. Furthermore, this work also analyses the impact of proposed tweet level recursive text pre-processing approach on overall classification results. This work achieved up to 4 points accuracy improvement over baseline approach besides mitigating feature vector space.

Author 1: Muhammad Bux Alvi
Author 2: Naeem A. Mahoto
Author 3: Mukhtiar A. Unar
Author 4: M. Akram Shaikh

Keywords: Machine learning; recursive text pre-processing; sentiment analysis; sentiment classification framework; Twitter

PDF

Paper 75: Competitive Algorithms for Online Conversion Problem with Interrelated Prices

Abstract: The classical uni-directional conversion algorithms are based on the assumption that prices are arbitrarily chosen from the fixed price interval [m,M] where m and M represent the estimated lower and upper bounds of possible prices 0 < m <= M. The estimated interval is erroneous and no attempts are made by the algorithms to update the erroneous estimates. We consider a real world setting where prices are interrelated, i.e., each price depends on its preceding price. Under this assumption, we derive a lower bound on the competitive ratio of randomized non-preemptive algorithms. Motivated by the fixed and erroneous price bounds, we present an update model that progressively improves the bounds. Based on the update model, we propose a non-preemptive reservation price algorithm RP* and analyze it under competitive analysis. Finally, we report the findings of an experimental study that is conducted over the real world stock index data. We observe that RP* consistently outperforms the classical algorithm.

Author 1: Javeria Iqbal
Author 2: Iftikhar Ahmad
Author 3: Asadullah Shah

Keywords: Time series search; one-way trading; online algo-rithms; update model

PDF

Paper 76: Introducing Multi Shippers Mechanism for Decentralized Cash on Delivery System

Abstract: One of the major problems of e-commerce globally is the selling and buying of goods among the parties over the Internet in which the traders may not trust their partners. Cash on delivery allows customers to pay in cash when the product is delivered to their home or a location they choose. This is sometimes called a payment system because customers receive goods before making a payment. This paper investigates a critical verification process issue in the cash on delivery system. In particular, we propose a multi shippers mechanism, which consists of blockchain technology, smart contracts and hyper-ledger fabric platform to achieve distributed and trustworthy verification across participants in the decentralized markets. Our proposed mechanism is given to not only ensure the benefits of the seller but also prevent shipper’s fraudulent. The solution leverages the consistency and robustness of decentralized markets where trust is flexible and effectively controlled. To demonstrate the application and implementation of the proposed framework, we conduct several case studies on real-world transaction datasets from a local computer retailer. We also provide our sources codes for further reproducibility and development. Our conclusion is that the continued integration of multi-shipper mechanism and blockchain technology in the decentralized markets will cause significant transformations across several disciplines.

Author 1: Hai Trieu Le
Author 2: Ngoc Tien Thanh Le
Author 3: Nguyen Ngoc Phien
Author 4: Nghia Duong-Trung
Author 5: Ha Xuan Son
Author 6: Thai Tam Huynh
Author 7: The Phuc Nguyen

Keywords: Blockchain; smart contract; Cash on Delivery (COD); hyperledger fabric

PDF

Paper 77: Content-based Automatic Video Genre Identification

Abstract: Video content is evolving enormously with the heavy usage of internet and social media websites. Proper searching and indexing of such video content is a major challenge. The existing video search potentially relies on the information provided by the user, such as video caption, description and subsequent comments on the video. In such case, if users provide insufficient or incorrect information about the video genre, the video may not be indexed correctly and ignored during search and retrieval. This paper proposes a mechanism to understand the contents of video and categorize it as Music Video, Talk Show, Movie/Drama, Animation and Sports. For video classification, the proposed system uses audio and visual features like audio signal energy, zero crossing rate, spectral flux from audio and shot boundary, scene count and actor motion from video. The system is tested on popular Hollywood, Bollywood and YouTube videos to give an accuracy of 96%.

Author 1: Faryal Shamsi
Author 2: Sher Muhammad Daudpota
Author 3: Sarang Shaikh

Keywords: Motion detection; scene detection; shot boundary detection; video genre identification

PDF

Paper 78: A Comparison of Sentiment Analysis Methods on Amazon Reviews of Mobile Phones

Abstract: The consumer reviews serve as feedback for busi-nesses in terms of performance, product quality, and consumer service. In this research, we predict consumer opinion based on mobile phone reviews, in addition to providing an analysis of the most important factors behind reviews being classified as either positive, negative, or neutral. This insight could help companies improve their products as well as helping potential buyers to make the right decision. The research presented in this paper was carried out as follows: the data was pre-processed, before being converted from text to vector representation using a range of feature extraction techniques such as bag-of-words, TF-IDF, Glove, and word2vec. We study the performance of different machine learning algorithms, such as logistic regression, stochastic gradient descent, naive Bayes and convolutional neural networks. In addition, we evaluate our models using accuracy, F1-score, precision, recall and log loss function. Moreover, we apply Lime technique to provide analytical reasons for the reviews being classified as either positive, negative or neutral. Our experiments revealed that convolutional neural network with word2vec as a feature extraction technique provides the best results for both the unbalanced and balanced versions of the dataset.

Author 1: Sara Ashour Aljuhani
Author 2: Norah Saleh Alghamdi

Keywords: Bag-of-words; TF-IDF; glove; word2vec; logistic regression; stochastic gradient descent; naive bayes; Convolutional Neural Network; log loss; lime

PDF

Paper 79: Virtualizing a Cluster to Optimize the Problems of High Scientific Complexity within an Organization

Abstract: The Image Processing Research Laboratory (INTI-Lab) of the Universidad de Ciencias y Humanidades has several research projects related to computer science needing high computational resources. Some of these projects are associated with climate prediction, molecule modeling, physical simulations, and others these applications generate a significant amount of data, regarding the big data issue, despite having excellent hardware features, the final result is obtained after hours or days of calculation depending on the algorithm complexity. For this reason, it is not possible to present optimal solutions at an ideal time. .In this work, we propose the virtualization and configuration of a high-performance cluster (HPC) known commercially as a "supercomputer" that is composed of several computers connected to a high-speed network to behave like a single computer. The virtualization is used to run a scientific algorithm that will apply performance tests using four virtual computers to demonstrate that the reduction of time is achieved by using more machines and thus be able to be implemented in the laboratories of the institution.

Author 1: Enrique Lee Huamaní
Author 2: Patricia Condori
Author 3: Avid Roman-Gonzalez

Keywords: High-performance cluster; distributed programming; computational parallelism; supercomputer; high-efficiency computing

PDF

Paper 80: School Manager System based on a Personal Information Architecture

Abstract: The current technological revolution has provided multiple benefits to human activities. For their part, organizations have had the need to make changes to their business requirements, which have led them to migrate to systems and services in more complex models. Educational institutions have experienced the impact of technological progress, because, regarding school management, information handling requires to be performed through automated processes that also protect data from any human or cyber-attack. The purpose of the paper herein is to show the development and integration of a system dedicated to manage personal information within a school environment, through the implementation of an information management architecture, whose main purpose is to create certified documents that can be shared with other information systems in the same trust environment. Research is descriptive in nature as it pretends to detect abnormalities in the characteristics of PIMS, describe their associations, and prove or reject the hypothesis in order to be compared to subsequent studies.

Author 1: Elena Fabiola Ruiz Ledesma
Author 2: Elizabeth Moreno Galván
Author 3: Juan Jesús Gutiérrez García
Author 4: Chadwick Carreto Arellano

Keywords: Architecture; mobile computing; ubiquitous computing; information management system

PDF

Paper 81: Fusing Identity Management, HL7 and Blockchain into a Global Healthcare Record Sharing Architecture

Abstract: Healthcare record sharing among various medical roles is a critical and challenging research problem especially in today’s everchanging global IT solutions. The emergence of blockchain as a new enabling technology brought radical changes to numerous business applications, including healthcare. Blockchain is a trusted distributed ledger that forms a decentral-ized infrastructure. There have been several proposals regarding the sharing of critical healthcare records over blockchain infras-tructure without requiring prior knowledge/trust of the parties involved (patients, service providers, and insurance companies). Another yet important issue is to securely share medical records across various countries for travelling patients to ensure an integrated and ubiquitous healthcare service. In this paper, we present a globally integrated healthcare record sharing architec-ture based on blockchain and HL7 client. Healthcare records are stored at the hosting country and are not stored on the blockchain. This architecture avails medical records of travelling patients temporarily and after performing necessary authentication. The actual authorisation process is performed on a federated identity management system, such as, the Shibboleth. Though there are similarities with identity management systems, our system is unique as it involves the patient in the permission process and discloses to them the identities of entities accessed their health records. Our solution also improves performance and guarantees privacy and security through the use of blockchain and identity management system.

Author 1: Mohammad A. R. Abdeen
Author 2: Toqeer Ali
Author 3: Yasar Khan
Author 4: M.C.E. Yagoub

Keywords: Healthcare; blockchain; electronic health record; identity management; Health Level Seven (HL7)

PDF

Paper 82: An Efficient Machine Learning Technique to Classify and Recognize Handwritten and Printed Digits of Sudoku Puzzle

Abstract: In this paper, we propose a convolutional neural network model to recognize and classify handwritten and printed digits present in Sudoku puzzle, which is captured using smartphone camera from various magazines, and printed papers. Sudoku puzzle grid is detected using various image processing and filtering techniques such as adaptive threshold. The system described in the paper is thoroughly tested on a set of 100 Sudoku images captured with smartphone cameras under varying conditions. The system shows promising results with 98% accuracy. Our model can handle more complex conditions often present on images that were taken with phone cameras and the complexity of mixed printed and handwritten digits.

Author 1: Sang C. Suh
Author 2: Aghalya Dharshni Manmatharaj

Keywords: Convolutional Neural Network (CNN); Artificial Neural Network (ANN); Deep Belief Network (DBN); Optical character recognition (OCR), Open source computer vision (OpenCV); Convolutional Deep Belief Network (CDBN)

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org