The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 9 Issue 11

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Exploring Identifiers of Research Articles Related to Food and Disease using Artificial Intelligence

Abstract: Currently hundreds of studies in the literature have shown the link between food and reducing the risk of chronic diseases. This study investigates the use of natural language processing and artificial intelligence techniques in developing a classifier that is able to identify, extract and analyze food-health articles automatically. In particular, this research focusses on automatic identification of health articles pertinent to roles of food in lowering the risk of cardiovascular disease, type-2 diabetes and cancer. Three hundred food-health articles on that topic were analyzed to help identify a unique key (Identifier) for each set of publications. These keys were employed to construct a classifier that is capable of performing online search for identifying and extracting scientific articles in request. The classifier showed promising results to perform automatic analysis of food-health articles which in turn would help food professionals and researchers to carry out efficient literature search and analysis in a timely fashion.

Author 1: Marco Ross
Author 2: El Sayed Mahmoud
Author 3: El-Sayed M. Abdel-Aal

Keywords: Natural language processing; text classification; ngrams; bioinformatics; knowledge extraction; nutrition assessment; health promotion; research uptake

PDF

Paper 2: A Method for Implementing Probabilistic Entity Resolution

Abstract: Deterministic and probabilistic are two approaches to matching commonly used in Entity Resolution (ER) systems. While many users are familiar with writing and using Boolean rules for deterministic matching, fewer are as familiar with the scoring rule configuration used to support probabilistic matching. This paper describes a method using deterministic matching to “bootstrap” probabilistic matching. It also examines the effectiveness three commonly used strategies to mitigate the effect of missing values when using probabilistic matching. The results based on experiment using different sets of synthetically generated data processed using the OYSTER open source entity resolution system.

Author 1: Awaad Alsarkhi
Author 2: John R. Talburt

Keywords: Entity resolution; probabilistic matching; deterministic matching; boolean rules; scoring rule; missing values

PDF

Paper 3: Intellectual Paradigm of Artificial Vision: from Video-Intelligence to Strong Artificial Intelligence

Abstract: A new (post-Shannon) informational approach is suggested in this paper, which allows to make deep analysis of nature of the information. It was found that information could be presented as an aggregate of quantitative (physical) and qualitative (structural) components to be considered together. It turned out that such full information theory could be efficiently used as the guiding theory at modeling of video-information recognition, perception and understanding. These hierarchical processes are solving the intellectual tasks step-by-step for formation of the corresponding video-information evaluation and also represent a strong interactions-measurements video-information’s ensuring adequacy of these assessments. That is why there is a need to build corresponding video information macro-objects (video-thesauruses) on every level of hierarchy of artificial vision system, which are formed by training (self-training) and form together an upward hierarchy of qualitative measuring scales. The top of this hierarchy is video-intelligence. Information theory of artificial intelligence is a logical development of new information approach from analysis to synthesis. Further “analysis through synthesis” allows establishing the informational nature and structure of not only video-intelligence, but also strong artificial intelligence, which for video-intelligence constitute as intellectual suprasystem.

Author 1: E. M. Yarichin
Author 2: V. M. Gruznov
Author 3: G. F. Yarichina

Keywords: Gauge approach; fibration space; informational hypergraph; video-intelligence; strong artificial intelligence

PDF

Paper 4: Anomaly Detection with Machine Learning and Graph Databases in Fraud Management

Abstract: In this paper, the task of fraud detection using the methods of data analysis and machine learning based on social and transaction graphs is considered. The algorithms for feature calculation, outlier detection and identifying specific sub-graph patterns are proposed. Software realization of the proposed algorithms is described and the results of experimental study of the algorithms on the sets of real and synthetic data are presented.

Author 1: Shamil Magomedov
Author 2: Sergei Pavelyev
Author 3: Irina Ivanova
Author 4: Alexey Dobrotvorsky
Author 5: Marina Khrestina
Author 6: Timur Yusubaliev

Keywords: Data analysis; machine learning; graph database; fraud detection; anti-money laundering

PDF

Paper 5: A Hybrid Intelligent Model for Enhancing Healthcare Services on Cloud Environment

Abstract: Cloud computing plays a major role in addressing the challenges of healthcare services such as diagnosis of diseases, telemedicine, maximize utilization of medical resources, etc. Early detection of chronic kidney disease on cloud environment is a big challenge that is facing healthcare providers. This paper concentrates on the using of intelligent techniques such as Decision Tree, Clustering, Linear Regression, Modular Neural Network, and Back Propagation Neural Network to address this challenge. In this paper, the researchers propose a hybrid intelligent model based on cloud computing for early revealing of chronic kidney disease. Two intelligent techniques were used: linear regression and neural network. Linear regression was used to define crucial factors that have an impact on chronic kidney disease. The proposed model for early revealing of chronic kidney disease was built using Neural Network. The accuracy of proposed model is 97.8%. This model outperforms on the other models existed in the previous works in terms of the accuracy and precision, recall and F1 score.

Author 1: Ahmed Abdelaziz
Author 2: Ahmed S. Salama
Author 3: A.M. Riad

Keywords: Chronic kidney disease; linear regression; neural network; cloud computing

PDF

Paper 6: Implementation of a basic Sonar of Echolocation for Education in Telecommunications

Abstract: Currently, having a sonar of echolocation in an electronic lab is complicated due to the high cost of its implementation, which is why it is proposed the implementation of a basic sonar, using agile technologies such as the Arduino, which will be used to implement this paper, also it has a servomotor and an ultrasonic sensor, which is responsible for detecting the distance where the objects are located. The Arduino will be in charge of controlling the servomotor movements which have to be between 15 and 165 degrees, in addition, it will send the information through the serial port to a computer, in which the data will be processed and displayed using the Processing software.

Author 1: Fredy Criollo-Sánchez
Author 2: Rodríguez-Villarreal, Kevin
Author 3: Mosquera-Sanchez, Cristian
Author 4: Medina-Alvarez, Marco
Author 5: Chavarry-Polanco, Danny
Author 6: Alvarado-Díaz, Witman
Author 7: Meneses-Claudio, Brian
Author 8: Roman-Gonzalez, Avid

Keywords: Arduino; processing software; sonar; ultrasonic sensor; servomotor

PDF

Paper 7: Experimental Study on an Efficient Dengue Disease Management System

Abstract: Dengue has become a serious health hazard in Sri Lanka with the increasing cases and loss of human lives. It is necessary to develop an efficient dengue disease management system which could predict the dengue out breaks, plan the countermeasures accordingly and allocate resources for the countermeasures. We have proposed a platform for Dengue disease management with following modules: (1) a prediction module to predict the dengue outbreak and (2) an optimization algorithm module to optimize hospital staff according to the predictions made on future dengue patient counts. This paper focuses on the optimization algorithm module. It has been developed based on two approaches: (1) Genetic Algorithm (GA) and (2) Iterated Local Search (ILS). We are presenting the performances of our optimization algorithm module with a comparison of the two approaches. Our results show that the GA approach is much more efficient and faster than the ILS approach.

Author 1: J M.M.C Jayasuriya
Author 2: G.K.K.T.Galappaththi
Author 3: M.A.Dilupa Sampath
Author 4: H.N.Nipunika
Author 5: W.H. Rankothge

Keywords: Optimization; genetic algorithm; iterated local search; algorithm comparison; nurse scheduling

PDF

Paper 8: Industrial Internet of Things as a Challenge for Higher Education

Abstract: This paper is aimed to examine the adoption of the Internet of Things (IoT) in industry (so-called Industrial Internet of Things, shortly IIoT) and the requirements for higher education in the times of the fourth industrial revolution. The addition of the fourth letter, "I" in front of the “IoT” coins the name of the new concept, “IIoT” in relation with another term, “Industry 4.0”. Because these concepts have no precise and widely accepted definitions, we presented some considered relevant by scientific literature. The paper also highlights the most important similarities and differences between these concepts. IIoT is a very dynamic concept and it will constantly bring changes in digital technologies, requirements and markets, and will also transform industries and business practices. According to manifold studies, currently, there is a skills gap which may widen in the future if no action is taken. Higher education must adopt the latest related technologies and must adapt to the new ways in which people, machines, services and data can interact. Consequently, employees, students, graduates, etc. have to be equally dynamic in learning and acquiring new skills. The transition from higher education to employment is a challenge that could be more easily addressed through the efforts of all stakeholders, from individuals to organizations, and from businesses to governments. As changes in higher education take time, all stakeholders will now have to act in preparing for the Industrial Internet of Things.

Author 1: Corneliu Octavian Turcu
Author 2: Cristina Elena Turcu

Keywords: Industry 4.0; industrial internet of things; internet of things; higher education; skills gap

PDF

Paper 9: Coverage-Aware Protocols in Wireless Sensor Networks: A Review

Abstract: Coverage plays a vital role in wireless sensor networks (WSNs), since it is used as one of the important measure to achieve the performance of the sensor network. The sensor nodes in WSN have limited power and energy resources. So, energy efficiency is an essential factor that should be considered along with coverage while designing the coverage protocols. During the past few years, many efforts have been made by the researchers on designing different coverage-aware protocols. Different coverage-aware protocols may impose different ways to solve the coverage and efficient utilization of energy among sensor nodes. In this paper, we present a review on coverage-aware protocols by highlighting their functionalities.

Author 1: Jahangir Khan
Author 2: Khalid Mahmood
Author 3: Ansar Munir Shah
Author 4: Babar Nawaz
Author 5: Mahmood ul Hassan

Keywords: Wireless sensor network; coverage area; energy efficiency; coverage protocols

PDF

Paper 10: Wireless Internet of Things-Based Air Quality Device for Smart Pollution Monitoring

Abstract: Living in a healthy environment is a need for every human being whether indoor or outdoor. However, pollutions occur everywhere and most people are merely mindful of the importance of having clean outdoor air to breathe and are not concerned about the indoor air quality. Indoor air quality refers to the quality within the building, and relates to the health and comfort of the building occupants. Dangerous particles exist in the outside air, pollute the indoor environment and produce harmful conditions as the polluted air travels into the house or building through windows or doors. Therefore, a wireless Internet of Things-based air quality device is developed to monitor the air quality in the indoor environment. The proposed system integrates a low-cost air quality sensor, temperature and humidity sensors, a single-board computer (Raspberry Pi 2 microprocessor) and cloud storage. The system provides realtime air quality reading, transfers the data through a wireless network to the Internet and displays the data in dedicated webpage. Furthermore, it stores records in cloud storage and sends e-mail notification message to the user when unhealthy condition is met. The study has a significant impact on promoting affordable and portable smart pollution monitoring system as the development of the device utilizing low-cost and off-the-shelf components.

Author 1: Nurul Azma Zakaria
Author 2: Zaheera Zainal Abidin
Author 3: Norharyati Harum
Author 4: Low Chen Hau
Author 5: Nabeel Saleh Ali
Author 6: Fairul Azni Jafar

Keywords: Internet of things (IoT); single-board computer; cloud storage; smart pollution monitoring

PDF

Paper 11: Securing Locations of Mobile Nodes in Wireless Mesh Network’s

Abstract: The current deployment of wireless mesh networks requires mobility management to track the current locations of mobile nodes around the network without service interruption. To do so, the Hierarchical Mobile IPv6 protocol has been chosen, which minimises the required signalling by introducing a new entity called the mobile anchor point to act as a local home agent for all visiting mobile nodes in a specific domain. It allows a mobile node to register its local/regional care-of addresses with a mobile anchor point by sending a local binding update message. However, the local binding update is quite sensitive; it modifies the routing to enable mobility in the wireless mesh networks. When a local binding update message is spoofed, an attacker can redirect traffic that is destined for legitimate mobile node either to itself or to another node. This situation leads to an increased risk of attacks. Therefore, this paper contributes to addressing this security issue based on wireless mesh networks by cryptography generation and verification of a mobile node’s local and regional care-of addresses, as well as the application of a novel method to verify the reachability of mobile node at claimed local care-of address. This is called the enhanced mobile anchor point registration protocol. The Scyther tool has been used to ensure the proposed protocol accuracy. Furthermore, the performance, in terms of the mobile anchor point registration delay and signalling overhead, is evaluated by using the OPNET modeller simulator.

Author 1: Sultan Alkhliwi

Keywords: Wireless mesh networks; hierarchical mobile IPv6 protocol; authentication; secret key; Scyther tool; OPNET simulation

PDF

Paper 12: Voice Pathology Recognition and Classification using Noise Related Features

Abstract: Nowadays, the diseases of the voice increase because of bad social habits and the misuse of voice. These pathologies should be treated from the beginning. Indeed, it is no longer necessary that the diseases of the voice lead to affect the quality of the voice as heard by a listener. The most useful tool for diagnosing such diseases is the Acoustic analysis. We present in this work, new expression parameters in order to clarify the description of the vocal signal. These parameters help to classify the unhealthy voices. They describes essentially the fundamental frequency F0, the Harmonics-to-Noise report HNR, the report Noise to Harmonics Ratio NHR and Detrended Fluctuation Analysis (DFA). The classification is performed on two Saarbruecken Voice and MEEI pathological databases using HTK classifiers. We can classify them into two different type: the first classification is binary which is used for the normal and pathological voices, the second one is called a four-category classification used in spasmodic, polyp, nodule and normal female voices and male speakers. And we studied the effects of these new parameters when combined with the MFCC, Delta, Delta second and Energy coefficients.

Author 1: HAMDI Rabeh
Author 2: HAJJI Salah
Author 3: CHERIF Adnane

Keywords: HTK; MFCC; MEEI; SVD; pathological voices

PDF

Paper 13: Developing A Model for Predicting the Speech Intelligibility of South Korean Children with Cochlear Implantation using a Random Forest Algorithm

Abstract: The random forest technique, a tree-based study model, predicts the results by using random decision trees based on the bootstrap technique. Therefore, it has a high prediction power and fewer errors, which are advantages of this method. This study aimed to provide baseline data regarding the language therapy after cochlear implantation by identifying the factors associated with the speech intelligibility of children with cochlear implantation. This study evaluating the factors associated with the articulation accuracy of children with cochlear implantation. This study targeted 82 hearing-impaired children, who lived in Seoul, Incheon, and Suwon areas, were between 4 and 8 years old, and had been worn cochlear implant at least one year and less than five years. Explanatory variables used in this study included gender, age, household income, the wear time of a cochlear implant, vocabulary index, and corrected hearing. Speech intelligibility was analyzed using the 'speech intelligibility test tool' composed of nine sentences. The predictive model for speech intelligibility of children with cochlear implants was developed using random forest. The major predictors of the articulation accuracy of children with cochlear implantation were the wear time of a cochlear implant, the time since cochlear implantation, vocabulary, household income, age, and gender, in the order of the magnitude. The final error rate of the random forest model developed by generating 500 bootstrap samples was 0.22, and the prediction rate was 78.8%. The results of this study on a prediction model suggested that it would be necessary to implement cochlear implantation and to develop a customized aural rehabilitation program considering the linguistic ability of a subject for enhancing the speech intelligibility of a child with cochlear implantation.

Author 1: Haewon Byeon

Keywords: Random forest; hearing impairment; vocabulary index; speech intelligibility; risk factor; data mining

PDF

Paper 14: Ranking Method in Group Decision Support to Determine the Regional Prioritized Areas and Leading Sectors using Garrett Score

Abstract: The main objective of regional development is to achieve equal development in different regions. However, the long duration and complexity of the process may result in the unequal development of some regions. In order to achieve a fair development process for each region, a standard approach must be developed to select a suitable priority area that can support other underdeveloped regions that require attention. One of the approaches taken is to determine the prioritized areas and the leading sectors in the region where the region is expected to be a support for other regions that still need attention and handling on development priorities. This research was conducted to provide a new alternative in the process of determining the prioritized areas, not only by observing the development data, but also involving decision-making components consisting of government and community (including non-governmental organizations and academicians). This study used group decision support approach with the Garrett ranking technique. The results of the research on the determination of the prioritized areas using Garret Score showed that there are 5 of 29 Regencies/Municipalities in Papua Province that can be used as prioritized areas, namely Jayapura Regency, Jayapura Municipality, Mimika, Merauke and Nabire. Then, there are three leading sectors for development, namely agricultural, mining and Industrial and Processing sectors. The test of ranking results was conducted by calculating the Spearman's correlation coefficient of the Garrett ranking results and obtained a coefficient of 0.807 which means that the ranking results are very strong.

Author 1: Heru Ismanto
Author 2: Suharto
Author 3: Azhari
Author 4: Lincolin Arsyad

Keywords: Priority area; leading sector; garrett; decision group support; spearman correlation

PDF

Paper 15: A Hybrid Technique for Tunneling Mechanism of IPv6 using Teredo and 6RD to Enhance the Network Performance

Abstract: Currently, Internet Protocol version 4 (IPv4) addresses have been depleted. Many Internet Service Providers (ISPs), researchers and end users are migrating from IPv4 to IPv6 due to strong features of IPv6 and limitation in IPv4. Different tunneling techniques have been deployed to migrate on IPv6 for ordinary users. However, these techniques create many issues such as compatibility, complexity, connectivity and traffic. Due to the dissimilar header structure and incompatibility of IPv4 and IPv6, the devices are unable to communicate with each other because devices do not support IPv6 addresses directly. The performance of network is also compromised due to huge increment in data transmission traffic. In this paper, we proposed a technique to provide full IPv6 connectivity and enhancing the network performances by combining two tunneling techniques such as IPv6 Rapid Deployment (6RD) and Teredo. To increase the throughput of the network, jumbo frames are used to carry huge amount of data. The main objective of using both techniques is to provide a hybrid network rendering for full IPv6 connectivity. The proposed technique provides not only the IPv6 but also provides better performance in the network. Simulation results show that throughput and packet delivery ratio achieve maximum gain by 9000 bytes and 98%, respectively.

Author 1: Zulfiqar Ali Zardari
Author 2: Munwar Ali
Author 3: Reehan Ali Shah
Author 4: Ladha Hussain Zardari

Keywords: Hybrid network; IPv4; IPv6; 6RD; teredo tunneling mechanism; network performance jumbo frames

PDF

Paper 16: An Empirical Study of App Permissions : A User Protection Motivation Behaviour

Abstract: Smartphone is one of the telecommunications media that can be used anytime and anywhere. To be able to support the activity of its users, smartphone users install the application on their smartphone. When installing an application, there are permissions provided by the application about data that will be collected. However, many users choose to ignore and do not read the application permission since it is too long or difficult to understand, hence they accept the apps permissions without thinking and consequently leads to security problems. This study aims to determine the factors that affect users in reading apps permissions that have been provided by an application before they install the application. Data were collected from 292 respondents who were active in using smartphones. The data analysis method used is Structural Equation Modeling (SEM). The results of the study show that the factors that influence the user in reading the app permissions before they install the application are coping self-efficacy and personal responsibility.

Author 1: Ari Kusyanti
Author 2: Harin Puspa Ayu Catherina

Keywords: Protection motivation theory (pmt); application permission; smartphone; structural equation modeling (SEM)

PDF

Paper 17: Automated Extraction of Large Scale Scanned Document Images using Google Vision OCR in Apache Hadoop Environment

Abstract: This Digitalization of documents is now being done in all fields to reduce paper usage. The availability of modern technology in the form of scanners and cameras supports the growth of multimedia data, especially documents stored in the form of image files. Searching a particular text in a large-scale scanned document images is a difficult task if the document is in the form of images where the text has not been extracted. In this research, text extraction method of large-scale scanned document images using Google Vision OCR on the Hadoop architecture is proposed. The object of research is student thesis documents, which includes the cover page, the approval page, and abstract. All documents are stored in the university's digital library. Extraction process begins with preparing the input folder that contains image documents (in JPEG format) in HDFS Apache Hadoop and followed by reading the image document. The image document is then extracted using Google Vision OCR in order to obtain text document (in TXT format) and the result is saved to output folder in Hadoop Distributed File System (HDFS). The same process is repeated for the entire documents in the folder. Test results have shown that the proposed methods was able to extract all test documents successfully. The recognition process achieved 100% accuracy and the extraction time is twice as fast as manual extraction. Google Vision OCR also shows better extraction performance compared to other OCR tools. The proposed automated extraction systems can recognize text in a large-scale image document accurately and can be operated in a real-time environment.

Author 1: Rifiana Arief
Author 2: Achmad Benny Mutiara
Author 3: Tubagus Maulana Kusuma
Author 4: Hustinawaty

Keywords: Automation; extraction; google vision OCR; hadoop; scanned document images

PDF

Paper 18: Optimal Overcurrent Relays Coordination using an Improved Grey Wolf Optimizer

Abstract: Recently, nature inspired algorithms (NIA) have been implemented to various fields of optimization problems. In this paper, the implementation of NIA is reported to solve the overcurrent relay coordination problem. The purpose is to find the optimal value of the Time Multiplier Setting (TMS) and Plug Setting (PS) in order to minimize the primary relays’ operating time at the near end fault. The optimization is performed using the Improved Grey Wolf Optimization (IGWO) algorithm. Some modifications to the original GWO have been made to improve the candidate’s exploration ability. Comprehensive simulation studies have been performed to demonstrate the reliability and efficiency of the proposed modification technique compared to the conventional GWO and some well-known algorithms. The generated results have confirmed the proposed IGWO is able to optimize the objective function of the overcurrent relay coordination problem.

Author 1: Noor Zaihah Jamal
Author 2: Mohd Herwan Sulaiman
Author 3: Omar Aliman
Author 4: Zuriani Mustaffa

Keywords: Time multiplier setting (TMS); plug setting (PS; grey wolf optimization algorithm (GWO); overcurrent relay coordination

PDF

Paper 19: Risk Assessment Method for Insider Threats in Cyber Security: A Review

Abstract: Today’s in manufacturing major challenge is to manage large scale of cybersecurity system, which is potentially exposed to a multitude of threats. The utmost risky threats are insider threats. An insider threat arises when a person authorized to perform certain movements in an organization decides to mishandle the trust and harm the organization. Therefore, to overcome these risks, this study evaluates various risk assessment method to assess the impact of insider threats and analyses the current gaps in risk assessment method. Based on the literature search done manually, we compare four methods which are NIST, FRAP, OCTAVE, and CRAMM. The result of the study shows that the most used by an organization is the NIST method. It is because NIST is a method that combines the involvement between human and system in term of collection data. The significance of this study contributes to developing a new method in analyzing the threats that can be used in any organization.

Author 1: Nurul Akmal Hashim
Author 2: Zaheera Zainal Abidin
Author 3: A.P. Puvanasvaran
Author 4: Nurul Azma Zakaria
Author 5: Rabiah Ahmad

Keywords: Insider threats; manufacturing; risk assessment; cyber security; threats; risk

PDF

Paper 20: BAAC: Bangor Arabic Annotated Corpus

Abstract: This paper describes the creation of the new Bangor Arabic Annotated Corpus (BAAC) which is a Modern Standard Arabic (MSA) corpus that comprises 50K words manually annotated by parts-of-speech. For evaluating the quality of the corpus, the Kappa coefficient and a direct percent agreement for each tag were calculated for the new corpus and a Kappa value of 0.956 was obtained, with an average observed agreement of 94.25%. The corpus was used to evaluate the widely used Madamira Arabic part-of-speech tagger and to further investigate compression models for text compressed using part-of-speech tags. Also, a new annotation tool was developed and employed for the annotation process of BAAC.

Author 1: Ibrahim S Alkhazi
Author 2: William J. Teahan

Keywords: Component; arabic language; corpus; annotated corpora; analysis results

PDF

Paper 21: Implicit Thinking Knowledge Injection Framework for Agile Requirements Engineering

Abstract: Agile has become commonly used as a software development methodology and its success depends on face-to-face communication of software developers and the faster software product delivery. Implicit thinking knowledge has considered as a very significant for organization self-learning. The main goal of paying attention to managing the implicit thinking knowledge is to retrieve valuable information of how the software is developed. However, requirements documentation is a challenging task for Agile software engineers. The current Agile requirements documentation does not incorporate the implicit thinking knowledge with the values it intends to achieve in the software project. This research addresses this issue and introduce a framework assists to inject the implicit thinking knowledge in Agile requirements engineering. An experiment used a survey questionnaire and case study of real project implemented for the framework evaluation. The results show that the framework enables software engineers to share and document their implicit thinking knowledge during Agile requirements documentation.

Author 1: Kaiss Elghariani
Author 2: Nazri Kama
Author 3: Nurulhuda Firdaus Mohd Azmi
Author 4: Nur Azaliah Abu bakar

Keywords: Software development methodology; agile methodology; requirements engineering; requirements documentation; and implicit thinking documentation

PDF

Paper 22: Usability Testing for Crop and Farmer Activity Information System

Abstract: Information System usability level depends on acceptance and system convenience to be run by users. One of the methods to measure usability level is by conducting usability testing. This article elaborates usability testing for Crop and Farmer Activity Information System. This system is one of the agriculture information systems that is developed to record system activities for each farm field. This system is considered as one of the important role of Information and Communication Technology (ICT) for agriculture. This system has been developing since 2017 and needs to be assessed and tested. To assess the system, usability testing was conducted by taking sampling from two regions in Central Java, Temanggung and Gombong. The respondents are system administrators, farmers, and general users with each of respondents has different criteria. There are 58 respondents participated in this research: 49 farmers, 3 system administrators, and 6 general users. Usability testing was carried out by giving respondents several test tasks based on the system. Each respondent had different kind of test task in accordance with the system functionality for each user. The result of the test found that system administrator user interface assessment value gained average percentage of 69%, while the farmers gained 76% and general users gained 79%. From the test, it also bring some recommendations for system refinement. Those recommendations were taken from user inputs and user test results. The recommendations have been made to bring better system environment.

Author 1: Halim Budi Santoso
Author 2: Rosa Delima
Author 3: Emylia Intan Listyaningsih
Author 4: Argo Wibowo

Keywords: Usability testing; crop and activity information system; improvement recommendation; precision farming; information technology for agriculture

PDF

Paper 23: A Novel Architecture for 5G Ultra Dense Heterogeneous Cellular Network

Abstract: The mounting use of wireless devices and wide range applications in an ultra-dense heterogeneous wireless network has directed to the challenging circumstances that could not be handled till 4G. In order to deal with the critical challenges, the fifth generation (5G) wireless network architecture requires an efficient well-organized wireless network. In this paper, a novel architecture for the 5G ultra-dense heterogeneous cellular network is proposed. In the proposed architecture two main aspects, massive MIMO-OFDMA and IP-based vertical handover are considered. In order to have full network coverage, the whole macro area network is divided into the microcells and each microcell is further divided into the smaller cells. The heterogeneity of different types of base stations (macro area network base station, micro-cell base station, and small cell base station) provides efficient network coverage. By reducing the area of the cell, the frequency efficiency and network coverage are improved. All the base stations are equipped with the massive MIMO-OFDMA antennas and different radio access technologies so a single wireless device can switch from one radio access technology to the other. In order to prevent the link disconnection and save IP address, whenever wireless devices need to perform vertical handover first a new connection is established with new radio access technology. It is important to note the same IP address is used whereas the current connection is disconnected. By utilizing IP-based vertical handover, the new 5G wireless network can become the principal goal of service continuity and minimize the handover processing delay. The simulation results show the improvement in the network performance.

Author 1: Sabeen Tahir

Keywords: Massive MIMO-OFDMA; 5G; IP-based interoperability; heterogeneous

PDF

Paper 24: Audio Augmentation for Traffic Signs: A Case Study of Pakistani Traffic Signs

Abstract: Augmented Reality (AR) extend the appearance of real-world by adding digital information to the scene using computer graphics and image processing techniques. Various approaches have been used to detect, identify and track objects in real environment depending upon the application, shape of the tracking object and environment type. The marker-based tracking technique is the most commonly used method in augmented reality applications in which fiducial markers are put in the real-world for tracking. In this work we proposed a model to detect and identify the traffic signs through marker based technique to improve the usability of marker-based detection in augmented reality applications. We developed an AR application that can detect and recognize the markers designed for Pakistani traffic signs and augment them with voice alert to the driver so that he does prepared for the upcoming hurdle on the road. As identified by literature no work has been performed on augmentation of voice for traffic signs. From experiments the model outperforms than baseline techniques.

Author 1: Abdul Wahab
Author 2: Aurangzeb Khan
Author 3: Ihsan Rabbi
Author 4: Khairullah Khan
Author 5: Nasir Gul

Keywords: Augmented reality; traffic sign detection; traffic sign recognition

PDF

Paper 25: DDoS Classification Using Neural Network and Naïve Bayes Methods for Network Forensics

Abstract: Distributed Denial of Service (DDoS) is a network security problem that continues to grow dynamically and has increased significantly to date. DDoS is a type of attack that is carried out by draining the available resources in the network by flooding the package with a significant intensity so that the system becomes overloaded and stops. This attack resulted in enormous losses for institutions and companies engaged in online services. Prolonged deductions and substantial recovery costs are additional losses for the company due to loss of integrity. The activities of damaging, disrupting, stealing data, and everything that is detrimental to the system owner on a computer network is an illegal act and can be imposed legally in court. Criminals can be punished based on the evidence found with the Forensics network mechanism. DDoS attack classification is based on network traffic activity using the neural network and naïve Bayes methods. Based on the experiments conducted, it was found that the results of accuracy in artificial neural networks were 95.23% and naïve Bayes were 99.9%. The experimental results show that the naïve Bayes method is better than the neural network. The results of the experiment and analysis can be used as evidence in the trial process.

Author 1: Anton Yudhana
Author 2: Imam Riadi
Author 3: Faizin Ridho

Keywords: DDoS; IDS; neural network; naïve bayes; network forensics

PDF

Paper 26: A Context-Sensitive Approach to Find Optimum Language Model for Automatic Bangla Spelling Correction

Abstract: Automated spelling correction is an important phenomenon in typing that has intense effect on aiding both literate and semi-literate people while using keyboard or other similar devices. Such automated spelling correction technique also helps students significantly in learning process through applying proper words during word processing. A lot of work has been conducted for English language, but for Bangla, it is still not adequate. All work done so far in Bangla is context-free. Bangla is one of the mostly spoken languages (3.05% of world population) and considered seventh language of all languages in the world. In this paper, we propose a context-sensitive approach for automated spelling correction in Bangla. We make combined use of edit distance and stochastic, i.e. N-gram language model. We use six N-gram models in total. A novel approach is deployed in order to find the optimum language model in terms of performance. In addition, for finding out better performance, a large Bangla corpus of different word types is used. We have achieved a satisfactory and promising accuracy of 87.58%.

Author 1: Muhammad Ifte Khairul Islam
Author 2: Md. Tarek Habib
Author 3: Md. Sadekur Rahman
Author 4: Md. Riazur Rahman
Author 5: Farruk Ahmed

Keywords: Spelling correction; non-word error; N-gram; edit distance; magnifying search; accuracy

PDF

Paper 27: Performance Comparison between Merge and Quick Sort Algorithms in Data Structure

Abstract: In computer science field, one of the basic operation is sorting. Many sorting operations use intermediate steps. Sorting is the procedure of ordering list of elements in ascending or descending with the help of key value in specific order. Many sorting algorithms have been designed and are being used. This paper presents performance comparisons among the two sorting algorithms, one of them merge sort another one is quick sort and produces evaluation based on the performances relating to time and space complexity. Both algorithms are vital and are being focused for long period but the query is still, which of them to use and when. Therefore this research study carried out. Each algorithm resolves the problem of sorting of data with a unique method. This study offers a complete learning that how both of the algorithms perform operation and then distinguish them based on various constraints to come with outcome.

Author 1: Irfan Ali
Author 2: Haque Nawaz
Author 3: Imran Khan
Author 4: Abdullah Maitlo
Author 5: M. Ameen Chhajro
Author 6: M. Malook Rind

Keywords: Performance; analysis of algorithm; merge sort; quick sort; complexity; time and space

PDF

Paper 28: Efficient Image Cipher using 2D Logistic Mapping and Singular Value Decomposition

Abstract: The research paper proposes an efficient image cryptosystem that depends on the utilization of two dimensional (2D) chaotic logistic map (CLM) and singular value decomposition (SVD). The encryption process starts by a confusion stage through applying the 2D-CLM to the input plainimage. Then, the resulted logistic transformed image is then decomposed using the SVD technique into three ciphered components; the horizontal, vertical, and diagonal components. The ciphered horizontal, vertical, and diagonal components are then transmitted to the destination which applies a reverse procedure to reconstruct the original plainimage. A matrix of encryption quality tests are performed for investigating the proposed 2D-CLM based SVD image cipher. The obtained test results confirmed and ensured the efficiency of the proposed 2D-CLM based SVD image cipher.

Author 1: Mohammed A. AlZain

Keywords: Image cipher; 2D-CL; SVD

PDF

Paper 29: Agent-Based Co-Modeling of Information Society and Wealth Distribution

Abstract: With empirical studies suggesting that information technology influence wealth distribution in different ways, and with economic interactions and information technology adoption being two complex phenomena, there is a need for simulation approach that addresses the whole complexity of the issue without being too costly in terms of computations and without ignoring relevant empirical facts in defining the behavior of different agents.. While this problem seems to require a bottom-up approach using agent-based modeling, further complexity levels in managing the heterogeneous agents in space and time and an appropriate separation in domain areas show its limitations in practice. In this paper we illustrate the use of novel multi-level agent based concepts on this socio-economic issue, by considering our studied phenomenon as an interference of multiple simple other phenomena, namely a basic producer/consumer economy and a diffusion of information model. Such an approach involves writing models following a formalism allowing compatibility and exchange of variables, in addition to implementing appropriate synchronization algorithms. Our simulation used Levelspace, a recent extension project of Netlogo simulation tool combined with data exploration tools but the patterns described are generic and can be implemented in other simulation tools. Indeed, our case study offers a building block for a framework that can investigate wealth dynamics and other analogue cases with influence between models. Our approach successfully validates against empirical macro-trends in the distribution of wealth and other social patterns. Thanks to its flexibility in conducting experiments, we could reduce the hypotheses that restricted previous models from conducting a multi-dimensional analysis for the Gini index and enabled solving conflicting research issues.

Author 1: Fayçal Yahyaoui
Author 2: Mohamed Tkiouat

Keywords: Agent-based modeling; computational economics, multi-level; complex systems; parallel computing

PDF

Paper 30: Data Flows Management and Control in Computer Networks

Abstract: In computer networks, loss of data packets is inevitable, because of the buffer memory overflow of at least one of the nodes located on the path from the source to the receiver, including the latter. Such losses associated with overflows are hereinafter referred to as congestion of network nodes. There are many ways to prevent and eliminate overloads; these methods, in the majority, are based on the management of data flows. A special place is taken by the maintenance of packages, taking into account their priorities. The ideas of these solutions are quite simple for their implementation in the development of appropriate software and hardware for telecommunication devices. The article considers a number of original solutions to these problems at a level sufficient for the development of new generations of telecommunication devices and systems such as allowing interrupting transmission of the low-priority packet practically at any stage, then to transmit a high-priority packet and only then resume the interrupted transfer, moreover warning in time the data source about the threat of overloading one or several nodes along the route in the propagation of data packets.

Author 1: Ahmad AbdulQadir AlRababah

Keywords: Data transmission; data stream; input output buffers; telecommunication devices; data packets; blocks of memory; switching matrix; high priority packets; bitstaffing

PDF

Paper 31: Overview of Service and Deployment Models Offered by Cloud Computing, based on International Standard ISO/IEC 17788

Abstract: Cloud Computing offers services over the Internet to support business processes, based on deployment models and service, meet business requirements in an efficient and cost-effective. A general context of the types of service models that it, as well as the models of deployment, are not known, so the following research questions are stated: Q1) How many studies refer to service models in Cloud Computing and Models of cloud computing deployment?, Q2) How the service models are classified in relation to the types of capabilities Application, Infrastructure y Platform in a Cloud?, and Q3) What types of cloud computing deployment models currently exist?. The objective of this paper is to investigate the service and deployment models that currently exist in cloud computing, for every which a process of systematic review of the literature has been used as a research methodology. The results show that 45 service models and 4 deployment models were found in Cloud Computing, this allows us to conclude that the offered models give a lot and diverse solutions for the business processes.

Author 1: Washington Garcia Quilachamin
Author 2: Igor Aguilar Alonso
Author 3: Jorge Herrera-Tapia

Keywords: Cloud computing services models; IT demand management; deployment models; applications; platform; infraestructure

PDF

Paper 32: Applying Machine Learning Techniques for Classifying Cyclin-Dependent Kinase Inhibitors

Abstract: The importance of protein kinases made them a target for many drug design studies. They play an essential role in cell cycle development and many other biological processes. Kinases are divided into different subfamilies according to the type and mode of their enzymatic activity. Computational studies targeting kinase inhibitors identification is widely considered for modelling kinase-inhibitor. This modelling is expected to help in solving the selectivity problem arising from the high similarity between kinases and their binding profiles. In this study, we explore the ability of two machine-learning techniques in classifying compounds as inhibitors or non-inhibitors for two members of the cyclin-dependent kinases as a subfamily of protein kinases. Random forest and genetic programming were used to classify CDK5 and CDK2 kinases inhibitors. This classification is based on calculated values of chemical descriptors. In addition, the response of the classifiers to adding prior information about compounds promiscuity was investigated. The results from each classifier for the datasets were analyzed by calculating different accuracy measures and metrics. Confusion matrices, accuracy, ROC curves, AUC values, F1 scores, and Matthews correlation, were obtained for the outputs. The analysis of these accuracy measures showed a better performance for the RF classifier in most of the cases. In addition, the results show that promiscuity information improves the classification accuracy, but its significant effect was notably clear with GP classifiers.

Author 1: Ibrahim Z. Abdelbaky
Author 2: Ahmed F. Al-Sadek
Author 3: Amr A. Badr

Keywords: CDK inhibitors; random forest classification; genetic programming classification

PDF

Paper 33: Bound Model of Clustering and Classification (BMCC) for Proficient Performance Prediction of Didactical Outcomes of Students

Abstract: In this era of High-Performance High computing systems, Large-scale Data Mining methodologies in the field of education have become a convenience to discover and extract knowledge from Databased of their respective educational archives. Typically, all educational institutions around the world maintain student data repositories. Attributes of students such as the name of the student, gender of student, age group (date of birth), religion, eligibility details, academic assessment details, etc. are kept in it. With this knowledge, in this paper, didactical data mining (DDM) is used to leverage the performance prediction of student and to analyse it proactively. As it is known, Classification and Clustering are the liveliest techniques in mining the required data. Hence, Bound Model of Clustering and Classification (BMCC) have been proposed in this research for most proficient educational data mining. Classification is one of the distinguished options in Data Mining to assign an object under some pre-defined classes according to their attributes, and hence it comes under a supervised learning problem. On the other side, clustering is considered as a non-supervised learning problem that involves in grouping up of objects with respect to some similarities. Moreover, this paper uses the dataset collected from Kerala Technological University-SNG College of Engineering (KTU_SNG) for performing the BMCC. An efficient J48 decision tree algorithm is used for classification and the k-means algorithm is incorporated for clustering here and is optimised with Bootstrap Aggregation (Bagging). The implementation has been done and analysed with a data mining tool called WEKA (Waikato Environment for Knowledge Analysis), and the results are compared with some most used classifications such as Bayes Classifier (NB), Neural Network (Multilayer Perceptron MLP) and J48. It is provable from the results that the model, proposed in this provides high Precision Rate (PR), accuracy and robustness with less computational time, though the sample data set includes some missing values.

Author 1: Anoopkumar M
Author 2: A. M. J. Md. Zubair Rahman

Keywords: Classification; clustering; precision rate; accuracy; j48 decision tree; bagging; educational data mining

PDF

Paper 34: Brain Signal Classification using Genetic Algorithm for Right-Left Motion Pattern

Abstract: Brain signals or EEG are non-stationary signals and are difficult to analyze visually. The brain signal has five waves alpha, beta, delta, gamma, and theta. The five waves have their frequency to describe the level of attention, alertness, character and external stimuli. The five waves can be used to analyze stimulation patterns when turning left and right. Giving weight to the five brain waves utilizes genetic algorithms to get one signal. Genetic algorithms can be used to find the best signal for classification. In this paper, the EEG signal will be classified to determine the right or left movement pattern. After combining the five brain waves with genetic algorithms is then classified using the Logistic Regression, Linear Discriminant Analysis, K-Neighbors Classifier, Decision Tree, Naïve Bayes Gaussian, and Support Vector Machine. From the six methods above that have the highest accuracy is 56% and SVM is a method that has better accuracy than others on this problem.

Author 1: Cahya Rahmad
Author 2: Rudy Ariyanto
Author 3: Dika Rizky Yunianto

Keywords: Brain wave; EEG; genetic algorithm; classification; left right movement

PDF

Paper 35: Amharic based Knowledge-Based System for Diagnosis and Treatment of Chronic Kidney Disease using Machine Learning

Abstract: Chronic kidney disease is an important challenge for health systems around the world and consuming a huge proportion of health care finances. Around 85% of the world populations live in developing country of the world, where chronic kidney disease prevention programs are undeveloped. Treatment options for chronic kidney disease are not readily available for most countries in sub-Saharan Africa including Ethiopia. Many rural and urban communities in Ethiopia have extremely limited access to medical advice where medical experts are not readily available. To address such a problem, a medical knowledge-based system can play a significant role. Therefore, the aim of this research was developing a self- learning knowledge based system for diagnosis and treatment of the first three stages of kidney disease that can update the knowledge without the involvement of knowledge engineer. In the development of this system, the following procedures are followed: Knowledge Engineering research design was used to develop the prototype system. Purposive sampling strategies were utilized to choose specialists. The information was acquired by using both structured and unstructured interviews and all knowledge’s are represented by using production rule. The represented production rule was modeled by using decision tree modeling approach. Implementation was employed by using pro-log tools. Testing and evolution was performed through test case and user acceptance methods. Finally, we extensively evaluate the prototype system through visual interactions and test cases. Finally, the results show that our approach is better than the current ones.

Author 1: Siraj Mohammed
Author 2: Tibebe Beshah

Keywords: Knowledge-based system; kidney diseases; machine learning; knowledge engineering; knowledge representation

PDF

Paper 36: Noble Method for Data Hiding using Steganography Discrete Wavelet Transformation and Cryptography Triple Data Encryption Standard: DES

Abstract: Noble method for data hiding using steganography Discrete Wavelet Transformation: DWT and cryptography triple Data Encryption Standard: DES is proposed. In the current era, information technology has become inseparable from human life, especially regarding the processing and dissemination of information. In line with advances in information technology, there are also parties who want to abuse such information by changing information or even damage it. To avoid that happening, then the data needs to be secured first into other media using the DWT method. The choosing of this method is because the image of the data insertion almost resembles the original image. Triple DES methods are also required to encode data and provide additional security so that hidden data will be difficult to solve. The choosing of this method is because it is resistant against brute force, chosen plaintext, and known plaintext attack. Based on the test, image insertion results in 100% immune to the image manipulation of brightness and contrast, but not so resistant to cropped, resized, and rotated image manipulation. Other tests also indicate that the data which is in the picture can be extracted again and will not undergo any changes.

Author 1: Cahya Rahmad
Author 2: Kohei Arai
Author 3: Arief Prasetyo
Author 4: Novriza Arizki

Keywords: Data hiding; steganography; DWT; cryptography; 3DES

PDF

Paper 37: An Effective Lightweight Cryptographic Algorithm to Secure Resource-Constrained Devices

Abstract: In recent years, small computing devices like embedded devices, wireless sensors, RFID tags (Radio Frequency Identification), Internet of Things (IoT) devices are increasing rapidly. They are expected to generate massive amount of sensitive data for controlling and monitoring purposes. But their resources and capabilities are limited. Those also work with valuable private data thus making security of those devices of paramount importance. Therefore, a secure encryption algorithm should be there to protect those vulnerable devices. Conventional encryption ciphers like RSA or AES are computationally expensive; require large memory but hinder performances of those devices. Simple encryption techniques, on the other hand are easy to crack, compromising security. In this paper a secure and efficient lightweight cryptographic algorithm for small computing devices has been proposed. It is a symmetric key block cipher, employing custom substitution-permutation (SP) network and a modified Feistel architecture. Two basic concepts from Genetic algorithm are used. A Linux based benchmark tool, FELICS is used for the measurement and MATLAB for the purpose of encryption quality testing. An improvement over the existing algorithm, the proposed algorithm reduces the use of processing cycles but at the same time provides sufficient security.

Author 1: Sohel Rana
Author 2: Saddam Hossain
Author 3: Hasan Imam Shoun
Author 4: Dr. Mohammod Abul Kashem

Keywords: Lightweight cryptography; IoT; RFID tags; genetic algorithm; feistel architecture; SP network; FELICS; MATLAB

PDF

Paper 38: A Hybrid Genetic Algorithm with Tabu Search for Optimization of the Traveling Thief Problem

Abstract: Until now, several approaches such as evolutionary computing and heuristic methods have been presented to optimize the traveling thief problem (TTP). However, most of these approaches consider the TTP components independently, usually considering the traveling salesman problem (TSP) and then tackling the knapsack problem (KP), despite their interdependent nature. In this paper, we investigate the use of a hybrid genetic algorithm (GA) and tabu search (TS) for the TTP. Therefore, a novel hybrid genetic approach called GATS is proposed and compared with the state-of-the-art approaches. The key aspect of GATS is that TTP solutions are considered by firmly taking into account the interdependent nature of the TTP subcomponents, where all its operators are simultaneously implemented on TSP and KP solutions. A comprehensive set of TTP benchmark datasets was adopted to investigate the effectiveness of GATS. We selected 540 instances for our investigation, which comprised five different groups of cities (51, 52, 76, 100 and 150 cities) and different groupings of items, from 50 to 745 items. All types of knapsack (uncorrelated, uncorrelated with similar weights and bonded strongly correlated) with all different knapsack capacities were also taken into consideration. Different initialization methods were empirically investigated as well. The results of the computational experiments demonstrated that GATS is capable of surpassing the state-of-the-art results for various instances.

Author 1: Saad T Alharbi

Keywords: Combinatorial; hybrid approaches; genetic algorithm; optimization; tabu search; TTP

PDF

Paper 39: A Simple Approach for Representation of Gene Regulatory Networks (GRN)

Abstract: Gene expressions are controlled by a series of processes known as Gene Regulation, and their abstract mapping is represented by Gene Regulatory Network (GRN) which is a descriptive model of gene interactions. Reverse engineering GRNs can reveal the complexity of gene interactions whose comprehension can lead to several other details. RNA-seq data provides better measurement of gene expressions, however it is difficult to infer GRNs using it because of its discreteness. Multiple other methods have already been proposed to infer GRN using RNA-seq data, but these methodologies are difficult to grasp. In this paper, a simple model is presented to infer GRNs, using RNA-seq based coexpression map provided by GeneFriends database, and a graph-based database tool is used to create regulatory network. The obtained results show that it is convenient to use graph database tools to work with regulatory networks instead of developing a new model from scratch.

Author 1: Raza ul Haq
Author 2: Javed Ferzund
Author 3: Shahid Hussain

Keywords: Graph theory; graph database; gene regulatory networks; RNA-seq; Genes Co-Expression; Neo4j

PDF

Paper 40: A Methodology for Identification of the Guilt Agent based on IP Binding with MAC using Bivariate Gaussian Model

Abstract: Enormous increase in data in the current world presents a major threat to the organization. Most of the organization maintains some sort of data that is sensitive and must be protected against the loss and leakage. In the IT field, the large amount of data will be exchanged between the multiple points at every moment. During this allocation of the data from the organization to the third party, there are enormous probabilities of data loss, leakages or alteration. Mostly an email is being utilized for correspondence in the working environment and from web-based like logins to ledgers; thereby an email is turning into a standard business application. An email can be abused to leave organization's elusive information open to trade off. Along these lines, it might be of little surprise that muggings on messages are normal and these issues need to be addressed. This paper completely focuses on the concept of data leakage, the technique to detect the data leakage and prevention.

Author 1: B. Raja Koti
Author 2: Dr. G. V. S. Raj Kumar
Author 3: Dr. Y. Srinivas
Author 4: Dr. K. Naveen Kumar

Keywords: Data leakage; sensitive information; data leakage detection; bivariate normal distribution; probability density function

PDF

Paper 41: Short-Term Load Forecasting for Electrical Dispatcher of Baghdad City based on SVM-FA

Abstract: The improvement of load forecasting accuracy is an important issue in the scientific optimization of power systems. The availability of accurate statistical data and a suitable scientific method are necessary for a perfect prediction of future occurrences. This research deals with the use of a regression forecast model (Support Vector Machine, SVM) for the prediction of the vector data for electrical power loading and temperature in Baghdad city. The Firefly algorithm was used to optimize the parameters of the SVM to improve its prediction accuracy. The quantitative statistical performance evaluation measures (absolute proportional error (MAPE)) were used to evaluate the performance of the optimization methods. The results proved that the modification method was more accurate compared to the basic method and PSO-SVM.

Author 1: Aqeel S. Jaber
Author 2: Kosay A. Satar
Author 3: Nadheer A. Shalash

Keywords: SVM; FA; Load forecasting; PSO

PDF

Paper 42: A Novel Student Risk Identification Model using Machine Learning Approach

Abstract: This research work aim at addressing issues in detecting student, who are at risk of failing to complete their course. The conceptual design presents a solution for efficient learning in non-existence of data from previous courses, which are generally used for training state-of-art machine learning (ML) based model. The expected scenarios usually occurs in scenario when university introduces new courses for academics. For addressing this work, build a novel learning model that builds a ML from data constructed from present course. The proposed model uses data about already submitted task, which further induces the issues of imbalanced data for both training and testing the classification model. The contribution of the proposed model are: the design of training the learning model for detecting risk student utilizing information from present courses, tackling challenges of imbalanced data which is present in both training and testing data, defining the issues as a classification task, and lastly, developing a novel non-linear support vector machine (NL-SVM) classification model. Experiment outcome shows proposed model attain significant outcome when compared with state-of-art model.

Author 1: Nityashree Nadar
Author 2: Dr.R.Kamatchi

Keywords: Classification; imbalanced data; machine learning; virtual learning environment

PDF

Paper 43: Decision Support System for Agriculture Industry using Crowd Sourced Predictive Analytics

Abstract: It is really tough to manually examine the raw data. The Datamining strategies are used to detect the applicable information from uncooked data. The data mining algorithms are efficient for retrieving a specific pattern. In Datamining techniques decision trees are the most commonly used methods for predicting the outcome or behavior of a pattern because they can successfully and efficiently visualize the facts. Presently several decision tree algorithms are advanced for predictive analysis. Right here we gathered a dataset for rubberized mattress, from coir board CCRI, and applied the several decision tree algorithms on the data set and as compared every one. Every set of rules gives a completely unique choice tree from the input statistics. This paper focuses in particular on the Fuzzy c4.5 set of rules and compares one-of-a-kind choice tree algorithms for predictive analysis. Here by using predictive analytics, a decision can be made for each rubberized firms.

Author 1: Remya S
Author 2: Dr.R.Sasikala

Keywords: Predictive analytics; coir fiber; fuzzy-C4.5; crowdsourcing

PDF

Paper 44: Multivariate Copula Modeling with Application in Software Project Management and Information Systems

Abstract: This paper discusses application of copulas in software project management and information systems. Successful software projects depend on accurate estimation of software development schedule. In this research, three major risk factors and their impact on software development schedule are considered. Software development schedule is calculated by COCOMO-II model. Two models are simulated 100000 times, model-I considered dependence among risk factors by T-copula and model-II considered risk factors independent. The comparison of the two risk models revealed that model-II always underestimate the software development schedule while model-I evaluated the software schedule risk accurately. Therefore it is necessary for software development experts to consider dependence among various risk factors. R-package copula is employed to implement the algorithm for multivariate T-copula. Multiplier goodness-of-fit test shows that T-copula is good choice for characterization of dependence among three risk factors.

Author 1: Syed Muhammad Aqil Burney
Author 2: Osama Ajaz
Author 3: Shamaila Burney

Keywords: T-Copula; COCOMO – II; software development schedule; risk analysis

PDF

Paper 45: The Fundamentals of Unimodal Palmprint Authentication based on a Biometric System: A Review

Abstract: Biometric system can be defined as the automated method of identifying or authenticating the identity of a living person based on physiological or behavioral traits. Palmprint biometric-based authentication has gained considerable attention in recent years. Globally, enterprises have been exploring biometric authorization for some time, for the purpose of security, payment processing, law enforcement CCTV systems, and even access to offices, buildings, and gyms via the entry doors. Palmprint biometric system can be divided into unimodal and multimodal. This paper will investigate the biometric system and provide a detailed overview of the palmprint technology with existing recognition approaches. Finally, we introduce a review of previous works based on a unimodal palmprint system using different databases.

Author 1: Inass Shahadha Hussein
Author 2: Shamsul Bin Sahibuddin
Author 3: Nilam Nur Amir Sjarif

Keywords: Biometric system; palmprint; palmprint features; unimodal

PDF

Paper 46: Control of Grid Connected Three-Phase Inverter for Hybrid Renewable Systems using Sliding Mode Controller

Abstract: This paper presents a power control approach of a grid connected 3-phase inverter for hybrid renewable energy systems that consists of wind generator, flywheel energy storage system and diesel generator. A sliding controller is developed around the grid connected inverter to control the injected currents which leads to control the active and reactive powers requested by grid and/or isolated loads. In series with the controller, a Space Vector Pulse width Modulation method is used to drive the six inverter switches to generate 3-phase voltages and currents for transferring the desired powers requested by the alternative side. Simulations under Matlab-Simulink software of the hybrid renewable energy systems are made to show performances given by the developed sliding mode controller.

Author 1: Sami Younsi
Author 2: Nejib Hamrouni

Keywords: Grid connected systems; sliding controller; hybrid renewable systems; SVPWM

PDF

Paper 47: Security Issues in Cloud Computing and their Solutions: A Review

Abstract: Cloud computing is an internet-based, emerging technology, tends to be prevailing in our environment especially computer science and information technology fields which require network computing on large scale. Cloud computing is a shared pool of services which is gaining popularity due to its cost effectiveness, availability and great production. Along with its numerous benefits, cloud computing brings much more challenging situation regarding data privacy, data protection, authenticated access etc. Due to these issues, adoption of cloud computing is becoming difficult in today’s era. In this research, various security issues regarding data privacy and reliability, key factors which are affecting the cloud computing, have been addressed and also suggestions on particular areas have been discussed.

Author 1: Sabiyyah Sabir

Keywords: Cloud computing data protection; encryption; digital signature; security issues

PDF

Paper 48: Self Interference Cancellation in Co-Time-Co-Frequency Full Duplex Cellular Communication

Abstract: The performance of co-time co-frequency full duplex (CCDF) communication systems is limited by the self-interference (SI), which is the result of using the same frequency for transmission and reception. However, current communication systems use separate frequencies for transmission and reception, respectively. Therefore, SI is an important issue to be fixed for future-generation systems. As the radio frequency (RF) spectrum is very scared and a CCDF system has the potential to reduce the current spectrum use by half. In this paper, a CCDF communication system is modeled and a combination of RF and digital cancellations is used to mitigate the SI. The simulation results reveal that the proposed combination of RF and digital cancellation achieve the bit-error-rate of 10-11 at an interference-to-signal ratio of 10 dB, which is satisfying value for CCDF communication. The achieved efficiency of the proposed system is 13 bits/sec/Hz at a signal-to-noise ratio of 50 dB. The antenna separation of 35 dB is considered for the proposed model to keep the data loss as minimum as possible. The performance can be improved further by increasing digital-to-analog converter bits but with added complexity.

Author 1: Sajjad Ali Memon
Author 2: Faisal Ahmed Dahri
Author 3: Farzana Rauf Abro
Author 4: Faisal Karim Shaikh

Keywords: Cellular; Co-Time Co-Frequency Full duplex (CCFD); Self-Interference Cancellation (SIC); Communication system

PDF

Paper 49: Semi Supervised Method for Detection of Ambiguous Word and Creation of Sense: Using WordNet

Abstract: Machine Translation, Information Retrieval and Knowledge Acquisition are the three main applications of Word Sense Disambiguation (WSD). The sense of a target word can be identified from a dictionary using a ‘bag of words’, i.e. neighbours of the target word. A target word has the same spelling of the word but with a different meaning, i.e. chair, light etc. In WSD, the key input sources are sentences and target words. But, instead of providing a target word, this should automatically be detected. If a sentence has more than one target word, then the filtration process will require further processing. In this study, the proposed framework, consisting of buzz words and query words has been developed to detect target words using the WordNet dictionary. Buzz words are defined as a ‘bag-of-words’ using POS-Tags, and query words are those words having multiple meanings. The proposed framework will endeavor to find the sense of the detected target word using its gloss and with examples containing buzz words. This is a semi-supervised approach because 266 words of multiple meanings have been labelled from various sources and used based on an unsupervised approach to detect the target word and sense (meaning). After experimenting on a dataset consisting of 300 hotel reviews, 100 % of the target words for each sentence were detected with 84 % related to the sense of each sentence or phrase.

Author 1: Sheikh Muhammad Saqib
Author 2: Fazal Masud Kundi
Author 3: Asif Hassan Syed
Author 4: Shakeel Ahmad

Keywords: Word sense disambiguation; machine translation; information retrieval and knowledge acquisition; target word; WordNet; bag of words

PDF

Paper 50: Evaluation of Gated Recurrent Unit in Arabic Diacritization

Abstract: Recurrent neural networks are powerful tools giving excellent results in various tasks, including Natural Language Processing tasks. In this paper, we use Gated Recurrent Unit, a recurrent neural network implementing a simple gating mechanism in order to improve the diacritization process of Arabic. Evaluation of Gated Recurrent Unit for diacritization is performed in comparison with the state-of-the art results obtained with Long-Short term memory a powerful RNN architecture giving the best-known results in diacritization. Evaluation covers two performance aspects, Error rate and training runtime.

Author 1: Rajae Moumen
Author 2: Raddouane Chiheb
Author 3: Rdouan Faizi
Author 4: Abdellatif El Afia

Keywords: Gated recurrent unit; long-short term memory; arabic diacritization

PDF

Paper 51: Smile Detection Tool using OpenCV-Python to Measure Response in Human-Robot Interaction with Animal Robot PARO

Abstract: Human-robot interaction (HRI) is a field of study that defines the relationship between humans and robot. In robot-assisted mental healthcare, there is still a lack in the methodology especially in evaluating the outcome. In this study, PARO; a robot in the shape of a cute baby seal is introduced as an adjunct therapy tool for six rehabilitation patients with post-stroke depression. Currently, the therapy outcome is measured using psychological tools. When a robot is introduced, a new measurement tool is needed to analyse the patients’ response. Thus, this study constructs a tool using OpenCV-Python to detect the number of smiles when each patient interact with PARO. Smile is an indicator of positive emotion and that PARO helps to uplift patient’s mood. The results were then compared with psychological evaluations. Both tools show congruent results. The number of smiles increased when patients were holding PARO and PARO helped all patients to manage their psychological distress. This indicates that smile detection is an effective supporting tool to indicate respond in human-robot interaction.

Author 1: Winal Zikril Zulkifli
Author 2: Syamimi Shamsuddin
Author 3: Fairul Azni Jafar
Author 4: Rabiah Ahmad
Author 5: Azizah Abdul Manaf
Author 6: Alaa Abdulsalam Alarood
Author 7: Lim Thiam Hwee

Keywords: Human-robot interaction; OpenCV; PARO

PDF

Paper 52: Functionality Gaps in the Design of Learning Management Systems

Abstract: This research paper focuses on various gaps associated with the Learning Management System (LMS) and their remedies. LMS is a software application platform upon which multiple tasks related to online tutoring are created. For organizations, it’s crucial that the risks associated with any automated process are kept as low as possible. This should also pertain to selecting the LMS platform for educating new professionals to the organization. To execute this, organizations, should carry out due research before incorporating any system as their primary LMS. Even though, they provide a lot of benefits for organizations integrating such platforms. Choosing faulty LMS for training recruits can lead to a variety of issues later on. Thus, it becomes essential to select the best LMS platform available in the market, and the one suits the organization’s needs. The work proposed in this paper is listing together a number of problems that exists in any given LMS framework and trying to alter them according to the needs of the organization so that they provide a feasible solution and deliver a better guidance to the recruits.

Author 1: Tallat Naz
Author 2: Momeen Khan

Keywords: Learning Management System (LMS); shortcomings of LMS; functional gaps in LMS; LMS design issues; remedies for gaps in design of LMS

PDF

Paper 53: Blockchain Traffic Offence Demerit Points Smart Contracts: Proof of Work

Abstract: In Malaysia, a new regulation of traffic offences demerit points has been over a debate. Therefore, a blockchain model is formulated to solve this issue. It serves a purpose to be a Proof of Work (PoW) of a blockchain system. This model contains application layer and blockchain layer with smart contract inside. The smart contracts act as a conditional filter which follows the regulation rules. It contains three contracts starting from the declaration of each offence’s demerit points and fines until the penalties when a certain amount of demerit points is collected, including revocation of driver license. The contracts will be automatically executed when such conditions are fulfilled. A transaction schema is also designed to match the schema of a traffic offence system. This model is deployed in online environment with two servers synced to each other to prove the decentralized characteristic of blockchain. It is developed using NodeJS while preserving JSON format for transaction between server and client. A user interface is also provided as a simulation media where a traffic officer can input offences and send it to blockchain server while public users or the driver itself can check the status of the driver license recorded on the blockchain. Government officer can monitor the records through a dashboard analytics provided which contains graphs and charts based on the records. This interface is used as media to do evaluation which produces satisfying results. The evaluation shows that the smart contracts are executed properly as compared to real regulations.

Author 1: Aditya Pradana
Author 2: Goh Ong Sing
Author 3: Yogan Jaya Kumar
Author 4: Ali A. Mohammed

Keywords: Blockchain; proof of work; smart contract; demerit points; decentralized system; distributed ledger

PDF

Paper 54: Features Optimization for ECG Signals Classification

Abstract: A new method is used in this work to classify ECG beats. The new method is about using an optimization algorithm for selecting the features of each beat then classify them. For each beat, twenty-four higher order statistical features and three timing interval features are obtained. Five types of beat classes are used for classification in this work, atrial premature contractions (APC), normal (NOR), premature ventricular contractions (PVC), left bundle branch (LBBB) and right bundle branch (RBBB). Cuttlefish algorithm is used for feature selection which is a new bio-inspired optimization algorithm. Four classifiers are used within CFA, Scaled Conjugate Gradient Artificial Neural Network (SCG-ANN), K-Nearest Neighborhood (KNN), Interactive Dichotomizer 3 (ID3) and Support Vector Machine (SVM). The final results show an accuracy of 97.96% for ANN, 95.71% for KNN, 94.69% for ID3 and 93.06% for SVM, these results were tested on fourteen signal records from MIT-HIH database, where 1400 beats were extracted from these records.

Author 1: Alan S. Said Ahmad
Author 2: Majd Salah Matti
Author 3: Adel Sabry Essa
Author 4: Omar A.M. Alhabib
Author 5: Sabri Shaikhow

Keywords: Features optimization; cuttlefish; ECG; ANN-SCG; ID3; KNN; SVM

PDF

Paper 55: Handling Class Imbalance in Credit Card Fraud using Resampling Methods

Abstract: Credit card based online payments has grown intensely, compelling the financial organisations to implement and continuously improve their fraud detection system. However, credit card fraud dataset is heavily imbalanced and different types of misclassification errors may have different costs and it is essential to control them, to a certain degree, to compromise those errors. Classification techniques are the promising solutions to detect the fraud and non-fraud transactions. Unfortunately, in a certain condition, classification techniques do not perform well when it comes to huge numbers of differences in minority and majority cases. Hence in this study, resampling methods, Random Under Sampling, Random Over Sampling and Synthetic Minority Oversampling Technique, were applied in the credit card dataset to overcome the rare events in the dataset. Then, the three resampled datasets were classified using classification techniques. The performances were measured by their sensitivity, specificity, accuracy, precision, area under curve (AUC) and error rate. The findings disclosed that by resampling the dataset, the models were more practicable, gave better performance and were statistically better.

Author 1: Nur Farhana Hordri
Author 2: Siti Sophiayati Yuhaniz
Author 3: Nurulhuda Firdaus Mohd Azmi
Author 4: Siti Mariyam Shamsuddin

Keywords: Credit card; imbalanced dataset; misclassification error; resampling methods; random undersampling; random oversampling; synthetic minority oversampling technique

PDF

Paper 56: Exploring Saudi Citizens' Acceptance of Mobile Government Service

Abstract: Mobile government is considered as an emerging technology that has been used in Saudi Arabia in order to enhance communication between the government and its citizens, it can also be considered as a mechanism through which the government can effectively respond to their needs and expectations. The current study seeks to propose and validate M-government adoption model to fully understand the varied variables affecting the adoption behavior. This model is based on Technology Acceptance Model (TAM) and DeLone and McLean Information Systems Success Model. The researchers will depend on a descriptive survey approach using structured questionnaires to investigate the extent of M-government acceptance by Saudi users. Structural equation modeling will be used as a method for statistical data analysis.

Author 1: Adnan Mustafa AlBar
Author 2: Mashael A. Hddas

Keywords: M-government; adoption; acceptance; citizen

PDF

Paper 57: The Role of user Involvement in the Success of Project Scope Management

Abstract: Greater emphasis is now being placed on User Involvement as a factor imperative to Success in Project Scope Management. Although Project Scope Management Processes have a tendency to centre on various factors pertaining to the collecting criteria, defining scope and verifying scope, controlling scope is viewed as being fundamental to the management process as a whole. Furthermore, Success in Project Scope Management in the modern-day competitive business setting is recognised as resting on efficient and effective processes applied across Project Scope Management. One essential factor in achieving success in this arena is that of User Involvement. In this regard, the point is presented that Project Scope Management and User Involvement may be implemented in such a way so as to enhance Successful Project Scope Management. A questionnaire-centred survey approach utilizing Project Scope Management Processes and User Involvement to Successful Project Scope Management, encompassing management- and strategy-level employees, totalling 295, was applied in order to establish the link, both indirect and direct, between particular elements influencing four different IT departments at the governmental level. The data gathered underwent analysis through the use of SPLS (Smart Partial Least Square). This work provides a valuable contribution for professionals in the field, both in terms of researchers and practitioners, and further highlights the different ways in which project managers can arrange and modify Project Scope Management Processes in pursuit of their efforts to enhance the mediation of Successful Project Scope Management through User Involvement.

Author 1: Maha Alkhaffaf

Keywords: Gathering requirements; defining scope; verifying scope; controlling scope; and user involvement

PDF

Paper 58: Experimental Evaluation of Security Requirements Engineering Benefits

Abstract: Security Requirements Engineering (SRE) approaches are designed to improve information system security by thinking about security requirements at the beginning of the software development lifecycle. This paper is a quantitative evaluation of the benefits of applying such an SRE approach. The followed methodology was to develop two versions of the same web application, with and without using SRE, then comparing the level of security in each version by running different test tools. The subsequent results clearly support the benefits of the early use of SRE with a 38% security improvement in the secure version of the application. This security benefit reaches 67% for high severity vulnerabilities, leaving only non-critical and easy-to-fix vulnerabilities.

Author 1: Jaouad Boutahar
Author 2: Ilham Maskani
Author 3: Souhaïl El Ghazi El Houssaïni

Keywords: Software security; security requirements engineering; security evaluation; security testing

PDF

Paper 59: Priority-Aware Virtual Machine Selection Algorithm in Dynamic Consolidation

Abstract: In the past few years, many researchers attempted to tackle the problem of decreasing energy consumption in cloud data centers. One of the widely adopted techniques for this purpose is dynamic Virtual Machine (VM) consolidation. Consolidation moves VMs between hosts to decrease energy consumption. However, it has a negative impact on performance leading to Service Level Agreement (SLA) violations. Accordingly, selecting which VM to migrate from one host to another is a challenging task since it can affect performance. Researchers came up with several solutions and policies for efficient VM selection. In this paper, we exploit the fact that many tasks and users may tolerate some performance degradation which means, the tasks running on the VMs can be of different priorities. Accordingly, we propose augmenting consolidation with the priority concept, where low priority tasks are always selected first for migration. Towards this goal, we modified the popular Minimum Migration Time VM selection algorithm using the priority concept. The efficiency of the proposed algorithm is confirmed through extensive simulations using CloudSim toolkit and a real workload. The results show that priority awareness has a positive impact on decreasing energy consumption as well as maximizing SLA obligation.

Author 1: Hanan A. Nadeem
Author 2: Hanan Elazhary
Author 3: Mai A. Fadel

Keywords: Cloud computing; energy efficiency; service level agreement; VM consolidation; VM selection

PDF

Paper 60: Three-Phase Approach for Developing Suitable Business Models for Exchanging Federated ERP Components as Web Services

Abstract: The importance of business models has increased significantly in the last decade, especially in the Internet. The cause of this increase is the effect of Internet and the associated applications and their business processes regarding to the business model. These effects include, for example, the emerging technical and economic aspects of a business model on the Internet, the support and transformation of traditional business models, and the arise of new business ideas based on that technology. One of these new ideas is: how distributed Enterprise Resource Planning (ERP) systems or federated ERP (FERP) systems as web services (WSs) can cover the increasing demands of small and medium sized enterprises (SMEs) for business software covered.This paper aims to provide a derived developing approach with three phases that will lead to three suitable concepts that identify suitable business model for FERP System with different scenarios of value exchanging. The results of this work will be conceptual models that describe the character, role and revenue models that identify FERP exchanging business model.

Author 1: Evan Asfoura
Author 2: Mohammad Samir Abdel-Haq
Author 3: Houcine Chatti

Keywords: FERP system; ERP web services; FERP mall; ERP workflow; developing approach

PDF

Paper 61: Conceptual Modeling of Inventory Management Processes as a Thinging Machine

Abstract: A control model is typically classified into three forms: conceptual, mathematical and simulation (computer). This paper analyzes a conceptual modeling application with respect to an inventory management system. Today, most organizations utilize computer systems for inventory control that provide protection when interruptions or breakdowns occur within work processes. Modeling the inventory processes is an active area of research that utilizes many diagrammatic techniques, including data flow diagrams, Universal Modeling Language (UML) diagrams and Integration DEFinition (IDEF). We claim that current conceptual modeling frameworks lack uniform notions and have inability to appeal to designers and analysts. We propose modeling an inventory system as an abstract machine, called a Thinging Machine (TM), with five operations: creation, processing, receiving, releasing and transferring. The paper provides side-by-side contrasts of some existing examples of conceptual modeling methodologies that apply to TM. Additionally, TM is applied in a case study of an actual inventory system that uses IBM Maximo. The resulting conceptual depictions point to the viability of FM as a valuable tool for developing a high-level representation of inventory processes.

Author 1: Sabah Al-Fedaghi
Author 2: Nourah Al-Huwais

Keywords: Conceptual model; diagrammatic representation; inventory control; inventory management; workflow; thinging

PDF

Paper 62: Dynamic Tuning and Overload Management of Thread Pool System

Abstract: Distributed applications have been developed using thread pool system (TPS) in order to improve system performance. The dynamic optimization and overload management of TPS are two crucial factors that affect overall performance of distributed thread pool (DTP). This paper presents a DTP, that is based on central management system, where a central manager forwards client’s requests in round robin fashion to available set of TPSs running in servers. The dynamic tuning of each TPS is done based on request rate on the TPS. The overload condition at each TPS is detected by the TPS itself, by throughput decline. The overload condition is resolved by reducing the size of thread pool to previous value, at which it was producing throughput parallel to the request rates. By reducing the size of thread pool on high request rates, the context switches and thread contention overheads are eliminated that enables system resources to be utilized effectively by available threads in the pool. The result of evaluation proved the validity of proposed system.

Author 1: Faisal Bahadur
Author 2: Arif Iqbal Umar
Author 3: Fahad Khurshid

Keywords: Distributed system; distributed thread pool; thread pool system; performance; overload management

PDF

Paper 63: Development of Interactive Ophthalmology Hologram

Abstract: Ophthalmology is one of the medical branches that deal with the eye. This field is associated with the anatomy, physiology and diseases of the eye. The main objective of this paper is to develop a novel interactive simulation of eye anatomy in three-dimensional (3D) by using holography environment approach in order to create better visualization of the structures of the eyes. Currently, public can access the information on medical through some conventional methods, such as brochure, pamphlet, booklet, in addition to some other better ways, for example 2D and 3D video. However, these methods do not offer an interactivity visualization of the medical information that help creating much engaging presentation to users. Moreover, the medical doctors are unable to show and explain in details about the diseases occurrence through these methods. An interactive method, therefore is required to assist the doctors to convey the disease information effectively. Since the human eye is the most complex organ in our body, thus an advanced technology, i.e., hologram should be used to visualize the visual system in an effective and interactive method, also to produce an effective eye disease explanation. Hologram is in a three-dimensional form besides an interactivity elements, where the proposed technology will help doctors examining vital organs using 3D displays. It is envisaged that the proposed interactive holography technique for clinical purpose would greatly contribute to and assist the management of the eye diseases and other diseases as well. It is hoped that the developed interactive holography technique for eye will assist clinicians in delivering the disease information efficiently and attractively.

Author 1: Sarni Suhaila Rahim
Author 2: Nazreen Abdullasim
Author 3: Wan Sazli Nasaruddin Saifudin
Author 4: Raja Norliza Raja Omar

Keywords: 3D animation; hologram; ophthalmology; interactive; eye

PDF

Paper 64: Efficient Page Collection Scheme for QLC NAND Flash Memory using Cache

Abstract: Recently, semiconductor companies such as Samsung, Hynix, and Micron, have focused on quad-level cell (QLC) NAND flash memory chips, because of the increase in the capacity of storage systems. The QLC NAND flash memory chip stores 4 bits per cell. A page in the QLC NAND flash memory consists of 16 sectors, which is two to four times larger than that of conventional triple-level cell flash NAND flash memory. Because of its large page size, when the QLC NAND flash memory is applied to the current storage system directly, each page space is not efficiently used, resulting in low space utilization in overall storage systems. To solve this problem, an efficient page collection scheme using cache for QLC NAND flash memory (PCS) is proposed. The main role of PCS is managing the data transmitted from the file system efficiently (according to the data pattern and size), and reducing the number of unnecessary write operations. The efficiency of PCS was evaluated using SNIA IOTTA NEXUS5 trace-driven simulation on QLC NAND flash memory. According to close observation, PCS significantly reduces 50% of write operations compared with previous page collection algorithms, by efficiently collecting the small data into a page. Furthermore, a cache idle-time determination algorithm is proposed to further increase the space utilization of each page, thereby reducing the overall number of write operations on the QLC flash memory.

Author 1: Seok-Bin Seo
Author 2: Wanil Kim
Author 3: Se Jin Kwon

Keywords: Solid state drive; storage systems; cache; flash translation layer

PDF

Paper 65: Edge Detection on DICOM Image using Triangular Norms in Type-2 Fuzzy

Abstract: In image processing, edge detection is an important venture. Fuzzy logic plays a vital role in image processing to deal with lacking in quality of an image or imprecise in nature. This present study contributes an authentic method of fuzzy edge detection through image segmentation. Gradient of the image is done by triangular norms to extract the information. Triangular norms (T norms) and triangular conorms (T conorms) are specialized in dealing uncertainty. Therefore triangular norms are chosen with minimum and maximum operators for the purpose of morphological operations. Also, mathematical properties of aggregation operator to represent the role of morphological operations using Triangular Interval Type-2 Fuzzy Yager Weighted Geometric (TIT2FYWG) and Triangular Interval Type-2 Fuzzy Yager Weighted Arithmetic (TIT2FYWA) operators are derived. These properties represent the components of image processing. Here Edge detection is done for DICOM image by converting into 2D gray scale image, using Type-2 fuzzy MATLAB and which is the novelty of this work.

Author 1: D. Nagarajan
Author 2: M.Lathamaheswari
Author 3: R.Sujatha
Author 4: J.Kavikumar

Keywords: Aggregation operators; T norm; T conorm; triangular interval type-2 fuzzy number (TIT2FN); fuzzy morphology; gray scale Image; medical image processing

PDF

Paper 66: Content Analysis of Privacy Management Features in Geosocial Networking Application

Abstract: Geosocial networking application allows user to share information and communicate with other people within a virtual neighborhood or community. Although most geosocial networking application include privacy management features, one the challenge is to improve privacy management features design. To overcome this challenge, the adaptation of privacy-related theories offers a concrete way to comprehend and analyze how the privacy management features are used as tangible research results that facilitate user and system developer in understanding privacy management. This paper attempt to propose a standardized privacy management features in geosocial networking application from market perspectives that could be utilized by researchers and application developers to demonstrate or measure privacy management features. The objective of this paper is two-fold: First, to map the theoretical constructs guided by Communication Privacy Management (CPM) theory into privacy management features in geosocial networking application. Second, to evaluate the reliability of the proposed features using content analysis. Content analysis is performed on 1326 geosocial networking apps in the market (Google Play store and App Store) to determine the reliability of the proposed privacy management features through inter-coder reliability analysis. The primary findings of the content analysis show that many of the privacy management features with low reliability are from Boundary Turbulence construct. Furthermore, only 6 out of 13 proposed features are deemed reliable, namely, specific grouping, visibility setting, privacy policy, violation, imprecision and inaccuracy. The proposed privacy management features may aid researchers and system developers to focus on the best privacy management features for improving geosocial networking application design.

Author 1: Syarulnaziah Anawar
Author 2: Yeoh Wai Hong
Author 3: Erman Hamid
Author 4: Zakiah Ayop

Keywords: Privacy management; communication privacy management theory; social network; geosocial network; content analysis

PDF

Paper 67: Differentiation of Brain Waves from the Movement of the Upper and Lower Extremities of the Human Body

Abstract: Currently, the study of brain waves has shown a type of alternative communication, in addition to the different applications that can be made with the brain waves obtained from each individual. The OpenBCI is an open source platform for electroencephalography (EEG), in addition to a device called Cyton Board capable of collecting brain waves and these can be sent to a computer to be processed. In this research work, a computer-machine interface is presented that may be able to collect the brain waves of individuals and process them, this is to indicate the differences between the thinking of the imaginary movement of the left and right arm, also of the leg left and right brain signals. Then, to use these brain wave differences in applications focused on people with physical disabilities.

Author 1: Brian Meneses-Claudio
Author 2: Witman Alvarado-Diaz
Author 3: Avid Roman-Gonzalez

Keywords: OpenBCI; cyton board; extremity movement; brain wave differentiation

PDF

Paper 68: Conceptual Model for Measuring Transparency of Inter-Organizational Information Systems in Supply Chain: Case Study of Cosmetic Industry

Abstract: The role of information systems has changed effectively in organizational performance, and today, information systems are creating value for organizations. This study aims to provide a conceptual model for measuring the transparency of inter-organizational information systems in the supply chain. The statistical population of this research includes all managers and staff of cosmetic cosmetics companies in Tehran while companies are engaged in different sectors and their number is 500. About 218 people were questioned through the calculation with the Cochran formula. A conceptual model for measuring the transparency of inter-organizational information systems in the supply chain was developed based on a review of the theoretical concepts. Researcher made questionnaire have been used to measure variables of the research model. The validity of the research tool was confirmed by experts and the reliability of the tool was reported 0.85 by Cronbach's alpha. According to T-statistics, transparency of resources, inter-organizational trust, and environmental assurance are positive and significant in measuring transparency of inter-organizational information systems at a level of 0.01 and they are above average.

Author 1: Maryam Toofani
Author 2: Alireza Hassanzadeh
Author 3: Ali Rajabzadeh Ghatari

Keywords: Transparency of information systems; supply chain; cosmetic industry, measuring transparency; inter-organizational

PDF

Paper 69: Biological Feedback Controller Design for Handwriting Model

Abstract: This paper deals with a feedback controller of PD (proportional, derivative) type applied to the process of handwriting. The considered model for this study describes the behavior of the system “hand and pen” to forearm muscles forces, applied for the production of handwriting. The applied approach considers memory recall of error signal between model outputs and experimental data to reach a desired trajectory position, a rapid dynamic and stable model response. The control technique is applied in order to expand the handwriting model response to a larger database of graphic traces. The obtained results illustrated the reliability of closed loop control to command the handwriting system, and to ensure its robustness against unknown inputs such as muscles forces that could vary from an individual to another and increase model complexity.

Author 1: Mariem BADRI
Author 2: Ines CHIHI
Author 3: Afef ABDELKRIM

Keywords: Handwriting system; biological system; feedback control; PD controller; muscles forces signals

PDF

Paper 70: Relationship of Liver Enzymes with Viral Load of Hepatitis C in HCV Infected Patients by Data Analytics

Abstract: Retracted: After careful and considered review of the content of this paper by a duly constituted expert committee, this paper has been found to be in violation of IJACSA`s Publication Principles. We hereby retract the content of this paper. Reasonable effort should be made to remove all past references to this paper.

Author 1: Fahad Ahmad
Author 2: Kashaf Junaid
Author 3: Ata ul Mustafa

Keywords: Hepatic; hepatitis virus; liver markers; UCINET analysis

PDF

Paper 71: A Novel Method for Secured Transaction of Images and Text on Cloud

Abstract: Implementation of privacy preservation of data on cloud storage is tedious and complex. Cloud is a third party on – demand service to hold data for a specific period. There is no assurance from the cloud storage providers about the security of data. It is necessary to provide some security to data. Cryptographic algorithms are required to provide security to the data on cloud. The aim of the research is to develop a method which combines Artificial Neural Network and Three fish algorithm for the secured transaction of images. Images are large in size and more sensitive comparing to normal text. The proposed method provides security to image with low computation cost comparing to existing methods. The research is implemented in privacy and public clouds. The generated results prove the proposed research is more efficient in terms of compression ratio, mean square error, normalized absolute error, time, and space efficiency.

Author 1: John Jeya Singh. T
Author 2: Dr E.Baburaj

Keywords: Three fish; neural network; security; multimedia; cloud storage

PDF

Paper 72: The Role of Information Technology on Teaching Process in Education; An Analytical Prospective Study at University of Sulaimani

Abstract: Nowadays Information Technology (IT) has been engaged in all spheres of life. It plays an important role in developing and processing works in all types of organizations, especially in the teaching process in institutions and universities. The purpose of this paper is to present the impact of information technology in the process of learning progress and teaching improvement in the University of Sulaimani from both lecturers and students perspective view, also to determine the common key factors which teaching process relies upon the information technology framework. In this paper, the researchers created an online questionnaire survey and took a sample number of academic staff and students in different colleges and departments of the University of Sulaimani in 2017, which were 320 questionnaires. The paper shows that information technology has become a basic need not dispense within the teaching process in universities and institutions in this era, also emphasize that various level of understanding of Information Technology serves various learning and teaching process.

Author 1: Mohammad Esmail Ahmad
Author 2: Ameer Sardar K. Rashid
Author 3: Amanj Anwar Abdullah
Author 4: Raza M. Abdulla

Keywords: Information and communication technology; education; evaluation of information technology

PDF

Paper 73: Three Levels Quality Analysis Tool for Object Oriented Programming

Abstract: In terms of evolution of software engineering methods for complex software developments techniques, new concepts have been emerged in the software languages, which used to develop software quality models. In this research, the Multi Levels Quality Analysis Tool (MLQA) is proposed as a tool for computer-aid software engineering, which classifies software complexity into three levels of analysis, namely the program package analysis, class analysis (program class) and finally the analysis at the level of the program method. MLQA is able to support a method of visual analysis of the software contents with color alerts, and recommendations systems, which can give a quick view of the software development and its complexity. The methodology of this work is a new suggested software quality model based on the standards object-oriented programming complexity metrics as well as threshold limits. In addition, a new quality attribute namely clean code attribute has been proposed and integrating it with the proposed software quality model in a way that enables the user of the model relies on this attribute and reduces the dependence on the software experience, which is expensive and rare at times.

Author 1: Mustafa Ghanem Saeed
Author 2: Maher Talal Alasaady
Author 3: Fahad Layth Malallah
Author 4: Kamaran HamaAli Faraj

Keywords: Software quality models; software measurements; clean code; source code complexity metrics

PDF

Paper 74: Using an Integrated Framework for Conceptual Modeling

Abstract: The Integrated Framework for Conceptual Modeling (IFCMod) is created to contribute to the quality of the information system through the integration of the functional and non-functional requirements. This paper attempts to explore the outcomes of the IFCMod usage through the Mixed Method Case Studies at the Higher Education Institution and the Central Bank. The case study at the South East European University (SEEU) was the analysis and design of the improvement of the e-Schedule system, while the case study at the Central Bank of the Republic of Kosovo (CBK) was the analysis and design of the Data Collection System for Enterprise Surveys (DCSES). Based on the institutional perspective of the community participation during the semi-structure interviews, at the end of the Joint Approval Requirements (JAR) meetings, in both cases, the outcomes showed that IFCMod usage increases the quality of the information system by increasing quality in the system requirements.

Author 1: Lindita Nebiu Hyseni
Author 2: Zamir Dika

Keywords: Integrated framework for conceptual modeling (IFCMod); joint approval requirements (JAR); system requirements; information system; mixed method case studies

PDF

Paper 75: Development of Mobile Health Application for Cardiovascular Disease Prevention

Abstract: Cardiovascular diseases are one of major cause of death in the world, as well as in Indonesia. In spite of that fact, cardiovascular diseases (CVDs) could be prevented with healthy behavior and lifestyle, such as: regular health check-up, healthy eating and drinking lifestyle, stress management, sleeping management, regular physical activities. In this paper, we develop mobile health application as a tool to record daily behavior and lifestyle. Mobile health is chosen because nowadays mobile devices are the most popular communication used among people. Thus, we believe that mobile Health (mHealth) is a promising tool to promote healthy lifestyle and behavior. The method we used for developing the application is called as Human-centered design (HCD). The application is evaluated iteratively from the first prototype (low-fidelity prototype) to the final prototype (high-fidelity prototype). Based on the feedback using User Experience Questionnaire (UEQ) shows that the application has above average scores for all of the components, i.e.: attractiveness, clarity, efficiency, accuracy and dependability, stimulation, and novelty. The best score is for Stimulation (Excellent), while the worst score is accuracy and dependability (above average). This shows that mHealth is a potential tool to stimulate users for having healthy lifestyle, however it still required further validation of use from health experts to ensure the accuracy’s result of the application.

Author 1: Vitri Tundjungsari
Author 2: Abdul Salam M Sofro
Author 3: Heri Yugaswara
Author 4: Adhika Trisna Dwi Putra

Keywords: Mobile health; cardiovascular; disease; human-centered design; standard; user-centered design

PDF

Paper 76: Image Processing based Task Allocation for Autonomous Multi Rotor Unmanned Aerial Vehicles

Abstract: Nowadays studies based on unmanned aerial vehicles draws attention. Especially image processing based tasks are quite important. In this study, several tasks were performed based on the autonomous flight, image processing and load drop capabilities of the Unmanned Aerial Vehicle (UAV). Two main tasks were tested with an autonomous UAV, and the performance of the whole system was measured according to the duration and the methods of the image processing. In the first mission, the UAV flew over a 4x4 sized color matrix. 16 tiles of the matrix had three main colors, and the pattern was changed three times. The UAV was sent to the matrix, recognized 48 colors of the matrix and returned to the launch position autonomously. The second mission was to test load drop and image processing abilities of the UAV. In this mission, the UAV flew over the matrix, read the pattern and went to the parachute drop area. After that, the load was dropped according to the recognized pattern by the UAV and then came back to the launch position.

Author 1: Akif Durdu
Author 2: Mehmet Celalettin Ergene
Author 3: Onur Demircan
Author 4: Hasan Uguz
Author 5: Mustafa Mahmutoglu
Author 6: Ender Kurnaz

Keywords: UAV; multi rotor; quad rotor; image processing; search and rescue; task allocation

PDF

Paper 77: Towards Adaptive user Interfaces for Mobile-Phone in Smart World

Abstract: All applications are developed for context adaptation and provide communication with users through their interfaces. These applications offer new opportunities for developers as well as users by collecting context data and adapting systems behavior accordingly. Particularly, in mobile devices, these mechanisms provide usability increment tremendously. Rigid and non-adaptive interface blocks the features of context awareness. In this paper, we study methods, technologies and criteria which have been proposed specifically for adaptive interfaces. Based on these guidelines, we elaborate the intelligence of adaptivity and usage of context according to user mental model. Further, we have proposed a model to develop user context ontology (UCO) and adaptive interface ontology (AIO) to optimize the use of adaptive mobile interfaces in the context of user preferences. These ontologies organize the perceptions and thoughts of user. The philosophy of User Centered Design (UCD) is proposed to analyze the usability and validity of mobile device interfaces according to user contexts.

Author 1: Muhammad Waseem Iqbal
Author 2: Nadeem Ahmad
Author 3: Syed Khuram Shahzad
Author 4: Irum Feroz
Author 5: Natash Ali Mian

Keywords: Adaptive features; smart-phone; usability experience, user interface; user context; usability engineering; UCD

PDF

Paper 78: Electronically Reconfigurable Two-Stage Schiffman Phase Shifter for Ku Band Beam Steering Applications

Abstract: An electronically reconfigurable phase shifter using two Schiffman sections is performed for beam steering applications in Ku band. The proposed phase shifter consists of only two cascaded coupled-line sections with the reference line removed. This circuit is loaded by varactor diodes that ensure its tunability over a wide bandwidth. By supplying these varactor diodes with suitable bias voltages, a phase shift is continuously adjusted and reached up to 168° at 12.7 GHz with low insertion losses according to the simulations. Thus, the proposed two-stage phase shifter is able to reach a beam steering angle of 28.6° at 12.7 GHz with only one control voltage. The proposed structure exhibits that our phase shifter has a compact size and a large phase shifting range throughout the Ku band. The tunable phase shifter is prototyped and the measurement results are presented.

Author 1: Rawia Wali
Author 2: Lotfi Osman
Author 3: Tchanguiz Razban
Author 4: Yann Mahé

Keywords: Schiffman phase shifter; reconfigurable; varactor diode; beam steerability; Ku-band

PDF

Paper 79: Social Networking Sites Habits and Addiction Among Adolescents in Klang Valley

Abstract: Social networking sites (SNS) is a very popular application in today’s world society. SNS, to certain extent has change the way people communicate with each other. This kind of technology has become a trend among the users regardless the impact of the technology to the users either positive or negative. The level of SNS usage among the adolescents has started to raise concern among the parents and also the society. SNS addictions are becoming problematic in certain countries especially in United States and lately this issue has started to spread all over the world. Malaysia is also one of the country affected with SNS addiction. SNS addiction is not an isolated phenomenon as it is started from high engagement on the SNS usage and it originates from habitual behavior. Therefore, it is important to seek and understand habit and addiction of SNS among adolescents in Malaysia. The purpose of this study is to analyze and explore the usage of SNS among the adolescents in Malaysia, specifically in Klang Valley. It examines the SNS usage behavioral, which is habit and addiction. The data was collected from a sample of 60 respondents using an online survey. The data were analyzed using SPSS for descriptive analysis. From the analysis, it was found that most of the adolescents used SNS in daily basis and majority of them use it for more than two hours per day. Patterns on habits and addiction on the SNS usage shows that some adolescents experienced certain habit and addiction behavior.

Author 1: Yazriwati Yahya
Author 2: Nor Zairah Ab. Rahim
Author 3: Roslina Ibrahim
Author 4: Nurazean Maarop
Author 5: Haslina Md Sarkan
Author 6: Suriayati Chuprat

Keywords: Social networking sites; habit behavior; addiction behavior; SNS usage

PDF

Paper 80: Performance Evaluation of Trivium on Raspberry Pi

Abstract: High connectivity of billions of IoT devices lead to many security issues. Trivium is designed for IoT to overcome the security challenges of IoT. The objective of this study is to implement a security service to provide confidentiality for the communication of IoT devices. Furthermore, this study aims to analyze Trivium performance in terms of keystream generation time and memory utilization on Raspberry Pi Zero, Raspberry Pi 2B, and Raspberry Pi 3B. The result showed that there was a statistically significant difference between the keystream generation time and memory utilization on Raspberry Pi Zero, Raspberry Pi 2B, and Raspberry Pi 3B based on Kruskal-Wallis H test. Further test of Jonckheere-Terpstra indicates that the fastest keystream generation time was on Raspberry Pi 3B, and the smallest memory utilization was on Raspberry Pi 2B. The implemantation of Trivium on three versions of Raspberry Pi shows promising results with less than 27 MB of memory utilization for cryptography leaves more resources available to applications.

Author 1: Ari Kusyanti
Author 2: Syahifudin Shahid
Author 3: Harin Puspa Ayu Catherina
Author 4: Yazid Samanhudi

Keywords: Trivium; Raspberry; Kruskal-Wallis

PDF

Paper 81: Improving K-Means Algorithm by Grid-Density Clustering for Distributed WSN Data Stream

Abstract: At recent years, Wireless Sensor Networks (WSNs) had a widespread range of applications in many fields related to military surveillance, monitoring health, observing habitat and so on. WSNs contain individual nodes that interact with the environment by sensing and processing physical parameters. Sometimes, sensor nodes generate a big amount of sequential tuple-oriented and small data that is called Data Streams. Data streams usually are huge data that arrive online, flowing rapidly in a very high speed, unlimited and can’t be controlled orderly during arrival. Due to WSN limitations, some challenges are faced and need to be solved. Extending network lifetime and reducing energy consumption are main challenges that could be solved by Data Mining techniques. Clustering is a common data mining technique that effectively organizes WSNs structure. It has proven its efficiency on network performance by extending network lifetime and saving energy of sensor nodes. This paper develops a grid-density clustering algorithm that enhances clustering in WSNs by combining grid and density techniques. The algorithm helps to face limitations found in WSNs that carry data streams. Grid-density algorithm is proposed based on the well-Known K-Means clustering algorithm to enhance it. By using Matlab, the grid-density clustering algorithm is compared with K-Means algorithm. The simulation results prove that the grid-density algorithm outperforms K-Means by 15% in network lifetime and by 13% in energy consumption.

Author 1: Yassmeen Alghamdi
Author 2: Manal Abdullah

Keywords: WSNs; data mining; clustering; data stream; grid density

PDF

Paper 82: Conditional Text Paraphrasing: A Survey and Taxonomy

Abstract: This work introduces a survey for the Text Para-phrasing task. The survey covers the different types of tasks around text paraphrasing and mentions the techniques and models that are regularly used when approaching towards it, alongside the datasets that are used while training and evaluating the models. Text paraphrasing has an effective impact when it is used in other applications, so, the paper mentions some text paraphrasing applications. Also, this work proposes a new taxonomy that it is called Conditional Text Paraphrasing. To the best of our knowledge, this is the first work that shows varieties and sub-problems of the original text paraphrasing task. The target of this taxonomy is to expand the definition of the text paraphrasing by adding some conditional constraints as features that either control the paraphrase generation or discrimination. This expanded definition opens in mind a new domain for research in Natural Language Processing (NLP) and Machine Learning. Finally, some useful applications for the conditional text paraphrasing are represented.

Author 1: Ahmed H. Al-Ghidani
Author 2: Aly A. Fahmy

Keywords: Natural Language Processing; Text Paraphrasing; Conditional Text Paraphrasing

PDF

Paper 83: TokenVote: Secured Electronic Voting System in the Cloud

Abstract: With the spread of democracy around the world, voting is considered a way to collectively make decisions. Recently, many government offices and private organizations use voting to make decisions when the opinions of multiple decision makers must be accounted for. Another advancement: cloud computing attracts many individual and organizations due to low cost, scal-ability, and the ability to leverage big data. These considerations motivate our proposal of the TokenVote scheme. TokenVote is an electronic voting system in the cloud that uses revocable fingerprint biotokens with a secret sharing scheme to provide privacy, non-repudiation, and authentication. The TokenVote scheme spreads shares of secret (vote), embeds them inside the encoding biometric data (i.e. fingerprint), and distributes them over multiple clouds. During the voting process, each voter must provide his/her fingerprint, causing the TokenVote scheme to collect all voting shares from all voters to compute the final voting result. TokenVote does cloud parallel computing for the voting process in an encoded mode to prevent disclosure of the shares of voting and the fingerprint itself. Our experiments show that TokenSign has a significant performance and comparable accuracy when compared with two baselines.

Author 1: Fahad Alsolami

Keywords: Cloud; Fingerprint; Voting; Security

PDF

Paper 84: A New Uncertainty Measure in Belief Entropy Framework

Abstract: Belief entropy, which represents the uncertainty measure between several pieces of evidence in the Dempster-Shafer framework, is attracting increasing interest in research. It has been used in many applications and is mainly based on the theory of evidence. To quantify uncertainty, several measures have been proposed in the literature. These measures, sometimes in extended or hybrid forms, use the Shannon entropy principle to determine uncertainty degree. However, the failure to consider the scale of the frame of discernment framework remains an open issue in quantifying uncertainty. In this paper, we propose a new uncertainty measure that takes into account the power set of the frame of discernment. After analysing the different existing methods, we show the performance and effectiveness of our proposed approach.

Author 1: Moise Digrais Mambe
Author 2: Tchimou N’Takp´e
Author 3: Nogbou Georges Anoh
Author 4: Souleymane Oumtanaga

Keywords: Dempster Shafer Theory; Belief entropy; Uncer-tainty; Information management; Deng entropy

PDF

Paper 85: A Secure User Authentication Scheme with Biometrics for IoT Medical Environments

Abstract: Internet of Things (IoT) is a ubiquitous network that devices are interconnected and users can access those devices through the Internet. Recently, medical healthcare systems are combined with these IoT networks and provide efficient and ef-fective medical services to medical staff and patients. However, the security threats are increased simultaneously as the requirements of medical services in IoT medical environments are increased. It is essential to provide security of the networks from malicious attacks. In 2018, Roy et al. proposed a remote user authentication and key agreement scheme with biometrics in IoT medical environments. Unfortunately, we analyze Roy et al.’s scheme and demonstrate that their scheme does not withstand various attacks, such as replay attacks and password guessing attacks. Then we propose a user authentication scheme to overcome these security drawbacks. The proposed scheme withstands various attacks from adversaries in IoT medical environments and provide better security functionalities of those of Roy et al. We then prove the authentication and session key of the proposed scheme using BAN logic and analyze that our proposed scheme is secure against various attacks.

Author 1: YoHan Park

Keywords: IoT medical environments; Cryptanalysis; User au-thentication; BAN logic

PDF

Paper 86: Round the Clock Vehicle Emission Monitoring using IoT for Smart Cities

Abstract: Emissions from the vehicles contribute the major part of pollution in this world. Most of the countries have stringent rules to check the emission level through their transport authorities. So as to have zero emission, continuous monitoring of emission level is required. Smart cities need to maintain zero pollution throughout the year. In this paper, an IoT (Internet of Things) based system is proposed for continuous tracking and warning system. The prototype developed is connected to the exhaust of the vehicle and data is collected in the cloud, which can be further processed for a warning system. The device is tested with some vehicles and the results are comparable with the existing emission testing systems used in the market. This device can be used by all vehicle manufacturing companies by embedding it in their products.

Author 1: Jagadish Nayak

Keywords: IoT; Emission; Sensor; Carbon; Tracking; Smartcity

PDF

Paper 87: The Implementation of an IoT-Based Flood Alert System

Abstract: Floods are the most damaging natural disaster in this world. On the occasion of heavy flood, it can destroy the community and killed many lives. The government would spend billions of dollars to recover the affected area. It is crucial to develop a flood control system as a mechanism to reduce the flood risk. Providing a quick feedback on the occurrence of the flood is necessary for alerting resident to take early action such as evacuate quickly to a safer and higher place. As a solution, this paper propose a system that is not only able to detect the water level but also able to measure the rise speed of water level and alerted the resident. Waterfall model is adopted as the methodology in this project. Raspberry Pi is used to collect data from the water sensor and transmit the data to GSM module for sending an alert via SMS. The analysis will be done to show how the Raspberry Pi will be integrated with the smartphone to give an alert. The system is tested in an experiments consist of two different environment in order to ensure that the system is able to provide accurate and reliable data. The project is an IoT-based which significantly in line with the Industrial Revolution 4.0, supporting the infrastructure of Cyber-Physical System.

Author 1: Wahidah Md. Shah
Author 2: F. Arif
Author 3: A.A. Shahrin
Author 4: Aslinda Hassan

Keywords: Flood Alert System; Internet of Things; Cyber-Physical System; IR4.0

PDF

Paper 88: Evaluating the Quality of UCP-Based Framework using CK Metrics

Abstract: Software effort estimation is one of the most important concerns in the software industry. It has received much attention since the last 40 years to improve the accuracy of effort estimate at early stages of software development. Due to this reason, many software estimation models have been proposed such as COCOMO, ObjectMetrix, Use Case Points (UCP) and many more. However, some of the estimation methods were not designed for object-oriented technology that actively encourages reuse strategies. Therefore, due to the popularity of UCP model and the evolution of the object-oriented paradigm, a UCP-based framework and supporting program were developed to assist software developers in building good qualities of software effort estimation programs. This paper evaluates the quality of the UCP-based framework using CK Metrics. The results showed that by implementing the UCP-based framework, the quality of the UCP-based program has improved regarding the understandability, testability, maintainability, and reusability.

Author 1: Zhamri Che Ani
Author 2: Nor Laily Hashim
Author 3: Hazaruddin Harun
Author 4: Shuib Basri
Author 5: Aliza Sarlan

Keywords: ucp-based framework; use case points; ck metrics

PDF

Paper 89: Tele-Ophthalmology Android Application: Design and Implementation

Abstract: Diabetic retinopathy is the leading cause of blind-ness in the world population. Early detection and appropriate treatment can significantly reduce the risk of loss of sight. Medical authorities recommend an annual review of the fundus for diabetic patients. Several screening programs for diabetic retinopathy in the world have been made to implement this recommendation. The purpose of this paper is to implement the idea and principle to facilitate the detection of this disease using tele-ophthalmology and the latest telecommunications tech-nologies. The present paper aims to develop an Android app named "RETINA" which captures retinal photographs using a microscopic lens, process them via several operators based on mathematical morphology after a filtering step by exploiting the library "OpenCV", and stores them in a local or remote MySQL database. The application also offers a small service and facilitates the task for the doctor and ophthalmologist by allowing the inspection of files of remote patients.

Author 1: Rachid Merzougui
Author 2: Mourad Hadjila
Author 3: Nadia Benmessaoud
Author 4: Mokhtaria Benaouali

Keywords: Diabetic retinopathy; Screening; Fundus; Tele-ophthalmology; Android; Mathematical morphology; OpenCV; Database

PDF

Paper 90: Virtual Rehabilitation Using Sequential Learning Algorithms

Abstract: Rehabilitation systems are becoming more impor-tant now because patients can access motor skills recovery treatment from home, reducing the limitations of time, space and cost of treatment in a medical facility. Traditional rehabilitation systems served as movement guides, later as movement mirrors, and in recent years research has sought to generate feedback messages to the patient based on the evaluation of his or her movements. Currently the most commonly used algorithms for exercise evaluation are Dynamic time warping (DTW), Hidden Markov model (HMM), Support vector machine (SVM). However, the larger the set of exercises to be evaluated, the less accurate the recognition becomes, generating confusion between exercises that have similar posture descriptors. This research paper compares two HMM classifiers and Hidden Conditional Random Fields (HCRF) plus two types of posture descriptors, based on points and based on angles. Point representation proves to be superior to angle representation, although the latter is still acceptable. Similar results are found in HCRF and HMM.

Author 1: Gladys Calle Condori
Author 2: Eveling Castro-Gutierrez
Author 3: Luis Alfaro Casas

Keywords: Kinect Skeletal; Sequential Learning Algoritms; Virtual Rehabilitation; Virtual Reality Therapy

PDF

Paper 91: A Machine Learning based Fine-Tuned and Stacked Model: Predictive Analysis on Cancer Dataset

Abstract: The earlier forecast and location of disease cells can be useful in curing the illness in medical applications. Knowledge discovery is having many significant roles in health sector, bioinformatics etc. Plenty of hidden information is available in the datasets present in the various domains like - medical information, textual analysis, image attributes exploration etc. Predictive analytics and modeling encompasses a variety of statistical methodologies from machine learning that can analyze the present along with historical facts to make the predictions about the future events. Breast cancer research already has involved with the good amount of progress in recent decade, but due to advancement in technologies, there is still some possibilities for an improvement. In this paper, the fine-tuned and stacked model procedure is presented which is experimented on standard breast cancer dataset. The obtained results show the improvement over state-of-the-art algorithms with improved performance parameters e.g. disease prediction accuracy, sensitivity and better F1 score etc.

Author 1: Ravi Aavula
Author 2: R. Bhramaramba

Keywords: Machine learning; Cancer prediction; Data mining and Knowledge discovery; Supervised learning; Neural Networks

PDF

Paper 92: Video Streaming Analytics for Traffic Monitoring Systems

Abstract: It is considered a difficult task to have check on traffic during rush hours. Traditional applications are man-ual, costly, time consuming, and the human factors involved. Large scale data is being generated from different resources. Advancement in technology make it possible to store, process, analyze, and communicate with large scale of video data. The manual applications are wiped out with the invention of automatic applications. Automatic video streaming analytics applications helps to reduce computational resources. The reason is cost efficient and accurate predictions while monitoring traffic on roads. This study reviews previously developed application of video streaming analytics for traffic monitoring systems using Hadoop that are able to efficiently analyze video streams.

Author 1: Muhammad Arslan Amin
Author 2: Muhammad Kashif Hanif
Author 3: Muhammad Umer Sarwar
Author 4: Muhammad Kamran Sarwar
Author 5: Ayesha Kanwal
Author 6: Muhammad Azeem

Keywords: Video streaming analytics; Traffic monitoring sys-tem; Video streams; Hadoop; GPU; CNN; Deep learning

PDF

Paper 93: Role of Bloom Filter in Big Data Research: A Survey

Abstract: Big Data is the most popular emerging trends that becomes a blessing for human kinds and it is the necessity of day-to-day life. For example, Facebook. Every person involves with producing data either directly or indirectly. Thus, Big Data is a high volume of data with exponential growth rate that consists of a variety of data. Big Data touches all fields, including Government sector, IT industry, Business, Economy, Engineering, Bioinformatics, and other basic sciences. Thus, Big Data forms a data silo. Most of the data are duplicates and unstructured. To deal with such kind of data silo, Bloom Filter is a precious resource to filter out the duplicate data. Also, Bloom Filter is inevitable in a Big Data storage system to optimize the memory consumption. Undoubtedly, Bloom Filter uses a tiny amount of memory space to filter a very large data size and it stores information of a large set of data. However, functionality of the Bloom Filter is limited to membership filter, but it can be adapted in various applications. Besides, the Bloom Filter is deployed in diverse field, and also used in the interdisciplinary research area. Bioinformatics, for instance. In this article, we expose the usefulness of Bloom Filter in Big Data research.

Author 1: Ripon Patgiri
Author 2: Sabuzima Nayak
Author 3: Samir Kumar Borgohain

Keywords: Bloom Filter; Big Data; Database; Membership Filter; Deduplication; Big Data Storage; Flash memory; Cloud Computing

PDF

Paper 94: Efficient Iris Pattern Recognition Method by using Adaptive Hamming Distance and 1D Log-Gabor Filter

Abstract: Iris recognition is one of the highly reliable security methods as compared to the other bio-metric security techniques. The iris is an internal organ whose texture is randomly de-termined during embryonic gestation and is amenable with a computerized machine vision system for the remote examination. Previously, researchers utilized different approaches like Ham-ming Distance in their iris recognition algorithms. In this paper, we propose a new method to improve the performance of the iris recognition matching system. Firstly, 1D Log-Gabor Filter is used to encode the unique features of iris into the binary template. The efficiency of the algorithm can be increased by taking into account the coincidence fragile bit’s location with 1D Log-Gabor filter. Secondly, Adaptive Hamming Distance is used to examine the affinity of two templates. The main steps of proposed iris recognition algorithm are segmentation by using the Hough’s circular transformation method, normalization by Daugman’s rubber sheet model that provides a high percentage of accuracy, feature encoding and matching. Simulation studies are made to test the validity of the proposed algorithm. The results obtained ensure the superior performance of our algo-rithm against several state-of-the-art iris matching algorithms. Experiments are performed on the CASIA V1.0 iris database, the success of the proposed method with a genuine acceptance rate is 99.92%.

Author 1: Rachida Tobji
Author 2: Wu Di
Author 3: Naeem Ayoub
Author 4: Samia Haouassi

Keywords: Iris recognition; bio-metric; Hamming Distance; iris recognition matching; Adaptive Hamming Distance; 1D Log-Gabor Filter; segmentation; normalization; feature encoding; genuine acceptance rate

PDF

Paper 95: Empirical Evaluation of SVM for Facial Expression Recognition

Abstract: Support Vector Machines (SVMs) have shown bet-ter generalization and classification capabilities in different appli-cations of computer vision; SVM classifies underlying data by a hyperplane that can separate the two classes by maintaining the maximum margin between the support vectors of the respective classes. An empirical analysis of SVMs on the facial expression recognition task is reported with high intra and low inter class variations by conducting an extensive set of experiments on a large-scale Fer 2013 dataset. Three different kernel functions of SVM are used; linear kernel, quadratic kernel and cubic kernel, whereas, Histogram of Oriented Gradient (HoG) is used as a feature descriptor. Cubic Kernel achieves highest accuracy on Fer 2013 dataset using HoG.

Author 1: Saeeda Saeed
Author 2: Junaid Baber
Author 3: Maheen Bakhtyar
Author 4: Ihsan Ullah
Author 5: Naveed Sheikh
Author 6: Imam Dad
Author 7: Anwar Ali Sanjrani

Keywords: Facial Expression Recognition; Support Vector Ma-chine (SVM); Histogram of Oriented Gradients (HoG)

PDF

Paper 96: Predicting Potential Banking Customer Churn using Apache Spark ML and MLlib Packages: A Comparative Study

Abstract: This study was conducted based on an assumption that Spark ML package has much better performance and accuracy than Spark MLlib package in dealing with big data. The used dataset in the comparison is for bank customers transactions. The Decision tree algorithm was used with both packages to generate a model for predicting the churn proba-bility for bank customers depending on their transactions data. Detailed comparison results were recorded and conducted that the ML package and its new DataFrame-based APIs have better-evaluating performance and predicting accuracy.

Author 1: Hend Sayed
Author 2: Manal A. Abdel-Fattah
Author 3: Sherif Kholief

Keywords: Churn prediction; Big data; Machine learning; Apache Spark; ML package; MLlib package; Decision tree

PDF

Paper 97: Multimodal Automatic Image Annotation Method using Association Rules Mining and Clustering

Abstract: Effective and fast retrieval of images from image datasets is not an easy task, especially with the continuous and fast growth of digital images added everyday by used to the web. Automatic image annotation is an approach that has been proposed to facilitate the retrieval of images semantically related to a query image. A multimodal image annotation method is proposed in this paper. The goal is to benefit from the visual features extracted from images and their associated user tags. The proposed method relies on clustering to regroup the text and visual features into clusters and on association rules mining to generate the rules that associate text clusters to visual clusters. In the experimental evaluation, two datasets of the photo annotation tasks are considered; ImageCLEF 2011 and ImageCLEF 2012. Results achieved by the proposed method are better than all the multimodal methods of participants in ImageCLEF 2011 photo annotation task and state-of-the-art methods. Moreover, the MiAP of the proposed method is better than the MiAP of 7 participants out of 11 when using ImageCLEF 2012 in the evaluation.

Author 1: Mounira Taileb
Author 2: Eman Alahmadi

Keywords: Automatic image annotation; association rules min-ing; clustering

PDF

Paper 98: Comparison between Commensurate and Non-commensurate Fractional Systems

Abstract: This article deals with fractional systems that rep-resent better physical process and guarantee a very small number of parameters that can reduces the computation time. It focuses in particular on the state-space representation which highlights the state variables and allows to study the internal behavior of the system taking into account the initial state. Moreover, this representation adapts better to the multiple input multiple-output case. It also discusses the discretization of fractional system to finally adapt the Model Predictive Control to apply it and shows its efficiency and performance in these systems. The main objective of this article is to compare the commensurate and non-commensurate fractional models performance, calculation time and ease of use.

Author 1: Khaled HCHEICHI
Author 2: Faouzi BOUANI

Keywords: discretization; state-space; fractional; calculation time

PDF

Paper 99: A Case Study for the IONEX CODE-Database Processing Tool Software: Ionospheric Anomalies before the Mw 8.2 Earthquake in Mexico on September 7, 2017

Abstract: A software tool was developed in the Imaging Processing Research laboratory (INTI-Lab) that automatically downloads several IONEX files around a specific user input date and also performs statistical calculations to look for ionospheric anomalies through the generation of differential vertical total electron content (∆VTEC) maps. The IONEX CODE-Database Processing Tool (ICPT) software allows to save a considerable amount of time spended in gathering the necessary IONosphere map EXchange (IONEX) files for the production of differential VTEC maps. Using the ICPT software we were able to detect ionospheric anomalies before the devastating earthquake that happened in Mexico on September 7, 2017. A positive and negative ionospheric anomalies were detected nine days and one day before the seismic event. Due to stable geomagnetic conditions we suggest that the anomalies are assocciated to the earthquake event. Furthermore, It is very likely that the collision between the North American and Coco’s plate is producing the ionization necessary of the air to generate the disturbances observed.

Author 1: Guillermo Wenceslao Zarate Segura
Author 2: Carlos Sotomayor-Beltran

Keywords: Earthquake; IONEX CODE-Database Processing Tool; ionospheric anomalies; geomagnetic storm; global iono-spheric maps(GIM); IONosphere map EXchange (IONEX)

PDF

Paper 100: Hybrid Non-Reference QoE Prediction Model for 3D Video Streaming Over Wireless Networks

Abstract: With the rapid growth in mobile device users, and increasing demand for video applications, the traffic from 2D/3D video services is expected to account the largest proportion of internet traffics. User’s perceived quality of experience (QoE) and quality of service (QoS) are the most important key factors for the success of video delivery. In this regard, predicting the QoE attracts high importance for provisioning of 3D video services in wireless domain due to limited resources and bandwidth constraints. This study presents a cross-layer no-reference quality prediction model for the wireless 3D video streaming. The model is based on fuzzy inference systems (FIS), and exploits several QoS key factors that are mapped to the QoE. The performance of the model was validated with unseen datasets and even shows a high prediction accuracy. The result shows a high correlation between the objectivley measured QoE and the predicted QoE by the FIS model.

Author 1: Ibrahim Alsukayti
Author 2: Mohammed Alreshoodi

Keywords: QoE; QoS; Video; Fuzzy Logic; Prediction

PDF

Paper 101: The Proposal of a Distributed Algorithm for Solving the Multiple Constraints Parking Problem

Abstract: The parking problem in big cities has become one of the key causes of the city traffic congestion, driver frustration and air pollution.So to avoid these problems, parking monitoring is an important solution. Recently many new technologies have been developed that allows vehicle drivers to effectively find the free parking places in the city but these systems still limited because they don 't take into consideration road networks constraints. In this paper, We design a distributed system that will help drivers to find the optimal route between their positions and an indoor parking in the city taking into consideration a set of constraints such as ( distance, traffic, amount of fuel in the car, available places in te parking, and parking cost). We propose a distributed technique based on multi objective Ant Colony Optimisation (ACO). The proposed method aim to manage multi objective parking problem in real time using the behavior of real ants and multi agent systems to decrease the traffic flow and to find the optimal route for drivers.

Author 1: Khaoula Hassoune
Author 2: Wafaa Dachry
Author 3: Fouad Moutaouakkil
Author 4: Hicham Medromi

Keywords: ant colony optimization(ACO); multi-agent system; parking monitoring; intelligent systems

PDF

Paper 102: A Review on Event-Based Epidemic Surveillance Systems that Support the Arabic Language

Abstract: With the revolution of the internet, many event-based systems have been developed for monitoring epidemic threats. These systems rely on unstructured data gathered from various online sources. Moreover, some systems are able to handle more than one language to cover all news reports related to disease outbreaks worldwide. The aim of this paper is to examine existing systems in terms of supporting the Arabic language. The 28 identified systems were evaluated based on different criteria. The results of this evaluation show that only 5 systems support the Arabic language using translation tools; hence, disease outbreaks in news reports written in Arabic are not directly processed. In other words, no existing event-based system in the literature has yet been developed specifically for Arabic health news reports to monitor epidemic diseases.

Author 1: Meshrif Alruily

Keywords: Public health; infectious disease; event extraction; disease surveillance system; arabic language

PDF

Paper 103: KASP: A Cognitive-Affective Methodology for Designing Serious Learning Games

Abstract: Many research studies agree on the existence of a close link between emotion and cognition. Actually, much research has demonstrated that students with learning disabilities (LD) experience emotional distress related to their difficulties. In this regard, this article proposes a new methodology of designing intelligent games called KASP Methodology, it’s a new approach applied to the serious games (SGs) design field. It includes new decisive factors for designing SGs for children with LD. The proposed methodology is based on four pillars which are: Knowledge, Affect, Sensory and Pedagogy and it aims helping designers of serious games for building suitable serious learning games for children with LD taking into account the cognitive and emotional aspects of the child learner in order to improve his learning rhythm and foster his emotional state related to learning in a playful and interactive environment.

Author 1: Tahiri Najoua
Author 2: El Alami Mohamed

Keywords: Methodology; Affective Computing; Serious Games; Learning Disabilities; Game Design; Knowledge; Pedagogy

PDF

Paper 104: Video Authentication using PLEXUS Method

Abstract: Digital Video authentication is very important issue in day to day life. A lot of devices have got the ability of recording or capturing digital videos and all these videos can be passed through the internet as well as many other non-secure channels. There is a problem of illegal updating or manipulation of digital video because of the development in video editing software. Therefore, video authentication techniques are required in order to ensure trustworthiness of the video. There are many techniques used to prevent this issue like Digital Signature and Watermarking, these solutions are successfully included in copyright purposes but it’s still really difficult to implement in many other situations especially in video surveillance. In this paper, a new method called PLEXUS is proposed for digital video authentication on temporal attacks. In authentication process, the sender will generate a signature according to the method steps using a video and private key. In verification process, the receiver will also generate a signature using the same video and private key then each signature will be compared. If the two signatures are matched then the video is not tampered otherwise the video is tampered. This method is implemented using 10 different videos and proved to be an efficient method.

Author 1: Hala Bahjat Abdulwahab
Author 2: Khaldoun L. Hameed
Author 3: Nawaf Hazim Barnouti

Keywords: PLEXUS; video authentication; video tampering; temporal attacks

PDF

Paper 105: An Automatic Cryptanalysis of Arabic Transposition Ciphers using Compression

Abstract: This paper introduces a compression-based method adapted for the automatic cryptanalysis of Arabic transposition ciphers. More specifically, this paper presents how a Prediction by Partial Matching (‘PPM’) compression scheme, a method that shows a high level of performance when applied to the different natural language processing tasks, can also be used for the automatic decryption of transposition ciphers for the Arabic language. Another well known compression scheme, Gzip, is also investigated in this paper with less efficient performance demonstrated by this method. In order to achieve readability, two further compression based approaches for space insertion are evaluated as well in this paper. The results of our experiments with 125 Arabic cryptograms of different lengths show that 97%of the cryptograms are successfully decrypted without any errors using PPM compression models. As well in a post-processing step, we can effectively segment the output that is produced by the automatic insertion of spaces resulting with only a few errors overall. As far as we know, this is the first work to demonstrate an effective automatic cryptanalysis for transposition ciphers in Arabic.

Author 1: Noor R. Al-Kazaz
Author 2: William J. Teahan

Keywords: Automatic cryptanalysis; Arabic transposition ci-phers; compression; PPM; word segmentation

PDF

Paper 106: Systematic Analysis and Classification of Cardiac Rate Variability using Artificial Neural Network

Abstract: Electrocardiogram (ECG) is acquisition of electrical activity signals in cardiology. It contains important information about the condition and diseases of heart. An ECG wave, pattern, size, shape and the time interval between different peaks of P-QRS-T wave provide useful information about the diseases which afflict heart. Heart rate signals vary and this variation contains important indicators of cardiac diseases. To assess autonomic nervous system, heart rate variability is popular and non-invasive tool. These indicators contained in ECG wave appear all the day or occur randomly in the day. So, computer based information is much useful over day long interval to diagnose heart disease. Thus, this paper deals with classification of heart diseases on the basis of heart rate variability using artificial neural network. Feed forward neural network is considered to be almost correct 85% of the test results.

Author 1: Azizullah Kakar
Author 2: Naveed Sheikh
Author 3: Bilal Ahmed
Author 4: Saleem Iqbal

Keywords: Electrocardiogram (ECG), Cardiology, P-QRS-T wave, Autonomic nervous system, Heart rate variability, artificial neural network, Time and frequency domain, Pattern recognition, Diseases classification

PDF

Paper 107: A New Steganography Technique using JPEG Images

Abstract: Steganography is a form of security technique that using ambiguity to hide a secret message within an ordinary message between senders and receivers. In this paper, we propose a new steganography technique for hiding data in Joint Photo-graphic Experts Group (JPEG) images as it is the most known type of image compression between the lossy type compressions. Our proposed work is based on lossy compression (frequency domain) in images. This type of compression is susceptible to change even for the smallest amount of change which raises a difficulty to find a proper location to embed data. This should be done without affecting the image quality and without allowing anyone to notice the hidden message. From the senders side, first, we divide the image into 8*8 blocks, then apply a Discrete Cosine Transform (DCT), Quantization, and zigzag processes respectively. Second, the secret message is embedded at the end of each selected zigzag block array using the best method of our experimental results. Third, the rest of the code applies the Run Length Code (RLC), Different Pulse Code Modularity (DPCM) and Huffman encoder to obtain the compressed image that includes the embedded message. From the receiver’s side, we will reverse the previous steps to extract the secret message using an encrypted shared key via a secure channel. Our experimental results show that the best array content size of zigzag computed coefficients are between 1 to 20. This selection allows us to utilize more than half of the image blocks to embed the secret message and the difference between the cover image that holds the secret message and the original cover image is very minimal and hard to detect.

Author 1: Rand A. Watheq
Author 2: Fadi Almasalha
Author 3: Mahmoud H. Qutqut

Keywords: Steganography; hide secret message; JPEG image; lossy compression; frequency domain; zigzag.

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org