The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Metadata Harvesting (OAI2)
  • Digital Archiving Policy
  • Promote your Publication

IJACSA

  • About the Journal
  • Call for Papers
  • Author Guidelines
  • Fees/ APC
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Editors
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Guidelines
  • Fees
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Editors
  • Reviewers
  • Subscribe

IJACSA Volume 9 Issue 1

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Novel Methods for Resolving False Positives during the Detection of Fraudulent Activities on Stock Market Financial Discussion Boards

Abstract: Financial discussion boards (FDBs) have been widely used for a variety of financial knowledge exchange activities through the posting of comments. Popular public FDBs are prone to being used as a medium to spread false financial information due to larger audience groups. Although online forums are usually integrated with anti-spam tools, such as Akismet, moderation of posted content heavily relies on manual tasks. Unfortunately, the daily comments volume received on popular FDBs realistically prevents human moderators to watch closely and moderate possibly fraudulent content, not to mention moderators are not usually assigned with such task. Due to the absence of useful tools, it is extremely time consuming and expensive to manually read and determine whether each comment is potentially fraudulent. This paper presents novel forward and backward analysis methodologies implemented in an Information Extraction (IE) prototype system named FDBs Miner (FDBM). The methodologies aim to detect potentially illegal Pump and Dump comments on FDBs with the integration of per-minute share prices in the detection process. This can possibly reduce false positives during the detection as it categorises the potentially illegal comments into different risk levels for investigation purposes. The proposed system extracts company’s ticker symbols (i.e. unique symbol that represents and identifies each listed company on stock market), comments and share prices from FDBs based in the UK. The forward analysis methodology flags the potentially Pump and Dump comments using a predefined keywords template and labels the flagged comments with price hike thresholds. Subsequently, the backward analysis methodology employs a moving average technique to determine price abnormalities and backward analyse the flagged comments. The first detection stage in forward analysis found 9.82% of potentially illegal comments. It is unrealistic and unaffordable for human moderators or financial surveillance authorities to read these comments on a daily basis. Hence, by integrating share prices to perform backward analysis can categorise the flagged comments into different risk levels. It helps relevant authorities to prioritise and investigate into the higher risk flagged comments, which could potentially indicate a real Pump and Dump crime happening on FDBs when the system is being used in real time.

Author 1: Pei Shyuan Lee
Author 2: Majdi Owda
Author 3: Keeley Crockett

Keywords: Financial discussion boards; financial crimes; pump and dump; text mining; information extraction

Download PDF

Paper 2: Inferring of Cognitive Skill Zones in Concept Space of Knowledge Assessment

Abstract: In these research zones of the knowledge, the assessed domain is identified. Explicitly, these zones are known as Verified Skills, Derived Skills and Potential Skills. In detail, the Verified Skills Zone is the set of tested concepts in the knowledge domain, while Derived Skills Zone is the set of the prerequisite concepts to the tested concepts based on the cognitive skills relation, whereas Potential Skills Zone is the set in the domain that have never been tested or prerequisite to the tested concepts but they are related to the tested concept based on the cognitive relation skills. Identifying cognitive relations between the concepts in one domain simplifies the structure of the assessment, which helps to find the knowledge state of the assessed individual in a short time and minimum number of questions. The existence of the concepts in the assessment domain helps us to estimate the set of the concepts that are known or not known or ready to be known or not ready to be known. In addition, it provides the output of the assessment in concept centric values in addition to the quantity values. The assessment result gives binary values of the assessed domain. “1” implies knowing the concept, whereas “0” implies not knowing the concept. The output is six sets of concepts: 1) Verified Known Skills; 2) Verified Not Known Skills; 3) Derived Known Skills; 4) Derived Not Known Skills; 5) Potential Known Skills; and 6) Potential Not Known Skills. The experiment is conducted to show the binary output of the assessed domain based on the participants’ answers to the asked questions. The results also highlight the efficiency of the assessment.

Author 1: Rania Aboalela
Author 2: Javed Khan

Keywords: Cognitive skill; bloom’s taxonomy; assessment of knowledge; concept space; concepts zones

Download PDF

Paper 3: Credit Card Fraud Detection using Deep Learning based on Auto-Encoder and Restricted Boltzmann Machine

Abstract: Frauds have no constant patterns. They always change their behavior; so, we need to use an unsupervised learning. Fraudsters learn about new technology that allows them to execute frauds through online transactions. Fraudsters assume the regular behavior of consumers, and fraud patterns change fast. So, fraud detection systems need to detect online transactions by using unsupervised learning, because some fraudsters commit frauds once through online mediums and then switch to other techniques. This paper aims to 1) focus on fraud cases that cannot be detected based on previous history or supervised learning, 2) create a model of deep Auto-encoder and restricted Boltzmann machine (RBM) that can reconstruct normal transactions to find anomalies from normal patterns. The proposed deep learning based on auto-encoder (AE) is an unsupervised learning algorithm that applies backpropagation by setting the inputs equal to the outputs. The RBM has two layers, the input layer (visible) and hidden layer. In this research, we use the Tensorflow library from Google to implement AE, RBM, and H2O by using deep learning. The results show the mean squared error, root mean squared error, and area under curve.

Author 1: Apapan Pumsirirat
Author 2: Liu Yan

Keywords: Credit card; fraud detection; deep learning; unsupervised learning; auto-encoder; restricted Boltzmann machine; Tensorflow

Download PDF

Paper 4: Voice Detection in Traditionnal Tunisian Music using Audio Features and Supervised Learning Algorithms

Abstract: The research presented in this paper aims to automatically detect the singing voice in traditional Tunisian music, taking into account the main characteristics of the sound of the voice in this particular music style. This means creating the possibility to automatically identify instrumental and singing sounds. Therefore different methods for the automatic classification of sounds using supervised learning algorithms were compared and evaluated. The research is divided into four successive stages. First, the extraction of features vectors from the audio tracks (through calculation of the parameters of sound perception) followed by the selection and transformation process of relevant features for singing/instrumental discrimination. Then, using learning algorithms, the instrumental and vocal classes were modeled from a manually annotated database. Finally, the evaluation of the decision-making process (indexing) was applied on the test part of the database. The musical databases used for this study consists of extracts from the national sound archives of Centre of Mediterranean and Arabic Music (CMAM) and recordings made especially for this research. The possibility to index audio data (classify/segment) into vocal and instrumental recognition allows for the retrieval of content-based information of musical databases.

Author 1: Wissem Ziadi
Author 2: Hamid Amiri

Keywords: Tunisian voice timbre; audio features extraction; singing voice detection; sung/instrumental discrimination; supervised learning algorithms

Download PDF

Paper 5: Predicting Fork Visibility Performance on Programming Language Interoperability in Open Source Projects

Abstract: Despite a variety of programming languages adopted in open source (OS) projects, fork variation on some languages has been minimal and slow to be adopted, and there is little research as to why this is so. We therefore employed a K-nearest neighbours (KNN) technique to predict the fork visibility performance of a productive language from a pool of programming languages adopted in projects. In total, 38 showcase OS projects from 2012 to 2016 were downloaded from the GitHub website and categorized into different levels of programming language adoption clusters. Among 33 languages, JavaScript is one of the popular languages that adopted by community. It has been predicted the language chosen when fork visibility is high can increase project longevity as a highly visible language is likely to occur more often in projects with a significant number of interoperable programming languages and high language fork count. Conversely, a low fork language reduces longevity in projects with an insignificant number of interoperable programming languages and low fork count. Our results reveal the survival of a productive language is in response to high language visibility (large fork number) and high interoperability of multiple programming languages.

Author 1: Bee Bee Chua
Author 2: d. bernardo

Keywords: Open Source Programming Languages; K-nearest neighbors (KNN) Algorithm; interoperability; survivability

Download PDF

Paper 6: Cadastral and Tea Production Management System with Wireless Sensor Network, GIS based System and IoT Technology

Abstract: Cadastral and tea production management system utilizing wireless sensor network of Internet of Things (IoT) technology is proposed. To improve efficiency of tea productions, cadastral management and tea production processes must be managed by Geographical Information System (GIS) based system. Through experiments with sensor acquired data, it is found that the required information can be estimated and represented efficiently. Thus, the system works for improvement of the tea production management and quality control.

Author 1: Kohei Arai

Keywords: Internet of Things; geographical information system; tea production; quality control

Download PDF

Paper 7: LOD Explorer: Presenting the Web of Data

Abstract: The quantity of data published on the Web according to principles of Linked Data is increasing intensely. However, this data is still largely limited to be used up by domain professionals and users who understand Linked Data technologies. Therefore, it is essential to develop tools to enhance intuitive perceptions of Linked Data for lay users. The features of Linked Data point to various challenges for an easy-to-use data presentation. In this paper, Semantic Web and Linked Data technologies are overviewed, challenges to the presentation of Linked Data is stated, and LOD Explorer is presented with the aim of delivering a simple application to discover triplestore resources. Furthermore, to hide the technical challenges behind Linked Data and provide both specialist and non-specialist users, an interactive and effective way to explore RDF resources.

Author 1: Karwan Jacksi
Author 2: Subhi R. M. Zeebaree
Author 3: Nazife Dimililer

Keywords: Semantic web; linked open data; linked data browsers; exploratory search systems; RDF; SPARQL

Download PDF

Paper 8: Agent-Based System for Efficient kNN Query Processing with Comprehensive Privacy Protection

Abstract: Recently, location based services (LBSs) have become increasingly popular due to advances in mobile devices and their positioning capabilities. In an LBS, the user sends a range of queries regarding his k-nearest neighbors (kNNs) that have common points of interests (POIs) based on his real geographic location. During the query sending, processing, and responding phases, private information may be collected by an attacker, either by tracking the real locations or by analyzing the sent queries. This compromises the privacy of the user and risks his/her safety in certain cases. Thus, the objective of this paper is to ensure comprehensive privacy protection, while also guaranteeing the efficiency of kNN query processing. Therefore, we propose an agent-based system for dealing with these issues. The system is managed by three software agents (selectorDL, fragmentorQ, and predictor). The selectorDL agent executes a Wise Dummy Selection Location (WDSL) algorithm to ensure the location privacy. The mission of the selectorDL agent is integrated with the mission of the fragmentorQ agent, which is to ensure the query privacy based on Left-Right Fragmentation (LRF) algorithm. To guarantee the efficiency of kNN processing, the predictor agent executes a prediction phase depending on a Cell Based Indexing (CBI) technique. Compared to similar privacy protection approaches, the proposed WDSL and LRF approaches showed higher resistance against location homogeneity attacks and query sampling attacks. In addition, the proposed CBI indexing technique obtains more accurate answers to kNN queries than the previous indexing techniques.

Author 1: Mohamad Shady Alrahhal
Author 2: Maher Khemakhem
Author 3: Kamal Jambi

Keywords: Agents; attacks; dummies; fragmentation; indexing; privacy protection; resistance

Download PDF

Paper 9: A Multiclass Deep Convolutional Neural Network Classifier for Detection of Common Rice Plant Anomalies

Abstract: This study examines the use of deep convolutional neural network in the classification of rice plants according to health status based on images of its leaves. A three-class classifier was implemented representing normal, unhealthy, and snail-infested plants via transfer learning from an AlexNet deep network. The network achieved an accuracy of 91.23%, using stochastic gradient descent with mini batch size of thirty (30) and initial learning rate of 0.0001. Six hundred (600) images of rice plants representing the classes were used in the training. The training and testing dataset-images were captured from rice fields around the district and validated by technicians in the field of agriculture.

Author 1: Ronnel R. Atole
Author 2: Daechul Park

Keywords: Deep neural network; convolutional neural network; rice; transfer learning; AlexNet

Download PDF

Paper 10: Improve Mobile Agent Performance by using Knowledge-Based Content

Abstract: Mobile agent technology is one of the mobile computing areas. This technology could be used in several types of applications, such as cloud computing, e-commerce, databases, distributed systems management, network management, etc. The purpose of this paper is to propose a new model for increasing the mobile agent systems performance. The performance is considered as one of the important factors that makes the system reliable. This paper suggests a knowledge-based content to be used to improve the mobile agent systems performance. In the beginning, this work started by conducting intensive survey about related models and mechanisms to investigate the gaps in the performance. A comparative discussion has been conducted between some researches issued and the proposed model. The proposed model has been described in full details based on the components. A scenario-based approach has been used to implement the proposed model by using .Net framework and C# language. The model has been tested and evaluated based on different scenarios. As findings, the overall performance has been improved by 83% when the knowledge-based content is used. In addition, the system performance will improve automatically by the time because the content of the knowledge is increased. The proposed model is suitable to be used in any type of mobile agent applications. The originality of the model is based on conducted survey and own knowledge.

Author 1: Tarig Mohamed Ahmed
Author 2: AL-Kharj

Keywords: Mobile agent; mobility; performance; intelligent system

Download PDF

Paper 11: Kit-Build Concept Map with Confidence Tagging in Practical Uses for Assessing the Understanding of Learners

Abstract: An answer of a learner can be interpreted as a learning evidence for demonstrating the understanding of the learner, while a confidence on the answer represents the belief of the learner as the degree of understanding. In this paper, we propose Kit-Build concept map with confidence tagging. Kit-Build concept map (KB map in short) is a digital tool for supporting a concept map strategy where learners can create the learning evidence, and the instructor can access the correctness and confidence information of learners. The practical uses were conducted for demonstrating the valuable of correctness and confidence information in the lecture class. The correctness information was visualized in the control classes, while the correctness and confidence information were visualized in the experiment classes. The observed evidence illustrates that the different information was used for selecting and ordering the supplementary content when the system visualized the different information. The normalized learning gains and effect size demonstrate the different learning achievements between control- and experiment- classes. The results suggest that the confidence information of learner affects the instructor behaviors, which is the positive changing behavior for improving the understanding of their learners. The results of questionnaire suggest that the KB map with confidence tagging is an accepted mechanism for representing the learner’s understanding and their confidence. The instructors also accepted that the confidence information of learners is valuable information for recognizing the learning situation.

Author 1: Jaruwat Pailai
Author 2: Warunya Wunnasri
Author 3: Yusuke Hayashi
Author 4: Tsukasa Hirashima

Keywords: Kit-Build concept map; confidence tagging; effect of confidence information; behavior changing of instructor

Download PDF

Paper 12: Method for Detection of Foreign Matters Contained in Dryed Nori (Seaweed) based on Optical Property

Abstract: Optical property, such as spectral reflectance, bidirectional reflectance distribution functions and polarization characteristics of dried seaweeds is clarified together with an attempt for transparent foreign matter detection considering optical property of dried seaweeds, such as spectral transparency, bidirectional reflectance and transparency as well as polarimetric properties. Through experiments, it is found that transparent foreign matter can be detected by using bidirectional reflectance distribution function as well as polarization characteristics.

Author 1: Kohei Arai

Keywords: Seaweeds; optical characteristics; BRDF; polarization characteristics; foreign matter detection

Download PDF

Paper 13: On P300 Detection using Scalar Products

Abstract: Results concerning detection of the P300 wave in EEG segments using scalar products with signals of various shapes are presented and their advantages and limitations are discussed. From the point of view of the computational complexity, the proposed algorithm is a simple algorithm, based on a scalar product and searching for the max value of 6 calculated values. Because we considered that the human subject is not a robot that precisely generates P300 and that there is also a human component of error in the involuntary generation of such waves, we have also calculated the rate of classification of character in the human visual field. To validate the proposed method, electroencephalography recordings from the competition for Spelling BCI Competition III Challenge 2005 -Dataset II have been used.

Author 1: Monica Fira
Author 2: Liviu Goras
Author 3: Anca Lazar

Keywords: Electroencephalographic (EEG); brain computer interface; P300; spelling paradigm; classification; signal processing

Download PDF

Paper 14: Implicit and Explicit Knowledge Mining of Crowdsourced Communities: Architectural and Technology Verdicts

Abstract: The use of social media especially community Q&A Sites by software development community has been increased significantly in past few years. The ever mounting data on these Q&A Sites has open up new horizons for research in multiple dimensions. Stackoverflow is repository of large amount of data related to software engineering. Software architecture and technology selection verdicts in SE have enormous and ultimate influence on overall properties and performance of software system, and pose risks to change if once implemented. Most of the risks in Software Engineering projects are directly or indirectly coupled with Architectural and Technology decisions (ATD). Advance Architectural knowledge availability and its utilization are crucial for decision making. Existing architecture and technology knowledge management approaches using software repositories give a rich insight to support architects by offering a wide spectrum of architecture and technology verdicts. However, they are mostly insourced and still depend on manual generation and maintenance of the architectural knowledge. This paper compares various software development approaches and suggests crowdsourcing as knowledge ripped approach and brings into use the most popular online software development community/Crowdsourced (StackOverflow) as a rich source of knowledge for technology decisions to support architecture knowledge management with a more reliable method of data mining for knowledge capturing. This is an exploratory study that follows a qualitative and qualitative e-content analysis approach. Our proposed framework finds relationships among technology and architecture related posts in this community to identify architecture-relevant and technology-related knowledge through explicit and implicit knowledge mining, and performs classification and clustering for the purpose of knowledge structuring for future work.

Author 1: Husnain Mushtaq
Author 2: Babur Hayat Malik
Author 3: Syed Azkar Shah
Author 4: Umair Bin Siddique
Author 5: Muhammad Shahzad
Author 6: Imran Siddique

Keywords: StackOverflow; architecture and technology verdicts; crowdsourcing; data mining; explicit and implicit knowledge; software repositories; knowledge structuring

Download PDF

Paper 15: Web-Based COOP Training System to Enhance the Quality, Accuracy and Usability Access

Abstract: In this paper, a web based COOP training system is demonstrated to ensure usable process of task interactions between various participants. In the existing method various issues related with the paper work, communication gap, etc. raised serious issues between the colleges and industries while implementing the COOP training programs. The primary data was collected by conducting interviews with the supervisors and also by taking the opinion of students to improve the proposed COOP system. The proposed system is capable of reducing the complexity of operations to a greater extent by avoiding overlapping of the information, reducing the communication gap and by increasing the accuracy of the information. The outcomes of the proposed system proved to be very fruitful in terms of results obtained from the point of view of all the participants in the COOP system. The performance, accuracy, quality and assessment of the student reports found to be improved to deliver excellent results.

Author 1: Amr Jadi
Author 2: Eesa A. Alsolami

Keywords: COOP training; web applications; integration; quality; accuracy

Download PDF

Paper 16: Standard Intensity Deviation Approach based Clipped Sub Image Histogram Equalization Algorithm for Image Enhancement

Abstract: The limitations of the hardware and dynamic range of digital camera have created the demand for post processing software tool to improve image quality. Image enhancement is a technique that helps to improve finer details of the image. This paper presents a new algorithm for contrast enhancement, where the enhancement rate is controlled by clipped histogram approach, which uses standard intensity deviation. Here standard intensity deviation is used to divide and equalize the image histogram. The equalization processes is applied to sub images independently and combine them into one complete enhanced image. The conventional histogram equalization stretches the dynamic range which leads to a large gap between adjacent pixels that produces over enhancement problem. This drawback is overcome by defining standard intensity deviation value to split and equalize the histogram. The selection of suitable threshold value for clipping and splitting image, provides better enhancement over other methods. The simulation results show that proposed method out performs other conventional histogram equalization (HE) methods and effectively preserves entropy.

Author 1: Sandeepa K S
Author 2: Basavaraj N Jagadale
Author 3: J S Bhat

Keywords: Standard intensity deviation; histogram clipping; histogram equalization; contrast enhancement; entropy

Download PDF

Paper 17: Quality Ranking Algorithms for Knowledge Objects in Knowledge Management Systems

Abstract: The emergence of web-based Knowledge Management Systems (KMS) has raised several concerns about the quality of Knowledge Objects (KO), which are the building blocks of knowledge expertise. Web-based KMSs offer large knowledge repositories with millions of resources added by experts or uploaded by users, and their content must be assessed for accuracy and relevance. To improve the efficiency of ranking KOs, two models are proposed for KO evaluation. Both models are based on user interactions and exploit user reputation as an important factor in quality estimation. For the purpose of evaluating the performance of the two proposed models, the algorithms were implemented and incorporated in a KMS. The results of the experiment indicate that the two models are comparable in accuracy, and that the algorithms can be integrated in the search engine of a KMS to estimate the quality of KOs and accordingly rank the results of user searches.

Author 1: Amal Al-Rasheed
Author 2: Jawad Berri

Keywords: Knowledge Management System (KMS); Knowledge Object (KO); knowledge evaluation; quality indicator; recommender system

Download PDF

Paper 18: The Effect of Music on Shoppers’ Shopping Behaviour in Virtual Reality Retail Stores: Mediation Analysis

Abstract: The aim of this study is to investigate the effect of music, as an atmospheric cue of 3D virtual reality retail (VRR) stores, on shoppers’ emotions and behaviour. To complete this research, a major empirical study was conducted in Second Life (SL) which is one of the most mature virtual worlds (VWs). The effect of the music on shoppers’ emotions was experimentally tested in computer labs. Pre-test and post-test were conducted to evaluate the emotion levels before and after experiencing 3D VRR stores. Detailed mediation analysis was done with the PROCESS tool at the later stage of the analysis. This research confirmed ‘music’ as an atmospheric cue of 3D Servicescape. Results of this research determined the effect of music on shoppers’ arousal, pleasure and consequent shopping behaviour. Further, this research could not identify the direct effect of arousal on shoppers’ behaviour, however, it was a major source of inducing pleasure and increasing shoppers’ positive approach behaviour. This paper contribute to better understanding the 3D VRR store atmospheric, role of music in it, shoppers’ emotions and behaviour.

Author 1: Aasim Munir Dad
Author 2: Andrew Kear
Author 3: Asma Abdul Rehman
Author 4: Barry J. Davies

Keywords: Music; retail atmospherics; 3D virtual reality retailing; second life (SL); mediation analysis

Download PDF

Paper 19: A Web Service Composition Framework based on Functional Weight to Reach Maximum QoS

Abstract: The recent trend in the web world is to accomplish almost all the user services in every field through the web portals of the respective organizations. But a specific task with series of actions cannot be completed by a single web service with limited functionality. Therefore, multiple web services with different functionalities are composed together to attain the result. Web service composition is an approach that combine various services to fulfill the Web related tasks with preferred quality. Composition of such services will become more challenging when these web services are with similar functionalities, varying Quality and from several providers. Hence, the overall QoS (Quality of Service) could be considered as the major factor for composition. Moreover, in most of the compositions the expected QoS cannot be attained when the task is finished. Sometimes the complete task may have affected by a poor performed single web service. So, while composition, at most care should be taken to select a particular web service. Composing web services dynamically is the main method used to overcome these difficulties. However, to reach the actual functionality of the specific task the quality of each individual service is very much necessary. The QoS of a web service normally evaluated using the non-functional attributes, such as response time, availability, reliability, throughput, etc. Also, while composition, the same level of quality is not expected for individual web services that are included in the chain. So, a framework proposed in this research paper, for web service composition by setting appropriate weightage for the non-functional parameters. Experimental results show that implementation of this method will definitely pave the way to reach the maximum performance of the composition with improved QoS.

Author 1: M.Y. Mohamed Yacoab
Author 2: Abdalla AlAmeen
Author 3: M. Mohemmed Sha

Keywords: Web service; composition of services; non-functional parameters; QoS

Download PDF

Paper 20: Encrypted Fingerprint into VoIP Systems using Cryptographic Key Generated by Minutiae Points

Abstract: The transmission of the encryption voice over IP is challenging. The voice is recorded, eavesdropping, change and theft voice, etc. The voice over IP is encrypted by using Advance Encryption Standard (AES) Algorithm. AES key is generated from Minutiae Points in fingerprint. By other way, we talk about biometric-cryptosystem, which is hybrid between one of the cryptosystems and biometric systems, such as fingerprint using for authentication as well as to generate cryptographic key to encrypt voice over IP by applying AES. In this paper, we define a new term which is Fingerprint Distribution Problem (FDP) based on Key Distribution Problem. Also, we suggest a solution for this problem by encrypted fingerprint before sending between users by using one of the public key cryptosystems which is RSA Algorithm.

Author 1: Mohammad Fawaz Anagreh
Author 2: Anwer Mustafa Hilal
Author 3: Tarig Mohamed Ahmed

Keywords: IP; cryptography; fingerprint; minutiae Advance Encryption Standard (AES); RSA; information security

Download PDF

Paper 21: General Characteristics and Common Practices for ICT Projects: Evaluation Perspective

Abstract: In today’s business world, organizations are more dependent on Information and Communication Technologies (ICT) resources. Cloud services, communication services and software services are most common resources, enterprises are spending large amount. To install new services and upgrade existing services, ICT project are essential part of organization’s business strategies. Researchers highlighted the real problem for the organization is to initiate new ICT projects and its evaluation after implementation. This research investigated the common approaches organizations using to start with ICT projects and how to evaluate its impact on after implementation. For this, we have extracted the number of steps with the help of literature review. To validate those steps, six case studies are selected for collecting the samples. The findings of this study elaborate that every ICT project has list of objectives i.e. strategic, informational, IT infrastructure and others. Furthermore, the results highlight that organizations believe on both financial and non-financial evaluation methods based on the type of organization i.e. public or private. Moreover, measurement process applied on project wise, monthly and yearly bases. Importantly, we have found that currently outsourcing plays significant role in success of ICT projects. The results of this study can be helpful for the organization to understand the type of ICT investments, approaches and possible impact on the organizations goals.

Author 1: Abdullah Saad AL-Malaise AL-Ghamdi
Author 2: Farrukh Saleem

Keywords: ICT project; ICT evaluation; measurement process; case studies; common practices

Download PDF

Paper 22: Identification of Toddlers’ Nutritional Status using Data Mining Approach

Abstract: One of the problems in community health center or health clinic is documenting the toddlers’ data. The numbers of malnutrition cases in developing country are quite high. If the problem of malnutrition is not resolved, it can disrupt the country’s economic development. This study identifies malnutrition status of toddlers based on the context data from community health center (PUSKESMAS) in Jogjakarta, Indonesia. Currently, the patients’ data cannot directly map into appropriate groups of toddlers’ malnutrition status. Therefore, data mining concept with k-means clustering is used to map the data into several malnutrition status categories. The aim of this study is building software that can be used to assist the Indonesian government in making decisions to take preventive action against malnutrition.

Author 1: Sri Winiarti
Author 2: Herman Yuliansyah
Author 3: Aprial Andi Purnama

Keywords: Data mining; k-means clustering; malnutrition status of toddler

Download PDF

Paper 23: A Comparative Study on Steganography Digital Images: A Case Study of Scalable Vector Graphics (SVG) and Portable Network Graphics (PNG) Images Formats

Abstract: Today image steganography plays a key role for exchanging a secret data through the internet. However, the optimal choice of images formats for processing steganography is still an open issue; therefore, this research comes into a table. This research conducts a comparative study between Scalable Vector Graphics (SVG) image format and Portable Network Graphics (PNG) image format. As results show, SVG image format is more efficient than PNG image format in terms of capacity and scalability before and after processing steganography. As well, SVG image format helps to increase simplicity and performance for processing steganography, since it is an XML text file. Our comparative study provides significant results between SVG and PNG images, which have not been seen in the previous related studies.

Author 1: Abdulgader Almutairi

Keywords: Image steganography; data hiding; raster and vector images; Scalable Vector Graphics (SVG) and Portable Network Graphics (PNG) images format

Download PDF

Paper 24: Detection of Violations in Credit Cards of Banks and Financial Institutions based on Artificial Neural Network and Metaheuristic Optimization Algorithm

Abstract: Due to popularity of the World Wide Web and e-commerce, electronic communications between people and different organizations through virtual world of the Internet have provided a good basis for commercial and economic relations. These developments, although occurring for less than a century, electronic communications have always been subject to interference, cheating, fraud, and other acts of sabotage. Along with this increase in trading volume, there is a huge increase in the number of online fraud which results in billions of dollars of losses annually worldwide; this has a direct effect on customer service of banking systems, particularly electronic banking systems, and survival as a reliable financial service provider. Therefore, attention to fraud detection techniques is essential to prevent fraudulent acts and is the motive for many scientific researches. For this reason, business intelligence is used to identify financial violations in various economic, banking and other fields. Here, the focus is on algorithms and methods presented in data mining to deal with fraud by using neural networks. The main objective is to improve these methods or present new algorithms by studying the behavioral patterns of customers and the combined use of genetic algorithm to improve the performance of neural network and find the appropriate models for better decision making by implementing and testing the performance of the suggested algorithms. The results show that more strength was given to neural network by using genetic algorithm. In fact, genetic algorithm can raise our ability to control the training process. Moreover, it was concluded that criteria such as age, gender, marital status were not effective on detection; in fact, the most important effective criteria are information related to transaction.

Author 1: Zarrin Monirzadeh
Author 2: Mehdi Habibzadeh
Author 3: Nima Farajian

Keywords: Financial fraud detection; neural networks; data mining; genetic algorithm

Download PDF

Paper 25: Data Exfiltration from Air-Gapped Computers based on ARM CPU

Abstract: Air-gapped Network is a network isolated from public networks. Several techniques of data exfiltration from air-gapped networks have been recently proposed. Air-gap malware is a malware that breaks the isolation of an air-gapped computer using air-gap covert channels, which extract information from air-gapped computers running on air-gap networks. Guri et al. presented an air-gap malware “GSMem”, which can exfiltrate data from air-gapped computers over GSM frequencies, 850 MHz to 900MHz. GSMem makes it possible to send data using the radio waves leaked out from the system bus between CPU and RAM. It generates binary amplitude shift keying (B-ASK) modulated waves with x86 SIMD instruction. In order to efficiently emit electromagnetic waves from the system-bus, it is necessary to access the RAM without being affected by the CPU caches. GSMem adopts an instruction that writes data without accessing CPU cache in Intel CPU. This paper proposes an air-gap covert channel for computers based on ARM CPU, which includes a software algorithm that can effectively cause cache misses. It is also a technique to use NEON instructions and transmit B-ASK modulated data by radio waves radiated from ARM based computer (e.g. Raspberry Pi 3). The experiment shows that the proposed program sends binary data using radio waves (about 1000kHz ~ 1700kHz) leaked out from system-bus between ARM CPU and RAM. The program can also run on Android machines based on ARM CPU (e.g. ASUS Zenpad 3S 10 and OnePlus 3).

Author 1: Kenta Yamamoto
Author 2: Miyuki Hirose
Author 3: Taiichi Saito

Keywords: Air-Gapped Network; ARM CPU; data exfiltration; SIMD; NEON; GSMem

Download PDF

Paper 26: A Seamless Network Database Migration Tool for Insititutions in Zambia

Abstract: The objective of the research was to efficiently manage migration process between different Database Management Systems (DBMS) by automating the database migration process. The automation of the database migration process involved database cloning between different platforms, exchange of data between data center and different clients running non-identical DBMS and backing up the database in flexible format, such as eXtensible Markup Language (XML). This approach involved development of a “Database Migration Tool”. The tool was developed on a windows platform using Java Eclipse™ with four non-identical dummy Relational Databases (Microsoft Access, MySQL, SQL Server and Oracle). The tool was run in a controlled environment over the network and databases were successfully migrated from source to targeted destination option specified. The developed tool is more efficient, timely, as well as highly cost effective.

Author 1: Mutale Kasonde
Author 2: Simon Tembo

Keywords: Database management system; database migration; database structure; database migration toolkits and database cloning

Download PDF

Paper 27: Software Engineering: Challenges and their Solution in Mobile App Development

Abstract: Mobile app development is increasing rapidly due to the popularity of smartphones. With billions of apps downloads, the Apple App Store and Google Play Store succeeded to overcome mobile devices. Throughout last 10 years, the amount of smartphones and mobile applications has been perpetually growing. Android and iOS are two mobile platforms that cowl most smartphones within the world in 2017. However, this success challenges app developers to publish high-quality apps to stay attracting and satisfying end-users. Developing a mobile app involves ?rst to select the platforms the app can run, so to develop speci?c solutions (i.e., native apps). During application development a developer come across multiple challenges. In this paper, we have tried to find out challenges faced by developer during their development life cycle with their possible solution.

Author 1: Naila Kousar
Author 2: Muhammad Sheraz Arshad Malik
Author 3: Aramghan Sarwar
Author 4: Burhan Mohy-ud-din
Author 5: Ayesha Shahid

Keywords: Android; IOS; mobile apps; software quality; survey research; user requirements

Download PDF

Paper 28: Analysis of Valuable Clustering Techniques for Deep Web Access and Navigation

Abstract: A massive amount of content is available on web but huge portion of it is still invisible. User can only access this hidden web, also called Deep web, by entering a directed query in a web search form and thus accessing the data from database which is not indexed with hyperlinks. Inability to index particular type of content and restricted storage capacity is significant factor behind the invisibleness of web content. Different clustering techniques offer a simple way to analyze large volume of non-indexed content. The major focus of research is to analyze the different clustering techniques to find more accurate and efficient method for accessing and navigating the deep web content. Analysis and comparison of Latent Dirichlet Allocation (LDA), Latent Semantic Analysis (LSA), and Hierarchical and K-means method have been carried out and valuable factors for clustering in deep web have been identified.

Author 1: Qurat ul ain
Author 2: Asma Sajid
Author 3: Uzma Jamil

Keywords: Deep web; clustering; Latent Diriclet Allocation; Latent Semantic Analysis; hierarchical methods; K-means methods

Download PDF

Paper 29: Pre-Trained Convolutional Neural Network for Classification of Tanning Leather Image

Abstract: Leather craft products, such as belt, gloves, shoes, bag, and wallet are mainly originated from cow, crocodile, lizard, goat, sheep, buffalo, and stingray skin. Before the skins are used as leather craft materials, they go through a tanning process. With the rapid development of leather craft industry, an automation system for leather tanning factories is important to achieve large scale production in order to meet the demand of leather craft materials. The challenges in automatic leather grading system based on type and quality of leather are the skin color and texture after tanning process will have a large variety within the same skin category and have high similarity with the other skin categories. Furthermore, skin from different part of animal body may have different color and texture. Therefore, a leather classification method on tanning leather image is proposed. The method uses pre-trained deep convolution neural network (CNN) to extract rich features from tanning leather image and Support Vector Machine (SVM) to classify the features into several types of leather. Performance evaluation shows that the proposed method can classify various types of leather with good accuracy and superior to other state-of-the-art leather classification method in terms of accuracy and computational time.

Author 1: Sri Winiarti
Author 2: Adhi Prahara
Author 3: Murinto
Author 4: Dewi Pramudi Ismi

Keywords: Leather classification; tanning leather; convolution neural network (CNN); deep learning; support vector machine (SVM)

Download PDF

Paper 30: Iteration Method for Simultaneous Estimation of Vertical Profiles of Air Temperature and Water Vapor with AQUA/AIRS Data

Abstract: Iteration method for simultaneous estimation of vertical profiles of air temperature and water vapor with the high spectral resolution of sounder of AQUA/AIRS data is proposed. Through a sensitivity analysis based on the proposed method for the several atmospheric models simulated by MODTRAN, it is found that the proposed method is superior to the conventional method by 41.4% for air temperature profile and by 88.9% for relative humidity profile.

Author 1: Kohei Arai

Keywords: Inversion; tropopause; AQUA; AIRS; Air temperature; sounder; MODTRAN

Download PDF

Paper 31: A Robust System for Noisy Image Classification Combining Denoising Autoencoder and Convolutional Neural Network

Abstract: Image classification, a complex perceptual task with many real life important applications, faces a major challenge in presence of noise. Noise degrades the performance of the classifiers and makes them less suitable in real life scenarios. To solve this issue, several researches have been conducted utilizing denoising autoencoder (DAE) to restore original images from noisy images and then Convolutional Neural Network (CNN) is used for classification. The existing models perform well only when the noise level present in the training set and test set are same or differs only a little. To fit a model in real life applications, it should be independent to level of noise. The aim of this study is to develop a robust image classification system which performs well at regular to massive noise levels. The proposed method first trains a DAE with low-level noise-injected images and a CNN with noiseless native images independently. Then it arranges these two trained models in three different combinational structures: CNN, DAE-CNN, and DAE-DAE-CNN to classify images corrupted with zero, regular and massive noises, accordingly. Final system outcome is chosen by applying the winner-takes-all combination on individual outcomes of the three structures. Although proposed system consists of three DAEs and three CNNs in different structure layers, the DAEs and CNNs are the copy of same DAE and CNN trained initially which makes it computationally efficient as well. In DAE-DAE-CNN, two identical DAEs are arranged in a cascaded structure to make the structure well suited for classifying massive noisy data while the DAE is trained with low noisy image data. The proposed method is tested with MNIST handwritten numeral dataset with different noise levels. Experimental results revealed the effectiveness of the proposed method showing better results than individual structures as well as the other related methods.

Author 1: Sudipta Singha Roy
Author 2: Sk. Imran Hossain
Author 3: M. A. H. Akhand
Author 4: Kazuyuki Murase

Keywords: Image denoising; denoising autoencoder; cascaded denoising autoencoder; convolutional neural network

Download PDF

Paper 32: A New Healthcare Context Information: The Social Context

Abstract: During the treatment process, medical institutes collect context information about their patients and store it in their healthcare systems. The collected information describes the measurable, risk, or medication information and used to improve the performance of the institutes healthcare systems by allowing diverse knowledge about patients. Being said that some other information is needed as they influence patients’ life style such as education and income as the high level of education or income reflected positively to the patient’s life, and probably resulting in reducing likelihood disease or incidence of infectious diseases. In this paper, a new type of healthcare context information (Social Context) is proposed to address this need. It can be divided into four main categories: related people, behavior, income and education of the patient. We believe that the new proposed context information should be considered in the designing process of the context-aware medical informatics systems beside to the well-known context information.

Author 1: Isra’a Ahmed Zriqat
Author 2: Ahmad Mousa Altamimi

Keywords: Context information; social context; healthcare; medical information

Download PDF

Paper 33: Brainwaves for User Verification using Two Separate Sets of Features based on DCT and Wavelet

Abstract: This paper discusses the effectiveness of brain waves for user verification using electroencephalogram (EEG) recordings of one channel belong to single task. The feature sets were previously introduced as features for EEG-based identification system are tested as suitable features for verification system in this paper. The first considered feature set is based on the energy distribution of DCT’s or DFT’s power spectra, while the second set is based on the statistical moments of wavelet transform, three types of wavelet transforms is proposed. Each set of features is tested using normalized Euclidean distance measure for the matching purpose. The performance of the verification system is evaluated using FAR, FRR, and HTER measures. Two publicly available EEG datasets are used; first is the Colorado State University (CSU) dataset which was collected from seven healthy subjects and the second is the Motor Movement /Imagery (MMI) dataset which is a relatively large dataset was collected from 109 healthy subjects. The attained verification results are encouraging when compared with the results of other recent published works, the best achieved HTER is (0.26) when the system was tested on CSU dataset, while the best achieved HTER is (0.16) when the system was tested on MMI dataset for the features which based on the energy of DFT spectra.

Author 1: Loay E. George
Author 2: Hend A. Hadi

Keywords: Electroencephalogram (EEG); wavelet transforms; DCT; DFT; energy features; statistical moments; Euclidean measure

Download PDF

Paper 34: FARM: Fuzzy Action Rule Mining

Abstract: Action Mining is a sub-field of Data Mining that concerns about finding ready-to-apply action rules. The majority of the patterns discovered by traditional data mining methods require analysis and further work by domain experts to be applicable in target domain while Action Mining methods try to find final cost-effective actions that can be applied immediately in target domain. Current state-of-the-art methods in AM domain only consider discrete attributes for action rule mining. Therefore, one should discretize continuous attributes using traditional discretization methods before using them for action rule mining. In this paper, the concept of Fuzzy Action Rule has been introduced. In this type of action rule, continuous attributes can be presented in fuzzy form. So that they can suggest fuzzy changes for continuous attributes instead of discretizing them. Because the space of all fuzzy action rules can be so huge a Genetic Algorithm-based Fuzzy Action Rule Mining (GA-FARM) method has been devised for finding the most cost-effective fuzzy action rules with tractable complexity. The proposed method has been implemented and tested on different real datasets. Results confirm that the proposed method is successful in finding cost-effective fuzzy action rules in acceptable time.

Author 1: Zahra Entekhabi
Author 2: Pirooz Shamsinejadbabki

Keywords: Action mining; fuzzy action rule mining; genetic algorithm

Download PDF

Paper 35: A Secured Interoperable Data Exchange Model

Abstract: Interoperability enables peer systems to communicate with each other and use the functionality of peer systems effectively. It improves ability for different systems to exchange information between cooperative systems. It plays a vital role in educational information system institutions. Practically, there are two main technical reasons that restrain the interoperability of the system. First, these systems may be developed under various operating systems, programming languages and different database management systems. Second, the obsessions of security greatly impact the execution of interoperability among various educational institutions. This paper proposes a new RESTful secured interoperable model for data exchange among different information system. This will help educational information system to exchange data among them with a pre-defined standard format of messages. Additionally, this paper designed Cross Platform Web Application Interoperability Protocol (CPWAIP) to facilitate the interaction among components of the proposed model.

Author 1: A. Bahaa
Author 2: A. Sayed
Author 3: L. Elfangary

Keywords: Data sharing; security; integrity; and protection

Download PDF

Paper 36: Iterative Removing Salt and Pepper Noise based on Neighbourhood Information

Abstract: Denoising images is a classical problem in low-level computer vision. In this paper, we propose an algorithm which can remove iteratively salt and pepper noise based on neighbourhood while preserving details. First, we compute the probability of different window without free noise pixel by noise ratio, and then determine the size of window. After that the corrupted pixel is replaced by the weighted eight neighbourhood pixels. If the neighbourhood information does not satisfy the de-noising condition, the corrupted pixels will recover in the subsequent iterations.

Author 1: Liu Chun
Author 2: Sun Bishen
Author 3: Liu Shaohui
Author 4: Tan Kun
Author 5: Ma Yingrui

Keywords: Salt and pepper noise; noise detection; neighbourhood similarity; detail preserving denoising

Download PDF

Paper 37: Attendance and Information System using RFID and Web-Based Application for Academic Sector

Abstract: Recently, students attendance have been considered as one of the crucial elements or issues that reflects the academic achievements and the performance contributed to any university compared to the traditional methods that impose time-consuming and inefficiency. Diverse automatic identification technologies have been more in vogue such as Radio Frequency Identification (RFID). An extensive research and several applications are produced to take maximum advantage of this technology and bring about some concerns. RFID is a wireless technology which uses to a purpose of identifying and tracking an object via radio waves to transfer data from an electronic tag, called RFID tag or label to send data to RFID reader. The current study focuses on proposing an RFID based Attendance Management System (AMS) and also information service system for an academic domain by using RFID technology in addition to the programmable Logic Circuit (such as Arduino), and web-based application. The proposed system aims to manage student’s attendance recording and provides the capabilities of tracking student absentee as well, supporting information services include students grading marks, daily timetable, lectures time and classroom numbers, and other student-related instructions provided by faculty department staff. Based on the results, the proposed attendance and information system is time-effective and it reduces the documentation efforts as well as, it does not have any power consumption. Besides, students attendance RFID based systems that have been proposed are also analyzed and criticized respect to systems functionalities and main findings. Future directions for further researchers are focused and identified.

Author 1: Hasanein D. Rjeib
Author 2: Nabeel Salih Ali
Author 3: Ali Al Farawn
Author 4: Basheer Al-Sadawi
Author 5: Haider Alsharqi

Keywords: Student attendance; Attendance Management System (AMS); information service; RFID; IoT; radio-frequency identification; Arduino

Download PDF

Paper 38: Social Network Link Prediction using Semantics Deep Learning

Abstract: Currently, social networks have brought about an enormous number of users connecting to such systems over a couple of years, whereas the link mining is a key research track in this area. It has pulled the consideration of several analysts as a powerful system to be utilized as a part of social networks study to understand the relations between nodes in social circles. Numerous data sets of today’s interest are most appropriately called as a collection of interrelated linked objects. The main challenge faced by analysts is to tackle the problem of structured data sets among the objects. For this purpose, we design a new comprehensive model that involves link mining techniques with semantics to perform link mining on structured data sets. The past work, to our knowledge, has investigated on these structured datasets using this technique. For this purpose, we extracted real-time data of posts using different tools from one of the famous SN platforms and check the society’s behavior against it. We have verified our model utilizing diverse classifiers and the derived outcomes inspiring.

Author 1: Maria Ijaz
Author 2: Javed Ferzund
Author 3: Muhammad Asif Suryani
Author 4: Anam Sardar

Keywords: Link prediction system; post analysis; semantic similarity; data analysis; social network analysis; dictionary; co-similar links

Download PDF

Paper 39: Matrix Clustering based Migration of System Application to Microservices Architecture

Abstract: A microservice architecture (MSA) style is an emerging approach which is gaining strength with the passage of time. Micro services are recommended by a number of researchers to overcome the limitations and issues encountered by usage of aging method of monolithic architecture styles. Previously the monolithic applications cannot be decomposed into smaller and different services. Monolithic styles application was the one build application. The issue resolution has the focus on lightweight independent application services in the form of sizable services, self-contained units with primary focus on maintenance, performance, scalability, and online services eliminating dependency. All quality factors have been thoroughly discussed in literature, system application migration is becoming an emerging issue with different challenges. This study is addressing the tight coupling to reducing this issue. Moreover, this literature review indicates some complex problems about the migration or conversion of system application into microservice. In architecture, dependency is a big challenge and issue in recent technology. Microservices are recommended by a number of researchers to overcome the limitations issue about how to migrate the existing system application to microservice. The need for a systematic mapping is essential in order to recap the improvement and identify the gaps and requirements for future studies. This study shows open issues first, new findings of quality attributes of microservices and then this study helps to understand the difference between previous traditional systems and microservices based systems. This research study creates awareness about system migration to microservices.

Author 1: Shahbaz Ahmed Khan Ghayyur
Author 2: Abdul Razzaq
Author 3: Saeed Ullah
Author 4: Salman Ahmed

Keywords: Monolithic architecture; microservices architecture; systematic mapping; system migration; application transformation; traditional application development; emerging challenges; API

Download PDF

Paper 40: DoS/DDoS Detection for E-Healthcare in Internet of Things

Abstract: Internet of Things (IoT) has emerged as a new horizon in communication age. IoT has provided platform to various emerging technologies and applications for growth. E-Health services have also been integrated and greatly benefitted from IoT. Due to the increased use of computer technology, computer networks have faced serious security challenges and IoT is also facing the same security threats. As IoT has provided platform to other fields, like E-Health, these services are also prone to such threats. Denial of Service (DoS) and Distributed Denial of Service (DDoS) attacks on E-Health servers in IoT would endanger real-time monitoring of patients and also overall reliability of the E-Health services. In this paper, existing solutions to DoS/DDoS attacks in IoT have been reviewed and a reliable solution is presented for securing the servers against these attacks.

Author 1: Iftikhar ul Sami
Author 2: Maaz Bin Ahmad
Author 3: Muhammad Asif
Author 4: Rafi Ullah

Keywords: E-Healthcare; DDoS attack; Internet of Things

Download PDF

Paper 41: Truncated Patch Antenna on Jute Textile for Wireless Power Transmission at 2.45 GHz

Abstract: Jute textile is made from natural fibres and is known for its strength and durability. To determine if jute could be used as a substrate for microstrip antennas, its electromagnetic characteristics (permittivity and loss tangent) are measured in the band of 1 GHz to 5 GHz. The obtained data are used to compare the performances of a simple rectangular patch antenna resonating at 2.45 GHz on jute with others using different textiles as a substrate. Comparing the simulation results gives an idea of using jute as a substrate for microstrip antennas. In the second part of this paper, a truncated patch antenna on jute is studied to be used for wireless power transmission at 2.45 GHz. The antenna was simulated and then fabricated. The measured reflection shows a shift in the resonance frequency compared to the simulated one. The frequency shift is explained, and a solution is proposed to correct it; a second antenna was fabricated and measured.

Author 1: Kais Zeouga
Author 2: Lotfi Osman
Author 3: Ali Gharsallah
Author 4: Bhaskar Gupta

Keywords: Jute textile; permittivity measurement; loss tangent measurement; patch antenna, truncated patch antenna; frequency shift; wireless power transmission

Download PDF

Paper 42: A Hybrid Approach for Feature Subset Selection using Ant Colony Optimization and Multi-Classifier Ensemble

Abstract: An active area of research in data mining and machine learning is dimensionality reduction. Feature subset selection is an effective technique for dimensionality reduction and an essential step in successful data mining applications. It reduces the number of features, removes irrelevant, redundant, or noisy features, and enhances the predictive capability of the classifier. It provides fast and cost-effective predictors and leading to better model comprehensibility. In this paper, we proposed a hybrid approach for feature subset selection. It is a filter based method in which a classifier ensemble is coupled with Ant colony optimization algorithm to enhance the predictive accuracy of filters. Extensive experimentation has been carried out on eleven data sets over four different classifiers. All of the data sets are available publically. We have compared our proposed method with numerous filter and wrapper based methods. Experimental results indicate that our method has remarkable ability to generate subsets with reduced number of features. Along with it, our proposed method attained higher classification accuracy.

Author 1: Anam Naseer
Author 2: Waseem Shahzad
Author 3: Arslan Ellahi

Keywords: Ant colony optimization; predictive; classifier features selection

Download PDF

Paper 43: Efficient Smart Emergency Response System for Fire Hazards using IoT

Abstract: The Internet of Things pertains to connecting currently unconnected things and people. It is the new era in transforming the existed systems to amend the cost effective quality of services for the society. To support Smart city vision, Urban IoT design plans exploit added value services for citizens as well as administration of the city with the most advanced communication technologies. To make emergency response real time, IoT enhances the way first responders and provides emergency managers with the necessary up-to-date information and communication to make use of those assets. IoT mitigates many of the challenges to emergency response including present problems, like a weak communication network and information lag. In this paper, it is proposed that an emergency response system for fire hazards is designed by using IoT standardized structure. To implement this proposed scheme a low-cost Expressive wi-fi module ESP-32, Flame detection sensor, Smoke detection sensor (MQ-5), Flammable gas detection sensor and one GPS module are used. The sensors detects the hazard and alerts the local emergency rescue organizations like fire departments and police by sending the hazard location to the cloud-service through which all are connected. The overall network utilizes a light weighted data oriented publish-subscribe message protocol MQTT services for fast and reliable communication. Thus, an intelligent integrated system is designed with the help of IoT.

Author 1: Lakshmana Phaneendra Maguluri
Author 2: Tumma Srinivasarao
Author 3: Maganti Syamala
Author 4: R. Ragupathy
Author 5: N.J. Nalini

Keywords: Internet of Things (IoT); Aurduino IDE; GPS

Download PDF

Paper 44: An Information Theoretic Analysis of Random Number Generator based on Cellular Automaton

Abstract: Realization of Randomness had always been a controversial concept with great importance both from theoretical and practical Perspectives. This realization has been revolutionized in the light of recent studies especially in the realms of Chaos Theory, Algorithmic Information Theory and Emergent behavior in complex systems. We briefly discuss different definitions of Randomness and also different methods for generating it. The connection between all these approaches and the notion of Normality as the necessary condition of being unpredictable would be discussed. Then a complex-system-based Random Number Generator would be introduced. We will analyze its paradoxical features (Conservative Nature and reversibility in spite of having considerable variation) by using information theoretic measures in connection with other measures. The evolution of this Random Generator is equivalent to the evolution of its probabilistic description in terms of probability distribution over blocks of different lengths. By getting the aid of simulations we will show the ability of this system to preserve normality during the process of coarse graining.

Author 1: Amirahmad Nayyeri
Author 2: Gholamhossein Dastghaibyfard

Keywords: Random number generators; entropy; correlation information; elementary cellular automata; reversibility

Download PDF

Paper 45: Requirement Elicitation Techniques for Open Source Systems: A Review

Abstract: The trend of Open Source Software development has been increased from the past few years. It has gained much attention of developers in the industry. The development of open source software systems is slightly different from traditional software development. In open source software development, requirement elicitation is a very complex and critical process as developers from different regions of the world develop the system so it’s really difficult to gather requirements for such systems. A variety of available tools, techniques, and approaches are used to perform the process of requirement elicitation. The purpose of this study is to focus on how the process of requirement elicitation is carried out for open source software and the different ways which are used to simplify the process of requirement elicitation. This paper comprehensively describes the techniques which are available and are used for requirement elicitation in open source software development. To do so, a literature survey of the existing techniques of requirement elicitation is conducted and different techniques are found that can be used for requirement elicitation in open source software systems.

Author 1: Hafiza Maria Kiran
Author 2: Zulfiqar Ali

Keywords: Requirements engineering; requirement elicitation; open source system; requirement elicitation techniques

Download PDF

Paper 46: A Parallel Community Detection Algorithm for Big Social Networks

Abstract: Mining social networks has become an important task in data mining field, which describes users and their roles and relationships in social networks. Processing social networks with graph algorithms is the source for discovering many features. The most important algorithms applied to social networks are community detection algorithms. Communities of social networks are groups of people sharing common interests or activities. DenGraph is one of the density-based algorithms that used to find clusters of arbitrary shapes based on users’ interactions in social networks. However, because of the rapidly growing size of social networks, it is impossible to process a huge graph on a single machine in an acceptable level of execution. In this article, DenGraph algorithm has been redesigned to work in distributed computing environment. We proposed ParaDengraph Algorithm based on Pregel parallel model for large graph processing.

Author 1: Yathrib AlQahtani
Author 2: Mourad Ykhlef

Keywords: Data mining; social networks; community detection; distributed computing; Pregel

Download PDF

Paper 47: Comparison between Two Adaptive Controllers Applied to Greenhouse Climate Monitoring

Abstract: This paper presents a study of a multivariable Adaptive Generalized Predictive Controller and its application to control the thermal behaviour of an agricultural greenhouse, which is composed of a number of different elements (cover, internal air, plants, soil, actuators and sensors). The thermal model was obtained after the study of energy balances reacting the physical behavior of the greenhouse. For this reason, we opted to estimate the dynamic model of the greenhouse with algorithm based on recursive least squares (RLS) method. Simulation results are exposed to show the controller’s performances in terms of response time, stability and the rejection of disturbances.

Author 1: Mohamed Essahafi
Author 2: Mustapha Ait Lafkih

Keywords: Generalized predictive control; greenhouse; multivariable control; identification; recursive least square

Download PDF

Paper 48: A Predictive Model for Solar Photovoltaic Power using the Levenberg-Marquardt and Bayesian Regularization Algorithms and Real-Time Weather Data

Abstract: The stability of power production in photovoltaics (PV) power plants is an important issue for large-scale gridconnected systems. This is because it affects the control and operation of the electrical grid. An efficient forecasting model is proposed in this paper to predict the next-day solar photovoltaic power using the Levenberg-Marquardt (LM) and Bayesian Regularization (BR) algorithms and real-time weather data. The correlations between the global solar irradiance, temperature, solar photovoltaic power, and the time of the year were studied to extract the knowledge from the available historical data for the purpose of developing a real-time prediction system. The solar PV generated power data were extracted from the power plant installed on-top of the faculty of engineering building at Applied Science Private University (ASU), Amman, Jordan and weather data with real-time records were measured by ASU weather station at the same university campus. Huge amounts of training, validation, and testing experiments were carried out on the available records to optimize the Neural Networks (NN) configurations and compare the performance of the LM and BR algorithms with different sets and combinations of weather data. Promising results were obtained with an excellent realtime overall performance for next-day forecasting with a Root Mean Square Error (RMSE) value of 0.0706 using the Bayesian regularization algorithm with 28 hidden layers and all weather inputs. The Levenberg-Marquardt algorithm provided a 0.0753 RMSE using 23 hidden layers for the same set of learning inputs. This research shows that the Bayesian regularization algorithm outperforms the reported real-time prediction systems for the PV power production.

Author 1: Mohammad H. Alomari
Author 2: Ola Younis
Author 3: Sofyan M. A. Hayajneh

Keywords: Solar photovoltaic; solar irradiance; PV power forecasting; machine learning; artificial neural networks; Levenberg-Marquardt; Bayesian regularization

Download PDF

Paper 49: Improving Energy Conservation in Wireless Sensor Network Using Energy Harvesting System

Abstract: Wireless Sensor Networks assume an imperative part to monitor and gather information from complex geological ranges. Energy conservation plays a fundamental role in WSNs since such sensor networks are designed to be located in dangerous and non-accessible areas and has gained popularity since the last decade. The main issue of Wireless Sensor Network is energy consumption. Therefore, management of energy consumption of the sensor node is the main area of our research. Sensor nodes use non-changeable batteries for power supply and the lifetime of Sensor node greatly depends on these batteries. The replacement of these batteries is very difficult in many applications, such as an alternative solution to this problem is to use Energy Harvesting system in Wireless Sensor Network to provide a permanent power supply to sensor nodes. This process of extracting energies from nature and converting it into electrical energy is called energy harvesting. Energy can be harvested from the environment for sensor nodes. There are many sources of energies in nature like solar, wind and thermal which can be harvested and used for WSNs. In this research, we suggest to use energy harvesting system for Cluster Heads in a clustering based Wireless Sensor Networks. We will compare our proposed technique to a well-known clustering algorithm Low Energy Adaptive Cluster Hierarchy (LEACH).

Author 1: Abdul Rashid
Author 2: Faheem Khan
Author 3: Toor Gul
Author 4: Fakhr-e-Alam
Author 5: Shujaat Ali
Author 6: Samiullah Khan
Author 7: Fahim Khan Khalil

Keywords: Wireless sensor network; Low Energy Adaptive Cluster Hierarchy (LEACH); clustering; cluster head; energy harvesting; energy conservation

Download PDF

Paper 50: A New Motion Planning Framework based on the Quantized LQR Method for Autonomous Robots

Abstract: This study addresses an argument on the disconnection between the computational side of the robot navigation problem with the control problem including concerns on stability. We aim to constitute a framework that includes a novel approach of using quantizers for occupancy grids and vehicle control systems concurrently. This representation allows stability concerned with the navigation structure through input and output quantizers in the framework. We have given the theoretical proofs of qLQR in the sense of Lyapunov stability alongside with the implementation details. The experimental results demonstrate the effectiveness of the qLQR controller and quantizers in the framework with realtime data and offline simulations.

Author 1: Onur Sencan
Author 2: Hakan Temeltas

Keywords: Robot motion; mobile robotics; hybrid systems; optimal control; quantization

Download PDF

Paper 51: Bearing Fault Classification based on the Adaptive Orthogonal Transform Method

Abstract: In this work, we propose an approach based on building an adaptive base which permits to make accurate decisions for diagnosis. The orthogonal adaptive transformation consists of calculating the adaptive operator and the standard spectrum for every state, using two sets of vibration signal records for each type of fault. To classify a new signal, we calculate the spectral vector of this signal in each base. Then, the similarity between this vector and other standard spectra is computed. The experimental results show that the proposed method is very useful for improving the fault detection.

Author 1: Mohamed Azergui
Author 2: Abdenbi Abenaou
Author 3: Hassane Bouzahir

Keywords: Condition monitoring; vibration analysis; adaptive orthogonal transformation; bearing fault

Download PDF

Paper 52: Improving Security of the Telemedicine System for the Rural People of Bangladesh

Abstract: Telemedicine is a healthcare system where healthcare professionals have the capability to observe, diagnose, evaluate and treat the patient from a remote location and the patient have the ability to easily access the medical expertise quickly and efficiently. Increasing popularity of Telemedicine increase the security intimidations. In this paper, a security framework is implemented for the developed cost-effective Telemedicine system. The proposed security framework secure all the sections of the model following the recommendations of Health Level 7, First Healthcare Interoperability Resources and Health Insurance Portability and Accountability Act. Implementation of this security framework including authenticating the different types of user, secure connection between mobile and sensors through authentication, protect the mobile application from hackers, ensures data security through encryption, as well as secure server, using secured socket layer called SSL. Finally, we can say that the developed Telemedicine model is more secure and it can be implemented in any remote areas of developing countries as like as Bangladesh.

Author 1: Toufik Ahmed Emon
Author 2: Uzzal Kumar Prodhan
Author 3: Mohammad Zahidur Rahman
Author 4: Israt Jahan

Keywords: Telemedicine; security; encryption; hashing

Download PDF

Paper 53: Nonlinear Model Predictive Control for pH Neutralization Process based on SOMA Algorithm

Abstract: In this work, the pH neutralization process is described by a neural network Wiener (NNW) model. A nonlinear Model Predictive Control (NMPC) is established for the considered process. The main difficulty that can be encountered in NMPC is solving the optimization problem at each sampling time to determine an optimal solution in finite time. The aim of this paper is the use of global optimization method to solve the NMPC minimization problem. Therefore, we propose in this work, to use the Self Organizing Migrating Algorithm (SOMA) to solve the presented optimization problem. This algorithm proves its efficiency to determine the optimal control sequence with a lower computation time. Then the NMPC is compared to adaptive PID controller, where we propose to use the SOMA algorithm to formulate the PID in order to determine the optimal parameters of the PID. The performances of the two controllers based on the SOMA algorithm are tested on the pH neutralization process.

Author 1: Hajer Degachi
Author 2: Wassila Chagra
Author 3: Moufida Ksouri

Keywords: Nonlinear model predictive control; optimization; SOMA algorithm; adaptive PID; pH neutralization process

Download PDF

Paper 54: An Efficient Participant’s Selection Algorithm for Crowdsensing

Abstract: With the advancement of mobile technology the use of Smartphone is greatly increased. Everyone has the mobile phones and it becomes the necessity of life. Today, smart devices are flooding the internet data at every time and in any form that cause the mobile crowdsensing (MCS). One of the key challenges in mobile crowd sensing system is how to effectively identify and select the well-suited participants in recruitments from a large user pool. This research work presents the concept of crowdsensing along with the selection process of participants from a large user pool. MCS provides the efficient selection process for participants that how well suited participant’s selects/recruit from a large user pool. For this, the proposed selection algorithm plays our role in which the recruitment of participants takes place with the availability status from the large user pool. At the end, the graphical result presented with the suitable location of the participants and their time slot.

Author 1: Tariq Ali
Author 2: Umar Draz
Author 3: Sana Yasin
Author 4: Javeria Noureen
Author 5: Ahmad shaf
Author 6: Munwar Ali

Keywords: Mobile crowdsensing (MCS); Mobile Sensing Platform (MSP]); crowd sensing; participant; user pool; crowdsourcing

Download PDF

Paper 55: An Energy-Efficient User-Centric Approach for High-Capacity 5G Heterogeneous Cellular Networks

Abstract: Today’s cellular networks (3G/4G) do not scale well in heterogeneous networks (HetNets) of multiple technologies that employ network-centric (NC) model. This destabilization is due to the need for coordination and management of multiple layers of the HetNets that the NC models cannot provide. User-centric (UC) approach is one of the key enablers of 5G wireless cellular networks for rapid recovering from network failures and ensuring certain communication capability for the users. In this paper, we present resource-aware energy-saving technique based on the UC model for LTE-A HetNets.We formulate an optimization problem for UC as a mixed linear integer programming (MILP) that minimizes the total power consumption (Energy Efficiency) while respecting the data rate per user and propose a low complexity iterative algorithm to user terminal (UE)-eNodeB association. In UC model, UE possessing terminal intelligence can establish the transmission and reception with different cells within the LTEA HetNet assuming the existence of coordination between the different cells in the network. The performance is evaluated in terms of energy saving in the uplink and downlink and the added capacity to the network (data rate). The evaluation is carried out by comparing a UC model against a NC model with the same simulation setup. The results show significant percentage of energy saving at eNodeBs and UEs in a UC model. Also, system capacity is enhanced in the UC model in both the uplink and downlink due to utilizing best channel gain for transmission and reception.

Author 1: Abdulziz M. Ghaleb
Author 2: Ali Mohammed Mansoor
Author 3: Rodina Ahmad

Keywords: Energy efficiency; HetNets; green networks; usercentric; network-centric; 5G

Download PDF

Paper 56: Lifetime Maximization on Scalable Stable Election Protocol for Large Scale Traffic Engineering

Abstract: Recently, Wireless Sensor Networks (WSNs) are getting more fame because of low cost and easy to manage and maintain. WSNs consists of sensor nodes and a Base Station (BS). Sensor nodes are responsible to sense, transmit and receive the data packets from sensing field, and the BS is responsible to collect this data and covert it into readable form. The main issue in this network is lack of power resources. As sensor nodes are restricted to limited energy, so researchers always aims to produce an energy efficient clustered routing protocol. To make the efficient routing protocol, heterogeneity of sensor nodes is a best possible solution. ‘Stable Election Protocol’ was the first heterogeneous network and proposed two level of heterogeneity. SEP Protocol not only improved the network lifetime but also improved the stability of sensor nodes. In order to maximize the network lifetime, we propose the scalability of SEP routing protocol (S-SEP) to check the reliability in large scale networks for traffic engineering. We compare the results of standard SEP routing protocol with fourth level of heterogeneity. Simulation results proves that S-SEP protocol works more better in larger networks.

Author 1: Muhammad Asad
Author 2: Arsalan Ali Shaikh
Author 3: Soomro Pir Dino
Author 4: Muhammad Aslam
Author 5: Yao Nianmin

Keywords: Wireless sensor networks (WSN); heterogeneous network; clustered routing protocol; traffic engineering

Download PDF

Paper 57: Comparative Analysis of Raw Images and Meta Feature based Urdu OCR using CNN and LSTM

Abstract: Urdu language uses cursive script which results in connected characters constituting ligatures. For identifying characters within ligatures of different scales (font sizes), Convolution Neural Network (CNN) and Long Short Term Memory (LSTM) Network are used. Both network models are trained on formerly extracted ligature thickness graphs, from which models extract Meta features. These thickness graphs provide consistent information across different font sizes. LSTM and CNN are also trained on raw images to compare performance on both forms of inputs. For this research, two corpora, i.e. Urdu Printed Text Images (UPTI) and Centre for Language Engineering (CLE) Text Images are used. Overall performance of networks ranges between 90% and 99.8%. Average accuracy on Meta features is 98.08% while using raw images, 97.07% average accuracy is achieved.

Author 1: Asma Naseer
Author 2: Kashif Zafar

Keywords: Long Short Term Memory (LSTM); Convolution Neural Network (CNN); OCR; scale invariance; deep learning; ligature

Download PDF

Paper 58: An Empirical Evaluation of Error Correction Methods and Tools for Next Generation Sequencing Data

Abstract: Next Generation Sequencing (NGS) technologies produce massive amount of low cost data that is very much useful in genomic study and research. However, data produced by NGS is affected by different errors such as substitutions, deletions or insertion. It is essential to differentiate between true biological variants and alterations occurred due to errors for accurate downstream analysis. Many types of methods and tools have been developed for NGS error correction. Some of these methods only correct substitutions errors whereas others correct multi types of data errors. In this article, a comprehensive evaluation of three types of methods (k-spectrum based, Multi- sequencing alignment and Hybrid based) is presented which are implemented and adopted by different tools. Experiments have been conducted to compare the performance based on runtime and error correction rate. Two different computing platforms have been used for the experiments to evaluate effectiveness of runtime and error correction rate. The mission and aim of this comparative evaluation is to provide recommendations for selection of suitable tools to cope with the specific needs of users and practitioners. It has been noticed that k-mer spectrum based methodology generated superior results as compared to other methods. Amongst all the tools being utilized, Racer has shown eminent performance in terms of error correction rate and execution time for both small as well as large data sets. In multisequence alignment based tools, Karect depicts excellent error correction rate whereas Coral shows better execution time for all data sets. In hybrid based tools, Jabba shows better error correction rate and execution time as compared to brownie. Computing platforms mostly affect execution time but have no general effect on error correction rate.

Author 1: Atif Mehmood
Author 2: Javed Ferzund
Author 3: Muhammad Usman Ali
Author 4: Abbas Rehman
Author 5: Shahzad Ahmed
Author 6: Imran Ahmad

Keywords: Next generation sequencing; bioinformatics; errors; error correction; execution time; k-spectrum; suffix tree based; hybrid based

Download PDF

Paper 59: Combinatorial Double Auction Winner Determination in Cloud Computing using Hybrid Genetic and Simulated Annealing Algorithm

Abstract: With the advancement of information technology need to perform computing tasks everywhere and all the time there, in cloud computing environments and heterogeneous users have access to different sources with different characteristics that these resources are geographically in different areas. Due to this, the allocation of resources in cloud computing comes to the main issue is considered a major challenge to achieve high performance. Due to the nature of cloud computing is a distributed system to account, comes to business, economic methods such as auctions are used to allocate resources for decentralization. As an important economic bilateral hybrid auction model is the perfect solution for the allocation of resources in cloud computing, on the other hand, providers of cloud resources similarly, their sources of supply combined addressing. One of the problems auction two-way combination with maximum benefit for the parties to the transaction is the efficient allocation of resources to the problem of determining an auction winner is known. Given that the winning auction is NP-Hard. It results in a problem, several methods have been proposed to solve it. In this dissertation, taking into account the strength simulated annealing algorithm, a modified version of it is proposed for solving the winner determination in combinatorial double auction problem in cloud computing. The proposed approach is simulated along with genetic and simulated annealing algorithms and the results show that the proposed approach finds better solutions than the two mentioned algorithms.

Author 1: Ali Sadigh Yengi Kand
Author 2: Ali Asghar Pourhaji Kazem

Keywords: Cloud computing; double auction; winner determination; genetic algorithm; simulated annealing

Download PDF

Paper 60: QoS-based Cloud Manufacturing Service Composition using Ant Colony Optimization Algorithm

Abstract: Cloud manufacturing (CMfg) is a service-oriented platform that enables engineers to use the manufacturing capacity in the form of cloud-based services that aggregated in service pools on demand. In CMfg, the integration of manufacturing resources across different areas and industries is accomplished using cloud services. In recent years, the interest in cloud manufacturing service composition has grown, due to its importance in different manufacturing applications. When no single service is capable of satisfying the need for a manufacturing service requester, the service combination may be useful in order to fulfill the purpose of the manufacturing service requester. Therefore, the problem of how efficient and effective interconnection of cloud manufactring services has come to fetch many research fields. In this paper, a new algorithm is presented using an ant colony optimization for the problem of cloud manufacturing service composition considering the quality of service.

Author 1: Elsoon Neshati
Author 2: Ali Asghar Pourhaji Kazem

Keywords: Cloud computing; cloud manufacturing; service composition, ant colony optimization

Download PDF

Paper 61: Envisioning Internet of Things using Fog Computing

Abstract: Internet of Things is the future of the Internet. It encircles a wide scope. There are currently billions of devices connected to the Internet and this trend is expecting to grow exponentially. Cisco predicts there are at present 20 billion connected devices. These devices, along with their varied data types, transmission rates and communication protocols connect to the Internet seamlessly. The futuristic implementation of Internet of Things across various scenarios demands the real time performance delivery. These range from RFID connected devices to huge data centers. Until date, there is no single communication protocol available for envisioning IoT. There is still no common, agreed upon architecture. Hence, huge challenges lie ahead. One of the ways to envision Internet of Things is to make use of Fog Networks. Fog is essentially a cloudlet, located nearer to the ground. It offers lower latency and better bandwidth conservation. The Fog or Fog computing is a recent concept. The OpenFog Consortium is a joint effort of many vendors. Its latest work is the background study for realizing Fog as a possible paltform for activating Internet of Things. This paper revolves around Envisioning Internet of Things using Fog computing. It begins with a detailed background study of Internet of Things and Fog Architecture. It covers applications and scenarios where such knowledge is highly applicable. The paper concludes by proposing Fog Computing as a possible platform for Internet of Things.

Author 1: Urooj Yousuf Khan
Author 2: Tariq Rahim Soomro

Keywords: IoT; fog computing; cloud computing

Download PDF

Paper 62: A Group Decision-Making Method for Selecting Cloud Computing Service Model

Abstract: Cloud computing is a new technology that has great potential for the business world. Many business firms have implemented, are implementing, or planning to implement cloud computing technology. The cloud computing resources are delivered in various forms of service models which make it challenging for business customers to select the model that suits their business needs. This paper proposes a novel group-based decision-making method where a group of decision makers is involved in the decision process. Each decision maker provides weights for the cloud selection criteria. Based on weight aggregations and deviations, decision makers would select the alternative which has the highest ratio of deviation to mean is selected. The method is illustrated with an example on the selection of cloud service models. This method is useful for IT managers in selecting the appropriate cloud service model for their organizations.

Author 1: Ibrahim M. Al-Jabri
Author 2: Mustafa I. Eid
Author 3: M. Sadiq Sohail

Keywords: Cloud computing; cloud service models; multi-criteria decision-making; group decision-making

Download PDF

Paper 63: Prediction of Stroke using Data Mining Classification Techniques

Abstract: Stroke is a neurological disease that occurs when a brain cells die as a result of oxygen and nutrient deficiency. Stroke detection within the first few hours improves the chances to prevent complications and improve health care and management of patients. In addition, significant effect of medications that were used as treatment for stroke would appear only if they were given within the first three hours since the beginning of stroke. A framework has been designed based on data mining techniques on Stroke data set that is obtained from Ministry of National Guards Health Affairs hospitals, Kingdom of Saudi Arabia. A data mining model was built with 95% accuracy. Furthermore, this study showed that patient with the following medical conditions, such as heart diseases (hypertension mainly), immunity diseases, diabetes militias, kidney diseases, hyperlipidemia, epilepsy, or blood (platelets) disorders has a higher probability to develop stroke.

Author 1: Ohoud Almadani
Author 2: Riyad Alshammari

Keywords: Stroke; data mining; classification

Download PDF

Paper 64: Hardware Implementation for the Echo Canceller System based Subband Technique using TMS320C6713 DSP Kit

Abstract: The acoustic echo cancellation system is very important in the communication applications that are used these days; in view of this importance we have implemented this system practically by using DSP TMS320C6713 Starter Kit (DSK). The acoustic echo cancellation system was implemented based on 8 subbands techniques using Least Mean Square (LMS) algorithm and Normalized Least Mean Square (NLMS) algorithm. The system was evaluated by measuring the performance according to Echo Return Loss Enhancement (ERLE) factor and Mean Square Error (MSE) factor.

Author 1: Mahmod. A. Al Zubaidy
Author 2: Sura Z. Thanoon

Keywords: Acoustic echo canceller; Least Mean Square (LMS); Normalized Least Mean Square (NLMS); TMS320C6713; 8 subbands adaptive filter

Download PDF

Paper 65: SME Cloud Adoption in Botswana: Its Challenges and Successes

Abstract: The standard office or business in Botswana hosts their resources in-house. This means that a company will have their hardware, software and support staff as part of their daily work operations. Technology has brought a shift to the office environment with Cloud Computing. Botswana has seen the growth of the Cloud Technologies, within its own boundaries where companies have embraced the new technology to mobilize and push their operational agenda with the same tenacity as the rest of the world using the technology. Cloud computing has taken root in Botswana and it shows that a lot of SMEs are using cloud computing, whilst some are non-adopters to the technology. Edgar Tsimane reported the take up on cloud computing in Botswana. Botswana uses the National ICT policy to guide on technological advances and development, i.e the Maitlamo policy. This paper is considering aspects influencing the company’s decisions on utilizing the cloud as a service, both opportunistic and challenges. Some of the questions to address for the study are: how effective is cloud computing for businesses in Botswana; what challenges and successes these companies have had, is there any particular framework they had to follow to guide them in adopting the services? Finally, the paper was to take consideration in recommending a framework that can be adopted within the Botswana.

Author 1: Malebogo Khanda
Author 2: Srinath Doss

Keywords: Cloud computing; SME’s; cloud computing services; cloud business processes; cloud computing framework; cloud deployment models; cloud computing services model

Download PDF

Paper 66: Survey Paper for Software Project Team, Staffing, Scheduling and Budgeting Problem

Abstract: Software project scheduling is a standout amongst the most imperative scheduling zones looked by Software project management team. Software development companies are under substantial strain to finish projects on time, with budget, quality and with the suitable level of values and qualities. Inexperienced development team or potentially poor management can cause deferrals and costs that given scheduling and spending limitations are regularly unsuitable, prompting business basic disappointments. Software development companies frequently battle to convey extends on time, inside spending plan and with the required quality. For a fruitful project, both software building and software management are exceptionally vital. One conceivable reason for this issue is poor Software project management and, specifically, insufficient project scheduling and inadequate team staffing. Software project schedule issue is one of the essential and testing issues come across by the product project directors in the much focused software companies. Since matter is winding up hard with the expanding quantities of workers and tasks, just a couple of calculations exist and the execution is as yet not fulfilling, to build up an adaptable and powerful model for Software project arranging. In this paper we have attempted to expand a few systems and strategies and results yielded are explained.

Author 1: Rizwan Akram
Author 2: Salman Ihsan
Author 3: Shaista Zafar
Author 4: Babar Hayat

Keywords: Software engineering; project management; software project resources; project scheduling; budgeting; team

Download PDF

Paper 67: Real-Time Experimentation and Analysis of Wifi Spectrum Utilization in Microwave Oven Noisy Environment

Abstract: The demand for broadband wireless communication in home and office has been increasing exponentially; thus, the need for reliable and effective communication is very crucial. Both theoretical and experimental investigations have clearly shown that electromagnetic radiation from external sources such as microwave oven (MWO) has detrimental impact on the wireless medium and the media content. Therefore, this drastically degrade the signal strength in wireless link and consequently affects the overall throughput due to noise and interference. This experimental study is primarily aimed at critically analyzing and evaluating the impact of electromagnetic radiation on spectrum utilization under different experimental scenarios. The experimental results clearly show that electromagnetic noise radiation from microwave oven can seriously affect the performance of other devices operating in 2.4GHz frequency band, especially, delay sensitive applications and services.

Author 1: Yakubu S. Baguda

Keywords: Electromagnetic radiation; microwave oven; spectrum utilization; bandwidth; ISM band; signal strength; throughput; wireless channel

Download PDF

Paper 68: Deep Learning Technology for Predicting Solar Flares from (Geostationary Operational Environmental Satellite) Data

Abstract: Solar activity, particularly solar flares can have significant detrimental effects on both space-borne and grounds based systems and industries leading to subsequent impacts on our lives. As a consequence, there is much current interest in creating systems which can make accurate solar flare predictions. This paper aims to develop a novel framework to predict solar flares by making use of the Geostationary Operational Environmental Satellite (GOES) X-ray flux 1-minute time series data. This data is fed to three integrated neural networks to deliver these predictions. The first neural network (NN) is used to convert GOES X-ray flux 1-minute data to Markov Transition Field (MTF) images. The second neural network uses an unsupervised feature learning algorithm to learn the MTF image features. The third neural network uses both the learned features and the MTF images, which are then processed using a Deep Convolutional Neural Network to generate the flares predictions. To the best of our knowledge, this work is the first flare prediction system that is based entirely on the analysis of pre-flare GOES X-ray flux data. The results are evaluated using several performance measurement criteria that are presented in this paper.

Author 1: Tarek A M Hamad Nagem
Author 2: Rami Qahwaji
Author 3: Stan Ipson
Author 4: Zhiguang Wang
Author 5: Alaa S. Al-Waisy

Keywords: Convolutional; neural; network; deep; learning; solar; flare; prediction; space; weather insert

Download PDF

Paper 69: Cyber-Security Incidents: A Review Cases in Cyber-Physical Systems

Abstract: Cyber-Physical Systems refer to systems that have an interaction between computers, communication channels and physical devices to solve a real-world problem. Towards industry 4.0 revolution, Cyber-Physical Systems currently become one of the main targets of hackers and any damage to them lead to high losses to a nation. According to valid resources, several cases reported involved security breaches on Cyber-Physical Systems. Understanding fundamental and theoretical concept of security in the digital world was discussed worldwide. Yet, security cases in regard to the cyber-physical system are still remaining less explored. In addition, limited tools were introduced to overcome security problems in Cyber-Physical System. To improve understanding and introduce a lot more security solutions for the cyber-physical system, the study on this matter is highly on demand. In this paper, we investigate the current threats on Cyber-Physical Systems and propose a classification and matrix for these threats, and conduct a simple statistical analysis of the collected data using a quantitative approach. We confirmed four components i.e., (the type of attack, impact, intention and incident categories) main contributor to threat taxonomy of Cyber-Physical Systems.

Author 1: Mohammed Nasser Al-Mhiqani
Author 2: Rabiah Ahmad
Author 3: Warusia Yassin
Author 4: Aslinda Hassan
Author 5: Zaheera Zainal Abidin
Author 6: Nabeel Salih Ali
Author 7: Karrar Hameed Abdulkareem

Keywords: Cyber-Physical Systems; threats; incidents; security; cybersecurity; taxonomies; matrix; threats analysis

Download PDF

Paper 70: Measuring Quality of E-Learning and Desaire2Learn in the College of Science and Humanities at Alghat, Majmaah University

Abstract: E-learning and Desaire2Learn (D2L) system were used in several higher education institutions; the learning satisfaction depends on the quality of the system applied to serve this issue and its importance in users mind. Therefore, this study, intended to explore the degree of students and Satisfaction of faculty members with the importance and quality of e-learning used and D2L system as a tool for learning some courses. We took a sample of 57 faculty members and 135 students participated in this study. We used two questionnaires as a tool to collect data from participants, one for faculty members and the other for students; both of these questionnaires had the same idea with different questions. We implemented Statistical Package for Social Science (SPSS) to analyze data. The results show that the Satisfaction of faculty members is high with the quality of e-learning and D2L system as a method of teaching, moderate satisfaction with using D2L tools, the result shows there was a positive relationship between e-learning quality and using D2L tools in teaching. But the result record high satisfaction from students towards the quality of e-learning; the D2L system as a method of learning and the result shows there was no statistically significant effect of gender on the D2L system quality. Finally, the study discussed the implications and recommendations of the work.

Author 1: Abdelmoneim Ali Mohamed
Author 2: Faisal Mohammed Nafie

Keywords: E-learning; Desire2Learn; D2L quality; E-learning quality; learning satisfaction

Download PDF

Paper 71: TSAN: Backbone Network Architecture for Smart Grid of P.R China

Abstract: Network architecture of any real-time system must be robust enough to absorb several network failures and still work smoothly. Smart Grid Network is one of those big networks that should be considered and designed carefully because of its dependencies. There are several hybrid approaches that have been proposed using wireless and wired technologies by involving SDH/SONNET as a backbone network, but all technologies have their own limitations and can’t be utilized due to various factors. In this paper, we propose a fiber optic based Gigabit Ethernet (1000BASE-ZX) network named as Territory Substation Area Network (T-SAN) for smart grid backbone architecture. It is a scalable architecture, with several desired features, like higher coverage, fault tolerance, robustness, reliability, and maximum availability. The use case of sample mapping the T-SAN on the map of People Republic of China proves its strength to become backhaul network of any territory or country, the results of implemented architecture and its protocol for fault detection and recovery reveals the ability of system survival under several random, multiple and simultaneous faults efficiently.

Author 1: Raheel Ahmed Memon
Author 2: Jianping Li
Author 3: Anwar Ahmed Memon
Author 4: Junaid Ahmed
Author 5: Muhammad Irshad Nazeer
Author 6: Muhammad Ismail

Keywords: Smart Grid; TSAN; 1000BASE-ZX ethernet; backbone architecture

Download PDF

Paper 72: Data Synchronization Model for Heterogeneous Mobile Databases and Server-side Database

Abstract: Mobile devices, because they can be used to access corporate information anytime anywhere, have recently received considerable attention, and several research efforts have been tailored towards addressing data synchronization problems. However, the solutions are either vendor specific or homogeneous in nature. This paper proposed Heterogeneous Mobile Database Synchronization Model (HMDSM) to enable all mobile databases (regardless of their individual differences) and participate in any data synchronization process. To accomplish this, an experimental approach (exploratory and confirmatory) was employed. Also existing models and algorithms are classified, protracted and applied. All database peculiar information, such as trigger, timestamp and meta-data are eliminated. A listener is added to listen to any operation performed from either side. To prove its performance, the proposed model underwent rigorous experimentation and testing. X2 test was used to analyze the data generated from the experiment. Results show the feasibility of having an approach which can handle database synchronization between heterogeneous mobile databases and the server. The proposed model does not only prove its generic nature to all mobile databases but also reduces the use of mobile resources; thus suitable for mobile devices with low computing power to proficiently process large amount of data.

Author 1: Abdullahi Abubakar Imam
Author 2: Shuib Basri
Author 3: Rohiza Ahmad
Author 4: Abdul Rehman Gilal

Keywords: Heterogeneous databases; data synchronization; mobile databases; mobile devices; NoSQL database; relational databases

Download PDF

Paper 73: Data Mining Techniques to Construct a Model: Cardiac Diseases

Abstract: Using echocardiography flexible Transthoracic Echocardiography reported data set detecting heart disease by using mining techniques designed prediction model the data set can develop the reliability of analysis of cardiac diseases by echocardiography, using eight iterative and interactive steps consisting Knowledge Discovery in Database (KDD) methodology including from 209 patients with echocardiography to extracting the data important mode of action Transthoracic Echocardiography inspection report. This study used data from Faisalabad Institute of Cardiology study from 2012 to 2015. All models exposed the results of J48 decision tree, naïve bayes classifier and neural network that has extraordinary classification precision and predictive of heart disease cases are generally comparable. However, J48 model predictive classification accuracy shows of 80% based on the true positive rate ratio and performance slightly better. This study shows to predict heart disease cases and People can be used the results of our study to make more consistent diagnosis of cardiac disease and to help them as a support tool for cardiac disease specialists.

Author 1: Noreen Akhtar
Author 2: Muhammad Ramzan Talib
Author 3: Nosheen Kanwal

Keywords: Knowledge Discovery in Database (KDD); data mining; decision trees; neural networks; Bayesian classifier; heart disease

Download PDF

Paper 74: Fuzzy Logic based Approach for VoIP Quality Maintaining

Abstract: Voice communication is an emerging technology and has great importance in our routine life. Perceptual, Voice over Internet Protocol quality is an important issue for VoIP Apps services because VoIP Apps require real-time support. Many network factors (packet loss, packet delay, and jitter) affect to VoIP quality, to achieve this objective we used an approach based on Fuzzy Logic. We configure Resource Reservation Protocol application to control Token Bucket Algorithm and the simulation experiments are carried out with Opnet. In addition, compare Token Bucket with and without Quality of Service for measure network factors. In this paper, building Fuzzy Token Bucket System consists of three variables (Bandwidth Rate, Buffer Size, and New Token) in order to improve Token Bucket Shaper output variable (New Token) by Fuzzy Stability model for Voice over IP quality maintaining.

Author 1: Mohamed E. A. Ebrahim
Author 2: Hesham A. Hefny

Keywords: Voice over Internet Protocol (VoIP); Fuzzy model System (FMS); Fuzzy Token Bucket Algorithm (FTBA); Resource Reservation Protocol (RSVP); Quality of Service (QoS)

Download PDF

Paper 75: Conceptual Modeling of a Procurement Process

Abstract: Procurement refers to a process resulting in delivery of goods or services within a set time period. The process includes aspects of purchasing, specifications to be met, and solicitation notifications as in the case of Request For Proposals (RFPs). Typically, such an RFP is described in a verbal ad hoc fashion, in English, with tables and graphs, resulting in imprecise specifications of requirements. It has been proposed that BPMN diagrams be used to specify requirements to be included in RFP. This paper is a merger of three topics: 1) Procurement development with a focus on operational specification of RFP; 2) Public key infrastructure (PKI) as an RFP subject; and 3) Conceptual modeling that produces a diagram as a supplement to an RFP to clarify requirements more precisely than traditional tools, such as natural language, tables, and ad hoc graphs.

Author 1: Sabah Al Fedaghi
Author 2: Mona Al-Otaibi

Keywords: Procurement; RFP; public key infrastructure; conceptual modeling; diagrammatic representation

Download PDF

Paper 76: Average Link Stability with Energy-Aware Routing Protocol for MANETs

Abstract: This paper suggests the A-LSEA (Average Link Stability and Energy Aware) routing protocol for Mobile Ad-hoc Networks (MANETs). The main idea behind this algorithm is on the one hand, a node must have enough Residual Energy (RE) before retransmitting the Route Request (RREQ) and declaring itself as a participating node in the end-to-end path. On the other hand, the Link Life Time (LLT) between the sending node and the receiving node must be acceptable before transmitting the received RREQ. The combination of these two conditions provides more stability to the path and less frequent route breaks. The average results of the simulations collected from the suggested A-LSEA protocol showed a fairly significant improvement in the delivery ratio exceeding 10% and an increase in the network lifetime of approximately 20%, compared to other re-active routing protocols.

Author 1: Sofian Hamad
Author 2: Salem Belhaj
Author 3: Muhana M. Muslam

Keywords: Mobile Ad-hoc Network (MANET); routing protocol; energy aware; link life time; AODV

Download PDF

Paper 77: Agent based Architecture for Modeling and Analysis of Self Adaptive Systems using Formal Methods

Abstract: Self-adaptive systems (SAS) can modify their behavior during execution; this modification is done because of change in internal or external environment. The need for self-adaptive software systems has increased tremendously in last decade due to ever changing user requirements, improvement in technology and need for building software that reacts to user preferences. To build this type of software we need well establish models that have the flexibility to adjust to the new requirements and make sure that the adaptation is efficient and reliable. Feedback loop has proven to be very effective in modeling and developing SAS, these loops help the system to sense, analyze, plan, test and execute the adaptive behavior at runtime. Formal methods are well defined, rigorous and reliable mathematical techniques that can be effectively used to reason and specify behavior of SAS at design and run-time. Agents can play an important role in modeling SAS because they can work independently, with other agents and with environment as well. Using agents to perform individual steps in feedback loop and formalizing these agents using Petri nets will not only increase the reliability, but also, the adaptation can be performed efficiently for taking decisions at run time with increased confidence. In this paper, we propose a multi-agent framework to model self-adaptive systems using agent based modeling. This framework will help the researchers in implementation of SAS, which is more dependable, reliable, autonomic and flexible because of use of multi-agent based formal approach.

Author 1: Natash Ali Mian
Author 2: Farooq Ahmad

Keywords: Formal methods; self-adaptive systems; agent based modeling; feedback loop; Petri nets

Download PDF

Paper 78: Reverse Engineering State and Strategy Design Patterns using Static Code Analysis

Abstract: This paper presents an approach to detect behavioral design patterns from source code using static analysis techniques. It depends on the concept of Code Property Graph and enriching graph with relationships and properties specific to Design Patterns, to simplify the process of Design Pattern detection. This approach used NoSQL graph database (Neo4j) and uses graph traversal language (Gremlin) for doing graph matching. Our approach, converts the tasks of design pattern detection to a graph matching task by representing Design Patterns in form of graph queries and running it on graph database.

Author 1: Khaled Abdelsalam Mohamed
Author 2: Amr Kamel

Keywords: Reverse engineering; source code analysis; design patterns; static analysis; graph matching; Gremlin; Joern; Neo4j

Download PDF

Paper 79: OpenMP Implementation in the Characterization of a Urban Growth Model Cellular Automaton

Abstract: This paper presents the implementation of a parallelization strategy using the OpenMP library, while developing a simulation tool based on a cellular automaton (CA) to run urban growth simulations. The characterization of an urban growth model CA is shown and it consists of a digitization process of the land use in order to get all the necessary elements for the CA to work. During the first simulation tests we noticed high processing times due to large quantity of calculations needed to perform one single simulation, in order to minimize this we implemented a parallelization strategy using the fork-join model in order to optimize the use of available hardware. The results obtained show a significant improvement in execution times in function of the number of available cores and map sizes, as a future work, it is planned to implement artificial neural networks in order to generate more complex urban growth scenarios.

Author 1: Alvaro Peraza Garzón
Author 2: René Rodríguez Zamora
Author 3: Wenseslao Plata Rocha

Keywords: Cellular automata; parallel programming; simulation models; OpenMP; urban growth

Download PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. Registered in England and Wales. Company Number 8933205. All rights reserved. thesai.org