The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 9 Issue 8

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Framework Utilizing Machine Learning to Facilitate Gait Analysis as an Indicator of Vascular Dementia

Abstract: Vascular dementia (VD), the second most common type of dementia, effects approximately 13.9 per cent of people over the age of 71 in the United States alone. 26% of individuals develop VD after being diagnosed with congestive heart failure. Memory and cognition are increasingly affected as dementia progresses. However, these are not the first symptoms to appear in some types of dementia. Alterations in gait and executive functioning have been associated Vascular Cognitive Impairment (VCI). Research findings suggest that gait may be one of the earliest affected systems during onset of VCI, immediately following a vascular episode. The diagnosis tools currently utilized for VD are focused on memory impairment, which is only observed in later stages of VD. Hence we are proposing a framework that isolates gait and executive functioning analysis by applying machine learning to predict VD before cognition is affected, so pharmacological treatments can be used to postpone the onset of cognitive impairment. Over a period of time, we hope to be able to develop prediction algorithms that will not only identify but also predict vascular dementia.

Author 1: Arshia Khan
Author 2: Janna Madden
Author 3: Kristine Snyder

Keywords: Gait; machine learning; vascular dementia; early diagnosis; indicators; gait analysis

PDF

Paper 2: Recognition of Ironic Sentences in Twitter using Attention-Based LSTM

Abstract: Analyzing written language is an interesting topic that has been studied by many disciplines. Recently, due to the explosive growth of Internet, social media has become an attractive source of searching and getting information for research purposes on written communication. It is true that different words in a sentence serve different purposes of conveying the meaning while they are of different significance. Therefore, this paper is going to employ the attention mechanism to find out the relative contribution or significance of every word in the sentence. In this work, we address the problem of detecting whether a tweet is ironic or not by using Attention-Based Long Short-Term Memory Network. The results show that the proposed method achieves competitive performance on average recall and F1 score compared to the state-of-the-art results.

Author 1: Andrianarisoa Tojo Martini
Author 2: Makhmudov Farrukh
Author 3: Hongwei Ge

Keywords: Irony detection; attention; attention mechanism; sentiment analysis; long-short-term memory

PDF

Paper 3: The Role of Camera Convergence in Stereoscopic Video See-through Augmented Reality Displays

Abstract: In the realm of wearable augmented reality (AR) systems, stereoscopic video see-through displays raise issues related to the user’s perception of the three-dimensional space. This paper seeks to put forward few considerations regarding the perceptual artefacts common to standard stereoscopic video see-through displays with fixed camera convergence. Among the possible perceptual artefacts, the most significant one relates to diplopia arising from reduced stereo overlaps and too large screen disparities. Two state-of-the-art solutions are reviewed. The first one suggests a dynamic change, via software, of the virtual camera convergence, whereas the second one suggests a matched hardware/software solution based on a series of predefined focus/vergence configurations. Potentialities and limits of both the solutions are outlined so as to provide the AR community, a yardstick for developing new stereoscopic video see-through systems suitable for different working distances.

Author 1: Fabrizio Cutolo
Author 2: Vincenzo Ferrari

Keywords: Augmented reality and visualization; stereoscopic display; stereo overlap; video see-through

PDF

Paper 4: Comparison of Event Choreography and Orchestration Techniques in Microservice Architecture

Abstract: Microservice Architecture (MSA) is an architectural design pattern which was introduced to solve the challenges involved in achieving the horizontal scalability, high availability, modularity and infrastructure agility for the traditional monolithic applications. Though MSA comes with a large set of benefits, it is challenging to design isolated services using independent Database per Service pattern. We observed that with each micro service having its own database, when transactions span across multiple services, it becomes challenging to ensure data consistency across databases, particularly in case of roll backs. In case of monolithic applications using RDBMS databases, these distributed transactions and roll backs can be handled efficiently using 2 phase commit techniques. These techniques cannot be applied for isolated No-SQL databases in micro services. This research paper aims to address three things: 1) elucidate the challenges with distributed transactions and rollbacks in isolated No-SQL databases with dependent collections in MSA, 2) examine the application of event choreography and orchestration techniques for the Saga pattern implementation, and 3) present the fact-based recommendations on the saga pattern implementations for the use cases.

Author 1: Chaitanya K. Rudrabhatla

Keywords: Microservice architecture; database per service pattern; Saga pattern; orchestration; event choreography; No-SQL database; 2 phase commit

PDF

Paper 5: Location-based E-Commerce Services: (Re-) Designing using the ISO9126 Standard

Abstract: E-commerce services based on user geographic location have emerged as a particularly important segment of modern information services. In these user-intensive applications, quality of service is important and design methods are increasingly relying on software standards to achieve quality. In this paper, we propose an evaluation model for location based e-services that provide insights on how overall system quality can be strengthened via identifying the most important quality characteristics of specific user-system interactions facets. The model categorizes location based services into taxonomies of components / functions, which are further analyzed in interaction facets and significance levels. A further mapping to external qualitative sub-characteristics of the ISO9126 quality standard is used to formally decompose design quality into quality attributes. The view of software design through quality attributes is supported by a mathematical model, which calculates significance weights on service components, defined either by designers or by the end users. An experiment, where this method is used to assess functionality is presented.

Author 1: Antonia Stefani
Author 2: Bill Vassiliadis
Author 3: Theofanis Efthimiades

Keywords: E-commerce; location based services; software quality; software design; ISO9126

PDF

Paper 6: Programming Technologies for the Development of Web-Based Platform for Digital Psychological Tools

Abstract: The choice of the tools and programming technologies for information systems creation is relevant. For every projected system, it is necessary to define a number of criteria for development environment, used libraries and technologies. The paper describes the choice of technological solutions using the example of the developed web-based platform of the Russian Academy of Education. This platform is used to provide information support for the activities of psychologists in their research (including population and longitudinal researches). There are following system features: large scale and significant amount of developing time that needs implementation and ensuring the guaranteed computing reliability of a wide range of digital tools used in psychological research; ensuring functioning in different environments when conducting mass research in schools that have different characteristics of computing resources and communication channels; possibility of services scaling; security and privacy of data; use of technologies and programming tools that would ensure the compatibility and conversion of data with other tools of psychological research processing. Some criteria were introduced for the developed system. These criteria take into account the feature of the functioning and life cycle of the software. A specific example shows the selection of appropriate technological solutions.

Author 1: Evgeny Nikulchev
Author 2: Dmitry Ilin
Author 3: Pavel Kolyasnikov
Author 4: Vladimir Belov
Author 5: Ilya Zakharov
Author 6: Sergey Malykh

Keywords: Psychological research tools; web-based platform; choice of the tools and programming technologies

PDF

Paper 7: Review of Prediction of Disease Trends using Big Data Analytics

Abstract: Big Data technologies promise to have a transformative impact in healthcare, public health, and medical research, among other application areas. Several intelligent machine learning techniques were designed and used to provide big data predictive analytics solutions for different illness. Nevertheless, there is no published research for prediction of allergy and respiratory system diseases. However, the impact of research and the finding of different cases is conducive to progress and further development of this. One of the goals of this paper is to devise a systematic mapping study, to explore and analyze existing research about disease prediction in healthcare information. According to the realized investigation of published research from 2012 up to today, we are focusing our research on studies that have been published around big data analytics. With this high number of secondary studies, it is important to conduct a review and provide an overview of the research situation and current developments in this area.

Author 1: Diellza Nagavci
Author 2: Mentor Hamiti
Author 3: Besnik Selimi

Keywords: Big data; algorithms; data analytics; healthcare; disease prediction; data mining

PDF

Paper 8: The Role of Hyperspectral Imaging: A Literature Review

Abstract: Optical analysis techniques are used recently to detect and identify the objects from a large scale of images. Hyperspectral imaging technique is also one of them. Vision of human eye is based on three basic color (red, green and blue) bands, but spectral imaging divides the vision into many more bands. Hyperspectral remote sensors achieve imagery data in the form of hundreds of adjoining spectral bands. In this paper, our purpose is to illustrate the fundamental concept, hyperspectral remote sensing, remotely sensed information, methods for hyperspectral imaging and applications based on hyperspectral imaging. Moreover, in the forensic context, the novel methods involving deep neural networks are elaborated in this paper. The proposed idea can be useful for further research in the field of hyperspectral imaging using deep learning.

Author 1: Muhammad Mateen
Author 2: Junhao Wen
Author 3: Nasrullah
Author 4: Muhammad Azeem Akbar

Keywords: Deep learning; electromagnetic spectrum; hyperspectral imaging; imaging spectroscopy; multispectral imaging; remote sensing

PDF

Paper 9: A Review on Scream Classification for Situation Understanding

Abstract: In our living environment, a non-speech audio signal provides a significant evidence for situation awareness. It also compliments the information obtained from a video signal. In non-speech audio signals, screaming is one of the events in which the people like security guard, care taker and family members are particularly interested in terms of care and surveillance because screams are atomically considered as a sign of danger. Contrary to this concept, this review is particularly targeting automated acoustic systems using non-speech class of scream believing that the screams can further be classified into various classes like happiness, sadness, fear, danger, etc. Inspired by the prevalent scream audio detection and classification field, a taxonomy has been projected to highlight the target applications, significant sound features, classification techniques, and their impact on classification problems in last few decades. This review will assist the researchers for retrieving the most appropriate scream detection and classification technique and acoustic parameters for scream classification that can assist in understanding the vocalization condition of the speaker.

Author 1: Saba Nazir
Author 2: Muhammad Awais
Author 3: Sheraz Malik
Author 4: Fatima Nazir

Keywords: Scream classification; scream detection; acoustic parameters; surveillance; security

PDF

Paper 10: Performance Improvement of Web Proxy Cache Replacement using Intelligent Greedy-Dual Approaches

Abstract: This paper reports on how intelligent Greedy-Dual approaches based on supervised machine learning were used to improve the web proxy caching performance. The proposed intelligent Greedy-Dual approaches predict the significant web objects’ demand for web proxy caching using Naïve Bayes (NB), decision tree (C4.5), or support vector machine (SVM) classifiers. Accordingly, the proposed intelligent Greedy-Dual approaches effectively make the cache replacement decision based on the trained classifiers. The trace-driven simulation results indicated that in terms of byte hit ratio and/or hit ratio, the performance of each of the conventional Greedy-Dual-Size-Frequency (GDSF) and Greedy-Dual-Size (GDS) was noticeably enhanced by applying the proposed Greedy-Dual approaches on five real datasets.

Author 1: Waleed Ali

Keywords: Cache replacement; Greedy-Dual approaches; machine learning; proxy

PDF

Paper 11: El Niño / La Niña Identification based on Takens Reconstruction Theory

Abstract: An identification method for earth observation data according to a chaotic behavior based on Takens reconstruction theory is proposed. The proposed method is examined by using the observed time series data of SST (Sea Surface Temperature) and the SOI (Southern Oscillation Index) data. The experimental results show that the time for the identification of the proposed method is not later than that of the existing method. Author confirmed that by using the definitions of the Japan Meteorological Agency and the use of Equations, I can identify El Niño / La Niña at an earlier time. In other words, we do not necessarily need a numerical value for 10 months by identifying the proposed method. I confirmed that the time required for the identification judgment of the proposed method is about one month. The proposed method is not based on extrapolation method with numerical model or governing equation, but based on interpolation method using only actual observation time series.

Author 1: Kohei Arai
Author 2: Kaname Seto

Keywords: Time series analysis; takens; sea surface temperature: SST; southern oscilation index: SOI; El Niño-southern oscillation: ENSO

PDF

Paper 12: Adaptive Return of e-Training (ROT) based on Communication Technology

Abstract: Persistent economic insecurity and harsh severity actions across the world push businesses either to cut down on training costs or to be very painstaking in choosing a training program that conveys palpable outcomes in a short period of time. Nevertheless, in most cases businesses are still unable to reckon Return of e-Training (ROT) in advance for better allocation of training budget and decision on a proper training plan in line with the business policy. The purpose of this paper is to appraise the practical worth of the applicability and usability of the Adaptive ROT in the enterprises with a particular regard to evaluating the impact of e-training in companies. A case study of gauging the profit of e-training in the Blackboard systems has been conducted. The outcome of this study is judged to be positive, given the efficacy of the Adaptive ROT Evaluation Model for e-training in companies.

Author 1: Fahad Alotaibi

Keywords: Return of e-Training (ROT); evaluation models; blackboard; e-learning; Key Performance Indicator (KPI)

PDF

Paper 13: Comparison of Hash Function Algorithms Against Attacks: A Review

Abstract: Hash functions are considered key components of nearly all cryptographic protocols, as well as of many security applications such as message authentication codes, data integrity, password storage, and random number generation. Many hash function algorithms have been proposed in order to ensure authentication and integrity of the data, including MD5, SHA-1, SHA-2, SHA-3 and RIPEMD. This paper involves an overview of these standard algorithms, and also provides a focus on their limitations against common attacks. These study shows that these standard hash function algorithms suffer collision attacks and time inefficiency. Other types of hash functions are also highlighted in comparison with the standard hash function algorithm in performing the resistance against common attacks. It shows that these algorithms are still weak to resist against collision attacks.

Author 1: Ali Maetouq
Author 2: Salwani Mohd Daud
Author 3: Noor Azurati Ahmad
Author 4: Nurazean Maarop
Author 5: Nilam Nur Amir Sjarif
Author 6: Hafiza Abas

Keywords: Hash function algorithms; MD5; PRIMEDS160; SHA-1; SHA-2; SHA-3

PDF

Paper 14: An Analysis of Cloud Computing Adoption Framework for Iraqi e-Government

Abstract: This paper presents an analysis of the factors which could have possible affect over the adoption of cloud computing via the Iraqi e-government. A conceptual framework model for cloud computing within Iraqi e-government is proposed, analyzed, evaluated and discussed.

Author 1: Ban Salman Shukur
Author 2: Mohd Khanapi Abd Ghani
Author 3: M.A. Burhanuddin

Keywords: e-Government; cloud computing; framework; Iraq; Iraqi e-government

PDF

Paper 15: A Survey on Tor Encrypted Traffic Monitoring

Abstract: Tor (The Onion Router) is an anonymity tool that is widely used worldwide. Tor protect its user privacy against surveillance and censorship using strong encryption and obfuscation techniques which makes it extremely difficult to monitor and identify users’ activity on the Tor network. It also implements strong defense to protect the users against traffic features extraction and website fingerprinting. However, the strong anonymity also became the heaven for criminal to avoid network tracing. Therefore, numerous of research has been performed on encrypted traffic analyzing and classification using machine learning techniques. This paper presents survey on existing approaches for classification of Tor and other encrypted traffic. There is preliminary discussion on machine learning approaches and Tor network. Next, there are comparison of the surveyed traffic classification and discussion on their classification properties.

Author 1: Mohamad Amar Irsyad Mohd Aminuddin
Author 2: Zarul Fitri Zaaba
Author 3: Manmeet Kaur Mahinderjit Singh
Author 4: Darshan Singh Mahinder Singh

Keywords: Encrypted traffic monitoring; Tor; machine learning; security; survey

PDF

Paper 16: The Implementation of Computer based Test on BYOD and Cloud Computing Environment

Abstract: Computer-based test promises several benefits such as automatic grading, assessment features, and paper efficiency. However, besides the benefits, the organization should prepare the enough infrastructure, network connectivity, and user education. The problem upsurges when a hundred numbers of users join the computer-based test. This article proposes Bring-Your-Own-Devices (BYOD), and the cloud computing approach to facilitate a hundred numbers of exam participant. Through the experiment method on 393 students, the article determines five central practices that can be used by the organization who want to implement the massive scale computer-based test.

Author 1: Ridi Ferdiana
Author 2: Obert Hoseanto

Keywords: Computer-based test; cloud computing; Bring-Your-Own-Devices (BYOD)

PDF

Paper 17: Method for Designing Scalable Microservice-based Application Systematically: A Case Study

Abstract: Microservice is a new transformation of Service-Oriented Architecture (SOA) which is gaining momentum in both academic and industry. The success of microservice began when giant companies like Netflix used them as a service architecture for the purpose of serving customers. Monolithic architecture used by Netflix previously was no longer able to cope with business development and it is difficult to scale to meet user demands. Although Netflix has been successful with microservice architecture, there is no systematic method introduced to produce microservice. Academic studies related to microservice are still in the early stages and have not yet reached maturity. Microservice is seen to require a method that helps organizations to systematically design microservices and replicate the success achieved by Netflix. In forming a method for this systematic microservice then the methods for building an existing microservice are studied. Based on the Design Science Research method, two research artefacts have been produced. The first artefact is a systematic design of microservice that has four main steps. The second artefact is the instantiation by applying the proposed microservice design method to the case studies, namely, MyFlix. Next the evaluation is made on the new method produced by obtaining expert opinions through the process of demonstration and interviewing. The expert assessment results found that the proposed method was able to produce a systematic microservice design based on the six proposed principles and the four main steps. This method can also produce a complete feature microservice such as cohesive, loose coupling, distributed and decentralized that will contribute to the production of scalable system.

Author 1: Ahmad Tarmizi Abdul Ghani
Author 2: Mohamad Shanudin Zakaria

Keywords: Microservice; systematic method; scalable microservice design

PDF

Paper 18: Adaptive Simulated Evolution based Approach for Cluster Optimization in Wireless Sensor Networks

Abstract: Energy consumption minimization is crucial for the constrained sensors in wireless sensor networks (WSNs). Partitioning WSNs into optimal set of clusters is a promising technique utilized to minimize energy consumption and to increase the lifetime of the network. However, optimizing the network into optimal set of clusters is a non-polynomial (NP) hard problem, and the time needed to solve such problem increases exponentially as the number of sensors increases. In this paper, simulated evolution (SimE) algorithm is engineered to tackle the problem of cluster optimization in WSNs. A goodness measure is developed to measure the accuracy of assigning nodes to clusters and to evaluate the clustering quality of the overall network. SimE was developed such that the number of clusters and cluster heads are adaptive to number of alive nodes in the network. In fact, extensive simulation results demonstrate that SimE provides near optimal clustering and improves the lifetime of the network by about 21% compared to the traditional LEACH-C protocol.

Author 1: Abdulaziz Alsayyari

Keywords: Clustering algorithm; cluster optimization; network lifetime; simulated evolution; wireless sensor networks

PDF

Paper 19: Investigating the Acceptance of Mobile Health Application User Interface Cultural-Based Design to Assist Arab Elderly Users

Abstract: Mobile health (m-health) applications are a way to provide solutions to the non-availability of physical health services in the Arab world. However, end users of m-health around the world have their cultural and personal differences that distinguish them from others. Studies suggest that culture is an essential component of the success of any product or technology usage. In view of this, the study investigated acceptance towards mobile health application User Interface (UI) designed for Arab elderly users based on their culture. The TAM model formed the theoretical basis upon which a quantitative design was adopted, with a questionnaire as data collection instrument from 134 participants. The findings showed that perceived Ease of Use (PEOU) and Attitude Towards Use (ATU) had a significant positive influence on Behavioural Intention (BI) to use mobile health application User Interface. Overall, the results indicated that Arab elderly users found the mobile health application UI as acceptable due to its cultural-based design. To improve designs of mobile applications UI targeting elderly users, it is vital to gain insight into cultural aspects that influence the usability of mHealth application UI as well as insights into their personal characteristics and experiences.

Author 1: Ahmed Alsswey
Author 2: Irfan Naufal Bin Umar
Author 3: Brandford Bervell

Keywords: TAM; elderly users; mobile health applications; user interface; culture

PDF

Paper 20: Acoustic Classification using Deep Learning

Abstract: Acoustic complements is an important methodology to perceive the sounds from environment. Significantly machines in different conditions can have the hearings capability like smartphones, different software or security systems. This kind of work can be implemented through conventional or deep learning machine models that contain revolutionized speech identification to understand general environment sounds. This work focuses on the acoustic classification and improves the performance of deep neural networks by using hybrid feature extraction methods. This study improves the efficiency of classification to extract features and make prediction of cost graph. We have adopted the hybrid feature extraction scheme consisting of DNN and CNN. The results have 12% improvement from the previous results by using mix feature extraction scheme.

Author 1: Muhammad Ahsan Aslam
Author 2: Muhammad Umer Sarwar
Author 3: Muhammad Kashif Hanif
Author 4: Ramzan Talib
Author 5: Usama Khalid

Keywords: Acoustics; deep learning; machine learning; neural networks; audio sounds

PDF

Paper 21: Quality Assurance for Data Analytics

Abstract: Quality Assurance is a technique for ensuring the overall software quality suggested by Global Standards bodies like IEEE. The Quality Assurance for Data Analytics requires more time and a very different set of skills because Software Products, which are used for Data Analytics, are different than that of traditional ones. In result, these Software Products require more complex algorithms to operate and then for ensuring their quality, one needs more advanced techniques for handling these Software Products. According to our survey, Data Analytical Software Products require more work because of their more complex nature. One of the possible reasons can be the volume and variety of Data. On the same hand, this research emphasizes on testing of Data Analytical Software Products which have many issues because testing of these Software Products requires real data. However, every time the testing of these Software Products is based either on dummy data or simulations and these Software Products fail when they work in real time. For making these Software Products work well before and after deployment, we have to define certain Quality standards. In this way, we can get better result producing analytics Software Products for better results.

Author 1: Rakesh Kumar
Author 2: Birth Subhash
Author 3: Maria Fatima
Author 4: Waqas Mahmood

Keywords: Software Quality Assurance (SQA); data analytical softwares; data driven softwares; real time analytics; data analytics; quality issues; quality control

PDF

Paper 22: Developing Communication Strategy for Multi-Agent Systems with Incremental Fuzzy Model

Abstract: Communication can guarantee the coordinated behavior in the multi-agent systems. However, in many real-world problems, communication may not be available at every time because of limited bandwidth, noisy environment or communication cost. In this paper, we introduce an algorithm to develop a communication strategy for cooperative multi-agent systems in which the communication is limited. This method employs a fuzzy model to estimate the benefit of communication for each possible situation. This specifies minimal communication that is necessary for successful joint behavior. An incremental method is also presented to create and tune our fuzzy model that reduces the high computational complexity of the multi-agent systems. We use several standard benchmark problems to assess the performance of our proposed method. Experimental results show that the generated communication strategy can improve the performance as well as full-communication strategy, while the agents utilize little communication.

Author 1: Sam Hamzeloo
Author 2: Mansoor Zolghadri Jahromi

Keywords: Multi-agent systems; decentralized partially observable Markov decision process; communication; planning under uncertainty; fuzzy inference systems

PDF

Paper 23: OpenSimulator based Multi-User Virtual World: A Framework for the Creation of Distant and Virtual Practical Activities

Abstract: The exponential growth of technology has contributed to the positive revolution of distance learning. E-learning is becoming increasingly used in the transfer of knowledge where instructors can model and script their courses in several formats such as files, videos and quizzes. In order to complete their courses, practical activities are very important. Several instructors have joined Multi-User Virtual World (MUVW) communities such as SecondeLife, as they offer a degree of interrelated realism and interaction between users. The modeling and scenarization of practical activities in the MUVWs remains a very difficult task considering the technologies used by these MUVWs and the necessary prerequisites. In this paper, we propose a framework for the OpenSimulator MUVWs that can simplify the scenarization of practical activities using the OpenSpace3D software and without requiring designers to have expertise in programming or coding.

Author 1: MOURDI Youssef
Author 2: SADGAL Mohamed
Author 3: BERRADA FATHI Wafaa
Author 4: EL KABTANE Hamada

Keywords: E-Learning; multi-user virtual world; practical activities; OpenSimulator; virtual reality; virtual laboratories

PDF

Paper 24: Performance Evaluation of Cloud Computing Resources

Abstract: Cloud computing is an emerging information technology which is rapidly growing. However, measuring the performance of cloud based applications in real environments is a challenging task for research as well as business community. In this work, we focused on Infrastructure as a Service (IaaS) facility of cloud computing. We made a performance evaluation of two renowned public and private cloud platforms. Several performance metrics such as integer, floating Point, GFLOPS, read, random Read, write, random write, bandwidth, jitter and throughput were used to analyze the performance of cloud resources. The motive of this analysis is to help cloud providers to adjust their data center parameters under different working conditions as well as cloud customers to monitor their hired resources. We analyzed and compared the performance of OpenStack and Windows Azure platforms by considering resources like CPU, memory, disk and network in a real cloud setup. In order to evaluate each feature, we used related benchmarks, for example, Geekbench & LINPACK for CPU performance, RAMspeed & STREAM for memory performance, IOzone for disk performance and Iperf for network performance. Our experimental results showed that the performance of both clouds is almost same; however, OpenStack seems to be better option as compared to Windows Azur keeping in view its cost as well as network performance.

Author 1: Muhammad Sajjad
Author 2: Arshad Ali
Author 3: Ahmad Salman Khan

Keywords: Cloud computing; OpenStack; windows azure

PDF

Paper 25: Intelligent Model Conception Proposal for Adaptive Hypermedia Systems

Abstract: The context of this article is to study and propose solutions for the major problems of adaptive hypermedia systems. In fact, the works and models proposed for these systems are made according to the tradition of studying first theories and rules, then modeling and designing a system that implements them. As a result, adaptive hypermedia systems designed reflect and support only the elements and information that were studied during the design phase. Also, these systems require a huge amount of data to power their architecture in order to start operating. This famous problem is called “cold start” and until now represents a challenge. So, in this paper, we will propose an intelligent and flexible model inspired by human nature and that proposes a promising solution to these problems concerning hypermedia adaptive systems.

Author 1: Mehdi TMIMI
Author 2: Mohamed BENSLIMANE
Author 3: Mohammed BERRADA
Author 4: Kamar OUAZZANI

Keywords: Adaptive hypermedia; artificial intelligence; deep learning; learner model; domain model; adaptation model; brain; neuron

PDF

Paper 26: Cost Aware Resource Selection in IaaS Clouds

Abstract: One of the main challenges in cloud computing is to cope up with the selection of efficient resources in terms of cost. There are various cloud computing service providers which dynamically provide resources to the customers through different pricing policies. Based on the different APIs and pricing policies of the service providers, it becomes difficult for the customers to select the best service provider in terms of cost. In some cases, if the usage of the resources provided by a datacenter exceeds certain limit, then the providers cannot offer more resources to the customers as new VMs cannot be created. Hence, even if the customer chooses the best provider based on the least cost parameter, still there is no guarantee that the provider allocates complete resources to the customer. For this reason, I present system architecture that selects the best service provider based on the customer requirements mainly the cost. The proposed architecture also performs resource management by automatically providing new VMs from the available service providers in the inter cloud. The proposed system is based on five clouds i.e. Amazon EC2, Cloudsigma, Google, GoGrid, and Windows Azure. An interface is designed for obtaining the user requirements. These user requirements are matched with the design database of five cloud providers and based on the matched values; the catalog of optimal costs for each particular cloud is shown to the user. Then Cost Aware Resource Selection algorithm is run for determining the lowest optimal cost for Instance based approach and Quantity based approach. The algorithm tackles two domains of clouds for the algorithm i.e. Single Cloud and Multi Cloud.

Author 1: Uzma Bibi

Keywords: Cloud computing; pay-as-you-go; infrastructure as a service; cost aware resource selection; virtual machines; hypervisor; instance based approach; quantity based approach; single cloud; multi cloud

PDF

Paper 27: ECG Abnormality Detection Algorithm

Abstract: The monitoring and early detection of abnormalities in the cardiac cycle morphology have significant impact on the prevention of heart diseases and their associated complications. Electrocardiogram (ECG) is very effective in detecting irregularities of the heart muscle functionality. In this work, we investigate the detection of possible abnormalities in ECG signal and the identification of the corresponding heart disease in real-time using an efficient algorithm. The algorithm relies on cross-correlation theory to detect abnormalities in ECG signal. The algorithm incorporates two cross-correlations steps. The first step detects abnormality in a real-time ECG signal trace while the second step identifies the corresponding disease. The optimization of search-time is the main advantage of this algorithm.

Author 1: Soha Ahmed
Author 2: Ali Hilal-Alnaqbi
Author 3: Mohamed Al Hemairy
Author 4: Mahmoud Al Ahmad

Keywords: Cross-correlation; abnormalities detection; electrocardiogram (ECG); cardiac cycle; eHealth; remote monitoring; algorithm

PDF

Paper 28: Efficient Resource Consumption by Dynamic Clustering and Optimized Routes in Wireless Sensor Networks

Abstract: The energy issue is an important parameter in the wireless sensor networks and should be managed in the different applications. We propose a new routing algorithm that it is energy efficient and uses different approaches as dynamic clustering, spanning tree, self-configurable routing and controls energy consuming by data-driven and power management schemas. It has two main phases. The first is consisting of the steady cluster, cluster head election and creation-spanning tree in each cluster and the second phase is data transmission. The proposed protocol is compared with four other protocols in network lifetime, network balance, and average packet delay and packet delivery. Simulation results show the proposed protocol performance in the network lifetime is about 6 per cent higher than Improved-LEACH, 21.5 per cent higher than EESR and 5.8 per cent higher than DHCO. Its improvement in packet delivery parameter is about 3.5 per cent higher than Improved-LEACH, 6.5 per cent higher than EESR and 3 per cent higher than DHCO. In addition, the performance or in packet delay is about 17 per cent higher than EESR and 6 per cent higher than DHCO but Improved-LEACH protocol has a good performance than our protocol about 4 per cent.

Author 1: Farzad Kiani

Keywords: Energy efficiency; data-driven; spanning tree; sleep/wake up mode; power management

PDF

Paper 29: A Method of Automatic Domain Extraction of Text to Facilitate Retrieval of Arabic Documents

Abstract: Arabic content on the internet has increased over the web because of the growth of the number of Arabic persons who use the internet in the world. Accordingly, this study introduces an automatic approach of domain extraction of information retrieval from these contents based on text classification. Text classification process makes the searching domain specific to facilitate the searching process. This paper discusses how to enhance the capacity of information retrieval in Arabic documents by classifying the unlabelled Arabic text automatically by using text classification algorithms. The classification of documents and texts is an important field in computer science and information retrieval. It aims at enhancing the retrieval process by identifying the searching-domain of retrieval systems.

Author 1: Mohammad Khaled A. Al-Maghasbeh
Author 2: Mohd Pouzi bin Hamzah

Keywords: Arabic information retrieval; text classification; Arabic text mining; Arabic language processing; text clustering; text classification; text categorization and classification algorithms

PDF

Paper 30: Features and Potential Security Challenges for IoT Enabled Devices in Smart City Environment

Abstract: Introduction of Internet of Things in our lives have brought drastic changes in the social norms, working habits, ways of completing tasks and planning for future. Data about our interactions with everyday objects can be effectively transmitted to their destinations with many communicating tags that also often provide specific location information. The risk of potential eavesdropping is always a major concern of data owners. Since Internet of Things is primarily responsible for carrying data of smart objects which are mostly connected over wireless technologies, securing of information carried by these wireless links to safeguard the private information is of utmost importance. Cryptographic techniques to cypher data carried by the IoT networks is one possibility which is not feasible due to the lack of sufficient computing resources at the sensor end of IoT devices. In this paper, we discuss various security issues that haunt the secure IoT deployments and propose a layered solution model that prevents breach of security during transmission of data.

Author 1: Gasim Alandjani

Keywords: IoT; privacy; smart city; smart society; actuators; sensors; industrial 4.0;5G

PDF

Paper 31: Comparative Study of PMSG Controllers for Variable Wind Turbine Power Optimization

Abstract: With a large increase in wind power generation, the direct driven Permanent Magnet Synchronous Generator is the most promising technology for variable speed operation and it also fulfills the grid requirements with high efficiency. This paper studies and compares conventional based on PI controller and proposed control technique for a direct driven PMSG wind turbine. The generator model is established in the Park synchronous rotating d-q reference frame. To achieve maximum power capture, the aeroturbine is controlled through Maximum Power Point Tracking (MPPT) while the PMSG control is treated through field orientation where the two currents control loops are regulated. A proposed direct-current based d-q vector control design is designed by the integration of the Internal Model Controller. Then an optimal control is developed for integrated control of PMSG power optimization and Voltage Source Converter control. The design system was done using SimWindFarm Matlab/Simulink toolbox to evaluate the performance of conventional and proposed technique control of PMSG wind turbine. The analysis, simulation results prove the effectiveness and robustness of the proposed control strategy.

Author 1: Asma Hammami
Author 2: Imen Saidi
Author 3: Dhaou Soudani

Keywords: Wind turbine; internal model control; PI controller Permanent Magnet Synchronous Generator (PMSG); vector control

PDF

Paper 32: Impact and Challenges of Requirements Management in Enterprise Resource Planning (ERP) via ERP Thesaurus

Abstract: Managing requirements efficiently aids the system design team to understand the existence and significance of any individual requirement, there are numerous requirements management practices that benefit in decision making but significantly many lacks to account the important factors that have substantial influence in managing requirements in context of ERP systems in particular. As highlighted comprehensively later in literature review section, requirements management failure is one of the pivotal aspects for the project(s) failure. The prime problem/lacking in software design and development is when it comes to requirements management the most vital thing that gets ignored is thinking before performing activities. As it should be the main step to save time, money and efforts. Further prominence other aspects in this are pivotal value about the software’s running in industries, the question arises when their business need ERP system, and when requirements change or new requirements are emerged into the system, what are the obstacles faced and how these obstacles are accomplished. ERP systems are becoming the need of industries nowadays as various industries are facing problems regarding data loss; it is challenging for the owners to fetch all the information when they need it, accounting systems are slower and consuming a lot of time and many other issues likewise. This paper further illustrates in detail the important traits, issues toward businesses may have when ERP is implemented and when requirements are changed or not managed professionally what issues are faced by requirement engineering team and industries and thus how to resolve them.

Author 1: Rahat Izhar
Author 2: Dr. Shahid Nazir Bhatti
Author 3: Saba Izhar
Author 4: Dr. Amr Mohsen Jadi

Keywords: Enterprise Resource Planning (ERP); Product Owner (PO); requirements elicitation; requirements management; change management

PDF

Paper 33: Implementation of Blended Learning in Teaching at the Higher Education Institutions of Pakistan

Abstract: Blended learning has emerged as one of the solutions to address the various needs of Higher Education Institutions around the world. Blended Learning is the combination of traditional classroom and online endeavour. It provides advantages of both face to face learning and e-learning. The main purpose of this study is to assess the adaptation level of blended learning in teaching process at Higher Education Institutions. This study carried out mixed method approach by using explanatory sequential model. Teachers of general public universities were included as the sample for this study. Questionnaire and interview techniques were used as data gathering tools. The main findings of this study showed that teachers have a positive perception for technology usage in teaching process. Most of the teachers possessed expertise in the use of different software and equipped with internet skills. The study concluded that in blended learning implementation, universities are still at awareness level and a lot of efforts are required for effective implementation of blended learning. It is recommended that the universities’ administration should provide an extra computing infrastructure (e.g. servers, bandwidth, and storage capacity) to run the courses in blended format. We recommend that in strategic plan of the universities the blended learning should be well defined and highlighted.

Author 1: Saira Soomro
Author 2: Arjumand Bano Soomro
Author 3: Tariq Bhatti
Author 4: Najma Imtiaz Ali

Keywords: Blended learning; teaching-learning; university teachers

PDF

Paper 34: The Measurement of Rare Plants Learning Media using Backward Chaining Integrated with Context-Input-Process-Product Evaluation Model based on Mobile Technology

Abstract: This research was aimed to know the effectiveness level of learning media utilization to the introduction of rare plants in Alas Kedaton tourism forest in Tabanan-Bali based on backward chaining for students and the general public. The type of this research includes explorative and evaluative research types. The population in this study was the plants species that exist in the Alas Kedaton tourism forest. The human population was the entire society in the area of Alas Kedaton tourism forest. The sampling method of plants species used the quadratic method, while for the human samples used purposive sampling method. The data has been collected then analyzed descriptively. The results of this study indicate that through the utilization of learning media obtained related information about the number of rare plants species in Alas Kedaton tourism forest as many as 48 species of plants with 26 families, and also the factors causing the scarcity of those plants species. Through the use of CIPP (Context-Input-Process-Product) evaluation model assisted by mobile technology, the overall average effectiveness of learning media utilization to the introduction of rare plant in Alas Kedaton tourism forest in Tabanan-Bali based on backward chaining amount of 88.20%, so that was included into the good categorization.

Author 1: Nyoman Wijana
Author 2: Ni Nyoman Parmithi
Author 3: I Gede Astra Wesnawa
Author 4: I Made Ardana
Author 5: I Wayan Eka Mahendra
Author 6: Dewa Gede Hendra Divayana

Keywords: Rare plants species; backward chaining; evaluation; CIPP; mobile technology

PDF

Paper 35: Energy Consumption Evaluation of AODV and AOMDV Routing Protocols in Mobile Ad-Hoc Networks

Abstract: Mobile Ad-hoc Networks (MANETs) are mobile, multi-hop wireless networks that can be set up anytime, anywhere without the need of pre-existing infrastructure. Due to its dynamic topology the main challenge in such networks is to design dynamic routing protocols, which are efficient in terms of consumption of energy and producing less overhead. The main emphasis of this research is upon the prominent issues of MANETs such as energy efficiency and scalability along with some traditional performance metrics for performance evaluation. Two proactive routing protocols used in this research are single-path AODV versus multi-path AOMDV. Extensive simulation has been done in NS2 simulator, which includes ten scenarios. The simulation results revealed that the performance of AOMDV is more optimal as compared to AODV in terms of throughput, packet delivery fraction and end to end delay. However, in terms of consumption of energy and NRL the AODV protocol performed better as compared to AOMDV.

Author 1: Fawaz Mahiuob Mohammed Mokbal
Author 2: Khalid Saeed
Author 3: Wang Dan

Keywords: MANETs; routing protocols; AODV; AOMDV; energy efficiency; routing performance

PDF

Paper 36: Piezoelectric based Biosignal Transmission using Xbee

Abstract: This paper is showcasing the development of an innovative healthcare solution that will allow patient to be monitored remotely. The system utilizes a piezoelectric sheet sensor and XBee wireless communication protocol to collect and transmit heart beat pressure signal from human subject neck to a receiving node. Then, using signal processing techniques a set of important vital parameters such as heart rate, and blood pressure are extracted from the received signal. Those extracted parameters are needed to assess the human subject health continuously and timely. The architecture of our developed system, which enables wireless transmission of the raw acquired physiological signal, has three advantages over existing systems. First, it increases user’s mobility because we employed XBee wireless communication protocol for signal transmission. Second, it increases the system usability since the user has to carry a single unit for signal acquisition while preprocessing is performed remotely. Third, it gives us more flexibility in acquiring various vital parameters with great accuracy since processing is done remotely with powerful computers.

Author 1: Mohammed Jalil
Author 2: Mohamed Al Hamadi
Author 3: Abdulla Saleh
Author 4: Omar Al Zaabi
Author 5: Soha Ahmed
Author 6: Walid Shakhatreh
Author 7: Mahmoud Al Ahmad

Keywords: Piezoelectric; XBee; medical sensors; vital signs; remote health monitoring

PDF

Paper 37: Performance Evaluation of a Smart Remote Patient Monitoring System based Heterogeneous WSN

Abstract: This paper investigates the development of a remote patient monitoring system based on WBAN Wireless Body Sensor Network. Thus, the main purpose of such design is to interconnect heterogeneous sensor networks not equipped with the HTTP / TCP / UDP stack. A novel gateway architecture is proposed to ensure interoperability and facilitate seamless access to data from different types of body sensors that communicate via different technologies, namely, Bluetooth, IEEE802.15.4 / Zigbee and IEEE 802.15.6. Moreover, an application-layer approach for a Web Service Gateway is also developed for interaction with heterogeneous WSN. The Gateway communicates with the server via the SOAP protocol and manages the service consumption. Since the proposed platform is targeted to monitor the patient health status, a preliminary link test between the sensor and the server is unavoided in terms of quality of service. To evaluate the performances of our proposed platform, a results comparison was conducted based on different communication scenarios (3G, ADSL, and LOCAL). Finding results illustrate the (QoS) constraints, namely, Latency, Packet loss and Jitter.

Author 1: Mohamed EDDABBAH
Author 2: Mohamed MOUSSAOUI
Author 3: Yassin LAAZIZ

Keywords: WSN; body sensor networks; remote patent monitoring; e-health; SOA

PDF

Paper 38: Mapping Wheat Crop Phenology and the Yield using Machine Learning (ML)

Abstract: Wheat has been a prime source of food for the mankind for centuries. The final wheat grain yield is the multitude of the complex interaction among the various yield attributes such as kernel per plant, Spike per plant, NSpt/s, Spike Dry Weight (SDW), etc. Different approaches have been followed to understand the non-linear relationship between the attributes and the yield to manage the crop better in the context of precision agriculture. In this study, Principle Component analysis (PCA) and Stepwise regression used to reduce the dimension of the original data to get the critical attributes under study. The reduced dataset is then modeled using the Radial Basis neural network. RBNN provides the regression value more than 0.95 which indicates the strong dependence of the yield on the critical traits.

Author 1: Muhammad Adnan
Author 2: Abaid-ur-Rehman
Author 3: M. Ahsan Latif
Author 4: Naseer Ahmad
Author 5: Maria Nazir
Author 6: Naheed Akhter

Keywords: RBNN; PCA; stepwise regression; attributes; yield

PDF

Paper 39: Soft Error Tolerance in Memory Applications

Abstract: This paper proposes a new method to detect and correct multi bit errors in memory applications using a combination of a clustering approach, Bit-Per-Byte error detection technique, and Majority Logic Decodable (MLD) codes. The likelihood of soft errors accelerates with system complexity, reduction in operational voltages, exponential growth in transistor per chip, increases in clock frequencies, breakdown of memory reliability and device shrinking. Memories are the sensitive part of a computer system. Soft errors in memories may cause an instruction to malfunction. Several techniques are already in practice to mitigate the soft errors. Majority logic decodable codes are proved as effective for memory applications because of their ability to correct a massive number of errors. Since memories are used to hold large number of bits that’s the restraint of Majority logic decodable codes method, so we emphasize on the size of data word in this method. The proposed method aims to detect and correct up to seven bit errors with lesser computational time. It works in an efficient manner in case of adjacent errors which is not possible in Majority logic decodable codes (MLD). It is delineated by Experimental reviews that the proposed approach outperforms existing dominant approach with respect to number of erroneous bit detection and correction, and computational time overhead.

Author 1: Muhammad Sheikh Sadi
Author 2: Md. Shamimur Rahman
Author 3: Shaheena Sultana
Author 4: Golam Mezbah Uddin
Author 5: Kazi Md. Bodrul Kabir

Keywords: Soft error tolerance; bit-per-byte; majority logic decodable codes; clustering; adjacent errors

PDF

Paper 40: Safety and Performance Evaluation Method for Wearable Artificial Kidney Systems

Abstract: This paper focuses on international standards and guidelines related to evaluating the safety and performance of wearable dialysis systems and devices. The applicable standard and evaluation indices for safety and performance are determined, and the relevant international standards and guidelines are provided in a table. In addition, example experiments using a triaxial accelerometer and robot arm are presented for testing the endurance and safety of wearable artificial kidneys. The findings in this paper can be used to suggest new guidelines for the mechanical safety and performance evaluation of wearable artificial kidney systems.

Author 1: YeJi Ho
Author 2: SangHoon Park
Author 3: KyungMin Jo
Author 4: Barum Choi
Author 5: SangEun Park
Author 6: Jaesoon Choi

Keywords: Wearable artificial kidney; safety; hemodialysis; peritoneal dialysis; accelerometer

PDF

Paper 41: Data Mining Models Comparison for Diabetes Prediction

Abstract: From the past few years, data mining got a lot of attention for extracting information from large datasets to find patterns and to establish relationships to solve problems. Well known data mining algorithms include classification, association, Naïve Bayes, clustering and decision tree. In medical science field, these algorithms help to predict a disease at early stage for future diagnosis. Diabetes mellitus is the most growing disease that needs to be predicted at its early stage as it is lifelong disease and there is no cure for it. This research is intended to provide comparison for different data mining algorithms on PID dataset for early prediction of diabetes.

Author 1: Amina Azrar
Author 2: Yasir Ali
Author 3: Muhammad Awais
Author 4: Khurram Zaheer

Keywords: Diabetes; data mining; classification; decision tree; Naïve Bayes; KNN

PDF

Paper 42: Using Artificial Intelligence Approaches to Categorise Lecture Notes

Abstract: Lecture materials cover a broad variety of documents ranging from e-books, lecture notes, handouts, research papers and lab reports amongst others. Downloaded from the Internet, these documents generally go in the Downloads folder or other folders specified by the students. Over a certain period of time, the folders become so messy that it becomes quite difficult to find our way through them. Sometimes files downloaded from the Internet are saved without the certainty that they will be used or revert to in the future. Documents are scattered all over the computer system, making it very troublesome and time consuming for the user to search for a particular file. Another issue that adds up to the difficulty is the improper naming conventions. Certain files bear names that are totally irrelevant to their contents. Therefore, the user has to open these documents one by one and go through them to know what the files are about. One solution to this problem is a file classifier. In this paper, a file classifier will be used to organise the lecture materials into eight different categories, thus easing the tasks of the students and helping them to organise the files and folders on their workstations. Modules each containing about 25 files were used in this study. Two machine learning techniques were used, namely, decision trees and support vector machines. For most categories, it was found that decision trees outperformed SVM.

Author 1: Naushine Bibi Baijoo
Author 2: Khusboo Bharossa
Author 3: Somveer Kishnah
Author 4: Sameerchand Pudaruth

Keywords: Classification; lecture materials; machine learning; support vector machines; decision trees

PDF

Paper 43: EEG-Based Emotion Recognition using 3D Convolutional Neural Networks

Abstract: Emotion recognition is a crucial problem in Human-Computer Interaction (HCI). Various techniques were applied to enhance the robustness of the emotion recognition systems using electroencephalogram (EEG) signals especially the problem of spatiotemporal features learning. In this paper, a novel EEG-based emotion recognition approach is proposed. In this approach, the use of the 3-Dimensional Convolutional Neural Networks (3D-CNN) is investigated using a multi-channel EEG data for emotion recognition. A data augmentation phase is developed to enhance the performance of the proposed 3D-CNN approach. And, a 3D data representation is formulated from the multi-channel EEG signals, which is used as data input for the proposed 3D-CNN model. Extensive experimental works are conducted using the DEAP (Dataset of Emotion Analysis using the EEG and Physiological and Video Signals) data. It is found that the proposed method is able to achieve recognition accuracies 87.44% and 88.49% for valence and arousal classes respectively, which is outperforming the state of the art methods.

Author 1: Elham S. Salama
Author 2: Reda A.El-Khoribi
Author 3: Mahmoud E.Shoman
Author 4: Mohamed A.Wahby Shalaby

Keywords: Electroencephalogram; emotion recognition; deep learning; 3D convolutional neural networks; data augmentation; single-label classification; multi-label classification

PDF

Paper 44: Enhanced and Improved Hybrid Model to Prediction of User Awareness in Agriculture Sector

Abstract: Agriculture is the backbone of Indian economy and is the main income source for most of the population in India. So farmers are always curious about yield prediction. Crop yield depends on various factors like soil, weather, rain, fertilizers and pesticides. Several factors have different impacts on agriculture, which can be quantified using appropriate statistical methodologies. Applying such methodologies and techniques on historical yield of crops, it is possible to obtain information or knowledge which can be helpful to farmers and government organizations for making better decision and policies which lead to increased production. The main drawbacks of Indian farmers are they do not have proper knowledge regarding crop yield based on soil necessities. So in this paper, we proposed and developed an Improved Hybrid Model (which is combination of both classification, i.e. Artificial Neural Networks and clustering approach i.e. k-means (works based on Euclidean distance)) to provide awareness, usage and prediction to each farmer that relates to classify different crop yield representation based on soil necessity. For that we collected farmer’s data from standard repositories like http://www.tropmet.res.in/static_ page.php?page_id=52#data and then using that data provide awareness and other parameter sequences to all the farmers in India. Our experimental results show efficient e-agriculture with respect to user awareness, usage and prediction with respect to prediction, recall and f-measure for supporting real time marketing of different agriculture products.

Author 1: A.V.S. Pavan Kumar
Author 2: Dr. R. Bhramaramba

Keywords: Agriculture products; e-agriculture; classification; clustering; ensemble model

PDF

Paper 45: Identifying Dynamic Topics of Interest across Social Networks

Abstract: Information propagation plays a significant role in online social networks, mining the latent information produced became crucial to understand how information is disseminated. It can be used for market prediction, rumor controlling, and opinion monitoring among other things. Thus, in this paper, an information dissemination model based on dynamic individual interest is proposed. The basic idea of this model is to extract effective topic of interest of each user overtime and identify the most relevant topics with respect to seed users. A set of experiments on real twitter dataset showed that the proposed dynamic prediction model which applies machine learning techniques outperformed traditional models that only rely on words extracted from tweets.

Author 1: Mohamed Salaheldin Aly
Author 2: Abeer Al Korany

Keywords: Information propagation; topic modelling; dynamic user modelling; user behavior; machine learning; topic classification; social networks

PDF

Paper 46: Processing Sampled Big Data

Abstract: Big data processing requires extremely powerful and large computing setup. This puts bottleneck not only on processing infrastructure but also many researchers don’t get the freedom to analyze large datasets. This paper thus analyzes the processing of the large amount of data from machine learnt models that are built on the smaller sets of data samples. This work analyzes more than 40 GB data by testing different strategies of reducing the processed data without losing and compromising on the detection and model learning in machine learning. Many alternatives are analyzed and it is observed that 50% reduction does not drastically harm the machine learning model performance. On average, in SVM only 3.6%, and in Random Forest, only 1.8% performance is reduced, if only 50% data is used. The 50% reduction in instances means that in most cases, the data will fit in the RAM and the processing times will be considerably reduced, benefitting in execution times and or resources. From the incremental training and testing experiments, it is found that in special cases, smaller sub-sampled data can be used for model generation in machine learning problems. This is useful in cases, where there are either limitations on hardware or one has to select among many available machine learning algorithms.

Author 1: Waleed Albattah
Author 2: Rehan Ullah Khan

Keywords: Deep learning; content analysis; machine learning; support vector machines; random forest

PDF

Paper 47: Access Control Model for Modern Virtual e-Government Services: Saudi Arabian Case Study

Abstract: e-Government services require intensive information exchange and interconnection among governmental agencies to provide specialized online services and allow informed decision-making. This could compromise the integrity, confidentiality, and/or availability of the information being exchanged. Government agencies are accountable and liable for the protection of information they possess and use on a least privilege security principle basis even after dissemination. However, traditional access control models are short of achieving this as they do not allow dynamic access to unknown users to the system, they do not provide security controls at a fine-grained level, and they do not provide persistent control over this information. This paper proposes a novel secure access control model for cross-governmental agencies. The secure model deploys a Role-centric Mandatory Access Control MAC (R-MAC) model, suggests a classification scheme for e-Government information, and enforces its application using XML security technologies. By using the proposed model, privacy could be preserved by having dynamic, persistent, and fine-grained control over their shared information.

Author 1: Rand Albrahim
Author 2: Hessah Alsalamah
Author 3: Shada Alsalamah
Author 4: Mehmet Aksoy

Keywords: Access control; cloud infrastructure; data classification scheme; data exchange; e-government; fine-grained access; implementation framework; persistent control; XML security technologies; Saudi Arabia

PDF

Paper 48: Evaluation of the Impact of Usability in Arabic University Websites: Comparison between Saudi Arabia and the UK

Abstract: Today usability is a crucial factor that can affect any website. The purpose of this study is to explore major usability defects within Saudi university websites in comparison to British university websites from a Saudi student perspective. In addition, students are expected to achieve their goal when surfing a Saudi Arabian university website comfortably and efficiently without any complication. This study uses two methods to evaluate and measure usability problems; user testing and thinking aloud. Both methods are very useful and effective for collecting data from participants. Based on the ranking of the universities, 60 students were split evenly into three groups; each group was asked to evaluate a different pair of university websites from different ranking levels, one from the UK and the other from KSA. The evaluation performed by each group was gathered using the SUS (System Usability Scaling) questionnaire to find flaws within the usability of the website. During the experiment, the participants’ opinions were collected using the thinking aloud method. The findings of this research showed that all Saudi universities in all tiers had significant problems within the usability of their websites. The most frequent problems found were, inconsistency, integration, confidence and satisfaction. Other less frequent problems that were found during this study were design concepts, easy use of websites and comfort of students. Saudi universities can learn from the differences in the quality between both sides to upgrade and redesign their website to achieve user satisfaction, therefore increasing the confidence of the users.

Author 1: Mohamed Benaida
Author 2: Abdallah Namoun
Author 3: Ahmad Taleb

Keywords: Usability; usability evaluation; factor analysis; student satisfaction

PDF

Paper 49: Using Sab-Iomha for an Alpha Channel based Image Forgery Detection

Abstract: Digital images are a very popular way of transfer-ring media. However, their integrity remains challenging because these images can easily be manipulated with the help of software tools and such manipulations cannot be verified through a naked-eye. Although there exist some techniques to validate digital images, but in practice, it is not a trivial task as the existing approaches to forgery detection are not very effective. Therefore, there is need for a simple and efficient solution for the challenge. On the other hand, digital image steganography is the concealing of a message within an image file. The secret message can be retrieved afterwards by the author to check the image file for its veracity. This research paper proposes Sabiomha, an image forgery technique that make use of image steganography. The proposed technique is also supported by a software tool to demonstrate its usefulness. Sabiomha works by inserting an invisible watermark to certain alpha bits of the image file. The watermark we have used to steganograph an image is composed of a combination of text inputs the author can use to sign the image. Any attempts to tamper the image would distort the sequence of the bits of the image pixel. Hence, the proposed technique can easily validate originality of a digital image by exposing any tampering. The usability of our contribution is demonstrated by using the software tool we developed to automate the proposed technique. The experiment which we performed to further validate our technique suggested that Sabimoha could be flawlessly applied to image files.

Author 1: Muhammad Shahid Bhatti
Author 2: Syed Asad Hussain
Author 3: Abdul Qayyum
Author 4: Abdul Karim Shahid
Author 5: Muhammad Usman Akram
Author 6: Sajid Ibrahim Hashmi

Keywords: Digital images; tamper; steganography; metadata; forgery detection; cipher; image authentication; image validation; watermarking

PDF

Paper 50: Recommendations for Building Adaptive Cognition-based E-Learning

Abstract: Adapted e-Learning systems try to adapt the learning material based on the student’s preferences. Course authors design their courses with their students’ styles and in mind, course delivery should match the student style, and student assessment should also be adapted to match each specific student’s learning style, while student portfolio helps identifying the student model. To the best of our knowledge, no clear recommendation for building community wide adapted and personalized e-learning systems. This paper presents recommendations to add adaptation and personalization to one of the most common open source Learning Management System (LMS), Moodle. The adaptation features are based on using learning styles, ontology, and cognitive Bloom Taxonomy in building and presentation of the e-learning material (Learning Objects). This is helpful to establish adaptable and cognition-based Learning Object repository and course development centers.

Author 1: Mostafa Saleh
Author 2: Reda Mohamed Salama

Keywords: Adaptive e-learning; learning objects, learning styles; student models; open source LMS; Moodle; personalized teaching model

PDF

Paper 51: Segmentation Method for Pathological Brain Tumor and Accurate Detection using MRI

Abstract: Image segmentation is challenging task in field of medical image processing. Magnetic resonance imaging is helpful to doctor for detection of human brain tumor within three sources of images (axil, corneal, sagittal). MR images are nosier and detection of brain tumor location as feature is more complicated. Level set methods have been applied but due to human interaction they are affected so appropriate contour has been generated in discontinuous regions and pathological human brain tumor portion highlighted after applying binarization, removing unessential objects; therefore contour has been generated. Then to classify tumor for segmentation hybrid Fuzzy K Mean-Self Organization Mapping (FKM-SOM) for variation of intensities is used. For improved segmented accuracy, classification has been performed, mainly features are extracted using Discrete Wavelet Transformation (DWT) then reduced using Principal Component Analysis (PCA). Thirteen features from every image of dataset have been classified for accuracy using Support Vector Machine (SVM) kernel classification (RBF, linear, polygon) so results have been achieved using evaluation parameters like Fscore, Precision, accuracy, specificity and recall.

Author 1: Khurram Ejaz
Author 2: Mohd Shafry Mohd Rahim
Author 3: Amjad Rehman
Author 4: Huma Chaudhry
Author 5: Tanzila Saba
Author 6: Anmol Ejaz
Author 7: Chaudhry Farhan Ej

Keywords: Brain tumor; level set; Hybrid Fuzzy K Mean (Hybrid FKM); Discrete Wavelet Transformation (DWT); Scalable Vector Machine (SVM); Magnetic Resonance Image (MRI); Principal Component Analysis (PCA)

PDF

Paper 52: Skew Detection and Correction of Mushaf Al-Quran Script using Hough Transform

Abstract: Document skew detection and correction is mainly one of base preprocessing steps in the document analysis. Correction of the skewed scanned images is critical because it has a direct impact on image quality. In this paper, the authors proposed a method for skew detection and correction for Mushaf Al-Quran image pages based on Hough transform method. The technique uses Hough transform lines detection for calculating the skew angulation. It works for different version of Mushaf Al-Quran image pages which has skewed text zones. Moreover, it can detect and correct the skew angle in the range between 20 degrees. Experiment conducted on different Mushaf Al-Quran image pages shows the accuracy of the method.

Author 1: Salem Saleh Bafjaish
Author 2: Mohd Sanusi Azmi
Author 3: Mohammed Nasser Al-Mhiqani
Author 4: Amirul Ramzani Radzid
Author 5: Hairulnizam Mahdin

Keywords: Skew detection; skew correction; Hough transform; preprocessing; binarization; image analysis

PDF

Paper 53: Review of Information Security Policy based on Content Coverage and Online Presentation in Higher Education

Abstract: Policies are high-level statements that are equal to organizational law and drive the decision-making process within the organization. Information security policy is not easy to develop unless organizations clearly identify the necessary steps required in the development process of an information security policy, particularly in institutions of higher education that largely utilize IT. An inappropriate development process or replication of security policy content from other organizations could fail in execution. The execution of a duplicated policy could fail to act in accordance with enforceable rules and regulations even though it is well developed. Hence, organizations need to develop appropriate policies in compliance with the organization regulatory requirements. This paper aims to reviews policies from selected universities with regards to ISO 27001:2013 minimum requirements as well as effective online presentation. The online presentation review covers the elements of aesthetics, navigation and content presentation. The information on the security policy document resides on the universities’ website.

Author 1: Arash Ghazvini
Author 2: Zarina Shukur
Author 3: Zaihosnita Hood

Keywords: Information security policy; policy development; higher education

PDF

Paper 54: Implementation of a Formal Software Requirements Ambiguity Prevention Tool

Abstract: The success of the software engineering process depends heavily on clear unambiguous software requirements. Ambiguity refers to the possibility to understand a requirement in more than one way. Unfortunately, ambiguity is an inherent property of the natural languages used to write the software user requirements. This could cause a final faulty system implementation, which is too expensive to correct. The basic requirements ambiguity resolution approaches in the literature are ambiguity detection, ambiguity avoidance, and ambiguity prevention. Ambiguity prevention is the least tackled approach because it requires designing formal languages and templates, which are hard to implement. The main goal of this paper is to provide full implementation of an ambiguity prevention tool and then study its effectiveness using real requirements. Towards this goal, we developed a set of Finite State Machine (FSMs) implementing templates of various requirement types. We then used Python to implement the ambiguity prevention tool based on those FSMs. We also collected a benchmark of 2460 real requirements and selected a random set of forty real requirements to test the effectiveness of the developed tool. The experiment showed that the implemented ambiguity prevention tool can prevent critical requirements ambiguity issues such as missing information or domain ambiguity. Nevertheless, there is a tradeoff between ambiguity prevention and the effort needed to write the requirements using the imposed templates.

Author 1: Rasha Alomari
Author 2: Hanan Elazhary

Keywords: Software requirements; requirements ambiguity; natural language ambiguity; ambiguity prevention; controlled languages; finite state machines

PDF

Paper 55: A Comparative Study of the Decisional Needs Engineering Approaches

Abstract: Requirements Engineering (RE) is an important phase in a project of systems development. It helps design-analysts to design and to model the expression of the end-user needs, and their expectations vis-a-vis their future system. This engineering is studying two major issues that are: What should the system do in order to have a complete needs specification, and reason on the why: "Why do we need to build this system? ", without looking for how to build it. The vast majority of needs engineering approaches are based on two concepts: scenario or goal; there are generally three types of approaches: Scenario-Oriented Approaches, Goal-Oriented Approaches and approaches generated by the couple: goals and scenarios at the same time. In the remainder of this paper, we present a comparative study of the three types of the RE approaches, then models of needs representation, and finally we conclude with the conclusions.

Author 1: OUTFAROUIN Ahmad
Author 2: ZAHID Noureddine
Author 3: ABDALI Abdelmounaïm

Keywords: Decisional information systems; decisional needs engineering; needs engineering approaches; goal; scenario; model of needs representation

PDF

Paper 56: A Blockchain Technology Evolution Between Business Process Management (BPM) and Internet-of-Things (IoT)

Abstract: A Blockchain is considered the main mechanism for Bitcoin concurrency. A Blockchain is known by a public ledger and public transactions stored in a chain. The properties of blockchain demonstrate in decentralization as distribution blocks, stability, anonymity, and auditing. Blockchain can enhance the results of network efficiency and improve the security of network. It also can be applied in several fields like financial and banking services, healthcare systems, and public services. However, the research is still opening at this point. It includes a big number of technical challenges which prevents the wide application of blockchain, for example, scalability problem, privacy leakage, etc. This paper shows a proposed comprehensive study of blockchain technology. It also examines the research efforts in blockchain. It presents a proposed blockchain lifecycle which refers to an evolution and a linked ring between business process management improvement and Internet-of-Things concepts. Then, this paper presents a practical proof of this relationship for smart city. It presents a new algorithm and a proposed blockchain framework for 38 blocks (which recognized as smart-houses). Finally, the future directions are well presented in blockchain field.

Author 1: Doaa Mohey El-Din M. Hussein
Author 2: Mohamed Hamed N. Taha
Author 3: Nour Eldeen M. Khalifa

Keywords: Blockchain; bitcoin; business process; cryptography; decentralization; consensus; applications

PDF

Paper 57: Defects Prediction and Prevention Approaches for Quality Software Development

Abstract: The demand for distributed and complex business applications in the enterprise requires error-free and high-quality application systems. Unfortunately, most of the developed software contains certain defects which cause failure of a system. Such failures are unacceptable for the development in the critical or sensitive applications. This makes the development of high quality and defect free software extremely important in software development. It is important to better understand and compute the association among software defects and its failures for the effective prediction and elimination of these defects to decline the failure and improve software quality. This paper presents a review of software defects prediction and its prevention approaches for the quality software development. It also focuses a review on the potential and constraints of those mechanisms in quality product development and maintenance.

Author 1: Mashooque Ahmed Memon
Author 2: Mujeeb-Ur-Rhman Magsi Baloch
Author 3: Muniba Memon
Author 4: Syed Hyder Abbas Musavi

Keywords: Software; defects; predictions; preventions; software development

PDF

Paper 58: Design and Implementation of a Risk Management Tool: A Case Study of the Moodle Platform

Abstract: During the last years, the distinctive feature of our society has been the rapid pace of technological change. In the Moroccan context, universities have put digital learning at the heart of their projects of development thanks to a wide range of hybrid training devices, Small Private Online Course (SPOC) and Massive Open Online Courses (MOOCs) via Virtual Work Environment (ENT, Environnement Numérique de Travail ). On the one hand, the purpose of using these devices consist in helping improve their performance and in enhancing their attractiveness. On the other hand, is aimed at meeting the increasingly diverse student’s needs, thanks to the infrastructures reorganization and a renovated pedagogy. Also, extensive use of information and communication technologies at different universities exposes them to a problem related to information system (IS) risks in general and e-learning in particular. The risk assessment is quite complicated and multidimensional. It must take into account many components, including assets, threats, vulnerabilities, controls already in place and analyses. In this work, we first propose the methods of risk management. We then present the risk analysis related to the Moodle platform.

Author 1: Nadia Chafiq
Author 2: Mohammed Talbi
Author 3: Mohamed Ghazouani

Keywords: Risk management; e-learning; mehari; platform

PDF

Paper 59: Artificial Neural Network based Weather Prediction using Back Propagation Technique

Abstract: Weather forecasting is a natural phenomenon which has some chaotic changes happening with the passage of time. It has become an essential topic of research due to some abrupt scenarios of weather. As the data of forecast is nonlinear and follows some irregular trends and patterns, there are many traditional techniques (the literature like nonlinear statistics) to work on the efficiency of models to make prediction better than previous models. However, Artificial Neural Network (ANN) has so far evolved out to be as a better way to improve the accuracy and reliability. The ANN is one of the most fastest growing technique of machine learning considered as non-linear predictive models to perform classification and prediction weather forecasts maximum temperature for the whole days (365) of the year. Therefore, a multi-layered neural network is designed and trained with the existing dataset and obtained a relationship between the existing non-linear parameters of weather. Eleven weather features were used to perform classification of weather into four types. Furthermore, twenty training examples from 1997-2015 were used to predict eleven weather features. The results revealed that by increasing the number of hidden layers, the trained neural network can classify and predict the weather variables with less error.

Author 1: Saboor Ahmad Kakar
Author 2: Naveed Sheikh
Author 3: Adnan Naseem
Author 4: Saleem Iqbal
Author 5: Abdul Rehman
Author 6: Aziz ullah Kakar
Author 7: Bilal Ahmad Kakar
Author 8: Hazrat Ali Kakar
Author 9: Bilal Khan

Keywords: Weather forecasting; artificial neural network; classification; prediction; backpropagation; hidden layers

PDF

Paper 60: An Incremental Technique of Improving Translation

Abstract: Statistical machine translation (SMT) refers to using probabilistic methods of learning translation process primarily from the parallel text. In SMT, the linguistic information such as morphology and syntax can be added to the parallel text for improved results. However, adding such linguistic matter is costly, in terms of time and expert effort. Here, we introduce a technique that can learn better shapes (morphological process) and more appropriate positioning (syntactic realization) of target words, without linguistic annotations. Our method improves result iteratively over multiple passes of translation. Our experiments showed better accuracy of translation, using a well-known scoring tool. There is no language specific step in this technique.

Author 1: Aasim Ali
Author 2: Arshad Hussain

Keywords: Statistical machine translation; incremental learning algorithm; English; Urdu

PDF

Paper 61: Role Term-Based Semantic Similarity Technique for Idea Plagiarism Detection

Abstract: Most of the text mining systems are based on statistical analysis of term frequency. The statistical analysis of term (phrase or word) frequency captures the importance of the term within a document, but the techniques that had been proposed by now still need to be improved in terms of their ability to detect the plagiarized parts, especially for capturing the importance of the term within a sentence. Two terms can have a same frequency in their documents, but one term pays more to the meaning of its sentences than the other term. In this paper, we want to discriminate between the important term and unimportant term in the meaning of the sentences in order to adopt for idea plagiarism detection. This paper introduces an idea plagiarism detection based on semantic meaning frequency of important terms in the sentences. The suggested method analyses and compares text based on a semantic allocation for each term inside the sentence. SRL offers significant advantages when generating arguments for each sentence semantically. Promising experimental has been applied on the CS11 dataset and results revealed that the proposed technique's performance surpasses its recent peer methods of plagiarism detection in terms of Recall, Precision and F-measure.

Author 1: Ahmed Hamza Osman
Author 2: Hani Moetque Aljahdali

Keywords: Plagiarism detection; semantic similarity; semantic role; term frequency; idea

PDF

Paper 62: Impact of Security in QoS Signaling in NGN: Registration Study

Abstract: New generation networks (NGN) use an IP base to transmit their services as well as voice, video and other services. The IP Multimedia Subsystem (IMS) which represents the network core, allowed controls and accesses into various services through a set of signalling protocols, the most common of which is Session Initiation Protocol (SIP). After securing the most vulnerable interfaces in the core of the NGN: IMS architecture. The idea is to improve QoS in SIP signalling, especially in authentication and registration that represent the first step to access. The proposed approach is used as encryption asymmetry in the SIP registration process and study the performance of the system in terms of QoS parameters.

Author 1: RAOUYANE Brahim
Author 2: BELMEKKI Elmostafa
Author 3: KHAIRI sara
Author 4: BELLAFKIH mostafa

Keywords: Quality of Service (QoS); Security; New Generation Network (NGN); IP Multimedia Subsystem (IMS); Session Initiation Protocol (SIP)

PDF

Paper 63: Information System Quality: Managers Perspective

Abstract: To evaluate Information System Quality (ISQ) quantitatively, a model was constructed based on sub-models related to the five Information System (IS) components, namely, Human Resources, Hardware, Software and application, Procedure and Data, and all IS players perspectives are considered who are: Managers, Technical Staff, Functional Staff and Users. This paper focuses on the survey designed for managers in order to form the variable indicators from variable questions, via appropriate formulas in the first place, and to analyze data collected from IS managers of the Moroccan universities in the second one. This approach will allow diagnosing precisely the malfunctioning areas on ISQ by emphasizing on the components with less quality level. It will also enable making comparison of ISQ on different organizations with the mean of standardized values.

Author 1: Sarah Aouhassi
Author 2: Mostafa Hanoune

Keywords: Information system; quality; managers; measurement indicator; university

PDF

Paper 64: Using Fuzzy Clustering Powered by Weighted Feature Matrix to Establish Hidden Semantics in Web Documents

Abstract: Digital Data is growing exponentially exploding on the 'World Wide Web'. The orthodox clustering algorithms obligate various challenges to tackle, of which the most often faced challenge is the uncertainty. Web documents have become heterogeneous and very complex. There exist multiple relations between one web document and others in the form of entrenched links. This can be imagined as a one to many (1-M) relationships, for example, a particular web document may fit in many cross domains viz. politics, sports, utilities, technology, music, weather forecasting, linked to ecommerce products, etc. Therefore, there is a necessity for efficient, effective and constructive context driven clustering methods. Orthodox or the already well-established clustering algorithms adhere to classify the given data sets as exclusive clusters. Signifies that we can clearly state whether to which cluster an object belongs to. But such a partition is not sufficient for representing in the real time. So, a fuzzy clustering method is presented to build clusters with indeterminate limits and allows that one object belongs to overlying clusters with some membership degree. In supplementary words, the crux of fuzzy clustering is to contemplate the fitting status to the clusters, as well as to cogitate to what degree the object belongs to the cluster. The aim of this study is to device a fuzzy clustering algorithm which along with the help of feature weighted matrix, increases the probability of multi-domain overlapping of web documents. Over-lapping in the sense that one document may fall into multiple domains. The use of features gives an option or a filter on the basis of which the data would be extracted through the document. Matrix allows us to compute a threshold value which in turn helps to calculate the clustering result.

Author 1: Pramod D Patil
Author 2: Parag Kulkarni

Keywords: Fuzzy; clustering; web document; feature matrix

PDF

Paper 65: A New E-Health Tool for Early Identification of Voice and Neurological Pathologies by Speech Processing

Abstract: The objective of this study is to develop a non-invasive method of early identification and classification of voice pathologies and neurological diseases by speech processing. We will present a new automatic medical diagnosis tool which can assist specialists in their medical diagnosis. The developed strategy is based on speech acquisition of the patient followed by audio features extraction, training and recognition by using the HTK toolkit. The computed parameters are compared to standard values from a codebook database. The experiments and tests are conducted by using the MEEI pathological database of KEY Pentax. The obtained results give good discrimination with a mean pathology recognition ratio about 95%. Finally, this E-Health application is helpful for the prevention of specific diseases and improving the quality of patient care as well as reducing the costs of healthcare.

Author 1: Bouafif Lamia
Author 2: Ellouze Noureddine

Keywords: E-Health; voice disorder; HMM classification; feature extraction; MFCC; pathology recognition rate

PDF

Paper 66: An Overview of Mutation Strategies in Bat Algorithm

Abstract: Bat algorithm (BA) is a population based stochastic search technique encouraged from the intrinsic manner of bee swarm seeking for their food source. BA has been mostly used to resolve diverse kind of optimization problems and one of major issue faced by BA is frequently captured in local optima meanwhile handling the complex real world problems. Many authors improved the standard BA with different mutation strategies but an exhausted comprehensive overview about mutation strategies is still lacking. This paper aims to furnish a concise and comprehensive study of problems and challenges that prevent the performance of BA. It has been tried to provide guidelines for the researchers who are active in the area of BA and its mutation strategies. The objective of this study is divided in two sections: primarily to display the improvement of BA with mutation strategies that may enhance the performance of standard BA up to great extent and secondly, to motivate the researchers and developers for using BA to solve the complex real world problems. This study presents a comprehensive survey of the various BA algorithms based on mutation strategies. It is anticipated that this survey would be helpful to study the BA algorithm in detail for the researcher.

Author 1: Waqas Haider Bangyal
Author 2: Jamil Ahmad
Author 3: Hafiz Tayyab Rauf
Author 4: Sobia Pervaiz

Keywords: Bat algorithm; optimization; local optima; mutation strategies; premature convergence; swarm intelligence

PDF

Paper 67: Arabic Chatbots: A Survey

Abstract: A Chatbot is a programmed entity that handles human-like conversations between an artificial agent and humans. This conversation has attracted the attention of researchers who are interested in the interaction between humans and machines to make the conversation more rational and hence pass the Turing test. The available research done in the field of Arabic chatbots is comparably scarce. This paper presents a review of the published Arabic chatbots studies to identify the gap of knowledge and to highlight the areas that needs more study and research. This study concluded the rarity of available research on Arabic chatbots and that all available works are retrieval based.

Author 1: Sarah AlHumoud
Author 2: Asma Al Wazrah
Author 3: Wafa Aldamegh

Keywords: Artificial intelligence; Arabic chatbot; conversational agent; ArabChat; human-machine interaction; utterance

PDF

Paper 68: Learner Cognitive Behavior and Influencing Factors in Web-based Learning Environment

Abstract: In educational institutions, to improve student learning outcome and performance, the information and communication technology has enabled us to embark web-based learning approaches. The traditional web-based learning environment in higher education is aimed at fulfilling the users for most of their deserved learning contents as per the course curriculum. But, in modeling the course curriculum and content, the motivational factors have been left out, through which the learner’s cognitive skills development can take place. Therefore, in e-learning courses, this issue needs to be addressed. It can be resolved through subsuming suitable learning objectives and appropriate skills based interactive learning resources, which can enhance thinking skills and cognitive behavior of learner. This paper provides theoretical framework on the pedagogical factors that can influence the quality of students' learning experience and cognitive learning skills in web based learning environment. Furthermore, this study discusses about the role of prior knowledge and learner’s thought process model in cognitive based learning environments.

Author 1: Kalla Madhusudhana

Keywords: Learning environment; cognitive behavior; influencing factors; pedagogical; knowledge; curriculum content

PDF

Paper 69: New Hybrid Task Scheduling Algorithm with Fuzzy Logic Controller in Grid Computing

Abstract: Distributed heterogeneous architecture is extensively applied to a diversity of large scale research projects conducive to solve complex computational problems. Mentioned distributed systems consist of multiple heterogenous linked processing units used to handle the continuous arrival jobs. The tasks scheduling problem is concerned with resource allocation strategies to assign jobs to available computing resources. The load balancing of linked resources becomes a main issue to select in each task schedule the adequate computing resource. Our proposal consists of combining Q-learning with ACO (Ant Colony Optimization) to solve the tasks allocation dilemma. In our proposed Fuzzy Hybrid Framework, Fuzzy ants are used to calculate at each scheduling operation, the novel reward values whereas Q-learning is used to select the suitable Worker Machine. The simulation findings confirmed the efficiency of the proposed framework due to the significant decrease of the makespan.

Author 1: Younes Hajoui
Author 2: Omar Bouattane
Author 3: Mohamed Youssfi
Author 4: Elhocein Illoussamen

Keywords: Distributed systems; computational problems; load balancing; Q-learning; ACO; fuzzy hybrid framework

PDF

Paper 70: Performance Comparison of QEC Network based JAVA Application and Web based PHP Application

Abstract: Every organization wants to automate the manual system for moving and storing their data in particular format. A QEC department takes feedback of teacher evaluation manually from the students in the university that is somehow more difficult to maintain the record of a teacher, more cost-effective and fewer chances to generate an accurate and optimized report. The computerized system has been developed that generates an accurate and optimized report, easy to maintain the record of the teacher. Lots of possibilities are available to design and develop the application using different programming languages. We have developed a network-based JAVA application and web-based PHP application to automate the manual system of teacher evaluation. The GUI of the application contains 18 questions as per policy of HEC which will be answered by the students. After submitting the answers to questions to the server, an excel report will be ready to generate. Our primary focus is to measure the performance of the server of a network-based JAVA application and web-based PHP application. Both forms contain the same scenario, but here we have to find which form is more suitable and beneficent for an organization in terms of their server’s performance parameters like average response time, throughput, and standard deviation and data transfer rate.

Author 1: Sanaullah Memon
Author 2: Rasool Bux Palh
Author 3: Muniba Memon
Author 4: Hina Siddique Memon

Keywords: QEC; network based JAVA; web based PHP; server; apache Jmeter

PDF

Paper 71: Aspect-Combining Functions for Modular MapReduce Solutions

Abstract: MapReduce represents a programming framework for modular Big Data computation that uses a function map to identify and target intermediate data in the mapping phase, and a function reduce to summarize the output of the map function and give a final result. Because inputs for the reduce function depend on the map function’s output to decrease the communication traffic of the output of map functions to the input of reduce functions, MapReduce permits defining combining function for local aggregation in the mapping phase. MapReduce Hadoop solutions do not warrant the combining functioning application. Even though there exist proposals for warranting the combining function execution, they break the modular nature of MapReduce solutions. Because Aspect-Oriented Programming (AOP) is a programming paradigm that looks for the modular software production, this article proposes and apply Aspect-Combining function, an AOP combining function, to look for a modular MapReduce solution. The Aspect-Combining application results on MapReduce Hadoop experiments highlight computing performance and modularity improvements and a warranted execution of the combining function using an AOP framework like AspectJ as a mandatory requisite.

Author 1: Cristian Vidal Silva
Author 2: Rodolfo Villarroel
Author 3: Jose´ Rubio
Author 4: Franklin Johnson
Author 5: Erika Madariaga
Author 6: Alberto Urz´ua
Author 7: Luis Carter
Author 8: Camilo Campos-Vald´es
Author 9: Xaviera A. L´opez-Cort´es

Keywords: Combining; Hadoop; MapReduce; AOP; AspectJ; aspects

PDF

Paper 72: A New Message Encryption Method based on Amino Acid Sequences and Genetic Codes

Abstract: As the use of technology is increasing rapidly, the amount of shared, sent, and received information is also increas-ing in the same way. As a result, this necessitates the need for finding techniques that can save and secure the information over the net. There are many methods that have been used to protect the information such as hiding information and encryption. In this study, we propose a new encryption method making use of amino acid and DNA sequences. In addition, several criteria including data size, key size and the probability of cracking are used to evaluate the proposed method. The results show that the performance of the proposed method is better than many common encryption methods, such as RSA in terms of evaluation criteria.

Author 1: Ahmed Mahdee Abdo
Author 2: Adel Sabry Essa
Author 3: Abdullah A. Abdullah

Keywords: Information; secure; encryption

PDF

Paper 73: Using Artificial Neural Networks for Detecting Damage on Tobacco Leaves Caused by Blue Mold

Abstract: Worldwide, the monitoring of pests and diseases plays a fundamental role in the agricultural sustainability; making necessary the development of new tools for early pest detection. In this sense, we present a software application for detecting damage in tobacco (Nicotiana tabacum L.) leaves caused by the fungus of blue mold (Peronospora tabacina Adam). This software application processes tobacco leaves images using a pat-tern recognition technique known as Artificial Neural Network. For the training and testing stages, a total of 40 images of tobacco leaves were used. The experimentation carried out shows that the developed model has accuracy higher than 97% and there is no significant difference with a visual analysis carried out by experts in tobacco crop.

Author 1: Himer Avila-George
Author 2: Topacio Valdez-Morones
Author 3: Humberto P´erez-Espinosa
Author 4: Brenda Acevedo-Ju´arez
Author 5: Wilson Castro

Keywords: Nicotiana tabacum L.; Peronospora tabacina Adam, image processing; artificial neural networks

PDF

Paper 74: Complex Shear Modulus Estimation using Integration of LMS/AHI Algorithm

Abstract: Elasticity and viscosity of tissues are two important parameters that can be used to investigate the structure of tissues, especially detecting tumors. By using a force excitation, the shear wave speed is acquired to extract its amplitude and phase. This information is then used directly or indirectly to compute the Complex Shear Modulus (CSM consists of elasticity and viscosity). Among these methods, Algebraic Helmholtz Inversion (AHI) algorithm can be combined with the Finite Difference Time Domain (FDTD) model to estimate CSM effectively. However, this algorithm is strongly affected by measured noise while acquiring the particle velocity. Thus, we proposed a LMS/AHI algorithm which can estimate correctly CSM. A simulation scenario is built to confirm the performance of the proposed LMS/AHI algorithm with average error of 3.14%

Author 1: Quang–Hai Luong
Author 2: Manh–Cuong Nguyen
Author 3: TonThat–Long
Author 4: Duc–Tan Tran

Keywords: Shear wave; elasticity; viscosity; CSM estimation; least mean square; Algebraic Helmholtz Inversion

PDF

Paper 75: RASP-TMR: An Automatic and Fast Synthesizable Verilog Code Generator Tool for the Implementation and Evaluation of TMR Approach

Abstract: Triple Modular Redundancy (TMR) technique is one of the most well-known techniques for error masking and Single Event Effects (SEE) protection for the FPGA designs. These FPGA designs are mostly expressed in hardware descrip-tion languages, such as Verilog and VHDL. The TMR technique involves triplication of the design module and adding the majority voter circuit for each output port. Building this triplication scheme is a non-trivial task and requires a lot of time and effort to alter the code of the design. In this paper, the RASP-TMR tool is developed and presented that has functionalities to take a synthesizable Verilog design file as an input, parse the design and triplicate it. The tool also generates a top-level module in which all three modules are instantiated and finally adds the proposed majority voter circuit. This tool, with its graphical user interface, is implemented in MATLAB. The tool is simple, fast and user-friendly. The tool generates the synthesizable design that facilitates the user to evaluate and verify the TMR design for FPGA-based systems. A simulation scenario is created using Xilinx ISE tools and ISim simulator. Different fault models are examined during simulations such as bit-flip and stuck at 1/0. The results using various benchmark designs demonstrate that the tool produces synthesizable code and the proposed majority voter logic perfectly masks the error/failure.

Author 1: Abdul Rafay Khatri
Author 2: Ali Hayek
Author 3: Josef Borcsok

Keywords: Fault injection; fault tolerance; reliability; single event effects; triple modular redundancy; Verilog HDL

PDF

Paper 76: Design of Linear Time Varying Flatness-Based Control for Single-Input Single-Output Systems

Abstract: In this paper, the control of linear discrete-time Varying Single-Input Single-Output systems is tackled. By using flatness theory combined with a dead-beat observer, a two degree of freedom controller is designed with high performances in terms of trajectory tracking. The aim of this work is to avoid the choice of closed loop poles in linear discrete-time varying framework which build a very serious problem in system control. The effectiveness of this control law is highlighted by simulation results.

Author 1: Marouen Sleimi
Author 2: Mohamed Ben Abdallah
Author 3: Mounir Ayadi

Keywords: Flatness theory; discrete-time systems; linear time varying; single-input single-output; dead-beat observer; two degree of freedom controller

PDF

Paper 77: Comparative Performance Analysis of Efficient MIMO Detection Approaches

Abstract: The promising massive level MIMO (multiple-input-multiple-output) systems based on extremely huge antenna collections have turned into a sizzling theme of wireless com-munication systems. This paper assesses the performance of the quasi optimal MIMO detection approach based on semi-definite programming (SDP). This study also investigates the gain obtained when using SDP detector by comparing Bit Error Rate (BER) performance with linear detectors. The near optimal Zero Forcing Maximum Likelihood (ZFML) is also implemented and the comparison is evaluated. The ZFML detector reduces exhaustive ML searching using multi-step reduced constellation (MSRC) detection technique. The detector efficiently combines linear processing with local ML search. The complexity is bounded by maintaining small search areas, while performance is maximized by relaxing this constraint and increasing the cardinality of the search space. The near optimality of SDP is analyzed through BER performance with different antenna configurations using 16-QAM signal constellation operating in a flat fading channel. Simulation results indicate that the SDP detector acquired better BER performance, in addition to a significant decrease in computational complexity using different system/antenna configurations.

Author 1: Muhammad Faisal
Author 2: Fazal Wahab Karam
Author 3: Ali Zahir
Author 4: Sajid Bashir

Keywords: Multiple input multiple output antennas; MIMO detection approaches; performance analysis; semi-definite program-ming; zero forcing maximum likelihood

PDF

Paper 78: Sentiment Analysis, Visualization and Classification of Summarized News Articles: A Novel Approach

Abstract: Due to advancement in technology, enormous amount of data is generated every day. One of the main challenges of large amount of data is user overloaded with huge volume of data. Hence effective methods are highly required to help user to comprehend large amount of data. This research work proposes effective methods to extract and represent the data. The summarization is applicable to obtain a brief overview of the text and sentiment analysis can obtain emotions expressed in the text computationally. The combined text summarization and sentiment analysis is proposed on BBC news articles. A pronoun replacement based text summarization method is developed and VADER sentiment analyzer is used to determine sentiment information. The 3-D visualization schemes have been provided to represent the sentiment information. The sentiment analysis and classification are performed on original BBC news articles as well as on summarized articles using classifiers, such as Logistic Regression, Random Forest and Adaboost. On original news articles highest classification rate of 84.93%, using summarization of ratio 25%, 50% and 75% highest classification rates of 78.73%, 83.06% and 83.23%, respectively are observed.

Author 1: Siddhaling Urologin

Keywords: Summarization; sentiment analysis; 3-D visualiza-tion; sentiment classification

PDF

Paper 79: Minimization of Information Asymmetry Interference using Partially Overlapping Channel Allocation in Wireless Mesh Networks

Abstract: Wireless Mesh Network (WMN) is a developing technology that has a great impact on the improvement of the performance, flexibility and reliability over the traditional wireless networks. Using multi-hop communication facility these networks are installed as a solution to extend last-mile access to the Internet. WMN has already been deployed but still it faces certain issues regarding channel assignment and interfer-ence. One of the well-known interference issues is Information Asymmetry (IA) interface that results in increased retransmission ratio, end-to-end delay, and thus decreases the overall network capacity of WMN. Various studies have been done in the past to minimize information asymmetry interference using limited number of orthogonal or non-overlapping channels i.e. 1, 6 and 11 from IEEE 802.11b radio technology. In recent studies, it is mentioned that partially overlapping channels called POCs can be used to maximize network capacity. The purpose of this research is to minimize Information Asymmetry (IA) interference problem by proposing a channel assignment model called Optimal Partially Overlapping Channel Assignment (OPOCA). In this research, comparison has been made between OPOCA and existing Information Asymmetry Minimization (IAM) model. Through extensive simulations it has been verified that the proposed OPOCA model gives 8% better results as compared to existing IAM model.

Author 1: Sadiq Shah
Author 2: Khalid Saeed
Author 3: Mustafa Khan
Author 4: Rafi Ullah Khan

Keywords: Wireless Mesh Network (WMN); information asymmetry; Optimal Partially Overlapping Channel Assignment (OPOCA); NOC; Information Asymmetry Minimization (IAM) model

PDF

Paper 80: Initialization Method for Communication and Data Sharing in P2P Environment Between Wireless Sensor Nodes

Abstract: Wireless Sensor Networks have increased notewor-thy thought nowadays, rather than wired sensor systems, by presenting multi-useful remote hubs, which are littler in size. However, WSNs correspondence is inclined to negative impacts from the physical environment, like, physical hurdles and interfer-ence. The reason for this work is to outline a testbed, to introduce method for communication startup and data sharing in a peer to peer (p2p) environment between wireless sensor nodes. The work is directed on both the IEEE 802.15.4 physical and the application layers. In this testbed, one channel, from the IEEE 802.15.4 channels range is devoted as an “emergency channel” which is utilized for handshaking or in case there is communication failure between the Transmitter (Tx) and Receiver (Rx) nodes. The remaining 15 channels are called “data channels” and are utilized for real information transmission and control signals. Linux based TinyOS-2.x is utilized as a working framework for low power sensors. MICAz bits are utilized as nodes and a MIB520 programming board is utilized for burning the codes and for the purpose of gateways.

Author 1: M. Asif Jamal
Author 2: Aziz Ur Rehman
Author 3: Moonisa Ahsan
Author 4: M. S. Riaz
Author 5: M. S. Zafar

Keywords: TinyOS; peer-to-peer; motes; testbed; nesC; MICAz; MIB520; handshaking

PDF

Paper 81: Formal Specification of Memory Coherence Protocol

Abstract: Memory coherence is the most fundamental re-quirement in a shared virtual memory system where there are concurrent as well as loosely coupled processes. These processes can demand a page for reading or writing. The memory is called coherent if the last update in a page remains constant for each process until the owner of that page does not change it. The ownership is transferred to a process interested to update that page. In [Kai LI, and Paul Hudak. Memory Coherence in Shared Virtual Memory Systems, 1986. Proc. of Fifth Annual ACM Symposium on Principles of Distributed Computing.], algorithms ensuring memory coherence are given. We formally specify these protocols and report the improvements through formal analysis. The protocols are specified in UPPAAL, i.e., a tool for modeling, validation and verification of real-time systems.

Author 1: Jahanzaib Khan
Author 2: Muhammad Atif
Author 3: Muhammad Khurram Zahoor Bajwa
Author 4: Muhammad Sohaib Mahmood
Author 5: Sobia Usman

Keywords: Memory coherence; formal specification; shared memory; address space; analysis

PDF

Paper 82: Digital Technology Disorder: Justification and a Proposed Model of Treatment

Abstract: Due to advances in technology being made at an exponential rate, organisations are attempting to compete with one another by utilising state-of-the-art technology to provide innovative products and services that encourage use. However, there is no moral code to inform sensitive technology design, a consequence of which is the emergence of so-called technology addiction. While addiction as a term is problematic, increasing evidence suggests that related-conditions present implications for the individual, for organisations and for wider society. In this research, a consideration of the potentially addictive elements of technology indicates that it can be possible to reverse engineer these systems, as it were, to promote the development of new behaviours, which can enable the individual to abstain from overuse. Utilising smartphones to deliver digital behavioural change interventions can leverage abundant data touchpoints to provide highly tailored treatment, in addition to allowing for enhanced monitoring and accuracy. To inform understanding of this contemporary phenomenon, the literature on addiction has been reviewed, along with the literature on persuasion architecture to inform an understanding of techniques that lend themselves to overuse and how these can be leveraged to promote recovery. From which, the authors have developed a proposed model to inform the practice of those operating in the domains of computer science.

Author 1: Andrew Kear
Author 2: Sasha L. Folkes

Keywords: Addiction; digital; treatment; data; smartphone; behaviour; overuse; interventions

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org