The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 7 Issue 11

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: BITRU: Binary Version of the NTRU Public Key Cryptosystem via Binary Algebra

Abstract: New terms such as closest vector problem (CVP) and the shortest vector problem (SVP), which have been illustrated as NP-hard problem, emerged, leading to a new hope for designing public key cryptosystem based on certain lattice hardness. A new cryptosystem called NTRU is proven computationally efficient and it can be implemented with low cost. With these characteristics, NTRU possesses advantage over others system that rely on number-theoretical problem in a finite field (e.g. integer factorization problem or discrete logarithm problem). These advantages make NTRU a good choice for many applications. After the adaptation of NTRU, many attempts to generalize its algebraic structure have appeared. In this study, a new variant of the NTRU public key cryptosystem called BITRU is proposed. BITRU is based on a new algebraic structure used as an alternative to NTRU-mathematical structure called binary algebra. This commutative and associative. Establishing two public keys in the proposed system has distinguished it from NTRU and those similar to NTRU cryptosystems. This new structure helps to increase the security and complexity of BITRU. The clauses of BITRU, which include key generation, encryption, decryption, and decryption failure, are explained in details. Its suitability of the proposed system is proven and its security is demonstrated by comparing it with NTRU.

Author 1: Nadia M.G. Alsaidi
Author 2: Hassan R. Yassein

Keywords: NTRU; BITRU; polynomial ring; binary algebra

PDF

Paper 2: OWLMap: Fully Automatic Mapping of Ontology into Relational Database Schema

Abstract: Semantic web is becoming a controversial issue in current research era. There must be an automated approach to transform ontology constructs into relational database so that it can be queried efficiently. The previous research work based on transformation of RDF/OWL concepts into relational database contains flaws in complete transformation of ontology constructs into relational database. Some researchers claim that their technique of transformation is entirely automated, however their approach of mapping is incomplete and miss essential OWL constructs. This paper presents a tool called OWLMap that is fully automatic and provides lossless approach for transformation of ontology into relational database format. Number of experiments have been performed for ontology to relational database transformation. Experiments show that proposed approach is fully automatic, effective and quick. Our OWLMap is based on an approach that is lossless as well as it does not loose data, data types and structure.

Author 1: Humaira Afzal
Author 2: Mahwish Waqas
Author 3: Tabbassum Naz

Keywords: Semantic Web; Ontology; Database; Mapping; OWL; Jena API

PDF

Paper 3: Vismarkmap – A Web Search Visualization Technique through Visual Bookmarking Approach with Mind Map Method

Abstract: Due to the massive growth of information over the Internet, Bookmarking becomes the most popular technique to keep track of the websites with the expectation of finding out the previously searched websites easily whenever are needed. However, present browser bookmark systems or different online social bookmarking websites actually do not let the users to manage their desired searches with appropriate method, so that the users could easily recognize or recall the previously searched website and its content with the bookmark whenever they are in need. In this paper, a new approach of bookmarking technique has been proposed which will let the users to organize their bookmarks with some features using mind map, a scientifically approved mental model that will help the users to recall easily the previously searched websites’ information from the bookmarks and will minimize the tendency to revisit or research the website using the search engines. Basically, the proposed system is more than a mind map as it provides more flexibility to organize the bookmarks.

Author 1: Abdullah Al-Mamun
Author 2: Sheak Rashed Haider Noori

Keywords: hci; visual bookmark; information retrieval; mind map; visualization

PDF

Paper 4: A Sales Forecasting Model in Automotive Industry using Adaptive Neuro-Fuzzy Inference System(Anfis) and Genetic Algorithm(GA)

Abstract: Nowadays, Sales Forecasting is vital for any business in competitive atmosphere. For an accurate forecasting, correct variables should be considered. In this paper, we address these problems and a technique is proposed which combines two artificial intelligence algorithms in order to forecast future automobile sales in Saipa group which is a leading Automobile manufacturer in Iran. Anfis is used as the base technique which is combined with GA. GA is used in order to tune the Anfis results. Furthermore, sales forecasting is succeeded with annual data of years between 1990 and 2016. With this in mind, per capita income, inflation rate, housing, Importation, Currency Rate (USD), loan interest rate and automobile import tariffs are selected as effective variables in the proposed model. Finally, we compare our model with ANN model which is a well-known forecasting model.

Author 1: Amirmahmood Vahabi
Author 2: Shahrooz Seyyedi Hosseininia
Author 3: Mahmood Alborzi

Keywords: Sales Forecasting; Adaptive Neuro-fuzzy inference system (Anfis); Genetic Algorithm (GA)

PDF

Paper 5: Japanese Dairy Cattle Productivity Analysis using Bayesian Network Model (BNM)

Abstract: Japanese Dairy Cattle Productivity Analysis is carried out based on Bayesian Network Model (BNM). Through the experiment with 280 Japanese anestrus Holstein dairy cow, it is found that the estimation for finding out the presence of estrous cycle using BNM represents almost 55% accuracy while considering all samples. On the contrary, almost 73% accurate estimation could be achieved while using suspended likelihood in sample datasets. Moreover, while the proposed BNM model have more confidence then the estimation accuracy is lies in between 93 to 100%. In addition, this research also reveals the optimum factors to find out the presence of estrous cycle among the 270 individual dairy cows. The objective estimation methods using BNM definitely lead a unique idea to overcome the error of subjective estimation of having estrous cycle among these Japanese dairy cattle.

Author 1: Iqbal Ahmed
Author 2: Kenji Endo
Author 3: Osamu Fukuda
Author 4: Kohei Arai
Author 5: Hiroshi Okumura
Author 6: Kenichi Yamashita

Keywords: Bayesian Network Model; BCS; Postpartum Interval; Parity Number; Estrous Cycle; Cattle Productivity

PDF

Paper 6: Analysis of Security Requirements Engineering: Towards a Comprehensive Approach

Abstract: Software’s security depends greatly on how a system was designed, so it’s very important to capture security requirements at the requirements engineering phase. Previous research proposes different approaches, but each is looking at the same problem from a different perspective such as the user, the threat, or the goal perspective. This creates huge gaps between them in terms of the used terminology and the steps followed to obtain security requirements. This research aims to define an approach as comprehensive as possible, incorporating the strengths and best practices found in existing approaches, and filling the gaps between them. To achieve that, relevant literature reviews were studied and primary approaches were compared to find their common and divergent traits. To guarantee comprehensiveness, a documented comparison process was followed. The outline of our approach was derived from this comparison. As a result, it reconciles different perspectives to security requirements engineering by including: the identification of stakeholders, assets and goals, and tracing them later to the elicited requirements, performing risk assessment in conformity with standards and performing requirements validation. It also includes the use of modeling artifacts to describe threats, risks or requirements, and defines a common terminology.

Author 1: Ilham Maskani
Author 2: Jaouad Boutahar
Author 3: Souhaïl El Ghazi El Houssaïni

Keywords: Security requirements; Requirements engineering; Security standards; Comparison; Risk assessment

PDF

Paper 7: Teachme, A Gesture Recognition System with Customization Feature

Abstract: Many presentation these days are done with the help of a presentation tool. Lecturers at Universities and researchers in conferences use such tools to order the flow of the presentation and to help audiences follow the presentation points. Presenters control the presentation tools using mouse and keyboard which keep the presenters always beside the computer machine to be close enough to the keyboard and mouse. This reduces the ability of the lecturer to move close to the audiences and reduces the eye contact with them. Moreover, using such traditional techniques in controlling presentation tools lack the communication naturalness. Several gesture recognition tools are introduced as solutions for these problems. However, these tools require the user to learn specific gestures to control the presentation and/or the mouse. These specific gestures can be considered as a gestures vocabulary for the gesture recognition system. This paper introduces a gesture recognition system, TeachMe, which controls Microsoft PowerPoint presentation tool and the mouse pointer. TeachMe also has a gesture customization feature that allows the user to customize some gestures according to his/her preference. TeachMe uses Kinect device as an interface for capturing gestures. This paper, specifically, discusses in details the techniques and factors taken into consideration for implementing the system and its customization feature.

Author 1: Hazem Qattous
Author 2: Bilal Sowan
Author 3: Omar AlSheikSalem

Keywords: Microsoft Kinect®; Gesture recognition system; Gesture customization

PDF

Paper 8: A Mobile Device Software to Improve Construction Sites Communications "MoSIC"

Abstract: Effective communication among project participants in construction sites is a real dilemma for construction projects productivity. To improve the efficiency of participants in construction projects and have a speedy delivery of these projects, this paper presents the development of a mobile application system to support construction site communication. The developed system is designed to enhance communication between home office employees, field office staff, and mobile users at the construction sites. It has two components: a mobile application and a website. The mobile application component provides users with valuable features such as, receive sites’ instructions, send requests for interpretations and retrieve information about projects. Whereas, the website component allows users, such as, home office employees to track projects’ progress and find projects’ location. The developed system is tested first on emulators and then on Android devices. After that the system was tested on a highway improvement project. Through their mobile phones, site users are able to interact with field office and home office personnel who use the web application to communicate with mobile users. It is expect that this work will contribute to facilitate communication in construction sites, which is much needed in this information intensive sector.

Author 1: Adel Khelifi
Author 2: Khaled Hesham Hyari

Keywords: Mobile application ; Construction communication; Construction site ; Construction information

PDF

Paper 9: Framework of Resource Management using Server Consolidation to Minimize Live Migration and Load Balancing

Abstract: Live Migration is one of the essential operations that require more attention to addressing its high variability problems with virtual machines. We review the existing techniques of resource management to find that there are less modeling to solve this problem. The present paper introduces a novel framework that mainly targets to achieve a computational effective resource management technique. The technique uses the stochastic approach in modelling to design a new traffic management scheme that considers multiple traffic possibilities over VMs along with its switching states. Supported by an analytical modelling approach, the proposed technique offers an efficient placement of virtual machine to the physical server, performs the computation of blocks, and explores reduced resource usage. The study outcome was found to possess potential reduction in live migration, more extent of VM mapping with physical servers, and increased level of capacity.

Author 1: Alexander Ngenzi
Author 2: Selvarani R
Author 3: Suchithra R

Keywords: Resource Management; Live Migration; Virtual Machine; Load Balancing; Cloud Computing

PDF

Paper 10: Automatic Rotation Recovery Algorithm for Accurate Digital Image and Video Watermarks Extraction

Abstract: Research in digital watermarking has evolved rapidly in the current decade. This evolution brought various different methods and algorithms for watermarking digital images and videos. Introduced methods in the field varies from weak to robust according to how tolerant the method is implemented to keep the existence of the watermark in the presence of attacks. Rotation attacks applied to the watermarked media is one of the serious attacks which many, if not most, algorithms cannot survive. In this paper, a new automatic rotation recovery algorithm is proposed. This algorithm can be plugged to any image or video watermarking algorithm extraction component. The main job for this method is to detect the geometrical distortion happens to the watermarked image/images sequence; recover the distorted scene to its original state in a blind and automatic way and then send it to be used by the extraction procedure. The work is limited to have a recovery process to zero padded rotations for now, cropped images after rotation is left as future work. The proposed algorithm is tested on top of extraction component. Both recovery accuracy and the extracted watermarks accuracy showed high performance level.

Author 1: Nasr addin Ahmed Salem Al-maweri
Author 2: Aznul Qalid Md Sabri
Author 3: Ali Mohammed Mansoor

Keywords: Rotation recovery; image watermarking; video watermarking; watermark extraction; robustness

PDF

Paper 11: A Comparative Study Between the Capabilities of MySQl Vs. MongoDB as a Back-End for an Online Platform

Abstract: In this article we present a comparative study between the usage capabilities of MongoDB, a non-relational database, and MySQL’s usage capabilities, a relational database, as a back-end for an online platform. We will also present the advantages of using a non-relational database, namely MongoDB, compared to a relational database, namely MySQL, integrated in an online platform, which allows users to publish different articles, books, magazines and so on, and also gives them the possibility to share online their items with other people. Nowadays, most applications have thousands of users that perform operations simultaneously thus, it takes more than one operation to be executed at a time, to really see the differences between the two databases. This paper aims to highlight the differences between MySQL and MongoDB, integrated in an online platform, when various operations were executed in parallel by many users.

Author 1: Cornelia Gyorödi
Author 2: Robert Gyorödi
Author 3: Ioana Andrada Olah
Author 4: Livia Bandici

Keywords: MySQL; relational database; MongoDB; non- relational database; comparative study

PDF

Paper 12: Security Risk Assessment of Cloud Computing Services in a Networked Environment

Abstract: Different cloud computing service providers offer their customers' services with different risk levels. The customers wish to minimize their risks for a given expenditure or investment. This paper concentrates on consumers' point of view. Cloud computing services are composed of services organized according to a hierarchy of software application services, beneath them platform services which also use infrastructure services. Providers currently offer software services as bundles which include the software, platform and infrastructure services. Providers also offer platform services bundled with infrastructure services. Bundling services prevent customers from splitting their service purchases between a provider of software and a different provider of the underlying platform or infrastructure. In this paper the underlying assumption is the existence of a free competitive market, in which consumers are free to switch their services among providers. The proposed model is aimed at the potential customer who wishes to compare the risks of cloud service bundles offered by providers. The article identifies the major components of risk in each level of cloud computing services. A computational scheme is offered to assess the overall risk on a common scale.

Author 1: Eli WEINTRAUB
Author 2: Yuval COHEN

Keywords: Cloud Computing; Risk Management; Information Security; Cloud Risks; Software as a service; Platform as a service; Infrastructure as a service

PDF

Paper 13: Using Multiple Seasonal Holt-Winters Exponential Smoothing to Predict Cloud Resource Provisioning

Abstract: Elasticity is one of the key features of cloud computing that attracts many SaaS providers to minimize their services’ cost. Cost is minimized by automatically provision and release computational resources depend on actual computational needs. However, delay of starting up new virtual resources can cause Service Level Agreement violation. Consequently, predicting cloud resources provisioning gains a lot of attention to scale computational resources in advance. However, most of current approaches do not consider multi-seasonality in cloud workloads. This paper proposes cloud resource provisioning prediction algorithm based on Holt-Winters exponential smoothing method. The proposed algorithm extends Holt-Winters exponential smoothing method to model cloud workload with multi-seasonal cycles. Prediction accuracy of the proposed algorithm has been improved by employing Artificial Bee Colony algorithm to optimize its parameters. Performance of the proposed algorithm has been evaluated and compared with double and triple exponential smoothing methods. Our results have shown that the proposed algorithm outperforms other methods.

Author 1: Ashraf A. Shahin

Keywords: auto-scaling; cloud computing; cloud resource scaling; holt-winters exponential smoothing; resource provisioning; virtualized resources

PDF

Paper 14: Optimal Path Planning using RRT* based Approaches: A Survey and Future Directions

Abstract: Optimal path planning refers to find the collision free, shortest, and smooth route between start and goal positions. This task is essential in many robotic applications such as autonomous car, surveillance operations, agricultural robots, planetary and space exploration missions. Rapidly-exploring Random Tree Star (RRT*) is a renowned sampling based planning approach. It has gained immense popularity due to its support for high dimensional complex problems. A significant body of research has addressed the problem of optimal path planning for mobile robots using RRT* based approaches. However, no updated survey on RRT* based approaches is available. Considering the rapid pace of development in this field, this paper presents a comprehensive review of RRT* based path planning approaches. Current issues relevant to noticeable advancements in the field are investigated and whole discussion is concluded with challenges and future research directions.

Author 1: Iram Noreen
Author 2: Amna Khan
Author 3: Zulfiqar Habib

Keywords: optimal path; mobile robots; RRT*; sampling based planning; survey; future directions

PDF

Paper 15: Performance Analysis of In-Network Caching in Content-Centric Advanced Metering Infrastructure

Abstract: In-network caching is a key feature of content-centric networking. It is however a relatively costly mechanism with hardware requirements besides placement/replication strategies elaboration. As content-centric networking is proposed in the literature to manage smart grid (SG) communications, we aim, in this research work, to investigate the cost effectiveness of in-network caching in this context. We consider, in particular, the Advanced Metering Infrastructure (AMI) service that comes into prominence since its outputs are imperative inputs of most smart grid applications. In this research work, AMI communication topology and data traffic are characterized. Corresponding simulation environment is then built. Thereafter, various placement and replacement strategies are compared in a simulation study to be further able to propose a suitable cache placement and replacement combination for AMI in Smart Grid.

Author 1: Nour El Houda Ben Youssef
Author 2: Yosra Barouni
Author 3: Sofiane Khalfallah
Author 4: Jaleleddine Ben Hadj Slama
Author 5: Khaled Ben Driss

Keywords: caching; placement; replacement; content-centric networking; Named Data Networking; Advanced Metering Infrastructure; Smart Grid

PDF

Paper 16: Development of Dynamic Real-Time Navigation System

Abstract: This study aimed to develop a system that considers dynamic real-time situations to provide effective support for tourist activities. The conclusions of this study are summarized in the following three points: (1) The system was developed by integrating Web-GIS, social media, recommendation systems and AR terminals (smart glasses) into a single system, and operated in the center part of Yokohama City in Kanagawa Prefecture, Japan. It enabled the accumulation, sharing and recommendation of information and navigation to guide users to their goals both in normal conditions and in the event of disasters. (2) The web-based system was aimed at members of the general public over 18 years old and operated for seven weeks. The total number of users was 86, and 170 items of information were contributed. A system using smart glasses operated for two days, and the total number of users was 34. (3) Evaluation results clarified that it was possible to support user behavior both in normal conditions and in the event of disasters, and to efficiently and safety conduct navigation using smart glasses. Operation premised on disaster conditions showed that users who accessed the system via mobile information terminals increased, and actively used functions requiring location information.

Author 1: Shun FUJITA
Author 2: Kayoko YAMAMOTO

Keywords: Navigation System; Dynamic Real-Time; Web-Based Geographical Information Systems (GIS); Social Media; Recommendation System; Augmented Reality (AR); Smart Glasses

PDF

Paper 17: MIMC: Middleware for Identifying & Mitigating Congestion Level in Hybrid Mobile Adhoc Network

Abstract: Adoption of middleware system to solve the congestion problem in mobile ad-hoc network is few to find in the existing system. Research gap is found as existing congestion control mechanism in MANET doesn’t use middleware design and existing middleware system were never investigated for its applicability in congestion control over the mobile ad-hoc network. Therefore, we introduce a novel middleware system called as MIMC or Middleware for Identifying and Mitigating Congestion in Hybrid Mobile Adhoc Network. MIMC is also equipped with novel traf-fic modeling using rule-based control matrix that not only pro-vides a better scenario of congestion but also assists in decision making for routing, which the existing techniques fails. This paper discusses the algorithms, result discussion on multiple scenarios to show MIMC perform better congestion control as compared to existing techniques.

Author 1: P. G. Sunitha Hiremath
Author 2: C.V. Guru Rao

Keywords: Middleware; Congestion Control; Traffic Management; Hybrid Mobile Adhoc network

PDF

Paper 18: Statistical Implicative Similarity Measures for User-based Collaborative Filtering Recommender System

Abstract: This paper proposes a new similarity measures for User-based collaborative filtering recommender system. The similarity measures for two users are based on the Implication intensity measures. It is called statistical implicative similarity measures (SIS). This similarity measures is applied to build the experimental framework for User-based collaborative filtering recommender model. The experiments on MovieLense dataset show that the model using our similarity measures has fairly accurate results compared with User-based collaborative filtering model using traditional similarity measures as Pearson correlation, Cosine similarity, and Jaccard.

Author 1: Nghia Quoc Phan
Author 2: Phuong Hoai Dang
Author 3: Hiep Xuan Huynh

Keywords: Similarity measures; Implication intensity; User-based collaborative filtering recommender system; statistical implicative similarity measures

PDF

Paper 19: Applying Chatbots to the Internet of Things: Opportunities and Architectural Elements

Abstract: Internet of Things (IoT) is emerging as a significant technology in shaping the future by connecting physical devices or things with the web. It also presents various opportunities for the intersection of other technological trends which can allow it to become even more intelligent and efficient. In this paper, we focus our attention on the integration of Intelligent Conversational Software Agents or Chatbots with IoT. Prior literature has covered various applications, features, underlying technologies and known challenges of IoT. On the other hand, Chatbots are a relatively new concept, being widely adopted due to significant progress in the development of platforms and frameworks. The novelty of this paper lies in the specific integration of Chatbots in the IoT scenario. We analyzed the shortcomings of existing IoT systems and put forward ways to tackle them by incorporating chatbots. A general architecture is proposed for implementing such a system, as well as platforms and frameworks – both commercial and open source – which allow for the implementation of such systems. Identification of the newer challenges and possible future research directions with this new integration have also been addressed.

Author 1: Rohan Kar
Author 2: Rishin Haldar

Keywords: Internet of Things; Chatbots; Human-Computer Interaction; Conversational User Interfaces; Software Agents

PDF

Paper 20: State of the Art Exploration Systems for Linked Data: A Review

Abstract: The ever-increasing amount of data available on the web is the result of the simplicity of sharing data over the current Web. To retrieve relevant information efficiently from this huge dataspace, a sophisticated search technology, which is further complicated due to the various data formats used, is crucial. Semantic Web (SW) technology has a prominent role in search engines to alleviate this issue by providing a way to understand the contextual meaning of data so as to retrieve relevant, high-quality results. An Exploratory Search System (ESS), is a featured data looking and search approach which helps searchers learn and explore their unclear topics and seeking goals through a set of actions. To retrieve high-quality retrievals for ESSs, Linked Open Data (LOD) is the optimal choice. In this paper, SW technology is reviewed, an overview of the search strategies is provided, and followed by a survey of the state of the art Linked Data Browsers (LDBs) and ESSs based on LOD. Finally, each of the LDBs and ESSs is compared with respect to several features such as algorithms, data presentations, and explanations.

Author 1: Karwan Jacksi
Author 2: Nazife Dimililer
Author 3: Subhi R. M. Zeebaree

Keywords: Exploratory Search System; Linked Data; Linked Data Browser; Semantic Web

PDF

Paper 21: Qos-based Computing Resources Partitioning between Virtual Machines in the Cloud Architecture

Abstract: Cloud services have been used very widely, but configuration of the parameters, including the efficient allocation of resources, is an important objective for the system architect. The article is devoted to solving the problem of choosing the architecture of computers based on simulation and developed program for monitoring computing resources. Techniques were developed aimed at providing the required quality of service and efficient use of resources. The article describes the monitoring program of computing resources and time efficiency of the target application functions. On the basis of this application the technique is shown and described in the experiment, designed to ensure the requirements for quality of service, by isolating one process from the others on different virtual machines inside the hypervisor.

Author 1: Evgeny Nikulchev
Author 2: Evgeniy Pluzhnik
Author 3: Oleg Lukyanchikov
Author 4: Dmitry Biryukov
Author 5: Elena Andrianova

Keywords: cloud computing architecture; simulation; software for monitoring computer resources

PDF

Paper 22: Multiobjective Optimization for the Forecasting Models on the Base of the Strictly Binary Trees

Abstract: The optimization problem dealing with the development of the forecasting models on the base of strictly binary trees has been considered. The aim of paper is the comparative analysis of two optimization variants which are applied for the development of the forecasting models. Herewith the first optimization variant assumes the application of one quality indicator of the forecasting model named as the affinity indicator and the second variant realizes the application of two quality indicators of the forecasting model named as the affinity indicator and the tendencies discrepancy indicator. In both optimization variants the search of the best forecasting models is carried out by means of application of the modified clonal selection algorithm. To obtain the high variety of population of the forecasting models it is offered to consider values of the crowding-distance at the realization of the second optimization variant. The results of experimental studies confirming the use efficiency of the modified clonal selection algorithm on the base of the second optimization variant are given.

Author 1: Nadezhda Astakhova
Author 2: Liliya Demidova
Author 3: Evgeny Nikulchev

Keywords: forecasting model; strictly binary tree; modified clonal selection algorithm; multiobjective optimization; affinity indicator; tendencies discrepancy indicator

PDF

Paper 23: Big Data Knowledge Mining

Abstract: Big Data (BD) era has been arrived. The ascent of big data applications where information accumulation has grown beyond the ability of the present programming instrument to catch, manage and process within tolerable short time. The volume is not only the characteristic that defines big data, but also velocity, variety, and value. Many resources contain BD that should be processed. The biomedical research literature is one among many other domains that hides a rich knowledge. MEDLINE is a huge biomedical research database which remain a significantly underutilized source of biological information. Discovering the useful knowledge from such huge corpus leading to many problems related to the type of information such as the related concepts of the domain of texts and the semantic relationship associated with them. In this paper, an agent-based system of two–level for Self-supervised relation extraction from MEDLINE using Unified Medical Language System (UMLS) Knowledgebase, has been proposed . The model uses a Self-supervised Approach for Relation Extraction (RE) by constructing enhanced training examples using information from UMLS with hybrid text features. The model incorporates Apache Spark and HBase BD technologies with multiple data mining and machine learning technique with the Multi Agent System (MAS). The system shows a better result in comparison with the current state of the art and naïve approach in terms of Accuracy, Precision, Recall and F-score.

Author 1: Huda Umar Banuqitah
Author 2: Fathy Eassa
Author 3: Kamal Jambi
Author 4: Maysoon Abulkhair

Keywords: Knowledge Mining; Relation Extraction; Self-supervised; Big Data; Agent

PDF

Paper 24: Characterizations of Flexible Wearable Antenna based on Rubber Substrate

Abstract: Modern ages have observed excessive attention from both scientific and academic communities in the field of flexible electronic based systems. Most progressive flexible electronic systems require incorporating the flexible rubber substrate antenna operating in explicit bands to offer wireless connectivity which is extremely required by today’s network concerned society. This paper characterizes flexible antenna performance under the environments developed by natural rubber as the substrate. Flexible antenna grounded on rubber substrate was simulated using CST microwave studio with diverse permittivity and loss tangent. In our work, prototype antennas were built using natural rubber with different carbon filler substances. This paper reveals advanced flexible substrate effects on antenna quality factor (Q) and its consequences on bandwidth and gain. Such antennas under bending washing environment were also found to perform better than existing designs, showing less change in their gain, frequency shift and impedance mismatch.

Author 1: Saadat Hanif Dar
Author 2: Jameel Ahmed
Author 3: Muhammad Raees

Keywords: wearable antenna; antenna characterization; antennas

PDF

Paper 25: E-Commerce Adoption at Customer Level in Jordan: an Empirical Study of Philadelphia General Supplies

Abstract: E-commerce in developing countries has been studied by numerous researchers during the last decade and a number of common and culturally specific challenges have been identified.. This study considers Jordan as a case study of a developing country where E-commerce is still in its infancy. Therefore, this research work comes as a complement to previous research and an opportunity to refine E-commerce adaptation research. This research was conducted by survey distributed randomly across branches of Philadelphia General Supplies (PGS), a small and medium enterprise (SME). The key findings in this research indicated that Jordanian society is moving towards online shopping at very low rates of adoption, due to barriers including weak infrastructure throughout the country except in the capital, societal trends and culture and educational and computer literacy. This means that E-commerce in Jordan still remains an under-developed industry.

Author 1: Mohammed Al Masarweh
Author 2: Sultan Al-Masaeed
Author 3: Laila Al-Qaisi
Author 4: Ziad Hunaiti

Keywords: Information systems; E-commerce; E-commerce Adoption; E-commerce in Jordan; Jordan

PDF

Paper 26: Wavelet based Scalable Edge Detector

Abstract: Fixed size kernels are used to extract differential structure of images. Increasing the kernal size reduces the localization accuracy and noise along with increase in computational complexity. The computational cost of edge extraction is related to the image resolution or scale. In this paper wavelet scale correlation for edge detection along with scalability in edge detector has been envisaged. The image is decomposed according to its resolution, structural parameters and noise level by multilevel wavelet decomposition using Quadrature Mirror Filters (QMF). The property that image structural information is preserved at each decomposition level whereas noise is partially reduced within subbands, is being exploited. An innovative wavelet synthesis approach is conceived based on scale correlation of the concordant detail bands such that the reconstructed image fabricates an edge map of the image. Although this technique falls short to spot few edge pixels at contours but the results are better than the classical operators in noisy scenario and noise elimination is significant in the edge maps keeping default threshold constraint.

Author 1: Imran Touqir
Author 2: Adil Masood Siddique
Author 3: Yasir Saleem

Keywords: Wavelet scales correlation; Edge detection; image denoising; Multiresolution analysis; entropy reduction

PDF

Paper 27: Variability Management in Business-IT Alignment: MDA based Approach

Abstract: The expansion of PAIS (Process Aware Information Systems) has created the need for reuse in business processes. In fact, companies are left with directories containing several variants of the same business processes, which differ according to their application context. Consequently, the development of PAIS has become increasingly expensive. Therefore, research in business process management domain introduced the concept of configurable process, with the aim of managing the variability of business process. However, with the emergence of the services-based development paradigm, the alignment of services with business processes is highly required in PAIS. Thus, in this paper an MDA based method which allows for generating configurable services from configurable process is proposed.

Author 1: Hanae Sbai
Author 2: Mounia Fredj

Keywords: alignment; variability; MDA; PAIS; configurable service, configurable process

PDF

Paper 28: Performance Metrics for Decision Support in Big Data vs. Traditional RDBMS Tools & Technologies

Abstract: In IT industry research communities and data scientists have observed that Big Data has challenged the legacy of solutions. ‘Big Data’ term used for any collection of data or data sets which is so large and complex and difficult to process and manage using traditional data processing applications and existing Relational Data Base Management Systems (RDBMSs). In Big Data; the most important challenges include analysis, capture, curation, search, sharing, storage, transfer, visualization and privacy. As the data increases in various dimensions with various features like structured, semi structured and unstructured with high velocity, high volume and high variety; the RDBMSs face another fold of challenges to be studied and analyzed. Due to the aforesaid limitations of RDBMSs, data scientists and information managers forced to rethink about alternative solutions for handling such data with 3Vs.Initially research study focused on to develop an intelligent base for decision makers so that alternative solutions for long term suitable solutions and handle the data and information with 3Vs can be designed. In this research attempts has been made to analyze the feature based capabilities of RDBMSs and then performance experimentation, observation and analysis has been done with Big Data handling tools and technologies. The features considered for scientific observation and analysis were resource consumption, execution time, on demand scalability, maximum data size, structure of the data, data visualization, and ease of deployment, cost and security. Finally the research provides a decision support metrics for decision makers in selecting the appropriate tool or technology based on the nature of data to be handled in the target organizations.

Author 1: Alazar Baharu
Author 2: Durga Prasad Sharma

Keywords: Big Data; RDBMSs; big data tools; Variety; velocity; volume; Metrics

PDF

Paper 29: Solving Word Tile Puzzle using Bee Colony Algorithm

Abstract: In this paper, an attempt has been made to solve the word tile puzzle with the help of Bee Colony Algorithm, in order to find maximum number of words by moving a tile up, down, right or left. Bee Colony Algorithm is a type of heuristic algorithms and is efficient and better than blind algorithms, in terms of running time and cost of search time. To examine the performance of the implemented algorithm, several experiments were performed with various combinations. The algorithm was evaluated with the help of statistical functions, such as average, maximum and minimum, for hundred and two-hundred iterations. Results show that an increasing number of agents can improve the average number of words found for both number of tested iterations. However, continuous increase in number of steps will not improve the results. Moreover, results of both iterations showed that the overall performance of the algorithm was not much improved by increasing the number of iterations.

Author 1: Erum Naz
Author 2: Khaled Al-Dabbas
Author 3: Mahdi Abrishami
Author 4: Lars Mehnen
Author 5: Milan Cvetkovic

Keywords: slide tile puzzle; artificial bee colony algorithm; swarm intelligence; artificial intelligence; fitness function; loyalty function; word tile puzzle; Bee colony optimization

PDF

Paper 30: A Novel Approach to Automatic Road-Accident Detection using Machine Vision Techniques

Abstract: In this paper, a novel approach for automatic road accident detection is proposed. The approach is based on detecting damaged vehicles from footage received from surveillance cameras installed in roads and highways which would indicate the occurrence of a road accident. Detection of damaged cars falls under the category of object detection in the field of machine vision and has not been achieved so far. In this paper, a new supervised learning method comprising of three different stages which are combined into a single framework in a serial manner which successfully detects damaged cars from static images is proposed. The three stages use five support vector machines trained with Histogram of gradients (HOG) and Gray level co-occurrence matrix (GLCM) features. Since damaged car detection has not been attempted, two datasets of damaged cars - Damaged Cars Dataset-1 (DCD-1) and Damaged Cars Dataset-2 (DCD-2) – was compiled for public release. Experiments were conducted on DCD-1 and DCD-2 which differ based on the distance at which the image is captured and the quality of the images. The accuracy of the system is 81.83% for DCD-1 captured at approximately 2 meters with good quality and 64.37% for DCD-2 captured at approximately 20 meters with poor quality.

Author 1: Vaishnavi Ravindran
Author 2: Lavanya Viswanathan
Author 3: Shanta Rangaswamy

Keywords: Feature extraction; Image denoising; Machine vision; object detection; Supervised learning; Support vector machines

PDF

Paper 31: Real-Time Implementation of an Open-Circuit Dc-Bus Capacitor Fault Diagnosis Method for a Three-Level NPC Rectifier

Abstract: The main goal of this paper is to detect the open-circuit fault of the electrolytic capacitors usually used in the dc-bus of a three phase/level NPC active rectifier. This phenomenon causes unavoidable overvoltage across the dc-bus leading therefore to the destruction of the converter’s power semiconductors. The real-time detection of this fault is therefore vital to avoid severe damage as well as wasted repair time. The proposed diagnosis method is based on the measurement of the voltages across the two dc-bus capacitors. Their mean values are therefore compared with the half value of the dc-bus reference voltage. If the comparison result is under a predefined threshold value, a fault alarm signal is generated in real-time by the monitoring system. The converter’s control algorithm and fault detection method are both implemented in real-time on a DSP controller. The obtained experimental results confirm the effectiveness of the proposed diagnosis technique. Indeed, a fault signal is generated at the peripheral of the DSP after 60 ms of the fault occurrence.

Author 1: Fatma Ezzahra LAHOUAR
Author 2: Mahmoud HAMOUDA
Author 3: Jaleleddine BEN HADJ SLAMA

Keywords: fault detection; capacitor failure; open-circuit fault; real-time implementation; multilevel converters

PDF

Paper 32: Issue Tracking System based on Ontology and Semantic Similarity Computation

Abstract: A computer program is never truly finished; changes are a constant feature of computer program development, there are always something need to be added, redone, or fixed. Therefore, issue-tracking systems are widely used on the system development to keep track of reported issues. This paper proposes a new architecture for automated issue tracking system based on ontology and semantic similarity measure. The proposed architecture integrates several natural languages techniques including vector space model, domain ontology, term-weighting, cosine similarity measure, and synonyms for semantic expansion. The proposed system searches for similar issue templates, which are characteristic of certain fields, and identifies similar issues in an automated way, possible experts and responses are extracted finally. The experimental results demonstrated the accuracy of the new architecture, the experiment result indicates that the accuracy reaches to 94%.

Author 1: Habes Alkhraisat

Keywords: issue tracking; ontology; similarity computation; vector space model

PDF

Paper 33: Constraints in the IoT: The World in 2020 and Beyond

Abstract: The Internet of Things (IoT), often referred as the future Internet; is a collection of interconnected devices integrated into the world-wide network that covers almost everything and could be available anywhere. IoT is an emerging technology and aims to play an important role in saving money, conserving energy, eliminating gap and better monitoring for intensive management on a routine basis. On the other hand, it is also facing certain design constraints such as technical challenges, social challenges, compromising privacy and performance tradeoffs. This paper surveys major technical limitations that are hindering the successful deployment of the IoT such as standardization, interoperability, networking issues, addressing and sensing issues, power and storage restrictions, privacy and security, etc. This paper categorizes the existing research on the technical constraints that have been published in the recent years. With this categorization, we aim to provide an easy and concise view of the technical aspects of the IoT. Furthermore, we forecast the changes influenced by the IoT. This paper predicts the future and provides an estimation of the world in year 2020 and beyond.

Author 1: Asma Haroon
Author 2: Munam Ali Shah
Author 3: Yousra Asim
Author 4: Wajeeha Naeem
Author 5: Muhammad Kamran
Author 6: Qaisar Javaid

Keywords: Internet of Things; Future Internet; Next generation network issues; World-wide network; 2020

PDF

Paper 34: ETEEM- Extended Traffic Aware Energy Efficient MAC Scheme for WSNs

Abstract: Idle listening issue arises when a sensor node listens to medium despite the absence of data which results in consumption of energy. ETEEM is a variant of Traffic Aware Energy Efficient MAC protocol (TEEM) which focuses on energy optimization due to reduced idle listening time and much lesser overhead on energy sources. It uses a novel scheme for using idle listening time of sensor nodes. The nodes are only active for small amount of time and most of the time, will be in sleep mode when no data is available. ETEEM reduces energy at byte level and uses a smaller byte packet called FLAG replacing longer byte SYNC packets of S-MAC and SYNCrts of TEEM respectively. It also uses a single acknowledgement packet per data set hence reducing energy while reducing frequency of the acknowledgment frames sent. The performance of ETEEM is 70% better comparative to other under-consideration MAC protocols.

Author 1: Younas Khan
Author 2: Sheeraz Ahmed
Author 3: Fakhri Alam Khan
Author 4: Imran Ahmad
Author 5: Saqib Shahid Rahim
Author 6: M. Irfan Khattak

Keywords: Energy Consumption; Multi-hop; Network Allocator Vector; Throughput; Wireless Sensor Networks

PDF

Paper 35: Intelligent System for Detection of Abnormalities in Human Cancerous Cells and Tissues

Abstract: Due to the latest advances in the field of MML (Medical Machine Learning) a significant change has been witnessed and traditional diagnostic procedures have been converted into DSS (Decision Support Systems). Specially, classification problem of cancer discovery using DICOM (Digital Communication in Medicine) would assume to be one of the most important problems. For example differentiation between the cancerous behaviours of chromatin deviations and nucleus related changes in a finite set of nuclei may support the cytologist during the cancer diagnostic process. In-order to assist the doctors during the cancer diagnosis, this paper proposes a novel algorithm BCC (Bag_of_cancerous_cells) to select the most significant histopathological features from the well-differentiated thyroid cancers. Methodology of proposed system comprises upon three layers. In first layer data preparation have been done by using BMF (Bag of Malignant Features) where each nuclei is separated with its related micro-architectural components and behaviours. In second layer decision model has been constructed by using CNN (Convolutional Neural Network) classifier and to train the histopathological behaviours such like BCP (Bags of chromatin Paches) and BNP (Bags of Nuclei Patches). In final layer, performance evaluation is done. A total number of 4520 nuclei observations were trained to construct the decision models from which BCP (Bags of Chromatin Patches) consists upon the 2650 and BNP (Bags of Nuclei Patches) comprises upon 1870 instances. Best measured accuracy for BCP was recorded as 97.93% and BNP accuracy was measured as 97.86%.

Author 1: Jamil Ahmed Chandio
Author 2: M. Abdul Rahman Soomrani

Keywords: Medical Image mining; Decision support system; Pre-process; DICOM; FNAB

PDF

Paper 36: Enhanced Re-Engineering Mechnanism to Improve the Efficiency of Software Re-Engineering

Abstract: Generally, software re-engineering is economical and perfect way to provide much needed boost to a present software system. Software Re-engineering is like to obtain a fully completed software from existing software with additional features if needed. The overall process of Software re-engineering is to analyze the needed requiements & its contents. It also changes the needed contents or transforms the existing software system for reconstructing a novel software system. The difficult part in re-engineering is to understand the traditional system. Most of the software re-engineering mechanisms are aimed to achieve the common re-engineering objectives and the objectives are: improved software quality, reduced complexity, reduce maintenance cost and increased reliability. As a result, several traditional re-engineering mechanisms fail to verify the performance of individual functionality in existing software. This performance evaluation increases the complexity in re-engineering process. To minimizing the complexities in software re-engineering, this proposed system implements a novel approach named Enhanced Re-engineering mechanism. This enhanced mechanism introduces a new idea, before executing the re- build process the developer verifies the performance of particular function in existing system. After that, the function performance is compared with proposed algorithm. Based on the comparison process only rebuild process should be carried out. Finally this proposed mechanism reduces the complexities in software re-engineering.

Author 1: A. Cathreen Graciamary
Author 2: Chidambaram

Keywords: Software Engineering; Software Re engineering; Software Quality; Restructuring

PDF

Paper 37: Scalable Scientific Workflows Management System SWFMS

Abstract: In today’s electronic world conducting scientific experiments, especially in natural sciences domain, has become more and more challenging for domain scientists since “science” today has turned out to be more complex due to the two dimensional intricacy; one: assorted as well as complex computational (analytical) applications and two: increasingly large volume as well as heterogeneity of scientific data products processed by these applications. Furthermore, the involvement of increasingly large number of scientific instruments such as sensors and machines makes the scientific data management even more challenging since the data generated from such type of instruments are highly complex. To reduce the amount of complexities in conducting scientific experiments as much as possible, an integrated framework that transparently implements the conceptual separation between both the dimensions is direly needed. In order to facilitate scientific experiments ‘workflow’ technology has in recent years emerged in scientific disciplines like biology, bioinformatics, geology, environmental science, and eco-informatics. Much more research work has been done to develop the scientific workflow systems. However, our analysis over these existing systems shows that they lack a well-structured conceptual modeling methodology to deal with the two complex dimensions in a transparent manner. This paper presents a scientific workflow framework that properly addresses these two dimensional complexities in a proper manner.

Author 1: M. Abdul Rahman

Keywords: Scientific Workflows; Workflow Management System; Reference Architecture

PDF

Paper 38: Efficient Relay Selection Scheme based on Fuzzy Logic for Cooperative Communication

Abstract: The performance of cooperative network can be increased by using relay selection technique. Therefore, interest in relay selection is sloping upward. We proposed two new relay selection schemes based on fuzzy logic for dual hop cooperative communication. These relay selection schemes require SNR (signal to noise ratio), cooperative gain and channel gain as input fuzzy parameters for selection of best relay. The performance of first proposed relay selection scheme is evaluated in term of BER (bit error rate) in Nakagami, Rician and Rayleigh fading channels. In second proposed relay selection scheme, threshold is used with the objective to minimize the power consumption and channel estimation load. Its performance is analyzed in term of BER, number of active relays and load of number of channel estimations.

Author 1: Shakeel Ahmad Waqas
Author 2: Imran Touqir
Author 3: Nasir Khan
Author 4: Imran Rashid

Keywords: Cooperative Networks; Relay selection schemes; Amplify and forward; Fuzzy logic; Nakagami Fading channel; Rician Fading Channel; Rayleigh Fading Channel

PDF

Paper 39: Wavelet-based Image Modelling for Compression Using Hidden Markov Model

Abstract: Statistical signal modeling using hidden Markov model is one of the techniques used for image compression. Wavelet based statistical signal models are impractical for most of the real time processing because they usually represent the wavelet coefficients as jointly Gaussian or independent to each other. In this paper, we build up an algorithm that succinctly characterizes the interdependencies of wavelet coefficients and their Non-Gaussian behavior especially for image compression. This is done by extracting the combine feature of hidden Markov model and Wavelet transformation that gives us comparatively better results. To estimate the parameter of wavelet based Hidden Markov model, an efficient expectation maximization algorithm is developed.

Author 1: Muhammad Usman Riaz
Author 2: Imran Touqir
Author 3: Maham Haider

Keywords: Hidden Markov model; Wavelet transformation; Compression; Expectation Maximization

PDF

Paper 40: Image De-Noising and Compression Using Statistical based Thresholding in 2-D Discrete Wavelet Transform

Abstract: Images are very good information carriers but they depart from their original condition during transmission and are corrupted by different kind of noise. The purpose is to remove the noisy coefficients such that minimum amount of information is lost and maximum amount of noise is suppressed or reduced. We considered Generalized Gaussian distribution for modeling of noise. In the proposed technique, statistical thresholding methods are used for the estimation of threshold value while Bi-orthogonal wavelet has been envisaged for image decomposition and reconstruction. A qualitative and quantitative analysis of thresholding methods on different images shows significant results for statistical thresholding methods based on objective and subjective quality as compared to other de-noising methods.

Author 1: Qazi Mazhar
Author 2: Adil Masood Siddique
Author 3: Imran Touqir
Author 4: Adnan Ahmad Khan

Keywords: Wavelet Thresholding; statistical Thresholding Image De-noising; Image Compression; Wavelet Sub-band Thresholding

PDF

Paper 41: Denoising in Wavelet Domain Using Probabilistic Graphical Models

Abstract: Denoising of real world images that are degraded by Gaussian noise is a long established problem in statistical signal processing. The existing models in time-frequency domain typically model the wavelet coefficients as either independent or jointly Gaussian. However, in the compression arena, techniques like denoising and detection, states the need for models to be non-Gaussian in nature. Probabilistic Graphical Models designed in time-frequency domain, serves the purpose for achieving denoising and compression with an improved performance. In this work, Hidden Markov Model (HMM) designed with 2D Discrete Wavelet Transform (DWT) is proposed. A comparative analysis of proposed method with different existing techniques: Wavelet based and curvelet based methods in Bayesian Network domain and Empirical Bayesian Approach using Hidden Markov Tree model for denoising has been presented. Results are compared in terms of PSNR and visual quality.

Author 1: Maham Haider
Author 2: Muhammad Usman Riaz
Author 3: Imran Touqir
Author 4: Adil Masood Siddiqui

Keywords: Guassian Mixture Models (GMM); Hidden Markov Model (HMM); Discrete Wacelet Transform (DWT); Hidden Markov Tree (HMT)

PDF

Paper 42: Connected Dominating Set based Optimized Routing Protocol for Wireless Sensor Networks

Abstract: Wireless Sensor Networks(WSNs) have problem of energy starvation in their operations. This constraint demands that the topology of communicating nodes should be limited. One of the ways of restraining the communicating nodes is by creating a Connected Dominating Set(CDS) of nodes out of them. In this paper, an Optimized Region Based Efficient Data(AORED) routing protocol for WSNs has been proposed. CDS has been employed in AORED to create a virtual backbone of communicating nodes in the network. The empirical study involving extensive simulations show that the proposed routing protocol outperforms the legacy DEEC and SEP protocols. AORED has increased number of transmission rounds, increased number of clusterheads and reduced number of packets sent to the basestation as compared to DEEC and SEP protocols.

Author 1: Hamza Faheem
Author 2: Naveed Ilyas
Author 3: Siraj ul Muneer
Author 4: Sadaf Tanvir

Keywords: Connected Dominating Set; Wireless Sensor Net-works; Energy Efficiency

PDF

Paper 43: Adaptive Error Detection Method for P300-based Spelling Using Riemannian Geometry

Abstract: Brain-Computer Interface (BCI) systems have be-come one of the valuable research area of ML (Machine Learning) and AI based techniques have brought significant change in traditional diagnostic systems of medical diagnosis. Specially; Electroencephalogram (EEG), which is measured electrical ac-tivity of the brain and ionic current in neurons is result of these activities. A brain-computer interface (BCI) system uses these EEG signals to facilitate humans in different ways. P300 signal is one of the most important and vastly studied EEG phenomenon that has been studied in Brain Computer Interface domain. For instance, P300 signal can be used in BCI to translate the subject’s intention from mere thoughts using brain waves into actual commands, which can eventually be used to control different electro mechanical devices and artificial human body parts. Since low Signal-to-Noise-Ratio (SNR) in P300 is one of the major challenge because concurrently ongoing heterogeneous activities and artifacts of brain creates lots of challenges for doctors to understand the human intentions. In order to address above stated challenge this research proposes a system so called Adaptive Error Detection method for P300-Based Spelling using Riemannian Geometry, the system comprises of three main steps, in first step raw signal is cleaned by preprocessing. In second step most relevant features are extracted using xDAWN spatial filtering along with covariance matrices for handling high dimensional data and in final step elastic net classification algorithm is applied after converting from Riemannian manifold to Euclidean space using tangent space mapping. Results obtained by proposed method are comparable to state-of-the-art methods, as they decrease time drastically; as results suggest six times decrease in time and perform better during the inter-session and inter-subject variability.

Author 1: Attaullah Sahito
Author 2: M. Abdul Rahman
Author 3: Jamil Ahmed

Keywords: Brain Computer Interface; EEG; P300; Rieman-nian geometry; xDAWN; Covariances; Tangent Space; Elastic net

PDF

Paper 44: Evaluation of OLSR Protocol Implementations using Analytical Hierarchical Process (AHP)

Abstract: Adhoc networks are part of IEEE 802.11 Wireless LAN Standard also called Independent Basic Service Set (IBSS) and work as Peer to Peer network by default. These work without the requirement of an Infrastructure (such as an Access Point) and demands specific routing requirements to work as a multi-hop network. There are various Adhoc network routing protocols which are categorized as Proactive, Reactive and Hybrid. OLSR (a proactive routing protocol) is one of widely used routing protocols in adhoc networks. In this paper an empirical study and analysis of the various OLSR implementations (by different research groups and individuals) has been conducted in light of Relative Opinion Scores (ROS) and Analytical Hierarchical Process (AHP) Online System software. Based on quantitative comparison of results, it is concluded that OLSRd project is most updated and best amongst six variants of OLSR protocol implementations.

Author 1: Ashfaq Ahmad Malik
Author 2: Athar Mahboob
Author 3: Tariq Mairaj Rasool Khan

Keywords: OLSR; MANET; AHP; Routing Protocols

PDF

Paper 45: Fast Approximation for Toeplitz, Tridiagonal, Symmetric and Positive Definite Linear Systems that Grow Over Time

Abstract: Linear systems with tridiagonal structures are very common in problems related not only to engineering, but chemistry, biomedical or finance, for example, real time cubic B-Spline interpolation of ND-images, real time processing of Electrocardiography (ECG) and hand drawing recognition. In those problems which the matrix is positive definite, it is possible to optimize the solution in O(n) time. This paper describes such systems whose size grows over time and proposes an approximation in O(1) time of such systems based on a series of previous approximations. In addition, it is described the development of the method and is proved that the proposed solution converges linearly to the optimal. A real-time cubic B-Spline interpolation of an ECG is computed with this proposal, for this application the proposed method shows a global relative error near to 10-6 and its computation is faster than traditional methods, as shown in the experiments.

Author 1: Pedro Mayorga
Author 2: Alfonso Estudillo
Author 3: A. Medina-Santiago
Author 4: Jos´e V´azquez
Author 5: Fernando Ramos

Keywords: real time interpolation; linear convergence; Cholesky decomposition; biomedical data acquisition

PDF

Paper 46: A Multi-Agent Framework for Data Extraction,Transformation and Loading in Data Warehouse

Abstract: The rapid growth in size of data sets poses chal-lenge to extract and analyze information in timely manner for better prediction and decision making. Data warehouse is the solution for strategic decision making. Data warehouse serves as a repository to store historical and current data. Extraction, Transformation and Loading (ETL) process gather data from different sources and integrate it into data warehouse. This paper proposes a multi-agent framework that enhance the efficiency of ETL process. Agents perform specific task assigned to them. The identification of errors at different stages of ETL process become easy. This was difficult and time consuming in traditional ETL process. Multi-agent framework identify data sources, extract, integrate, transform, and load data into data warehouse. A monitoring agent remains active during this process and generate alerts if there is issue at any stage.

Author 1: Ramzan Talib
Author 2: Muhammad Kashif Hanif
Author 3: Fakeeha Fatima
Author 4: Shaeela Ayesha

Keywords: Data Warehouse; Extraction; Loading; Multi-Agent; Operational Data; Transformation

PDF

Paper 47: Polynomial based Channel Estimation Technique with Sliding Window for M-QAM Systems

Abstract: Pilot Symbol Assisted Modulation (PSAM) channel estimation techniques over Rayleigh fading channels have been analysed in recent years. Fluctuations in the Rayleigh fading channel gain degrades the performance of any modulation scheme. This paper develops and analyses a PSAM Polynomial interpolation technique based on Least Square (LS ) approxi-mations to estimate the Channel State Information (CSI) for M-ary Quadrature Amplitude Modulation (M-QAM) over flat Rayleigh fading channels. A Sliding window approach with pilot symbol adjustment is employed in order to minimize the computational time complexity of the estimation technique. The channel estimation performance, and its computational delay and time complexity is verified for di?erent Doppler frequen-cies ( fd), frame lengths (L), and Polynomial orders (P-orders). Simulation results show that the Cubic Polynomial interpolation gives superior Symbol Error Rate (SER) performance than the Quadratic Polynomial interpolation and higher P-orders, and the performance of the Polynomial estimation techniques degrade with increase in the P-orders.

Author 1: O. O. Ogundile
Author 2: M. O. Oloyede
Author 3: F. A. Aina
Author 4: S. S. Oyewobi

Keywords: Channel estimation; Doppler frequency; frame length; interpolation; polynomial order

PDF

Paper 48: Synergies of Advanced Technologies and Role of VANET in Logistics and Transportation

Abstract: In Intelligent Transport Systems (ITS), Vehicular Ad-hoc Network (VANET) is one of key wireless technologies, which helps in managing road safety, traffic efficiency, fleet, logistics and transportation. The objective of this paper is to give an overview of the implication of different technologies and placement of VANET in transportation and specifically in logistics. We provide researchers with an overview of consid-ered technologies in logistics scenarios and the current projects regarding VANET for safety and non-safety applications. We additionally discuss current and potential domains in logistics in which new applications can improve efficiency by use of new and existing technologies.

Author 1: Kishwer Abdul Khaliq
Author 2: Amir Qayyum
Author 3: Jurgen Pannek

Keywords: VANET; IEEE802.11p; Logistics; Vehicular Ad-hoc Network; Transportation; Technology role

PDF

Paper 49: WQbZS: Wavelet Quantization by Z-Scores for JPEG2000

Abstract: In this document we present a methodology to quantize wavelet coefficients for any wavelet-base entropy coder, we apply it in the particular case of JPEG2000. Any compression system have three main steps: Transformation in terms of fre-quency, Quantization and Entropy Coding. The only responsible for reducing or maintaining precision is the second element, Quantization, since it is the element of lossy compression that reduces the precision of dequantized pixels in order to make quantized pixels more compressible. We modify the well-known dead zone scalar Quantization introducing Z-Scores in the pro-cess. Thus, Z-scores are expressed in terms of standard deviations from their means. Resultantly, these z-scores have a distribution with a mean of 0 and a standard deviation of 1, in this way we increase redundancies into the image, which produces a lower compression ratio.

Author 1: Jesus Jaime Moreno-Escobar
Author 2: Oswaldo Morales-Matamoros
Author 3: Ricardo Tejeida-Padilla
Author 4: Ana Lilia Coria-Paes
Author 5: Teresa Ivonne Contreras-Troya

Keywords: Z-Scores; Statistical Normalization; Wavelet Transformation; Scalar Quantization; Deadzone Quantization; JPEG2000

PDF

Paper 50: Determination of Child Vulnerability Level from a Decision-Making System based on a Probabilistic Model

Abstract: The purpose of this paper is to provide a decision support tool based on a mathematical model and an algorithm that can help in the assessment of the level of vulnerability of children in Côte d'Ivoire. So, this study was conducted in three phases, the first one includes the settlement of a data warehouse. Then the second involves the application of probabilistic model. The final phase deals with the classification of children considered vulnerable in descending order from the most to the least vulnerable. The purpose of this classification is to better manage the resources of donors to support vulnerable children. This work is part of the activities of UMRI The resilience of Côte d’Ivoire. This is to propose mathematical and computational tools to facilitate the work of the Centre for social resilience. The use of the context of children made vulnerable due to crises or diseases is an example of practical application of our social resilience model

Author 1: SAHA Kouassi Bernard
Author 2: BROU Konan Marcelin
Author 3: Gooré Bi Tra
Author 4: Souleymane OUMTANAGA

Keywords: Crisis; Children; XML data warehouse; data mining; scheduling; Resilience; snowflake pattern; vulnerability level; probabilistic model

PDF

Paper 51: Software-Defined Networks (SDNs) and Internet of Things (IoTs): A Qualitative Prediction for 2020

Abstract: The Internet of Things (IoT) is imminent technology grabbing industries and research attention with a fast stride. Currently, more than 15 billion devices are connected to the Internet and this number is expected to reach up to 50 billion by 2020. The data generated by these IoT devices are immensely high, creating resource allocation, flow management and security jeopardises in the IoT network. Programmability and centralised control are considered an alternative solution to address IoT issues. On the other hand, a Software Define Network (SDN) provides a centralised and programmable control and management for the underlying network without changing existing network architecture. This paper surveys the state of the art on the IoT integration with the SDN. A comprehensive review and the generalised solutions over the period 2010-2016 is presented for the different communication domains. Furthermore, a critical review of the IoT and the SDN technologies, current trends in research and the futuristic contributing factors form part of the paper. The comparative analysis of the existing solutions of SDN based IoT implementation provides an easy and concise view of the emerging trends. Lastly, the paper predicts the future and presents a qualitative view of the world in 2020.

Author 1: Sahrish Khan Tayyaba
Author 2: Munam Ali Shah
Author 3: Naila Sher Afzal Khan
Author 4: Yousra Asim
Author 5: Wajeeha Naeem
Author 6: Muhammad Kamran

Keywords: SDN; IoT; Integration of SDN-IoT; WSN; LTE; M2M communication; NFV

PDF

Paper 52: Modified Random Forest Approach for Resource Allocation in 5G Network

Abstract: According to annual visual network index (VNI) report by the year 2020, 4G will reach its maturity and incremental approach will not meet demand. Only way is to switch to newer generation of mobile technology called as 5G. Resource allocation is critical problem that impact 5G Network operation critically. Timely and accurate assessment of underutilized bandwidth to primary user is necessary in order to utilize it efficiently for increasing network efficiency. This paper presents a decision making system at Fusion center using modified Random Forest. Modified Random Forest is first trained using Database accumulated by measuring different network parameters and can take decision on allocation of resources. The Random Forest is retrained after fixed time interval, considering dynamic nature of network. We also test its performance in comparison with existing AND/OR logic decision logic at Fusion Center

Author 1: Parnika De
Author 2: Shailendra Singh

Keywords: 5G; Cognitive Radio; Clustering; Fusion Centre; Random Forest

PDF

Paper 53: Text Mining: Techniques, Applications and Issues

Abstract: Rapid progress in digital data acquisition tech-niques have led to huge volume of data. More than 80 percent of today’s data is composed of unstructured or semi-structured data. The discovery of appropriate patterns and trends to analyze the text documents from massive volume of data is a big issue. Text mining is a process of extracting interesting and non-trivial patterns from huge amount of text documents. There exist different techniques and tools to mine the text and discover valuable information for future prediction and decision making process. The selection of right and appropriate text mining technique helps to enhance the speed and decreases the time and effort required to extract valuable information. This paper briefly discuss and analyze the text mining techniques and their applications in diverse fields of life. Moreover, the issues in the field of text mining that affect the accuracy and relevance of results are identified.

Author 1: Ramzan Talib
Author 2: Muhammad Kashif Hanif
Author 3: Shaeela Ayesha
Author 4: Fakeeha Fatima

Keywords: Classification; Knowledge Discovery; Applications; Information Extraction; Patterns

PDF

Paper 54: A Generic Model for Assessing Multilevel Security-Critical Object-Oriented Programs

Abstract: The most promising approach for developing secure systems is the one which allows software developers to assess and compare the relative security of their programs based on their designs. Thereby, software metrics provide an easy approach for evaluating the security of certain object-oriented designs. They can also measure the impact on security that caused by modifications to existing programs. However, most studies in this area focus on a binary classification of data, either is classified or unclassified. In fact, there are other models with other classifications of data, for instance, the common model used by Defense departments that classifies data into four security levels. However, these various classifications have received little attention in terms of measuring their effect. This paper introduces a model for measuring information flow of security-critical data within a certain object-oriented program with multilevel classification of its security-critical data. It defines a set of object-oriented security metrics which are capable of assessing the security of a given program’s design from the point of view of potential information flow. These metrics can be used to compare the security of programs or assess the effect of program modifications on security. Specifically, this paper proposes a generic model that consists of several security metrics to measure the relative security of object-oriented designs with respect to design quality properties of accessibility, cohesion, coupling, and design size.

Author 1: Bandar M. Alshammari

Keywords: Multilevel Security Models; Object-Orientation; Se-curity Metrics; Security Matrix; Unified Model Language

PDF

Paper 55: Towards Analytical Modeling for Persuasive Design Choices in Mobile Apps

Abstract: Persuasive technology has emerged as a new field of research in the past decade with its applications in various domains including web-designing, human-computer interaction, healthcare systems, and social networks. Although persuasive technology has its roots in psychology and cognitive sciences, researchers from the computing disciplines are also increasingly interested in it. Unfortunately, the existing theories, models, and frameworks for persuasive system design fall short due to absence of systematic design processes mostly used in the computing domains as well as lack of support for appropriate post-analysis. This work provides some insight into such limitations and identifies the importance of analytical modeling for persuasion in mobile applications design. The authors illustrate, using a case study, that appropriate mathematical models can be applied together with user modeling to develop a persuasive system that will allow the designer to consider several design choices simultaneously.

Author 1: Hamid Mukhtar

Keywords: goal; intent; analytics; modeling; feedback

PDF

Paper 56: Computer Science Approach to Philosophy: Schematizing Whitehead’s Processes

Abstract: Diagrams are used in many areas of study to depict knowledge and to assist in understanding of problems. This paper aims to utilize schematic representation to facilitate understanding of certain philosophical works; specifically, it is an attempt, albeit tentative, to schematize A. N. Whitehead’s ontological approach. It targets professionals and students in fields outside of philosophy such as computer science and engineering, who often look to sources in philosophy for design ideas or for a critical framework for practice. Yet students in such fields struggle to navigate thinkers’ writings. The paper employs schematization as an apparatus of specification for clarifying philosophical language by describing philosophical ideas in a form familiar to computer science. The resultant high-level representation seems to be a viable tool for enhancing the relationship between philosophy and computer science, especially in computer science education.

Author 1: Sabah Al-Fedaghi

Keywords: A. N. Whitehead; schematization; metaphysical ontology; diagrammatic representation; flow

PDF

Paper 57: Mood Extraction Using Facial Features to Improve Learning Curves of Students in E-Learning Systems

Abstract: Students’ interest and involvement during class lectures is imperative for grasping concepts and significantly improves academic performance of the students. Direct supervision of lectures by instructors is the main reason behind student attentiveness in class. Still, there is sufficient percentage of students who even under direct supervision tend to lose concentration. Considering the e-learning environment, this problem is aggravated due to absence of any human supervision. This calls for an approach to assess and identify lapses of attention by a student in an e-learning session. This study is carried out to improve student’s involvement in e-learning platforms by using their facial feature to extract mood patterns. Analyzing themoods based on emotional states of a student during an online lecture can provide interesting results which can be readily used to improvethe efficacy of content delivery in an e-learning platform. A survey is carried out among instructors involved in e-learning to identify most probable facial features that represent the facial expressions or mood patterns of a student. A neural network approach is used to train the system using facial feature sets to predict specific facial expressions. Moreover, a data association based algorithm specifically for extracting information on emotional states by correlating multiple sets of facial features is also proposed. This framework showed promising results in inciting student’s interest by varying the content being delivered.Different combinations of inter-related facial expressions for specific time frames were used to estimate mood patterns and subsequently level of involvement of a student in an e-learning environment.The results achieved during the course of research showed that mood patterns of a student provide a good correlation with his interest or involvement during online lectures and can be used to vary the content to improve students’ involvement in the e-learning system.More facial expressions and mood categories can be included to diversify the application of the proposed method.

Author 1: Abdulkareem Al-Alwani

Keywords: Mood extraction; Facial features; Facial recognition; Online education; E-Learning; Attention state; Learning styles

PDF

Paper 58: Impact of Domain Modeling Techniques on the Quality of Domain Model: An Experiment

Abstract: The unified modeling language (UML) is widely used to analyze and design different software development artifacts in an object oriented development. Domain model is a significant artifact that models the problem domain and visually represents real world objects and relationships among them. It facilitates the comprehension process by identifying the vocabulary and key concepts of the business world. Category list technique identifies concepts and associations with the help of pre defined categories, which are important to business information systems. Whereas noun phrasing technique performs grammatical analysis of use case description to recognize concepts and associations. Both of these techniques are used for the construction of domain model, however, no empirical evidence exists that evaluates the quality of the resultant domain model constructed via these two basic techniques. A controlled experiment was performed to investigate the impact of category list and noun phrasing technique on quality of the domain model. The constructed domain model is evaluated for completeness, correctness and effort required for its design. The obtained results show that category list technique is better than noun phrasing technique for the identification of concepts as it avoids generating unnecessary elements i.e. extra concepts, associations and attributes in the domain model. The noun phrasing technique produces a comprehensive domain model and requires less effort as compared to category list. There is no statistically significant difference between both techniques in case of correctness.

Author 1: Hiqmat Nisa
Author 2: Salma Imtiaz
Author 3: Muhammad Uzair Khan
Author 4: Saima Imtiaz

Keywords: Domain Model; UML; Experiment; Noun Phrasing Technique; Category List Technique

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org