The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Metadata Harvesting (OAI2)
  • Digital Archiving Policy
  • Promote your Publication

IJACSA

  • About the Journal
  • Call for Papers
  • Author Guidelines
  • Fees/ APC
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Editors
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Guidelines
  • Fees
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Editors
  • Reviewers
  • Subscribe

IJACSA Volume 8 Issue 4

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Teaching Software Testing using Data Structures

Abstract: Software testing is typically a rushed and neglected activity that is done at the final stages of software development. In particular, most students tend to test their programs manually and very seldom perform adequate testing. In this paper, two basic data structures are utilized to highlight the importance of writing effective test cases by testing their fundamental properties. The paper also includes performance testing at the unit level, of a classic recursive problem called the Towers of Hanoi. This teaching approach accomplishes two important pedagogical objectives: (1) it allows students to think about how to find hidden bugs and defects in their programs and (2) it encourages them to test more effectively by leveraging data structures that are already familiar to them.

Author 1: Ingrid A. Buckley
Author 2: Winston S. Buckley

Keywords: Software Testing; Data Structures; Abstract Data Type (ADT); Unit Testing; Performance Testing; Stacks; Binary Search Tree; Towers of Hanoi

Download PDF

Paper 2: Deep Learning Approach for Secondary Structure Protein Prediction based on First Level Features Extraction using a Latent CNN Structure

Abstract: In Bioinformatics, Protein Secondary Structure Prediction (PSSP) has been considered as one of the main challenging tasks in this field. Today, secondary structure protein prediction approaches have been categorized into three groups (Neighbor-based, model-based, and meta predicator-based model). The main purpose of the model-based approaches is to detect the protein sequence-structure by utilizing machine learning techniques to train and learn a predictive model for that. In this model, different supervised learning approaches have been proposed such as neural networks, hidden Markov chain, and support vector machines have been proposed. In this paper, our proposed approach which is a Latent Deep Learning approach relies on detecting the first level features based on using Stacked Sparse Autoencoder. This approach allows us to detect new features out of the set of training data using the sparse autoencoder which will have used later as convolved filters in the Convolutional Neural Network (CNN) structure. The experimental results show that the highest accuracy of the prediction is 86.719% in the testing set of our approach when the backpropagation framework has been used to pre-trained techniques by relying on the unsupervised fashion where the whole network can be fine-tuned in a supervised learning fashion.

Author 1: Adil Al-Azzawi

Keywords: Secondary structure protein prediction; secondary structure; fine-tuning; Stacked Sparse; Deep Learning; CNN

Download PDF

Paper 3: 3D Human Action Recognition using Hu Moment Invariants and Euclidean Distance Classifier

Abstract: This paper presents a new model of scale, rotation, and translations invariant interest point descriptor for human actions recognition. The descriptor, HMIV (Hu Moment Invariants on Videos) is used for solving surveillance camera recording problems under different conditions of side, position, direction and illumination. The proposed approach deals with raw input human action video sequences. Seven Hu moments are computed for extracting human action features and for storing them in a 1D vector which is constringed as one mean value for all the frames’ moments. The moments are invariant to scale, translation, or rotation, which is the robustness point of Hu moments algorithm. The experiments are evaluated using two different datasets; KTH and UCF101. The classification process is executed by calculating the Euclidean distance between the training and testing datasets. Human action with minimum distance will be selected as the winner matching action. The maximum classification accuracy in this work is 93.4% for KTH dataset and 92.11% for UCF101.

Author 1: Fadwa Al-Azzo
Author 2: Arwa Mohammed Taqi
Author 3: Mariofanna Milanova

Keywords: human action recognition; Hu moment invariants; surveillance camera; Euclidean distance

Download PDF

Paper 4: Learning Analytics in a Shared-Network Educational Environment: Ethical Issues and Countermeasures

Abstract: The recent trend in the development of education across the globe is the use of the new Learning Analytics (LA) tools and technologies in teaching and learning. The potential benefits of LA notwithstanding, potential ethical issues have to be considered and addressed in order to avoid any legal issues that might arise from its use. As a result of this, Higher Education Institutions (HEIs) involved in the development of LA tools need to pay particular attention to every ethical challenges/constraint that might arise. This paper aims to identify and discuss several ethical issues connected with the practice and use of LA tools and technologies in analysing and predicting the performance of students in a shared network environment of HEIs. The study discusses the four ethical issues of Information and Communication Technology namely Privacy, Accuracy, Property and Accessibility (PAPA’s Model) as well as other approaches to explain these future concerns. The paper also presents the empirical evidence of the views of students on the analytical use and storage of their data. The results indicate that even though students have high trust in the privacy and security of their data being used by their institutions, more than half of the students have ethical concerns with the accessibility and storage of their data beyond a certain period. In the light of this, generalised strategies on ethical issues of the use of learners’ data in an HEI shared networked environment are proposed.

Author 1: Olugbenga Adejo
Author 2: Thomas Connolly

Keywords: Learning Analytics; Student’s data; Emerging technologies; Ethical Issues; Higher Education

Download PDF

Paper 5: Dynamic Service Adaptation Architecture

Abstract: This paper proposes a software architecture for dynamical service adaptation. The services are constituted by reusable software components. The adaptation’s goal is to optimize the service function of their execution context. For a first step, the context will take into account just the user needs but other elements will be added. A particular feature in our proposition is the profiles that are used not only to describe the context’s elements but also the components itself. An Adapter analyzes the compatibility between all these profiles and detects the points where the profiles are not compatibles. The same Adapter search and apply the possible adaptation solutions: component customization, insertion, extraction or replacement.

Author 1: Mohammed Yassine BAROUDI
Author 2: Abdelkrim BENAMAR
Author 3: Fethi Tarik BENDIMERAD

Keywords: Adaptative service; software component; service; dynamic adaptation

Download PDF

Paper 6: A Comprehensive Insight towards Research Direction in Information Propagation

Abstract: The concept of Information Propagation has been studied to illustrate the particular, discrete, and explicit behavior of the nodes in a complex and highly distributed and connected networks. The complex network structure exhibits various challenges towards information propagation due to the usage of diversified communication protocol and dynamic behavior in the context of uncertainty. This paper is first one of its kind, which reviews frequently addressed problems, the most significant research techniques, for addressing various research problems associated with the information propagation concerning to social network analysis, data routing behavior in the multi-path wireless networks, multimedia transmission, and security. This paper is useful for the various researchers, academicians, and industry having research interest into social network analysis, predictive modeling and information propagation analysis.

Author 1: Selva Kumar S
Author 2: Dr. Kayarvizhy N

Keywords: Information Propagation Prediction; Social Network Analysis; Predcitive Modelling; Information Propogation

Download PDF

Paper 7: Visualizing Composition in Design Patterns

Abstract: Visualization of design patterns information play a vital role in analysis, design and comprehension of software applications. Different representations of design patterns have been proposed in literature, but each representation has its strengths and limitations. State of the art design pattern visualization approaches are unable to capture all the aspects of design pattern visualization which is important for the comprehension of any software application e.g., the role that a class, attribute and operation play in a design pattern. Additionally, there exist multiple instances of a design pattern and different types of overlapping in the design of different systems. Visualization of overlapping and composition in design patterns is important for forward and reverse engineering domains. The focus of this paper is to analyze the characteristics, strengths and limitations of key design pattern representations used for visualization and propose a hybrid approach which incorporates best features of existing approaches while suppressing their limitations. The approach extends features which are important for visualizing different types of overlapping in design patterns. Stereotypes, tagged values, semantics and constraints are defined to represent the design pattern information related to attributes and/or operations of a class. A prototyping tool named VisCDP is developed to demonstrate and evaluate our proposed.

Author 1: Zaigham Mushtaq
Author 2: Kiran Iqbal
Author 3: Ghulam Rasool

Keywords: Design patterns; Visualization; Program Comprehension; Reverse engineering; Composition

Download PDF

Paper 8: Adaptive Case Management Framework to Develop Case-based Emergency Response System

Abstract: Emergency response to crisis, disaster, or catastrophe incidents is a clear example of knowledge-intensive and collaboration-heavy process facing all public safety-related organizations. Software systems to support emergency response have existed for decades. However, the limitations of these systems and their development approaches are still significant in terms of flexibility and dynamicity. With the emergence of Adaptive Case Management (ACM) as a new software development approach to support knowledge work and the empower knowledge worker, the authors found that ACM is a promising approach that can be extended to support emergency response especially in large-scale situations. This research aims to study how ACM can be leveraged to design and implement case-based emergency response systems (ERSs). In particular, the authors propose a domain-specific and vendor-neutral Case Management Framework (CMF) that incorporates the essential capabilities to support the ERSs. As a proof-of-concept, the authors support the proposed CMF by a case-based ERS prototype. Finally, the authors conclude that ACM has a great potential to enhance the effectiveness and efficiency of ERSs. This work can be considered as an attempt to advocate the adaptation of ACM in such context.

Author 1: Abobakr Y. Shahrah
Author 2: Majed A. Al-Mashari

Keywords: Adaptive Case Management; Case Handling; Case Management; Emergency Response System

Download PDF

Paper 9: Novel Intra-Prediction Framework for H.264 Video Compression using Decision and Prediction Mode

Abstract: With the increasing usage of multimedia contents and advancement of the communication devices (along with services), there is a heavy demand of an effective multimedia compression protocol. In this regards, H.264 has been proven to be an effective video compression standard; however, its computational complexity associated with out and various other issues has been impediment towards mainstream of research towards compression. Therefore, we present a novel framework that enhances the capability of H.264 compression method by emphasizing on accomplishing the cost effectiveness of computational operation during intra-prediction mode. A simple and novel encoding mechanism has been formulated using H.264/AVC using decision mode of macro block as well as selection of prediction mode exclusively for intra-prediction in H.264/AVC. The study outcome is found to offer a superior signal quality as compared to conventional H.264 encoding mechanism.

Author 1: Pradeep Kumar N.S.
Author 2: H.N. Suresh

Keywords: Encoding Mechanism; H.264 / AVC; Intra-Prediction Mode; Video Compression; Visual Quality

Download PDF

Paper 10: An Efficient Approach for the Security Threats on Data Centers in IOT Environment

Abstract: Internet of Things has progressed from the conjunction of wireless knowledge, MEMS which is termed as micro electromechanical systems, micro facilities and the Internet. The conjunction has helped scratch down the storage walls concerning operating technology (OT) and information technology (IT), and allowing amorphous machine created data to be examined for understandings that will drive enhancements. The Things known as IOT is an arrangement of interconnected computing procedures, mechanical and digital machineries, substances or matters that are delivered with inimitable identifiers, and the ability to handover data over a system without necessitating human to humanoid, or human to computer collaboration. However, the security is one of the main concerns in Internet of things, which should be minimized. There are unnecessary requests from the attacker to overload the data center, which results in the hanging of the servers, decreasing the throughput, and requesting a transmission to the Data centers. This paper deals with an efficient approach to decrease the unwanted request at the Data Centers, so that the sessions will be reduced, and the unnecessary load will be reduced on the data centers, in order to mitigate the effect of attack as much as possible.

Author 1: Fahad H. Alshammari

Keywords: Internet of Things; Data centers; sessions; Security Threats; Networks

Download PDF

Paper 11: Using Weighted Bipartite Graph for Android Malware Classification

Abstract: The complexity and the number of mobile malware are increasing continually as the usage of smartphones continue to rise. The popularity of Android has increased the number of malware that target Android-based smartphones. Developing efficient and effective approaches for Android malware classification is emerging as a new challenge. This paper introduces an effective Android malware classifier based on the weighted bipartite graph. This classifier includes two phases: in the first phase, the permissions and API Calls used in the Android app are utilized to construct the weighted bipartite graph; the feature importance scores are integrated as weights in the bipartite graph to improve the discrimination between malware and goodware apps, by incorporating extra meaningful information into the graph structure. The second phase applied multiple classifiers to categorise the Android application as a malware or goodware. The results using an Android malware dataset consists of different malware families, showing the effectiveness of our approach toward Android malware classification.

Author 1: Altyeb Altaher

Keywords: Android malware; Bipartite graph; Classification algorithms; machine learning

Download PDF

Paper 12: A Parallel Simulated Annealing Algorithm for Weapon-Target Assignment Problem

Abstract: Weapon-target assignment (WTA) is a combinatorial optimization problem and is known to be NP-complete. The WTA aims to best assignment of weapons to targets to minimize the total expected value of the surviving targets. Exact methods can solve only small-size problems in a reasonable time. Although many heuristic methods have been studied for the WTA in the literature, a few parallel methods have been proposed. This paper presents parallel simulated algorithm (PSA) to solve the WTA. The PSA runs on GPU using CUDA platform. Multi-start technique is used in PSA to improve quality of solutions. 12 problem instances (up to 200 weapons and 200 targets) generated randomly are used to test the effectiveness of the PSA. Computational experiments show that the PSA outperforms SA on average and runs up to 250x faster than a single-core CPU.

Author 1: Emrullah SONUC
Author 2: Baha SEN
Author 3: Safak BAYIR

Keywords: Weapon-Target Assignment; Multi-start Simulated Annealing, Combinatorial optimization; Parallel algorithms; GPU

Download PDF

Paper 13: A Review on Urdu Language Parsing

Abstract: Natural Language Processing is the multidisciplinary area of Artificial Intelligence, Machine Learning and Computational Linguistic for processing human language automatically. It involves understanding and processing of human language. The way through which we share our contents or feelings have always great importance in understanding and processing of language. Parsing is the most suited approach in identifying and scanning what the available sentences expressed? Parsing is the process in which syntactic structure of sentence is identified using grammatical tags. The syntactically correct sentence structure is achieved by assigning grammatical labels to its constituents using lexicon and syntactic rules. Phrase and Dependency are two main structure formalisms for parsing natural language sentences. The growing use of web 2.0 has produced novel research challenges as people from different geographical areas are using this channel and sharing contents in their native languages. Urdu is one of such free word order native language which is widely shared over social media sites but identification and summarization of Urdu sentences is challenging task. In this review paper we present an overview to recent work in parsing of fixed order (i.e. English) and free word order languages (i.e Urdu) in order to reveal the most suited method for Urdu Language Parsing. This survey explored that dependency parsing is more appropriate for Urdu and other free word order languages and parsers of English language are not useful in parsing Urdu sentence due to its morphological, syntactical and grammatical differences.

Author 1: Arslan Ali Raza
Author 2: Asad Habib
Author 3: Jawad Ashraf
Author 4: Muhammad Javed

Keywords: Natural Language Processing; Machine Learning; Urdu Language Processing and Dependency Parsing

Download PDF

Paper 14: A Novel Representation and Searching Algorithm for Opening Hours

Abstract: Opening Hours can be considered a data type having a human representation; this means that it can be easily understood by human beings and hardly understood by computers because the lack of a standard structured representation. In essence, the opening hours gives us a simple information: opening state at a certain date and time, and this is our focus in this paper. So far, this kind of functionality does not exists in today's database management systems because there are no algorithms developed in this way. The purpose of this paper is to presents a novel and easy to implement algorithm for encoding opening hours in order to quickly search and get the opening state for records.

Author 1: Teodora Husar
Author 2: Cornelia Gyorödi
Author 3: Robert Gyorödi
Author 4: Sorin Sarca

Keywords: Opening Hours; Java; optimizations

Download PDF

Paper 15: Improved Selfish Node Detection Algorithm for Mobile Ad Hoc Network

Abstract: Mobile Ad hoc network (MANET) suffers from different security issues. Ideally, not all nodes in MANET cooperate in forwarding packets because of non-malicious intention. This node is called selfish node and it behaves so due to its internal state such as limited energy concerns. Selfish nodes drop packets and that harms the process of routes establishment and relaying packets. Therefore, it is very important to detect these nodes and avoid them, which guarantees improving the performance of the overall network. Here, an improved scheme has been developed for detecting selfish node in Ad Hoc On-demand Distance Vector routing protocol (AODV) based wireless routing network. Two algorithms were integrated for assuring least fault positive decision of selfish nodes detection; first one is to avoid false positive of detection of selfish nodes in forwarding Route Request (RREQ) and second one is to avoid false positive of detection of selfish node in forwarding data packets. This scheme guarantees improvement in performance of packet forwarding in terms of Packet Delivery Ratio (PDR) and End-to-End delay (E2E delay).

Author 1: Ahmed. A. Hadi
Author 2: Zulkarnain Md. Ali
Author 3: Yazan Aljeroudi

Keywords: Selfish nodes detection; AODV routing; routing protocols; MANET

Download PDF

Paper 16: A Study on Ranking Key Factors of Virtual Teams Effectiveness in Saudi Arabian Petrochemical Companies

Abstract: This research ranks effectiveness-related factors of virtual teams. The literature suggests various factors which could motivate or discourage management in using virtual teams versus co-located teams. Forty-eight interviews were done in petrochemical companies in Saudi Arabia. The Echo Method has been employed and eleven factors were identified. Results showed that the participants ranked efficiency and communication as first and second as a motivating factor in adopting the virtual team approach. While, the other three motivating factors which were ranked lower are flexibility, diversity and cooperation. On the other hand, the six discouraging factors (barrier) are miscommunication, scheduling preferences, unreliability of technology, incompetency of staff, varying standards and isolationist tendency. Suggestions were made to counteract the effects of the barrier-inducing factors and enhance the effects of the motivating factors.

Author 1: Abdullah Basiouni
Author 2: Kang Mun Arturo Tan
Author 3: Hafizi Muhamad Ali
Author 4: Walid Bahamdan
Author 5: Ahmad Khalifi

Keywords: Virtual teams; Social network; Echo method; Quantitative analysis

Download PDF

Paper 17: Prediction of Naturally Fractured Reservoir Performance using Novel Integrated Workflow

Abstract: Generation of naturally fractured reservoir subsurface fracture maps and prediction its production potential are considered complex process due to insufficient data available such as bore hole images, core data and proper reservoir simulator model. To overcome such shortcomings, the industry has relied on geo-statistical analyses of hard and soft data, which are often referred to as static data. This paper presents an integrated workflow that models and predicting fractured reservoirs performance, through the use of gradient-based inversion techniques and discrete fracture network modelling (DFN), which—through the inversion of well test data (i.e., dynamic data)—aims to optimise fracture properties and then predicting of the reservoir production potential. The first step in the workflow is to identify flow contributing fracture sets by analysing available core descriptions, borehole images, conventional log data and production data. Once the fracture sets are identified, the fracture intensity is statistically populated in the inter-well space. In the second step, 3D block-based permeability tensors are calculated based on flow through discrete fractures, and the fracture intensity is then propagated away from the wellbore, i.e., by relating to permeability tensors with fracture intensity. In the final step (fracture optimisation), the fracture properties are computed by DFN modelling, which includes distribution, orientation and geometry in different realisations. Fluid flow is simulated in these discrete fractures to estimate pressure change and pressure derivatives. The production rate associated with drill stem test that performed within this reservoir area has been successfully simulated using the optimised subsurface fracture map that has been generated from the first step.

Author 1: Reda Abdel Azim

Keywords: fractured reservoirs; production potential; fracture network map and finite element

Download PDF

Paper 18: Secure Data Accumulation among Reliable Hops with Rest/Alert Scheduling in Wireless Sensor Networks

Abstract: Wireless Sensor Networks (WSNs) are more inclined to attackers by outer sources. The total information must be secured to guarantee the uprightness and privacy. In sensor networks, the data collection and data accumulation are mainly based on the energy levels of the sensor hops. Due to the drain of the energy, at one particular point of time the sensor hops become obsolete and the data transmission will not take place. This research proposes a reliable and secure strategy with dependable hops utilizing own-key logic test with convention for sensor organization. This research work proposes to practice a few hops as dependable hops (Reliable-hops) to understand the insight nature of the nodes. With every hop, a secret authorization is shared among the sink and its neighboring hops. At this point, a network is developed for sending information to the sink hops in a progressive design. The hops encode the information by utilizing the secrecy authorization and advances to the next level in network. By improving the transmission structure of reliable-hops, the accumulated value was confirmed towards guaranteeing trustworthiness. The proposed system is demonstrated with various examples and carried throughout the paper.

Author 1: Mohamed Mustaq AhmedA
Author 2: Abdalla AlAmeen
Author 3: Mohemmed Sha M
Author 4: Mohamed Yacoab M.Y
Author 5: Manesh.T

Keywords: Sensor Networks; Data Collection; Data Accumulation; Reliability; Security Key; Rest/Alert hops

Download PDF

Paper 19: Human Gesture Recognition using Keyframes on Local Joint Motion Trajectories

Abstract: Human Action Recognition (HAR) systems are systems that recognize and classify the actions that users perform against the sensor or camera. In most HAR systems, an input test data is compared with the reference data in the database using various methods. Classification process is performed according to the result obtained. The size of the test or reference data directly affects the operation speed of the system. Reduced data size allows a significant performance increase in system operation speed. In this study, action recognition method is proposed by using skeletal joint information obtained by Microsoft Kinect sensor. Splitting keyframes are obtained from the skeletal joint information. The keyframes are observed as a distinguishing feature. Therefore, these keyframes are used for the classification process. Keeping the keyframes instead of keeping the position or angle information of action in the reference database can benefit from memory and working time. The weight value of each keyframes is calculated in the method. The problem of temporal differences that occur when comparing test and reference action is solved by Dynamic Time Warping (DTW). The k-nearest neighbor’s algorithm is used for classification according to the obtained results from DTW. The sample has been tested in a data set so that the success of the method can be tested. As a result, 100% correct classification was achieved. It is also suitable for working at real time systems. Breakpoints can also be used to provide feedback to the user as a result of the classification process. The magnitude and direction of the keyframes, the change in the trajectory of joint, the position and the time of its existence also give information about the time errors.

Author 1: Rafet Durgut
Author 2: Oguz FINDIK

Keywords: Human gesture recognition; dynamic time warping; local joint motion trajectory; Human action recognition; microsoft kinect

Download PDF

Paper 20: A Proposed Fuzzy Stability Model to Improve Multi-Hop Routing Protocol

Abstract: Today’s wide spread use of mobile devices such as: mobile phones, tablets, laptops and many others had driven the wireless Mobile Network growth especially the Mobile Ad hoc Networks commonly referred to as MANETs. Since the routing process is regarded as the core of communication and is associated with the network performance metrics, then its improvement will be revealed in the whole network performance improvement. Due to users’ mobility, limited battery power, and limited transmission ranges, the current routing protocols should consider the stability of routes. Hence, the lack of resources of MANETs may result in imprecise routing decisions. In this paper, the proposed fuzzy model is used to handle imprecision of routing decisions by Fuzzy stability model for Dynamic Source Routing (FSDSR). Regarding the number of hops per route, cache size, end-to-end delay and route discovery time, the results showed that FSDSR has outperformed the state of art protocol Dynamic Source Routing protocol (DSR).

Author 1: Hamdy A.M. Sayedahmed
Author 2: Hesham A. Hefny
Author 3: Imane M.A. Fahmy

Keywords: MANET; Fuzzy Model; Routes Stability; OPNET; DSR; FSDSR; MATLAB

Download PDF

Paper 21: An Improved Machine Learning Approach to Enhance the Predictive Accuracy for Screening Potential Active USP1/UAF1 Inhibitors

Abstract: DNA repair mechanism is an important mechanism employed by the cancerous cell to survive the DNA damages induced during uncontrolled proliferation of cell and anti-cancer drug treatments. In this context, the Ubiquitin-Specific Proteases (USP1) in complex with Ubiquitin Associated Factor 1(UAF1) plays a key role in the survival of cancerous cell by DNA repair mechanism. Thus, this put forth USP1/UAF1 complex as a striking anti-cancer target for screening of anti-cancer molecule. The current research is aimed to improve the classification accuracy of the existing bioactivity predictive chemoinformatics model for screening potential active USP1/UAF1 inhibitors from high-throughput screening data. The current study employed feature selection method to extract key molecular descriptors from the publicly available high-throughput screening dataset of small molecules that were used to screen active USP1/UAF1 complex inhibitors. This study proposes an improved predictive machine learning approach using the feature selection technique and two class Linear Discriminant Technique (LDA) algorithm to accurately predict the active novel USP1/UAF1 inhibitor compounds.

Author 1: Syed Asif Hassan
Author 2: Ahmed Hamza Osman

Keywords: Ubiquitinases; DNA repair mechanism; anti USP1/UAF1 molecule; High-throughput Dataset; Feature Selection and Discriminant Technique; Chemoinformatic Model; Classification accuracy; T-test

Download PDF

Paper 22: Instant Diacritics Restoration System for Sindhi Accent Prediction using N-Gram and Memory-Based Learning Approaches

Abstract: The script of Sindhi Language is highly complex due to many complexities including abundance of homographic words. The interpretation of the text turns so tough due to the possibility of multitudinal meanings associated with a homographic word unless given specific pronunciation with the help of diacritics. Diacritics help the readers to comprehend the text easily. Due to the rapidly developing nature of this era, people don’t bother writing diacritics in routine applications of life. Besides creating difficulties for human reading, the absence of diacritics does also make the text abstruse for machine reading. Relatively alike human, machines may also lead to semantic and syntactic complexities during computational processing of the language. Instant diacritics restoration is an approach emerged from the text prediction systems. This type of diacritics restoration is an unprecedented work in the realm of natural language processing, particularly in Indo-Aryan languages. A proposition for a framework using N-Grams and Memory-Based Learning approach is made in this work. The grab-point of this mechanism is its 99.03% accuracy on the corpus of Sindhi language during the experiments. The comparative edge of instant diacritics restoration is its being source of expedition in the performance of other natural language and speech processing applications. The future development of this approach seems vivid and clear for Sindhi orthography is highly similar to those of Arabic, Urdu, Persian and other languages based on this type of script.

Author 1: Hidayatullah Shaikh
Author 2: Javed Ahmed Mahar
Author 3: Mumtaz Hussain Mahar

Keywords: Sindhi Language; Instant Diacritics Restoration; Text Prediction; N-Grams; Memory-Based Learning

Download PDF

Paper 23: An Enhanced Breast Cancer Diagnosis Scheme based on Two-Step-SVM Technique

Abstract: This paper proposes an automatic diagnostic method for breast tumour disease using hybrid Support Vector Machine (SVM) and the Two-Step Clustering Technique. The hybrid technique is aimed at improving the diagnostic accuracy and reducing diagnostic miss-classification, thereby solving the classification problems related to Breast Tumour. To distinguish the hidden patterns of the malignant and benign tumours, the Two-Step algorithm and SVM have been combined and employed to differentiate the incoming tumours. The developed hybrid method enhances the accuracy by 99.1% when examined on the UCI-WBC data set. Moreover, in terms of evaluation measures, it has been shown experimentally results that the hybrid method outperforms the modern classification techniques for breast cancer diagnosis.

Author 1: Ahmed Hamza Osman

Keywords: Two-Step Clustering; Breast Cancer; SVM classification; Diagnosis; Tumors

Download PDF

Paper 24: Automatic Recognition of Medicinal Plants using Machine Learning Techniques

Abstract: The proper identification of plant species has major benefits for a wide range of stakeholders ranging from forestry services, botanists, taxonomists, physicians, pharmaceutical laboratories, organisations fighting for endangered species, government and the public at large. Consequently, this has fueled an interest in developing automated systems for the recognition of different plant species. A fully automated method for the recognition of medicinal plants using computer vision and machine learning techniques has been presented. Leaves from 24 different medicinal plant species were collected and photographed using a smartphone in a laboratory setting. A large number of features were extracted from each leaf such as its length, width, perimeter, area, number of vertices, colour, perimeter and area of hull. Several derived features were then computed from these attributes. The best results were obtained from a random forest classifier using a 10-fold cross-validation technique. With an accuracy of 90.1%, the random forest classifier performed better than other machine learning approaches such as the k-nearest neighbour, naïve Bayes, support vector machines and neural networks. These results are very encouraging and future work will be geared towards using a larger dataset and high-performance computing facilities to investigate the performance of deep learning neural networks to identify medicinal plants used in primary health care. To the best of our knowledge, this work is the first of its kind to have created a unique image dataset for medicinal plants that are available on the island of Mauritius. It is anticipated that a web-based or mobile computer system for the automatic recognition of medicinal plants will help the local population to improve their knowledge on medicinal plants, help taxonomists to develop more efficient species identification techniques and will also contribute significantly in the protection of endangered species.

Author 1: Adams Begue
Author 2: Venitha Kowlessur
Author 3: Upasana Singh
Author 4: Fawzi Mahomoodally
Author 5: Sameerchand Pudaruth

Keywords: leaf recognition; medicinal plants; random forest; Mauritius

Download PDF

Paper 25: E-exam Cheating Detection System

Abstract: With the expansion of Internet and technology over the past decade, E-learning has grown exponentially day by day. Cheating in exams has been a widespread phenomenon all over the world regardless of the levels of development. Therefore, detection of traditional cheating methods may no longer be wholly successful to fully prevent cheating during examinations. Online examination is an integral and vital component of E-learning. Students’ exams in E-learning are remotely submitted without any monitoring from physical proctors. As a result of being able to easily cheat during e-exams, E-learning universities depend on an examination process in which students take a face-to-face examination in a physical place allocated at the institution premises under supervised conditions, however this conflicts with the concept of distant E-learning environment. This paper will investigate the methods used by student for cheating detection in online exams, through continuous authentication and online proctors. In addition, we have implemented an E-exam management system, which used to detect and prevent the cheating in online exams. The system used fingerprint reader authenticator and eye tribe tracker in exam session. We researched two parameters that can define the examinee status as cheating or non-cheating during exam. Through these two parameters: the total time on out screen and the number of times on out screen were computed.

Author 1: Razan Bawarith
Author 2: Dr. Abdullah Basuhail
Author 3: Dr. Anas Fattouh
Author 4: Prof. Dr. Shehab Gamalel-Din

Keywords: online exam; cheating; continuous authentication; online proctor; fingerprint; eye tracking

Download PDF

Paper 26: Modeling and Control of a Multi-Machine Traction System Connected in Series using Two Static Converter

Abstract: Power may be segmented either at the converter, using a multilevel inverter, either at the machine, by performing a polyphase winding. Moreover, increasing numbers of phases enables improved power quality and reducing torque ripples with the advantage of fault tolerance related to the loss of one or more phases. This class of system offers a reduction of design time, costs and the optimization of the volume of embedded systems. The objective of this work is to order, model and characterize the behavior of a training system multimachines composed of two five phase synchronous permanent magnet motors connected in series using two static converters.

Author 1: Selimane MEGUENNI
Author 2: Abedelkhader. DJAHBAR

Keywords: synchronous machine; Multi-machine Multi-inverter; five-phase; vector control

Download PDF

Paper 27: Spatial Comprehension Exercise System with 3D CG of Toy Model for Disabled Children

Abstract: Spatial comprehension exercise system with Three-Dimensional Computer Graphics: 3D CG of toy model for disabled children is proposed. In order to improve spatial comprehension in an attractive manner, a toy model is created together with building block model. Through experiments, it is confirmed that the spatial comprehension is improved for the disabled children remarkably.

Author 1: Kohei Arai
Author 2: Taiki Ishigaki
Author 3: Mariko Oda

Keywords: Spatial Comprehension; Toy model; Augmented reality; Computer graphics

Download PDF

Paper 28: Gatekeepers Practices in Knowledge Diffusion within Saudi Organizations: KFMC Case Study

Abstract: Gatekeepers in organizations play a critical role in terms of disseminating and transferring outside knowledge into their groups. This research contributes in identifying the gatekeepers' practices in terms of gathering, selecting, and diffusing knowledge. In the context of Saudi organizations, the exploratory case selected in this research is King Fahad Medical City (KFMC). This research is conducted on Health Informatics and Information Technology employees. A mixed-method design is applied on this research to provide a deep understanding of knowledge interactions structure and the process of knowledge interactions across the organization network. Both methods; questionnaires and interviews are conducted in order to investigate the context. Social Network Analysis method is also used in this research to capture the "brokerage" network structure position using Flow Betweenness Centrality algorithm. The findings reveal that gatekeepers use different knowledge sharing mechanisms which are: information retrieval, information pooling, pushing, diffusion, collaborative problem solving, and thinking along. In addition, the results present the distinct methods and technologies used by the gatekeepers to collect and share their knowledge with others. The findings of this research help managerial decision makers and strategic managers among start-up organizations and also well-structured organizations to provide valuable insights and decisions in terms of policies, strategies, and the appropriate collaborative tools that foster collaborative working.

Author 1: Mona Alawadh
Author 2: Abdullah Altameem

Keywords: Knowledge sharing; gatekeepers; brokerage; knowledge transfer; and SNA

Download PDF

Paper 29: Observation of Scintillation Events from GPS and NavIC (IRNSS) Measurements at Bangalore Region

Abstract: Ionosphere scintillation is a random phenomenon of the ionosphere, causing abrupt fluctuations in the amplitude and phase of the signals traversing the medium, significantly impacting the performance of navigation systems, signifying the need to take up scintillation studies. Scintillation events are monitored on L5, S and L1 band signals of IRNSS and GPS navigation system respectively over low latitude Bangalore region during moderate and low solar activity period, 2015 and 2016 respectively. Investigations into scintillation variability with respect to local time, solar activity and seasonal variations are conducted to draw a trend of scintillation pattern. Comparison of L5 and L1 band scintillation events demonstrate similar scintillation pattern with varying scintillation magnitude. With S band signals exhibit minimum scintillation, suggesting the scintillation-free link for effective navigation.

Author 1: Manjula T R
Author 2: Raju Garudachar

Keywords: Ionosphere scintillation; Navigation; carrier to noise ratio; solar activity; equinox

Download PDF

Paper 30: A RDWT and Block-SVD based Dual Watermarking Scheme for Digital Images

Abstract: In the modern era, digital image watermarking is a successful method to protect the multimedia digital data for example copyright protection, content verification, rightful ownership identification, tamper detection etc. In this paper for improving the robustness and security, a Dual watermarking approach using Redundant Discrete Wavelet Transform (RDWT), block based singular value decomposition (SVD) and Arnold transform is presented. There are two gray scale watermarks, one is Prime watermark and other is Arnold scrambled Second watermark. Second watermark is embedded into the RDWT transformed Prime watermark in all sub bands to get the processed watermark image. After that transformed gray scale cover image is partitioned into non-overlapping blocks for embedding the processed watermark image by modifying the SVD coefficients of each block to obtain the resultant watermarked image. Now a reverse algorithm is developed to takeout the Prime and Second watermark from noisy image. Analysis and experimental outcomes show that the presented method is more robust against numerous image processing attacks and perform better as compared to previously introduced schemes related to presented work.

Author 1: Sachin Gaur
Author 2: Vinay Kumar Srivastava

Keywords: Digital image watermarking; Redundant Discrete wavelet transform; Singular value decomposition; Arnold transform; NCC and PSNR

Download PDF

Paper 31: Optimized Routing Information Exchange in Hybrid IPv4-IPv6 Network using OSPFV3 & EIGRPv6

Abstract: IPv6 is the next generation internet protocol which is gradually replacing the IPv4. IPv6 offers larger address space, simpler header format, efficient routing, better QoS and built-in security mechanisms. The migration from IPv4 to IPv6 cannot be attained in a short span of time. The main issue is compatibility and interoperability between the two protocols. Therefore, both the protocols are likely to coexist for a long time. Usually, tunneling protocols are deployed over hybrid IPv4-IPv6 networks to offer end-to-end IPv6 connectivity. Many routing protocols are used for IPv4 and IPv6. In this paper, researchers analyzed the optimized routing information exchange of two routing protocols (OSPFv3 & EIGRPv6) in hybrid IPv4-IPv6 network. Experimental results show that OSPFv3 performs better than EIGRPv6 in terms of most of the parameters i.e. convergence time, RTT, response time, tunnel overhead, protocol traffic statistics, CPU and memory utilization.

Author 1: Zeeshan Ashraf
Author 2: Muhammad Yousaf

Keywords: EIGRPv6; OSPFv3; Hybrid IPv4-IPv6; Route Redistributio; Route Summarization; Tunneling

Download PDF

Paper 32: Critical Success Factors In Implementing ITIL in the Ministry of Education in Saudi Arabia: An Exploratory Study

Abstract: This paper engages with the ITIL framework for IT service delivery within the specific context of the Ministry of Education in the Kingdom of Saudi Arabia (KSA). A literature review process is used to develop a critical success factors (CSFs) for the implementation of the ITIL framework in an organisation, based on a series of models like TAM and UTAUT, and then put into an overall conceptual model of use behaviour towards ITIL described by [1]. The conceptual model is then deployed in the field through a series of interviews with IT professionals within the Ministry of Education in the KSA. The interviews are semi-structured, and were intended to draw out the corresponding factors for success and the factors that have hindered the implementation of ITIL within this organisation. The data confirm the view of the literature that strong leadership and management involvement is essential both as a success factor in its own right, and as the means by which other success factors are enabled. The findings of paper make two observations. First, the literature sets out a series of quite precise success factors that relate to project management, communication, and quality control. The data presented in this research project demonstrate that in practice it is hard to make clear all of these factors to such a level of detail within the Ministry of Education in Saudi. Second, the implementation of ITIL is more reflexive than the literature and the conceptual model would initially demonstrate.

Author 1: Abdullah S Alqahtani

Keywords: Information Technology Infrastructure Technology (ITIL); Information Technology Service Management (ITSM); Ministry of Education (MoE); the Kingdom of Saudi Arabia (KSA)

Download PDF

Paper 33: QR Code Recognition based on Principal Components Analysis Method

Abstract: QR (Quick Response) code recognition systems (based on computer vision) have always been challenging to be accurately devised due to two main constraints: (1) QR code recognition system must be able to localize QR codes from an acquired image even in case of unfavorable conditions (illumination variations, perspective distortions) and (2) The system must be adapted to embedded system platforms in terms of processing complexity and resources requirement. Most of the earlier proposed QR code recognition systems implemented complex feature descriptors such as (Harris features, Hough transform which aim at extracting QR code pattern features and subsequently estimating their positions. This process is reinforced by pattern classifiers e.g. (Random forests, SVM) which are used to remove false detected patterns. Those approaches are very computationally expensive. Thus, they are not able to be run in real-time systems. In this paper, a streamlined QR code recognition approach is proposed to be efficiently operable in systems characterized by a limited performance. The evoked approach is conducted as follows: the captured image is segmented in order to reduce searching space and extract the regions of interest. Afterwards a horizontal and vertical scans are performed to localize preliminarily QR code patterns, followed by Principal Component Analysis (PCA) method which allows removing false positives. Thereafter, the remaining patterns are assembled according to a constraint so as to localize the corresponding QR codes. Experimental results show that the incorporation of PCA decreases notably the processing time and increase QR code recognition accuracy (96%).

Author 1: Hicham Tribak
Author 2: Youssef Zaz

Keywords: QR code; Image segmentation; Principal Components Analysis; Perspective rectification; Pattern similarity measurement

Download PDF

Paper 34: Segmentation of Brain Tumor in Multimodal MRI using Histogram Differencing & KNN

Abstract: Tumor segmentation inside the brain MRI is one of the trickiest and demanding subjects for the research community due to the complex nature and structure of the human brain and the different types of abnormalities that grow inside the brain. A Few common types of tumors are CNS Lymphoma, Meningioma, Glioblastoma, and Metastases. In this research work, our aim is to segment and classify the four most commonly diagnosed types of brain tumors. To segment the four most common brain tumors, we are proposing a new demanding dataset comprising of multimodal MRI along with healthy brain MRI images. The dataset contains 2000 images collected from online sources of about 80 patient cases. Segmentation method proposed in this research is based on histogram differencing with rank filter. Morphology at post-processing is practically implemented to detect the brain tumor more evidently. The KNN classification is applied to classify tumor values into their respective category (i.e. benign and malignant) based on the size value of tumor. The average rate of True Classification Rate (TCR) achieved is 97.3% and False Classification Rate (FCR) is 2.7%.

Author 1: Qazi Nida-Ur-Rehman
Author 2: Imran Ahmed
Author 3: Ghulam Masood
Author 4: Najam-U-Saquib
Author 5: Muhammad Khan
Author 6: Awais Adnan

Keywords: MRI imaging; tumor types; image segmentation; Histogram Differencing; KNN

Download PDF

Paper 35: VHDL Design and FPGA Implementation of LDPC Decoder for High Data Rate

Abstract: In this work, we present a FPGA design and implementation of a parallel architecture of a low complexity LDPC decoder for high data rate applications. The selected code is a regular LDPC code (3, 4). VHDL design and synthesis of such architecture uses the decoding by the algorithm of BP (Believe propagation) simplified "Min-Sum". The complexity of the proposed architecture was studied; it is 6335 LEs at a data rate of 2.12 Gbps for quantization of 8 bits at the second iteration. We also realized a platform based on a co-simulation on Simulink to validate performance in BER (Bit Error Rate) of our architecture.

Author 1: A. Boudaoud
Author 2: M. El Haroussi
Author 3: E. Abdelmounim

Keywords: error correcting codes; LDPC codes; BP “Min-Sum”; VHDL language; FPGA

Download PDF

Paper 36: Resources Management of Mobile Network IEEE 802.16e WiMAX

Abstract: The evolution of the world of telecommunications towards the mobile multimedia following the technological advances has demonstrated that to provide access to the network is no longer sufficient. The need for users is to access value-added multimedia services in their own home environment regardless of how they access the systems. Multimedia services require high transfer rates and have quality service requirements. They must coexist with services with real time constraints such as the voice service which does not tolerate variation of the delay between sending and receiving packets. The guarantee of these services by the operator becomes much more difficult in the technologies that take into account the mobility of the users. This paper studies the IEEE802.16e system according to the continuous modeling case. A model of the IEEE802.16e cell is proposed and allows the decomposition of the cell according to the principle of the AMC adaptive modulation and coding technique. The model is based on an admission control mechanism in the presence of two types of real-time and non-real-time traffic. The model is based on a new CAC strategy with intra-cell mobility and gives the same QoS for the calls of this traffic by favoring the calls in progress on the new arrivals.

Author 1: Mubarak Elamin Elmubarak Daleel
Author 2: Marwa Eltigani Abubakar Ali

Keywords: Wireless Networks; IEEE 802.16; WiMAX; Radio Resource Allocation; Mobility; Admission Control

Download PDF

Paper 37: A Social Semantic Web based Conceptual Architecture of Disaster Trail Management System

Abstract: Disasters affect human lives severely. Due to these disasters, hundreds and thousands of human beings lost their lives and gracious properties. Government agencies, non- government organization and individual volunteers act to rescue the affected people and to mitigate the disaster effects. These teams require real time information about the nature, severity, area and number of affectees. Their efforts can be supported by providing timely, effective and specific information so that the rescuers can get better idea about the available routes to reach the affectees, urgency and mass of loss. People share huge amount of data through blogs and social media that can be utilized to help rescue operations. This information can electronically be filtered, arranged and formatted in a proper manner. Thus, semantic web technologies can play a vital role in providing timeliness information. Purpose of this research is to capture explicit knowledge of the domain in form of ontologies, automatic information extraction, generation of implicit knowledge and then disseminating this information to various stakeholders. Collection of implicit and explicit knowledge will help improve decision making for disaster trail management.

Author 1: Ashfaq Ahmad
Author 2: Roslina Othman
Author 3: Mohamad Fauzan

Keywords: Ontology; Disaster Trail Management; Information Extraction; Knowledge Management

Download PDF

Paper 38: Clustering Students’ Arabic Tweets using Different Schemes

Abstract: In this paper, Twitter has been chosen as a platform for clustering the topics that have been mentioned by King Abdulaziz University students to understand students’ behaviours and answer their inquiries. The aim of the study is to propose a model for clustering analysis of Saudi Arabian (standard and Arabian Gulf dialect) tweets to segment topics included in the students’ posts. A combination of the natural language processing (NLP) and the machine learning (ML) method to build models is used to cluster tweets according to their text similarity. K-mean algorithm is utilised with different vector representation schemes such as TF-IDF (term frequency-inverse document frequency) and BTO (binary-term occurrence). Distinct preprocessing is explored to obtain the N-grams term of tokens. The cluster distance performance task is applied to determine the average between the centroid clusters. Moreover, human evaluation clustering is performed by looking at the data source to make sure that the clusters are making sense to an educational domain. At this moment, each cluster has been identified, and students’ accounts on Twitter have been known by their facilities or their educational system, such as e-learning. The results show that the best vector’s representation was using BTO, and it will be useful to apply it to cluster students’ text instead of the TF-IDF scheme.

Author 1: Hamed Al-Rubaiee
Author 2: Khalid Alomar

Keywords: Twitter; Arabic tweets; Saudi Arabia; King Abdulaziz University; data mining; data preparation

Download PDF

Paper 39: Human Visual System-based Unequal Error Protection for Robust Video Coding

Abstract: To increase the overall visual quality of the video services without increasing data rate, a human visual system-based video coding, founded on a hierarchy of the video stream in different levels of importance, is developed. Determining these importance levels takes in count three classification criteria: the position of current image in the group of images (image level), the importance of the motion vectors of macroblocks in the current image (macroblock level) and belonging or not of a pixel in a spatial region of interest (pixel level). At the end of this classification process, an interpolation between the results of the three-level selection allows to establish an index of importance for each macroblock of the image to be encoded. This index determines the type of channel coding to be applied to the corresponding macroblock. Tests have shown that the technique presented in this paper achieves better results in PSNR and SSIM (structural similarity) than an equal error protection technique.

Author 1: Ouafae Serrar
Author 2: Oum el kheir Abra
Author 3: Mohamed Youssfi

Keywords: video coding; unequal error protection; human visual system (HVS); Regions of Interest ROI; Significant Motion Vectors SVM; Classification; index of importance

Download PDF

Paper 40: Proposing a Keyword Extraction Scheme based on Standard Deviation, Frequency and Conceptual Relation of the Words

Abstract: At each text there are a few keywords which provide important information about the content of that text. Since this limited set of words (keywords) is supposed to describe the total concept of a text (e.g. article, book), the correct choosing of keywords for a text plays an important role in the right representing of that text. Despite several efforts in this field, none of the so far published methods is accurate enough to elicit representative words for retrieving a vast variety of different texts. In this study, an unsupervised scheme is proposed which is independent on domain, language, structure and length of a text. The proposed method uses the words’ frequency in conjunction with standard deviation of occurred location of words in text along with considering the conceptual relation of words. In the next stage, a secondary score is given to those selected keywords by the statistical criterion of TFISF in order to improve the basis method of TFIDF. Moreover, the proposed hybrid method does not remove the stopwords since they might be a part of bigram keywords while the similar approaches remove all stopwords at their first stage. Experimental results on the known SEMEVAL dataset imply the superiority of the proposed method in comparison with state-of-the-art schemes in terms of F-score and accuracy. Therefore, the introduced hybrid method can be considered as an alternative scheme for accurate keyword extraction.

Author 1: Shadi Masaeli
Author 2: Seyed Mostafa Fakhrahmad
Author 3: Reza Boostani
Author 4: Betsabeh Tanoori

Keywords: Keyword extraction; key-phrase extraction; TFISF; standard deviation; frequency

Download PDF

Paper 41: Output Feedback Controller Synthesis for Discrete-Time Nonlinear Systems

Abstract: This paper presents a computational approach for solving optimal control problem for a class of nonlinear discrete-time systems. We focus on problem in which a pre-specified N local subsystems are given to describe the studied systems. For such problem, we derive an output feedback controller and a cost function such that the resulting closed-loop system is asymptotically stable and the closed loop cost function is minimized. The main results are demonstrated numerically through the implementation of the proposed algorithm for solving the optimal control problem of a mechanical system.

Author 1: Hajer Bouzaouache

Keywords: nonlinear systems; discrete-time systems; optimal control; output feedback control

Download PDF

Paper 42: Interactive Mobile Health Monitoring System

Abstract: Health monitoring system is an active application in pervasive and ubiquitous computing. It is an application of mobile computing technology for enhancing communication among health care workers, physicians and patients with a view to provide better health care system. Recent elevation in sensors, wireless communication and low-power integrated circuits has empowered the design of pocket size, light weight, low-cost, and interactive bio-sensor nodes. These nodes are seamlessly integrated for mobile health monitoring using wireless body area network which can sense, process and communicate one or more vital parameters. The proposed system, through mobile device can provide patient health parameters (such as temperature, heart rate and ECG) to medical server, care taker and to medical practitioner based on the biomedical and environmental data collected by deployed sensors. In this system, multiple physiological parameters are incorporated for monitoring as against one or two parameters in legacy system. In this paper hardware, software and implementation of system is discussed whereas the focus is on authentication, power consumption, accuracy in transmission of health parameters to medical server.

Author 1: Varsha Wahane
Author 2: Dr.P.V. Ingole

Keywords: Biomedical sensors; Wireless body area network; mobile device and microcontroller

Download PDF

Paper 43: Wireless Sensor Network Energy Efficiency with Fuzzy Improved Heuristic A-Star Method

Abstract: Energy is a major factor in designing wireless sensor networks (WSNs). In order to extend the network lifetime, researchers should consider energy consumption in routing protocols of WSNs. Routing will serve to facilitate a number of sensors on the technology of WSNs to identify the optimal path and manage energy consumption saving at the time of transmitting data. Current Wireless Sensor Networks efficiency system uses node selection as the main parameter without applying path finding routing. It will not complete the optimization of energy. This research was designed to address the problem of energy optimization by using fuzzy-improved heuristic A-Star. A new algorithm named improved heuristic A-Star was developed from previous A-Star algorithm. The result of fuzzy-improved heuristic A-Star indicated node sensor to sink destination saved 0.3698 joule energy dissipation which resulted in longer lifetime.

Author 1: Sigit Soijoyo
Author 2: Retantyo Wardoyo

Keywords: Improved Heuristic A-Star; Fuzzy Logic; Wireless Sensor Network

Download PDF

Paper 44: Utilization of Finite Elements Programs and Matlab Simulink in the Study of a Special Electrical Motor

Abstract: This paper presents the study of a single-phase synchronous motor with permanent magnets (PM) using some computer programs. This motor type is used especially in household applications, and it has a low power. It is known that PM synchronous motors have a great advantage consisting in the lack of rotor losses. For this motor the starting problem has been solved performing a variable air gap under polar shoes, what determine the occurrence of the starting torque due to the fact that the axis of the rotor field created by PM in the rest position differs from the axis of the stator. First the parameters of the motor have been determined by tests and finite elements (FE) simulations, without knowing the properties of the PM. At the beginning magnetostatic FE simulations have been performed and then in magnetodynamic regime. The obtained results in the two regimes are closed. Secondly, with the determined parameters, a Matlab Simulink model has been realized (this being the final goal), and the dynamic regime of the motor has been studied. The results regarding the motor speed in starting process, the current variation, are also presented and discussed.

Author 1: Olivian Chiver
Author 2: Liviu Neamt
Author 3: Oliviu Matei
Author 4: Zoltan Erdei
Author 5: Cristian Barz

Keywords: parameters; synchronous motor; single-phase; permanent magnet; finite elements programs; Matlab/Simulink

Download PDF

Paper 45: A Rich Feature-based Kernel Approach for Drug- Drug Interaction Extraction

Abstract: Discovering drug-drug interactions (DDIs) is a crucial issue for both patient safety and health care cost control. Developing text mining techniques for identifying DDIs has attracted a great deal of attention in the last few years. Unfortunately, state-of-the-art results didn't exceed the threshold of 0.7 F1 score, which calls for more efforts. In this work, we propose a new feature-based kernel method to extract and classify DDIs. Our approach consists of two steps: identifying DDIs and assigning one of four different DDI types to the predicted drug pairs. We demonstrate that by using new groups of features non-linear kernels can achieve the best performance. When evaluated on the DDIExtraction 2013 challenge corpus, our system achieved an F1-score of 71.79%, as compared to 69.75% and 68.4% reported by the top two state-of-the-art systems.

Author 1: ANASS RAIHANI
Author 2: NABIL LAACHFOUBI

Keywords: Drug–drug interaction; Feature-based approach; Nonlinear kernel; Biomedical informatics; Natural Language Processing

Download PDF

Paper 46: OTSA: Optimized Time Synchronization Approach for Delay-based Energy Efficient Routing in WSN

Abstract: Time Synchronization is one of the problems and still ignored problem in area of wireless sensor network (WSN). After reviewing the existing literatures, it is found that there are few studies that combinely address the problem of energy conservation, clustering, routing along with minimizing the errors due to time synchronization in sensor network. Therefore, this manuscript presents a delay-based routing which considers the propagation delay to formulate a mechanism for delay compensation in large scale wireless sensor network. The prime goal of this technique is to jointly address energy problems, time synchronization, and routing in wireless sensor network. The outcome of the proposed study was found to posse’s minimized communication overhead, minimized synchronization errors, lower energy consumption, and reduced processing time when compared with the existing standards of time synchronization techniques.

Author 1: K. Nagarathna
Author 2: Jayashree D Mallapur

Keywords: Wireless Sensor Network; Routing; Time Synchronization; Optimization; Hardware Clock

Download PDF

Paper 47: Formal Specification of a Truck Geo-Location Big-Data Application

Abstract: In the last few year’s social networks, e-commerce, mobile commerce, and sensor networks have resulted into an exponential increase in data size. This data comes in all formats i.e. structured, un-structured and semi-structured. To efficiently extract useful information from these huge data sources is important. This information can play a central role in making future decisions and strategies. A truck geo-location big-data application integrated with formal model is proposed. The truck geo-location data is un-structured and it is accessed and manipulated by Hadoop query engine. Labelled transition system based formal model of the application is proposed to ensure safety and liveness properties of correctness.

Author 1: Ayman Naseem
Author 2: Nadeem Akhtar
Author 3: Malik Saad Missen

Keywords: Big-data; Formal methods; Correctness properties; Safety; Liveness; Internet-of-Things (IoT); MapReduce; Hadoop Distributed File System (HDFS); Finite State Processes (FSP); Labelled Transition System (LTS)

Download PDF

Paper 48: Wi-Fi Redux: Never Trust Untrusted Networks

Abstract: This study analyzes the dangers posed to computer user information and their equipment as they connect to untrusted networks, such as those found in coffee shops. Included in this study is a virtualized lab consisting of the target and attacker nodes and router to facilitate communication. Also included are a binary for reverse connection and a modified binary that was created to connect back to the attacker node and bypasses most Anti-virus software.

Author 1: Young B. Choi
Author 2: Kenneth P. LaCroix

Keywords: Wi-Fi; Untrusted Network; Pineapple; MITM; DNS Spoofing; Least Privilege

Download PDF

Paper 49: Simplex Parallelization in a Fully Hybrid Hardware Platform

Abstract: The simplex method has been successfully used in solving linear programming (LP) problems for many years. Parallel approaches have also extensively been studied due to the intensive computations required, especially for the solution of large LP problems. Furthermore, the rapid proliferation of multicore CPU architectures as well as the computational power provided by the massive parallelism of modern GPUs have turned CPU / GPU collaboration models increasingly into focus over the last years for better performance. In this paper, a highly scalable implementation framework of the standard full tableau simplex method is first presented, over a hybrid parallel platform which consists of multiple multicore nodes interconnected via a high-speed communication network. The proposed approach is based on the combined use of MPI and OpenMP, adopting a suitable column-based distribution scheme for the simplex tableau. The parallelization framework is then extended in such a way that it can exploit concurrently the full power of the provided resources on a multicore single-node environment with a CUDA-enabled GPU (i.e. using the CPU cores and the GPU concurrently), based on a suitable hybrid multithreading/GPU offloading scheme with OpenMP and CUDA. The corresponding experimental results show that the hybrid MPI+OpenMP based parallelization scheme leads to particularly high speed-up and efficiency values, considerably better than in other competitive approaches, and scaling well even for very large / huge linear problems. Furthermore, the performance of the hybrid multithreading/GPU offloading scheme is clearly superior to both the OpenMP-only and the GPU-only based implementations in almost all cases, which validates the worth of using both resources concurrently. The most important, when it is used in combination with MPI in a multi-node (fully hybrid) environment, it leads to substantial improvements in the speedup achieved for large and very large LP problems.

Author 1: Basilis Mamalis
Author 2: Marios Perlitis

Keywords: Parallel Processing; Linear Programming; Simplex Algorithm; MPI; OpenMP; CUDA

Download PDF

Paper 50: Modern Data Formats for Big Bioinformatics Data Analytics

Abstract: Next Generation Sequencing (NGS) technology has resulted in massive amounts of proteomics and genomics data. This data is of no use if it is not properly analyzed. ETL (Extraction, Transformation, Loading) is an important step in designing data analytics applications. ETL requires proper understanding of features of data. Data format plays a key role in understanding of data, representation of data, space required to store data, data I/O during processing of data, intermediate results of processing, in-memory analysis of data and overall time required to process data. Different data mining and machine learning algorithms require input data in specific types and formats. This paper explores the data formats used by different tools and algorithms and also presents modern data formats that are used on Big Data Platform. It will help researchers and developers in choosing appropriate data format to be used for a particular tool or algorithm.

Author 1: Shahzad Ahmed
Author 2: M. Usman Ali
Author 3: Javed Ferzund
Author 4: Muhammad Atif Sarwar
Author 5: Abbas Rehman
Author 6: Atif Mehmood

Keywords: Big Data; Machine Learning; Hadoop; MapReduce; Spark; Bioinformatics; Microarray; Data Models; Data Formats; Classification; Clustering

Download PDF

Paper 51: Proactive Intention-based Safety through Human Location Anticipation in HRI Workspace

Abstract: The safety involved in Human-Robot Interaction (HRI) is an important issue. This is the key point for the increase or decrease in HRI activity. A novel solution concerning the safety of HRI is proposed. The solution considers the near future human intentions. A set of possible human intentions is known to the robot. The robot also knows the places that can be visited by the interacting human according to his current intention. The proposed solution enables the robot to avoid a potential collision by anticipating the future human location and dividing the workspace into safe and unsafe zones. The solution contributes for the improvement of HRI safety measures but further efforts are required for achieving an enhanced safety level.

Author 1: Muhammad Usman Ashraf
Author 2: Muhammad Awais
Author 3: Muhammad Sheraz Arshad Malik
Author 4: Ijaz Shoukat
Author 5: Muhammad Sher

Keywords: Intention Recognition; Human-Robot Interaction; Human-Robot Interaction Safety; Unsafe Zone; Workspace

Download PDF

Paper 52: Impact of Story Point Estimation on Product using Metrics in Scrum Development Process

Abstract: Agile Software Development techniques are worldwide accepted, regardless of the definition of agile we all must agree with the fact that agile is maturing day by day, suppliers of software systems are moving away from traditional waterfall techniques and other development practices in favor of agile methods. There are numerous types of methodologies, domains/ methods in agile for which are to be selected according to the current situation and demand of the current project. As a case scenario in the following research will discuss scrum as a development technique in which we will focus on the effort estimation(s) and their effects by discussing distinct metrics. Mainly estimation refers directly to cost, time and complexity during the life cycle of project. Metrics will help the teams to better understand the development progress and building releasing (releases) of software easier in a fluent and robust way. The following paper thus identifies aspects mainly ignored by the development team(s) during estimation.

Author 1: Ali Raza Ahmed
Author 2: Muhammad Tayyab
Author 3: Dr. Shahid Nazir Bhatti
Author 4: Dr. Abdullah J. Alzahrani
Author 5: Dr. Muhammad Imran Babar

Keywords: product backlog; sprint backlog; backlog Item; front end designer; product Owner; agile software development; scrum master; product owner; sprint planning; velocity chart; Agile methodology; Effort Estimation; Story Points Estimation

Download PDF

Paper 53: Optimized Quality Model for Agile Development: Extreme Programming (XP) as a Case Scenario

Abstract: The attributes of quality are that it is complex taxonomy, it cannot be weighted or measured but can be felt, discussed and judged. Early assessment and verification of functional attributes (requirements) are supported well by renowned standards while the nonfunctional attributes (requirements) are not. Agile software development methodologies are of high repute as the most popular and effective approaches to the development of software systems. Early requirements verification methodologies in Agile Software Engineering are well focused in this way and hence mainly researched have achieved in functional requirements. For early quality aspects (attributes) in order to bring quality in our design and hence development process, it is very important to consider nonfunctional requirements quality metrics (attributes). A comprehensive work is also being done to propose and validate (using iThink) different quality models which could make sure the quality of agile software products being developed, which will be though available in detail in the literature review (section II). Yet a generic and standard quality metrics model is missing in this for the agile software practices in all, which off course is further needed to make sure that the agile product being developed, will surely accomplish quality characteristics as decided by the stakeholders as well as the mentioned quality standard they are addressing. In this work we have proposed a quality metrics model that fulfills the desired quality attributes exist in ISO/IEC (Quality standards, ISO 9126, ISO 25000) in early requirements, we validated this by performing simulations in iThink technology that also ensures that the quality of item being produced to meet the described criteria.

Author 1: Atika Tabassum
Author 2: Iqra Manzoor
Author 3: Dr. Shahid Nazir Bhatti
Author 4: Aneesa Rida Asghar
Author 5: Dr. Imtiaz Alam

Keywords: Agile Software Engineering (ASE); Agile Software Development (ASD); Extreme Programming (XP); ISO; ISO 9126; ISO 25000

Download PDF

Paper 54: The Design and Development of Spam Risk Assessment Prototype: In Silico of Danger Theory Variants

Abstract: Now-a-days, data is flowing with various types of information and it is absolutely enormous and moreover, it is in unstructured form. These raw data is meaningless unless it is processed and analyzed to retrieve all the valuable and meaningful information. In this paper, a design and principal functionalities of the system prototype is introduced. A process of information retrieval by applying the text mining with Artificial Immune System (AIS) is proposed to discover the possible level of severity for a Short Messaging Service (SMS) spam. This is expected to be a potential tool in retrieving an implicit danger that a spam might impact to the recipients. Furthermore, the development of this tool can be considered as an emergence of another data mining tool that also exceedingly possible to be embedded with another existing tool.

Author 1: Kamahazira Zainal
Author 2: Mohd Zalisham Jali

Keywords: Danger Theory Variants; Text Spam Messages; Severity Assessment; Text Mining; Information Retrieval; Knowledge Discovery

Download PDF

Paper 55: Impact and Challenges of Requirement Engineering in Agile Methodologies: A Systematic Review

Abstract: Requirement Engineering is one of important stage in development life cycle. All requirements required for development of product is collected in this phase. A high standard product can be developed by agile methodology in less budget and time. Importance of agile practices have been enhanced since it offers assist cooperation too software engineering. Being basic phase of software engineering, requirement engineering has different processes. The elements of direct correspondence is one of spry way which not at all like to other conventional and traditional approaches .Although a lot of research has been done on agile practices and role of requirement in agile methodologies but still there is need of studies on change manage management ,requirement prioritization, prototyping and nonfunctional requirement in agile methodologies. Aim of this review paper is to present the limitations in presentation of requirement engineering phases in agile practices and what are the issues and challenges that agile person faces in implementation of agile practices. Many research studies from different sources have been reviewed on basis of inclusion and exclusion criteria. Most RE activities has been discussed in review. Evidence helps to prove that how RE process was performed in scrums. Mostly research has been conducted on general agile methodologies, few authors specified RE practices in other methodologies of agile. Finding of this research is the work of researchers that will be beneficial for those who are interested in finding interesting area of research in this field because many techniques of agile (extreme programming, crystal methodology, lean ) requires further study and practical results as clarified by studies.

Author 1: Sehrish Alam
Author 2: Shahid Nazir Bhatti
Author 3: S. Asim Ali Shah
Author 4: Dr. Amr Mohsen Jadi

Keywords: Requirement Engineering; Traditional approaches; Agile methodologies; Challenges in RE; Requirement prioritization; Nonfunctional Requirements; Dynamic system development method; Scrum; Extreme programming

Download PDF

Paper 56: Classifying and Segmenting Classical and Modern Standard Arabic using Minimum Cross-Entropy

Abstract: Text classification is the process of assigning a text or a document to various predefined classes or categories to reflect their contents. With the rapid growth of Arabic text on the Web, studies that address the problems of classification and segmentation of the Arabic language are limited compared to other languages, most of which implement word-based and feature extraction algorithms. This paper adopts a PPM character-based compression scheme to classify and segment Classical Arabic (CA) and Modern Standard Arabic (MSA) texts. An initial experiment using the PPM classification method on samples of text resulted in an accuracy of 95.5%, an average precision of 0.958, an average recall of 0.955 and an average F-measure of 0.954, using the concept of minimum cross-entropy. PPM-based classification experiments on standard Arabic corpora showed that they contained different types of text (CA or MSA), or a mixture of the both (CA and MSA). Further experiments with the same corpora showed that a more accurate picture of the contents of the corpora was possible using the PPM-based segmentation method. Tag-based compression experiments (using tags produced by parts-of-speech Arabic taggers) also showed that the quality of the tagging (as measured by compression quality) is significantly affected when tagging either CA and MSA text. The conclusion is that NLP applications (such as taggers) should treat these texts separately and use different training data for each or process them differently.

Author 1: Ibrahim S Alkhazi
Author 2: William J. Teahan

Keywords: text classification; Arabic language; Classical Arabic; Modern Standard Arabic

Download PDF

Paper 57: A Recent Study on Routing Protocols in UWSNs

Abstract: Recent research has seen remarkable advancement in the field of Under Water Sensor Networks (UWSNs). Many different protocols are developed in the recent years in this domain. As these protocols can be categorized in a variety of ways according to the mechanisms and functionalities they follow, hence it becomes important to understand their principal working. In this research we have introduced three analysis methods; Clustering based, Localization based and Cooperation based routing by selecting some recent routing protocols in the field of UWSN and presented a comparative analysis according to the categories in which they lie. This research has been taken theoretically and is qualitative one. Also a detail analysis of their key advantages and flaws are also identified in this research.

Author 1: Muhammad Ahsan
Author 2: Sheeraz Ahmed
Author 3: Adil khan
Author 4: Mukhtaj khan
Author 5: Fazle Hadi
Author 6: Fazal Wahab
Author 7: Imran Ahmed

Keywords: UWSN; routing protocol; relay node; sink

Download PDF

Paper 58: A Framework to Reason about the Knowledge of Agents in Continuous Dynamic Systems

Abstract: Applying formal methods to a group of agents provides a precise and unambiguous definition of their behaviors, as well as verify properties of agents against implementations. Hybrid automaton is one of the formal approaches that are used by several works to model a group of agents. Several logics have been proposed, as extension of temporal logics to specify and hence verify those quantitative and qualitative properties of systems modeled by hybrid automaton. However, when it comes to agents, one needs to reason about the knowledge of other agents participating in the model. For this purpose, epistemic logic can be used to specify and reason about the knowledge of agents. But this logic assumes that the model of time is discrete. This paper proposes a novel framework that formally specifies and verifies the epistemic behaviors of agents within continuous dynamics. To do so, the paper first extends the hybrid automaton with knowledge. Second, the paper proposes a new logic that extends epistemic logic with quantitative real time requirement. Finally, the paper shows how to specify several properties that can be verified within our framework.

Author 1: Ammar Mohammed
Author 2: Ahmed M. Elmogy

Keywords: Epistemic logic; Reasoning; Hybrid Automata; Agents

Download PDF

Paper 59: A Lexicon-based Approach to Build Service Provider Reputation from Arabic Tweets in Twitter

Abstract: Nowadays Social media has become a popular com-munication tool among Internet users. Many users share opinions and experiences on different service providers everyday through the social media platforms. Thus, these platforms become valuable sources of data which can be exploited and used efficiently to support decision-making. However, finding and monitoring customers’ opinions on the social media is difficult task due to the fast growth of the content. This work focus on using Twitter for the task of building service providers’ reputation. Particularly, service provider’s reputation is calculated from the collected Saudi tweets in Twitter. To do so, a Saudi dialect lexicon has been developed as a basic component for sentiment polarity to classify words extracted from Twitter into either a positive or negative word. Then, beta probability density functions have been used to combine feedback from the lexicon to derive reputation scores. Experimental evaluations show that the proposed approach were consistent with the results of Qaym, a website that calculates restaurants’ rankings based on consumer ratings and comments.

Author 1: Haifa Al-Hussaini
Author 2: Hmood Al-Dossari

Keywords: Reputation; Sentiment Analysis; Arabic Language; Saudi Dialect; Social Media

Download PDF

Paper 60: A Two Phase Hybrid Classifier based on Structure Similarities and Textural Features for Accurate Meningioma Classification

Abstract: Meningioma subtype classification is a complex pattern classification problem of digital pathology due to het-erogeneity issues of tumor texture, low inter-class and high intra-class texture variations of tumor samples, and architec-tural variations of cellular components. The basic aim is the achievement of significantly high classification results for all the subtypes of meningioma while dealing with inherent complexity and texture variations. The ultimate goal is to mimic the prognosis decision of expert pathologists and assist newer pathologists in making right and quick decisions. In this paper, a novel hybrid classification framework based on nuclei shape matching and texture analysis is proposed for classification of four subtypes of grade-I benign meningioma. Meningothelial and fibroblastic subtypes are classified on basis of nuclei shape matching through skeletons and shock graphs while an optimized texture-based evolutionary framework is designed for the classification of transi-tional and psammomatous subtypes. Classifier-based evolutionary feature selection is performed using Genetic Algorithm (GA) in combination with Support Vector Machine (SVM) to select the optimal combination of higher-order statistical features extracted from morphologically processed RGB color channel images. The proposed hybrid classifier employed leave-one-patient-out 5-fold cross validation and achieved an overall 95.63% mean classification accuracy.

Author 1: Kiran Fatima
Author 2: Hammad Majeed

Keywords: Meningioma; Computer-Aided Diagnosis; Brain Tumour Classification; Cell Segmentation; Shape Analysis; Texture Analysis

Download PDF

Paper 61: DoS Detection Method based on Artificial Neural Networks

Abstract: DoS attack tools have become increasingly sophis-ticated challenging the existing detection systems to continually improve their performances. In this paper we present a victim-end DoS detection method based on Artificial Neural Networks (ANN). In the proposed method a Feed-forward Neural Network (FNN) is optimized to accurately detect DoS attack with minimum resources usage. The proposed method consists of the following three major steps: (1) Collection of the incoming network traffic,(2) selection of relevant features for DoS detection using an unsupervised Correlation-based Feature Selection (CFS) method,(3) classification of the incoming network traffic into DoS traffic or normal traffic. Various experiments were conducted to evaluate the performance of the proposed method using two public datasets namely UNSW-NB15 and NSL-KDD. The obtained results are satisfactory when compared to the state-of-the-art DoS detection methods.

Author 1: Mohamed Idhammad
Author 2: Karim Afdel
Author 3: Mustapha Belouch

Keywords: DoS detection; Artificial Neural Networks; Feed-forward Neural Networks; Network traffic classification; Feature selection

Download PDF

Paper 62: DSP Real-Time Implementation of an Audio Compression Algorithm by using the Fast Hartley Transform

Abstract: This paper presents a simulation and hardware implementation of a new audio compression scheme based on the fast Hartley transform in combination with a new modified run length encoding. The proposed algorithm consists of analyzing signals with fast Hartley Transform and then thresholding the ob-tained coefficients below a given threshold which are then encoded using a new approach of run length encoding. The thresholded coefficients are, finally, quantized and coded into binary stream. The experimental results show the ability of the fast Hartley transform to compress audio signals. Indeed, it concentrates the signal energy in a few coefficients and demonstrates the ability of the new approach of run length encoding to increase the compression factor. The results of the current work are compared with wavelet based compression by using objective assessments namely CR, SNR, PSNR and NRMSE. This study shows that the fast Hartley transform is more appropriate than wavelets one since it offers a higher compression ratio and a better speech quality. In addition, we have tested the audio compression system on DSP processor TMS320C6416.This test shows that our system fits with the real-time requirements and ensures a low complexity. The perceptual quality is evaluated with the Mean Opinion Score (MOS).

Author 1: Souha BOUSSELMI
Author 2: Noureddine ALOUI
Author 3: Adnen CHERIF

Keywords: Speech compression; Fast Hartley transform (FHT); Discrete Wavelet Transform (DWT)

Download PDF

Paper 63: Dynamic Programming Inspired Genetic Programming to Solve Regression Problems

Abstract: The candidate solution in traditional Genetic Pro-graming is evolved through prescribed number of generations using fitness measure. It has been observed that, improvement of GP on different problems is insignificant at later generations. Furthermore, GP struggles to evolve on some symbolic regression problems due to high selective pressure, where input range is very small, and few generations are allowed. In such scenarios stagnation of GP occurs and GP cannot evolve a desired solution. Recent works address these issues by using single run to reduce residual error which is based on semantic concept. A new approach is proposed called Dynamic Decomposition of Genetic Programming (DDGP) inspired by dynamic programing. DDGP decomposes a problem into sub problems and initiates sub runs in order to find sub solutions. The algebraic sum of all the sub solutions merge into an overall solution, which provides the desired solution. Experiments conducted on well known benchmarks with varying complexities, validates the proposed approach, as the empirical results of DDGP are far superior to the standard GP. Moreover, statistical analysis has been conducted using T test, which depicted significant difference on eight datasets. Symbolic regression problems where other variants of GP stagnates and cannot evolve the required solution, DDGP is highly recommended for such symbolic regression problems.

Author 1: Asim Darwaish
Author 2: Hammad Majeed
Author 3: M. Quamber Ali
Author 4: Abdul Rafay

Keywords: Genetic Programming; Evolutionary Computing; Machine Learning; Fitness Landscape; Semantic GP; Symbolic Regression and Dynamic Decomposition of GP

Download PDF

Paper 64: Identification and Nonlinear PID Control of Hammerstein Model using Polynomial Structures

Abstract: In this paper, a new nonlinear discrete-time PID is proposed to control Hammerstein model. This model is composed by a static nonlinearity gain associated to a linear dynamic sub-system. Nonlinear polynomial structures are used to identify and to control this class of systems. The determination of parameters is based on the use of RLS algorithm. A coupled two-tank process is given to illustrate the effectiveness of the proposed approach.

Author 1: Zeineb RAYOUF
Author 2: Chekib GHORBEL
Author 3: Naceur BENHADJ BRAIEK

Keywords: Parametric identification; Hammerstein model; RLS algorithm; Polynomial structure; Nonlinear PID controller

Download PDF

Paper 65: Large Scale Graph Matching(LSGM): Techniques, Tools, Applications and Challenges

Abstract: Large Scale Graph Matching (LSGM) is one of the fundamental problems in Graph theory and it has applications in many areas such as Computer Vision, Machine Learning, Pattern Recognition and Big Data Analytics (Data Science). Matching belongs to the combinatorial class of problems which refers to finding correspondence between the nodes of a graph or among set of graphs (subgraphs) either precisely or approximately. Precise Matching is also known as Exact Matching such as (sub)Graph Isomorphism and Approximate Matching is called Inexact Matching in which matching activity concerns with conceptual/semantic matching rather than focusing on structural details of graphs. In this article, a review of matching problem is presented i.e. Semantic Matching (conceptual), Syntactic Match-ing (structural) and Schematic Matching (Schema based). The aim is to present the current state of the art in Large Scale Graph Matching (LSGM), a systematic review of algorithms, tools and techniques along with the existing challenges of LSGM. Moreover, the potential application domains and related research activities are provided.

Author 1: Azka Mahmood
Author 2: Hina Farooq
Author 3: Javed Ferzund

Keywords: Big Data; Graph Matching; Graph Isomorphism; Graph Analytics; Data Models; Large Scale Graphs

Download PDF

Paper 66: Medical Image Retrieval based on the Parallelization of the Cluster Sampling Algorithm

Abstract: Cluster sampling algorithm is a scheme for sequential data assimilation developed to handle general non-Gaussian and nonlinear settings. The cluster sampling algorithm can be used to solve a wide spectrum of problems that requires data inversion such as image retrieval, tomography, weather prediction amongst others. This paper develops parallel cluster sampling algorithms, and show that a multi-chain version is embarrassingly parallel, and can be used efficiently for medical image retrieval amongst other applications. Moreover, it presents a detailed complexity analysis of the proposed parallel cluster samplings scheme and discuss their limitations. Numerical experiments are carried out using a synthetic one dimensional example, and a medical image retrieval problem. The experimental results show the accuracy of the cluster sampling algorithm to retrieve the original image from noisy measurements, and uncertain priors. Specifically, the proposed parallel algorithm increases the acceptance rate of the sampler from 45% to 81% with Gaussian proposal kernel, and achieves an improvement of 29% over the optimally-tuned Tikhonov-based solution for image retrieval. The parallel nature of the proposed algorithm makes the it a strong candidate for practical and large scale applications.

Author 1: Hesham Arafat Ali
Author 2: Salah Attiya
Author 3: Ibrahim El-henawy

Keywords: Bayes’ theorem; Hamiltonian Monte-Carlo; Inverse problems; Markov chain Monte-Carlo; Medical image reconstruc-tion; Parallel programming

Download PDF

Paper 67: Online Reputation Model Using Moving Window

Abstract: Users are increasingly dependent on decision tools to facilitate their transactions on the internet. Reputation models offer a solution to the users in supporting their purchase decisions. The reputation model takes product ratings as input and produces product quality as score. Most existing reputation models use naïve average method or weighted average method to aggregate ratings. Naïve average method is unstable when there exist a clear trend in the ratings sequence. Also, the weighted methods are influenced by unfair and malicious ratings. This paper introduces a new simple reputation model that aggregates ratings based on the concept of moving window. This approach enables us to study variability of ratings over time which allows us to investigate the trend of ratings and account for sudden changes in ratings trend. The window size can be defined by either number of ratings or duration. The proposed model has been validated against stat-of-art reputation models using Mean Absolute Error and Kendall tau correlation.

Author 1: Mohammad Azzeh

Keywords: Reputation Model, Moving Window, Ratings Aggregation Method, E-Commerce.

Download PDF

Paper 68: PaMSA: A Parallel Algorithm for the Global Alignment of Multiple Protein Sequences

Abstract: Multiple sequence alignment (MSA) is a well-known problem in bioinformatics whose main goal is the identification of evolutionary, structural or functional similarities in a set of three or more related genes or proteins. We present a parallel approach for the global alignment of multiple protein sequences that combines dynamic programming, heuristics, and parallel programming techniques in an iterative process. In the proposed algorithm, the longest common subsequence technique is used to generate a first MSA by aligning identical residues. An iterative process improves the MSA by applying a number of operators that were defined in the present work, in order to produce more accurate alignments. The accuracy of the alignment was evaluated through the application of optimization functions. In the proposed algorithm, a number of processes work independently at the same time searching for the best MSA of a set of sequences. There exists a process that acts as a coordinator, whereas the rest of the processes are considered slave processes. The resulting algorithm was called PaMSA, which stands for Parallel MSA. The MSA accuracy and response time of PaMSA were compared against those of Clustal W, T-Coffee, MUSCLE, and Parallel T-Coffee on 40 datasets of protein sequences. When run as a sequential application, PaMSA turned out to be the second fastest when compared against the nonparallel MSA methods tested (Clustal W, T-Coffee, and MUSCLE). However, PaMSA was designed to be executed in parallel. When run as a parallel application, PaMSA presented better response times than Parallel T-Cofffee under the conditions tested. Furthermore, the sum-of-pairs scores achieved by PaMSA when aligning groups of sequences with an identity percentage score from approximately 70% to 100%, were the highest in all cases. PaMSA was implemented on a cluster platform using the C++ language through the application of the standard Message Passing Interface (MPI) library.

Author 1: Irma R. Andalon-Garcia
Author 2: Arturo Chavoya

Keywords: Multiple Sequence Alignment; parallel program-ming; Message Passing Interface

Download PDF

Paper 69: Performance Evaluation of Anti-Collision Algorithms for RFID System with Different Delay Requirements

Abstract: The main purpose of Radio-frequency identification (RFID) implementation is to keep track of the tagged items. The basic components of an RFID system include tags and readers. Tags communicate with the reader through a shared wireless channel. Tag collision problem occurs when more than one tag attempts to communicate with the reader simultaneously. Therefore, the second-generation UHF Electronic Product Code (EPC Gen 2) standard uses Q algorithm to deal with the collision problem. In this paper, we introduce three new anti-collision algorithms to handle multiple priority classes of tags, namely, DC, DQ and DCQ algorithms. The goal is to achieve high system performance and enable each priority class to meet its delay requirement. The simulation results reveal that DCQ algorithm is more effective than the DC and DQ algorithms as it is designed to flexibly control and adjust system parameters to obtain the desired delay differentiation level. Finally, it can conclude that the proposed DCQ algorithm can control the delay differentiation level and yet maintain high system performance.

Author 1: Warakorn Srichavengsup

Keywords: RFID; Anti-collision; Q algorithm; Priority

Download PDF

Paper 70: Privacy and Security Mechanisms for eHealth Monitoring Systems

Abstract: The rapid scientific and technological merging be-tween Internet of Things (IoT), cloud computing and wireless body area networks (WBANs) have significantly contributed to the advent of e-healthcare. Due to this the quality of medicinal care has also been improved. Specifically, patient-centric health care monitoring plays important role in e-healthcare facilities by providing important assistance in different areas, including medical data collection and aggregation, data transmission, data processing, data query, and so on. This paper proposed an ar-chitectural framework to describe complete monitoring life cycle and indicates the important service modules. More meticulous discussions are then devoted to data gathering at patient side, which definitely serves as essential basis in achieving efficient, vigorous and protected patient health monitoring. Different design challenges are also analyzed to develop a high quality and protected patient-centric monitoring systems along with their possible potential solutions.

Author 1: M. Ajmal Sawand
Author 2: Najeed Ahmed Khan

Keywords: Wireless body area network; e-healthcare; mobile crowd sensing

Download PDF

Paper 71: SVM based Emotional Speaker Recognition using MFCC-SDC Features

Abstract: Enhancing the performance of emotional speaker recognition process has witnessed an increasing interest in the last years. This paper highlights a methodology for speaker recognition under different emotional states based on the mul-ticlass Support Vector Machine (SVM) classifier. We compare two feature extraction methods which are used to represent emotional speech utterances in order to obtain best accuracies. The first method known as traditional Mel-Frequency Cepstral Coefficients (MFCC) and the second one is MFCC combined with Shifted-Delta-Cepstra (MFCC-SDC). Experimentations are conducted on IEMOCAP database using two multiclass SVM ap-proaches: One-Against-One (OAO) and One Against-All (OAA). Obtained results show that MFCC-SDC features outperform the conventional MFCC.

Author 1: Asma Mansour
Author 2: Zied Lachiri

Keywords: Emotion; Speaker recognition; Mel Frequency Cep-stral Coefficients (MFCC); Shifted-Delta-Cepstral (SDC); SVM

Download PDF

Paper 72: Improving Routing Performances to Provide Internet Connectivity in VANETs over IEEE 802.11p

Abstract: In the intelligent transportation systems, many applications and services could be offered in the road via Internet. Providing these applications over vehicular ad hoc network (VANETs) technology may require good performances in routing. The channel fading and quality of received signal are the two main factors which affect the Mobile ad hoc network perfor-mances as well as mobility of vehicles, in terms of throughput and delay packets that are relevant to the performance evaluation of the routing protocols. In this paper, we propose an efficient relay selection scheme based on Contention Based Forwarding (CBF) and Fuzzy Logic System (FLS) that considers two important Quality of Service parameters such as link stability, and quality of received signal to select a potential relay vehicle, in order to improve the routing performances in the network. The simulation results show that the proposed relay selection scheme enhances throughput, and decreases packet delay and overhead comparing it with an existing link stability based routing protocol(MBRP) and M-AODV+.

Author 1: Driss ABADA
Author 2: Abdellah MASSAQ
Author 3: Abdellah BOULOUZ

Keywords: VANET; routing; link stability; fading; RSS; mobility

Download PDF

Paper 73: An Efficient Routing Protocol in Mobile Ad-hoc Networks by using Artificial Immune System

Abstract: Characteristics of the mobile ad-hoc networks such as nodes high mobility and limited energy are regarded as the routing challenges in these networks. OLSR protocol is one of the routing protocols in mobile ad hoc network that selects the shortest route between source and destination through Dijkstra's algorithm. However, OLSR suffers from a major problem. It does not consider parameters such as nodes’ energy level and links length in its route processing. This paper employs the artificial immune system (AIS) to enhance efficiency of OLSR routing protocol. The proposed algorithm, called AIS-OLSR, considers hop count, remaining energy in the intermediate nodes, and distance among node, which is realized by negative selection and ClonalG algorithms of AIS. Widespread packet - level simulation in ns-2 environment, shows that AIS-OLSR outperforms OLSR and EA-OLSR in terms of packet delivery ratio, throughput, end-end delay and lifetime.

Author 1: Fatemeh Sarkohaki
Author 2: Reza Fotohi
Author 3: Vahab Ashrafian

Keywords: AIS-OLSR; Routing protocol; Mobile ad hoc network; AIS

Download PDF

Paper 74: Mitigating Address Spoofing Attacks in Hybrid SDN

Abstract: Address spoofing attacks like ARP spoofing and DDoS attacks are mostly launched in a networking environment to degrade the performance. These attacks sometimes break down the network services before the administrator comes to know about the attack condition. Software Defined Networking (SDN) has emerged as a novel network architecture in which date plane is isolated from the control plane. Control plane is implemented at a central device called controller. But, SDN paradigm is not commonly used due to some constraints like budget, limited skills to control SDN, the flexibility of traditional protocols. To get SDN benefits in a traditional network, a limited number of SDN devices can be deployed among legacy devices. This technique is called hybrid SDN. In this paper, we propose a new approach to automatically detect the attack condition and mitigate that attack in hybrid SDN. We represent the network topology in the form of a graph. A graph based traversal mechanism is adopted to indicate the location of the attacker. Simulation results show that our approach enhances the network efficiency and improves the network security.

Author 1: Fahad Ubaid
Author 2: Rashid Amin
Author 3: Faisal Bin Ubaid
Author 4: Muhammad Muwar Iqbal

Keywords: Communication system security; Network Security; ARP Spoofing Introduction

Download PDF

Paper 75: Low Error Floor Concatenated LDPC for MIMO Systems

Abstract: Multiple-Input and Multiple-Output, or MIMO is the use of multiple antennas at both the transmitter and receiver to improve communication performance. MIMO technology has attracted attention in wireless communications; because it offers significant increases in data throughput and spectral efficiency without additional bandwidth or increased transmit power. To achieve the mentioned above performance Bit Error Rates (BER) should be low. For this reason efficient encoding and decoding algorithms should be used. MIMO systems rely on error-control coding to ensure reliable communication in the presence of noise. Forward Error Correction Codes (FEC) such as convolutional and block codes were investigated for MIMO systems. Low Density Parity Check (LDPC) shows good performance except that an error floor may appear at high Signal to Noise Ratio (SNR). In this work we propose a concatenated error control code that reduces the error floor of LDPC codes suffering from error floor. The proposed scheme is a good candidate for high rates real time communication since it reduces the decoding latency as well.

Author 1: Lamia Berriche
Author 2: Areej Al Qahtani

Keywords: LDPC; error floor; MIMO; error control

Download PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. Registered in England and Wales. Company Number 8933205. All rights reserved. thesai.org