The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Metadata Harvesting (OAI2)
  • Digital Archiving Policy
  • Promote your Publication

IJACSA

  • About the Journal
  • Call for Papers
  • Author Guidelines
  • Fees/ APC
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Editors
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Guidelines
  • Fees
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Editors
  • Reviewers
  • Subscribe

IJACSA Volume 7 Issue 3

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: A Robust Algorithm of Forgery Detection in Copy-Move and Spliced Images

Abstract: The paper presents a new method to detect forgery by copy-move, splicing or both in the same image. Multiscale, which limits the computational complexity, is used to check if there is any counterfeit in the image. By applying one-level Discrete Wavelet Transform, the sharped edges, which are traces of cut-paste manipulation, are high frequencies and detected from LH, HL and HH sub-bands. A threshold is proposed to filter the suspicious edges and the morphological operation is applied to reconstruct the boundaries of forged regions. If there is no shape produced by dilation or no highlight sharped edges, the image is not faked. In case of forgery image, if a region at the other position is similar to the defined region in the image, a copy-move is confirmed. If not, a splicing is detected. The suspicious region is extracted the feature using Run Difference Method (RDM) and a feature vector is created. Searching regions having the same feature vector is called detection phase. The algorithm applying multiscale and morphological operation to detect the sharped edges and RDM to extract the image features is simulated in Matlab with high efficiency not only in the copy-move or spliced images but also the image with both copy-move and splicing.

Author 1: Tu Huynh-Kha
Author 2: Thuong Le-Tien
Author 3: Synh Ha-Viet-Uyen
Author 4: Khoa Huynh-Van
Author 5: Marie Luong

Keywords: Forgery detection (FD); Copy-Move; Discrete Wavelet Transform (DWT); Run Difference Method (RDM); Splicing, Sharpness

Download PDF

Paper 2: Portable Facial Recognition Jukebox Using Fisherfaces (Frj)

Abstract: A portable real-time facial recognition system that is able to play personalized music based on the identified person’s preferences was developed. The system is called Portable Facial Recognition Jukebox Using Fisherfaces (FRJ). Raspberry Pi was used as the hardware platform for its relatively low cost and ease of use. This system uses the OpenCV open source library to implement the computer vision Fisherfaces facial recognition algorithms, and uses the Simple DirectMedia Layer (SDL) library for playing the sound files. FRJ is cross-platform and can run on both Windows and Linux operating systems. The source code was written in C++. The accuracy of the recognition program can reach up to 90% under controlled lighting and distance conditions. The user is able to train up to 6 different people (as many as will fit in the GUI). When implemented on a Raspberry Pi, the system is able to go from image capture to facial recognition in an average time of 200ms.

Author 1: Richard Mo
Author 2: Adnan Shaout

Keywords: Facial Recognition; Raspberry Pi; Computer Vision; GNU/Linux Operating System; OpenCV; C++

Download PDF

Paper 3: Internet of Everything (Ioe): Analysing the Individual Concerns Over Privacy Enhancing Technologies (Pets)

Abstract: This paper aims to investigate the effectiveness of the provision of privacy of individuals through privacy enhancing technologies (PETs). The successful evolution and emergence of cyberspace with the real world through “Internet of Everything (IoE)” has led to the speedy progress in research and development of predictive analysis of big data. The individual’s privacy has gained a considerable momentum in both industry and academia since privacy-enhancing technologies (PETs) constitute a technical means to protect information. Privacy regulations and state of law deemed this as an integral part in order to protect the individual’s private sphere when the infrastructure of Information Communication Technologies (ICT) is laid out. Modern organisations use consent forms to gather individual’s sensitive personal information for a specific purpose. The law prohibits using the person’s information for purposes other than that of when the consent was initially established. The infrastructure of ICT should be developed in alliance with the privacy laws and made compliant as well intelligent which learn by itself from the environment. This extra layer embedded in the system would educate the ICT structure and help system to authenticate as well as communicate with the perspective users. The existing literature on protecting individuals’ privacy through privacy-enhancing technologies (PETs) is still embryonic and does conclude that the individual’s concerns about privacy are not fully considered in the technological sense. Among other contributions, this research paper will devise a conceptual model to improve individual’s privacy.

Author 1: Asim Majeed
Author 2: Rehan Bhana
Author 3: Anwar Ul Haq
Author 4: Imani Kyaruzi
Author 5: Shaheed Pervaz
Author 6: Mike-Lloyd Williams

Keywords: privacy; privacy enhancing technology (PET); big data; information communication technology (ICT)

Download PDF

Paper 4: A General Evaluation Framework for Text Based Conversational Agent

Abstract: This paper details the development of a new evaluation framework for a text based Conversational Agent (CA). A CA is an intelligent system that handle spoken or/and text based conversations between machine and human. Generally, the lack of evaluation frameworks for CAs effects its development. The idea behind any system’s evaluation is to make sure about the system’s functionalities and to continue development on it. A specific CA has been chosen to test the proposed framework on it; namely ArabChat. The ArabChat is a rule based CA and uses pattern matching technique to handle user’s Arabic text based conversations. The proposed and developed evaluation framework in this paper is natural language independent. The proposed framework is based on the exchange of specific information between ArabChat and user called “Information Requirements”. This information are tagged for each rule in the applied domain and should be exist in a user’s utterance (conversation). A real experiment has been done in Applied Science University in Jordan as an information point advisor for their native Arabic students to evaluate the ArabChat and then evaluating the proposed evaluation framework.

Author 1: Mohammad Hijjawi
Author 2: Zuhair Bandar
Author 3: Keeley Crockett

Keywords: Artificial intelligence; Conversational Agent and evaluation

Download PDF

Paper 5: Effects of Walls and Floors in Indoor Localization Using Tracking Algorithm

Abstract: The advancement in wireless and mobile networks has led to an increase in location based services (LBS). LBS can be applied in many applications, such as vehicle systems, security systems, and patient tracking systems. The Global Navigation Satellite Systems (GNSS) have become very popular due to their ability to provide highly accurate positions, especially in outdoor environments. However, GNSS signals become very weak when they go through natural or man-made structure, like in urban canyons or indoor environments. This hinders the applicability of GNSS-based localization techniques in such challenging environments. Many indoor localization techniques are based on the received signal strength (RSS). An RSS is proportional to the distance to an access point (AP), where it is stronger in power when it is closer to an AP, given that the received signal is not obstructed by walls or floors. This paper aims at studying the effect of walls and floors on the RSS, and estimating the distribution of the RSS due to such obstructions. Moreover, a tracking algorithm based on a multi-walls and floors propagation model is applied to increase the positioning accuracy.

Author 1: Farhat M. A. Zargoun
Author 2: Ibrahim M. Henawy
Author 3: Nesreen I. Ziedan

Keywords: Indoor localization; Tracking algorithm; Effects of wall and floor on RSS; Effects of obstruction; Multi wall and floor propagation model

Download PDF

Paper 6: E-Learning Collaborative System for Practicing Foreign Languages with Native Speakers

Abstract: The paper describes a novel social network-based open educational resource for practicing foreign languages with native speakers, based on the predefined teaching materials. This virtual learning platform, called i2istudy, eliminates misunderstanding by providing prepared and predefined scenarios, enabling the participants to understand each other and, as a consequence, to communicate freely. The developed system allows communication through the real time video and audio feed. In addition to establishing the communication link, it tracks the student progress and allows rating the instructor, based on the learner’s experience. The system went live in April 2014, and had over six thousand active daily users, with over 40,000 total registered users. Monetization has been added to the system, and time will show how popular the system will become in the future.

Author 1: Ilya V. Osipov
Author 2: Alex A. Volinsky
Author 3: Anna Y. Prasikova

Keywords: E-learning, learning tools; peer-to-peer network; social network; open educational resources; distance learning

Download PDF

Paper 7: Color Image Segmentation via Improved K-Means Algorithm

Abstract: Data clustering techniques are often used to segment the real world images. Unsupervised image segmentation algorithms that are based on the clustering suffer from random initialization. There is a need for efficient and effective image segmentation algorithm, which can be used in the computer vision, object recognition, image recognition, or compression. To address these problems, the authors present a density-based initialization scheme to segment the color images. In the kernel density based clustering technique, the data sample is mapped to a high-dimensional space for the effective data classification. The Gaussian kernel is used for the density estimation and for the mapping of sample image into a high- dimensional color space. The proposed initialization scheme for the k-means clustering algorithm can homogenously segment an image into the regions of interest with the capability of avoiding the dead centre and the trapped centre by local minima phenomena. The performance of the experimental result indicates that the proposed approach is more effective, compared to the other existing clustering-based image segmentation algorithms. In the proposed approach, the Berkeley image database has been used for the comparison analysis with the recent clustering-based image segmentation algorithms like k-means++, k-medoids and k-mode.

Author 1: Ajay Kumar
Author 2: Shishir Kumar

Keywords: k-means; k-means++; k-medoids; k-mode; kernel density component

Download PDF

Paper 8: Detection and Feature Extraction of Collective Activity in Human-Computer Interaction

Abstract: Time-based online media, such as video, has been growing in importance. Still, there is limited research on information retrieval of time-coded media content. This work elaborates on the idea of extracting feature characteristics from time-based online content by means of users' interactions analysis instead of analyzing the content itself. Accordingly, a time series of users’ activity in online media is constructed and shown to exhibit rich temporal dynamics. Additionally it is demonstrated that is also possible to detect characteristic patterns in collective activity while accessing time-based media. Pattern detection of collective activity, as well as feature extraction of the corresponding pattern, is achieved by means of a time series clustering approach. This is demonstrated with the proposed approach featuring information-rich videos. It is shown that the proposed probabilistic algorithm effectively detects distinct shapes of the users’ time series, predicting correctly popularity dynamics, as well as their scale characteristics.

Author 1: Ioannis Karydis
Author 2: Markos Avlonitis
Author 3: Phivos Mylonas
Author 4: Spyros Sioutas

Keywords: Users activity; aggregation modelling; collective intelligence; time-based media; pattern detection

Download PDF

Paper 9: NMVSA Greedy Solution for Vertex Cover Problem

Abstract: Minimum vertex cover (MVC) is a well-known NP-Complete optimization problem. The importance of MVC in theory and practical comes from the wide range of its applications. This paper describes a polynomial time greedy algorithm to find near optimal solutions for MVC. The new algorithm NMVAS is a modification of already existed algorithm called MVAS which uses the same principle of selecting candidate from the neighborhood of the vertex with a modification in the selection procedure. A comparative study is conducted between the NMVAS and MVAS which shows that the proposed algorithm NMVSA provides better or equal results in the most cases of the underlying data sets which leads to a better average approximation ratio of NMVAS. NMVAS inherits the simplicity of the original algorithm.

Author 1: Mohammed Eshtay
Author 2: Azzam Sleit
Author 3: Ahmad Sharieh

Keywords: Vertex Cover Problem (MVC); Combinatorial Problem; NP-Complete Problem; Approximation Algorithm; Greedy algorithms; Minimum Independent Set

Download PDF

Paper 10: Implementation of Pedestrian Dynamic

Abstract: Pattern generation is one of the ways to implement computer science in art. Many methods have been implemented. One of them is cellular automata. In a previous work, cellular automata (CA) has been used to create an image with stochastic and irregular pattern. There are problems in the performance of the method because the average number of the occupied cells is less than 50 percent. So, this method must be improved. In this research, the pedestrian dynamic concept is implemented into the pattern generation process. This method is used so that there is a combination between stochastic and deterministic approaches in generating the pattern. This combination is the key element of the method. This proposed model has successfully produced irregular pattern image too. Based on quantitative test, the occupied cell ratio is still less than 50 percent but the proposed model can make a better distance between last position and starting point nodes of the pattern. When the number of agents is 75, the target to reach the occupied cells ratio by more than 75 percent is achieved.

Author 1: Purba Daru Kusuma

Keywords: pattern generation; cellular automata; pedestrian dynamic; intelligent agent

Download PDF

Paper 11: Critical Path Reduction of Distributed Arithmetic Based FIR Filter

Abstract: Operating speed, which is reciprocal of critical path computation time, is one of the prominent design matrices of finite impulse response (FIR) filters. It is largely affected by both, system architecture as well as technique used to design arithmetic modules. A large computation time of multipliers in conventionally designed multipliers, limits the speed of system architecture. Distributed arithmetic is one of the techniques, used to provide multiplier-free multiplication in the implementation of FIR filter. However suffers from a sever limitation of exponential growth of look up table (LUT) with order of filter. An improved distributed arithmetic technique is addressed here to design for system architecture of FIR filter. In proposed technique, a single large LUT of conventional DA is replaced by number of smaller indexed LUT pages to restrict exponential growth and to reduce system access time. It also eliminates the use of adders. Selection module selects the desired value from desired page, which leads to reduce computational time of critical path. Trade off between access times of LUT pages and selection module helps to achieve minimum critical path so as to maximize the operating speed. Implementations are targeted to Xilinx ISE, Virtex IV devices. FIR filter with 8 bit data width of input sample results are presented here. It is observed that, proposed design perform significantly faster as compared to the conventional DA and existing DA based designs.

Author 1: Sunita Badave
Author 2: Anjali Bhalchandra

Keywords: Critical Path; Multiplier less FIR filter; Distributed Arithmetic; LUT Design; Indexed LUT

Download PDF

Paper 12: Explorative Study of SQL Injection Attacks and Mechanisms to Secure Web Application Database- A Review

Abstract: The increasing innovations in web development technologies direct the augmentation of user friendly web applications. With activities like - online banking, shopping, booking, trading etc. these applications have become an integral part of everyone’s daily routine. The profit driven online business industry has also acknowledged this growth because a thriving application provides the global platform to an organization. Database of web application is the most valuable asset which stores sensitive information of an individual and of an organization. SQLIA is the topmost threat as it targets the database on web application. It allows the attacker to gain control over the application ensuing financial fraud, leak of confidential data and even deleting the database. The exhaustive survey of SQL injection attacks presented in this paper is based on empirical analysis. This comprises the deployment of injection mechanism for each attack with respective types on various websites, dummy databases and web applications. The paramount security mechanism for web application database is also discussed to mitigate SQL injection attacks.

Author 1: Chandershekhar Sharma
Author 2: Dr. S. C. Jain
Author 3: Dr. Arvind K Sharma

Keywords: Injection Attacks; SQL vulnerabilities; Web Application Attacks

Download PDF

Paper 13: Corrupted MP4 Carving Using MP4-Karver

Abstract: In the digital forensic, recovery of deleted and damaged video files play an important role in searching for the evidences. In this paper, MP4-Karver tool is proposed to recover and repair the corrupted videos. Moreover, MP4-Karver extracts frames from video for automatically screen-video to detect illegal cases instead of targeting or watching complete video. Therefore, many existing approaches such as Scalpel’s method, Garfienkel, Bi-Fragment Gap Carving, Smart Carving and Frame Based Recovery attempts to recover the videos in different ways, but most of the recovered videos are usually not complete playable. The proposed MP4-Karver focuses on recovery of video files and repair corrupted videos to complete and playable. Experimental results show that the proposed MP4-Karver effectively restores corrupted or damaged video for an improved percentage of the video restoration compared with existing tools.

Author 1: Ahmed Nur Elmi Abdi
Author 2: Kamaruddin Malik Mohamad
Author 3: Yusoof Mohammed Hasheem
Author 4: Rashid Naseem
Author 5: Jamaluddin
Author 6: Muhammad Aamir

Keywords: Digital forensic; File carving; Repairing; Corrupted; Frame extraction

Download PDF

Paper 14: Competitive Representation Based Classification Using Facial Noise Detection

Abstract: Linear representation based face recognition is hotly studied in recent years. Competitive representation classification is a linear representation based method which uses the most competitive training samples to sparsely represent a probe. However, possible noises on a test face image can bias the representation results. In this paper we propose a facial noise detection method to remove noises in the test image during the competitive representation. We compare the proposed method with others on AR, Extended Yale B, ORL, FERET, and LFW databases and the experimental results show the good performance of our method.

Author 1: Tao Liu
Author 2: Cong Li
Author 3: Ying Liu
Author 4: Chao Li

Keywords: face recognition; sparse representation; biometrics; noise detection

Download PDF

Paper 15: Energy Efficient Routing Protocol for Maximizing the Lifetime in Wsns Using Ant Colony Algorithm and Artificial Immune System

Abstract: Energy limitations have become fundamental challenge for designing wireless sensor networks. Network lifetime represent the most important and interested metric. Several attempts have been made for efficient utilization of energy in routing techniques. This paper proposes an energy efficient routing technique for maximizing the networks lifetime called swarm intelligence routing. This is achieved by using ant colony algorithm (ACO) and artificial immune system (AIS). AIS is used for solving packet LOOP problem and to control route direction. While ACO algorithm is used for determining optimum route for sending data packets. The proposed routing technique seeks for determining the optimum route from nodes towards base station so that energy exhaustion is balanced and lifetime is maximized. Proposed routing technique is compared with Dijkstra routing method. Results show significant increase in network lifetime of about 1.2567.

Author 1: Safaa Khudair Leabi
Author 2: Turki Younis Abdalla

Keywords: ant colony algorithm; artificial immune system; adaptive routing; network lifetime; wireless sensor networks

Download PDF

Paper 16: Improving DNA Computing Using Evolutionary Techniques

Abstract: the field of DNA Computing has attracted many biologists and computer scientists as it has a biological interface, small size and substantial parallelism. DNA computing depends on DNA molecules’ biochemical reactions which they can randomly anneal and they might accidentally cause improper or unattractive computations. This will inspire opportunities to use evolutionary computation via DNA. Evolutionary Computation emphasizes on probabilistic search and optimization methods which are mimicking the organic evolution models. The research work aims at offering a simulated evolutionary DNA computing model which incorporates DNA computing with an evolutionary algorithm. This evolutionary approach provides the likelihood for increasing dimensionality through replacing the typical filtering method by an evolutionary one. Thus, via iteratively increasing and recombination a population of strands, eliminating incorrect solutions from the population, and choosing the best solutions via gel electrophoresis, an optimal or near-optimal solution can be evolved rather than extracted from the initial population.

Author 1: Godar J. Ibrahim
Author 2: Tarik A. Rashid
Author 3: Ahmed T. Sadiq

Keywords: Parallel Computation; DNA Computation Algorithm; Evolutionary DNA Computing Algorithm

Download PDF

Paper 17: A Novel Paradigm for Symmetric Cryptosystem

Abstract: Playfair cipher is the first known digraph polyalphabetic method. It relies on 5x5 uppercase alphabets matrix with simple substitution processes to be used for encryption and decryption. This paper proposes an enhanced variant of Playfair cipher algorithm that incorporates an algorithm for elaborate key generation starting with a seed accompanying the ciphertext and will be referred to as a Novel Paradigm for Symmetric Cryptosystem (NPSC). The key generation, encryption and decryption processes implement modular calculations instead of the simple substitution used in the traditional Playfair cipher. It supports both alphabetic characters and numerals. This variant considerably enhances the security strength without increasing the matrix size as demonstrated by the experimentation results. Comparative studies of various critical factors with other reported versions of Playfair cipher and results were also included.

Author 1: Shadi R. Masadeh
Author 2: Hamza A. Al_Sewadi
Author 3: Mohammad A. Wadi

Keywords: cryptography; security; symmetric systems; polyalphbetic cipher; key generation

Download PDF

Paper 18: Integrating Semantic Features for Enhancing Arabic Named Entity Recognition

Abstract: Named Entity Recognition (NER) is currently an essential research area that supports many tasks in NLP. Its goal is to find a solution to boost accurately the named entities identification. This paper presents an integrated semantic-based Machine learning (ML) model for Arabic Named Entity Recognition (ANER) problem. The basic idea of that model is to combine several linguistic features and to utilize syntactic dependencies to infer semantic relations between named entities. The proposed model focused on recognizing three types of named entities: person, organization and location. Accordingly, it combines internal features that represented linguistic features as well as external features that represent the semantic of relations between the three named entities to enhance the accuracy of recognizing them using external knowledge source such as Arabic WordNet ontology (ANW). We introduced both features to CRF classifier, which are effective for ANER. Experimental results show that this approach can achieve an overall F-measure around 87.86% and 84.72% for ANERCorp and ALTEC datasets respectively.

Author 1: Hamzah A. Alsayadi
Author 2: Abeer M. ElKorany

Keywords: Arabic Named Entity Recognition (ANER); Conditional Random Fields (CRF); Domain Ontology; Semantic Relation Feature (SRF); Arabic WordNet ontology (ANW)

Download PDF

Paper 19: Clustering Analysis of Wireless Sensor Network Based on Network Coding with Low-Density Parity Check

Abstract: The number of nodes in wireless sensor networks (WSNs) is one of the fundamental parameters when it comes to developing an algorithm based on Network Coding (NC) with LDPC (Low Density Parity Check) code because it directly affects the size of the generator matrix of the LDPC code and to its dispersion. Optimizing Wireless Communication Systems by decreasing BER (Bit Error Rate) is one approach to analyze the network into clusters (at the level of their nodes). In this paper, the authors present a fully distributed clustering algorithm and they consider different node values by cluster, then they select the curves that have the best compromise. They examine the effects of SNR (Signal-to-noise ratio) quantization on system performance obtained for different scenarios (by varying the parameter corresponding to the number of the symbol during the forwarding phase). Finally, the results prove that the increased number nodes improve LDPC code properties.

Author 1: Maria Hammouti
Author 2: El Miloud Ar-reyouchi
Author 3: Kamal Ghoumid
Author 4: Ahmed Lichioui

Keywords: Clustering Techniques; Network Coding; LDPC codes; distributed algorithms; wireless sensor network

Download PDF

Paper 20: Multi-Agent Based Model for Web Service Composition

Abstract: The evolution of the Internet and the competitiveness among companies were factors in the explosion of Web services. Web services are applications available on the Internet each performing a particular task. Web users often need to call different services to achieve a more complex task that can’t be satisfied by a simple service. And users often prefer to have the best services responding to their requests. In this context, we should measure the Quality of Service (QoS) which is a very important aspect of Web services in order to offer to the user the best services. “How can we ensure the composition of different services to respond the user request” is the first problem that we contribute to resolve, proposing a multi-agent based model for the automatic planification of Web services. And “guarantee the required quality of the composite Web services” is a complex task regarding the unpredictable nature and dynamics of composite Web services, so our contribution to remedy to this problem consists of the use of two classes of quality attributes. The first one considers generic and the second contains specific attributes.

Author 1: Karima Belmabrouk
Author 2: Fatima Bendella
Author 3: Maroua Bouzid

Keywords: Agents; Model; Quality of Service; Service composition; Web services

Download PDF

Paper 21: A Novel Mapreduce Lift Association Rule Mining Algorithm (MRLAR) for Big Data

Abstract: Big Data mining is an analytic process used to discover the hidden knowledge and patterns from a massive, complex, and multi-dimensional dataset. Single-processor's memory and CPU resources are very limited, which makes the algorithm performance ineffective. Recently, there has been renewed interest in using association rule mining (ARM) in Big Data to uncover relationships between what seems to be unrelated. However, the traditional discovery ARM techniques are unable to handle this huge amount of data. Therefore, there is a vital need to scalable and parallel strategies for ARM based on Big Data approaches. This paper develops a novel MapReduce framework for an association rule algorithm based on Lift interestingness measurement (MRLAR) which can handle massive datasets with a large number of nodes. The experimental result shows the effi-ciency of the proposed algorithm to measure the correlations between itemsets through integrating the uses of MapReduce and LIM instead of depending on confidence.

Author 1: Nour E. Oweis
Author 2: Mohamed Mostafa Fouad
Author 3: Sami R. Oweis
Author 4: Suhail S. Owais
Author 5: Vaclav Snasel

Keywords: Big Data; Data Mining; Association Rule; MapReduce; Lift Interesting Measurement

Download PDF

Paper 22: Knowledge Management of Best Practices in a Collaborative Environment

Abstract: Identifying and sharing best practices in a domain means duplicating successes, which help people, learn from each other and reuse proven practices. Successful sharing of best practices can be accomplished by establishing a collaborative environment where users, experts and communities can interact and cooperate. A detailed review of previous research in best practice knowledge management shows that existing models have focused on developing methodologies to manage best practices but most of them did not propose solutions towards the development of full-fledge systems that make use of technologies to allow effective sharing and reuse of best practices. This paper presents a life cycle model to manage expertise for communities of practice. The proposed model is implemented in the education field as a knowledge management system that promotes and values user’s contributions. We focus on the case of best teaching practice (BTP) as they develop instructor’s abilities and improve overall instruction quality in higher education. For this purpose, we developed a computer environment including a knowledge management system and a web portal to assist instructors in higher education in the creation, sharing and application of BTPs.

Author 1: Amal Al-Rasheed
Author 2: Jawad Berri

Keywords: Best practice; knowledge management system; knowledge sharing; higher education; life cycle; portal

Download PDF

Paper 23: An Automated Recommender System for Course Selection

Abstract: Most of electronic commerce and knowledge management` systems use recommender systems as the underling tools for identifying a set of items that will be of interest to a certain user. Collaborative recommender systems recommend items based on similarities and dissimilarities among users’ preferences. This paper presents a collaborative recommender system that recommends university elective courses to students by exploiting courses that other similar students had taken. The proposed system employs an association rules mining algorithm as an underlying technique to discover patterns between courses. Experiments were conducted with real datasets to assess the overall performance of the proposed approach.

Author 1: Amer Al-Badarenah
Author 2: Jamal Alsakran

Keywords: collaborative recommendation; association rule mining; data mining; recommender system; course selection

Download PDF

Paper 24: Fault Tolerant System for Sparse Traffic Grooming in Optical WDM Mesh Networks Using Combiner Queue

Abstract: Queuing theory is an important concept in current internet technology. As the requirement of bandwidth goes on increasing it is necessary to use optical communication for transfer of data. Optical communication at backbone network requires various devices for traffic grooming. The cost of these devices is very high which leads to increase in the cost of network. One of the solutions to this problem is to have sparse traffic grooming in optical WDM mesh network. Sparse traffic grooming allows only few nodes in the network as grooming node (G-node). These G-nodes has the grooming capability and other nodes are simple nodes where traffic grooming is not possible. The grooming nodes are the special nodes which has high cost. The possibility of faults at such a node, or link failure is high. Resolving such faults and providing efficient network is very important. So we have importance of such survivable sparse traffic grooming network. Queuing theory helps to improve the result of network and groom the traffic in the network. The paper focuses on the improvement in performance of the backbone network and reduction in blocking probability. To achieve the goals of the work we have simulated the model. The main contribution is to use survivability on the sparse grooming network and use of combiner queues at each node. It has observed that Combiner queuing alone does the job of minimizing blocking probability and balancing the load over the network. The model is not only cost effective but also increases the performance of network and minimizes the call blocking probability

Author 1: Sandip R. Shinde
Author 2: Dr. Suhas H. Patil
Author 3: Dr. S. Emalda Roslin
Author 4: Archana S. Shinde

Keywords: optical communication; sparse traffic grooming; survivability; fault tolerance; Combiner Queue; WDM

Download PDF

Paper 25: The ECG Signal Compression Using an Efficient Algorithm Based on the DWT

Abstract: The storage capacity of the ECG records presents an important issue in the medical practices. These data could contain hours of recording, which needs a large space for storage to save these records. The compression of the ECG signal is widely used to deal with this issue. The problem with this process is the possibility of losing some important features of the ECG signal. This loss could influence negatively the analyzing of the heart condition. In this paper, we shall propose an efficient method of the ECG signal compression using the discrete wavelet transform and the run length encoding. This method is based on the decomposition of the ECG signal, the thresholding stage and the encoding of the final data. This method is tested on some of the MIT-BIH arrhythmia signals from the international database Physionet. This method shows high performances comparing to other methods recently published.

Author 1: Oussama El B’charri
Author 2: Rachid Latif
Author 3: Wissam Jenkal
Author 4: Abdenbi Abenaou

Keywords: ECG compression; wavelet transform; lossy compression; hard thresholding

Download PDF

Paper 26: AMBA Based Advanced DMA Controller for SoC

Abstract: this paper describes the implementation of an AMBA Based Advanced DMA Controller for SoC. It uses AMBA Specifications, where two buses AHB and APB are defined and works for processor as system bus and peripheral bus respectively. The DMA controller functions between these two buses as a bridge and allow them to work concurrently. Depending on the speed of peripherals it uses buffering mechanism. Therefore an asynchronous FIFO is used for synchronizing the speed of peripherals. The proposed DMA controller can works in SoC along with processor and achieve fast data rate. The method introduced significant volume of data transfer with very low timing characteristics. Thus it is a better choice in respect of timing and volume of data. These two issues have been resolved under this research study. The results are compared with the AMD processors, like Geode GX 466, GX 500 and GX 533, and the presence and absence of DMA controller with processor is discussed and compared. The DMAC stands to be better alternative in SoC design.

Author 1: Abdullah Aljumah
Author 2: Mohammed Altaf Ahmed

Keywords: FPGA; AMBA; DMA; DMA Controller; SoC; data transfer rate; FIFO

Download PDF

Paper 27: Role Based Multi-Agent System for E-Learning (MASeL)

Abstract: Software agents are autonomous entities that can interact intelligently with other agents as well as their environment in order to carry out a specific task. We have proposed a role-based multi-agent system for e-learning. This multi-agent system is based on Agent-Group-Role (AGR) method. As a multi-agent system is distributed, ensuring correctness is an important issue. We have formally modeled our role-based multi-agent system. The correctness properties of liveness and safety are specified as well as verified. Timed-automata based model checker UPPAAL is used for the specification as well as verification of the e-learning system. This results in a formally specified and verified model of the role-based multi-agent system.

Author 1: Mustafa Hameed
Author 2: Nadeem Akhtar
Author 3: Malik Saad Missen

Keywords: Management System (IMS); Multi-Agent System (MAS); Role Based Multi-Agent Systems; Agent-Group-Role (AGR); Agent-based Virtual Classroom (AVC); Intelligent Virtual Classroom (IVC); E-Learning; Information and Communication Technologies (ICTs); Formal verification; Model Checking

Download PDF

Paper 28: Detection and Identification System of Bacteria and Bacterial Endotoxin Based on Raman Spectroscopy

Abstract: Sepsis is a global health problem that causes risk of death. In the developing world, about 60 to 80 % of death cases are caused by Sepsis. Rapid methods for detecting its causes, represent one of the major factors that may reduce Sepsis risks. Such methods can provide microbial detection and identification which is critical to determine the right treatment for the patient. Microbial and Pyrogen detection is important for quality control system to ensure the absence of pathogens and Pyrogens in the manufacturing of both medical and food products. Raman spectroscopes represent a q uick and accurate identification and detection method, for bacteria and bacterial endotoxin, which this plays an important role in delivering high quality biomedical products using the power of Raman spectroscopy. It is a rapid method for chemical structure detection that can be used in identifying and classifying bacteria and bacterial endotoxin. Such a method acts as a solution for time and cost effective quality control procedures. This work presents an automatic system based on Raman spectroscopy to detect and identify bacteria and bacterial endotoxin. It uses the frequency properties of Raman scattering through the interaction between organic materials and electromagnetic waves. The scattered intensities are measured and wave number converted into frequency, then the cepstral coefficients are extracted for both the detection and identification. The methodology depends on normalization of Fourier transformed cepstral signal to extract their classification features. Experiments’ results proved effective identification and detection of bacteria and bacterial endotoxin even with concentrations as low as 0.0003 Endotoxin unit (EU)/ml and 1 Colony Forming Unit (CFU)/ml using signal processing based enhancement technique.

Author 1: Muhammad Elsayeh
Author 2: Ahmed H.Kandil

Keywords: Rapid Microbial Detection; Rapid Pyrogen Detection; Microwave Spectroscopy; Dielectric Spectroscopy; Ultra Wide Band; Cepstral Analysis; Raman Spectroscopy

Download PDF

Paper 29: Feature Based Correspondence: A Comparative Study on Image Matching Algorithms

Abstract: Image matching and recognition are the crux of computer vision and have a major part to play in everyday lives. From industrial robots to surveillance cameras, from autonomous vehicles to medical imaging and from missile guidance to space exploration vehicles computer vision and hence image matching is embedded in our lives. This communication presents a comparative study on the prevalent matching algorithms, addressing their restrictions and providing a criterion to define the level of efficiency likely to be expected from an algorithm. The study includes the feature detection and matching techniques used by these prevalent algorithms to allow a deeper insight. The chief aim of the study is to deliver a source of comprehensive reference for the researchers involved in image matching, regardless of specific applications.

Author 1: Usman Muhammad Babri
Author 2: Munim Tanvir
Author 3: Khurram Khurshid

Keywords: computer vision; image matching; image recognition; algorithm comparison; feature detection

Download PDF

Paper 30: New Mathematical Modeling of Three-Level Supply Chain with Multiple Transportation Vehicles and Different Manufacturers

Abstract: Nowadays, no industry can move in global markets individually and independently of competitors because they are part of a supply chain and the success of each member of the chain influences the others. In this paper, the issue of three-level supply chain with several products, one manufacturer, one distributor, and several customers is reviewed. In the first part of the chain, one type of vehicle and in the second part of the chain, two types of vehicles are used. The proposed model of this paper is a mixed integer planning integrated model. The model is considered to minimize costs, including transportation, inventory, and shortage penalty cost. This paper is about the proposed approach for development of quantitative models in the field of three-level supply chain and theoretical research and presents a case study of sending produced rolls by Mobarakeh Steel Structure Company to “Sazeh Gostar Saipa (S.G.S)” and then to suppliers. The solution proposed in this paper is imperialist competitive algorithm which is solved in 20 different sizes, and the results in small size are compared with GAMS.

Author 1: Amir Sadeghi
Author 2: Amir Farmahini Farahani
Author 3: Hossein Beiki

Keywords: Transportation; Mathematical Model; Logistic Costs; Imperialist Competitive Algorithm

Download PDF

Paper 31: Wiki-Based Stochastic Programming and Statistical Modeling System for the Cloud

Abstract: Scientific software is a special type of software because its quality has a huge impact on the quality of scientific conclusions and scientific progress. However, it is hard to ensure required quality of the software because of the misunderstandings between the scientists and the software engineers. In this paper, we present a system for improving the quality of scientific software using elements of wikinomics and cloud computing and its implementation details. The system enables scientists to collaborate and make direct evolution of the models, algorithms, and programs. WikiSPSM expands the limits of mathematical software.

Author 1: Vaidas Giedrimas
Author 2: Leonidas Sakalauskas
Author 3: Marius Neimantas
Author 4: Kestutis Žilinskas
Author 5: Nerijus Barauskas
Author 6: Remigijus Valciukas

Keywords: Wikinomics; open source; mathematical programming; software modeling; online computing

Download PDF

Paper 32: Multi Agent Architecture for Search Engine

Abstract: The process of retrieving information is becoming ambiguous day by day due to huge collection of documents present on web. A single keyword produces millions of results related to given query but these results are not up to user expectations. The search results produced from traditional text search engines may be relevant or irrelevant. The underlying reason is Web documents are HTML documents that do not contain semantic descriptors and annotations. The paper proposes multi agent architecture to produce fewer but personalized results. The purpose of the research is to provide platform for domain specific personalized search. Personalized search allows delivering web pages in accordance with user’s interest and domain. The proposed architecture uses client side as well server side personalization to provide user with personalized fever but more accurate results. Multi agent search engine architecture uses the concept of semantic descriptors for acquiring knowledge about given domain and leading to personalized search results. Semantic descriptors are represented as network graph that holds relationship between given problem in form of hierarchy. This hierarchical classification is termed as Taxonomy.

Author 1: Disha Verma
Author 2: Dr. Barjesh Kochar

Keywords: Search engine; Data mining; Multi agent systems (MAS); Semantic mapping; Hozo

Download PDF

Paper 33: Towards a New Approach to Improve the Classification Accuracy of the Kohonen’s Self-Organizing Map During Learning Process

Abstract: Kohonen self-organization algorithm, known as “topologic maps algorithm”, has been largely used in many applications for classification. However, few theoretical studies have been proposed to improve and optimize the learning process of classification and clustering for dynamic and scalable systems taking into account the evolution of multi-parameter objects. Our objective in this paper is to provide a new approach to improve the accuracy and quality of the classification method based on the basic advantages of the Kohonen self-organization algorithm and on new network functions to pre-eliminate the auto-detected of drawbacks and redundancy.

Author 1: El Khatir HAIMOUDI
Author 2: Hanane FAKHOURI
Author 3: Loubna CHERRAT
Author 4: Mostafa Ezziyyani

Keywords: Artificial neural networks; self-organization map; Learning algorithm; Classification; Clustering; Principal components Analysis; power iteration

Download PDF

Paper 34: A New Approach for Time Series Forecasting: Bayesian Enhanced by Fractional Brownian Motion with Application to Rainfall Series

Abstract: A new predictor algorithm based on Bayesian enhanced approach (BEA) for long-term chaotic time series using artificial neural networks (ANN) is presented. The technique based on stochastic models uses Bayesian inference by means of Fractional Brownian Motion as model data and Beta model as prior information. However, the need of experimental data for specifying and estimating causal models has not changed. Indeed, Bayes method provides another way to incorporate prior knowledge in forecasting models; the simplest representations of prior knowledge in forecasting models are hard to beat in many forecasting situations, either because prior knowledge is insufficient to improve on models or because prior knowledge leads to the conclusion that the situation is stable. This work contributes with long-term time series prediction, to give forecast horizons up to 18 steps ahead. Thus, the forecasted values and validation data are presented by solutions of benchmark chaotic series such as Mackey-Glass, Lorenz, Henon, Logistic, Rössler, Ikeda, Quadratic one-dimensional map series and monthly cumulative rainfall collected from Despeñaderos, Cordoba, Argentina. The computational results are evaluated against several non-linear ANN predictors proposed before on high roughness series that shows a better performance of Bayesian Enhanced approach in long-term forecasting.

Author 1: Cristian Rodriguez Rivero
Author 2: Daniel Patiño
Author 3: Julian Pucheta
Author 4: Victor Sauchelli

Keywords: long-term prediction; neural networks; Bayesian inference; Fractional Brownian Motion; Hurst parameter

Download PDF

Paper 35: Performance Evaluation of Content Based Image Retrieval on Feature Optimization and Selection Using Swarm Intelligence

Abstract: The diversity and applicability of swarm intelligence is increasing everyday in the fields of science and engineering. Swarm intelligence gives the features of the dynamic features optimization concept. We have used swarm intelligence for the process of feature optimization and feature selection for content-based image retrieval. The performance of content-based image retrieval faced the problem of precision and recall. The value of precision and recall depends on the retrieval capacity of the image. The basic raw image content has visual features such as color, texture, shape and size. The partial feature extraction technique is based on geometric invariant function. Three swarm intelligence algorithms were used for the optimization of features: ant colony optimization, particle swarm optimization (PSO), and glowworm optimization algorithm. Coral image dataset and MatLab software were used for evaluating performance.

Author 1: Kirti Jain
Author 2: Dr.Sarita Singh Bhadauria

Keywords: CBIR; Swarm intelligence; feature extraction;SIFT transform; GSO(glowwarm swarm optimization)

Download PDF

Paper 36: Application of Artificial Neural Networks for Predicting Generated Wind Power

Abstract: This paper addresses design and development of an artificial neural network based system for prediction of wind energy produced by wind turbines. Now in the last decade, renewable energy emerged as an additional alternative source for electrical power generation. We need to assess wind power generation capacity by wind turbines because of its non-exhaustible nature. The power generation by electric wind turbines depends on the speed of wind, flow direction, fluctuations, density of air, generator hours, seasons of an area, and wind turbine position. During a particular season, wind power generation access can be increased. In such a case, wind energy generation prediction is crucial for transmission of generated wind energy to a power grid system. It is advisable for the wind power generation industry to predict wind power capacity to diagnose it. The present paper proposes an effort to apply artificial neural network technique for measurement of the wind energy generation capacity by wind farms in Harshnath, Sikar, Rajasthan, India.

Author 1: Vijendra Singh

Keywords: wind; neural network; wind power forecasting

Download PDF

Paper 37: Extract Five Categories CPIVW from the 9V’s Characteristics of the Big Data

Abstract: There is an exponential growth in the amount of data from different fields around the world, and this is known as Big Data. It needs more data management, analysis, and accessibility. This leads to an increase in the number of systems around the world that will manage and manipulate the data in different places at any time. Big Data is a systematically analysed data that depends on the existence of complex processes, devices, and resources. Data are no longer stored in traditional database storage types or on such forms like database, which only on structured data are limited, but surpassed them to the unstructured or semi-structured data. Thus, Big Data has several characteristics and specific properties proportionate with the size of the data with the enormous and rapid development in all business and areas of life. In this work, we study the relationship between the characteristics of Big Data and extract some categories from them. From this, we conclude that there are five categories, and these categories are related to each other.

Author 1: Suhail Sami Owais
Author 2: Nada Sael Hussein

Keywords: Big Data; Characteristics; Categories; Management; Analysis; Anywhere and Anytime

Download PDF

Paper 38: Hybrid Solution Methodology: Heuristic-Metaheuristic-Implicit Enumeration 1-0 for the Capacitated Vehicle Routing Problem (Cvrp)

Abstract: The capacitated vehicle routing problem (CVRP) is a difficult combinatorial optimization problem that has been intensively studied in the last few decades. We present a hybrid methodology approach to solve this problem which incorporates an improvement stage by using a 1-0 implicit enumeration technique or Balas’s method. Other distinguishing features of the methodology proposed include a specially designed route-based crossover operator for solution recombination and an effective local procedure as the mutation step. Finally, the methodology is tested with instances of the specialized literature and compared with its best-known solutions for the CVRP with homogeneous fleet, to be able to identify the efficiency of the use of the Balas’s methodology in routing problems.

Author 1: David Escobar Vargas
Author 2: Ramón A. Gallego Rendón
Author 3: Antonio Escobar Zuluaga

Keywords: 1-0 implicit enumeration; CVRP; Operations research; Genetic algorithm; Chu-Beasley; Heuristics; Metaheuristics and exact methods

Download PDF

Paper 39: Automation of Optimized Gabor Filter Parameter Selection for Road Cracks Detection

Abstract: Automated systems for road crack detection are extremely important in road maintenance for vehicle safety and traveler’s comfort. Emerging cracks in roads need to be detected and accordingly repaired as early as possible to avoid further damage thus reducing rehabilitation cost. In this paper, a robust method for Gabor filter parameters optimization for automatic road crack detection is discussed. Gabor filter has been used in previous literature for similar applications. However, there is a need for automatic selection of optimized Gabor filter parameters due to variation in texture of roads and cracks. The problem of change of background, which in fact is road texture, is addressed through a learning process by using synthetic road crack generation for Gabor filter parameter tuning. Tuned parameters are then tested on real cracks and a thorough quantitative analysis is performed for performance evaluation.

Author 1: Haris Ahmad Khan
Author 2: M. Salman
Author 3: Sajid Hussain
Author 4: Khurram Khurshid

Keywords: Pavement Cracks; Automated detection; Gabor Filters; Genetic Algorithm; Parameter Selection

Download PDF

Paper 40: Classification of Hand Gestures Using Gabor Filter with Bayesian and Naïve Bayes Classifier

Abstract: A hand Gesture is basically the movement, position or posture of hand used extensively in our daily lives as part of non-verbal communication. A lot of research is being carried out to classify hand gestures in videos as well as images for various applications. The primary objective of this communication is to present an effective system that can classify various static hand gestures in complex background environment. The system is based on hand region localized using a combination of morphological operations. Gabor filter is applied to the extracted region of interest (ROI) for extraction of hand features that are then fed to Bayesian and Naïve Bayes classifiers. The results of the system are very encouraging with an average accuracy of over 90%.

Author 1: Tahira Ashfaq
Author 2: Khurram Khurshid

Keywords: Human Computer Interaction; Hand Segmentation; Gesture recognition; Gabor Filter; Bayesian and Naïve Bayes classifiers; Feature Extraction; Image Processing

Download PDF

Paper 41: Moon Landing Trajectory Optimization

Abstract: Trajectory optimization is a crucial process during the planning phase of a spacecraft landing mission. Once a trajectory is determined, guidance algorithms are created to guide the vehicle along the given trajectory. Because fuel mass is a major driver of the total vehicle mass, and thus mission cost, the objective of most guidance algorithms is to minimize the required fuel consumption. Most of the existing algorithms are termed as “near-optimal” regarding fuel expenditure. The question arises as to how close to optimal are these guidance algorithms. To answer this question, numerical trajectory optimization techniques are often required. With the emergence of improved processing power and the application of new methods, more direct approaches may be employed to achieve high accuracy without the associated difficulties in computation or pre-existing knowledge of the solution. An example of such an approach is DIDO optimization. This technique is applied in the current research to find these minimum fuel optimal trajectories.

Author 1: Ibrahim Mustafa MEHEDI
Author 2: Md. Shofiqul ISLAM

Keywords: lunar landing; trajectory optimization; optimization techniques; DIDO optimization

Download PDF

Paper 42: Evaluation of Navigational Aspects of Moodle

Abstract: Learning Management System (LMS) is an effective platform for communication and collaboration among teachers and students to enhance learning. These LMSs are now widely used in both conventional and virtual and distance learning paradigms. These LMSs have various limitations as identified in the existing literature, including poor learning content, use of appropriate technology and usability issues. Poor usability leads to the distraction of users. Literature covers many aspects of usability evaluation of LMS. However, there is less focus on navigational issues. Poor navigational can lead to disorientation and cognitive overload of the users of any Web application. For this reason, we have proposed a navigational evaluation framework to evaluate the navigational structure of the LMS. We have applied this framework to evaluate the navigational structure of Moodle. We conducted a survey among students and teachers of two leading universities in Pakistan, where Moodle is in use. This work summarizes the survey results and proposes guidelines to improve the usability of Moodle based on the feedback received from its users.

Author 1: Raheela Arshad
Author 2: Awais Majeed
Author 3: Hammad Afzal
Author 4: Muhammad Muzammal
Author 5: Arif ur Rahman

Keywords: e-Learning; Navigational Evaluation Framework; Learning management system (LMS); Moodle; Usability

Download PDF

Paper 43: Communication-Load Impact on the Performance of Processor Allocation Strategies in 2-D Mesh Multicomputer Systems

Abstract: A number of processor allocation strategies have been proposed in literature. A key performance factor that can highlight the difference between these strategies is the amount of communication conducted between the parallel jobs to be allocated. This paper aims to identifying how the density and pattern of communication can affect the performance of these strategies. Compared to the work already presented in literature, we examined a wider range of communication patterns. Other works consider only two types of communication patterns; those are the one-to-all and all-to-all patterns. This paper used the C language to implement the different allocation strategies, and then combined them with the ProcSimity simulation tool. The processor-allocation strategies are examined under the First-Come-First-Serve scheduling strategy. Results show that communication pattern and load are effective factors that have a significant impact on the performance of the processor allocation strategy used.

Author 1: Zaid Mustafa
Author 2: J. J. Alshaer
Author 3: O. Dorgham
Author 4: S. Bani-Ahmad

Keywords: Processor allocation; Parallel computing; 2-D Mesh; Communication patterns; Multicomputer systems

Download PDF

Paper 44: ECG Signal Compression Using the High Frequency Components of Wavelet Transform

Abstract: Electrocardiography (ECG) is the method of recording electrical activity of the heart by using electrodes. In ambulatory and continuous monitoring of ECG, the data that need to be handled is huge. Hence we require an efficient compression technique. The data also must retain the clinically important features after compression. For most of the signals, the low frequency component is considered as most important part of the signal. In wavelet analysis, the approximation coefficients are the low frequency components of the signal. The detail coefficients are the high frequency components of the signal. Most of the time the detail coefficients (high frequency components) are not considered. In this paper, we propose to use detail coefficients of Wavelet transform for ECG signal compression. The Compression Ratio (CR) of both the approximation and detail coefficients are compared. Threshold based technique is adopted. The Threshold value helps to remove the coefficients below the set threshold value of coefficients. Experiment is carried out using different types of Wavelet transforms. MIT BIH ECG data base is used for experimentation. MATLAB tool is used for simulation purpose. The novelty of the method is that the CR achieved by detail coefficients is better. CR of about 88% is achieved using Sym3 Wavelet. The performance measure of the reconstructed signal is carried out by PRD.

Author 1: Surekha K.S
Author 2: B. P. Patil

Keywords: ECG; PRD; transform

Download PDF

Paper 45: Feature Selection Based on Minimum Overlap Probability (MOP) in Identifying Beef and Pork

Abstract: Feature selection is one of the most important techniques in image processing for classifying. In classifying beef and pork based on texture feature, feature overlaps are difficult issues. This paper proposed feature selection method by Minimum Overlap Probability (MOP) to get the best feature. The method was tested on two datasets of features of digital images of beef and pork which had similar textures and overlapping features. The selected features were used for data training and testing by Backpropagation Neural Network (BPNN). Data training process used single features and several selected feature combinations. The test result showed that BPNN managed to detect beef or pork images with 97.75% performance. From performance a conclusion was drawn that MOP method could be used to select the best features in feature selection for classifying/identifying two digital image objects with similar textures.

Author 1: Khoerul Anwar
Author 2: Agus Harjoko
Author 3: Suharto Suharto

Keywords: overlap; feature selection; best feature; minimum overlap probability (MOP); identifying

Download PDF

Paper 46: Resource Allocation in Cloud Computing Using Imperialist Competitive Algorithm with Reliability Approach

Abstract: Cloud computing has become a universal trend now. So, for users, the reliability is an effective factor to use this technology. In addition, users prefer to implement and get their work done quickly. This paper takes into account these two parameters for resource allocation due to their importance In this method, the Imperialist Competitive algorithm with the addition of a cross layer of cloud architecture to reliability evaluation is used. In this cross layer, initial reliability is considered for all the resources and the implementation of their tasks and due to the success or failure of implementation, reliability of resources is increased or reduced. Reliability and makespan are used as a cost function in ICA for resource allocation. Results show that the proposed method can search the problem space in a better manner and give a better performance when compared to other methods.

Author 1: Maryam Fayazi
Author 2: Mohammad Reza Noorimehr
Author 3: Sayed Enayatollah Alavi

Keywords: Imperialist Competitive algorithm; Reliability; makespan; Cloud Computing

Download PDF

Paper 47: Real-Time Gender Classification by Face

Abstract: The identification of human beings based on their biometric body parts, such as face, fingerprint, gait, iris, and voice, plays an important role in electronic applications and has become a popular area of research in image processing. It is also one of the most successful applications of computer–human interaction and understanding. Out of all the abovementioned body parts,the face is one of most popular traits because of its unique features.In fact, individuals can process a face in a variety of ways to classify it by its identity, along with a number of other characteristics, such as gender, ethnicity, and age. Specifically, recognizing human gender is important because people respond differently according to gender. In this paper, we present a robust method that uses global geometry-based features to classify gender and identify age and human beings from video sequences. The features are extracted based on face detection using skin color segmentation and the computed geometric features of the face ellipse region. These geometric features are then used to form the face vector trajectories, which are inputted to a time delay neural network and are trained using the Broyden–Fletcher–Goldfarb–Shanno (BFGS) function. Results show that using the suggested method with our own dataset under an unconstrained condition achieves a 100% classification rate in the training set for all application, as well as 91.2% for gender classification, 88% for age identification, and 83% for human identification in the testing set. In addition, the proposed method establishes the real-time system to be used in three applications with a simple computation for feature extraction.

Author 1: Eman Fares Al Mashagba

Keywords: Biometrics; Face Detection; Geometry-based; Gender Classification; Quasi-Newton Algorithms

Download PDF

Paper 48: An Accelerated Architecture Based on GPU and Multi-Processor Design for Fingerprint Recognition

Abstract: Fingerprint recognition is widely used in security systems to recognize humans. In both industry and scientific literature, many fingerprint identification systems were developed using different techniques and approaches. Although the number of conducted research works in this field, developed systems suffer for some limitations partially those related the real time computation and fingerprint recognition. Accordingly, this paper proposes a reliable algorithm for fingerprint recognition based on the extraction and matching of Minutiae. In this paper, we present also an accelerated architecture based on GPU and multi-processor design in which the suggested fingerprint recognition algorithm is implemented.

Author 1: Mossaad Ben Ayed
Author 2: Sabeur Elkosantini

Keywords: Minutia; Fingerprint; Architecture design; recognition; Gabor filter; MPSOC

Download PDF

Paper 49: Planning And Allocation of Tasks in a Multiprocessor System as a Multi-Objective Problem and its Resolution Using Evolutionary Programming

Abstract: the use of Linux-based clusters is a strategy for the development of multiprocessor systems. These types of systems face the problem of efficiently executing the planning and allocation of tasks, for the efficient use of its resources. This paper addresses this as a multi-objective problem, carrying out an analysis of the objectives that are opposed during the planning of the tasks, which are waiting in the queue, before assigning tasks to processors. For this, we propose a method that avoids strategies such as those that use genetic operators, exhaustive searches of contiguous free processors on the target system, and the use of the strict allocation policy: First Come First Serve (FIFO). Instead, we use estimation and simulation of the joint probability distribution as a mechanism of evolution, for obtaining assignments of a set of tasks, which are selected from the waiting queue through the planning policy Random-Order-of-Service (ROS). A set of conducted experiments that compare the results of the FIFO allocation policy, with the results of the proposed method show better results in the criteria of: utilization, throughput, mean turnaround time, waiting time and the total execution time, when system loads are significantly increased.

Author 1: Apolinar Velarde Martinez
Author 2: Eunice Ponce de León Sentí
Author 3: Juan Antonio Nungaray Ornelas
Author 4: Juan Alejandro Montañez de la Torre

Keywords: Multicomputer system; Evolutionary Multi-objective Optimization; First Input First Output; Random-Order-of-Service; Estimation of Distribution Algorithms; Univariate Distribution Algorithm

Download PDF

Paper 50: An Improved Image Steganography Method Based on LSB Technique with Random Pixel Selection

Abstract: with the rapid advance in digital network, information technology, digital libraries, and particularly World Wide Web services, many kinds of information could be retrieved any time. Thus, the security issue has become one of the most significant problems for distributing new information. It is necessary to protect this information while passing over insecure channels. Steganography introduces a strongly approach to hide the secret data in an appropriate media carriers such as images, audio files, text files, and video files. In this paper, a new image steganography method based on spatial domain is proposed. According to the proposed method, the secret message is embedded randomly in the pixel location of the cover image using Pseudo Random Number Generator (PRNG) of each pixel value of the cover image instead of embedding sequentially in the pixels of the cover image. This randomization is expected to increase the security of the system. The proposed method works with two layers (Blue and Green), as (2-1-2) layer, and the byte of the message will be embedded in three pixels only in this form (3-2-3). From the experimental results, it has found that the proposed method achieves a very high Maximum Hiding Capacity (MHC), and higher visual quality as indicated by the Peak Signal-to- Noise Ratio (PSNR).

Author 1: Marwa M. Emam
Author 2: Abdelmgeid A. Aly
Author 3: Fatma A. Omara

Keywords: Image Steganography; PRNG (Pseudorandom Number Generator); Peak Signal-to-Noise Rate (PSNR); Mean Square Error (MSE)

Download PDF

Paper 51: Developing a Feasible and Maintainable Ontology for Automatic Landscape Design

Abstract: In general, landscape architecture includes analysis, planning, design, administration and management of natural and artificial. An important aspect is the formation of so-called sustainable landscapes that allow maximum use of the environment, natural resources and promote sustainable restoration of ecosystems. For such purposes, a designer needs a complete database with existing and suitable plants, but no designing tool has one. Therefore it is presented the structure and the development of on ontology suitable for storing and managing all information and knowledge about plants. The advantage is that the format of the ontology allows the storage of any plant species (e.g. live or fossil) and automated reasoning. Ontology is a formal conceptualization of a particular knowledge about the world, through the explicit representation of basic concepts, relations, and inference rules about themselves. Therefore the ontology may be used by a design tool for helping the designer and choosing the best options for a sustainable landscape.

Author 1: Pintescu Alina
Author 2: Matei Oliviu-Dorin
Author 3: Boanca Iuliana Paunita
Author 4: Honoriu Valean

Keywords: environment; landscapes; ontology; ontology-based simulation; sustainable landscapes

Download PDF

Paper 52: Parallel Implementation of Bias Field Correction Fuzzy C-Means Algorithm for Image Segmentation

Abstract: Image segmentation in the medical field is one of the most important phases to diseases diagnosis. The bias field estimation algorithm is the most interesting techniques to correct the in-homogeneity intensity artifact on the image. However, the use of such technique requires a powerful processing and quite expensive for big size as medical images. Hence the idea of parallelism becomes increasingly required. Several researchers have followed this path mainly in the bioinformatics field where they have suggested different algorithms implementations. In this paper, a novel Single Instruction Multiple Data (SIMD) architecture for bias field estimation and image segmentation algorithm is proposed. In order to accelerate compute-intensive portions of the sequential implementation, we have implemented this algorithm on three different graphics processing units (GPU) cards named GT740m, GTX760 and GTX580 respectively, using Compute Unified Device Architecture (CUDA) software programming tool. Numerical obtained results for the computation speed up, allowed us to conclude on the suitable GPU architecture for this kind of applications and closest ones.

Author 1: Nouredine AITALI
Author 2: Bouchaib CHERRADI
Author 3: Ahmed EL ABBASSI
Author 4: Omar BOUATTANE
Author 5: Mohamed YOUSSFI

Keywords: Image segmentation; Bias field correction; GPU; Non homogeneity intensity; CUDA; Clustering

Download PDF

Paper 53: An Algerian dialect: Study and Resources

Abstract: Arabic is the official language overall Arab coun-tries, it is used for official speech, news-papers, public adminis-tration and school. In Parallel, for everyday communication, non-official talks, songs and movies, Arab people use their dialects which are inspired from Standard Arabic and differ from one Arabic country to another. These linguistic phenomenon is called disglossia, a situation in which two distinct varieties of a language are spoken within the same speech community. It is observed Throughout all Arab countries, standard Arabic widely written but not used in everyday conversation, dialect widely spoken in everyday life but almost never written. Thus, in NLP area, a lot of works have been dedicated for written Arabic. In contrast, Arabic dialects at a near time were not studied enough. Interest for them is recent. First work for these dialects began in the last decade for middle-east ones. Dialects of the Maghreb are just beginning to be studied. Compared to written Arabic, dialects are under-resourced languages which suffer from lack of NLP resources despite their large use. We deal in this paper with Arabic Algerian dialect a non-resourced language for which no known resource is available to date. We present a first linguistic study introducing its most important features and we describe the resources that we created from scratch for this dialect.

Author 1: Salima Harrat
Author 2: Karima Meftouh
Author 3: Mourad Abbas
Author 4: Khaled-Walid Hidouci
Author 5: Kamel Smaili

Keywords: Arabic dialect, Algerian dialect, Modern Standard Arabic, Grapheme to Phoneme Conversion, Morphological Analysis

Download PDF

Paper 54: An Enhanced Automated Test Item Creation Based on Learners Preferred Concept Space

Abstract: Recently, research has become increasingly inter-ested in developing tools that are able to automatically create test items out of text-based learning contents. Such tools might not only support instructors in creating tests or exams but also learners in self-assessing their learning progress. This paper presents an enhanced automatic question-creation tool (EAQC) that has been recently developed. EAQC extracts the most important key phrases (concepts) out of a textual learning content and automatically creates test items based on these concepts. Moreover, this paper discusses two studies for the evaluation of EAQC application in real learning settings. The first study showed that concepts extracted by the EAQC often but not always reflect the concepts extracted by learners. Learners typically extracted fewer concepts than the EAQC and there was a great inter-individual variation between learners with regard to which concepts they experienced as relevant. Accordingly, the second study investigated whether the functionality of the EAQC can be improved in a way that valid test items are created if the tool was fed with concepts provided by learners. The results showed that the quality of semi-automated creation of test items were satisfactory. Moreover, this depicts the EAQC flexibility in adapting its workflow to the individual needs of the learners.

Author 1: Mohammad AL-Smadi
Author 2: Margit H¨ofler
Author 3: Christian G¨utl

Keywords: Automated Assessment; Automatic Test-Item Cre-ation; Self-Regulated Learning; Evaluation of CAL systems; Ped-agogical issues; Natural-Language Processing

Download PDF

Paper 55: Characterizing End-to-End Delay Performance of Randomized TCP Using an Analytical Model

Abstract: TCP (Transmission Control Protocol) is the main transport protocol used in high speed network. In the OSI Model, TCP exists in the Transport Layer and it serves as a connection-oriented protocol which performs handshaking to create a connection. In addition, TCP provides end-to-end reliability. There are different standard variants of TCP (e.g. TCP Reno, TCP NewReno etc.)which implement mechanisms to dynamically control the size of congestion window but they do not have any control on the sending time of successive packets. TCP pacing introduces the concept of controlling the packet sending time at TCP sources to reduce packet loss in a bursty traffic network. Randomized TCP is a new TCP pacing scheme which has shown better performance (considering throughput, fairness) over other TCP variants in bursty networks. The end-to-end delay of Randomized TCP is a very important performance measure which has not yet been addressed. In the current high speed networks, it is increasingly important to have mechanisms that keep end-to-end to delay within an acceptable range. In this paper, we present the performance evaluation of end-to-end delay of Randomized TCP. To this end, we have used an analytical and a simulation model to characterize the end-to-end delay performance of Randomized TCP.

Author 1: Mohammad Shorfuzzaman
Author 2: Mehedi Masud
Author 3: Md. Mahfuzur Rahman

Keywords: Randomized TCP, end to end delay, congestion window, TCP pacing, propagation delay, Markov chain.

Download PDF

Paper 56: Improving Vertical Handoffs Using Mobility Prediction

Abstract: The recent advances in wireless communications require integration of multiple network technologies in order to satisfy the increasing demand of mobile users. Mobility in such a heterogeneous environment entails that users keep moving between the coverage regions of different networks, which means that a non-trivial vertical handoff scheme is required in order to maintain a seamless transition from one network technology to another. A good vertical handoff scheme must provide the users with the best possible connection while keeping connection dropping probability to the minimum. In this paper, we propose a handoff scheme which employs the Markov model to predict the users’ future locations in order to make better handoff decisions with reduced connection dropping probability and number of unnecessary handoffs. Through simulation, the proposed scheme is compared with the SINR-based scheme, which was shown to outperform other vertical handoff schemes. The experiments show that the proposed scheme achieves significant improvements over the SINR-based scheme that can reach 51% in terms of the number of failed handoffs and 44% in terms of the number of handoffs.

Author 1: Mahmoud Al-Ayyoub
Author 2: Ghaith Husari
Author 3: Wail Mardini

Keywords: Heterogeneous wireless networks, Vertical handoff, Markov model, Artificial intelligence, Mobility management.

Download PDF

Paper 57: Performance Evaluation of Affinity Propagation Approaches on Data Clustering

Abstract: Classical techniques for clustering, such as k-means clustering, are very sensitive to the initial set of data centers, so it need to be rerun many times in order to obtain an optimal result. A relatively new clustering approach named Affinity Propagation (AP) has been devised to resolve these problems. Although AP seems to be very powerful it still has several issues that need to be improved. In this paper several improvement or development are discussed in , i.e. other four approaches: Adaptive Affinity Propagation, Partition Affinity Propagation, Soft Constraint Affinity propagation, and Fuzzy Statistic Affinity Propagation. and those approaches are be implemented and compared to look for the issues that AP really deal with and need to be improved. According to the testing results, Partition Affinity Propagation is the fastest one among four other approaches. On the other hand Adaptive Affinity Propagation is much more tolerant to errors, it can remove the oscillation when it occurs where the occupance of oscillation will bring the algorithm to fail to converge. Adaptive Affinity propagation is more stable than the other since it can deal with error which the other can not. And Fuzzy Statistic Affinity Propagation can produce smaller number of cluster compared to the other since it produces its own preferences using fuzzy iterative methods.

Author 1: R. Refianti
Author 2: A.B. Mutiara
Author 3: A.A. Syamsudduha

Keywords: Affinity Propagation, Availability, Clustering, Exem-plar, Responsibility, Similarity Matrix.

Download PDF

Paper 58: Modified Grapheme Encoding and Phonemic Rule to Improve PNNR-Based Indonesian G2P

Abstract: A grapheme-to-phoneme conversion (G2P) is very important in both speech recognition and synthesis. The existing Indonesian G2P based on pseudo nearest neighbour rule (PNNR) has two drawbacks: the grapheme encoding does not adapt all Indonesian phonemic rules and the PNNR should select a best phoneme from all possible conversions even though they can be filtered by some phonemic rules. In this paper, a modified partial orthogonal binary grapheme encoding and a phonemic-based rule are proposed to improve the performance of PNNR-based Indonesian G2P. Evaluating on 5-fold cross-validation, contain 40K words to develop the model and 10K words to evaluation each, shows that both proposed concepts reduce the relative phoneme error rate (PER) by 13.07%. A more detail analysis shows the most errors are from grapheme ?e? that can be dynamically converted into either /E/ or /??/ since four prefixes, ’ber’, ’me’, ’per’, and ’ter’, produce many ambiguous conversions with basic words and also from some similar compound words with both different pronunciations for the grapheme ?e?. A stemming procedure can be applied to reduce those errors.

Author 1: Suyanto
Author 2: Sri Hartati
Author 3: Agus Harjoko

Keywords: Modified grapheme encoding; phonemic rule; In-donesian grapheme-to-phoneme conversion; pseudo nearest neigh-bour rule

Download PDF

Paper 59: Testing and Analysis of Activities of Daily Living Data with Machine Learning Algorithms

Abstract: It is estimated that 28% of European Union’s population will be aged 65 or older by 2060. Europe is getting older and this has a high impact on the estimated cost to be spent for older people. This is because, compared to the younger generation, older people are more at risk to have/face cognitive impairment, frailty and social exclusion, which could have negative effects on their lives as well as the economy of the European Union. The ‘active and independent ageing’ concept aims to support older people to live active and independent life in their preferred location and this goal can be fully achieved by understanding the older people (i.e their needs, abilities, preferences, difficulties they are facing during the day). One of the most reliable resources for such information is the Activities of Daily Living (ADL), which gives essential information about people’s lives. Understanding this kind of information is an important step towards providing the right support, facilities and care for the older population. In the literature, there is a lack of study that evaluates the performance of Machine Learning algorithms towards understanding the ADL data. This work aims to test and analyze the performance of the well known Machine Learning algorithms with ADL data.

Author 1: Ayse Cufoglu
Author 2: Adem Coskun

Keywords: Activities of Daily Living (ADL); Machine Learning (ML); Classification Algorithms; Active and Independent Aging

Download PDF

Paper 60: A Context-Aware Recommender System for Personalized Places in Mobile Applications

Abstract: Selecting the most appropriate places under differ-ent context is important contribution nowadays for people who visit places for the first time. The aim of the work in this paper is to make a context-aware recommender system, which recom-mends places to users based on the current weather, the time of the day, and the user’s mood. This Context-aware Recommender System will determine the current weather and time of the day in a user’s location. Then, it gets places that are appropriate to context state in the user’s location. Also, Recommender system takes the current user’s mood and then selects the location that the user should go. Places are recommended based on what other users have visited in the similar context conditions. Recommender system puts rates for each place in each context for each user. The place’s rates are calculated by The Genetic algorithm, based on Gamma function. Finally,mobile application was implemented in the context-aware recommender system.

Author 1: Soha A.El-Moemen Mohamed
Author 2: Taysir Hassan A.Soliman
Author 3: Adel A.Sewisy

Keywords: recommender system, context, context-aware, ge-netic algorithm, gamma function

Download PDF

Paper 61: Performance Enhancement of Patch-based Descriptors for Image Copy Detection

Abstract: Images have become main sources for the informa-tion, learning, and entertainment, but due to the advancement and progress in multimedia technologies, millions of images are shared on Internet daily which can be easily duplicated and redistributed. Distribution of these duplicated and transformed images cause a lot of problems and challenges such as piracy, redundancy, and content-based image indexing and retrieval. To address these problems, copy detection system based on local features are widely used. Initially, keypoints are detected and represented by some robust descriptors. The descriptors are computed over the affine patches around the keypoints, these patches should be repeatable under photometric and geometric transformations. However, there exist two main challenges with patch based descriptors, (1) the affine patch over the keypoint can produce similar descriptors under entirely different scene or the context which causes “ambiguity”, and (2) the descriptors are not enough “distinctive” under image noise. Due to these limitations, the copy detection systems suffer in performance. We present a framework that makes descriptor more distinguishable and robust by influencing them with the texture and gradients in vicinity. The experimental evaluation on keypoints matching and image copy detection under severe transformations shows the effectiveness of the proposed framework.

Author 1: Junaid Baber
Author 2: Maheen Bakhtyar
Author 3: Waheed Noor
Author 4: Abdul Basit
Author 5: Ihsan Ullah

Keywords: Content-based image copy detection, SIFT, CSLBP, robust descriptors, patch based descriptors

Download PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. Registered in England and Wales. Company Number 8933205. All rights reserved. thesai.org