The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 7 Issue 1

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Content Based Image Retrieval Using Gray Scale Weighted Average Method

Abstract: High feature vector dimension quietly remained a curse element for Content Based Image Retrieval (CBIR) system which eventually degrades its efficiency while indexing similar images from database. This paper proposes CBIR system using Gray Scale Weighted Average technique for reducing the feature vector dimension. The proposed method is more suitable for color and texture image feature analysis as compared to color weighted average method as illustrated in literature review. To prove the effectiveness of retrieval system, two standard benchmark dataset namely, Wang and Amsterdam Library of Texture Images (A LOT) for color and texture have been selected to evaluate the system retrieval accuracies as well as efficiencies generated by each method. For the purpose of image similarity, Euclidean distance has been employed which matches query image feature vector with image database feature vectors. The experimental results generated by two methods showed that overall performance of the proposed method is relatively better in terms of average precision, average recall and its average retrieval time.

Author 1: Kamlesh Kumar
Author 2: Jian-Ping Li
Author 3: Zain-ul-abidin
Author 4: Riaz Ahmed Shaikh

Keywords: Color Weighted Average Method; Gray Scale Weighted Average Method; Feature Extraction; Precision; Recall; CBIR

PDF

Paper 2: 3D Virtual Worlds: Business and Learning Opportunities

Abstract: Virtual worlds (VWs) are rampant and easily accessible to common internet users nowadays. Millions of users are already living their virtual lives in these worlds. Moreover, the number of users is increasing continuously. The purpose of this paper is to review all the business opportunities on these virtual worlds along with the learning opportunities for the real world companies and business students. This paper clearly and precisely defines the virtual worlds in the context of social networking sites and also aims at discussing the past, present and future of VWs. All the possible business opportunities for the real world companies including advertisement & communication, retailing opportunities, application for human resource management, marketing research and organizations' internal process management through virtual worlds are critically reviewed here. In addition to the discussion current learning and training opportunities for the real world companies and business students are also reviewed. The paper aims at proving that the VWs are full of business and marketing applications and they could be widely used by the real world companies for effective and efficient business operations.

Author 1: Aasim Munir Dad
Author 2: Professor Barry Davies
Author 3: Dr Andrew Kear

Keywords: Virtual Worlds; Social Networking Sites; Virtual Reality; Virtual Education Environments; Virtual Commerce

PDF

Paper 3: A Unified Forensic Framework for Data Identification and Collection in Mobile Cloud Social Network Applications

Abstract: Mobile Cloud Computing (MCC) is the emerging and well accepted concept that significantly removes the constraints of mobile devices in terms of storage and computing capabilities and improves productivity, enhances performance, saves energy, and elevates user experience. The consolidation of cloud computing, wireless communication infrastructure, portable computing devices, location- based services, and mobile web has led to the inauguration of novel computing model. The Mobile social networks and cloud computing technology have gained rapid and intensive attention in recent years because of its numerous available benefits. Despite being an advanced technology to communicate and socialize with friends, the diverse and anonymous nature of mobile cloud social networking applications makes them very vulnerable to crimes and illegal activities. On considering the point of mobile cloud computing benefits, the forensic assistance based mobile cloud computing could offer a solution to the problem of social networking applications. Therefore, this work proposes a Mobile Cloud Forensic Framework (MCFF) to facilitate forensic investigation in social networking applications. The MCFF comprises of two components such as the forensic logging module and the forensic investigation process. The forensic logging module is a readiness component that is installed both the device and on the cloud. The ClouDroid Inspector (CDI) tool uses of the record traced by forensic logging module and conduct the investigation in both the mobile and the cloud. The MCFF identifies and collects the automated synchronized copies of data on both the mobile and cloud environment to prove and establish the use of cloud service via Smartphones.

Author 1: Muhammad Faheem
Author 2: Dr Tahar Kechadi
Author 3: Dr An Le Khac

Keywords: Mobile cloud computing; forensics; mobile cloud forensics; social networking applications

PDF

Paper 4: Features Management and Middleware of Hybrid Cloud Infrastructures

Abstract: The wide spread of cloud computing has identified the need to develop specialized approaches to the design, management and programming for cloud infrastructures. In the article were reviewed the peculiarities of the hybrid cloud and middleware software development, adaptive to implementing the principles of governance and change in the structure of storing data in clouds. The examples and results of experimental research are presented.

Author 1: Evgeny Nikulchev
Author 2: Oleg Lukyanchikov
Author 3: Evgeniy Pluzhnik
Author 4: Dmitry Biryukov

Keywords: Cloud Infrastructure; Distributed Databases; Hybrid Clouds

PDF

Paper 5: Proposed Hyperchaotic System for Image Encryption

Abstract: This paper presents a new hyper chaos system based on Hénon and Logistic maps which provides characteristics of high capacity, security and efficiency. The Proposed hyper chaos system is employed to generate the key for diffusion in an image encryption algorithm. The simulation experiments to the image encryption algorithm which based on the proposed hyper chaos system show that the algorithm security analysis it has large key space (10 84 that ensures a strong resistance against attack of exhaustion as the key space will be greater), strong sensitivity of encryption key and good statistical characteristics. Encryption and decryption time is suitable for different applications.

Author 1: Asst. Prof. Dr. Alia Karim Abdul Hassan

Keywords: hyperchaos; logistic map; Hénon map; image; encryption; decryption

PDF

Paper 6: Automatic Approach for Word Sense Disambiguation Using Genetic Algorithms

Abstract: Word sense disambiguation (WSD) is a significant field in computational linguistics as it is indispensable for many language understanding applications. Automatic processing of documents is made difficult because of the fact that many of the terms it contain ambiguous. Word Sense Disambiguation (WSD) systems try to solve these ambiguities and find the correct meaning. Genetic algorithms can be active to resolve this problem since they have been effectively applied for many optimization problems. In this paper, genetic algorithms proposed to solve the word sense disambiguation problem that can automatically select the intended meaning of a word in context without any additional resource. The proposed algorithm is evaluated on a collection of documents and produce's a lot of sense to the ambiguities word, the system creates dynamic, and up-todate word sense in a highly automatic method.

Author 1: Dr. Bushra Kh. AlSaidi

Keywords: unsupervised method; genetic algorithms; word sense disambiguation; Natural Language Processing; Information Retrieval

PDF

Paper 7: Hybrid Motion Graphs for Character Animation

Abstract: Many works in the literature have improved the performance of motion graphs for synthesis the humanlike results in limited domains that necessity few constraints like dance, navigation in small game like environments or in games by the gesture of feedback on a snowboard tutorial. The humanlike cannot exist in an environment without interacting with the world surrounding them; the naturalness of the entire motion extremely depends on the animation of the walking character, the chosen path and the interaction motions. Addressing exact position of end-effectors is the main disadvantage of motion graphs which cause less importance expended to the search for motions with no collision in complex environments or manipulating motions. This fact motivates this approach which is the proposition of an hybrid motion graphs taking advantages of motion graphs to synthesis a natural locomotion and overcoming their limitations in synthesis manipulation motions by combined it with an inverse kinematic method for synthesis the upper-body motions.

Author 1: Kalouache Saida
Author 2: Cherif Foudil

Keywords: motion graphs; inverse kinematic; virtual human; animation

PDF

Paper 8: Toward a Hybrid Approach for Crowd Simulation

Abstract: We address the problem of simulating pedestrian crowd behaviors in real time. To date, two approaches can be used in modeling and simulation of crowd behaviors, i.e. macroscopic and microscopic models. Microscopic simulation techniques can capture the accuracy simulation of individualistic pedestrian behavior while macroscopic simulations maximize the efficiency of simulation; neither of them assures the two goals at the same time. In order to achieve the strengths of the two classes of crowd modeling, we propose a hybrid architecture for defining the complex behaviors of crowd at two levels, the individual behaviors, and the aggregate motion of pedestrian flow. It consists of combining a microscopic and a macroscopic model in an unified framework, we simulate individual pedestrian behaviors in regions of low density by using a microscopic model, and we use a faster continuum model of pedestrian flow in the remainder regions of the simulation environment. We demonstrate the flexibility and scalability of our interactive hybrid simulation technique in a large environment. This technique demonstrates the applicability of hybrid techniques to the efficient simulation of large-scale flows with complex dynamics.

Author 1: Chighoub Rabiaa
Author 2: Cherif Foudil

Keywords: vcrowd behavior; micro-scale representation; multi-layered framework; real time simulation

PDF

Paper 9: Data Mining and Intrusion Detection Systems

Abstract: The rapid evolution of technology and the increased connectivity among its components, imposes new cyber-security challenges. To tackle this growing trend in computer attacks and respond threats, industry professionals and academics are joining forces in order to build Intrusion Detection Systems (IDS) that combine high accuracy with low complexity and time efficiency. The present article gives an overview of existing Intrusion Detection Systems (IDS) along with their main principles. Also this article argues whether data mining and its core feature which is knowledge discovery can help in creating Data mining based IDSs that can achieve higher accuracy to novel types of intrusion and demonstrate more robust behaviour compared to traditional IDSs.

Author 1: Zibusiso Dewa
Author 2: Leandros A. Maglaras

Keywords: (Intrusion Detection; NSL–KDD; Machine Learning; Datasets; Classifiers; Feature Selection; Waikato Environment for Knowledge Analysis; Anomaly detection; Misuse detection; Data mining)

PDF

Paper 10: Dynamic Crypto Algorithm for Real-Time Applications DCA-RTA, Key Shifting

Abstract: The need for fast and attack resistance crypto algorithm is challenging issue in the era of the revolution in the information and communication technologies. The previous works presented by the authors “Dynamic Crypto Algorithm for Real-Time Applications DCA_RTA”, still need more enhancements to bring up the DCA_RTA into acceptable security level. In this work, the author added more enhancements on the Transformation-Table that is generated by the Initial-Table IT, which affects the overall encryption/decryption process. The new TT generation proven to be less correlated with the IT than using the previous TT generation processes. The simulated result indicates more randomness in the TT, which means better attack resistance algorithm. More room for algorithm enhancements is still needed.

Author 1: Ahmad H. Al-Omari

Keywords: Dynamic crypto algorithm; real time applications; shared key generation; symmetric key encryption

PDF

Paper 11: Human Object Tracking in Nonsubsampled Contourlet Domain

Abstract: The intelligent systems are becoming more important in life. Moving objects tracking is one of the tasks of intelligent systems. This paper proposes the algorithm to track the object in the street. The proposed method uses the amplitude of zernike moment on nonsubsampled contourlet transform to track object depending on context awareness. The algorithm has also been processed successfully such cases as the new object detection, object detection obscured after they reappeared, detecting and tracking objects which successfully intertwined and then separated again. The proposed method tested on a standard large dataset like PEST dataset, CAVIAR dataset and SUN dataset. The author has compared the results with the other recent methods. Experimental results of the proposed method performed well compared to the other methods.

Author 1: Nguyen Thanh Binh

Keywords: object tracking; zernike moment; nonsubsampled contourlet transform; context awareness; extracting features

PDF

Paper 12: Metrics for Event Driven Software

Abstract: The evaluation of Graphical User Interface has significant role to improve its quality. Very few metrics exists for the evaluation of Graphical User Interface. The purpose of metrics is to obtain better measurements in terms of risk management, reliability forecast, project scheduling, and cost repression. In this paper structural complexity metrics is proposed for the evaluation of Graphical User Interface. Structural complexity of Graphical User Interface is considered as an indicator of complexity. The goal of identifying structural complexity is to measure the GUI testability. In this testability evaluation the process of measuring the complexity of the user interface from testing perspective is proposed. For the GUI evaluation and calculating structural complexity an assessment process is designed which is based on types of events. A fuzzy model is developed to evaluate the structural complexity of GUI. This model takes five types of events as input and return structural complexity of GUI as output. Further a relationship is established between structural complexity and testability of event driven software. Proposed model is evaluated with four different applications. It is evident from the results that higher the complexities lower the testability of application.

Author 1: Neha Chaudhary
Author 2: O.P. Sangwan

Keywords: Graphical User Interface; Structural Complexity; Testability; Fuzzy model

PDF

Paper 13: A Novel Adaptive Grey Verhulst Model for Network Security Situation Prediction

Abstract: Recently, researchers have shown an increased interest in predicting the situation of incoming security situation for organization’s network. Many prediction models have been produced for this purpose, but many of these models have various limitations in practical applications. In addition, literature shows that far too little attention has been paid in utilizing the grey Verhulst model predicting network security situation although it has demonstrated satisfactory results in other fields. By considering the nature of intrusion attacks and shortcomings of traditional grey Verhulst model, this paper puts forward an adaptive grey Verhust model with adjustable generation sequence to improve the prediction accuracy. The proposed model employs the combination methods of Trapezoidal rule and Simpson’s 1/3rd rule to obtain the background value in grey differential equation which will directly influence the forecast result. In order to verify the performance of the proposed model, benchmarked datasets, DARPA 1999 and 2000 have been used to highlight the efficacy of the proposed model. The results show that the proposed adaptive grey Verhulst surpassed GM(1,1) and traditional grey Verhulst in forecasting incoming security situation in a network.

Author 1: Yu-Beng Leau
Author 2: Selvakumar Manickam

Keywords: Grey Theory; Network Security Situation Prediction; Adaptive Grey Verhulst Model; Adjustable Generation Sequence; Prediction Accuracy

PDF

Paper 14: Adaptive Neuro-Fuzzy Inference Systems for Modeling Greenhouse Climate

Abstract: The objective of this work was to solve the problem of non linear time variant multi-input multi-output of greenhouse internal climate for tomato seedlings. Artificial intelligent approaches including neural networks and fuzzy inference have been used widely to model expert behavior. In this paper we proposed the Adaptive Neuro-Fuzzy Inference Systems (ANFIS) as methodology to synthesize a robust greenhouse climate model for prediction of air temperature, air humidity, CO2 concentration and internal radiation during seedlings growth. A set of ten input meteorological and control actuators parameters that have a major impact on the greenhouse climate was chosen to represent the growing process of tomato plants. In this contribution we discussed the construction of an ANFIS system that seeks to provide a linguistic model for the estimation of greenhouse climate from the meteorological data and control actuators during 48 days of seedlings growth embedded in the trained neural network and optimized using the back propagation and the least square algorithm with 500 iterations. The simulation results have shown the efficiency of the proposed model.

Author 1: Charaf eddine LACHOURI
Author 2: Khaled MANSOURI
Author 3: Mohamed mourad LAFIFI
Author 4: Aissa BELMEGUENAI

Keywords: Greenhouse climate; Modeling; ANFIS; Neuro-Fuzzy

PDF

Paper 15: Face Behavior Recognition Through Support Vector Machines

Abstract: Communication between computers and humans has grown to be a major field of research. Facial Behavior Recognition through computer algorithms is a motivating and difficult field of research for establishing emotional interactions between humans and computers. Although researchers have suggested numerous methods of emotion recognition within the literature of this field, as yet, these research works have mainly focused on one method for their system output i.e. used one facial database for assessing their works. This may diminish the generalization method and additionally it might shrink the comparability range. A proposed technique for recognizing emotional expressions that are expressed through facial aspects of still images is presented. This technique uses the Support Vector Machines (SVM) as a classifier of emotions. Substantive problems are considered such as diversity in facial databases, the samples included in each database, the number of facial expressions experienced an accurate method of extracting facial features, and the variety of structural models. After many experiments and the results of different models being compared, it is determined that this approach produces high recognition rates.

Author 1: Haval A. Ahmed
Author 2: Tarik A. Rashid
Author 3: Ahmed T. Sadiq

Keywords: Facial Behavior Recognition; Support Vector Machine; Human Computer Interaction

PDF

Paper 16: VoIP Forensic Analyzer

Abstract: People have been utilizing Voice over Internet Protocol (VoIP) in most of the conventional communication facilities which has been of assistance in the enormous attenuation of operating costs, as well as the promotion of next- generation communication services-based IP. As an intimidating upshot, cyber criminals have correspondingly started interjecting the environment and creating new challenges for the law enforcement system in any Country. This paper presents an idea of a framework for the forensic analysis of the VoIP traffic over the network. This forensic activity includes spotting and scrutinizing the network patterns of VoIP-SIP stream, which is used to initiate a session for the communication, and regenerate the content from VoIP-RTP stream, which is employed to convey the data. Proposed network forensic investigation framework also accentuates on developing an efficient packet restructuring algorithm for tracing the depraved users involved in a conversation. Network forensics is the basis of proposed work, and performs packet level surveillance of VoIP followed by reconstruction of original malicious content or network session between users for their prosecution in the court.

Author 1: M Mohemmed Sha
Author 2: Manesh T
Author 3: Saied M. Abd El-atty

Keywords: Forensics; Packet Reordering; Session Initiation; Real Time Transfer

PDF

Paper 17: Comparing the Usability of M-Business and M-Government Software in Saudi Arabia

Abstract: This study presents a usability assessment of mobile presence in the Kingdom of Saudi Arabia (KSA), with a particular focus on the variance between M-business and M-government presence. In fact, a general hypothesis was developed that M-business software is more usable than M-government software, with eleven sub-hypotheses derived from Nielsen’s heuristics method. To examine the hypotheses, a true representative sample of thirty-six (n=36) mobile software applications in Saudi Arabia were identified from prior research, representing two main categories: M-business and M-government. Within each category, eighteen (n=18) mobile software applications were carefully chosen for further evaluation, representing a wide variety of sectors. A questionnaire was devised based on Nielsen’s heuristics method; this was tailored to fit the context at hand (mobile computing) to establish a usability checklist (consisting of eleven constructs). A group of thirty-six (n=36) participants were recruited to complete the usability assessment of examining each software application against the usability checklist, by rating each item using a Likert scale. The results herein reveal that mobile interactions in KSA were, in general, of an acceptable design quality with respect to usability aspects. The average percentage score for all heuristics met by the evaluated mobile software applications was 68.6%, this reflected how well the usability practices in mobile presence were implemented. The scores for all usability components exceeded 60%, with five components being below the average score (of 68.6%) and six components being above it. The variance between M-business and M-government software usability was significant, particularly in favor of M-business. In fact, the general hypothesis was accepted as well as seven other sub-hypotheses, as only four sub-hypotheses were rejected.

Author 1: Mutlaq B. Alotaibi

Keywords: Usability; interaction; heuristics; interface; mobile; Saudi Arabia

PDF

Paper 18: Detection of Malware and Malicious Executables Using E-Birch Algorithm

Abstract: Malware detection is one of the challenges to the modern computing world. Web mining is the subset of data mining used to provide solutions for complex problems. Web intelligence is the new hope for the field of computer science to bring solution for the malware detection. Web mining is the method of web intelligence to make web as an intelligent tool to combat malware and phishing websites. Generally, malware injected through websites into the user system and modifies the executable file and paralyze the whole activity of the system. Antivirus application utilizes the data mining technique to find the malware in the web. There is a need of heuristic approach to solve the malware problem. Dynamic analysis methods yield better result than the static methods. Data mining is the best option for the dynamic analysis of malware or malicious program. The purpose of the research is to apply the enhanced Birch algorithm to find the malware and modified executables of Windows and Android operating system.

Author 1: Dr. Ashit Kumar Dutta

Keywords: Birch; Malware; Executables; Android and Windows

PDF

Paper 19: Formalization of Learning Patterns Through SNKA

Abstract: The Learning patterns found among the learners community is steadily progressing towards the digitalized world. The learning patterns arise from acquiring and sharing knowledge. More impact is found on the usage of knowledge sharing tools such as facebook, linkedin, weblogs, etc that are dominating the traditional means of learning. Since the knowledge patterns acquired through web unstructured data is insecure, it leads to poor decision making or decision making without a root cause. These acquired patterns are also shared to others which indirectly affect the trust patterns between users. In this paper, In order to streamline the knowledge acquisition patterns and their sharing means a new framework is defined as Social Networking based Knowledge Acquisition (SNKA) to formalize the observed data and the Dynamic Itemset Count (DIC) algorithm is tried for predicting the users about the usage of web content before and after the knowledge is acquired. Finally the rough idea in building a tool is also suggested.

Author 1: Mr Rajesh D
Author 2: Dr. K.David

Keywords: Data Acquiring methods; Learning Patterns; Knowledge Management; Data Mining Tools

PDF

Paper 20: Comprehensive Study and Comparison of Information Retrieval Indexing Techniques

Abstract: This research is aimed at comparing techniques of indexing that exist in the current information retrieval processes. The techniques being inverted files, suffix trees, and signature files will be critically described and discussed. The differences that occur in their use will be discussed. The performance and stability of each indexing technique will be critically studied and compared with the rest of the techniques. The paper also aims at showing by the end the role that indexing plays in the process of retrieving information. It is a comparison of the three indexing techniques that will be introduced in this paper. However, the details arising from the detailed comparison will also enhance more understanding of the indexing techniques.

Author 1: Zohair Malki

Keywords: Information Retrieval; Indexing Techniques; Inverted Files; Suffix Trees; Signature Files

PDF

Paper 21: Writing Kurdish Alphabetics in Java Programming Language

Abstract: Nowadays, Kurdish programmers usually suffer when they need to write Kurdish letter while they program in java. More to say, all the versions of Java Development Kits have not supported Kurdish letters. Therefore, the aim of this study is to develop Java Kurdish Language Package (JKLP) for solving writing Kurdish alphabetic in Java programming language. So that Kurdish programmer and/or students they can converts the English-alphabetic to Kurdish-alphabetic. Furthermore, adding Kurdish language to standard Java Development Kit (JDK). Additionally, in this paper we present the JKLP standard documentation for users. Our object-oriented solution composed of a package consisting two classes which have been implemented in the Java programming language.

Author 1: Rebwar Mala Nabi
Author 2: Sardasht M-Raouf Mahmood
Author 3: Mohammed Qadir Kheder
Author 4: Shadman Mahmood

Keywords: Java; Arabic Scripts; Java language support; Java issues; Kurdish Language

PDF

Paper 22: Modeling of Compensation in Long-Running Transactions

Abstract: nowadays, the most controversial issue is transaction in database systems or web services. Specifically, in the area of service-oriented computing, where business transactions always need long periods of time to finish. In the case of a failure rollback, which is the traditional method, it will not be enough and not suitable for handling errors during long running transactions. As a substitute, the most appropriate approach is compensation which is used as an error recovery mechanism. Therefore, transactions that need a long time to complete are programmed as a composition of a set of compensable transactions. This study attempts to design several compensation policies in the long running web transaction especially when the transaction has parallel threads. Meabwhile, one thread in sequence steps of the transaction may fail. This paper also describes and models many different ways to compensate to the thread. Moreover, this study proposes a system to implement creating long running transactions as well as simulating failures by using compensation policies.

Author 1: Rebwar Mala Nabi
Author 2: Sardasht M-Raouf Mahmood
Author 3: Rebaz Mala Nabi
Author 4: Rania Azad Mohammed

Keywords: transaction; compensation; long-running transaction and interruption

PDF

Paper 23: A Survey on Digital Watermarking and its Application

Abstract: Digital communication plays a vital role in the world of Internet as well as in the communication technology. The secrecy of the communication is an essential part of passing the data or information. One noticeable technique is Digital Watermarking. Copyright owners seek methods to control and detect such reproduction, and henceforth research on digital product copyright protection has significant practical significance for E-commerce & E-Governance. In this paper, a survey on some previous work done in watermarking field is presented. Experimentally evaluated algorithms are collected to focus on the wide scope of encrypted digital watermarking for data transmission security and authentication.

Author 1: Ms.Mahua Pal

Keywords: Watermarking; Watermarking technique; DCT; DWT; LWM; DFRNT; PSNR

PDF

Paper 24: Database-as-a-Service for Big Data: An Overview

Abstract: The last two decades were marked by an exponential growth in the volume of data originating from various data sources, from mobile phones to social media contents, all through the multitude devices of the Internet of Things. This flow of data can’t be managed using a classical approach and has led to the emergence of a new buzz word: Big Data. Among the research challenges related to Big Data there is the issue of data storage. Traditional relational database systems proved to be unable to efficiently manage Big Data datasets. In this context, Cloud Computing plays a relevant role, as it offers interesting models to deal with Big Data storage, especially the model known as Database as a Service (DBaaS). We propose, in this article, a review of database solutions that are offered as DBaaS and discuss their adaptability to Big Data applications.

Author 1: Manar Abourezq
Author 2: Abdellah Idrissi

Keywords: Cloud Computing; Big Data; Database as a Service

PDF

Paper 25: Analysis of the SNR Estimator for Speech Enhancement Using a Cascaded Linear Model

Abstract: Elimination of tainted noise and improving the overall quality of a speech signal is speech enhancement. To gain the advantage of individual algorithms we propose a new linear model and that is in the form of cascade adaptive filters for suppression of non-stationary noise. We have successfully deployed NLMS (Normalized Least Mean Square) algorithm, Sign LMS (Least Mean Square) and RLS (Recursive Least Square) as the main de-noising algorithms. Moreover, we are successful in demonstrating that the prior information about the noise is not required otherwise it would have been difficult to estimate for fast-varying noise in non-stationary environment. This approach estimates clean speech by recognizing the long segments of the clean speech as one whole unit. During experiment/implementation we used in-house database (includes various types of non stationary noise) for speech enhancement and proposed model results have shown improvement over conventional algorithms not only in objective but in subjective evaluations as well. Simulations present good results with a new linear model that are compared with individual algorithm results.

Author 1: Harjeet Kaur
Author 2: Rajneesh Talwar

Keywords: Least Mean Square (LMS); Normalized Least Mean Square (NLMS); Recursive Least Square(RLS); Speech Enhancement; Non- stationary

PDF

Paper 26: A Game Theoretic Framework for E-Mail Detection and Forgery Analysis

Abstract: In email forensic, the email detection and forgery conflict is an interdependent strategy selection process, and there exists complex dynamics between the detector and the forger, who have conflicting objectives and influence each other’s performance and decisions. This paper aims to study their dynamics from the perspective of game theory .We firstly analyze the email basic structure and header information, then discuss the email detection and forgery technologies. In this paper, we propose a Detection-Forgery Game (DFG) model and make a classification of players’ strategy with the Operation Complexity (OC). In the DFG model, we regard the interactions between the detector and the forger as a two-player, non-cooperative, non-zero-sum and finite strategic game, and formulate the Nash Equilibrium. The optimal detection and forgery strategies with minimizing cost and maximizing reward will be found by using the model. Finally, we perform empirical experiments to verify the effectiveness and feasibility of the model.

Author 1: Long Chen
Author 2: Yuan Lou
Author 3: Min Xiao
Author 4: Zhen-Xing Dong

Keywords: email detection; email forgery; game theoretic model; Nash Equilibrium; the optimal strategy

PDF

Paper 27: Review of Energy Reduction Techniques for Green Cloud Computing

Abstract: The growth of cloud computing has led to uneconomical energy consumption in data processing, storage, and communications. This is unfriendly to the environment, because of the carbon emissions. Therefore, green IT is required to save the environment. The green cloud computing (GCC) approach is part of green IT; it aims to reduce the carbon footprint of datacenters by reducing their energy consumption. The GCC is a broad and exciting field for research. A plethora of research has emerged aiming to support the GCC vision by improving the utilization of computing resources from different aspects, such as: software optimization, hardware optimization, and network optimization techniques. This paper overviews the approaches to GCC and classifies them. Such a classification assists in comparisons between GCC approaches by identifying the key implementation approaches and the issues related to each.

Author 1: Shaden M. AlIsmail
Author 2: Heba A. Kurdi

Keywords: Cloud Computing; Green Computing; Energy Efficiency; Power Management; Virtualization

PDF

Paper 28: A Discrete Particle Swarm Optimization to Estimate Parameters in Vision Tasks

Abstract: The majority of manufacturers demand increasingly powerful vision systems for quality control. To have good outcomes, the installation requires an effort in the vision system tuning, for both hardware and software. As time and accuracy are important, actors are oriented to automate parameter’s adjustment optimization at least in image processing. This paper suggests an approach based on discrete particle swarm optimization (DPSO) that automates software setting and provides optimal parameters for industrial vision applications. A novel update functions for our DPSO definition are suggested. The proposed method is applied on some real examples of quality control to validate its feasibility and efficiency, which shows that the new DPSO model furnishes promising results.

Author 1: Benchikhi Loubna
Author 2: Sadgal Mohamed
Author 3: Elfazziki Abdelaziz
Author 4: Mansouri Fatimaezzahra

Keywords: component; component; industrial vision; image processing; optimization; DPSO; quality control

PDF

Paper 29: Improving Image Encryption Using 3D Cat Map and Turing Machine

Abstract: Security of data is of prime importance. Security is a very complex and vast topic. One of the common ways to protect this digital data from unauthorized eavesdropping is encryption. This paper introduces an improved image encryption technique based on a chaotic 3D cat map and Turing machine in the form of dynamic random growth technique. The algorithm consists of two main sections: The first does a preprocessing operation to shuffle the image using 3D chaotic map in the form of dynamic random growth technique. The second uses Turing machine simultaneous with shuffling pixels’ locations to diffuse pixels’ values using a random key that is generated by chaotic 3D cat map. The hybrid compound of a 3D chaotic system and Turing machine strengthen the encryption performance and enlarge the key space required to resist the brute force attacks. The main advantages of such a secure technique are the simplicity and efficiency. These good cryptographic properties prove that it is secure enough to use in image transmission systems.

Author 1: Nehal A. Mohamed
Author 2: Mostafa A. El-Azeim
Author 3: Alaa Zaghloul

Keywords: chaotic 3D cat map; brute force attacks; Dynamic random growth technique; Turing machine; key space

PDF

Paper 30: Performance Analysis of CPU Scheduling Algorithms with Novel OMDRRS Algorithm

Abstract: CPU scheduling is one of the most primary and essential part of any operating system. It prioritizes processes to efficiently execute the user requests and help in choosing the appropriate process for execution. Round Robin (RR) & Priority Scheduling(PS) are one of the most widely used and acceptable CPU scheduling algorithm. But, its performance degrades with respect to turnaround time, waiting time & context switching with each recurrence. A New scheduling algorithm OMDRRS is developed to improve the performance of RR and priority scheduling algorithms. The new algorithm performs better than the popular existing algorithm. Drastic improvement is seen in waiting time, turnaround time, response time and context switching. Comparative analysis of Turn around Time(TAT), Waiting Time(WT), Response Time (RT) is shown with the help of ANOVA and t-test.

Author 1: Neetu Goel
Author 2: Dr. R. B. Garg

Keywords: Round Robin; Turn-around time; Waiting Time; t-test; Anova test

PDF

Paper 31: Assessment and Comparison of Fuzzy Based Test Suite Prioritization Method for GUI Based Software

Abstract: The testing of event driven software has significant role to improve overall quality of software. Due to event driven nature of GUI based software many test cases are generated and it is difficult to identify test cases whose fault revealing capability is high. To identify those test cases test suite prioritization is done. Various test suite prioritization methods exists for GUI based software in literature. Prioritization methods improve the rate of fault detection. In our previous work we have proposed a fuzzy model for test suite prioritization of GUI based software. In this method priority is assigned on the basis of multiple factors using fuzzy model. These factors are: The type of event, Event Interaction, and Parameter-value interaction coverage-based criteria. Using this method a test oracle will be organized in the descending order of its effectiveness. In this paper we are evaluating the proposed fuzzy model and comparing results with other prioritization methods.

Author 1: Neha Chaudhary
Author 2: O.P. Sangwan

Keywords: Test suite prioritization; Fuzzy Model; Comparison of prioritization methods

PDF

Paper 32: Bag of Features Model Using the New Approaches: A Comprehensive Study

Abstract: The major challenge in content based image retrieval is the semantic gap. Images are described mainly on the basis of their numerical information, while users are more interested in their semantic content and it is really difficult to find a correspondence between these two sides. The bag of features (BoF) model is an efficient image representation technique for image classification. However, it has some limitations for instance the information loss during the encoding process, an important step of BoF. This is because the encoding is usually done by hard assignment i.e. in vector quantization each feature is encoded by being assigned to a single visual word. Another notorious disadvantage of BoF is that it ignores the spatial relationships among the patches, which are very important in image representation. To address those limitations and enhance the results, novel approaches were proposed at each level of the BoF pipeline. In instance the combination of local and global descriptors for a better description, a soft-assignment encoding manner with a spatial pyramid partitioning for a more informative image representation and a maximum pooling to get the final descriptors. Our work aims to give a detailed version of the BoF, including all the levels of the pipeline, as a support leading to a better comprehension of the approach. We also compare and evaluate the state-of-the-art approaches and find out how these changes at each level of the pipeline could affect the performance and the overall classification results.

Author 1: CHOUGRAD Hiba
Author 2: ZOUAKI Hamid
Author 3: ALHEYANE Omar

Keywords: Bag of features; Image classification; Local and global descriptors; Locality-constrained Linear Coding; Spatial pyramid; Pooling

PDF

Paper 33: Maximally Distant Codes Allocation Using Chemical Reaction Optimization with Enhanced Exploration

Abstract: Error correcting codes, also known as error controlling codes, are sets of codes with redundancy that provides for error detection and correction, for fault tolerant operations like data transmission over noisy channels or data retention using storage media with possible physical defects. The challenge is to find a set of m codes out of 2n available n-bit combinations, such that the aggregate hamming distance among those codewords and/or the minimum distance is maximized. Due to the prohibitively large solution spaces of practically sized problems, greedy algorithms are used to generate quick and dirty solutions. However, modern evolutionary search techniques like genetic algorithms, swarm particles, gravitational search, and others, offer more feasible solutions, yielding near optimal solutions in exchange for some computational time. The Chemical Reaction Optimization (CRO), which is inspired by the molecular reactions towards a minimal energy state, emerged recently as an efficient optimization technique. However, like the other techniques, its internal dynamics are hard to control towards convergence, yielding poor performance in many situations. In this research, we proposed an enhanced exploration strategy to overcome this problem, and compared it with the standard threshold based exploration strategy in solving the maximally distant codes allocation problem. Test results showed that the enhancement provided better performance on most metrics.

Author 1: Taisir Eldos
Author 2: Abdallah Khreishah

Keywords: Evolutionary Algorithms; Chemical Reaction Optimization; Maximally Distant Codes; Binary Knapsack Problem; Fault Tolerance

PDF

Paper 34: Enhancement Bag-of-Words Model for Solving the Challenges of Sentiment Analysis

Abstract: Sentiment analysis is a branch of natural language processing, or machine learning methods. It becomes one of the most important sources in decision making. It can extract, identify, evaluate or otherwise characterizes from the online sentiments reviews. Although Bag-Of-Words model is the most widely used technique for sentiment analysis, it has two major weaknesses: using a manual evaluation for a lexicon in determining the evaluation of words and analyzing sentiments with low accuracy because of neglecting the language grammar effects of the words and ignore semantics of the words. In this paper, we propose a new technique to evaluate online sentiments in one topic domain and produce a solution for some significant sentiment analysis challenges that improves the accuracy of sentiment analysis performed. The proposed technique relies on the enhancement bag-of-words model for evaluating sentiment polarity and score automatically by using the words weight instead of term frequency. This technique also can classify the reviews based on features and keywords of the scientific topic domain. This paper introduces solutions for essential sentiment analysis challenges that are suitable for the review structure. It also examines the effects by the proposed enhancement model to reach higher accuracy.

Author 1: Doaa Mohey El-Din

Keywords: Sentiment analysis; Bag-Of-Words; sentiment analysis challenges; text analysis; Reviews

PDF

Paper 35: QRS Detection Based on an Advanced Multilevel Algorithm

Abstract: This paper presents an advanced multilevel algorithm used for the QRS complex detection. This method is based on three levels. The first permits the extraction of higher peaks using an adaptive thresholding technique. The second allows the QRS region detection. The last level permits the detection of Q, R and S waves. The proposed algorithm shows interesting results compared to recently published methods. The perspective of this work is the implementation of this method on an embedded system for a real time ECG monitoring system.

Author 1: Wissam Jenkal
Author 2: Rachid Latif
Author 3: Ahmed Toumanari
Author 4: Azzedine Dliou
Author 5: Oussama El B’charri
Author 6: Fadel Mrabih Rabou Maoulainine

Keywords: ECG Signal; QRS Complex; multilevel algorithm; thresholding technique

PDF

Paper 36: FPGA Implementation of Adaptive Neuro-Fuzzy Inference Systems Controller for Greenhouse Climate

Abstract: This paper describes a Field-programmable Gate Array (FPGA) implementation of Adaptive Neuro-fuzzy Inferences Systems (ANFIS) using Very High-Speed Integrated Circuit Hardware-Description Language (VHDL) for controlling temperature and humidity inside a tomato greenhouse. The main advantages of using the HDL approach are rapid prototyping and allowing usage of powerful synthesis controller through the use of the VHDL code. The use of hardware description language (HDL) in the application is suitable for implementation into an Application Specific Integrated Circuit (ASIC) and Field tools such as Quartus II 8.1. A set of six inputs meteorological and control actuators parameters that have a major impact on the greenhouse climate was chosen to represent the growing process of tomato plants. In this contribution, we discussed the construction of an ANFIS system that seeks to provide a linguistic model for the estimation of greenhouse climate from the meteorological data and control actuators during 48 days of seedlings growth embedded in the trained neural network and optimized using the backpropagation and the least square algorithm with 500 iterations. The simulation results have shown the efficiency of the implemented controller.

Author 1: Charaf eddine LACHOURI
Author 2: Khaled MANSOURI
Author 3: Aissa BELMEGUENAI
Author 4: Mohamed mourad LAFIFI

Keywords: Neuro-Fuzzy; ANFIS; VHDL; FPGA; Quartus; ASIC

PDF

Paper 37: MR Brain Real Images Segmentation Based Modalities Fusion and Estimation Et Maximization Approach

Abstract: With the development of acquisition image techniques, more data coming from different sources of image become available. Multi-modality image fusion seeks to combine information from different images to obtain more inferences than can be derived from a single modality. The main aim of this work is to improve cerebral IRM real images segmentation by fusion of modalities (T1, T2 and DP) using estimation et maximizatio Approach (EM). The evaluation of adopted approaches was compared using four criteria which are: the standard deviation (STD), entropy of information (IE), the coefficient of correlation (CC) and the space frequency (SF). The experimental results on MRI brain real images prove that the adopted scenarios of fusion approaches are more accurate and robust than the standard EM approach

Author 1: ASSAS Ouarda

Keywords: component; Data fusion; Segmentation; Estimation and Maximization; MRI images

PDF

Paper 38: A Leveled Dag Critical Task Firstschedule Algorithm in Distributed Computing Systems

Abstract: In distributed computing environment, efficient task scheduling is essential to obtain high performance. A vital role of designing and development of task scheduling algorithms is to achieve better makes pan. Several task scheduling algorithms have been developed for homogeneous and heterogeneous distributed computing systems. In this paper, a new static task scheduling algorithm is proposed namely; Leveled DAG Critical Task First (LDCTF) that optimizes the performance of Leveled DAG Prioritized Task (LDPT) algorithm to efficiently schedule tasks on homogeneous distributed computing systems. LDPT was compared to B-level algorithm which is the most famous algorithm in homogeneous distributed systems and it provided better results. LDCTF is a list based scheduling algorithm which depends on sorting tasks into a list according to their priority then scheduling one by one on the suitable processor. LDCTF aims to improve the performance of the system by minimizing the schedule length than LDPT and B-level algorithms.

Author 1: Amal EL-NATTAT
Author 2: Nirmeen A. El-Bahnasawy
Author 3: Ayman EL-SAYED

Keywords: Task scheduling; Homogeneous distributed computing systems; Precedence constrained parallel applications; Directed Acyclic Graph; Critical path

PDF

Paper 39: No-Reference Perceived Image Quality Algorithm for Demosaiced Images

Abstract: Visual image quality assessment (IQA) plays a key role in every multimedia application, as end user to it is a human-being. Real time applications demand no reference (NR) IQA, due to unavailability of the reference image. Today, most of the perceived/visual NR-IQA algorithms developed are for distortions like blur, ringing, and blocking artifacts. Very few are available for color distortions. Visible color distortions, such as false color, and zipper are produced in the demosaiced image due to incorrect interpolation of missing color values. In this paper, state of the art zipper and false color artifact quantification algorithms, general purpose NR-IQA algorithms are evaluated for visual quality assessment of demosaiced images. Separate NR- IQA algorithms are proposed for zipper and false color artifact quantification these scores are then combined to obtain final quality score for demosaiced image. Zipper algorithm quantifies zipper artifact by searching for zipper pixels in an image. While, false color algorithm finds correlation between local high frequency region’s color planes to quantify false color.

Author 1: Lamb Anupama Balbhimrao
Author 2: Madhuri Khambete

Keywords: Demosaicing; Correlation; False color; Image quality; Regression; Zipper

PDF

Paper 40: Dynamic Clustering for Information Retrieval from Big Data Depending on Compressed Files

Abstract: The rapid growth in the database data led to origination a large amount of data. So, it is still a big problem to access this data for answering user queries. In this paper a novel approach for aggregating the required data was proposed, this approach called dynamic clustering. Also, several retrieval methods were used for retrieving purposes. The dynamic clustering method is built clusters according to the user entries (queries). It has been applied to different compressed database files in different size and using different queries. The compressed database file it is resulted from applying ICM (Ideal Compression Method) and best compressed algorithm(improved k-mean, k-mean with medium probability and k-mean with maximum gain ratio).The retrieval methods applied to original database file, compressed file and the cluster that result from implementing dynamic clustering algorithm and the results was compared.

Author 1: Dr.Alaa Kadhim F.
Author 2: Prof. Dr. Ghassan H. Abdul
Author 3: Rasha Subhi Ali

Keywords: dynamic clustering; data retrieval methods; compression algorithm; ICM system; improved k-means algorithm and modified improved k-means algorithms

PDF

Paper 41: A Fast Adaptive Artificial Neural Network Controller for Flexible Link Manipulators

Abstract: This paper describes a hybrid approach to the problem of controlling flexible link manipulators in the dynamic phase of the trajectory. A flexible beam/arm is an appealing option for civil and military applications, such as space-based robot manipulators. However, flexibility brings with it unwanted oscillations and severe chattering which may even lead to an unstable system. To tackle these challenges, a novel control architecture scheme is presented. First, a neural network controller based on the robot’s dynamic equation of motion is elaborated. Its aim is to produce a fast and stable control of the joint position and velocity and damp the vibration of each arm. Then, an adaptive Cerebellar Model Articulation Controller (CMAC) is implemented to balance unmodeled dynamics, enhancing the precision of the control. Efficiency of the new controller obtained is tested on a two-link flexible manipulator. Simulation results on a dynamic trajectory with a sinusoidal form show the effectiveness of the proposed control strategy.

Author 1: Amin Riad Maouche
Author 2: Hosna Meddahi

Keywords: Adaptive control; CMAC neural network; artificial neural network; nonlinear control; flexible-link manipulator; dynamic motion equation

PDF

Paper 42: Performance Testing, and Evaluation for the Voipv6 Network Related Functions, (Sendto and Receivefrom)

Abstract: (The network related functions (Sendto, and Receivefrom) in VoIPv6, are needed to obtain the communication socket in both UDP, and TCP before the communication can take place between the sending and receiving ends. The intent of testing and evaluating the network related functions in Voice over Internet Protocol (VoIPv6) in this research work is not to provide a comprehensive benchmark, but rather to test how well TCP (Transmission Control Protocol), and UDP (User Datagram Protocol) perform in sending and receiving VoIPv6 traffic and bulk data transfer, Part of this, due to the cumulative nature of VoIPv6 performance, can be achieved by testing the network related functions which are the Sendto and Receivefrom socket calls. This is because the sending concept in UDP IPv6 is based on the best effort sending of packets not a guaranteed sending as in TCP IPv6. In this context, performance enhancement techniques are needed to be applied in VoIPv6 due to the fact that there is no dedicated line between the sending and receiving ends. This is actually the putty and the drawback at the same time of VoIP. This is also the reason for IPv6 to take longer time yet to reach its full maturity (Recommendation G.711 of the ITU expectation is by the year 2050) when fully deploying real time applications due to their time sensitivity.

Author 1: Asaad Abdallah Yousif Malik Abusin
Author 2: Dr. Junaidi Abdullah
Author 3: Dr Tan Saw Chin

Keywords: (VoIPv6 (Voice over Internet Protocol V6) Performance; Voice over Internet Protocol V6 Performance testing; Voice over Internet Protocol V6 Performance analysis; VoIPv6 quality testing in the protocol and application layers; Internet Measurement Research;

PDF

Paper 43: fMRI Data Analysis Using Dempster-Shafer Method with Estimating Voxel Selectivity by Belief Measure

Abstract: In the functional Magnetic Resonance Imaging (fMRI) data analysis, detecting the activated voxels is a challenging research problem where the existing methods have shown some limits. We propose a new method wherein brain mapping is done based on Dempster-Shafer theory of evidence (DS) that is a useful method in uncertain representation analysis. Dempster-Shafer allows finding the activated regions by checking the activated voxels in fMRI data. The activated brain areas related to a given stimulus are detected by using a belief measure as a metric for evaluating activated voxels. To test the performance of the proposed method, artificial and real auditory data have been employed. The comparison of the introduced method with the t-test and GLM method has clearly shown that the proposed method can provide a higher correct detection of activated voxels.

Author 1: ATTIA Abdelouahab
Author 2: MOUSSAOUI Abdelouahab
Author 3: TALEB-AHMED Abdelmalik

Keywords: Dempster-Shafer theory; fMRI; GLM; t-test; HRF; OTSU method

PDF

Paper 44: Statistical Quality of Service to Increase Qos/Qoe of IP-Based Gateway for Integrating Heterogeneous Wireless Devices

Abstract: In broadcast service area above communications supported cellular wireless networks, data is communicated to several addressees from a right of entry point/base station. Multicast significantly progresses the network effectiveness to dispense data to multiple addressees as associated to multiple unicast gatherings of the similar data to each receiver independently, by taking improvement of the communal nature of the wireless intermediate. These algorithms need to be intended to be responsible for the essential Quality of Service (QoS) towards an extensive assortment of applications while permitting seamless roaming between multitudes of access network technologies. This paper proposed a cellular-aided mobile ad hoc network (CAMA) structural design, in which a CAMA representative in the cellular network accomplishes the control data, while the data is transported over the mobile terminals (MTs). The routing and security info is substituted among MTs and the negotiator over cellular radio channels. A location centered routing protocol, the multi-selection greedy positioning routing (MSGPR) protocol, is projected. This novel feature makes it more appropriate in the actual world. In accumulation, dynamic new call blocking possibility is initially familiarized to make handoff decision for wireless networks. This paper proposes a novel technique to afford QoS sustenance by means of an assistant network to recuperate the failure of multicast data in the major network. A wireless device might misplace some of the multicast records send above the major network. The experiment results have exposed that the proposed algorithm outclasses traditional algorithms in bandwidth deployment, handoff dropping rate and handoff rate.

Author 1: Pon. Arivanantham
Author 2: Dr. M. Ramakrishnan

Keywords: Heterogeneous Wireless Mobile Networks; Ad Hoc Networks; Cellular Networks; Quality of Service; Security; Wireless Networks

PDF

Paper 45: Design and Simulation of a Low-Voltage Low-Offset Operational Amplifier

Abstract: In many applications, offset of the OP-AMPs should be canceled to high accuracy be accomplished. In this work, an asymmetrical differential input circuit with active DC offset rejection circuit was implemented to minimize the systematic offset of the amplifier. The proposed OP-AMPs show that the systematic offset voltages is less than 80 µV.

Author 1: Babak Gholami

PDF

Paper 46: Enhanced Audio LSB Steganography for Secure Communication

Abstract: The ease with which data can be remitted across the globe via Internet has made it an obvious (as medium) choice for on line data transmission and communication. This salient trait, however, is constraint with akin issues of privacy, veracity of the information being exchanged over it, and legitimacy of its sender together with its availability when needed. Although cryptography is being used to confront confidentiality concern yet for many is slightly limited in scope because of discernibility of encrypted information. Further, due to restrictions imposed on the use of cryptography by its citizens for personal doings, various Governments have also coxswained the research arena to explore another discipline of information hiding called steganography – whose sole purpose is to make the information being exchanged inaudible. This research is focused on evolution of model based secure LSB Steganographic scheme for digital audio wave file format to withstand passive attack by Warden Wendy.

Author 1: Muhammad Junaid Hussain
Author 2: Khan Farhan Rafat

Keywords: Conceal; Human Auditory System (HAS); Imperceptible Communication; Internet as a Secure Communication Medium; LSB Based Audio Steganography; Modeling Security of Steganographic System; WAV File Steganography

PDF

Paper 47: Eliminating Broadcast Storming in Vehicular Ad-Hoc Networks

Abstract: VANETs (Vehicular Ad-hoc Networks) offer diversity of appealing applications. A lot of applications offered by VANETs depend upon the propagation of messages from one vehicle to another vehicle in the network. Several algorithms for useful broadcasting of safety/ warning messages in the network have been presented by different researchers. Vehicles on roads are increasing day by day. Due to this increased number of vehicles and especially during the crest hours, when the networks become very dense, dissemination of the messages blindly in the network causes problems like packet collisions, data thrashing and broadcast storming. In this research, a relative speed based waiting time algorithm has been presented for avoiding broadcast storming problem in the VANETS especially in dense environment. This proposed algorithm calculates the waiting time for each vehicle after receiving the safety/ warning messages according to the relative speed of the vehicles, the distance between the vehicles and range of vehicles. The results show that the proposed relative speed based algorithm is better than already existing algorithms like blind flooding and dynamically broadcasting waiting time algorithm which uses number of neighbors and distance between the vehicles for calculating the waiting time.

Author 1: Umar Hayat
Author 2: Razi Iqbal
Author 3: Jamal Diab

Keywords: VANETs; Intelligent Transportation Systems; Broadcast Storming; Distance based flooding

PDF

Paper 49: Applications of Some Topological Near Open Sets to Knowledge Discovery

Abstract: In this paper, we use some topological near open sets to introduce the rough set concepts such as near open lower and near open upper approximations. Also, we study the concept of near open, rough set and some of their basic properties. We will study the comparison among near open concepts and rough set concepts. We also studied the effect of this concept to motivate the knowledge discovery processing.

Author 1: A. S. Salama
Author 2: O. G. El-Barbary

Keywords: Topological spaces; Rough sets; Knowledge discovery; open sets; Accuracy measure

PDF

Paper 50: Analysis of Cloud Network Management Using Resource Allocation and Task Scheduling Services

Abstract: Network failure in cloud datacenter could result from inefficient resource allocation; scheduling and logical segmentation of physical machines (network constraints). This is highly undesirable in Distributed Cloud Computing Networks (DCCNs) running mission critical services. Such failure has been identified in the University of Nigeria datacenter network situated in the south eastern part of Nigeria. In this paper, the architectural decomposition of a proposed DCCN was carried out while exploring its functionalities for grid performance. Virtualization services such as resource allocation and task scheduling were employed in heterogeneous server clusters. The validation of the DCCN performance was carried out using trace files from Riverbed Modeller 17.5 in order to ascertain the influence of virtualization on server resource pool. The QoS metrics considered in the analysis are: the service delay time, resource availability, throughput and utilization. From the validation analysis of the DCCN, the following results were obtained: average throughput (bytes/Sec) for DCCN = 40.00%, DCell = 33.33% and BCube = 26.67%. Average resource availability response for DCCN = 38.46%, DCell = 33.33%, and BCube = 28.21%. DCCN density on resource utilization = 40% (when logically isolated) and 60% (when not logically isolated). From the results, it was concluded that using virtualization in a cloud DataCenter servers will result in enhanced server performance offering lower average wait time even with a higher request rate and longer duration of resource use (service availability). By evaluating these recursive architectural designs for network operations, enterprises ready for Spine and leaf model could further develop their network resource management schemes for optimal performance.

Author 1: K.C. Okafor
Author 2: F.N.Ugwoke
Author 3: Obayi, Adaora Angela
Author 4: V.C Chijindu
Author 5: O.U Oparaku

Keywords: Resource Provisioning; Virtualization; Cloud Computing; Service Availability; Smart Green Energy; QoS

PDF

Paper 51: The Impact of the Implementation of the ERP on End-User Satisfaction Case of Moroccan Companies

Abstract: In recent years, the implementation of ERP is as a lever for development and inter-organizational collaboration. The ERP is a powerful tool for integration, sharing of information, and fluidizing of the process within the organizations (El Amrani et al. 2006; Kocoglu and Moatti, 2010). The company must not only equip and computerization but it must opt for the establishment of an IT infrastructure "optimal" who will respond to its present and future needs. OF or the interest of the application integration, and especially of the ERP who come remedy the situations mentioned. This article proposes a model and tests to evaluate the success of a system "Enterprise Resource Planning "(ERP) based on a measure of user satisfaction. Referring to the model DeLone & McLean (1992) and the work of Seddon & Kiew (1994) . The criteria that can influence user satisfaction, to ensure the successful implementation of the ERP system are identified. The results of the exploratory study, carried out on 60 users in 40 Moroccan companies, shows that user satisfaction of ERP is explained by the quality of the ERP system, perceived usefulness and quality of information provided by this type of system. The study also found that the quality of change is a predictor of satisfaction measured by user involvement in the implementation of ERP, the quality of communication within such a project and the quality of training given to users.

Author 1: Fatima JALIL
Author 2: Abdellah ZAOUIA
Author 3: Rachid EL BOUANANI

Keywords: Enterprise Resource Planning (ERP); User Satisfaction; Quality Change;Information Technology (IT); Information Systems(IS); success; evaluation approaches;Evaluation Success Factors

PDF

Paper 52: Vision Based Geo Navigation Information Retreival

Abstract: In order to derive the three-dimensional camera position from the monocular camera vision, a geo-reference database is needed. Floor plan is a ubiquitous geo-reference database that every building refers to it during construction and facility maintenance. Comparing with other popular geo-reference database such as geo-tagged photos, the generation, update and maintenance of floor plan database does not require costly and time consuming survey tasks. In vision based methods, the camera needs special attention. In contrast to other sensors, vision sensors typically yield vast information that needs complex strategies to permit use in real-time and on computationally con-strained platforms. This research work show that map-based visual odometer strategy derived from a state-of-the-art structure-from-motion framework is particularly suitable for locally stable, pose controlled flight. Issues concerning drifts and robustness are analyzed and discussed with respect to the original framework. Additionally, various usage of localization algorithm in view of vision has been proposed here. Though, a noteworthy downside with vision-based algorithms is the absence of robustness. The greater parts of the methodologies are delicate to scene varieties (like season or environment changes) because of the way that they utilize the Sum of Squared Differences (SSD). To stop that, we utilize the Mutual Information which is exceptionally vigorous toward global and local scene varieties. On the other hand, dense methodologies are frequently identified with drift drawbacks. Here, attempt to take care of this issue by utilizing geo-referenced pictures. The algorithm of localization has been executed and experimental results are available. Vision sensors possess the potential to extract information about the surrounding environment and determine the locations of features or points of interest. Having mapped out landmarks in an unknown environment, subsequent observations by the vision sensor can in turn be used to resolve position and orientation while continuing to map out new features. . In addition, the experimental results of the proposed model also suggest a plausibility proof for feed forward models of delineate recognition in GEO-location.

Author 1: Asif Khan
Author 2: Jian-Ping Li
Author 3: Riaz Ahmed Shaikh

Keywords: Vision; Geo-Navigation; Information Retrieval

PDF

Paper 53: Contemporary Layout’s Integration for Geospatial Image Mining

Abstract: Image taxonomy and repossession plays a major role in dealing with large multimedia data on the Internet. Social networks, image sharing websites and mobile application require categorizing multimedia items for more efficient search and storage. Therefore, image classification and retrieval methods gained a great importance for researchers and companies. Image classification can be performed in a supervised and semi-supervised manner and in order to categorize an unknown image, a statistical model created using relabeled samples is fed with the numerical representation of the visual features of images. Analysis of the keywords surrounding the images or the content of the images alone has not yet achieved results that would allow deriving precise location information to select representative images. Photos that are reliably tagged with labels of place names or areas only cover a small fraction of available images and also remain at a keyword level. State of the art of content based retrieval has been analyzed in earth perception image archives concentrating on complete frameworks indicating guarantee for the operational implementation. The methods are taken into consideration, concentrating specifically on the stages after extraction of primitive features. The solutions conceived for the issues such as synthesis and simplification of features, semantic labeling and indexing are reviewed. The approaches regarding query execution and specification are assessed where conclusions are drawn in the research of earth observation mining.

Author 1: Riaz Ahmed Shaikh
Author 2: Jian-Ping Li
Author 3: Asif Khan

Keywords: Geo-Location; Spatial Layout; Feature Extraction; Image Mining

PDF

Paper 54: Arabic Stemmer for Search Engines Information Retrieval

Abstract: Arabic language is very different and difficult structure than other languages, that’s because it is a very rich language with complex morphology. Many stemmers have been developed for Arabic language but still there are many weakness and problems. There is still lack of usage of Arabic stemming in search engines. This paper introduces a rooted word Arabic stemmer technique. The results of the introduced technique for six Arabic sentences are used in famous search engines Google Chrome, Internet Explore and Mozilla Firefox to check the effect of using Arabic stemming in these search engines in terms of the total number of searched pages and the search time ratio for actual sentences and their stemming results. The results show that Arabic words stemming increase and accelerate the search engines output.

Author 1: Ahmed Khalid
Author 2: Zakir Hussain
Author 3: Mirza Anwarullah Baig

Keywords: Information Retrieval; Arabic Stemming; Search Engine; Arabic Morphology

PDF

Paper 55: Implementation of a Neural Network Using Simulator and Petri Nets*

Abstract: This paper describes construction of multilayer perceptron by open source neural networks simulator - Neuroph and Petri net. The described multilayer perceptron solves logical function "xor "- exclusive or. The aim is to explore the possibilities of description of the neural networks by Petri Nets. The selected neural network (multilayer perceptron) allows to be seen clearly the advantages and disadvantages of the realizing through simulator. The selected logical function does not have a linear separability. After consumption of a neural network on a simulator was investigated implementation by Petri Nets. The results are used to determine and to consider opportunities for different discrete representations of the same model and the same subject area.

Author 1: Nayden Valkov Nenkov
Author 2: Elitsa Zdravkova Spasova

Keywords: neural networks; simulators; logical or; petri net

PDF

Paper 56: Content-Based Image Retrieval Using Texture Color Shape and Region

Abstract: Interests to accurately retrieve required images from databases of digital images are growing day by day. Images are represented by certain features to facilitate accurate retrieval of the required images. These features include Texture, Color, Shape and Region. It is a hot research area and researchers have developed many techniques to use these feature for accurate retrieval of required images from the databases. In this paper we present a literature survey of the Content Based Image Retrieval (CBIR) techniques based on Texture, Color, Shape and Region. We also review some of the state of the art tools developed for CBIR.

Author 1: Syed Hamad Shirazi
Author 2: Arif Iqbal Umar
Author 3: Saeeda Naz
Author 4: Noor ul Amin Khan
Author 5: Muhammad Imran Razzak
Author 6: Bandar AlHaqbani

Keywords: CBIR; Color Space; Relevance Feedback; Texture Features; Shape; Color

PDF

Paper 57: SDAA: Towards Service Discovery Anywhere Anytime Mobile Based Application

Abstract: Providing on-demand service based on customers' current location is an urgent need for many societies and individuals. Specially, for woman, elderly people, single mother, sick people, etc. Considering the need of providing localized services, this paper proposes a mobile application framework that allows an individual to receive services from his neighborhood peers anywhere anytime. The application allows an individual to find and select reliable service providers near his location. The application will provide an opportunity to the interested individuals to use their free time for providing services to the community and earn some extra money. This application will benefit many stakeholders like elderly people, women at home, a person while traveling in an unknown place, etc. A prototype application is developed and empirical evaluation is considered to find the qualitative measures of the users' acceptability and satisfaction of the application. It is observed that users' satisfaction is high.

Author 1: Mehedi Masud

Keywords: mobile application; service discovery; mobile services; software engineering

PDF

Paper 58: Expectation-Maximization Algorithms for Obtaining Estimations of Generalized Failure Intensity Parameters

Abstract: This paper presents several iterative methods based on Stochastic Expectation-Maximization (EM) methodology in order to estimate parametric reliability models for randomly lifetime data. The methodology is related to Maximum Likelihood Estimates (MLE) in the case of missing data. A bathtub form of failure intensity formulation of a repairable system reliability is presented where the estimation of its parameters is considered through EM algorithm . Field of failures data from industrial site are used to fit the model. Finally, the interval estimation basing on large-sample in literature is discussed and the examination of the actual coverage probabilities of these confidence intervals is presented using Monte Carlo simulation method.

Author 1: Makram KRIT
Author 2: Khaled MILI

Keywords: Repairable systems reliability; bathtub failure intensity; EM algorithm; estimation; likelihood; Monte Carlo simulation

PDF

Paper 59: Detecting Distributed Denial of Service Attacks Using Data Mining Techniques

Abstract: Users and organizations find it continuously challenging to deal with distributed denial of service (DDoS) attacks. . The security engineer works to keep a service available at all times by dealing with intruder attacks. The intrusion-detection system (IDS) is one of the solutions to detecting and classifying any anomalous behavior. The IDS system should always be updated with the latest intruder attack deterrents to preserve the confidentiality, integrity and availability of the service. In this paper, a new dataset is collected because there were no common data sets that contain modern DDoS attacks in different network layers, such as (SIDDoS, HTTP Flood). This work incorporates three well-known classification techniques: Multilayer Perceptron (MLP), Naïve Bayes and Random Forest. The experimental results show that MLP achieved the highest accuracy rate (98.63%).

Author 1: Mouhammd Alkasassbeh
Author 2: Ghazi Al-Naymat
Author 3: Ahmad B.A Hassanat
Author 4: Mohammad Almseidin

Keywords: DDoS; IDS; MLP; Naïve Bayes; Random Forest

PDF

Paper 60: Investigating the Effect of Different Kernel Functions on the Performance of SVM for Recognizing Arabic Characters

Abstract: A considerable progress in the recognition techniques of Latin and Chinese characters has been achieved. By contrast, Arabic Optical character Recognition is still lagging in spite that the interest and research in this area is becoming more intensive than before. This is because the Arabic is a cursive language, written from right to left, each character has two to four different forms according to its position in the word, and several characters are associated with complementary parts above, below, or inside the character. Support Vector Machines (SVMs) are used successfully for recognizing Latin, and Chinese characters. This paper studies the effect of different kernel functions on the performance of SVMs for recognizing Arabic characters. Eleven different kernel functions are used throughout this study. The objective is to specify which type of kernel functions gives the best recognition rate. The resulting kernel functions can be considered as base for future studies aiming at enhancing their performance. The obtained results show that Exponential and Laplacian Kernels give excellent performance, while others, like multi-quadric kernel, fail to recognize the characters, speciallywith increased level of noise.

Author 1: Sayed Fadel
Author 2: Said Ghoniemy
Author 3: Mohamed Abdallah
Author 4: Hussein Abu Sorra
Author 5: Amira Ashour
Author 6: Asif Ansary

Keywords: SVM; Kernel Functions; Arabic Character Recognition

PDF

Paper 61: Cosine Based Latent Factor Model for Precision Oriented Recommendation

Abstract: Recommender systems suggest a list of interesting items to users based on their prior purchase or browsing behaviour on e-commerce platforms. The continuing research in recommender systems have primarily focused on developing algorithms for rating prediction task. However, most e-commerce platforms provide ‘top-k’ list of interesting items for every user. In line with this idea, the paper proposes a novel machine learning algorithm to predict a list of ‘top-k’ items by optimizing the latent factors of users and items with the mapped scores from ratings. The basic idea is to learn latent factors based on the cosine similarity between the users and items latent features which is then used to predict the scores for unseen items for every user. Comprehensive empirical evaluations on publicly available benchmark datasets reveal that the proposed model outperforms the state-of-the-art algorithms in recommending good items to a user.

Author 1: Bipul Kumar
Author 2: Pradip Kumar Bala
Author 3: Abhishek Srivastava

Keywords: collaborative filtering; recommender systems; precision; e-commerce; machine learning

PDF

Paper 62: Towards Building an Intelligent Call Routing System

Abstract: This paper presents EduICR - an Intelligent Call Routing system. This system can route calls to the most appropriate agent using routing rules built by the text classifier. EduICR includes the following main components: telephone communication network; Vietnamese speech recognition; Text classifier/ Natural language processor and Vietnamese speech synthesis. To our best knowledge, this is one of the first systems in Vietnam to implement the integration mechanism of text processing and speech processing. This allows voice applications to be more intelligent, able to communicate with humans in natural language with high accuracy and reasonable speed. Having been built and tested in real environment, our system proves its accuracy attaining more than 95%.

Author 1: Thien Khai Tran
Author 2: Dung Minh Pham
Author 3: Binh Van Huynh

Keywords: EduICR; spoken dialog systems; intelligent call center; voice application

PDF

Paper 63: A Privacy-Preserving Roaming Authentication Scheme for Ubiquitous Networks

Abstract: A privacy-preserving roaming authentication scheme (PPRAS) for ubiquitous networks is proposed, in which a remote mobile user can obtain the service offered by a foreign agent after being authenticated. In order to protect the mobile user’s privacy, the user presents an anonymous identity to the foreign agent with the assistance of his or her home agent to complete the authentication. After that, the user and the foreign agent can establish a session key using the semi-group property of Chebyshev polynomial. In this way, huge burden of key management is avoided. Furthermore, the user can update the login password and the session key between itself and the foreign agent if necessary. The correctness is proved using BAN logic, and the performance comparison against the existing schemes is given as well.

Author 1: You-sheng Zhou
Author 2: Jun-feng Zhou
Author 3: Feng Wang

Keywords: roaming authentication; anonymous; chaotic maps; key agreement

PDF

Paper 64: Segmentation and Recognition of Handwritten Kannada Text Using Relevance Feedback and Histogram of Oriented Gradients – A Novel Approach

Abstract: India is a multilingual country with 22 official languages and more than 1600 languages in existence. Kannada is one of the official languages and widely used in the state of Karnataka whose population is over 65 million. Kannada is one of the south Indian languages and it stands in the 33rd position among the list of widely spoken languages across the world. However, the survey reveals that much more effort is required to develop a complete Optical Character Recognition (OCR) system. In this direction the present research work throws light on the development of suitable methodology to achieve the goal of developing an OCR. It is noted that the overall accuracy of the OCR system largely depends on the accuracy of the segmentation phase. So it is desirable to have a robust and efficient segmentation method. In this paper, a method has been proposed for proper segmentation of the text to improve the performance of OCR at the later stages. In the proposed method, the segmentation has been done using horizontal projection profile and windowing. The result obtained is passed to the recognition module. The Histogram of Oriented Gradient (HoG) is used for the recognition in combination with the support vector machine (SVM). The result is taken as the feedback and fed to the segmentation module to improve the accuracy. The experimentation is delivered promising results.

Author 1: Karthik S
Author 2: Srikanta Murthy K

Keywords: Optical character recognition; Histogram of oriented gradients; relevance feedback; segmentation; Support Vector Machine; handwritten Kannada documents

PDF

Paper 65: Intelligent Accreditation System: A Survey of the Issues, Challenges, and Solution

Abstract: International educational institutes are aiming to be accredited by local and international accreditation agencies such as the Association to Advance Collegiate Schools of Business (AACSB), Accreditation Board for Engineering and Technology (ABET) and so forth to be recognized by stakeholders. The institutes are striving to meet the stakeholders’ expectations by integrating quality in all standards of educational practices and guarantee a continuous improvement. This study has acknowledged the principal barriers that need to be addressed and resolved such as collection & population of data, time constrains, compensation and lack of guidance and expertise. A web-based survey was conducted to identify the obstacles and the respondents’ expectations in the optimization process for accreditation. This research proposes an Intelligent Web-Based Accreditation System (IWBAS) that addresses the above issues and streamline the accreditation process.

Author 1: Fahim Akhter
Author 2: Yasser Ibrahim

Keywords: Challenges of Accreditation Process; Intelligent Accreditation System

PDF

Paper 66: Multi-Objective Optimization Algorithm to the Analyses of Diabetes Disease Diagnosis

Abstract: There is huge amount of data available in health industry which is found difficult in handing, hence mining of data is necessary to innovate the hidden patterns and their relevant features. Recently, many researchers have devoted to the study of using data mining on disease diagnosis. Mining bio-medical data is one of the predominant research area where evolutionary algorithms and clustering techniques are emphasized in diabetes disease diagnosis. Therefore, this research focuses on application of evolution clustering multi-objective optimization algorithm (ECMO) to analyze the data of patients suffering from diabetes disease. The main objective of this work is to maximize the prediction accuracy of cluster and computation efficiency along with minimum cost for data clustering. The experimental results prove that this application has attained maximum accuracy for dataset of Pima Indians Diabetes from UCI repository. In this way, by analyzing the three objectives, ECMO could achieve best Pareto fronts.

Author 1: M. Anusha
Author 2: Dr. J.G.R. Sathiaseelan

Keywords: Clustering; Genetic Algorithm; Multi-objective Optimization; ECMO; Diabetes Disease

PDF

Paper 67: Comparatative Analysis of Energy Detection Spectrum Sensing of Cognitive Radio Under Wireless Environment Using SEAMCAT

Abstract: In the recent years, the Cognitive Radio technology imposed itself as a good solution to enhance the utilization of unused spectrum and globalized the radio environment for different band users that utilize or require different techniques for transmission. In this paper, the energy detection spectrum sensing technique that is used to detect the presence of unknown deterministic signal is studied under the non-time dispersive fading environment using the Hata propagation model for picocell communication systems. The different aspects of non-time dispersive fading regions over energy detection spectrum sensing and impact of changing a detection threshold of the secondary user Cognitive Radio on interference at primary user for non-cooperative spectrum access have been studied in the terms of probability of interference. The entire Comparatative Analysis of Spectrum Sensing in Cognitive radio has been carried out with the aid of SEAMCAT software platform.

Author 1: A.S. Kang
Author 2: Renu Vig
Author 3: Jasvir Singh
Author 4: Jaisukh Paul Singh

Keywords: Cognitive Radio; Primary User; Secondary User; Detection Threshold; Interference Probability;Energy Detection; Desired/interfering/sensing received signal strength

PDF

Paper 68: Adaptive Lockable Units to Improve Data Availability in a Distributed Database System

Abstract: Distributed database systems have become a phenomenon and have been considered a crucial source of information for numerous users. Users with different jobs are using such systems locally or via the Internet to meet their professional requirements. Distributed database systems consist of a number of sites connected over a computer network. Each site deals with its own database and interacts with other sites as needed. Data replication in these systems is considered a key factor in improving data availability. However, it may affect system performance when most of the transactions that access the data contain write or a mix of read and write operations because of exclusive locks and update propagation. This research proposes a new adaptive approach for increasing the availability of data contained in a distributed database system. The proposed approach suggests a new lockable unit by increasing the database hierarchy tree by one level to include attributes as lockable units instead of the entire row. This technique may allow several transactions to access the database row simultaneously by utilizing some attributes and keeping others available for other transactions. Data in a distributed database system can be accessed locally or remotely by a distributed transaction, with each distributed transaction decomposed into several sub-transactions called participants or agents. These agents access the data at multiple sites and must guarantee that any changes to the data must be committed in order to complete the main transaction. The experimental results show that using attribute-level locking will increase data availability, reliability, and throughput, as well as enhance overall system performance. Moreover, it will increase the overhead of managing such a large number of locks, which will be managed according to the qualification of the query.

Author 1: Khaled Maabreh

Keywords: Granularity hierarchy tree; Lockable unit; Locks; Attribute level; Concurrency control; Data availability; Replication

PDF

Paper 69: A Multipath Lifetime-Prolonging Routing Algorithm for Wireless Ad Hoc Networks

Abstract: Dynamic networks can be tremendously challenging when deploying distributed applications on autonomous machines. Further, implementing services like routing and security for such networks is generally difficult and problematic. Consequently, multi-agent systems are well suited for designing distributed systems where several autonomous agents interact or work together to perform a set of tasks or satisfy a set of goals, hence moving the problem of analyzing from a global level to a local level therefore reducing the design complexity. In our previous paper, we presented a Multi Agent system model that has been adapted to develop a routing protocol for ad hoc networks. Wireless ad hoc networks are infrastructureless networks that comprise wireless mobile nodes which areable to communicate with each other outside the wireless transmission range. Due to frequent network topology changes, the limited energy and underlying bandwidth, routing becomes a challenging task. In this paper, we present a new version of routing algorithm devoted for mobile ad hoc networks. Our new algorithm helps controlling the network congestion and increasing the network lifetime by effectively managing nodes energy and link cost. The performance of our new version is validated through simulation. The Simulation results show the effectiveness and efficiency of our new algorithm compared to stae-of-the-artsolutions in terms of various performance metrics.

Author 1: Mohamed Amine RIAHLA
Author 2: Karim TAMINE

Keywords: Mobile Multi Agent System; Ad hoc Network Lifetime; Ant Routing Protocol; Distributed Algorithm; Network Congestion

PDF

Paper 70: High Lightweight Encryption Standard (HLES) as an Improvement of 512-Bit AES for Secure Multimedia

Abstract: In today’s scenario, people share information to another people frequently using network. Due to this, more amount of information are so much private but some are less private. Therefore, the attackers or the hackers take the advantage and start attempting to steal the information since 2001. the symmetric encryption algorithm called 512-bit AES provides high level of security, but it's almost be impossible to be used in multimedia transmissions and mobile systems because of the need for more design area that effect in the use of large memory space in each round and the big encryption time that it takes. This paper presents an improvement of 512-bit AES algorithm with efficient utilization of resources such as processor and memory space. The proposed approach resists the linear and differential encrypt analysis and provides high security level using a 512-bit size of key block and data block and ameliorates the performance by minimizing the use of memory space and time encryption to be able to work in specific characteristics of resource-limited systems. The experimental results on several data (text, image, sound, video) show that the used memory space is reduced to quarter, and the encryption time is reduced almost to the half. Therefore, the adopted method is very effective for encryption of multimedia data.

Author 1: GUESMIA Seyf Eddine
Author 2: ASSAS Ouarda
Author 3: BOUDERAH Brahim

Keywords: Advanced Encryption Standard (AES); Encryption; multimedia data; security; resource-limited systems

PDF

Paper 71: A Distributed Framework for Content Search Using Small World Communities

Abstract: The continuous growth of multimedia content available all over the web is raising the importance of a distributed framework for searching it. One of the important parameters in a distributed environment is system response time. This parameter specially plays an important role in search and retrieval. A novel two-tier structure is introduced in this paper, which focuses on the community concept to facilitate creation of ontological small worlds that can effectively assist the search task. As a result, user queries are forwarded to nodes that are likely to contain the relevant resources. Evaluation of the framework proves that the small world character of the proposed structure provides queries with better route selection and searching efficiency.

Author 1: Seyyed-Mohammad Javadi-Moghaddam
Author 2: Stefanos Kollias

Keywords: Small-world networks; distributed multimedia model; ontology; community; fuzzy similarity

PDF

Paper 72: Non Correlation DWT Based Watermarking Behavior in Different Color Spaces

Abstract: There are two digital watermarking techniques. Digital watermarking techniques based on correlation and digital watermarking techniques that are not based on correlation. In previous work, we proposed a DWT2 based CDMA image watermarking scheme to study the effects of using eight color spaces RGB, YCbCr, JPEG-YCbCr, YIQ, YUV, HSI, HSV and CIELab, on watermarking algorithms based on correlation techniques. This paper proposes a non correlation based image watermarking scheme in wavelet transform domain and tests it in the same color spaces, to develop studying, reach a comprehensive analysis and focus on satisfying the requirements of based non coloration watermarking algorithms. To achieve more security, imperceptibility and robustness of the proposed scheme, first, the binary watermark image encodes by applying ATM, CCM and exclusive OR. Then, the scrambled watermark embeds into intended quantized approximation coefficients of wavelet transform by LSB insertion technique.

Author 1: Mehdi Khalili
Author 2: Mahsa Nazari

Keywords: ATM; CCM; DWT2; color spaces; non correlation watermarking technique

PDF

Paper 73: Semi-Automatic Segmentation System for Syllables Extraction from Continuous Arabic Audio Signal

Abstract: The paper describes a speaker independent segmentation system for breaking Arabic uttered sentences into its constituent syllables. The goal is to construct a database of acoustical Arabic syllables as a step towards a syllable-based Arabic speech verification/recognition system. The proposed technique segments the utterances based on maxima extraction from delta function of 1st MFC coefficient. This method locates syllables boundaries by applying the template matching technique with reference utterances. The system was applied over a data set of 276 utterances to segment them into their 2544 constituent syllables. A segmentation success rate of about 91.5% was reached.

Author 1: Mohamed S. Abdo
Author 2: Ahmed H. Kandil

Keywords: Arabic speech syllables; automatic segmentation; boundaries detection; delta-MFCC features

PDF

Paper 74: Power-Controlled Data Transmission in Wireless Ad-Hoc Networks: Challenges and Solutions

Abstract: Energy scarcity and interference are two important factors determining the performance of wireless ad-hoc networks that should be considered in depth. A promising method of achieving energy conservation is the transmission power control. Transmission power control also contributes to the mitigation of interference thereby promotes throughput by means of rendering multiple hosts to communicate in the same neighborhood simultaneously without impairing each other’s transmissions. However, as identified previously in the literature, traditional hidden terminal problem gets deteriorated when transmission power control mechanism is intended to be applied. In this article, we discuss the primary details about the power usage and throughput deficiency of the traditional 802.11 RTS/CTS mechanism. Improvements by means of power control are introduced as well as the solutions to the challenges likely to emerge because of the usage of diverse power levels throughout the network.

Author 1: Bilgehan Berberoglu
Author 2: Taner Cevik

Keywords: ad-hoc networks; energy conservation; power control; throughput

PDF

Paper 75: Face Recognition Based on Improved SIFT Algorithm

Abstract: People are usually identified by their faces. Developments in the past few decades have enabled human to automatically do the identification process.Now, face recognition process employs the advanced statistical science and matching methods. Improvements and innovations in face recognition technology during 10 to 15 past years have propelled it to the current status. Due to the wide application of face recognition algorithms in many practical systems, including security control systems, human–computer interaction systems, etc., algorithms with high success rate are highly interested in research areas in recent years.Most of suggested algorithms are about correctly identifying face photos and assigning them to a person in the database. This study focuses on face recognition based on improved SIFT algorithm. Results indicate the superiority of the proposed algorithm over the SIFT.To evaluate the proposed algorithm, it is applied on ORL database and then compared to other face detection algorithms including Gabor, GPCA, GLDA, LBP, GLDP, KGWRCM, and SIFT. The results obtained from various tests show that the proposed algorithm reveals accuracy of 98.75% and run time of 4.3 seconds which is shorter. The new improved algorithm is more efficient and more accurate than other algorithms.

Author 1: EHSAN SADEGHIPOUR
Author 2: NASROLLAH SAHRAGARD

Keywords: face detection; improved SIFT descriptor; KGWRCM; GPCA; GLDA

PDF

Paper 76: Prediction of Mental Health Problems Among Children Using Machine Learning Techniques

Abstract: Early diagnosis of mental health problems helps the professionals to treat it at an earlier stage and improves the patients’ quality of life. So, there is an urgent need to treat basic mental health problems that prevail among children which may lead to complicated problems, if not treated at an early stage. Machine learning Techniques are currently well suited for analyzing medical data and diagnosing the problem. This research has identified eight machine learning techniques and has compared their performances on different measures of accuracy in diagnosing five basic mental health problems. A data set consisting of sixty cases is collected for training and testing the performance of the techniques. Twenty-five attributes have been identified as important for diagnosing the problem from the documents. The attributes have been reduced by applying Feature Selection algorithms over the full attribute data set. The accuracy over the full attribute set and selected attribute set on various machine learning techniques have been compared. It is evident from the results that the three classifiers viz., Multilayer Perceptron, Multiclass Classifier and LAD Tree produced more accurate results and there is only a slight difference between their performances over full attribute set and selected attribute set.

Author 1: Ms. Sumathi M.R.
Author 2: Dr. B. Poorna

Keywords: Mental Health Diagnosis; Machine Learning; Prediction; Feature Selection; Basic Mental Health Problems

PDF

Paper 77: VLSI Design of a High Performance Decimation Filter Used for Digital Filtering

Abstract: With the rapid development of computers and communications, more and more chips are required to have small size, low-power and high performance. Digital filter is one of the basic building blocks used for implementation in Very Large Scale Integration (VLSI) of mixed-signal circuits. This paper presents a design of decimation filter used for digital filtering. It consists of Cascode Integrated Comb (CIC) filters, using Finite Impulse Response (FIR) filters and Infinite Impulse Response (IIR) filters structure. This architecture provides small area and low power consumption by avoiding the use of multiplication structure. This design presents the way of speeding up the route from the theoretical design with Simulink/Matlab, via behavioral simulation in fixed-point arithmetic to the implementation on either ASIC. This has been achieved by porting the netlist of the Simulink system description into the Very high speed integrated circuit Hardware Description Language (VHDL). At the first instance, the Simulink-to-VHDL converter has been designed to use structural VHDL code to describe system interconnections, allowing simple behavioral descriptions for basic blocks. A comparison of several architectures of this circuit based on different architectures of most popular filter is presented. The comparison includes: supply voltage, power consumption, area and technology. This approach consumes only 2.94 mW of power at a supply voltage of 3V. The core chip size of the filter block without bonding pads is 0.058 mm2 by using the AMS 0.35 µm CMOS technology.

Author 1: Radhouane LAAJIMI
Author 2: Ali AJMI
Author 3: Randa KHEMIRI
Author 4: Mohsen Machout

Keywords: Digital circuit design; CIC decimation; Cascaded integrator comb filter (CIC); IIR-FIR structure

PDF

Paper 78: Proposal and Implementation of MPLS Fuzzy Traffic Monitor

Abstract: Multiprotocol Label Switched Networks need highly intelligent controls to manage high volume traffic due to issues of traffic congestion and best path selection. The work demonstrated in this paper shows results from simulations for building optimal fuzzy based algorithm for traffic splitting and congestion avoidance. The design and implementation of Fuzzy based software defined networking is illustrated by introducing the Fuzzy Traffic Monitor in an ingress node. Finally, it displays improvements in the terms of mean delay (42.0%) and mean loss rate (2.4%) for Video Traffic. Then, the resu1t shows an improvement in the terms of mean delay (5.4%) and mean loss rate (3.4%) for Data Traffic and an improvement in the terms of mean delay(44.9%) and mean loss rate(4.1%) for Voice Traffic as compared to default MPLS implementation.

Author 1: Anju Bhandari
Author 2: V.P.Singh

Keywords: Multiprotocol Label Switched Networks; Fuzzy Traffic Monitor; Network Simulator; Ingress; Traffic Splitting; Fuzzy Logic Control System; Label setup System; Traffic Splitting System

PDF

Paper 79: Verification of Statecharts Using Data Abstraction

Abstract: We present an approach for verifying Statecharts including infinite data spaces. We devise a technique for checking that a formula of the universal fragment of CTL is satisfied by a specification written as a Statechart. The approach is based on a property-preserving abstraction technique that additionally preserves structure. It is prototypically implemented in a logic-based framework using a theorem prover and a model checker. This paper reports on the following results. (1) We present a proof infra-structure for Statecharts in the theorem prover Isabelle/HOL, which constitutes a basis for defining a mechanised data abstraction process. The formalisation is based on Hierar-chical Automata (HA) which allow a structural decomposition of Statecharts into Sequential Automata. (2) Based on this theory we introduce a data abstraction technique, which can be used to abstract the data space of a HA for a given abstraction function. The technique is based on constructing over-approximations. It is structure-preserving and is designed in a compositional way. (3) For reasons of practicability, we finally present two tactics supporting the abstraction that we have implemented in Isabelle/HOL. To make proofs more efficient, these tactics use the model checker SMV checking abstract models automatically.

Author 1: Steffen Helke
Author 2: Florian Kammuller

Keywords: Statecharts; CTL; Data Abstraction; Model Check-ing; Theorem Proving

PDF

Paper 80: Weighted Unsupervised Learning for 3D Object Detection

Abstract: This paper introduces a novel weighted unsuper-vised learning for object detection using an RGB-D camera. This technique is feasible for detecting the moving objects in the noisy environments that are captured by an RGB-D camera. The main contribution of this paper is a real-time algorithm for detecting each object using weighted clustering as a separate cluster. In a preprocessing step, the algorithm calculates the pose 3D position X, Y, Z and RGB color of each data point and then it calculates each data point’s normal vector using the point’s neighbor. After preprocessing, our algorithm calculates k-weights for each data point; each weight indicates membership. Resulting in clustered objects of the scene.

Author 1: Kamran Kowsari
Author 2: Manal H. Alassaf

Keywords: Weighted Unsupervised Learning, Object Detection, RGB-D camera, Kinect

PDF

Paper 81: A Novel Approach for On-road Vehicle Detection and Tracking

Abstract: On the basis of a necessary development of the road safety, vision-based vehicle detection techniques have gained an important amount of attention. This work presents a novel vehicle detection and tracking approach, and structured based on a vehicle detection process starting from, images or video data acquired from sensors installed on board of the vehicle, to vehicle detection and tracking. The features of the vehicle are extracted by the proposed GIST image processing algorithm, and recognized by the state-of-art Support Vectors Machine classifier. The tracking process was performed based on edge features matching approach. The Kalman filter was used to correct the measurements. Extensive experiments were carried out on real image data validate that it is promising to employ the proposed approach for on road vehicle detection and tracking.

Author 1: Ilyas EL JAAFARI
Author 2: Mohamed EL ANSARI
Author 3: Lahcen KOUTTI
Author 4: Ayoub ELLAHYANI
Author 5: Said CHARFI

Keywords: Vehicle detection; Vehicle tracking; GIST; SVM; Edge features; Kalman filter

PDF

Paper 82: A Robust Hash Function Using Cross-Coupled Chaotic Maps with Absolute-Valued Sinusoidal Nonlinearity

Abstract: This paper presents a compact and effective chaos-based keyed hash function implemented by a cross-coupled topology of chaotic maps, which employs absolute-value of sinusoidal nonlinearity, and offers robust chaotic regions over broad parameter spaces with high degree of randomness through chaoticity measurements using the Lyapunov exponent. Hash function operations involve an initial stage when the chaotic map accepts initial conditions and a hashing stage that accepts input messages and generates the alterable-length hash values. Hashing performances are evaluated in terms of original message condition changes, statistical analyses, and collision analyses. The results of hashing performances show that the mean changed probabilities are very close to 50%, and the mean number of bit changes is also close to a half of hash value lengths. The collision tests reveal the mean absolute difference of each character values for the hash values of 128, 160 and 256 bits are close to the ideal value of 85.43. The proposed keyed hash function enhances the collision resistance, comparing to MD5 and SHA1, and the other complicated chaos-based approaches. An implementation of hash function Android application is demonstrated.

Author 1: Wimol San-Um
Author 2: Warakorn Srichavengsup

Keywords: Hash Function; Cross-Coupled Chaotic Map; Sinu-soidal Nonlinearity; Information security; Authentication

PDF

Paper 83: An Efficient Method for Distributing Animated Slides of Web Presentations

Abstract: Attention control of audience is required for suc-cessful presentations, therefore giving a presentation with im-mediate reaction, called reactive presentation, to unexpected changes in the context given by the audience is important. Examples of functions for the reactive presentation are shape animation effects on slides and slide transition effects. Understanding the functions that realize the reactive pre-sentation on the Web can be useful. In this work, we present an effective method for synchronizing shape animation effects on the Web, such as moving the objects and changing the size and color of the shape objects. The main idea is to make a video of animated slides, called Web Slide Media, including the page information of slides as movie chapter information for synchronization. Moreover, we explain a method to reduce the file size of the Web slide media by removing all shape animation effects and slide transition effects from a Web slide media item, called Sparse Web Slide Media. We demonstrate that the performance of the system is enough for practical use and the file size of the Sparse Web Slide Media is smaller than the file size of the Web Slide Media.

Author 1: Yusuke Niwa
Author 2: Shun Shiramatsu
Author 3: Tadachika Ozono
Author 4: Toramatsu Shintani

Keywords: Collaborative tools; communication aids; informa-tion sharing; Web services

PDF

Paper 84: Applying data mining in the context of Industrial Internet

Abstract: Nowadays, (industrial) companies invest more and more in connecting with their clients and machines deployed to the clients. Mining all collected data brings up several technical challenges, but doing it means getting a lot of insight useful for improving equipments. We define two approaches in mining the data in the context of Industrial Internet, applied to one of the leading companies in shoe production lines, but easily extendible to any producer. For each approach, various machine learning algorithms are applied along with a voting system. This leads to a robust model, easy to adapt for any machine.

Author 1: Oliviu Matei
Author 2: Kevin Nagorny
Author 3: Karsten Stoebener

Keywords: machine learning; data mining; k-nearest neigh-bour; neural network; support vector machine; rule induction

PDF

Paper 85: Complex-Valued Neural Networks Training: A Particle Swarm Optimization Strategy

Abstract: QSAR (Quantitative Structure-Activity Relation-ship) modelling is one of the well developed areas in drug development through computational chemistry. This kind of relationship between molecular structure and change in biological activity is center of focus for QSAR modelling. Machine learning algorithms are important tools for QSAR analysis, as a result, they are integrated into the drug production process. In this paper we will try to go through the problem of learning the Complex-Valued Neural Networks(CVNNs) using Particle Swarm Optimiza-tion(PSO); which is one of the open topics in the machine learning society where the CVNN is a more complicated for complex-valued data processing due to a lot of constraints such as activation function must be bounded and differentiable at the complete complex space. In this paper, a CVNN model for real-valued regression problems s presented. We tested such trained CVNN on two drug sets as a real world benchmark problem. The results show that the prediction and generalization abilities of CVNNs is superior in comparison to the conventional real-valued neural networks (RVNNs). Moreover, convergence of CVNNs is much faster than that of RVNNs in most of the cases.

Author 1: Mohammed E. El-Telbany
Author 2: Samah Refat

Keywords: Particle Swarm Optimization, Complex-Valued Neu-ral Networks, QSAR, Drug Design, prediction

PDF

Paper 86: Translation of the Mutation Operator from Genetic Algorithms to Evolutionary Ontologies

Abstract: Recently introduced, evolutionary ontologies rep-resent a new concept as a combination of genetic algorithms and ontologies. We have defined a new framework comprising a set of parameters required for any evolutionary algorithm, i.e. ontological space, representation of individuals, the main genetic operators such as selection, crossover, and mutation. Although a secondary operator, mutation proves its importance in creating and maintaining evolutionary ontologies diversity. Therefore, in this article, we widely debate the mutation topic in evolutionary ontologies, marking its usefulness in practice by experimental results. Also we introduce a new mutation operator, called relational mutation, concerning mutation of a relationship through its inverse.

Author 1: Diana Contras
Author 2: Oliviu Matei

Keywords: Evolutionary ontologies; Genetic algorithms; Muta-tion; Ontology

PDF

Paper 87: Faster Scalar Multiplication Algorithm to Implement a Secured Elliptic Curve Cryptography System

Abstract: Elliptic Curve Cryptography provides similar strength of protection comparing other public key cryptosystems but requires significantly smaller key size. This paper proposes a new faster scalar multiplication algorithm aiming at a more secured Elliptic Curve Cryptography scheme. This paper also proposes a novel Elliptic Curve Cryptography scheme where maximum length random sequence generation method is utilized as data mapping technique on elliptic curve over a finite field. The proposed scheme is tested on various bits length of prime field and key sizes. The numerical experiments demonstrate that the proposed scheme reduces the computation time compared to conventional scheme and shows very high strength against cryptanalytic attack particularly random walk attack.

Author 1: Fatema Akhter

Keywords: Cryptography; Elliptic curve cryptography; Scalar multiplication; Random walk; Elliptic curve discrete logarithm problem

PDF

Paper 88: FPGA Prototype Implementation of Digital Hearing Aid from Software to Complete Hardware Design

Abstract: The design and implementation of digital hearing aids requires a detailed knowledge of various digital signal processing techniques used in hearing aids like Wavelet Trans-forms, uniform and non-uniform Filter Banks and Fast Fourier Transform (FFT). In this paper the design and development of digital part of hearing aid is divided into three different phases. In the first phase review and Matlab simulation of various signal processing techniques used in the digital hearing aids is presented. In the second phase a software implementation was carried out and the firmware was designed for the Xilinx Microblaze softcore processor system. In the third phase everything was moved into hardware using VHDL hardware description language. The implementation was done on Xilinx Field Programmable Gate Array (FPGA) Development Board.

Author 1: Abdul Rehman Buzdar
Author 2: Azhar Latif
Author 3: Liguo Sun
Author 4: Abdullah Buzdar

Keywords: Hearing Aid; FPGA; CODEC; MicroBlaze; Wavelets; Filter Banks; FFT

PDF

Paper 89: Innovative Framework for e-Government adoption in Saudi Arabia: A Study from the business sector perspective

Abstract: E-Government increases transparency and im-proves communication between the government and the users. Providing e-Government services to business sector is a fun-damental mission of governmental agencies in Saudi Arabia. However, the adoption of e-Government systems is less than satisfactory in many countries, particularly in developing coun-tries. This is a significant factor that can lead to e-Government failure and, therefore, to the waste of budget and effort. One pertinent, unanswered question is what are the key factors that influence the adoption and utilisation level of users from business sector. Unlike much research in the literature that has utilised common technology acceptance models and theories to analyse the adoption of e-Government, which may not be sufficient for such analysis, this study proposes a conceptual framework following a holistic approach to analyse key factors that influence the adoption and utilisation of e-Government in Saudi Arabia. The developed framework, E-Government Adoption and Utilisation Model (EGAUM), was developed based on critical evaluation of several common models and theories related to technology acceptance and use including Technology Acceptance Model (TAM) and Unified Theory of Acceptance and Use of Technology (UTAUT), in conjunction with analysis of e-Government adoption literature. The study involved 48 participating business entities from two major cities in Saudi Arabia, Riyadh and Jeddah. The descriptive and statistical analyses are presented in this paper and the results indicated that all the proposed factors have degree of influence on the adoption and utilisation level. Perceived Benefits, Awareness, Previous Experience, and Regulations & Policies were found to be the significant factors that are most likely to influence the adoption and usage level of users from business sector.

Author 1: Saleh Alghamdi
Author 2: Natalia Beloff

Keywords: E-Government; E-Services; Saudi Arabia; Technol-ogy Adoption; Influential Factors; Users’ Intention; Business Sector Perspective

PDF

Paper 90: Mobile computation offloading architecture for mobile augmented reality, case study: Visualization of cetacean skeleton

Abstract: Augmented Reality applications can serve as teach-ing tools in different contexts of use. Augmented reality appli-cation on mobile devices can help to provide tourist information on cities or to give information on visits to museums. For example, during visits to museums of natural history, applications of augmented reality on mobile devices can be used by some visitors to interact with the skeleton of a whale. However, making rendering heavy models can be computationally infeasible on devices with limited resources such as smart phones or tablets. One solution to this problem is to use techniques to Mobile Computation Offloading. This work proposes a mobile computation offloading architecture for mobile augmented reality. This solution would allow users to interact with a whale skeleton through an augmented reality application on mobile devices. Finally testing to assess the optimization of the resources of the mobile device when performing heavy render tests were made.

Author 1: Belen G. Rodriguez-Santana
Author 2: Amilcar Meneses Viveros
Author 3: Blanca Esther Carvajal-Gamez
Author 4: Diana Carolina Trejo-Osorio

Keywords: Mobile augmented reality, mobile devices, render, mobile computation offloading

PDF

Paper 91: Resolution Method in Linguistic Propositional Logic

Abstract: In the present paper, the resolution method for a linguistic propositional logic with truth value in a logical algebra - refined hedge algebra, is focused. The preliminaries of refined hedge algebra are given first. Then the syntax and semantic of linguistic propositional are defined. Finally, a resolution method which based on resolution principle in two-valued logic is established. Accordingly, the research in this paper will be helpful support for the application of intelligent reasoning system based on linguistic-valued logic which includes incomparable information.

Author 1: Thi-Minh-Tam Nguyen
Author 2: Duc-Khanh Tran

Keywords: Resolution; Linguistic Truth Value; Linguistic Propositional Logic; Hedge Algebra

PDF

Paper 92: Single-Handed Cursor Control Technique Optimized for Rear Touch Operation and Its Usability

Abstract: To improve single-handed operation of mobile de-vices, the use of rear touch panel has potential for user interac-tions. In this paper, a basic study of operational control simply achieved through drag and tap of the index finger on a rear touch panel is conducted. Since a user has to hold the handheld device firmly with the thumb and fingers, a movable range of the tip of an index finger is limited. This restriction requires a user to perform several times of dragging actions to reach a cursor to the long distance target. Considering such kinematic restriction, a technique optimized for rear operation is proposed, wherein not only the position but also the velocity of fingertip movement is regarded. Movement time, the number of dragging operation, and the throughputs of the proposed technique have been evaluated in comparison with the generic technique using Fitts’s law. Experiments have been conducted to perform the target selection in the form of reciprocal 1D pointing tasks with ten participants. The combinations of two ways of holding the device (landscape and portrait) and two directions of dragging (horizontal and vertical) are considered. As a result, the proposed technique achieved the improvements of from 5 to 13% shorter movement time, from 20 to 40% higher throughputs and no deterioration of the number of dragging even for the longer distance targets. In addition, the further analysis addressed that there exists the advantageous combinations of the way of holding and the direction of dragging, which would be beneficial for better design of single-handed user interactions using rear touch.

Author 1: Yoshikazu Onuki
Author 2: Itsuo Kumazawa

Keywords: Rear touch; cursor control; mobile device; single-handed; Fitts’s law

PDF

Paper 93: Traffic Sign Detection and Recognition using Features Combination and Random Forests

Abstract: In this paper, we present a computer vision based system for fast robust Traffic Sign Detection and Recognition (TSDR), consisting of three steps. The first step consists on image enhancement and thresholding using the three components of the Hue Saturation and Value (HSV) space. Then we refer to distance to border feature and Random Forests classifier to detect circular, triangular and rectangular shapes on the segmented images. The last step consists on identifying the information included in the detected traffic signs. We compare four features descriptors which include Histogram of Oriented Gradients (HOG), Gabor, Local Binary Pattern (LBP), and Local Self-Similarity (LSS). We also compare their different combinations. For the classifiers we have carried out a comparison between Random Forests and Support Vector Machines (SVMs). The best results are given by the combination HOG with LSS together with the Random Forest classifier. The proposed method has been tested on the Swedish Traffic Signs Data set and gives satisfactory results.

Author 1: Ayoub ELLAHYANI
Author 2: Mohamed EL ANSARI
Author 3: Ilyas EL JAAFARI
Author 4: Said CHARFI

Keywords: Traffic Sign Recognition (TSR); thresholding; Hue Saturation and Value (HSV); Histogram of Oriented Gradients (HOG); Gabor; Local Binary Pattern (LBP); Local Self-Similarity (LSS); Random forests

PDF

Paper 94: Risk Propagation Analysis and Visualization using Percolation Theory

Abstract: This article presents a percolation-based approach for the analysis of risk propagation, using malware spreading as a showcase example. Conventional risk management is often driven by human (subjective) assessment of how one risk influences the other, respectively, how security incidents can affect subsequent problems in interconnected (sub)systems of an infrastructure. Using percolation theory, a well-established methodology in the fields of epidemiology and disease spreading, a simple simulation-based method is described to assess risk propagation system-atically. This simulation is formally analyzed using percolation theory, to obtain closed form criteria that help predicting a pandemic incident propagation (or a propagation with average-case bounded implications). The method is designed as a security decision support tool, e.g., to be used in security operation centers. For that matter, a flexible visualization technique is devised, which is naturally induced by the percolation model and the simulation algorithm that derives from it. The main output of the model is a graphical visualization of the infrastructure (physical or logical topology). This representation uses color codes to indicate the likelihood of problems to arise from a security incident that initially occurs at a given point in the system. Large likelihoods for problems thus indicate “hotspots”, where additional action should be taken.

Author 1: Sandra Konig
Author 2: Stefan Rass
Author 3: Stefan Schauer
Author 4: Alexander Beck

Keywords: security operation center; malware infection; perco-lation; BYOD; risk propagation; visualization

PDF

Paper 95: MAI and Noise Constrained LMS Algorithm for MIMO CDMA Linear Equalizer

Abstract: This paper presents a constrained least mean squared (LMS) algorithm for MIMO CDMA linear equalizer is presented, which is constrained on spreading sequence length, number of subscribers, variances of the Gaussian noise and the multiple access interference (MAI) plus the additive noise (introduced as a new constraint). The novelty of the proposed algorithm is that MAI and MAI plus noise variance has never been used as a constraint in MIMO CDMA systems. Convergence analysis is performed for the proposed algorithm in case when statistics of MAI and MAI plus noise are available. Simulation results are presented to compare the performance of the proposed constrained algorithm with other constrained algorithms and it is proved that the new algorithm has outperformed the existing constrained algorithms.

Author 1: Khalid Mahmood
Author 2: Syed Muhammad Asad
Author 3: Muhammad Moinuddin
Author 4: Waqas Imtiaz

Keywords: Least mean squared (LMS), multiple input, multiple output (MIMO), linear equalizer, multiple access in-terference (MAI), Variance, AWGN, adaptive algorithm

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org