The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 7 Issue 7

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: A Vertical Handover Management for Mobile Telemedicine System using Heterogeneous Wireless Networks

Abstract: Application of existing mobile telemedicine system is restricted by the imperfection of network coverage, network capacity, and mobility. In this paper, a novel telemedicine based handover decision making (THODM) algorithm is proposed for mobile telemedicine system using heterogeneous wireless networks. The proposed algorithm select the best network based on the services requirement to ensure the connected or targeted network candidate has sufficient capacity for supporting the telemedicine services. The simulation results show that the proposed algorithm minimizes the number of unnecessary handover to WLAN in high speed environment. The throughput achieved by the proposed algorithm is up to 75% and 205% higher than Cellular and RSS based schemes, respectively. Moreover, the average data transmission cost of THODM algorithm is 24% and 69.2% lower than the Cellular and RSS schemes. The proposed algorithm minimizes the average transmission cost while maintaining the telemedicine service quality at the highest level in high speed environment.

Author 1: Hoe-Tung Yew
Author 2: Eko Supriyanto
Author 3: M Haikal Satria
Author 4: Yuan-Wen Hau

Keywords: Mobile telemedicine system; vertical handover; heterogeneous networks; unnecessary handover; throughput; cost

PDF

Paper 2: A New Approach for Improvement Security against DoS Attacks in Vehicular Ad-hoc Network

Abstract: Vehicular Ad-Hoc Networks (VANET) are a proper subset of mobile wireless networks, where nodes are revulsive, the vehicles are armed with special electronic devices on the motherboard OBU (On Board Unit) which enables them to trasmit and receive messages from other vehicles in the VANET. Furthermore the communication between the vehicles, the VANET interface is donated by the contact points with road infrastructure. VANET is a subgroup of MANETs. Unlike the MANETs nodes, VANET nodes are moving very fast. Impound a permanent route for the dissemination of emergency messages and alerts from a danger zone is a very challenging task. Therefore, routing plays a significant duty in VANETs. decreasing network overhead, avoiding network congestion, increasing traffic congestion and packet delivery ratio are the most important issues associated with routing in VANETs. In addition, VANET network is subject to various security attacks. In base VANET systems, an algorithm is used to dicover attacks at the time of confirmation in which overhead delay occurs. This paper proposes (P-Secure) approach which is used for the detection of DoS attacks before the confirmation time. This reduces the overhead delays for processing and increasing the security in VANETs. Simulation results show that the P-Secure approach, is more efficient than OBUmodelVaNET approach in terms of PDR, e2e_delay, throughput and drop packet rate.

Author 1: Reza Fotohi
Author 2: Yaser Ebazadeh
Author 3: Mohammad Seyyar Geshlag

Keywords: component; VANET; P-Secure Protocol; DoS Attack; detection; OBUmodelVaNET; security

PDF

Paper 3: Performance Evaluation of Routing Protocol (RPL) for Internet of Things

Abstract: Recently, Internet Engineering Task Force (IETF) standardized a powerful and flexible routing protocol for Low Power and Lossy Networks (RPL). RPL is a routing protocol for low power and lossy networks in the Internet of Things. It is an extensible distance vector protocol, which has been proposed for low power and lossy networks in the global realm of IPv6 networks, so it selects the routes from a source to a destination node based on certain metrics injected into the objective function (OF). There has been an investigation of the performance of RPL in the lighter density network. This study investigates the performance of RPL in medium density using of two objective function in various topologies (e.g. grid, random). The performance of RPL is studied using various metrics. For example, Packet Delivery Ratio (PDR), Power Consumption and Packet Reception Ratio (RX) using a fixed Packet Reception Ratio (RX) values.

Author 1: Qusai Q. Abuein
Author 2: Muneer Bani Yassein
Author 3: Mohammed Q. Shatnawi
Author 4: Laith Bani-Yaseen
Author 5: Omar Al-Omari
Author 6: Moutaz Mehdawi
Author 7: Hussien Altawssi

Keywords: density network; objective function; zero grid; packet delivery; power consumption; Internet of Things

PDF

Paper 4: A New Method to Build NLP Knowledge for Improving Term Disambiguation

Abstract: Term sense disambiguation is very essential for different approaches of NLP, including Internet search engines, information retrieval, Data mining, classification etc. However, the old methods using case frames and semantic primitives are not qualify for solving term ambiguities which needs a lot of information with sentences. This new approach introduces a building structure system of natural language knowledge. In this paper all surface case patterns is classified in advance with the consideration of the meaning of noun. Moreover, this paper introduces an efficient data structure using a trie which define the linkage among leaves and multi-attribute relations. By using this linkage multi-attribute relations, we can get a high frequent access among verbs and noun with an automatic generation of hierarchical relationships. In our experiment a large tagged corpus (Pan Treebank) is used to extract data. In our approach around 11,000 verbs and nouns is used for verifying the new method and made a hierarchy group of its noun. Moreover, the achievement of term disambiguating using our trie structure method and linking trie among leaves is 6% higher than old method.

Author 1: E. MD. Abdelrahim
Author 2: El-Sayed Atlam
Author 3: R. F. Mansour

Keywords: Information Retrieval; NLP Knowledge; Disambiguation; Word Semantics; trie structure

PDF

Paper 5: Combination of Neural Networks and Fuzzy Clustering Algorithm to Evalution Training Simulation-Based Training

Abstract: With the advancement of computer technology, computer simulation in the field of education are more realistic and more effective. The definition of simulation is to create a virtual environment that accurately and real experiences to improve the individual. So Simulation Based Training is the ability to improve, replace, create or manage a real experience and training in a virtual mode. Simulation Based Training also provides large amounts of information to learn, so use data mining techniques to process information in the case of education can be very useful. So here we used data mining to examine the impact of simulation-based training. The database created in cooperation with relevant institutions, including 17 features. To study the effect of selected features, LDA method and Pearson's correlation coefficient was used along with genetic algorithm. Then we use fuzzy clustering to produce fuzzy system and improved it using Neural Networks. The results showed that the proposed method with reduced dimensions have 3% better than other methods.

Author 1: Lida Pourjafar
Author 2: Mehdi Sadeghzadeh
Author 3: Marjan Abdeyazdan

Keywords: Educational Data Mining; Simulation-Based Training; Dimensions Reduction; ANFIS

PDF

Paper 6: A Proposed Quantitative Conceptual Model for the Assessment of Patient Clinical Outcome

Abstract: The assessment of patient clinical outcome focuses on measuring various aspects of the patient’s health status after medical treatments and interventions. Patient clinical outcome assessment is a major concern in the clinical field as the current measures are not well developed and, as a result, they may be used without sufficient understanding of their characteristics. This issue retards the development in the clinical field. This paper proposes a general pure quantitative conceptual model for the assessment of patient clinical outcome. The proposed model contains five WHO’s International Classification of Functioning, Disability, and Health (ICF) measurable components: body functions impairment, clinical elegancy distortion, pain, death, and shortening of life expectancy. Total patient clinical outcome is measured by summing the five WHO components. Five validity types are used to validate the proposed model: content, construct, criterion, descriptive, and predictive validities.

Author 1: Mou’ath Hourani

Keywords: quantitative; conceptual model; assessment; patient clinical outcome

PDF

Paper 7: Identifying and Prioritizing Evaluation Criteria for User-Centric Digital Identity Management Systems

Abstract: Identity Management systems are used for securing digital identity of users in reliable, automated and compatible way. Service providers employ identity management systems which are cost effective and scalable but cause poor usability for users. Identity management systems are user-centric applications which should be designed by considering users’ perspective. User centricity is a remarkable concept in identity management systems as it provides more powerful user control and privacy. This approach has been evolved from amending past paradigms. Thus, evaluation of digital identity management systems based on users’ point of view, is really important. The main objective of this paper is to identify the appropriateness of the criteria used in evaluation of user-centric digital identity management systems. These criteria are gathered from the literature and then categorized into four groups for the first time in this work to examine the importance of each parameter. In this approach, several interviews were performed as a qualitative research method and two questionnaires have been filled out by forty six users who were involved with identity management systems. Since the answers are perception based data the most important criteria in each category are assessed by using fuzzy method. This research found that the most important criteria are related to security category. The results of this research can provide valuable information for managers and decision makers of hosting companies as well as system designers to adapt and develop appropriate user-centric digital identity management systems.

Author 1: Sepideh Banihashemi
Author 2: Elaheh Homayounvala
Author 3: Alireza Talebpour
Author 4: Abdolreza Abhari

Keywords: management of information technology; digital identity management systems; evaluation criteria; fuzzy analytical hierarchy process (FAHP); user-centricity

PDF

Paper 8: Direct Torque Control of Saturated Doubly-Fed Induction Generator using High Order Sliding Mode Controllers

Abstract: The present work examines a direct torque control strategy using a high order sliding mode controllers of a doubly-fed induction generator (DFIG) incorporated in a wind energy conversion system and working in saturated state. This research is carried out to reach two main objectives. Firstly, in order to introduce some accuracy for the calculation of DFIG performances, an accurate model considering magnetic saturation effect is developed. The second objective is to achieve a robust control of DFIG based wind turbine. For this purpose, a Direct Torque Control (DTC) combined with a High Order Sliding Mode Control (HOSMC) is applied to the DFIG rotor side converter. Conventionally, the direct torque control having hysteresis comparators possesses major flux and torque ripples at steady-state and moreover the switching frequency varies on a large range. The new DTC method gives a perfect decoupling between the flux and the torque. It also reduces ripples in these grandeurs. Finally, simulated results show, accurate dynamic performances, faster transient responses and more robust control are achieved.

Author 1: Elhadj BOUNADJA
Author 2: Abdelkader DJAHBAR
Author 3: Mohand Oulhadj MAHMOUDI
Author 4: Mohamed MATALLAH

Keywords: Doubly Fed Induction Generator (DFIG); Magnetic saturation; Direct Torque Control (DTC); High Order Sliding Mode Controller (HOSMC)

PDF

Paper 9: Energy Dissipation Model for 4G and WLAN Networks in Smart Phones

Abstract: With the modernization of the telecommunication standards, there has been considerable evolution of various technologies to assist cost effective communication. In this regard, the fourth generation communication services or commonly known as 4G mobile networks have penetrated almost every part of the world to offer faster and seamless data connectivity. However, such services come at the cost of energy drained from the smart phone supporting 4G services. This paper presents an algorithm that is capable of evaluating the actual amount of energy being dissipated while using next generation mobile networks. The study also performs a comparative analysis of energy dissipation of 4G networks with other wireless local area networks to understand the networks that cause more energy dissipation.

Author 1: Shalini Prasad
Author 2: S. Balaji

Keywords: G Wireless Networks; Energy Consumption; Smart phone; Wi-Fi

PDF

Paper 10: Effective Data Mining Technique for Classification Cancers via Mutations in Gene using Neural Network

Abstract: The prediction plays the important role in detecting efficient protection and therapy/treatment of cancer. The prediction of mutations in gene needs a diagnostic and classification, which is based on the whole database (big dataset enough), to reach sufficient accuracy/correct results. Since the tumor suppressor P53 is approximately about fifty percentage of all human tumors because mutations that occur in the TP53 gene into the cells. So, this paper is applied on tumor p53, where the problem is there are several primitive databases (e.g. excel genome and protein database) contain datasets of TP53 gene with its tumor protein p53, these databases are rich datasets that cover all mutations and cause diseases (cancers). But these Data Bases cannot reach to predict and diagnosis cancers, i.e. the big datasets have not efficient Data Mining method, which can predict, diagnosis the mutation, and classify the cancer of patient. The goal of this paper to reach a Data Mining technique, that employs neural network, which bases on the big datasets. Also, offers friendly predictions, flexible, and effective classified cancers, in order to overcome the previous techniques drawbacks. This proposed technique is done by using two approaches, first, bioinformatics techniques by using BLAST, CLUSTALW, etc, in order to know if there are malignant mutations or not. The second, data mining by using neural network; it is selected (12) out of (53) TP53 gene database fields. To clarify, one of these 12 fields (gene location field) did not exists inTP53 gene database; therefore, it is added to the database of TP53 gene in training and testing back propagation algorithm, in order to classify specifically the types of cancers. Feed Forward Back Propagation supports this Data Mining method with data training rate (1) and Mean Square Error (MSE) (0.00000000000001). This effective technique allows in a quick, accurate and easy way to classify the type of cancer.

Author 1: Ayad Ghany Ismaeel
Author 2: Dina Yousif Mikhail

Keywords: Detection; Classification; Data Mining; TP53 Gene; Tumor Protein P53; Back Propagation Network (BPN)

PDF

Paper 11: Firefly Algorithm for Adaptive Emergency Evacuation Center Management

Abstract: Flood disaster is among the most devastating natural disasters in the world, claiming more lives and causing property damage. The pattern of floods across all continents has been changing, becoming more frequent, intense and unpredictable for local communities. Due to unforeseen scenarios, some evacuation centers that host the flood victims may also be drowned. Hence, prime decision making is required to relocate the victims and resources to a safer center. This study proposes a Firefly Algorithm (FA) to be employed in an emergency evacuation center management. Experimental analysis of a minimization problem was performed to compare the solutions produced by FA and the ones generated using Tabu Search. Results show that the proposed FA produced solutions with smaller utility value, hence indicating that it is better than the benchmark method.

Author 1: Yuhanis Yusof
Author 2: Nor Laily Hashim
Author 3: Noraziah ChePa
Author 4: Azham Hussain

Keywords: Firefly Algorithm; Swarm Intelligence; Flood Management; Evacuation Center Management

PDF

Paper 12: A Conversion of Empirical MOS Transistor Model Extracted From 180 nm Technology To EKV3.0 Model Using MATLAB

Abstract: In this paper, the EKV3.0 model used for RF analog designs was validated in all-inversion regions under bias conditions and geometrical effects. A conversion of empirical data of 180nm CMOS process to EKV model was proposed. A MATLAB developed algorithm for parameter extraction was set up to evaluate the basic EKV model parameters. Respecting the substrate, and as long as the source and drain voltages remain constant, the DC currents and gm/ID real transistors ratio can be reconstructed by means of the EKV model with acceptable accuracy even with short channel devices. The results verify that the model takes into account the second order effects such as DIBL and CLM. The sizing of the elementary amplifier was considered in the studied example. The sizing procedure based on gm/ID methodology was described considering a semi-empirical model and an EKV model. The two gave close results.

Author 1: Amine AYED
Author 2: Mongi LAHIANI
Author 3: Hamadi GHARIANI

Keywords: EKV model; gm/ID methodology; analog design; MATLAB

PDF

Paper 13: PSO Algorithm based Adaptive Median Filter for Noise Removal in Image Processing Application

Abstract: A adaptive Switching median filter for salt and pepper noise removal based on genetic algorithm is presented. Proposed filter consist of two stages, a noise detector stage and a noise filtering stage. Particle swarm optimization seems to be effective for single objective problem. Noise Dictation stage works on it. In contrast to the standard median filter, the proposed algorithm generates the noise map of corrupted Image. Noise map gives information about the corrupted and non-corrupted pixels of Image. In filtering, filter calculates the median of uncorrupted neighbouring pixels and replaces the corrupted pixels. Extensive simulations are performed to validate the proposed filter. Simulated results show refinement both in Peak signal to noise ratio (PSNR) and Image Quality Index value (IQI). Experimental results shown that proposed method is more effective than existing methods.

Author 1: Ruby Verma
Author 2: Rajesh Mehra

Keywords: Switching median filter; Particle Swarm algorithm; Noise removal; salt and pepper noise

PDF

Paper 14: Switched Control of a Time Delayed Compass Gait Robot

Abstract: the analysis and control of delayed systems are becoming more and more research topics in progress. This is mainly due to the fact that the delay is frequently encountered in technological systems. Most control command laws are based on current digital computers and delays are intrinsic to the process or in the control loop caused by the transmission time control sequences, or computing time. In other hand, the controls of humanoid walking robot present a common problem in robotics because it involves physical interaction between an articulated system and its environment. This close relationship is actually a common set of fundamental problems such as the implementation of robust stable dynamic control. This paper presents acomplete approach, based on switched system theory, for the stabilization of a compass gait robot subject to time delays transmission. The multiple feedback gains designed are based on multiple linear systems governed by a switching control law. The establishment of control law in real time is affected by the unknown pounded random delay. The results obtained from this method show that the control law stabilize the compass robot walk despite a varying delay reaching six times sampling period.

Author 1: Elyes Maherzi
Author 2: Walid Arouri
Author 3: Mongi Besbes

Keywords: Biped robot; delayed system; Switched system; Stability; Lagrange formulation; Lyapunov method; Relaxation; Linear matrix inequalities (LMI); bilinear matrix inequalities (BMI)

PDF

Paper 15: An Evolutionary Stochastic Approach for Efficient Image Retrieval using Modified Particle Swarm Optimization

Abstract: Image retrieval system as a reliable tool can help people in reaching efficient use of digital image accumulation; also finding efficient methods for the retrieval of images is important. Color and texture descriptors are two basic features in image retrieval. In this paper, an approach is employed which represents a composition of color moments and texture features to extract low-level feature of an image. By assigning equal weights for different types of features, we can’t obtain good results, but by applying different weights to each feature, this problem is solved. In this work, the weights are improved using a modified Particle Swarm Optimization (PSO) method for increasing average Precision of system. In fact, a novel method based on an evolutionary approach is presented and the motivation of this work is to enhance Precision of the retrieval system with an improved PSO algorithm. The average Precision of presented method using equally weighted features and optimal weighted features is 49.85% and 54.16%, respectively. 4.31% increase in the average Precision achieved by proposed technique can achieve higher recognition accuracy, and the search result is better after using PSO.

Author 1: Hadis Heidari
Author 2: Abdolah Chalechale

Keywords: color moments; content based image retrieval; particle swarm optimization (PSO); texture feature

PDF

Paper 16: Evaluating Web Accessibility Metrics for Jordanian Universities

Abstract: University web portals are considered one of the main access gateways for universities. Typically, they have a large candidate audience among the current students, employees, and faculty members aside from previous and future students, employees, and faculty members. Web accessibility is the concept of providing web content universal access to different machines and people with different ages, skills, education levels, and abilities. Several web accessibility metrics have been proposed in previous years to measure web accessibility. We integrated and extracted common web accessibility metrics from the different accessibility tools used in this study. This study evaluates web accessibility metrics for 36 Jordanian universities and educational institute websites. We analyze the level of web accessibility using a number of available evaluation tools against the standard guidelines for web accessibility. Receiver operating characteristic quality measurements is used to evaluate the effectiveness of the integrated accessibility metrics.

Author 1: Israa Wahbi Kamal
Author 2: Heider A. Wahsheh
Author 3: Izzat M. Alsmadi
Author 4: Mohammed N. Al-Kabi

Keywords: web accessibility, web ranking, web evaluation, web testing

PDF

Paper 17: Ontology for Academic Program Accreditation

Abstract: Many educational institutions are adopting national and international accreditation programs to improve teaching, student learning, and curriculum. There is a growing demand across higher education for automation and helpful educational resources to continuously improve student outcomes. The student outcomes are the required knowledge and skill set that graduates of any accredited program have to gain in order entry into the workforce or for to continue with their future education. To evaluate student outcomes, each assessment activities must map to a course learning outcomes which maps students’ outcomes. The problem is that all course learning outcomes and student outcome mapping are placed in documents or database which requires more work and time to access and understand. This paper proposes an ontology based solution to enable visual discover of all course learning outcomes that maps to a particular student outcome and related assessments to help faculty or curriculum committees avoid over mapping or under mapping students’ outcomes.

Author 1: Jehad Sabri Alomari

Keywords: Accreditation; Ontology; Semantic Web; classification; Education

PDF

Paper 18: A Dual Cylindrical Tunable Laser Based on MEMS

Abstract: Free space optics is considered the topic of the day and have a large variety of applications which free space separates source from destination such as External cavity tunable laser (ECTL). In ECTL, laser source emits Gaussian beam that propagates in plane with substrate until reach external reflector. The efficiency of these applications depends on the amount of light that coupled back to the laser, called coupling efficiency. Increasing coupling efficiency depends on using assembled lens's or any optical part in the path between laser front facet and external reflector, which result in increasing the cost and integration effort. We innovate here anew configuration of external cavity tunable laser based on cylindrical (curved) Mirrors. The usage of cylindrical mirror affects on the amount of light that coupled back to laser and that decreases the alignment requirement in the laser assembly as compared to another configurations based on flat mirror. The fabrication of cylindrical mirror is simple with respect to spherical mirror so it can be used in batch fabrication. Tuning achieved by using micro electro mechanical system MEMS technology. The system consists of a laser cavity and a two filter cavities for wavelength selection. The formation of cylindrical microstructures were made into the substrate volume. So we report also the micromachining method that used for fabricating the cylindrical mirror. Anisotropic etching and the deep reactive ion etching (DRIE) are especially useful for the batch fabrication of large optical mechanical devices. The characteristics of the laser's spectral response versus laser facet reflectance variations are described via simulations. The diffraction of light in ECTL formed by the laser front facet and the external reflector are taken into account. Here we report all things about the model including the fabrication steps and simulation analysis.

Author 1: Ahmed Fawzy
Author 2: Osama M. EL-Ghandour
Author 3: Hesham F.A. Hamed

Keywords: Dual ECT; wavelength tuning; MEMS; DRIE

PDF

Paper 19: Function-Behavior-Structure Model of Design: An Alternative Approach

Abstract: The Function-Behavior-Structure model (FBS) of design conceptualizes objects in terms of function, behavior, and structure. It has been widely utilized as a foundation for modelling the design process that transforms posited functions to a description of behaviors. Nevertheless, the FBS model is still regarded as a subjective and experience-based process and it provides no theory about how a function is transformed into behavior. Research has shown that the critical concepts of function and behavior have many different definitions. This paper suggests a viable alternative and contrasts it with the FBS framework of design using published study cases. The results point to several benefits gained by adopting the proposed method.

Author 1: Sabah Al-Fedaghi

Keywords: conceptual design; FBS framework; flow-based model; function; behaviour; structure

PDF

Paper 20: Evolutionary Strategy of Chromosomal RSOM Model on Chip for Phonemes Recognition

Abstract: This paper aims to contribute in modeling and implementation, over a system on chip SoC, of a powerful technique for phonemes recognition in continuous speech. A neural model known by its efficiency in static data recognition, named SOM for self organization map, is developed into a recurrent model to incorporate the temporal aspect in these applications. The obtained model RSOM will subsequently introduced to ensure the diversification of the genetic algorithm GA populations to expand even more the search space and optimize the obtained results. We assigned a chromosomal vision to this model in an effort to improve the information recognition rate.

Author 1: Mohamed Salah Salhi
Author 2: Nejib Khalfaoui
Author 3: Hamid Amiri

Keywords: Information recognition; Recurrent SOM; Chromosomal RSOM model; Evolutionary RSOM; Implementation over SoC

PDF

Paper 21: An Intelligent Agent based Architecture for Visual Data Mining

Abstract: the aim of this paper is to present an intelligent architecture of Decision Support System (DSS) based on visual data mining. This architecture applies the multi-agent technology to facilitate the design and development of DSS in complex and dynamic environment. Multi-Agent Systems add a high level of abstraction. To validate the proposed architecture, it is implemented to develop a distributed visual data mining based DSS to predict nosocomial infectionsoccurrence in intensive care units. The developed prototype was evaluated to verify the architecture practicability.

Author 1: Hamdi Ellouzi
Author 2: Hela Ltifi
Author 3: Mounir Ben Ayed

Keywords: Multi Agent System; Decision Support System; Visualization; Knowledge Discovery from Data; Nosocomial Infection

PDF

Paper 22: A Zone Classification Approach for Arabic Documents using Hybrid Features

Abstract: Zone segmentation and classification is an important step in document layout analysis. It decomposes a given scanned document into zones. Zones need to be classified into text and non-text, so that only text zones are provided to a recognition engine. This eliminates garbage output resulting from sending non-text zones to the engine. This paper proposes a framework for zone segmentation and classification. Zones are segmented using morphological operation and connected component analysis. Features are then extracted from each zone for the purpose of classification into text and non-text. Features are hybrid between texture-based and connected component based features. Effective features are selected using genetic algorithm. Selected features are fed into a linear SVM classifier for zone classification. System evaluation shows that the proposed zone classification works well on multi-font and multi-size documents with a variety of layouts even on historical documents.

Author 1: Amany M.Hesham
Author 2: Sherif Abdou
Author 3: Amr Badr
Author 4: Mohsen Rashwan
Author 5: Hassanin M.Al-Barhamtoshy

Keywords: segmentation; layout analysis; texture features; connected component analysis; Arabic script; genetic algorithms

PDF

Paper 23: Air Pollution Analysis using Ontologies and Regression Models

Abstract: Rapidly throughout the world economy, "the expansive Web" in the "world" explosive growth, rapidly growing market characterized by short product cycles exists and the demand for increased flexibility as well as the extensive use of a new data vision managed data society. A new socio-economic system that relies more and more on movement and allocation results in data whose daily existence, refinement, economy and adjust the exchange industry. Cooperative Engineering Co -operation and multi -disciplinary installed on people's cooperation is a good example. Semantic Web is a new form of Web content that is meaningful to computers and additional approved another example. Communication, vision sharing and exchanging data Society's are new commercial bet. Urban air pollution modeling and data processing techniques need elevated Association. Artificial intelligence in countless ways and breakthrough technologies can solve environmental problems from uneven offers. A method for data to formal ontology means a true meaning and lack of ambiguity to allow us to portray memo. In this work we survey regression model for ontologies and air pollution.

Author 1: Parul Choudhary
Author 2: Dr. Jyoti Gautam

Keywords: Ontologies; Air pollution Analysis; Regression Models; Linear Regression

PDF

Paper 24: Decision Support System for Diabetes Mellitus through Machine Learning Techniques

Abstract: recently, the diseases of diabetes mellitus have grown into extremely feared problems that can have damaging effects on the health condition of their sufferers globally. In this regard, several machine learning models have been used to predict and classify diabetes types. Nevertheless, most of these models attempted to solve two problems; categorizing patients in terms of diabetic types and forecasting blood surge rate of patients. This paper presents an automatic decision support system for diabetes mellitus through machine learning techniques by taking into account the above problems, plus, reflecting the skills of medical specialists who believe that there is a great relationship between patient’s symptoms with some chronic diseases and the blood sugar rate. Data sets are collected from Layla Qasim Clinical Center in Kurdistan Region, then, the data is cleaned and proposed using feature selection techniques such as Sequential Forward Selection and the Correlation Coefficient, finally, the refined data is fed into machine learning models for prediction, classification, and description purposes. This system enables physicians and doctors to provide diabetes mellitus (DM) patients good health treatments and recommendations.

Author 1: Tarik A. Rashid
Author 2: Saman . M. Abdulla
Author 3: Rezhna . M. Abdulla

Keywords: Diabetes disease; Blood sugar rate and symptoms; ANN; Prediction and Classification models

PDF

Paper 25: An Efficient Application Specific Memory Storage and ASIP Behavior Optimization in Embedded System

Abstract: Low power embedded system requires effective memory design system which improves the system performance with the help of memory implementation techniques. Application specific data allocation design pattern implements the memory storage area and internal cell design techniques implements data transition speeds. Embedded cache design is implemented with simulator and scheduling approaches which can reduce the cache miss behavior and improve the cache hit quantities. Cache hit optimization, delay reduction and latency prediction techniques are effective for ASIP design. The design functionality is simply specifying the tradeoff among various design metrics like performance, power, size, cost and flexibility. ASIP behavior and memory storage area optimized for low power embedded system and implements cycle time with effective scheduling techniques which implements the system performance with low power consumption.

Author 1: Ravi Khatwal
Author 2: Manoj Kumar Jain

Keywords: Memory design; Compiler; Processor design; Scheduling Techniques; Memory storage

PDF

Paper 26: MAS based on a Fast and Robust FCM Algorithm for MR Brain Image Segmentation

Abstract: In the aim of providing sophisticated applications and getting benefits from the advantageous properties of agents, designing agent-based and multi-agent systems has become an important issue that received further consideration from many application domains. Towards the same goal, this work gathered three different research fields; image segmentation, fuzzy clustering and multi-agent systems (MAS); and furnished a MAS for MR brain image segmentation that is based on a fast and robust FCM (FRFCM) algorithm. The proposed MAS was tested, as well as the sequential version of the FRFCM algorithm and the standard FCM, on simulated and real normal brains. The experimental results were valuable in both segmentation accuracy and running times point of views.

Author 1: Hanane Barrah
Author 2: Abdeljabbar Cherkaoui
Author 3: Driss Sarsri

Keywords: agents; MAS; FCM; c-means algorithm; MRI images; image segmentation

PDF

Paper 27: Development of the System to Support Tourists’ Excursion Behavior using Augmented Reality

Abstract: The purpose of this study is to develop an information system (AR recommended GIS) to support tourists’ excursion behavior by making the accumulating, sharing, and recommending of information concerning urban tourist spots possible. The conclusion of this study can be summarized into three points as shown below. (1) In order to support tourists’ excursion behaviors by integrating SNS, Twitter, Web-GIS, recommendation system, and Smart Eyeglass, in addition to making the accumulating, sharing, and recommending of information regarding urban tourist spots possible, the AR recommended GIS was designed and developed. (2) 91% were between the age of 20-40 among the 91 users, and the ultimate number of submitted information was 161. In addition, concerning the operation using Smart Eyeglass, which was conducted with tourists in the Minato Mirai area, the total number of users were 34, age of users were spread out, and all users had no experience in using Smart Eyeglasses. (3) From the results of the Web questionnaire survey, the system is compatible for the collection method of tourist spot information for users, and is mainly used to collect tourist spot information using the viewing and recommendation functions. From the results of the access analysis using the log data form during the operation, the utilization method of the system with PCs and mobile information terminals were very similar. Additionally, as the system using AR Smart Eyeglass was rated extremely highly, it was evident that it is possible to support tourists’ excursion behavior using PCs, mobile information terminals, and AR Smart Eyeglasses are possible.

Author 1: Jiawen ZHOU
Author 2: Kayoko YAMAMOTO

Keywords: Augmented Reality; Web-GIS; Social Media; Recommendation System; AR recommended GIS; Tourists’ Excursion Behavior

PDF

Paper 28: An Efficent Lossless Compression Scheme for ECG Signal

Abstract: Cardiac diseases constitute the main cause of mortality around the globe. For detection and identification of cardiac problems, it is very important to monitor the patient's heart activities for long periods during his normal daily life. The recorded signal that contains information about the condition of the heart called electrocardiogram (ECG). As a result, long recording of ECG signal amounts to huge data sizes. In this work, a robust lossless ECG data compression scheme for real-time applications is proposed. The developed algorithm has the advantages of lossy compression without introducing any distortion to the reconstructed signal. The ECG signals under test were taken from the PTB Diagnostic ECG Database. The compression procedure is simple and provides a high compression ratio compared to other lossless ECG compression methods. The compressed ECG data is generated as a text file. The decompression scheme has also been developed using the reverse logic and it is observed that there is no difference between original and reconstructed ECG signal.

Author 1: O. *El B’charri
Author 2: R. Latif
Author 3: A. Abenaou
Author 4: A. Dliou
Author 5: W. Jenkal

Keywords: ECG; lossless compression; data encoding; compression ratio

PDF

Paper 29: Albanian Sign Language (AlbSL) Number Recognition from Both Hand’s Gestures Acquired by Kinect Sensors

Abstract: Albanian Sign Language (AlbSL) is relatively new and until now there doesn’t exist a system that is able to recognize Albanian signs by using natural user interfaces (NUI). The aim of this paper is to present a real-time gesture recognition system that is able to automatically recognize number signs for Albanian Sign Language, captured from signer’s both hands. Kinect device is used to obtain data streams. Every pixel generated from Kinect device contains depth data information which is used to construct a depth map. Hands segmentation process is performed by applying a threshold constant to depth map. In order to differentiate signer’s hands a K-means clustering algorithm is applied to partition pixels into two groups corresponding to each signer’s hands. Centroid distance function is calculated in each hand after extracting hand’s contour pixels. Fourier descrip-tors, derived form centroid distance is used as a hand shape representation. For each number gesture there are 15 Fourier descriptors coefficients generated which represent uniquely that gesture. Every input data is compared against training data set by calculating Euclidean distance, using Fourier coefficients. Sign with the lowest Euclidean distance is considered as a match. The system is able to recognize number signs captured from one hand or both hands. When both signer’s hands are used, some of the methodology processes are executed in parallel in order to improve the overall performance. The proposed system achieves an accuracy of 91% and is able to process 55 frames per second.

Author 1: Eriglen Gani
Author 2: Alda Kika

Keywords: Albanian Sign Language (AlbSL); Number Recog-nition; Microsoft Kinect; K-Means; Fourier Descriptors

PDF

Paper 30: A Collaborative Process of Decision Making in the Business Context based on Online Questionnaires

Abstract: This article is a component of a series of articles and scientific researches conducted by the research team which deals with the web 2.0 and its interactions with the different technology areas. During recent years, the emergence of the web 2.0 has revolutionized the world of new technologies, in particular the business intelligence field, providing businesses with new and innovative ways to make use of information in order to improve their overall performance. This article comes to consolidate the profit which can be taken from the new technologies of the web 2.0, especially blogs which constitute a valuable mean to gather exchanged information and results of the collaboration between users, by offering a new collaborative tool for decision making based on online questionnaires in order to exploit the collective intelligence which represents a very important source of significant data, and by adopting the SCAMMPERR method, a creative technique of stimulation of ideas and problem solving. This paper presents a practical innovation in the computing level and makes an impact on the economic and the organizational sides of the enterprise, by proposing a new methodology based on the SCAMMPERR technique and supported by the strengths of the web 2.0 to ensure a collaborative decision making. As a result, it provides relevant decisions which support the traditional decision support systems.

Author 1: Rhizlane Seltani
Author 2: Noura Aknin
Author 3: Souad Amjad
Author 4: Mohamed Chrayah
Author 5: Kamal Eddine El Kadiri

Keywords: Decision Making; Web 2.0; Blogs; Business Intelligence; SCAMMPERR Method; Online Questionnaire

PDF

Paper 31: Intelligent Sensor Based Bayesian Neural Network for Combined Parameters and States Estimation of a Brushed DC Motor

Abstract: The objective of this paper is to develop an Artificial Neural Network (ANN) model to estimate simultaneously, parameters and state of a brushed DC machine. The proposed ANN estimator is novel in the sense that his estimates simultaneously temperature, speed and rotor resistance based only on the measurement of the voltage and current inputs. Many types of ANN estimators have been designed by a lot of researchers during the last two decades. Each type is designed for a specific application. The thermal behavior of the motor is very slow, which leads to large amounts of data sets. The standard ANN use often Multi-Layer Perceptron (MLP) with Levenberg-Marquardt Backpropagation (LMBP), among the limits of LMBP in the case of large number of data, so the use of MLP based on LMBP is no longer valid in our case. As solution, we propose the use of Cascade-Forward Neural Network (CFNN) based Bayesian Regulation backpropagation (BRBP). To test our estimator robustness a random white-Gaussian noise has been added to the sets. The proposed estimator is in our viewpoint accurate and robust.

Author 1: Hacene MELLAH
Author 2: Kamel Eddine HEMSAS
Author 3: Rachid TALEB

Keywords: DC motor; thermal modeling; state and parameter estimations; Bayesian regulation; backpropagation; cascade-forward neural network

PDF

Paper 32: Social Computing: The Impact on Cultural Behavior

Abstract: Social computing continues to become more and more popular and has impacted cultural behavior. While cultural behavior affects the way an individual do social computing, Hofstede’s theory is still prevalent. The results of this literature review suggest that, at least for several cultural dimensions, some adjustments may be required to reflect current time and the recognition of the role of technology nowadays. Thus, today, social computing has evolved into continuous communication and interaction of many culturally diverse users.

Author 1: Naif Ali Almudawi

Keywords: social computing; Web 2.0; cultural behavior; culture; Power distance; Individualism vs. collectivism; masculinity vs. femininity; uncertainty; avoidance and time horizon

PDF

Paper 33: Improvisation of Security aspect of Steganographic System by applying RSA Algorithm

Abstract: The applications accessing multimedia systems and content over the internet have grown extremely in the earlier few years. Moreover, several end users or intruders can simply use tools to synthesize and modify valuable information. The safety of information over unsafe communication channel has constantly been a primary concern in the consideration of researchers. It became one of the most important problems for information technology and essential to safeguard this valuable information during transmission. It is also important to determine where and how such a multimedia file is confidential. Thus, a need exists for emerging technology that helps to defend the integrity of information and protected the intellectual property privileges of owners. Various approaches are coming up to safeguard the data from unauthorized person. Steganography and Cryptography are two different techniques for security data over communication network. The primary purpose of Cryptography is to create message concept unintelligible or ciphertext might produce suspicious in the mind of opponents. On the other hand, Steganography implant secrete message in to a cover media and hides its existence. As a normal practice, data embedding is employed in communication, image, text or multimedia contents for the purpose of copyright, authentication and digital signature etc. Both techniques provides the sufficient degree of security but are vulnerable to intruder’s attacks when used over unsecure communication channel. Attempt to combines the two techniques i.e. Cryptography and Steganography, did results in security improvement. The existing steganographic algorithms primarily focus on embedding approach with less attention to pre-processing of data which offer flexibility, robustness and high security level. Our proposed model is based on Public key cryptosystem or RSA algorithms in which RSA algorithm is used for message encryption in encoding function and the resultant encrypted image is hidden into cover image employing Least Significant Bit (LSB) embedding method.

Author 1: Manoj Kumar Ramaiya
Author 2: Dr. Dinesh Goyal
Author 3: Dr. Naveen Hemrajani

Keywords: Image Steganography; Cryptography; LSB insertion; Public key Cryptosystem; RSA algorithm

PDF

Paper 34: New Modified RLE Algorithms to Compress Grayscale Images with Lossy and Lossless Compression

Abstract: New modified RLE algorithms to compress grayscale images with lossy and lossless compression, depending on the probability of repetition of pixels in the image and the pixel values to reduce the size of the encoded data by sending bit 1 instead of the original value of the pixel if the pixel’s value is repeated. The proposed algorithms achieved good reduction of encoded size as compared with other compression method that used to compare with our method and decrease encoding time by good ratio.

Author 1: Hassan K. Albahadily
Author 2: Alaa A. Jabbar Altaay
Author 3: Viktar U. Tsviatko
Author 4: Valery K. Kanapelka

Keywords: compression; Run Length Encoding; quantization

PDF

Paper 35: An Investigation and Comparison of Invasive Weed, Flower Pollination and Krill Evolutionary Algorithms

Abstract: Being inspired by natural phenomena and available biological processes in the nature is one of the difficult methods of problem solving in computer sciences. Evolutionary methods are a set of algorithms that are inspired from the nature and are based on their evolutionary mechanisms. Unlike other optimizing methods of problem solving, evolutionary algorithms do not require any prerequisites and usually offer solutions very close to optimized answers. Based on their behavior, evolutionary algorithms are divided into two categories of biological processes based on plant behavior and animal behavior. Various evolutionary algorithms have been proposed so far to solve optimization problems, some of which include evolutionary algorithm of invasive weed and flower pollination algorithm that are inspired by plants and krill algorithm inspired by the animal algorithm of sea animals. In this paper, a comparison is made for the first time between the accuracy and rate of involvement in local optimization of these new evolutionary algorithms to identify the best algorithm in terms of efficiency. Results of various tests show that invasive weed algorithm is more efficient and accurate than flower pollination and krill algorithms.

Author 1: Marjan Abdeyazdan
Author 2: Samaneh Mehri Dehno
Author 3: Sayyed Hedayat Tarighinejad

Keywords: evolutionary algorithm; invasive weed algorithm; flower pollination algorithm; krill algorithm

PDF

Paper 36: Mobile Forensic Images and Videos Signature Pattern Matching using M-Aho-Corasick

Abstract: Mobile forensics is an exciting new field of research. An increasing number of Open source and commercial digital forensics tools are focusing on less time during digital forensic examination. There is a major issue affecting some mobile forensic tools that allow the tools to spend much time during the forensic examination. It is caused by implementation of poor file searching algorithms by some forensic tool developers. This research is focusing on reducing the time taken to search for a file by proposing a novel, multi-pattern signature matching algorithm called M-Aho-Corasick which is adapted from the original Aho-Corasick algorithm. Experiments are conducted on five different datasets which one of the data sets is obtained from Digital Forensic Research Workshop (DFRWS 2010). Comparisons are made between M-Aho-Corasick using M_Triage with Dec0de, Lifter, XRY, and Xaver. The result shows that M-Aho-Corasick using M_Triage has reduced the searching time by 75% as compared to Dec0de, 36% as compared to Lifter, 28% as compared to XRY, and 71% as compared to Xaver. Thus, M-Aho-Corasick using M_Triage tool is more efficient than Dec0de, Lifter, XRY, and Xaver in avoiding the extraction of high number of false positive results.

Author 1: Yusoof Mohammed Hasheem
Author 2: Kamaruddin Malik Mohamad
Author 3: Ahmed Nur Elmi Abdi
Author 4: Rashid Naseem

Keywords: mobile forensics; Images; Videos, M-Aho-Corasick; (File Signature Pattern Matching)

PDF

Paper 37: Visual Knowledge Generation from Data Mining Patterns for Decision-Making

Abstract: The visual data mining based decision support systems had already been recognized in literature. It allows users analysing large information spaces to support complex decision-making. Prior research provides frameworks focused on simply representing extracted patterns. In this paper, we present a new model for visually generating knowledge from these patterns and communicating it for intelligent decision-making. To prove the practicality of the proposed model, it was applied in the medical field to fight against nosocomial infections in the intensive care units.

Author 1: Jihed Elouni
Author 2: Hela Ltifi
Author 3: Mounir Ben Ayed
Author 4: Mohamed Masmoudi

Keywords: Knowledge; patterns; visualization; data mining; Decision Support Systems

PDF

Paper 38: A Novel Design of Miniaturaized Patch Antenna Using Different Substrates for S-Band and C-Band Applications

Abstract: In advance communication technology, patch antennas are widely exploit due to their inexpensive and light weighted structure. This paper presents a novel design of miniaturized multiband patch antenna using different substrates frequently used in patch antennas. Various substrates such as Teflon, Roger 5880, Bakelite and Air are in use to achieve better gain and directivity. The proposed miniaturized multiband patch antenna contains 2 substrates where one substrate is FR4 (fixed and lossy) and the other substrates are changed to observe gain, directivity and return loss. Coaxial probe serving mode is presented in this paper. This serving mode is a contacting arrangement for patch, in which the outer conductor is linked to ground plane and the inner conductor of the coaxial connector spreads through dielectric and is bonded to the radiating patch. The proposed antenna can be used for various S-band and C-band applications.

Author 1: Saad Hassan Kiani
Author 2: Khalid Mahmood
Author 3: Sharyar Shafeeq
Author 4: Mehre Munir
Author 5: Khalil Muhammad Khan

Keywords: substrates; microstrip; return loss; directivity; miniaturized; Impdence bandwidth

PDF

Paper 39: Impact of Elliptical Holes Filled with Ethanol on Confinement Loss and Dispersion in Photonic Crystal Fibers

Abstract: To get a confinement loss value, the weakest possible We have interest to optimize an optical fiber our structure has a cladding which is formed by holes in silica. The geometry of the holes is special because they have an elliptical shape and oriented with some angle. The introduction of ethanol in the holes, the omission of some rings allows us to have values very close to zero for the confinement loss. In this paper, we have designed an ultraflat dispersion PCF. We notice that the zero dispersion can be in the range from 1000 nm to 1650 nm and has the value of 0+-0.14ps/nm/km.

Author 1: Khemiri Kheareddine
Author 2: Ezzedine Tahar
Author 3: Houria Rezig

Keywords: confinement loss; dispersion; doped Photonic Crystal Fiber; ethanol-filled holes; elliptical holes; FDTD

PDF

Paper 40: Improving the Recognition of Heart Murmur

Abstract: Diagnosis of congenital cardiac defects is challenging, with some being diagnosed during pregnancy while others are diagnosed after birth or later on during childhood. Prompt diagnosis allows early intervention and best prognosis. Contemporary diagnosis relies upon the history, clinical examination, pulse oximetery, chest X-ray, electrocardiogram (ECG), echocardiography (ECHO), computed tomography (CT) and cardiac catheterization. These diagnostic modalities reliable upon recording electrical activity or sound waves or upon radiation. Yet, congenital heart diseases are still liable to misdiagnosis because of level of operator expertise and other multiple factors. In an attempt to minimize effect of operator expertise this paper built a classification model for heart murmur recognition using Hidden Markov Model (HMM). This paper used Mel Frequency Cepestral coefficient (MFCC) as a feature and 13 MFCC coefficients. The machine learning model built by studying 1069 different heart sounds covering normal heart sounds, ventricular septal defect (VSD), mitral regurgitation (MR), aortic stenosis (AS), aortic regurgitation (AR), patent ductus arteriosus (PDA), pulmonary regurgitation (PR), and pulmonary stenosis (PS). MFCC feature used to extract feature matrix for each type of heart sounds after separation according to amplitude threshold. The frequency of normal heart sound (range= 1Hz to 139Hz) was specific without overlap with any of the studied defects (ranged= 156-556Hz). The frequency ranges for each of these defects was typical without overlap according to examined heart area (aortic, pulmonary, tricuspid and mitral area). The overall correct classification rate (CCR) using this model was 96% and sensitivity 98%. This model has great potential for prompt screening and specific defect detection. Effect of cardiac contractility, cardiomegaly or cardiac electrical activity on this novel detection system needs to be verified in future works.

Author 1: Magd Ahmed Kotb
Author 2: Hesham Nabih Elmahdy
Author 3: Fatma El Zahraa Mostafa
Author 4: Mona El Falaki
Author 5: Christine William Shaker
Author 6: Mohamed Ahmed Refaey
Author 7: Khaled W Y Rjoob

Keywords: component; Hidden Markov Model (HMM); Heart Murmur; Mel Frequency Cepestral Coefficient MFCC; Systolic Murmur; Diastolic Murmur; Auscultation Area; ventricular septaldefect (VSD); mitral stenosis (MS); mitral regurgitation (MR); aortic stenosis (AS); aortic regurgitation (AR); patent ductusarteriosus (PDA); pulmonary regurgitation (PR); pulmonary stenosis (PS); Electrocardiogram(ECG); Echocardiography(ECHO); Computed Tomography(CT); Correct Classification Rate(CCR); Artificial Neural Network(ANN); Back Propagation Neural Network (BPNN); Empirical Mode Decomposition (EMD); Support Vector Machines(SVM); Adaptive Neuro-Fuzzy Inference System (ANFIS); MATRIXLABORATORY (MATLAB); Radial Basis Function (RBF)

PDF

Paper 41: Investigative Behavioral Intention to Knowledge Acceptance and Motivation in Cloud Computing Applications

Abstract: Recently the number of Cloud Computing users in educational institutions has increased. Students have the chance to access various applications and this gives the opportunity to take advantage of those applications. This study examined the behavioral Intention toward Cloud Computing Applications and evaluated the acceptance of these Applications. Participants population consisted of 110 students from different Jordanian universities. The results showed that Performance Expectancy, Effort Expectancy, Attitude toward using Technology, Social Influence, Self-Efficacy, Attention and Relevance have different levels of correlation with Behavioral Intention in Cloud Computing Applications, and there was no correlation between Anxiety and Behavioral Intention in Cloud Computing Applications.

Author 1: Sundus A. Hamoodi

Keywords: Cloud computing; formatting; ARCS model; UTAUT model

PDF

Paper 42: The Impact of Black-Hole Attack on ZRP Protocol

Abstract: lack of infrastructure in ad hoc networks makes their deployment easier. Each node in an ad hoc network can route data using a routing protocol, which decreases the level of security. Ad hoc networks are exposed to several attacks such as the blackhole attack. In this article, a study has been made on the impact of the attack on the hybrid routing protocol ZRP (Zone Routing Protocol). In this attack a malicious node is placed between two or more nodes in order to drop data. The trick of the attack is simple, the malicious node declares to have the most reliable way to the destination so that the wife destination chooses this path. In this study, NS2 is used to assess the impact of the attack on ZRP. Two metrics measure, namely the packet delivered ratio and end to end delay.

Author 1: CHAHIDI Badr
Author 2: EZZATI Abdellah

Keywords: ZRP; Blackhole; security; Routing

PDF

Paper 43: Design of Modulator and Demodulator for a 863-870 MHz BFSK Tranceiver

Abstract: This paper presents the design of low power modulator and demodulator circuits dedicated to a BFSK transceiver, operating in the 863- 870 MHZ ISM band. The two circuits were designed using ams 0.35µm technology with 3V dc voltage supply. Simulation results of the new Direct Digital Frequency Synthesizer in the modulation have shown good performances of the designed system as the Spurious Free Dynamic Range SFDR reached -88 dBc while the circuit consumes only 47.7 µW @ 43.3MHz. The demodulator has also presented a good BER of 10-3 @10.9 EbtoN0 and a sensitivity of about -115 dBm.

Author 1: A. Neifar
Author 2: G. Bouzid
Author 3: M. Masmoudi

Keywords: ISM band; FHSS; FSK modulator; BFSK demodulator; wireless sensor network

PDF

Paper 44: Reducing the Calculations of Quality-Aware Web Services Composition Based on Parallel Skyline Service

Abstract: The perfect composition of atomic services to provide users with services through applying qualitative parameters is very important. As expected, web services with similar features lead to competition among the service providers. The key challenge to find an appropriate web service for composition occurs when multiple aspects of quality (such as response time, cost, etc.) are considered in the optimal composition of services. Skyline service provides the best service with consideration of qualitative parameters by using superior analysis. In this study, Skyline algorithm is used to find a set of best possible services compositions while taking into account qualitative parameters. The parallelism technique in this study, had significant impact on reducing response time and increasing the speed of service composition as well as reduction in composition calculations. The results of the analysis and evaluation of the proposed method shows optimum runtime and the best composition.

Author 1: Maryam Moradi
Author 2: Sima Emadi

Keywords: service composition; parallel Skyline service; the dominant relationship; service quality

PDF

Paper 45: A New Strategy to Optimize the Load Migration Process in Cloud Environment

Abstract: Cloud computing is a model of internet-based service that provides easy access to a set of changeable computational sources through internet for users based on their demand. Load balancing in cloud have to manage service provider resources appropriately. Load balancing in cloud computing is the process of load distribution between distributed computational nodes for optimal use of resources and have to decrease latency in order to prevent a situation in which some nodes overloaded and some others under-loaded or be in the idle mode. Load migration is a potential solution for most of critical conditions such as load imbalance. However, many load migration methods are only based on one purpose. Practically, considering just one objective for migration can be in contrary to the other objectives and may lose optimal solution to work in existing situation. Therefore, having a strategy to make load migration process purposeful is essential in cloud environment. The main idea of this research is to reduce cost and increase efficiency in order to be compatible with cloud different conditions. In the recommended method, it is tried to improve load migration process using several different criteria simultaneously and apply some changes in previous methods. The simulated annealing algorithm is employed to implement the recommended strategy in the present research. Obtained result show desired performance and efficiency in general. This algorithm is highly flexible by which several important criteria can be calculated simultaneously.

Author 1: Hamid Mirvaziri
Author 2: ZhilaTajrobekar

Keywords: cloud computing; load balancing; migration; virtual machines; simulated annealing

PDF

Paper 46: Investigating the Use of Machine Learning Algorithms in Detecting Gender of the Arabic Tweet Author

Abstract: Twitter is one of the most popular social network sites on the Internet to share opinions and knowledge extensively. Many advertisers use these Tweets to collect some features and attributes of Tweeters to target specific groups of highly engaged people. Gender detection is a sub-field of sentiment analysis for extracting and predicting the gender of a Tweet author. In this paper, we aim to investigate the gender of Tweet authors using different classification mining techniques on Arabic language, such as Naïve Bayes (NB), Support vector machine (SVM), Naïve Bayes Multinomial (NBM), J48 decision tree, KNN. The results show that the NBM, SVM, and J48 classifiers can achieve accuracy above to 98%, by adding names of Tweet author as a feature. The results also show that the preprocessing approach has negative effect on the accuracy of gender detection. In nutshell, this study shows that the ability of using machine learning classifiers in detecting the gender of Arabic Tweet author.

Author 1: Emad AlSukhni
Author 2: Qasem Alequr

Keywords: Social Networking; Data Mining; Sentiment Analysis; Sentiment Classification; Gender Detection; Twitter

PDF

Paper 47: Diagnosis of Diabetes by Applying Data Mining Classification Techniques

Abstract: Health care data are often huge, complex and heterogeneous because it contains different variable types and missing values as well. Nowadays, knowledge from such data is a necessity. Data mining can be utilized to extract knowledge by constructing models from health care data such as diabetic patient data sets. In this research, three data mining algorithms, namely Self-Organizing Map (SOM), C4.5 and RandomForest, are applied on adult population data from Ministry of National Guard Health Affairs (MNGHA), Saudi Arabia to predict diabetic patients using 18 risk factors. RandomForest achieved the best performance compared to other data mining classifiers.

Author 1: Tahani Daghistani
Author 2: Riyad Alshammari

Keywords: Diabetes; Data mining; Self-Organizing Map; Decision tree; Classification

PDF

Paper 48: Finding Non Dominant Electrodes Placed in Electroencephalography (EEG) for Eye State Classification using Rule Mining

Abstract: Electroencephalography is a measure of brain activity by wave analysis; it consist number of electrodes. Finding most non-dominant electrode positions in Eye state classification is important task for classification. The proposed work is identifying which electrodes are less responsible for classification. This is a feature selection step required for optimal EEG channel selection. Feature selection is a mechanism for subset selection of input features, in this work input features are EEG Electrodes. Most Non Dominant (MND), gives irrelevant input electrodes in eye state classification and thus it, reduces computation cost. MND set creation completed using different stages. Stages includes, first extreme value removal from electroencephalogram (EEG) corpus for data cleaning purpose. Then next step is attribute selection, this is a preprocessing step because it is completed before classification step. MND set gives electrodes they are less responsible for classification and if any EEG electrode corpus wants to remove feature present in this set, then time and space required to build the classification model is (20%) less than as compare to all electrodes for the same, and accuracy of classification not very much affected. The proposed article uses different attribute evaluation algorithm with Ranker Search Method.

Author 1: Mridu Sahu
Author 2: N.K.Nagwani
Author 3: ShrishVerma

Keywords: Electroencephalography (EEG); Most Non Dominant (MND); Ranker algorithm; classification; EEG

PDF

Paper 49: Evaluation of Fault Tolerance in Cloud Computing using Colored Petri Nets

Abstract: Nowadays, the necessity of rendering reliable services to the customers in business markets is assumed as a crucial matter for the service providers, and the importance of this subject in many fields is undeniable. Design of systems with high complexity and existence of different resources in network cloud leads the service providers to intend to provide the best services to their customers. One of the important challenges for service providers is fault tolerance and reliability and different techniques and methods have been presented for solving this challenge so far. The method presented in this paper analyzes the fault tolerance process in interconnected network cloud in order to avoid problems and irreparable damages before implementation. In the offered method, the fault tolerance was evaluated aiding colored petri nets using Byzantine technique. Summary of results analyzed by cpntools and demonstrated reliability. It was concluded that upon increase of requests, the fault tolerance is reduced and consequently reliability is also reduced and vice versa. In other word, resources management is under impact of requested services.

Author 1: Mehdi Effatparvar
Author 2: Seyedeh Solmaz Madani

Keywords: Cloud Computing; Fault Tolerance; Colored Petri Nets; Reliability

PDF

Paper 50: Reputation Management System for Fostering Trust in Collaborative and Cohesive Disaster Management

Abstract: The best management of a disaster requires knowledge, skills and other resources not only for relief and rehabilitation but also for recovery and mitigation of its effects. These multifaceted goals cannot be achieved by a single organization and require collaborative efforts in an agile manner. Blind trust cannot be applied while selecting collaborators/team members/partners therefore good reputation of a collaborator is mandatory. Currently, various Information and Communication Technology based artifacts, for collaborative disaster management, have been developed; however, they do not employ trust and reputation as their key factor. In this paper, a framework of reputation based trust management system is proposed for the support of disaster management. The key features of framework are Meta model, Reputation Indicator Matrix and Computational algorithm, deployed using Service Oriented Architecture. To evaluate the efficacy of the artifact, a prototype is implemented. Furthermore, an industrial survey is carried out to get the feedback on the proposed framework. The results support that the proposed reputation management system provides significant support in collaborative disaster management by assisting in agile and smart decision making in all phases of disaster management cycle.

Author 1: Sabeen Javed
Author 2: Hammad Afzal
Author 3: Fahim Arif
Author 4: Awais Majeed

Keywords: reputation; trust; reputation management; disaster management; collaborators; collaborative management

PDF

Paper 51: Indirect Substitution Method in Combinable Services by Eliminating Incompatible Services

Abstract: Service-oriented architecture is a style in information systems architecture with the aim of achieving loose coupling in communication between software components and services. Service, here means software implementation, is a well-defined business function that can be used and be called in various processes or software. An organization can choose and composite the Web services that fulfill its intended quality of service. As the number of available Web services increases, choosing the best services to composite is challenging and is the most important problem of service composition. In addition, due to the utilization of systems in dynamic environments, service characteristics and users’ needs are constantly faced with changes which lead to deterioration of service, unavailability and quality loss of services. One of the ways to deal with this challenge is substitution of a Web service with another service, which is done at the runtime and dynamically. Substitution is both direct and indirect. Though there are many related works in the field of direct substitution, still no work is done for explaining substitution based on the indirect method, and works were conducted only on direct substitution. In this method, there are many problems such as the incompatibility of important services in composition. To solve the problems in this method and other challenges in this paper, considering a subset of inputs and outputs, qualitative parameters and service composition, simultaneous and dynamic service composition and use of the fitness function of genetic algorithm to compare the compositions are done. In addition, in substitution, a table which contains the best possible substitutes with dynamic updates through multi-threading techniques is provided. The results obtained by the analysis and evaluation of the proposed method, indicates the establishment of compatibility between the services, and finding the best possible substitute to reduce substitution time.

Author 1: Forough Hematian Chahardah Cheriki
Author 2: Sima Emadi

Keywords: component; indirect substitution; SLA; service composition; quality of service

PDF

Paper 52: Optimum Access Analysis of Collaborative Spectrum Sensing in Cognitive Radio Network using MRC

Abstract: The performance of cognitive radio network mainly depends on the finest sensing of the presence or absence of Primary User (PU). The throughput of a Secondary User (SU) can be reduced because of the false detection of PU which causes an SU from its transmission opportunity. The factorization of the probability of correct decision is a really hard job when the special false alarm is incorporated into it. Previous works focus on collaborative sensing on the normal environment. In this paper, we have proposed a collaborative sensing method in Cognitive radio network for optimal access of PU licensed band by SU. It is shown performance analysis of energy detection through different cognitive users and conducts a clear comparison between local and collaborative sensing. In this paper, the maximal ratio combining diversity technique with energy detection has been employed to reduce the false alarm probability in the collaborative environment. The simulation result shows significant reduction of the probability of misdetection with increasing in the number of collaborative users. We also analyze that MRC scheme exhibits the best detection performance in collaborative environment.

Author 1: Risala Tasin Khan
Author 2: Shakila Zaman
Author 3: Md. Imdadul Islam
Author 4: M. R. Amin

Keywords: Fusion center; Local energy detection; Maximum Ratio Combining; Spectrum Sensing; Receiver Operating Characteristics

PDF

Paper 53: A Comparative Study of Classification Algorithms using Data Mining: Crime and Accidents in Denver City the USA

Abstract: In the last five years, crime and accidents rates have increased in many cities of America. The advancement of new technologies can also lead to criminal misuse. In order to reduce incidents, there is a need to understand and examine emerging patterns of criminal activities. This paper analyzed crime and accident datasets from Denver City, USA during 2011 to 2015 consisting of 372,392 instances of crime. The dataset is analyzed by using a number of Classification Algorithms. The aim of this study is to highlight trends of incidents that will in return help security agencies and police department to discover precautionary measures from prediction rates. The classification of algorithms used in this study is to assess trends and patterns that are assessed by BayesNet, NaiveBayes, J48, JRip, OneR and Decision Table. The output that has been used in this study, are correct classification, incorrect classification, True Positive Rate (TP), False Positive Rate (FP), Precision (P), Recall (R) and F-measure (F). These outputs are captured by using two different test methods: k-fold cross-validation and percentage split. Outputs are then compared to understand the classifier performances. Our analysis illustrates that JRip has classified the highest number of correct classifications by 73.71% followed by decision table with 73.66% of correct predictions, whereas OneR produced the least number of correct predictions with 64.95%. NaiveBayes took the least time of 0.57 sec to build the model and perform classification when compared to all the classifiers. The classifier stands out producing better results among all the classification methods. This study would be helpful for security agencies and police department to discover data patterns and analyze trending criminal activity from prediction rates.

Author 1: Amit Gupta
Author 2: Azeem Mohammad
Author 3: Ali Syed
Author 4: Malka N. Halgamuge

Keywords: Data Mining; Classification; Big Data; Crime and Accident

PDF

Paper 54: Simulation and Analysis of Optimum Golomb Ruler Based 2D Codes for OCDMA System

Abstract: The need for high speed communications networks has led the research communities and industry to develop reliable, scalable transatlantic and transpacific fiber-optic communication links. In this paper the optimum Golomb ruler based 2D OCDMA codes has been demonstrated. An OCDMA system based on the discussed 2D codes is designed and simulated on Optisystem. The encoder and decoder structure of OCDMA system have been designed using filter and time delays. Further the performance is analysed for various parameter such as bit rate, number of users, BER (Bit Error Rate), quality factor, eye diagram and signal diagram. The system is analyzed for up to 18 users at 1 Gbps and 1.25 Gbps bit rate.

Author 1: Dr. Gurjit Kaur
Author 2: Rajesh Yadav
Author 3: Disha Srivastava
Author 4: Aarti Bhardwaj
Author 5: Manu Gangwar
Author 6: Nidhi

Keywords: OCDMA System; 2D Codes; OOC; Golomb Ruler; BER; Eye Diagram; MAI

PDF

Paper 55: Crowding Optimization Method to Improve Fractal Image Compressions Based Iterated Function

Abstract: Fractals are geometric patterns generated by Iterated Function System theory. A popular technique known as fractal image compression is based on this theory, which assumes that redundancy in an image can be exploited by block-wise self-similarity and that the original image can be approximated by a finite iteration of fractal codes. This technique offers high compression ratio among other image compression techniques. However, it presents several drawbacks, such as the inverse proportionality between image quality and computational cost. Numerous approaches have been proposed to find a compromise between quality and cost. As an efficient optimization approach, genetic algorithm is used for this purpose. In this paper, a crowding method, an improved genetic algorithm, is used to optimize the search space in the target image by good approximation to the global optimum in a single run. The experimental results for the proposed method show good efficiency by decreasing the encoding time while retaining a high quality image compared with the classical method of fractal image compression.

Author 1: Shaimaa S. Al-Bundi
Author 2: Nadia M. G. Al-Saidi
Author 3: Neseif J. Al-Jawari

Keywords: Fractal; Iterated Function System (IFS); Genetic algorithm (GA); Crowding method; Fractal Image Compression (FIC)

PDF

Paper 56: Current Trends and Research Challenges in Spectrum-Sensing for Cognitive Radios

Abstract: The ever increasing demand of wireless communication systems has led to search of suitable spectrum bands for transmission of data. The research in the past has revealed that radio spectrum is under-utilized in most of the scenarios. This prompted the scientist to seek a solution to utilize the spectrum efficiently. Cognitive Radios provided an answer to the problem by sensing the idle (licensed) bands and allowing (secondary) users to transmit in these idle spaces. Spectrum sensing forms the main block of cognition cycle. This paper reviews the current trends in research in the domain of spectrum sensing. The author describes the type of channel being modelled, diversity combining schemes used, optimal algorithms applied at fusion centre, spectrum sensing techniques employed. Further, the research challenges are discussed. It is presented that various attributes like sensing time, throughput, rate reliability, optimum cooperative users, sensing frequency etc. needs to be addressed. A trade-off needs to be established to optimize two opposing parameters like sensing and throughput.

Author 1: Roopali Garg
Author 2: Dr. Nitin Saluja

Keywords: CR; cognitive radio; FC-PSO; fast-convergence particle swarm optimization; FC; fusion centre; KLMS; kernel least mean square; PU; primary use); ROC; receiver operating characteristic curves; SU; secondary user; soft combination; spectrum hole

PDF

Paper 57: Evaluation of a Behind-the-Ear ECG Device for Smartphone Based Integrated Multiple Smart Sensor System in Health Applications

Abstract: In this paper, we present a wireless Multiple Smart Sensor System (MSSS) in conjunction with a smartphone to enable an unobtrusive monitoring of electrocardiogram (ear-lead ECG) integrated with multiple sensor system which includes core body temperature and blood oxygen saturation (SpO2) for ambulatory patients. The proposed behind-the-ear device makes the system desirable to measure ECG data: technically less complex, physically attached to non-hair regions, hence more suitable for long term use, and user friendly as no need to undress the top garment. The proposed smart sensor device is similar to the hearing aid device and is wirelessly connected to a smartphone for physiological data transmission and displaying. This device not only gives access to the core temperature and ECG from the ear, but also the device can be controlled (removed and reapplied) by the patient at any time, thus increasing the usability of personal healthcare applications. A number of combination ECG electrodes, which are based on the area of the electrode and dry/non-dry nature of the surface of the electrodes are tested at various locations near behind the ear. The best ECG electrode is then chosen based on the Signal-to-Noise Ratio (SNR) of the measured ECG signals. These electrodes showed acceptable SNR ratio of ~20 db, which is comparable with existing tradition ECG electrodes. The developed ECG electrode systems is then integrated with commercially available PPG sensor (Amperor pulse oximeter) and core body temperature sensor (MLX90614) using a specialized micro controller (Arduino UNO) and the results monitored using a newly developed smartphone (android) application.

Author 1: Numan Celik
Author 2: Nadarajah Manivannan
Author 3: Wamadeva Balachandran

Keywords: wireless body area networks; body-worn sensors; ECG; core body temperature; oxygen saturation level (SpO2); biosensor integration; m-health

PDF

Paper 58: An Evaluation of Requirement Prioritization Techniques with ANP

Abstract: This article elaborates an evaluation of seven software requirements prioritization methods (ANP, binary search tree, AHP, hierarchy AHP, spanning tree matrix, priority group and bubble sort). Based on the case study of local project (automation of Mobilink franchise system), the experiment is conducted by students in the Requirement Engineering course in the department of Software Engineering at the University of Science and Technology Bannu, Khyber Pakhtunkhawa, Pakistan. Parameters/ measures used for the experiment are consistency indication, scale of measurement, interdependence, required number of decisions, total time consumption, time consumption per decision, ease of use, reliability of results and fault tolerance; on which requirements prioritization techniques are evaluated. The results of experiment show that ANP is the most successful prioritization methodology among all the available prioritization methodologies.

Author 1: Javed ali Khan
Author 2: Izaz-ur-Rehman
Author 3: Iqbal Qasim
Author 4: Shah Poor Khan
Author 5: Yawar Hayat Khan

Keywords: Requirement Engineering; Requirement prioritization; ANP; AHP; Software Engineering; Comparison

PDF

Paper 59: Cyber Profiling Using Log Analysis And K-Means Clustering

Abstract: The Activities of Internet users are increasing from year to year and has had an impact on the behavior of the users themselves. Assessment of user behavior is often only based on interaction across the Internet without knowing any others activities. The log activity can be used as another way to study the behavior of the user. The Log Internet activity is one of the types of big data so that the use of data mining with K-Means technique can be used as a solution for the analysis of user behavior. This study has been carried out the process of clustering using K-Means algorithm is divided into three clusters, namely high, medium, and low. The results of the higher education institution show that each of these clusters produces websites that are frequented by the sequence: website search engine, social media, news, and information. This study also showed that the cyber profiling had been done strongly influenced by environmental factors and daily activities.

Author 1: Muhammad Zulfadhilah
Author 2: Yudi Prayudi
Author 3: Imam Riadi

Keywords: Clustering; K-Means; Log; Network; Cyber Profiling

PDF

Paper 60: Enhancement in System Schedulability by Controlling Task Releases

Abstract: In real-time systems fixed priority scheduling techniques are considered superior than the dynamic priority counterparts from implementation perspectives; however the dynamic priority assignments dominate the fixed priority mechanism when it comes to system utilization. Considering this gap, a number of results are added to real-time system literature recently that achieve higher utilization at the cost of tuning task parameters. We further investigate this problem by proposing a novel fixed priority scheduling technique that keeps task parameters intact. The proposed technique favors the lower priority tasks by blocking the release of higher priority tasks without hurting their deadlines. The aforementioned strategy helps in creating some extra space that is utilized by a lower priority task to complete its execution. It is proved that the proposed technique dominates pure preemptive scheduling. Furthermore the results obtained are applied to an example task set which is not schedulable with preemption threshold scheduling and quantum based scheduling but it is schedulable with proposed technique. The analyses show the supremacy of our work over existing fixed priority alternatives from utilization perspective.

Author 1: Basharat Mahmood
Author 2: Naveed Ahmad
Author 3: Saif ur Rehman Malik
Author 4: Adeel Anjum

Keywords: Real-time Systems; Fixed Priority Scheduling; RM Scheduling; Priority Inversion

PDF

Paper 61: An Emergency Unit Support System to Diagnose Chronic Heart Failure Embedded with SWRL and Bayesian Network

Abstract: In all the regions of the world, heart failure is common and on raise caused by several aetiologies. Although the development of the treatment is fast, there are still lots of cases that lose their lives in emergence sections because of slow response to treat these cases. In this paper we propose an expert system that can help the practitioners in the emergency rooms to fast diagnose the disease and advise them with the appropriate operations that should be taken to save the patient’s life. Based on the mostly binary information given to the system, Bayesian Network model was selected to support the process of reasoning under uncertain or missing information. The domain concepts and the relations between them were building by using ontology supported by the Semantic Web Rule Language to code the rules. The system was tested on 105 patients and several classification functions were tested and showed remarkable results in the accuracy and sensitivity of the system.

Author 1: Baydaa Al-Hamadani

Keywords: Ontology Engineering; Bayesian Network; Heart Failure; Expert System; Validation Test

PDF

Paper 62: Analyzing Data Reusability of Raytrace Application in Splash2 Benchmark

Abstract: When designing a chip multiprocessors, we use Splash2 to estimate its performance. This benchmark contains eleven applications. The performance when running them is similar, except Raytrace. We analyse it to clarity why the performance is not good. We discover, in theory, Raytrace never reuses data. This leads the fact that the performance is not good due to the low hit ratio in data cache.

Author 1: Hao Do-Duc
Author 2: Vinh Ngo-Quang

Keywords: Chip multiprocessors; benchmark; ray tracing; reflection; intensity; ray-Tree

PDF

Paper 63: Management Information Systems in Public Institutions in Jordan

Abstract: Six constructs were utilized in this study to explore the factors affecting MIS implementation in Jordanian public institutions and to investigate the impact of MIS implementation on organizational (operational) performance. They were human factors, organizational factors, technological factors, environmental factors, MIS implementation components and organizational performance. The required data were collected using a valid and reliable questionnaire developed based on the literature review. Human factors were conceptualized as users’ computer skills and experience, IS usefulness and IS ease of use. Organizational factors were assessed using three sub-indicators, which were top-management support, user training and IS confidentiality. Technological factors were evaluated by systematic quality, information quality and service quality. The overall industry, industry environment and external pressure were three indicators used to measure the environmental factors. Two variables were selected to measure MIS implementation: IT/IS capability and technological aspects related to information service quality. Since the current study tackled public institutions, the indicators of organizational performance were limited to operational ones. The questionnaire was distributed to 125 informants from IT/IS departments. The findings of the study indicated the acceptance of the hypothesis that the factors in question are significantly and positively related to MIS implementation, which in turn, when measured by IT/IS capability and information service quality, significantly and positively affect organizational performance. The main contribution provided by this study is that MIS implementation is not limited to information technology and systems capabilities and usefulness. Other factors should be considered, particularly when examining the impact of MIS implementation on organizational performance.

Author 1: Ahmad A. Al-Tit

Keywords: management information systems; adoption success factors; organizational performance; public institutions

PDF

Paper 64: A Novel Method in Two-Step-Ahead Weight Adjustment of Recurrent Neural Networks: Application in Market Forecasting

Abstract: Gold price prediction is a very complex nonlinear problem which is severely difficult. Real-time price prediction, as a principle of many economic models, is one of the most challenging tasks for economists since the context of the financial agents are often dynamic. Since in financial time series, direction prediction is important, in this work, an innovative Recurrent Neural Network (RNN) is utilized to obtain accurate Two-Step- Ahead (2SA) prediction results and ameliorate forecasting per- formances of gold market. The training method of the proposed network has been combined with an adaptive learning rate algorithm and a linear combination of Directional Symmetry (DS) is utilized in the training phase. The proposed method has been developed for online and offline applications. Simulations and experiments on the daily Gold market data and the benchmark time series of Lorenz and Rossler shows the high efficiency of proposed method which could forecast future gold price precisely.

Author 1: Narges Talebi Motlagh
Author 2: Amir RikhtehGar Ghiasi

Keywords: Recurrent Neural Network; Two Step Ahead Prediction; Reinforcement Learning; Directional Statistics; Gold Market

PDF

Paper 65: Applications of Multi-criteria Decision Making in Software Engineering

Abstract: Every complex problem now days require multicriteria decision making to get to the desired solution. Numerous Multi-criteria decision making (MCDM) approaches have evolved over recent time to accommodate various application areas and have been recently explored as alternative to solve complex software engineering problems. Most widely used approach is Analytic Hierarchy Process that combines mathematics and expert judgment. Analytic Hierarchy Process suffers from the problem of imprecision and subjectivity. This paper proposes to use Fuzzy AHP (FAHP) instead of traditional AHP method. The usage of FAHP helps decision makers to make better choices both in relation to tangible criteria and intangible criteria. The paper provides a clear guide on how FAHP can be applied, particularly in the software engineering area in specific situations. The conclusion of this study would help and motivate practitioners and researchers to use multi-criteria decision making approaches in the area of software engineering.

Author 1: Sumeet Kaur Sehra
Author 2: Yadwinder Singh Brar
Author 3: Navdeep Kaur

Keywords: Multi-criteria Decision Making; Analytic Hierarchy Process; Fuzzy AHP; Software Engineering

PDF

Paper 66: Arabic Text Question Answering from an Answer Retrieval Point of View: a survey

Abstract: Arabic Question Answering (QA) is gaining more importance due to the importance of the language and the dramatic increase in online Arabic content. The goal of this article is to review the state-of-the-art of Arabic QA methods, to classify them into different categories from an answer retrieval viewpoint and to present their applications, issues and new trends. The main components of question answering systems are also presented. Finally, this survey provides a comparative study of systems of each type of QA based on several criteria.

Author 1: Bodor A. B. Sati
Author 2: Mohammed A. S. Ali
Author 3: Sherif M. Abdou

Keywords: Question answering; Information retrieval; Answer retrieval; Arabic NLP

PDF

Paper 67: Comparative Analysis of ALU Implementation with RCA and Sklansky Adders In ASIC Design Flow

Abstract: An Arithmetic Logic Unit (ALU) is the heart of every central processing unit (CPU) which performs basic operations like addition, subtraction, multiplication, division and bitwise logic operations on binary numbers. This paper deals with implementation of a basic ALU unit using two different types of adder circuits, a ripple carry adder and a sklansky type adder. The ALU is designed using application specific integrated circuit (ASIC) platform where VHDL hardware description language and standard cells are used. The target process technology is 130nm CMOS from the foundry ST Microelectronics. The Cadence EDA tools are used for the ASIC implementation. A comparative analysis is provided for the two ALU circuits designed in terms of area, power and timing requirements.

Author 1: Abdul Rehman Buzdar
Author 2: Liguo Sun
Author 3: Abdullah Buzdar

Keywords: Arithmetic Logic Unit; Ripple Carry Adder; Sklansky Adder; ASIC Design, EDA Tools

PDF

Paper 68: Computational Modeling of Proteins based on Cellular Automata

Abstract: The literature of building computational and mathematical models of proteins is rich and diverse, since its practical applications are of a vital importance in the development of many fields. Modeling proteins is not a straightforward process and in some modeling strategies, it requires to combine concepts from different fields including physics, chemistry, thermodynamics, and computer science. The focus here will be on models that are based on the concept of cellular automata and equivalent systems. Cellular automata are discrete computational models that are capable of universal computation, in other words, they are capable of doing any computation that a normal computer can do. What is special about cellular automata is its ability to produce complex and chaotic global behavior from local interactions. The paper discusses the effort done so far by the researchers community in this direction and proposes a computational model of protein folding that is based on 3D cellular automata. Unlike common models, the proposed model maintains the basic properties of cellular automata and keeps a realistic view of proteins operations. As in any cellular automata model, the dimension, neighborhood, boundary, and rules were specified. In addition, a discussion is given to clarify why these parameters are in place and what possible alternatives can be used in the protein folding context.

Author 1: Alia Madain
Author 2: Abdel Latif Abu Dalhoum
Author 3: Azzam Sleit

Keywords: Proteins 3D Folding; Bioinformatics; Computational Modeling; Cellular Automata; Theoretical Computer Science;

PDF

Paper 69: Conditions Facilitating the Aversion of Unpopular Norms: An Agent-Based Simulation Study

Abstract: People mostly facilitate and manage their social lives adhering to the prevalent norms. There are some norms which are unpopular, yet people adhere to them. Ironically, people at individual level do not agree to these norms, but, they still follow and even facilitate them. Irrespective of the social and psychological reasons behind their persistence, sometimes, for societal good, it is necessary to oppose and possibly avert the unpopular norms. In this paper, we model theorydriven computational specifications of Emperor’s Dilemma into an agent-based simulation, to understand the the conditions that result in emergence of unpopular norms. The reciprocal nature of persistence and aversion of norms, thus, is utilized to define situations under which these norms can be changed and averted. Simulation is performed under many interesting “what-if” questions. The simulation results reveal that under high density conditions of agent population with a high percentage of norm aversion activists, the aversion of unpopular norms can be achieved.

Author 1: Zoofishan Zareen
Author 2: Muzna Zafar
Author 3: Kashif Zia

Keywords: Agent-based Modeling and Simulation; Emperor’s Dilemma; Complex Adaptive Systems.

PDF

Paper 70: Developing a Real-Time Web Questionnaire System for Interactive Presentations

Abstract: Conducting presentations with bi-directional communication requires extended presentation systems, e.g., having sophisticated expressions and gathering real-time feedback. We aim to develop an interactive presentation system to enhance presentations with bi-directional communication during presentations. We developed a hybrid interactive presentation system that is a collaboration between the traditional presentation supporting system, e.g. PowerPoint, and a web application. To gather feedback from audiences at presentations, the web application delivers presentation slides to audiences. The client system provides a feature of creating annotations and answering the questions on delivered presentation slides for making feedback. Specifically, the system provides a real-time questionnaire function where the result is displayed on a shared screen in real time while gathering answers. Since users can make their questionnaire on PowerPoint, the task becomes quite easy. This paper explains the development of the system and demonstrates that the real-time questionnaire system realizes high performance scalability.

Author 1: Yusuke Niwa
Author 2: Shun Shiramatsu
Author 3: Tadachika Ozono
Author 4: Toramatsu Shintani

Keywords: Interactive Presentation; Real-time Web questionnaire; Collaborative tools; communication aids; information sharing; Web services

PDF

Paper 71: FPGA implementation of filtered image using 2D Gaussian filter

Abstract: Image filtering is one of the very useful techniques in image processing and computer vision. It is used to eliminate useless details and noise from an image. In this paper, a hardware implementation of image filtered using 2D Gaussian Filter will be present. The Gaussian filter architecture will be described using a different way to implement convolution module. Thus, multiplication is in the heart of convolution module, for this reason, three different ways to implement multiplication operations will be presented. The first way is done using the standard method. The second way uses Field Programmable Gate Array (FPGA) features Digital Signal Processor (DSP) to ensure and make fast the scalability of the effective FPGA resource and then to speed up calculation. The third way uses real multiplier for more precision and a the maximum uses of FPGA resources. In this paper, we compare the image quality of hardware (VHDL) and software (MATLAB) implementation using the Peak Signal-to-Noise Ratio (PSNR). Also, the FPGA resource usage for different sizes of Gaussian kernel will be presented in order to provide a comparison between fixed-point and floating point implementations.

Author 1: Leila kabbai
Author 2: Anissa Sghaier
Author 3: Ali Douik
Author 4: Mohsen Machhout

Keywords: Gaussian Filter; convolution;fixed point arithmetic; Floating point arithmetic;FPGA

PDF

Paper 72: From Emotion Recognition to Website Customizations

Abstract: A computer vision system that recognizes the emotions of a website’s user and customizes the context and the presentation of this website accordingly is presented herein. A logistic regression classifiers is trained over the Extended Cohn- Kanade dataset in order to recognize the emotions. The Scale- Invariant Feature Transform algorithm over two different part of an image, the face and the eyes without any special pixel intensities preprocessing, is used to describe each emotion. The testing phase shows a significant improvement in the classification results. A toy web site, as a proof of concept, is also developed.

Author 1: O. B. Efremides

Keywords: Emotion recognition; classification; computer vision; web interfaces

PDF

Paper 73: New mechanism for Cloud Computing Storage Security

Abstract: Cloud computing, often referred to as simply the cloud, appears as an emerging computing paradigm which promises to radically change the way computer applications and services are constructed, delivered, managed and finally guaranteed as dynamic computing environments for end users. The cloud is the delivery of on-demand computing resources - everything from applications to data centers - over the Internet on a pay-for-use basis. The revolution of cloud computing has provided opportunities for research in all aspects of cloud computing. Despite the big progress in cloud computing technologies, funding concerns in cloud, security may limit a broader adoption. This paper presents a technique to tolerate both accidental and intentional faults, which is fragmentation-redundancy-scattering (FRS). The possibility to use the FRS technique as an intrusion tolerance one is investigated for providing secure and dependable storage in the cloud environment. Also a cloud computing security (CCS) based on the FRS technique is proposed to explore how this proposal can then be used via several scenarios. To demonstrate the robustness of the proposal, we formalize our design and we carry out a security as well as performance evaluations of the approach and we compare it with the classical model. The paper concludes by strongly suggesting future research proposals for the CCS framework.

Author 1: Almokhtar Ait El Mrabti
Author 2: Najim Ammari
Author 3: Anas Abou El Kalam
Author 4: Abdellah Ait Ouahman
Author 5: Mina De Montfort

Keywords: Cloud Computing; Data Security; Data Encryption; Fragmentation-Redundancy-Scattering;

PDF

Paper 74: Quality of Service Provisioning in Biosensor Networks

Abstract: Biosensor networks are wireless networks consisting of tiny biological sensors (biosensors, for short) that can be implanted inside the body of human and animal subjects. Biosensors can measure various biological processes that occur inside the body of the subject under test. Applications of biosensor networks include automated drug delivery, heart beat rate monitoring, and temperature sensing. Since biosensor networks employ wireless transmission, heat is generated in the tissues surrounding the implanted biosensors. Human and animal tissues are very sensitive to temperature increase. Therefore, the generated heat is mitigated by the natural thermoregulatory system. However, excessive transmissions can cause a significant increase in temperature and thus tissue damage. Hence, there is a need for a mechanism to control the rate of wireless transmissions. Of course, controlling the rate of wireless transmissions will lead to Quality-of-Service (QoS) issues like the required minimum delay and throughput. In this paper, we are going to investigate the above issues using the framework of Markov Decision Processes (MDPs). We are going to develop several MDP models that will enable us to study the different trade-offs involved in QoS provisioning in biosensor networks. The optimal policies computed using the proposed MDP models are compared with greedy policies to show their vigilant behavior and viable performance.

Author 1: Yahya Osais
Author 2: Muhammad Butt

Keywords: Biosensor networks; Quality of service; Markov decision processes

PDF

Paper 75: Software Architecture Quality Measurement Stability and Understandability

Abstract: Over the past years software architecture has become an important sub-field of software engineering. There has been substantial advancement in developing new technical approaches to start handling architectural design as an engineering discipline. Measurement is an essential part of any engineering discipline. Quantifying the quality attributes of the software architecture will reveal good insights about the architecture. It will also help architects and practioners to choose the best fit of alternative architectures that meets their needs. This work paves the way for researchers to start investigating ways to measure software architecture quality attributes. Measurement of these qualities is essential for this sub-field of software engineering. This work explores Stability and Understandability of software architecture, several metrics that affect them, and literature review of these qualities.

Author 1: Mamdouh Alenezi

Keywords: Software Engineering; Software Architecture; Quality Attributes; Stability; Understandability

PDF

Paper 76: WE-MQS-VoIP Priority: An enhanced LTE Downlink Scheduler for voice services with the integration of VoIP priority mode

Abstract: The Long Term Evolution (LTE) is a high data rates and fully All-IP network. It is developed to support well to multimedia services such as Video, VoIP, Gaming, etc. So that, the real-time services such as VoIP, video, etc. need to be optimized. Nevertheless, the deployment of such live stream services having many challenges. Scheduling and allocating radio resource are very important in LTE network, especially with multimedia services such as VoIP. When voice service transmitted over LTE network, it is affected by many network impairments where there are three main factors including packet loss, delay, and jitter. This study proposes a new scheduler which is based on VoIP priority mode,Wideband (WB) E-model, QoS- and Channel-Aware (called WE-MQS-VoIP Priority scheduler) for voice services in LTE downlink direction. The proposed scheduling scheme is built based on the WB E-model and Maximum Queue Size (MQS). In addition, we integrate the VoIP priority mode into our scheduling scheme. Since the proposed scheduler considers the VoIP priority mode and user perception, thus, it improves significantly the system performance. The results demonstrate that the proposed scheduler not only meets QoS demands of voice calls but also outperforms Modified Largest Weighted Delay First (M-LWDF) in terms of delay, Packet Loss Rate (PLR) for all number of user (NU) and excepting NU equals 30, respectively. For Fairness Index (FI), cell throughput, and Spectral Efficiency (SE), the difference among the packet schedulers is not significant. The performance evaluation is compared in terms of Delay, PLR, Throughput, FI, and SE.

Author 1: Duy-Huy Nguyen
Author 2: Hang Nguyen
Author 3: Eric Renault

Keywords: AMR-WB; Wideband E-model; VoLTE; VoIP priority mode; User perception

PDF

Paper 77: Sentiment Based Twitter Spam Detection

Abstract: Spams are becoming a serious threat for the users of online social networks especially for the ones like of twitter. twitter’s structural features make it more volatile to spam attacks. In this paper, we propose a spam detection approach for twitter based on sentimental features. We perform our experiments on a data collection of 29K tweets with 1K tweets for 29 trending topics of 2012 on twitter. We evaluate the usefulness of our approach by using five classifiers i.e. BayesNet, Naive Bayes, Random Forest, Support Vector Machine (SVM) and J48. Naive Bayes, Random Forest, J48 and SVM spam detections performance improved with our all proposed features combination. The results demonstrate that proposed features provide better classification accuracy when combined with content and user-oriented features.

Author 1: Nasira Perveen
Author 2: Malik M. Saad Missen
Author 3: Qaisar Rasool
Author 4: Nadeem Akhtar

Keywords: sentiment analysis; spam detection; twitter

PDF

Paper 78: WHITE - DONKEY: Unmanned Aerial Vehicle for searching missing people

Abstract: Searching for a missing person is not an easy task to accomplish,so over the years search methods have been developed, the problem is that the methods currently available have certain limitations and these limitations are reflected in time location. Time location in a person search is a very important factor that rescuers cannot afford to waste because the missing person is exposed to great dangers. In people search the vision system of the human being plays a very important role. The human visual system has the ability to detect and identify objects such as trees, walls, people among others besides to estimate the distance to them, this gives the human being the possibility of moving in their environment. With the development of artificial intelligence primarily to computer vision it is possible to model the human visual perception and generate computer software needed to simulate these capabilities. Using computer vision is expected to search for any missing person designing and implementing algorithms in order to an Unmanned Aerial Vehicle perform this task, also thanks to the speed of this is expected to reduce the time location. By using of a Unmanned Aerial Vehicle is not intended to replace the human being in the difficult task of searching and rescuing people but rather is intended to serve as a support tool in performing this difficult task.

Author 1: Jaime Moreno
Author 2: Jesus Cruz
Author 3: Edgar Dominguez

Keywords: Computer Vision; Unmanned Aerial Vehicle; Search and Rescue System; Human Visual System; and Quadricopter

PDF

Paper 79: Wyner-Ziv Video Coding using Hadamard Transform and Deep Learning

Abstract: Predictive schemes are current standards of video coding. Unfortunately they do not apply well for lightweight devices such as mobile phones. The high encoding complexity is the bottleneck of the Quality of Experience (QoE) of a video conversation between mobile phones. A considerable amount of research has been conducted towards tackling that bottleneck. Most of the schemes use the so-called Wyner-Ziv Video Coding Paradigm, with results still not comparable to those of predictive coding. This paper shows a novel approach for Wyner-Ziv video compression. It is based on the Reinforcement Learning and Hadamard Transform. Our Scheme shows very promising results.

Author 1: Jean-Paul Kouma
Author 2: Ulrik Soderstrom

Keywords: Wyner-Ziv; video coding; rate distortion; Hadamard transform; Deep learning; Expectation Maximization

PDF

Paper 80: Quartic approximation of circular arcs using equioscillating error function

Abstract: A high accuracy quartic approximation for circular arc is given in this article. The approximation is constructed so that the error function is of degree 8 with the least deviation from the x-axis; the error function equioscillates 9 times; the approximation order is 8. The numerical examples demonstrate the efficiency and simplicity of the approximation method as well as satisfying the properties of the approximation method and yielding the highest possible accuracy.

Author 1: Abedallah Rababah

Keywords: Bezier curves; quartic approximation; circular arc; high accuracy; approximation order; equioscillation; CAD

PDF

Paper 81: A Robust MAI Constrained Adaptive Algorithm for Decision Feedback Equalizer for MIMO Communication Systems

Abstract: Decision feedback equalizer uses prior sensor’s decisions to mitigate damaging effects of intersymbol interference on the received symbols. Due to its inherent non linear nature, decision feedback equalizer outperforms the linear equalizer in case where the intersymbol interference is sever in a communication system. Equalization of multiple input multiple output fast fading channels is a daunting job as these equalizers should not only mitigate the intersymbol interference but also interstream interference. Various equalization methods have been suggested in the adaptive filtering literature for multiple input multiple output systems. In our paper, we have developed a novel algorithm for multiple input multiple output communication systems centered around constrained optimization technique. It is attained by reducing the mean squared error criteria with respect to known variance statistics of multiple access interference and white Gaussian noise. Novelty of our paper is that such a constrained method has not been used for scheme of multiple input multiple output decision feedback equalizer resulting in a constrained algorithm. Performance of the proposed algorithm is compared to the least mean squared as well as normalized least mean squared algorithms. Simulation results demonstrate that proposed algorithm outclasses competing multiple input multiple output decision feedback equalizer algorithms.

Author 1: Khalid Mahmood
Author 2: Syed Muhammad Asad
Author 3: Muhammad Moinuddin
Author 4: Waqas Imtiaz

Keywords: Decision feedback equalizer, single input single output, inter symbol interference, multiple access interference, Rayleigh fading, AWGN, adaptive algorithm

PDF

Paper 82: TGRP: A New Hybrid Grid-based Routing Approach for Manets

Abstract: Most existing grid-based routing protocols use reactive mechanisms to build routing paths. In this paper, we propose a new hybrid approach for grid-based routing in MANETs which uses a combination of reactive and proactive mechanisms. The proposed routing approach uses shortest-path trees to build the routing paths between source and destination nodes. We design a new protocol based on this approach called the Tree-based Grid Routing Protocol (TGRP). The main advantage of the new approach is the high routing path stability due to availability of readily constructed alternative paths. Our simulation results show that the stability of the TGRP paths results in a substantially higher performance compared to other protocols in terms of lower end-to-end delay, higher delivery ratio and reduced control overhead.

Author 1: Hussein Al-Maqbali
Author 2: Mohamed Ould-Khaoua
Author 3: Khaled Day
Author 4: Abderezak Touzene
Author 5: Nasser Alzeidi

Keywords: MANETs; routing protocols; NS2 simulation; performance evaluation

PDF

Paper 83: Balanced Distribution of Load on Grid Resources using Cellular Automata

Abstract: Load balancing is a technique for equal and fair distribution of load on resources and maximizing their performance as well as reducing the overall execution time. However, meeting all of these goals in a single algorithm is not possible due to their inherent conflict, so some of the features must be given priority based on requirements and objectives of the system and the desired algorithm smust be designed with their orientation. In this article, a decentralized load balancing algorithm based on cellular automata and fuzzy logic has been presented which has capabilities needed for fair distribution of resources in Grid level. Each computing node in this algorithm has been modeled as a Cellular Automata’s cell and has been provided with the help of fuzzy logic in which each node can be an expert system and have a decisive role which is the best choice for tasking in dynamic environment and uncertain data. Each node is mapped to one of the VL, L, VN, and H, VH conditions based on information exchange on certain time periods with its neighboring nodes and based on fuzzy logic and tries to estimate the status of the other nodes in subsequent periods to reduce communication overhead with the help of Fuzzy Logic and the decision making to send or receive task loads is done based on the status of each node. So an appropriate structure for the system can greatly improve the efficiency of the algorithm. Fuzzy control does not use search and optimization and makes decisions based on inputs which are effective parameters of the system and are mostly based on incomplete and nonspecific information.

Author 1: Amir Akbarian Sadeghi
Author 2: Ahmad Khademzadeh
Author 3: Mohammad Reza Salehnamadi

Keywords: computing Grid; load balancing; cellular automata; fuzzy logic

PDF

Paper 84: Goal Model Integration for Tailoring Product Line Development Processes

Abstract: Many companies rely on the promised benefits of product lines, targeting systems between fully custom made software and mass products. Such customized mass products account for a large number of applications automatically derived from a product line. This results in the special importance of product lines for companies with a large part of their product portfolio based on their product line. The success of product line development efforts is highly dependent on tailoring the development process. This paper presents an integrative model of influence factors to tailor product line development processes according to different project needs, organizational goals, individual goals of the developers or constraints of the environment. This model integrates goal models, SPEM models and requirements to tailor development processes.

Author 1: Arfan Mansoor
Author 2: Detlef Streitferdt
Author 3: Muhammad Kashif Hanif

Keywords: Goal model; Product Line; Development Process; Process Line

PDF

Paper 85: Cuckoo Search Optimization for Reduction of a Greenhouse Climate Model

Abstract: Greenhouse climate and crop models and specially reduced models are necessary for bettering environmental management and control ability. In this paper, we present a new metaheuristic method, called Cuckoo Search (CS) algorithm, established on the life of a bird family for selecting the parameters of a reduced model which optimizes their choice by minimizing a cost function. The reduced model was already developed for control purposes and published in the literature. The proposed models target at simulating and predicting the greenhouse environment. [?]. This study focuses on the dynamical behaviors of the inside air temperature and pressure using ventilation. Some experimental results are used for model validation, the greenhouse being automated with actuators and sensors connected to a greenhouse control system on the cuckoo search methods to determine the best set of parameters allowing for the convergence of a criteria based on the difference between calculated and observed state variables (inside air temperature and water vapour pressure content). The results shown that the tested Cuckoo Search algorithm allows for a faster convergence towards the optimal solution than classical optimization methods.

Author 1: Hasni Abdelhafid
Author 2: Haffane Ahmed
Author 3: Sehli Abdelkrim
Author 4: Draoui Belkacem

Keywords: optimization; cuckoo search; greenhouses; metaheuristics; climate models

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org