The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 7 Issue 6

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Hybrid Multi-faceted Computational Trust Model for Online Social Network (OSN)

Abstract: Online Social Network (OSN) is an online social platform that enables people to exchange information, get in touch with family members or friends, and also helps as a marketing tool. However, OSN suffers from various security and privacy issues. Trust, fundamentally, is made up of security with hard trust (cryptographic mechanism) and soft trust (recommender system); user's trustworthiness for this platform has decrement signed. In this paper, the authors leverage the multi-faceted model trust concept from user-centric and personalized trust model and present weightage and ranking for its important features by employing statistical means. Next, the multi-faceted model trust is combined with an existing Action-based model and Context recommender. The contributions of this research are an enhanced trust algorithm and an enhanced context-based, recommender-based trust, which has been tested based on user-acceptance. Overall, the result demonstrates OSN as fairly better by employing a multi-faceted model which embeds both actions incomparable to recommender type

Author 1: Manmeet Mahinderjit Singh
Author 2: Teo Yi Chin

Keywords: Multi-facet Trust; Recommender Trust Model; Action Trust; Online Social Network; Security

PDF

Paper 2: Mobile Software Testing: Thoughts, Strategies, Challenges, and Experimental Study

Abstract: Mobile devices have become more pervasive in our daily lives, and are gradually replacing regular computers to perform traditional processes like Internet browsing, editing photos, playing videos and sound track, and reading different files. The importance of mobile devices in our life necessitates more concerns of the reliability and compatibility of mobile applications, and thus, testing these applications arises as an important phase in mobile devices adaption process. This paper addressed various research directions on mobile applications testing by investigating essential concepts, scope, features and requirements for testing mobile application. We highlight the similarities and the differences between mobile APP testing and mobile web testing. Furthermore, we discuss and compare different mobile testing approaches and environments, and provide the challenges as emergent needs in test environments. As a case study, we compared the testing experience of hybrid application in an emulator and a real world device. The purpose of the experiment is to verify to which extent a virtual device can emulate a complete client experience. Set of experiments are conducted where five android mobile browsers are tested. Each browser will be on a real device as well as an emulated device with the same features (CPU used, memory size, etc). The application will be tested on the following metrics: Performance and function/behavior testing

Author 1: Mohammed Akour
Author 2: Bouchaib Falah
Author 3: Ahmad A. Al-Zyoud
Author 4: Salwa Bouriat
Author 5: Khalid Alemerien

Keywords: Software Mobile Testing; Software Testing; Mobile Performance Testing; Hybrid Mobile Application

PDF

Paper 3: A Hybrid Data Mining Approach for Intrusion Detection on Imbalanced NSL-KDD Dataset

Abstract: Intrusion detection systems aim to detect malicious viruses from computer and network traffic, which is not possible using common firewall. Most intrusion detection systems are developed based on machine learning techniques. Since datasets which used in intrusion detection are imbalanced, in the previous methods, the accuracy of detecting two attack classes, R2L and U2R, is lower than that of the normal and other attack classes. In order to overcome this issue, this study employs a hybrid approach. This hybrid approach is a combination of synthetic minority oversampling technique (SMOTE) and cluster center and nearest neighbor (CANN). Important features are selected using leave one out method (LOO). Moreover, this study employs NSL KDD dataset. Results indicate that the proposed method improves the accuracy of detecting U2R and R2L attacks in comparison to the baseline paper by 94% and 50%, respectively

Author 1: Mohammad Reza Parsaei
Author 2: Samaneh Miri Rostami
Author 3: Reza Javidan

Keywords: intrusion detection system; feature selection; imbalanced dataset; SMOTE; NSL KDD

PDF

Paper 4: Rapid Control Prototyping and PIL Co-Simulation of a Quadrotor UAV Based on NI myRIO-1900 Board

Abstract: In this paper, a new Computer Aided Design (CAD) methodology for the Processor-In-the-Loop (PIL) co-simulation and Rapid Control Prototyping (RCP) of a Quadrotor Vertical Take-Off and Landing (VTOL) type of Unmanned Arial Vehicle (UAV) is proposed and successfully implemented around an embedded NI myRIO-1900 target and a host PC. The developed software (SW) and hardware (HW) prototyping platform is based on the Control Design and Simulation (CDSim) module of LabVIEW environment and an established Network Streams data communication protocol. A dynamical model of the Quadrotor UAV, which incorporates the dynamics of vertical and landing flights and aerodynamic forces, is obtained using the Newton-Euler formalism. PID and Model Predictive Control (MPC) approaches are chosen as examples for experiment prototyping. These control laws, as well as the dynamical model of the Quad, are implemented and deployed as separate LabVIEW Virtual Instruments (VI) on the myRIO-1900 target and the host PC, respectively. Several demonstrative co-simulation results, obtained for a 3D LabVIEW emulator of the Quadrotor, are presented and discussed in order to improve the effectiveness of the proposed Model Based Design (MBD) prototyping methodology

Author 1: Soufiene Bouallègue
Author 2: Rabii Fessi

Keywords: Quadrotor UAV; modeling; Computer Aided Design; Model Based Design; Rapid Control Prototyping; PIL co-simulation; LabVIEW/CDSim; NI myRIO-1900; PID and MPC approaches

PDF

Paper 5: Ontology based Intrusion Detection System in Wireless Sensor Network for Active Attacks

Abstract: WSNs are vulnerable to attacks and have deemed special attention for developing mechanism for securing against various threats that could effect the overall infrastructure. WSNs are open to miscellaneous classes of attacks and security breaches are intolerable in WSNs. Threats like untrusted data transmissions, settlement in open and unfavorable environments are still open research issues. Safekeeping is an essential and complex requirement in WSNs. These issues raise the need to develop a security-based mechanism for Wireless Sensor Network to categorize the different attacks based on their relevance. A detailed survey of active attacks is highlighted based on the nature and attributes of those attacks. An Ontology based mechanism is developed and tested for active attack in WSNs

Author 1: Naheed Akhter
Author 2: Maruf Pasha

Keywords: Semantics; Wireless Sensor Network; Intrusion detection and prevention system; ontologies

PDF

Paper 6: Secure Steganography for Digital Images

Abstract: The degree of imperceptibility of hidden image in the ‘Digital Image Steganography’ is mostly defined in relation to the limitation of Human Visual System (HVS), its chances of detection using statistical methods and its capacity to hide maximum information inside its body. Whereas, a tradeoff does exist between data hiding capacity of the cover image and robustness of underlying information hiding scheme. This paper is an exertion to underline the technique to embed information inside the cover at Stego key dependent locations, which are hard to detect, to achieve optimal security. Hence, it is secure under worst case scenario where Wendy is in possession of the original image (cover) agreed upon by Alice and Bob for their secret communication. Reliability of our proposed solution can be appreciated by observing the differences between cover, preprocessed cover and Stego object. Proposed scheme is equally good for color as well as gray scaled images. Another interesting aspect of this research is that it implicitly presents fusion of cover and information to be hidden in it while taking care of passive attacks on it

Author 1: Khan Farhan Rafat
Author 2: Muhammad Junaid Hussain

Keywords: Steganography; Imperceptibility; Information Hiding; LSB Technique; Secure Communication; Information Security

PDF

Paper 7: Numerical Solutions of Heat and Mass Transfer with the First Kind Boundary and Initial Conditions in Capillary Porous Cylinder Using Programmable Graphics Hardware

Abstract: Recently, heat and mass transfer simulation is more and more important in various engineering fields. In order to analyze how heat and mass transfer in a thermal environment, heat and mass transfer simulation is needed. However, it is too much time-consuming to obtain numerical solutions to heat and mass transfer equations. Therefore, in this paper, one of acceleration techniques developed in the graphics community that exploits a graphics processing unit (GPU) is applied to the numerical solutions of heat and mass transfer equations. The nVidia Compute Unified Device Architecture (CUDA) programming model provides a straightforward means of describing inherently parallel computations. This paper improves the performance of solving heat and mass transfer equations over capillary porous cylinder with the first boundary and initial conditions numerically running on GPU. Heat and mass transfer simulation using the novel CUDA platform on nVidia Quadro FX 4800 is implemented. Our experimental results clearly show that GPU can accurately perform heat and mass transfer simulation. GPU can significantly accelerate the performance with the maximum observed speedups 10 times. Therefore, the GPU is a good approach to accelerate the heat and mass transfer simulation

Author 1: Hira Narang
Author 2: Fan Wu
Author 3: Abisoye Ogunniyan

Keywords: Genereal: Numerical Solution; Heat and Mass Transfer; General Purpose Graphics Processing Unit; CUDA

PDF

Paper 8: LNG Import Contract in the perspective of Associated Technical and Managerial Challenges for the Distribution Companies of Pakistan

Abstract: Energy Managers and Government Office Holders in Pakistan are nowadays pondering over multiple options for the resolution of ongoing Energy crises in the country. LNG (Liquefied Natural Gas) import has been finalized for being the quickest remedy among all the available options. LNG (Liquefied Natural Gas) contract is at the verge to be implemented in Pakistan. However there are several factors that need to be addressed while implementing the project. In this paper, identification of the challenges affecting the optimized distribution of gas is presented. The sustainability of LNG (Liquefied Natural Gas) Project depends upon the successful running of the project without facing any financial crises arising from the gas distribution losses. The motive of this paper is the identification of the factors that are risk for the sustainability and successful running of LNG Import in Pakistan on long term basis. In association it is required to identify technical and managerial challenges for the gas distribution companies in distribution of LNG. Moreover recommendations are proposed for modification in existing infrastructure and governing policies related to gas distribution companies for logical success and long term sustainability of LNG import in Pakistan

Author 1: Kawish Bakht
Author 2: Farhan Aslam
Author 3: Tahir Nawaz
Author 4: Bakhtawar Seerat

Keywords: LNG Import Contract; sustainable energy solution; UFG (Unaccounted for Gas) losses; Infrastructure and policy Amendments; Technical and Managerial Challenges

PDF

Paper 9: Assessment for the Model Predicting of the Cognitive and Language Ability in the Mild Dementia by the Method of Data-Mining Technique

Abstract: Assessments of cognitive and verbal functions are widely used as screening tests to detect early dementia. This study developed an early dementia prediction model for Korean elderly based on random forest algorithm and compared its results and precision with those of logistic regression model and decision tree model. Subjects of the study were 418 elderly (135 males and 283 females) over the age of 60 in local communities. Outcome was defined as having dementia and explanatory variables included digit span forward, digit span backward, confrontational naming, Rey Complex Figure Test (RCFT) copy score, RCFT immediate recall, RCFT delayed recall, RCFT recognition true positive, RCFT recognition false positive, Seoul Verbal Learning Test (SVLT) immediate recall, SVLT delayed recall, SVLT recognition true positive, SVLT recognition false positive, Korean Color Word Stroop Test (K-CWST) color reading correct, and K-CWST color reading error. The Random Forests algorithm was used to develop prediction model and the result was compared with logistic regression model and decision tree based on chi-square automatic interaction detector (CHAID). As the result of the study, the tests with high level of predictive power in the detection of early dementia were verbal memory, visuospatial memory, naming, visuospatial functions, and executive functions. In addition, the random forests model was more accurate than logistic regression and CHIAD. In order to effectively detect early dementia, development of screening test programs is required which are composed of tests with high predictive power

Author 1: Haewon Byeon
Author 2: Dongwoo Lee
Author 3: Sunghyoun Cho

Keywords: random forests; data mining; mild dementia; risk factors; neuropsychological test

PDF

Paper 10: Impact of IT Resources on IT Capabilities in Sudanese Insurance and Banking Sectors

Abstract: The previous studies that applied the Resource Based View (RBV), to examine the impact of IT resource on IT (Information Technology) competencies, often show different results. This study intends to investigate the impact of IT resources (core communication technology, group collaboration enterprise competences “inter-organization system usage”) on IT capabilities (infrastructure, empowerment and functional capabilities) trying to discuss some empirical limitation of testing RBV such as able to disentangle the effects from a variety of sources, and how IT resources complement with other IT resources. Data were collected from 83 IT employees involved in Sudanese banking and insurance sector. A questionnaire was used to collect data. Reliability and factor analysis was conducted to ensure goodness of the data; regression analysis conducted to test the relationships between variables. The findings of this study do disentangle the effects on IT capabilities from a variety of sources. Moreover, it shows how IT resource complements each other in order to generate the outcome of capability

Author 1: Anwar Yahia Shams Eldin
Author 2: Abdel Hafiez Ali
Author 3: Ahmad A. Al-Tit

Keywords: IT resource, IT capabilities; RBV; process level; banking sector; insurance sector; Sudan

PDF

Paper 11: A New Artificial Neural Networks Approach for Diagnosing Diabetes Disease Type II

Abstract: Diabetes is one of the major health problems as it causes physical disability and even death in people. Therefore, to diagnose this dangerous disease better, methods with minimum error rate must be used. Different models of artificial neural networks have the capability to diagnose this disease with minimum error. Hence, in this paper we have used probabilistic artificial neural networks for an approach to diagnose diabetes disease type II. We took advantage of Pima Indians Diabetes dataset with 768 samples in our experiments. According to this dataset, PNN is implemented in MATLAB. Furthermore, maximizing accuracy of diagnosing the Diabetes disease type II in training and testing the Pima Indians Diabetes dataset is the performance measure in this paper. Finally, we concluded that training accuracy and testing accuracy of the proposed method is 89.56% and 81.49%, respectively

Author 1: Zahed Soltani
Author 2: Ahmad Jafarian

Keywords: diabetes type 2; probabilistic artificial neural networks; data mining; mean squares error; Naive Bayes

PDF

Paper 12: A Hybrid Algorithm Based on Firefly Algorithm and Differential Evolution for Global Optimization

Abstract: In this paper, a new and an effective combination of two metaheuristic algorithms, namely Firefly Algorithm and the Differential evolution, has been proposed. This hybridization called as HFADE, consists of two phases of Differential Evolution (DE) and Firefly Algorithm (FA). Firefly algorithm is the nature- inspired algorithm which has its roots in the light intensity attraction process of firefly in the nature. Differential evolution is an Evolutionary Algorithm that uses the evolutionary operators like selection, recombination and mutation. FA and DE together are effective and powerful algorithms but FA algorithm depends on random directions for search which led into retardation in finding the best solution and DE needs more iteration to find proper solution. As a result, this proposed method has been designed to cover each algorithm deficiencies so as to make them more suitable for optimization in real world domain. To obtain the required results, the experiment on a set of benchmark functions was performed and findings showed that HFADE is a more preferable and effective method in solving the high-dimensional functions

Author 1: S. Sarbazfard
Author 2: A. Jafarian

Keywords: Differential Evolution; Firefly Algorithm; Global optimization; Hybrid algorithm

PDF

Paper 13: The Development of the Routing Pattern of the Backbone Data Transmission Network for the Automation of the Krasnoyarsk Railway

Abstract: The paper deals with the data transmission network of the Krasnoyarsk Railway, its structure, the topology of data transmission and the routing protocol, which supports its operation, as well as the specifics of data transmission networking. The combination of the railway automation applications and the data transmission network make up the automation systems, making it possible to improve performance, increase the freight traffic volume, and improve the quality of passenger service. The objective of this paper is to study the existing data transmission network of the Krasnoyarsk Railway and to develop ways of its modernization, in order to improve the reliability of the network and the automated systems that use this network. It was found that the IS-IS and OSPF routing protocols have many differences, primarily due to the fact that the IS-IS protocol does not use IP addresses as paths. On the one hand, it makes it possible to use the IS-IS protocol for the operation in IPv6 networks, whereas OSPF version 2 doesn’t provide this opportunity; therefore, OSPF version 3 was developed to solve this problem. However, on the other hand, in case of using IPv4, the routing configuration by means of the IS-IS protocol will imply the need to study a significant volume of information and use unusual methods of routing configuration

Author 1: Sergey Victorovich Makarov
Author 2: Faridun Abdulnazarov
Author 3: Omurbek Anarbekov

Keywords: Router; Topology; IS-IS protocol; OSPF protocol

PDF

Paper 14: An Automatic Evaluation for Online Machine Translation: Holy Quran Case Study

Abstract: The number of Free Online Machine Translation (FOMT) users witnessed a spectacular growth since 1994. FOMT systems change the aspects of machine translation (MT) and the mass translated materials using a wide range of natural languages and machine translation systems. Hundreds of millions of people use these FOMT systems to translate the holy Quran (Al-Qur?an) verses from the Arabic language to other natural languages, and vice versa. In this study, an automatic evaluation for the use of FOMT systems to translate Arabic Quranic text into English is conducted. The two well-known FOMT systems (Google and Bing Translators) are chosen to be evaluated in this study using a metric called Assessment of Text Essential Characteristics (ATEC). ATEC metric is one of the automatic evaluation metrics for machine translation systems. ATEC scores the correlation between the output of a machine translation system and professional human reference translation based on word choice, word orders and the similarity between MT output and the human reference translation. Extensive evaluation has been conducted on two well-known FOMT systems to translate Arabic Quranic text into English. This evaluation shows that Google translator performs better than Bing translator in translating Quranic text. It is noticed that the average ATEC score does not exceed 41% which indicates that FOMT systems are ineffective in translating Quranic texts accurately

Author 1: Emad AlSukhni
Author 2: Mohammed N. Al-Kabi
Author 3: Izzat M. Alsmadi

Keywords: machine translation; language automatic evaluation; Statistical Machine Translation; Quran machine translation; Arabic MT

PDF

Paper 15: Efficiency in Motion: The New Era of E-Tickets

Abstract: The development of mobile applications has played an important role in technology. Due to recent advances in technology, mobile applications are creating more attraction across the world. Mobile application is a very interesting field of research. There are several ongoing research and development in both industries along with in academia. In this paper, we present the design and implementation of a mobile application that creates an electronic ticket or e-ticket application for campus police. The goals for this mobile application are to make the ticket process much faster and easier for campus police by using mobile devices. Furthermore, the results will indicate an increase in performance and productivity for campus police by sending, retrieving data, printing tickets and permits for students to limit the need for paper based ticketing

Author 1: Fan Wu
Author 2: Dwayne Clarke
Author 3: Jian Jiang
Author 4: Adontavius Turner

Keywords: Genereal; Mobile Application; Mobile Device; E-Ticket

PDF

Paper 16: Detection of SQL Injection Using a Genetic Fuzzy Classifier System

Abstract: SQL Injection (SQLI) is one of the most popular vulnerabilities of web applications. The consequences of SQL injection attack include the possibility of stealing sensitive information or bypassing authentication procedures. SQL injection attacks have different forms and variations. One difficulty in detecting malicious attacks is that such attacks do not have a specific pattern. A new fuzzy rule-based classification system (FBRCS) can tackle the requirements of the current stage of security measures. This paper proposes a genetic fuzzy system for detection of SQLI where not only the accuracy is a priority, but also the learning and the flexibility of the obtained rules. To create the rules having high generalization capabilities, our algorithm builds on initial rules, data-dependent parameters, and an enhancing function that modifies the rule evaluation measures. The enhancing function helps to assess the candidate rules more effectively based on decision subspace. The proposed system has been evaluated using a number of well-known data sets. Results show a significant enhancement in the detection procedure

Author 1: Christine Basta
Author 2: Ahmed elfatatry
Author 3: Saad Darwish

Keywords: SQL injection; web security; genetic fuzzy system; fuzzy rule learning

PDF

Paper 17: Compressed Sensing of Multi-Channel EEG Signals: Quantitative and Qualitative Evaluation with Speller Paradigm

Abstract: In this paper the possibility of the electroencephalogram (EEG) compressed sensing based on specific dictionaries is presented. Several types of projection matrices (matrices with random i.i.d. elements sampled from the Gaussian or Bernoulli distributions, and matrices optimized for the particular dictionary used in reconstruction by means of appropriate algorithms) have been compared. The results are discussed from the reconstruction error point of view and from the classification rates of the spelling paradigm

Author 1: Monica Fira

Keywords: Compressed sensed; EEG; Brain computer interface; P300; Speller Paradigm

PDF

Paper 18: Multicast Routing Problem Using Tree-Based Cuckoo Optimization Algorithm

Abstract: The problem of QoS multicast routing is to find a multicast tree with the least expense/cost which would meet the limitations such as band width, delay and loss rate. This is a NP-Complete problem. To solve the problem of multicast routing, the entire routes from the source node to every destination node are often recognized. Then the routes are integrated and changed into a single multicast tree. But they are slow and complicated methods. The present paper introduces a new tree-based optimization method to overcome such weaknesses. The recommended method directly optimizes the multicast tree. Therefore a tree-based typology including several spanning trees is created which combines the trees two by two. For this purpose, the Cuckoo Algorithm is used which is proved to be well converged and makes quick calculations. The simulation conducted on different types of network typologies proved that it is a practical and influential algorithm

Author 1: Mahmood Sardarpour
Author 2: Hasan Hosseinzadeh
Author 3: Mehdi Effatparvar

Keywords: Multicast tree; routing; Cuckoo Algorithm; optimal function

PDF

Paper 19: Cost-effective and Green Manufacturing Substrate Integrated Waveguide (SIW) BPF for Wireless Sensor Network Applications

Abstract: This paper presents a comparison between innovative technique for implementation of substrate integrated waveguide band pass filter centered at 4 GHz and conventional PCB results . Two poles filter is designed, simulated and fabricated. The novel fabrication process technique is based on green manufacturing where physical etching of metal layer of aluminum is glued on paper substrate. The pass band filter is composed of two resonant adjacent and symmetric cavities separated by a coupling iris. The same topology is adopted with conventional substrate and PCB technology to validate the new technique results. The bandwidth achieved is almost four times wider than found with PCB technology developed for UWB applications. The proposed technique keeps the advantages of conventional SIW technology including low profile, compact size, complete shielding, easy fabrication, low cost for mass production as well as convenient integration with planar waveguide components including transitions. Moreover, the flexible quality of paper offer the possibility to fabricate conformal shapes of SIW components which is not possible with conventional rigid substrates made with PCB technology . In addition to the advantages of eco friendly, renewable, light weight , and ultra low cost materials

Author 1: Hiba Abdel Ali
Author 2: Rachida Bedira
Author 3: Hichem Trabelsi
Author 4: Ali Gharsallah

Keywords: Substrate Integrated Waveguide; Band pass filter; Wireless sensor network; green material technology; paper substrate

PDF

Paper 20: Evaluation and Comparison of Binary Trie base IP Lookup Algorithms with Real Edge Router IP Prefix Dataset

Abstract: Internet network is comprised of routers that forward packets towards their destinations. IP routing lookup requires computing the Best-Matching Prefix. The main Functionality of Router is finding the Appropriate Path for Packet. There are many Algorithms for IP-Lookup with different Speed, Complexity and Memory usage. In This Paper Three Binary Trie algorithms will be considered for Performance Analysis. These algorithms are Priority-Trie, Disjoint Binary and Binary Trie. We consider three parameters for comparison, these parameters are Time, Memory and Complexity of Algorithms. For performance analysis, we develop and run algorithms with real Lookup-Tables which were used in an edge router

Author 1: Alireza Shirmarz
Author 2: Masoud Sabaei
Author 3: Mojtaba hosseini

Keywords: Binary Trie; IP-Lookup; Running Time; Memory Usage; Complexity

PDF

Paper 21: Flying Ad-Hoc Networks: Routing Protocols, Mobility Models, Issues

Abstract: Flying Ad-Hoc Networks (FANETs) is a group of Unmanned Air Vehicles (UAVs) which completed their work without human intervention. There are some problems in this kind of networks: the first one is the communication between (UAVs). Various routing protocols introduced classified into three categories, static, proactive, reactive routing protocols in order to solve this problem. The second problem is the network design, which depends on the network mobility, in which is the process of cooperation and collaboration between the UAV. Mobility model of FANET is introduced in order to solve this problem. In Mobility Model, the path and speed variations of the UAV and represents their position are defined. As of today, Random Way Point Model is utilized as manufactured one for Mobility in the greater part of recreation situations. The Arbitrary Way Point model is not relevant for the UAV in light of the fact that UAV do not alter their course and versatility, speed quickly at one time because of this reason, we consider more practical models, called Semi-Random Circular Movement (SRCM) Mobility Model. Also, we consider different portability models, Mission Plan-Based (MPB) Mobility Model, Pheromone-Based Model. Moreover, Paparazzi Mobility Model (PPRZM). This paper presented and discussed the main routing protocols and main mobility models used to solve the communication, cooperation, and collaboration in FANET networks

Author 1: Muneer Bani Yassein
Author 2: “Nour Alhuda” Damer

Keywords: FANET; Ad-hoc Network; UAVs; MANET; Mobility Model; Networking Model

PDF

Paper 22: A Robust Audio Watermarking Technique Operates in MDCT Domain based on Perceptual Measures

Abstract: the review presents a digital audio watermarking technique operating in the frequency domain with two variants. This technique uses the Modified Discrete Cosine Transform (MDCT) to move to the frequency domain. To ensure more inaudibility, we exploited the proprieties of the psychoacoustic model 1 (PMH1) of MPEG1 encoder layer I in the first variant and those of psychoacoustic model 2 (PMH2) of MPEG1 encoder Layer III in the second alternative to search the places for insertion of the watermark. In both variants of the technique, the bits of the mark will be duplicated to increase the capacity of insertion then inserted into the least significant bit (LSB). For more reliability in the detection phase, we use an error correction code (Hamming) on the mark. Next, to analyze the performance of the proposed technique, we perform two comparative studies. In the first, we compare the proposed digital audio watermarking technique with her two variants and those achieved by Luigi Rosa and Rolf Brigola, ‘which we download the M-files of each’. The technique developed by Luigi Rosa operates in the frequency domain but using the Discrete Cosine Transform (DCT) as transformation and that proposed by Rolf Brigola uses the Fast Fourier Transform (FFT). We studied the robustness of each technique against different types of attacks such as compression / decompression MP3, stirmark audio attack and we evaluated the inaudibility by using an objective approach by calculating the SNR and the ODG notes given by PEAQ. The robustness of this technique is shown against different types of attacks. In the second, we prove the contribution of the proposed technique by comparing the payload data, imperceptibility and robustness against attack MP3 with others existing techniques in the literature

Author 1: Maha Bellaaj
Author 2: Kais Ouni

Keywords: digital audio watermarking; Hamming; LSB; psychoacoustic model1, 2; MDCT; DCT; FFT; SNR; ODG

PDF

Paper 23: Multi-Robot Path-Planning Problem for a Heavy Traffic Control Application: A Survey

Abstract: This survey looked at the methods used to solve multi-autonomous vehicle path-planning for an application of heavy traffic control in cities. Formally, the problem consisted of a graph and a set of robots. Each robot has to reach its destination in the minimum time and number of movements, considering the obstacles and other robots’ paths, hence, the problem is NP-hard. The study found that decoupled centralised approaches are the most relevant approaches for an autonomous vehicle path-planning problem for three reasons: (1) a city is a large environment and coupled centralised approaches scale weakly, (2) the overhead of a coupled decentralised approach to achieve the global optimal will affect the time and memory of the other robots, which is not important in a city configuration and (3) the coupled approaches suppose that the number of robots is defined before they start to find the paths and resolve collisions, while in a city, any car can start at any time and hence, each car should work individually and resolve collisions as they arise. In addition, the study reviewed four decoupled centralised techniques to solve the problem: multi-robot path-planning rapidly exploring random tree (MRRRT), push and swap (PAS), push and rotate (PAR) and the Bibox algorithm. The experiments showed that MRRRT is the best for exploring any search space and optimizing the solution. On the other hand, PAS, PAR and Bibox are better in terms of providing a complete solution for the problem and resolving collisions in significantly much less time, the analysis, however, shows that a wider class of solvable instances are excluded from PAS and PAR domain. In addition, Bibox solves a smaller class than the class solved by PAS and PAR in less time, in the worst case, and with a shorter path than PAS and PAR

Author 1: Ebtehal Turki Saho Alotaibi
Author 2: Hisham Al-Rawi

Keywords: component; Heavy traffic control; Multi robots; Coupled Path Planning; Decoupled Path Planning; Collision Avoidance; Heuristics; RRT; Push and Swap; Push and Rotate; Bibox

PDF

Paper 24: Improving Service-Oriented Architecture Processes in Process of Automatic Services Composition Using Memory and QF, QWV Factor

Abstract: The application of service-orientated architecture in organizations for implementation of complicated workflows in electronic way using composite web services has become widespread. Thus, challenging research issues have also been raised in this regard. One of these issues is constructing composite web services by workflows. These workflows are composed of existing web services. Selections of a web service for each of workflow activities and fulfilling users’ conditions is still regarded as a major challenge. In fact, selection of a web service out of many such web services with identical function is a critical task which generally depends on composite evaluation tool of QoS. Previously proposed approaches do not consider exchange restrictions on the composition process and internal processes of architecture and previous experiences, and they ignore the fact that value of many of QoSs depends on the time of implementation. Selection of web services only based on QoS does not bring about optimal composite web service. Thus, till now, no solution has been proposed that performs composition process automatically or semi-automatically in optimal manner Objective: identification of existing concerns on composition of services and then designing a framework to provide a solution which consider all concerns and finally performing tests in order to examine and evaluate proposed framework Method: in the proposed framework, elements affecting management of service-oriented architecture processes are organized according to a logical procedure. This framework identifies processes of this style of architecture based on requirements in service-oriented architecture processes management and according to qualitative features in this area. In the proposed framework, in addition to using existing data in the problem area, existing structure and patterns in the area of software architecture are also utilize, and management processes in service-orientated architecture are improved based on propriety of available requirements. QWV are qualitative weighted dynamic features which indicate priority of users, and QF is quality factor of service at the time of implementation which is weighted in the framework. These factors are used for constructing composite web service. Multifactor computing is known as a natural computational system for automating the interaction between services. The factors in multi-agent systems can be used as the main reliable mechanism for the control which usually use data exchange for accelerating their evaluations. For identification of all concerns in the solution space, many aspects should be examined. To this end, classes of agents are defined which investigate these aspects in the form of four components using repository data. Results: proposed framework was simulated by Arena software and results showed this framework can be useful in automatic generation of needed services and meet all concerns at the same time. Results support that using agents in the model increased speed of accountability and satisfaction of users as well as system efficiency

Author 1: Behnaz Nahvi
Author 2: Jafar Habibi

Keywords: Service-orien ted architecture; process management; multi-agent systems

PDF

Paper 25: Modeling Access Control Policy of a Social Network

Abstract: Social networks bring together users in a virtual platform and offer them the ability to share -within the Community- personal and professional information’s, photos, etc. which are sometimes sensitive. Although, the majority of these networks provide access control mechanisms to their users (to manage who accesses to which information), privacy settings are limited and do not respond to all users' needs. Hence, the published information remain all vulnerable to illegal access. In this paper, the access control policy of the social network "Facebook" is analyzed in a profound way by starting with its modeling with "Organization Role Based Access Control" model, and moving to the simulation of the policy with an appropriate simulator to test the coherence aspect, and ending with a discussion of analysis results which shows the gap between access control management options offered by Facebook and the real requirements of users in the same context. Extracted conclusions prove the need of developing a new access control model that meets most of these requirements, which will be the subject of a forthcoming work

Author 1: Chaimaa Belbergui
Author 2: Najib Elkamoun
Author 3: Rachid Hilal

Keywords: social network; Facebook; access control; OrBAC; study of coherence

PDF

Paper 26: BF-PSO-TS: Hybrid Heuristic Algorithms for Optimizing Task Schedulingon Cloud Computing Environment

Abstract: Task Scheduling is a major problem in Cloud computing because the cloud provider has to serve many users. Also, a good scheduling algorithm helps in the proper and efficient utilization of the resources. So, task scheduling is considered as one of the major issues on the Cloud computing systems. The objective of this paper is to assign the tasks to multiple computing resources. Consequently, the total cost of execution is to be minimum and load to be shared between these computing resources. Therefore, two hybrid algorithms based on Particle Swarm Optimization (PSO) have been introduced to schedule the tasks; Best-Fit-PSO (BFPSO) and PSO-Tabu Search (PSOTS). According to BFPSO algorithm, Best-Fit (BF) algorithm has been merged into the PSO algorithm to improve the performance. The main principle of the modified BFSOP algorithm is that BF algorithm is used to generate the initial population of the standard PSO algorithm instead of being initiated randomly. According to the proposed PSOTS algorithm, the Tabu-Search (TS) has been used to improve the local research by avoiding the trap of the local optimality which could be occurred using the standard PSO algorithm. The two proposed algorithms (i.e., BFPSO and PSOTS) have been implemented using Cloudsim and evaluated comparing to the standard PSO algorithm using five problems with different number of independent tasks and resources. The performance parameters have been considered are the execution time (Makspan), cost, and resources utilization. The implementation results prove that the proposed hybrid algorithms (i.e., BFPSO, PSOTS) outperform the standard PSO algorithm

Author 1: Hussin M. Alkhashai
Author 2: Fatma A. Omara

Keywords: Cloud computing; task scheduling; CloudSim; Particle Swarm Optimization; Tabu search; Best-Fit

PDF

Paper 27: An Approach for Integrating Data Mining with Saudi Universities Database Systems: Case Study

Abstract: This paper presents an approach for integrating data mining algorithms within Saudi university’s database system, viz., Prince Sattam Bin Abdulaziz University (PSAU) as a case study. The approach based on a bottom-up methodology; it starts by providing a data mining application that represents a solution to one of the problems that face Saudi Universities’ systems. After that, it integrates and implements the solution inside the university’s database system. This process is then repeated to enhance the university system by providing data mining tools that help different parties -especially decision makers- to carry out certain decision. The paper presents a case study that includes analyzing and predicting the student withdrawal from courses at PSAU using association rule mining, neural networks and decision trees. Then it provides a conceptual and practical approach for integrating the resulted application within the university’s database system. The experiment improves that this approach can be used as a framework for integrating data mining techniques within Saudi university’s database systems. The paper concluded that mining universities’ data can be applied as a computer system (intelligent university’s system), Also, data mining algorithms can be adapted with any database system regardless that this system is new, exists or legacy. Moreover, data mining algorithms can be a solution for some educational problems, in addition to providing information for decision makers and users

Author 1: Mohamed Osman Hegazi
Author 2: Mohammad Alhawarat
Author 3: Anwer Hilal

Keywords: Data Mining; Database; Predict; Integration; Association rule mining; Neural networks; Decision tree; Educational Data Mining (EDM); University system

PDF

Paper 28: A Memetic Algorithm for the Capacitated Location-Routing Problem

Abstract: In this paper, a hybrid genetic algorithm is proposed to solve a Capacitated Location-Routing Problem. The objective is to minimize the total cost of the distribution in a network composed of depots and customers, both depots and vehicles have limited capacities, each depot has a homogenous vehicle fleet and customers’ demands are known and must be satisfied. Solving this problem involves making strategic decisions such as the location of depots, as well as tactical and operational decisions which include assigning customers to the opened depots and organization of the vehicle routing. To evaluate the performance of the proposed algorithm, its results are compared to those obtained by a greedy randomized adaptive search procedure, computational results shows that the algorithm gave good quality solutions

Author 1: Laila KECHMANE
Author 2: Benayad NSIRI
Author 3: Azeddine BAALAL

Keywords: hybrid genetic algorithm; capacitated location-routing problem; location; assigning; vehicle routing

PDF

Paper 29: Design and Modeling of RF Power Amplifiers with Radial Basis Function Artificial Neural Networks

Abstract: A radial basis function (RBF) artificial neural network model for a designed high efficiency radio frequency class-F power amplifier (PA) is presented in this paper. The presented amplifier is designed at 1.8 GHz operating frequency with 12 dB of gain and 36 dBm of 1dB output compression point. The obtained power added efficiency (PAE) for the presented PA is 76% under 26 dBm input power. The proposed RBF model uses input and DC power of the PA as inputs variables and considers output power as the output variable. The presented RBF network models the designed class-F PA as a block, which could be applied in circuit design. The presented model could be used to model any RF power amplifier. The obtained results show a good agreement between real data and predicted values by RBF model. The results clearly show that the presented RBF network is more precise than multilayer perceptron (MLP) model. According to the results, better than 84% and 92% improvement is achieved in MAE and RMSE, respectively

Author 1: Ali Reza Zirak
Author 2: Sobhan Roshani

Keywords: Amplifier model; artificial neural network (ANN); class-F amplifier; radial basis function; RF amplifier

PDF

Paper 30: Denial of Service Attack in IPv6 Duplicate Address Detection Process

Abstract: IPv6 was designed to replace the existing Internet Protocol, that is, IPv4. The main advantage of IPv6 over IPv4 is the vastness of address space. In addition, various improvements were brought to IPv6 to address the drawbacks in IPv4. Nevertheless, as with any new technology, IPv6 suffers from various security vulnerabilities. One of the vulnerabilities discovered allows Denial of Service Attack using the Duplicate Address Detection mechanism. In order to study and analyse this attack, an IPv6 security testbed was designed and implemented. This paper presents our experience with the deployment and operation of the testbed, and discussion on the outcome and data gathered from carrying out DoS attack in this testbed

Author 1: Shafiq Ul Rehman
Author 2: Selvakumar Manickam

Keywords: Duplicate Address Detection; Denial of Service; IPv6; Address autoconfiguration; Security; Internet Protocol

PDF

Paper 31: A Self-organizing Location and Mobility-Aware Route Optimization Protocol for Bluetooth Wireless

Abstract: Bluetooth allows multi-hop ad-hoc networks that contain multiple interconnected piconets in a common area to form a scatternet. Routing is one of the technical issues in a scatternet because nodes can arrive and leave at arbitrary times; hence node mobility has a serious impact on network performance. Bluetooth network is built in an ad-hoc fashion, therefore, a fully connected network does not guarantee. Moreover, a partially connected network may not find the shortest route between source and destination. In this paper, a new Self-organizing Location and Mobility-aware Route Optimization (LMRO) protocol is proposed for Bluetooth scatternet, which is based on node mobility and location. The proposed protocol considered the shortest route ahead of the source and destination nodes through nodes location information. In addition, proposed protocol guarantees network connectivity through executing Self-organizing procedure for the damaged route by considering signal strength. The proposed LMRO protocol predicts node mobility through the signal strength and activates an alternate link before the main link breaks. Simulation results show that the LMRO protocol has reduced the average hop count by 20%-50% and increased network throughput by 30%-40% compared to existing protocols

Author 1: Sheikh Tahir Bakhsh

Keywords: Bluetooth; Hop count; Mobility; Routing; Resource optimization; Scatternet; Self-healings

PDF

Paper 32: Face Retrieval Based On Local Binary Pattern and Its Variants: A Comprehensive Study

Abstract: Face retrieval (FR) is one of the specific fields in content-based image retrieval (CBIR). Its aim is to search relevant faces in large database based on the contents of the images rather than the metadata. It has many applications in important areas such as face searching, forensics, and identification… In this paper, we experimentally evaluate Face Retrieval based on Local Binary Pattern (LBP) and its variants: Rotation Invariant Local Binary Pattern (RILBP) and Pyramid of Local Binary Pattern (PLBP). We also use a grid LBP based operator, which divides an image into 6×7 sub-regions then concentrates LBP feature vector from each of them into a spatially enhanced feature histogram. These features were firstly tested on three fontal face datasets: The Database of Faces (TDF), Caltech Faces 1999 (CF1999) and the combination of The Database of faces and Caltech Faces 1999 (CF). Good result on these dataset has encouraged us to conduct tests on Labeled Faces in the Wild (LFW), where the images were taken from real-world condition. Mean average precision (MAP) was used for measuring the performance of the system. We carry out the experiments in two main stages indexing and searching with the use of k-fold cross-validation. We further boost the system by using Locality Sensitive Hashing (LSH). Furthermore, we also evaluate the impact of LSH on the searching stage. The experimental results have shown that LSH is effective for face searching as well as LBP is robust feature in fontal face retrieval

Author 1: Phan Khoi
Author 2: Lam Huu Thien
Author 3: Vo Hoai Viet

Keywords: Face Retrieval; LBP; PLBP; Grid LBP; LSH

PDF

Paper 33: Classifying Arabic Text Using KNN Classifier

Abstract: With the tremendous amount of electronic documents available, there is a great need to classify documents automatically. Classification is the task of assigning objects (images, text documents, etc.) to one of several predefined categories. The selection of important terms is vital to classifier performance, feature set reduction techniques such as stop word removal, stemming and term threshold were used in this paper. Three term-selection techniques are used on a corpus of 1000 documents that fall in five categories. A comparison study is performed to find the effect of using full-word, stem, and the root term indexing methods. K-nearest – neighbors classifiers used in this study. The averages of all folds for Recall, Precision, Fallout, and Error-Rate were calculated. The results of the experiments carried out on the dataset show the importance of using k-fold testing since it presents the variations of averages of recall, precision, fallout, and error rate for each category over the 10-fold

Author 1: Amer Al-Badarenah
Author 2: Emad Al-Shawakfa
Author 3: Khaleel Al-Rababah
Author 4: Safwan Shatnawi
Author 5: Basel Bani-Ismail

Keywords: categorization; Arabic; KNN; stemming; cross validation

PDF

Paper 34: Knowledge Extraction from Metacognitive Reading Strategies Data Using Induction Trees

Abstract: The assessment of students’ metacognitive knowledge and skills about reading is critical in determining their ability to read academic texts and do so with comprehension. In this paper, we used induction trees to extract metacognitive knowledge about reading from a reading strategies dataset obtained from a group of 1636 undergraduate college students. Using a C4.5 algorithm, we constructed decision trees, which helped us classify participants into three groups based on their metacognitive strategy awareness levels consisting of global, problem-solving and support reading strategies. We extracted rules from these decision trees, and in order to evaluate accuracy of the extracted rules, we built a fuzzy inference system (FIS) with the extracted rules as a rule base and classified the test dataset with the FIS. The extracted rules are evaluated using measures such as the overall efficiency and Kappa coefficient

Author 1: Christopher Taylor
Author 2: Arun Kulkarni
Author 3: Kouider Mokhtar

Keywords: Metacognitive Reading Strategies; Classification; Induction Tree; Rule Extraction; Fuzzy Inference System

PDF

Paper 35: Development of a Fingerprint Gender Classification Algorithm Using Fingerprint Global Features

Abstract: In forensic world, the process of identifying and calculating the fingerprint features is complex and take time when it is done manually using fingerprint laboratories magnifying glass. This study is meant to enhance the forensic manual method by proposing a new algorithm for fingerprint global feature extraction for gender classification. The result shows that the new algorithm gives higher acceptable readings which is above 70% of classification rate when it is compared to the manual method. This algorithm is highly recommended in extracting a fingerprint global feature for gender classification process.

Author 1: S. F. Abdullah
Author 2: A.F.N.A. Rahman
Author 3: Z.A.Abas
Author 4: W.H.M Saad

Keywords: fingerprint; gender classification; global features; algorithm

PDF

Paper 36: Towards Improving the Quality of Present MAC Protocols for LECIM Systems

Abstract: Wireless networking system is quickly growing in the field of communication technology due to its usefulness and huge applications. To make the system more effective to the users its lower energy consumption, security, reliability and lower cost issues must be considered under any circumstances. Low energy wireless is exceedingly required because the sensors are frequently located where mains power and network infrastructure are not reliably available. The recent development of Low Energy Critical Infrastructure Monitoring (LECIM) has vast applications including: Water leak detection, Bridge/structural integrity monitoring, Oil & gas pipeline monitoring, electric plant monitoring, public transport tracking, Cargo container monitoring, Railroad condition monitoring, Traffic congestion monitoring, Border surveillance, Medical alert for at-risk populations and many more. This proposal Low Energy Critical Infrastructure Monitoring (LECIM) is proposed by the Task Group 4k under IEEE P802.15 WPAN. Although many issues related to its quality are involved, but several Media Access Control (MAC) protocols with different objectives were proposed for LECIM. In this research paper, issues related to energy consumption and wastage in LECIM system, energy savings mechanism, relevant energy conscious MAC protocols have been briefly studied and analyzed. Science Direct, Elsevier, Springer, IEEE Explore, Google Scholar and Wiley digital Library databases were used to search for articles related to the existing MAC protocols well suited for LECIM system. Finally, some ideas have been proposed towards developing energy efficient MAC protocol for LECIM applications in order to fulfill and satisfy the major issues of LECIM quality

Author 1: Mohammad Arif Siddiqui
Author 2: Shah Murtaza Rashid Al Masud
Author 3: Mohammed Basit Kamal

Keywords: wireless networking; LECIM; IEEE P802.15; WPAN; MAC

PDF

Paper 37: Improvement of Adaptive Smart Concentric Circular Antenna Array Based Hybrid PSOGSA Optimizer

Abstract: Unlike all recent research which used Concentric Circular Antenna Array (CCAA) based on one beam-former for each single main beam, this research presents a technique to adapt smart CCAA by using only single beam-former for multi main beams based on hybrid PSOGSA. Hybrid PSOGSA is a combining technique between Particle Swarm Optimization and Gravitational Search Algorithm which is applied in the feeding of the smart CCAA to enhance its performance. Phase excitation of the array with a large number of elements is suggested for different scenarios based on hybrid PSOGSA and other algorithms such as PSO and GSA in High Altitude Platform (HAP) application. Simulation results proved that hybrid PSOGSA achieves better performance than other optimizers for excitation of the smart CCAA in all scenarios for different parameters like normalized array factor, fitness values, convergence rate, and directivity

Author 1: Ahmed Magdy
Author 2: Osama M. EL-Ghandour
Author 3: Hesham F. A. Hamed

Keywords: Smart antenna; CCAA; PSOGSA; HAP; and Beam-forming

PDF

Paper 38: Optimized Image Scaling Using DWT and Different Interpolation Techniques

Abstract: Discrete Wavelet Transform (DWT) has gained much limelight in the past years. Wavelet Transform has precedence over Discrete Fourier Transform and Discrete Cosine Transform because they capture the frequency as well as spatial information of a signal. In this paper DWT has been used for image scaling purpose. To achieve higher visual quality image, DWT is applied on the gray scale image using a downscaling technique. The original image is recovered using IDWT by employing different interpolation techniques for upscaling. In this paper interpolation techniques used are: Nearest Neighbor, Bilinear and Bicubic. Peak Signal to Noise Ratio (PSNR) and Mean Square Error (MSE) is calculated for quantifying interpolated image effectiveness. Results show that the reconstructed image is better when using a combination of DWT and Bicubic interpolation

Author 1: Wardah Aslam
Author 2: Khurram Khurshid
Author 3: Asfaq Ahmed Khan

Keywords: Bilinear Interpolation; Bicubic Interpolation; Discrete Wavelet Transform Image Scaling

PDF

Paper 39: A Survey on Smartphones Systems for Emergency Management (SPSEM)

Abstract: Emergency never runs with earlier intimations and indications. In the real world and practical life, detecting and perceiving such emergencies and reporting them are a genuine test and tough challenge. Smartphones Systems for Emergency Management (SPSEM) provide details of existing emergency applications and provide a new direction to overcome the traditional problem having manual intercession and reporting emergencies. In this paper, we provide a comprehensive overview of SPSEM. We elaborate how embedded sensors automate the procedure of emergency crisis detection and responding. Furthermore, we critically evaluate the operations, benefits, limitations, emergency applications and responsiveness in any emergency crisis of different approaches. We provide an easy and concise view of the underlying model adapted by each SPSEM approach. In last, we estimate the future utility and provide an insight of upcoming trends in SPSEM

Author 1: Hafsa Maryam
Author 2: Munam Ali Shah
Author 3: Qaisar Javaid
Author 4: Muhammad Kamran

Keywords: smartphone; emergency response; mobile; crises management

PDF

Paper 40: PSO-based Optimized Canny Technique for Efficient Boundary Detection in Tamil Sign Language Digital Images

Abstract: For the hearing impaired, sign language is the most prevailing means of communication for their day-to-day life. It is always a challenge to develop an optimized automated system to recognize and interpret the implication of signs expressed by the hearing impaired. There are a wide range of algorithms developed for SLR, in which only few considerable approaches are carried out in Tamil Sign Language Recognition. This paper has proposed a significant contribution in segmentation process which is the most predominant component of image analysis in constructing the SLR system. Segmentation is handled using edge detection procedure for finding the borders of hand sign within the captured images, by detecting the split in the image illumination. The objective of the edge function, to find the boundary intensity, is done by a particle swarm optimization technique which chooses the optimal threshold values and implemented in the canny hysteresis thresholding method. The analysis primarily uses common edge recognition algorithms which contain Sobel, Robert, Canny and Prewitt from which the scope of the work is extended by introducing an optimization technique in Canny method. The performance of the proposed algorithm is tested with real time Tamil sign language dataset and comparison is inevitably carried out with standard segmentation metrics

Author 1: Dr M Krishnaveni
Author 2: Dr P Subashini
Author 3: TT Dhivyaprabha

Keywords: Tamil Sign Language; Canny Edge Detection; PSO; Thresholding; Objective Function

PDF

Paper 41: Enhancement of KaPoW Plugin to Defend Against DDoS Attacks

Abstract: DDoS attack is one of the hardest attacks to detect and mitigate in the computer world. This paper introduces two quantitative models, which use the client puzzling to detect and thwart application DDoS attacks. We simulated the models to use the probabilistic metrics to penalize the malicious users and prevent them from launching a DDoS attack while offering a stable environment to the normal users and decreasing the number of false positives and false negatives

Author 1: Farah Samir Barakat
Author 2: A .Prof. Amira Kotb

Keywords: Application Security; Client Puzzling; DDoS; Metrics; PHP; Puzzle

PDF

Paper 42: Rule Based Approach for Arabic Part of Speech Tagging and Name Entity Recognition

Abstract: The aim of this study is to build a tool for Part of Speech (POS) tagging and Name Entity Recognition for Arabic Language, the approach used to build this tool is a rule base technique. The POS Tagger contains two phases:The first phase is to pass word into a lexicon phase, the second level is the morphological phase, and the tagset are (Noun, Verb and Determine). The Named-Entity detector will apply rules on the text and give the correct Labels for each word, the labels are Person(PERS), Location (LOC) and Organization (ORG)

Author 1: Mohammad Hjouj Btoush
Author 2: Abdulsalam Alarabeyyat
Author 3: Isa Olab

Keywords: POS; Speech tagging; Speech recognition; Text phrase; Phrase; NLP

PDF

Paper 43: Resource Utilization in Cloud Computing as an Optimization Problem

Abstract: In this paper, an algorithm for resource utilization problem in cloud computing based on greedy method is presented. A privately-owned cloud that provides services to a huge number of users is assumed. For a given resource, hundreds or thousands of requests accumulate over time to use that resource by different users worldwide via the Internet. A prior knowledge of the requests to use that resource is also assumed. The main concern is to find the best utilization schedule for a given resource in terms of profit obtained by utilizing that resource, and the number of time slices during which the resource will be utilized. The problem is proved to be an NP-Complete problem. A greedy algorithm is proposed and analyzed in terms of its runtime complexity. The proposed solution is based on a combination of the 0/1 Knapsack problem and the activity-selection problem. The algorithm is implemented using Java. Results show good performance with a runtime complexity O((F-S)nLogn)

Author 1: Ala'a Al-Shaikh
Author 2: Hebatallah Khattab
Author 3: Ahmad Sharieh
Author 4: Azzam Sleit

Keywords: Activity Selection; NP-Complete; Optimization Problem; Resource Utilization; 0/1 Knapsack

PDF

Paper 44: Improvement of Persian Spam Filtering by Game Theory

Abstract: There are different methods for dealing with spams; however, since spammers continuously use tricks to defeat the proposed methods, hence, filters should be constantly updated. In this study, Stackelberg game was used to produce a dynamic filter and the relations between filter and adversary were modelled as a turn game in which there is a leader and a follower. Then, an attempt was made to solve the game as an optimization program via the evolutionary stable strategy (ESS). The dataset used in the study for evaluating and analyzing the proposed method was a real dataset including the emails of four users’ personal emails. The results of the conducted evaluations and investigations indicated that the proposed method had an 8% improvement over the three-class classification method and a 0.8% improvement over the ESS-based equilibrium point method

Author 1: Seyedeh Tina Sefati
Author 2: Mohammad-Reza Feizi-Derakhshi
Author 3: Seyed Naser Razavi

Keywords: Spam Filtering; Game theory; Stackelberg game; Evolutionary Stable Strategy; Email Classification; Stackelberg equilibria

PDF

Paper 45: SecFHIR: A Security Specification Model for Fast Healthcare Interoperability Resources

Abstract: Patients taking medical treatment in distinct healthcare institutions have their information deeply fragmented between very different locations. All this information --- probably with different formats --- may be used or exchanged to deliver professional healthcare services. As the exchange of information/ interoperability is a key requirement for the success of healthcare process, various predefined e-health standards have been developed. Such standards are designed to facilitate information interoperability in common formats. Fast Healthcare Interoperability Resources (FHIR) is a newly open healthcare data standard that aims to providing electronic healthcare interoperability. FHIR was coined in 2014 to address limitations caused by the ad-hoc implementation and the distributed nature of modern medical care information systems. Patient’s data or resources are structured and standard in FHIR through a highly readable format such as XML or JSON. However, despite the unique features of FHIR, it is not a security protocol, nor does it provide any security-related functionality. In this paper, we propose a security specification model (SecFHIR) to support the development of intuitive policy schemes that are mapping directly to the healthcare environment. The formal semantics for SecFHIR are based on the well-established typing and the independent platform properties of XML. Specifically, patients’ data are modeled in FHIR using XML documents. In our model, we assume that these XML resources are defined by a set of schemes. Since XML Schema is a well-formed XML document, the permission specification can be easily integrated to the schema itself, then the specified permissions are applied to instance objects without any change. In other words, our security model (SecFHIR) defines permissions on XML schemes level, which implicitly specify the permissions on XML resources. Using these schemes, SecFHIR can combine them to support complex constraints over XML resources. This will result in reusable permissions, which efficiently simplify the security administration and achieve fine-grained access control. We also discuss the core elements of the proposed model, as well as the integration with the FHIR framework

Author 1: Ahmad Mousa Altamimi

Keywords: Healthcare; FHIR; Interoperability; Privacy preserving; Standards; XML schema

PDF

Paper 46: A Comparative Study of the Iterative Numerical Methods Used in Mine Ventilation Networks

Abstract: Ventilation is one of the key safety tasks in underground mines. Determination of the airflow through mine openings and ducts is complex and often requires the application of numerical analysis. The governing equations used in the computation of mine ventilation are discussed in matrix forms. The aim of this paper is to compare the most frequently used numerical analysis methods, which includes Newton-Raphson and the Linear Theory. It is the challenge of this study to investigate the influence of the initial flow rates and the fans in the network. A simulated mine ventilation network is represented in order to examine the two numerical methods. The numerical results acquired from Newton-Raphson method exhibited faster rate of convergence in comparison to those of the Linear Theory method. The mine ventilation networks are less expanded, therefore, the Newton-Raphson method converges faster. On the other hand, when using computational tools and software the advantage of faster convergence becomes less important, and therefore the Linear Theory method will be more preferred

Author 1: B Maleki
Author 2: E. Mozaffari

Keywords: mine ventilation; network analysis; Newton-Raphson method; linear theory

PDF

Paper 47: Stable Beneficial Group Activity Formation

Abstract: Computational models are one of the very powerful tools for expressing everyday situations that are derived from human interactions. In this paper, an investigation of the problem of forming beneficial groups based on the members' preferences and the coordinator's own strategy is presented . It is assumed that a coordinator has a good intention behind trimming members' preferences to meet the ultimate goal of forming the group. His strategy is justified and evaluated by Nash stability. There are two variations of the problem: the Anonymous Stable Beneficial Group Activity Formation and the General Stable Beneficial Group Activity Formation. The computational complexity of solving both variations has been analyzed. Finding stable groups needs non-polynomial time algorithm to be solved. A polynomial time solution is presented and enhanced with examples

Author 1: Noor Sami Al-Anbaki
Author 2: Azzam Sleit
Author 3: Ahmed Sharieh

Keywords: computational models; group formation; members' preferences; Nash stability

PDF

Paper 48: Network-State-Aware Quality of Service Provisioning for the Internet of Things

Abstract: The Internet of Things (IoT) describes a diverse range of technologies to enable a diverse range of applications using diverse platforms for communication. IP-enabled Wireless Sensor Networks (6LoWPAN) are an integral part of IoT realization because of their huge potential for sensing and communication. The provision of Quality of Service (QoS) requirements in IoT is a challenging task because of device heterogeneity in terms of bandwidth, computing and communication capabilities of the diverse set of IoT nodes and networks. The sensor nodes in IoT, e.g., 6LoWPAN, exhibit low battery power, limited bandwidth and extremely constrained computing power. Additionally, these IP-based sensor networks are inherently dynamic in nature due to node failures and mobility. Introduction of modern delay-sensitive applications for such networks has made the QoS provisioning task even harder. In this paper, we present Network-State-Adaptive QoS provision algorithm for 6LoWPAN, which adapts with the changing network state to ensure that QoS requirements are met even with the dynamic network states. It is a policy-based mechanism, which collaborates with the underlying routing protocol to satisfy the QoS requirements specified in the high level policies. It is simple in its implementation yet minimizes the degradation of the best effort traffic at a considerable level. Our implementation results show that our protocol adjusts well in dynamic 6LoWPAN environment where multiple services are competing for available limited resources

Author 1: Shafique Ahmad Chaudhry
Author 2: Jun Zhang

Keywords: Internet of things; QoS provisioning; 6Lowpan; Policy-based QoS; IP-based Wireless Sensor Network; 6LoWPAN

PDF

Paper 49: Formal Verification of a Secure Model for Building E-Learning Systems

Abstract: Internet is considered as common medium for E-learning to connect several parties with each other (instructors and students) as they are supposed to be far away from each other. Both wired and wireless networks are used in this learning environment to facilitate mobile access to educational systems. This learning environment requires a secure connection and data exchange. An E-learning model was implemented and evaluated by conducting student’s experiments. Before the approach is deployed in the real world a formal verification for the model is completed which shows that unreachability case does not exist. The model in this paper which is concentrated on the security of e-content has successfully validated the model using SPIN Model Checker where no errors were found

Author 1: Farhan M Al Obisat
Author 2: Hazim S. AlRawashdeh

Keywords: Formal verification; SPIN Model Checking; E-content; E-protection; Encryption and Decryption; Security of e-content

PDF

Paper 50: Performance Evaluation of Support Vector Regression Models for Survival Analysis: A Simulation Study

Abstract: Desirable features of support vector regression (SVR) models have led to researchers extending them to survival problems. In current paper we evaluate and compare performance of different SVR models and the Cox model using simulated and real data sets with different characteristics. Several SVR models are applied: 1) SVR with only regression constraints (standard SVR); 2) SVR with regression and ranking constraints; 3) SVR with positivity constraints; and 4) L1-SVR. Also, a SVR model based on mean residual life is proposed. Our findings from evaluation of real data sets indicate that for data sets with high censoring rate and high number of features, SVR model significantly outperforms the Cox model. Simulated data sets also show similar results. For some real data sets L1-SVR has a significantly degraded performance in comparison to the standard SVR. Performance of other SVR models is not substantially different from the standard SVR with the real data sets. Nevertheless, the results of simulated data sets show that standard SVR slightly outperforms SVR with regression and ranking constraints

Author 1: Hossein Mahjub
Author 2: Shahrbanoo Goli
Author 3: Javad Faradmal
Author 4: Ali-Reza Soltanian

Keywords: support vector machines; support vector regression; survival analysis; simulation study; Cox model; mean residual life

PDF

Paper 51: Data Security Using Cryptography and Steganography Techniques

Abstract: Although cryptography and steganography could be used to provide data security, each of them has a problem. Cryptography problem is that, the cipher text looks meaningless, so the attacker will interrupt the transmission or make more careful checks on the data from the sender to the receiver. Steganography problem is that once the presence of hidden information is revealed or even suspected, the message is become known. According to the work in this paper, a merged technique for data security has been proposed using Cryptography and Steganography techniques to improve the security of the information. Firstly, the Advanced Encryption Standard (AES) algorithm has been modified and used to encrypt the secret message. Secondly, the encrypted message has been hidden using method in [1]. Therefore, two levels of security have been provided using the proposed hybrid technique. In addition, the proposed technique provides high embedding capacity and high quality stego images

Author 1: Marwa E. Saleh
Author 2: Abdelmgeid A. Aly
Author 3: Fatma A. Omara

Keywords: Image Steganography; Pixel Value Difference (PVD); Encryption; Decryption; Advance encryption standard (AES)

PDF

Paper 52: Authenticating Sensitive Speech-Recitation in Distance-Learning Applications using Real-Time Audio Watermarking

Abstract: Thispaper focuses on audio-watermarking authentication and integrity-protection within the context of the speech-data transmitted over the Internet in a real-time learning environment.The Arabic Quran recitation through distance learning is used as a case-study example that is characteristic of sensitive data requiring robust authentication and integrity measures. Thisworkproposes an approach for the purpose ofauthenticating and validatingaudio-data transmitted by a publisher or during communications between an instructor and students reciting via Internet communications.The watermarking approach proposed here is based on detection of the key patternswithin the audio signal as an input to the algorithm before the embedding phase is performed.The developed application could be easily used at both sides of the communication for ensuring authenticity and integrity of the transmitted speech signal and is proved effective for many distance-learning applications that require low-complexity processing in real-time

Author 1: Omar Tayan
Author 2: Lamri Laouamer
Author 3: Tarek Moulahi
Author 4: Yasser M. Alginahi

Keywords: Audio; Watermarking; Quran-recitation; Integrity; Authentication

PDF

Paper 53: Increasing the Target Prediction Accuracy of MicroRNA Based on Combination of Prediction Algorithms

Abstract: MicroRNA is an oligonucleotide that plays a role in the pathogenesis of several diseases (mentioning Cancer). It is a non-coding RNA that is involved in the control of gene expression through the binding and inhibition of mRNA. In this study, three algorithms were implemented in WEKA software using two testing modes to analyze five datasets of miRNA families. The data mining techniques are used to compare the interactions of miRNA-mRNA that it either belongs to the same gene-family or to different families, and to establish a biological scheme that explains how the biological parameters are involved or less involved in miRNA-mRNA prediction. The factors that were involved in the prediction process includs match, mismatch, bulge, loop, and score to represent the binding characteristics, while the position, 3’UTR length, and chromosomal location and chromosomal categorizations represent the characteristics of the target mRNA. These attributes can provide an empirical guidance for study of specific miRNA family to scan the whole human genome for novel targets. This research provides promising results that can be utilized for current and future research in this field

Author 1: Mohammed Q. Shatnawi
Author 2: Mohammad Alhammouri
Author 3: Kholoud Mukdadi

Keywords: miRNA; chromosome; prediction; genome; disease; biology; DNA sequence; enzyme

PDF

Paper 54: Comparative Study from Several Business Cases and Methodologies for ICT Project Evaluation

Abstract: Achieving high competitive advantage through Information and Communication Technologies (ICT) has never been easy without proper management and appropriate utilization of ICT resources. Therefore, the statistics suggested that ICT project failures are very common in the organization due to several reasons; it fails to deliver the required objectives of investment, inaccurate budget planning, lack of risk management plan and time overrun are some basic reasons for an ICT project’s failure. To overcome these issues, recently ICT decision makers are emphasizing more on ICT project’s evaluation rather than investment. The practitioner broadly categorized the evaluation techniques in post and pre evaluation methods, which is further divided into measuring the return from financial and non-financial perspectives. The main purpose of this paper is to provide a comparative analysis on ICT investment’s evaluation, their categories based on pre and post evaluation. Thus, the paper offers an extensive literature review that can help ICT decision makers and organizations to better select the evaluation techniques available, where integration of multiple techniques can further improve this process

Author 1: Farrukh Saleem
Author 2: Naomie Salim
Author 3: Abdulrahman H. Altalhi
Author 4: Abdullah AL-Malaise AL-Ghamdi
Author 5: Zahid Ullah
Author 6: Fatmah A. Baothman
Author 7: Muhammad Haleem Junejo

Keywords: ICT Investment, Evaluation of ICT Investment; Multi-Dimensional Approaches; Multi-Criteria Approaches; Financial Approaches

PDF

Paper 55: A Novel Algorithm for Optimizing Multiple Services Resource Allocation

Abstract: Resource provisioning becomes more and more challenging problem in cloud computing environment since cloudbased services become more numerous and dynamic. The problem of scheduling multiple tasks for multiple users on a given number of resources is considered NP-Complete problem, and therefore, several heuristic based research methods were proposed, yet, there are still many improvements can be done, since the problem has several optimization parameters. In addition, most proposed solutions are built on top of several assumptions and simplifications by applying computational methods such as game theory, fuzzy logic, or evolutionary computing. This paper presents an algorithm to address the problem of resource allocation across a cloud-based network, where several resources are available, and the cost of computational service depends on the amount of computation. The algorithm is applicable without restrictions on cost vector or compaction time matrix as opposed to methods in the literature. In addition, the execution of the algorithm shows better utility compared to methods applied on similar problems

Author 1: Amjad Gawanmeh
Author 2: Alain April

Keywords: Cloud computing; Cloud Services; Scheduling; Parallel and Distributed systems

PDF

Paper 56: A Proposed Textual Graph Based Model for Arabic Multi-document Summarization

Abstract: Text summarization task is still an active area of research in natural language preprocessing. Several methods that have been proposed in the literature to solve this task have presented mixed success. However, such methods developed in a multi-document Arabic text summarization are based on extractive summary and none of them is oriented to abstractive summary. This is due to the challenges of Arabic language and lack of resources. In this paper, we present a minimal languagedependent processing abstractive Arabic multi-document summarizer. The proposed model is based on textual graph to remove multi-document redundancy and generate coherent summary. Firstly, the original text, highly redundant and related multidocument, will be converted into textual graph. Next, graph traversal with structural rules will be applied to concatenate related sentences to single ones. Finally, unwanted and less weighted phrases will be removed from the summarized sentences to generate final summary. Preliminary results show that the proposed method has achieved promising results for multidocument summarization.

Author 1: Muneer A. Alwan
Author 2: Hoda M. Onsi

Keywords: Text Summarization; Arabic Abstractive Summary; Textual Graph; Natural Language Processing

PDF

Paper 57: An Auction-Bidding Protocol for Distributed Bit Allocation in RSSI-based Localization Networks

Abstract: Several factors (e.g., target energy, sensor density) affect estimation error at a point of interest in sensor networks. One of these factors is the number of allocated bits to sensors that cover the point of interest when quantization is employed. In this paper, we investigate bit allocation in such networks such that estimation error requirements at multiple points of interest are satisfied as best as possible. To solve this nonlinear integer programming problem, we propose an iterative distributed auctionbidding protocol. Starting with some initial bit distribution, a network is divided into a a number of clusters each with its own auction. Each cluster head (CH) acts as an auctioneer and divides sensors into buyers or sellers of bits (i.e., commodity). With limited messaging, CHs redistribute bits among sensors, each bit at a time such that the difference between achieved and required estimation errors within each cluster is reduced in each round. We propose two bit-pricing schemes used by sensors to decide on exchanging bits. Finally, simulation results show that our proposed ‘distributed’ protocol’s error performance can be within 5%-10% of that of a ‘centralized’ genetic algorithm (GA) solution

Author 1: Ahmad A. Ababneh

Keywords: Target localization; Auction-bidding

PDF

Paper 58: An Extension of the Bisection Theorem to Symmetrical Circuits with Cross-Coupling

Abstract: This paper demonstrates that the bisection theorem can be applied to the differential and common-mode analysis of balanced symmetrical circuits with cross coupling. This class of circuits is often found in the literature. The application of the circuit greatly reduces the complexity of the circuit thereby greatly simplifying the analysis and providing insight

Author 1: Fadi Nessir Zghoul

Keywords: Bisection theorem; common-mode analysis; cross-coupling; differential amplifiers; differential-mode analysis

PDF

Paper 59: Data Mining in Education

Abstract: Data mining techniques are used to extract useful knowledge from raw data. The extracted knowledge is valuable and significantly affects the decision maker. Educational data mining (EDM) is a method for extracting useful information that could potentially affect an organization. The increase of technology use in educational systems has led to the storage of large amounts of student data, which makes it important to use EDM to improve teaching and learning processes. EDM is useful in many different areas including identifying at-risk students, identifying priority learning needs for different groups of students, increasing graduation rates, effectively assessing institutional performance, maximizing campus resources, and optimizing subject curriculum renewal. This paper surveys the relevant studies in the EDM field and includes the data and methodologies used in those studies

Author 1: Abdulmohsen Algarni

Keywords: Data mining, Educational Data Mining (EDM), Knowledge extraction

PDF

Paper 60: Exploiting Document Level Semantics in Document Clustering

Abstract: Document clustering is an unsupervised machine learning method that separates a large subject heterogeneous collection (Corpus) into smaller, more manageable, subject homogeneous collections (clusters). Traditional method of document clustering works around extracting textual features like: terms, sequences, and phrases from documents. These features are independent of each other and do not cater meaning behind these word in the clustering process. In order to perform semantic viable clustering, we believe that the problem of document clustering has two main components: (1) to represent the document in such a form that it inherently captures semantics of the text. This may also help to reduce dimensionality of the document and (2) to define a similarity measure based on the lexical, syntactic and semantic features such that it assigns higher numerical values to document pairs which have higher syntactic and semantic relationship. In this paper, we propose a representation of document by extracting three different types of features from a given document. These are lexical , syntactic and semantic features. A meta-descriptor for each document is proposed using these three features: first lexical, then syntactic and in the last semantic. A document to document similarity matrix is produced where each entry of this matrix contains a three value vector for each lexical , syntactic and semantic . The main contributions from this research are (i) A document level descriptor using three different features for text like: lexical, syntactic and semantics. (ii) we propose a similarity function using these three, and (iii) we define a new candidate clustering algorithm using three component of similarity measure to guide the clustering process in a direction that produce more semantic rich clusters. We performed an extensive series of experiments on standard text mining data sets with external clustering evaluations like: FMeasure and Purity, and have obtained encouraging results.

Author 1: Muhammad Rafi
Author 2: Muhammad Naveed Sharif
Author 3: Waleed Arshad
Author 4: Habibullah Rafay

Keywords: Document Clustering; Text Mining; Similarity Measure; Semantics

PDF

Paper 61: HAMSA: Highly Accelerated Multiple Sequence Aligner

Abstract: For biologists, the existence of an efficient tool for multiple sequence alignment is essential. This work presents a new parallel aligner called HAMSA. HAMSA is a bioinformatics application designed for highly accelerated alignment of multiple sequences of proteins and DNA/RNA on a multi-core cluster system. The design of HAMSA is based on a combination of our new optimized algorithms proposed recently of vectorization, partitioning, and scheduling. It mainly operates on a distance vector instead of a distance matrix. It accomplishes similarity computations and generates the guide tree in a highly accelerated and accurate manner. HAMSA outperforms MSAProbs with 21.9- fold speedup, and ClustalW-MPI of 11-fold speedup. It can be considered as an essential tool for structure prediction, protein classification, motive finding and drug design studies

Author 1: Naglaa M. Reda
Author 2: Mohammed Al-Neama
Author 3: Fayed F. M. Ghaleb

Keywords: Bioinformatics; Multiple sequence alignment; parallel programming; Clusters; Multi-cores

PDF

Paper 62: Hashtag the Tweets: Experimental Evaluation of Semantic Relatedness Measures

Abstract: On Twitter, hashtags are used to summarize topics of the tweet content and to help search tweets. However, hashtags are created in a free style and thus heterogeneous, increasing difficulty of their usage. Therefore, it is important to evaluate that if they really represent the content they are attached with? In this work, we perform detailed experiments to find answer for this question. In addition to this, we compare different semantic relatedness measures to find this similarity between hashtags and tweets. Experiments are performed using ten different measures and Adapted Lesk is found to be the best.

Author 1: Muhammad Asif
Author 2: Malik Muhammad Saad Missen
Author 3: Nadeem Akhtar
Author 4: Hina Asmat
Author 5: Mujtaba Husnain
Author 6: Muhammad Asghar

Keywords: component; formatting; style; styling; insert (key words)

PDF

Paper 63: Knowledge-based Approach for Event Extraction from Arabic Tweets

Abstract: Tweets provide a continuous update on current events. However, Tweets are short, personalized and noisy, thus raises more challenges for event extraction and representation. Extracting events out of Arabic tweets is a new research domain where few examples – if any – of previous work can be found. This paper describes a knowledge-based approach for fostering event extraction out of Arabic tweets. The approach uses an unsupervised rule-based technique for event extraction and provides a named entity disambiguation of event related entities (i.e. person, organization, and location). Extracted events and their related entities are populated to the event knowledge base where tagged tweets’ entities are linked to their corresponding entities represented in the knowledge base. Proposed approach was evaluated on a dataset of 1K Arabic tweets covering different types of events (i.e. instant events and interval events). Results show that the approach has an accuracy of, 75.9% for event trigger extraction, 87.5% for event time extraction, and 97.7% for event type identification.

Author 1: Mohammad AL-Smadi
Author 2: Omar Qawasmeh

Keywords: Event Extraction; Knowledge base; Entity linking; Named entity disambiguation; Arabic NLP

PDF

Paper 64: Multivariable Decoupling Controller: Application to Multicellular Converter

Abstract: A new control strategy is presented in this paper, based on previous works limited to the control of the capacitor voltages considered as the outputs of a three cell converter. An additional control input is proposed to this latter in order to obtain the desired current output. The experimentations performed on a multicellular converter are presented and the discussed results showing the efficiency of the contribution

Author 1: Abir Smati
Author 2: Wassila Chagra
Author 3: Denis Berdjag
Author 4: Moufida Ksouri

Keywords: Hybride systems; Multicellular series converters; PWM; Closed loop control

PDF

Paper 65: Performance of a Constrained Version of MOEA/D on CTP-series Test Instances

Abstract: Constrained multiobjective optimization arises in many real-life applications, and is therefore gaining a constantly growing attention of the researchers. Constraint handling techniques differ in the way infeasible solutions are evolved in the evolutionary process along with their feasible counterparts. Our recently proposed threshold based penalty function gives a chance of evolution to infeasible solutions whose constraint violation is less than a specified threshold value. This paper embeds the threshold based penalty function in the update and replacement scheme of multi-objective evolutionary algorithm based on decomposition (MOEA/D) to find tradeoff solutions for constrained multiobjective optimization problems (CMOPs). The modified algorithm is tested on CTP-series test instances in terms of the hypervolume metric (HV-metric). The experimental results are compared with the two well-known algorithms, NSGA-II and IDEA. The sensitivity of algorithm to the adopted parameters is also checked. Empirical results demonstrate the effectiveness of the proposed penalty function in the MOEA/D framework for CMOPs

Author 1: Muhammad Asif Jan
Author 2: Rashida Adeeb Khanum
Author 3: Nasser Mansoor Tairan
Author 4: Wali Khan Mashwani

Keywords: Decomposition; MOEA/D; threshold based penalty function; constrained multiobjective optimization

PDF

Paper 66: Scheduling on Heterogeneous Multi-core Processors Using Stable Matching Algorithm

Abstract: Heterogeneous Multi-core Processors (HMP) are better to schedule jobs as compare to homogenous multi-core processors. There are two main factors associated while analyzing both architectures i.e. performance and power consumption. HMP incorporates cores of various types or complexities in a solitary chip. Hence, HMP is capable to address both throughput and productivity for different workloads by coordinating execution assets to the needs of every application. The primary objective of this study is to improve the dynamic selection of the processor core to fulfill the power and performance requirements using a task scheduler. In the proposed solution, there will be dynamic priority lists for tasks and available cores. The tasks to core mapping is performed on the basis of priorities of the tasks and cores.

Author 1: Muhammad Rehman Zafar
Author 2: Muhammad Asfand-e-Yar

Keywords: Heterogeneous, Performance, Scheduling, Multicore processors, Stable matching

PDF

Paper 67: Determining adaptive thresholds for image segmentation for a license plate recognition system

Abstract: A vehicle license plate recognition (LPR) system is useful to many applications, such as entrance admission, security, parking control, airport and cargo, traffic and speed control. This paper describe an adaptive threshold for image segmentation applied to a system for Malaysian intelligent license plate recognition (MyiLPR). Due to the different types of license plates used, the requirements of an automatic LPR system are rather different for each country. Upon receiving the input car image, this system (MyiLPR) detects and segments the license plate based on proposed adaptive threshold via image and blob histogram, and blob agglomeration, and finally, it extracts geometric character features and classifies them using neural network. The use of the proposed adaptive threshold increased the detection, segmentation and recognition rate to 99%, 94.98% and 90% correspondingly, from 95%, 78.27% and 71.08% obtained with the fixed threshold used in the originally proposed system

Author 1: Siti Norul Huda Sheikh Abdullah
Author 2: Khairuddin Omar
Author 3: Abbas Salimi Zaini
Author 4: Maria Petrou
Author 5: Marzuki Khalid

Keywords: adaptive threshold; image segmentation; license plate recognition; neural network; computer surveillance

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org