The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 5 Issue 2

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Applying Cellular Automata for Simulating and Assessing Urban Growth Scenario Based in Nairobi, Kenya

Abstract: This research explores urban growth based scenarios for the city of Nairobi using a cellular automata urban growth model (UGM). African cities have experienced rapid urbanization over the last decade due to increased population growth and high economic activities. We used multi-temporal Landsat imageries for 1976, 1986, 2000 and 2010 to investigate urban land-use changes in Nairobi. Our UGM used data from urban land-use of 1986 and 2010, road data, slope data and exclusion layer. Monte-Carlo technique was used for model calibration and Multi Resolution Validation (MRV) technique for validation. Simulation of urban land-use was done up to the year 2030 when Kenya plans to attain Vision 2030. Three scenarios were explored in the urban modelling process; unmanaged growth with no restriction on environmental areas, managed growth with moderate protection, and a managed growth with maximum protection on forest, agricultural areas, and urban green. Thus alternative scenario development using UGM is useful for planning purposes so as to ensure sustainable development is achieved. UGM provides quantitative, visual, spatial and temporal information which aid policy and decision makers can make informed decisions.

Author 1: Kenneth Mubea
Author 2: Roland Goetzke
Author 3: Gunter Menz

Keywords: Urban Growth; Scenarios; Nairobi; Cellular automata; Simulation; sustainable development

PDF

Paper 2: Construction Strategy of Wireless Sensor Networks with Throughput Stability by Using Mobile Robot

Abstract: We propose a wireless sensor networks deployment strategy for constructing wireless communication infrastructures for a rescue robot with considering a throughput between sensor nodes (SNs). Recent studies for reducing disaster damage focus on a disaster area information gathering in underground spaces. Since information gathering activities in such post disaster underground spaces present a high risk of personal injury by secondary disasters, a lot of rescue workers were injured or killed in the past. Because of this background, gathering information by utilizing the rescue robot is discussed in wide area. However, there are no wireless communication infrastructures for tele-operation of rescue robot in the post-disaster environment such as the underground space. Therefore, we have been discussing the construction method of wireless communication infrastructures for remotely operated the rescue robot by utilizing the rescue robot. In this paper, we evaluated the proposed method in field operation test, and then it is confirmed that maintaining communication connectivity and throughputs between End to End of constructed networks.

Author 1: Kei Sawai
Author 2: Shigeaki Tanabe
Author 3: Hitoshi Kono
Author 4: Yuta Koike
Author 5: Ryuta Kunimoto
Author 6: Tsuyoshi Suzuki

Keywords: Wireless Sensor Networks; Rescue Robot Tele-Operation; Maintaining Throughput

PDF

Paper 3: A Secured Framework for Geographical Information Applications on Web

Abstract: Current geographical information applications increasingly require managing spatial data through the Web. Users of geographical information application need not only to display the spatial data but also to interactively modify them. As a result, the security risks that face geographical information applications are also increasing. In this paper, a secured framework is proposed. The proposed framework's goal is, providing a fine grained access control to web-based geographic information applications. A case study is finally applied to prove the proposed framework feasibility and effectiveness.

Author 1: Mennatallah H. Ibrahim
Author 2: Hesham A. Hefny

Keywords: spatial data; geographic information systems; access control; authorization

PDF

Paper 4: Bimodal Emotion Recognition from Speech and Text

Abstract: This paper presents an approach to emotion recognition from speech signals and textual content. In the analysis of speech signals, thirty-seven acoustic features are extracted from the speech input. Two different classifiers Support Vector Machines (SVMs) and BP neural network are adopted to classify the emotional states. In text analysis, we use the two-step classification method to recognize the emotional states. The final emotional state is determined based on the emotion outputs from the acoustic and textual analyses. In this paper we have two parallel classifiers for acoustic information and two serial classifiers for textual information, and a final decision is made by combing these classifiers in decision level fusion. Experimental results show that the emotion recognition accuracy of the integrated system is better than that of either of the two individual approaches.

Author 1: Weilin Ye
Author 2: Xinghua Fan

Keywords: emotion recognition; acoustic features; textual features; decision level fusion

PDF

Paper 5: Performance of window synchronisation in coherent optical ofdm system

Abstract: In this paper we investigate the performances of a robust and efficient technique for frame/symbol timing synchronization in coherent optical OFDM. It uses a preamble consisting of only two training symbol with two identical parts to achieve reliable synchronization schemes. The performances of the timing offset estimator at correct and incorrect timing in coherent optical OFDM are compared in term of mean and variance of the timing offset, and finally, we study the influence of number of subcarriers and chromatic dispersion.

Author 1: Sofien Mhatli
Author 2: Bechir Nsiri
Author 3: Mutasam Jarajreh
Author 4: Basma Hammami
Author 5: Rabah Attia

Keywords: COFDM; timing offset; time synchronization; training symbol

PDF

Paper 6: SentiTFIDF – Sentiment Classification using Relative Term Frequency Inverse Document Frequency

Abstract: Sentiment Classification refers to the computational techniques for classifying whether the sentiments of text are positive or negative. Statistical Techniques based on Term Presence and Term Frequency, using Support Vector Machine are popularly used for Sentiment Classification. This paper presents an approach for classifying a term as positive or negative based on its proportional frequency count distribution and proportional presence count distribution across positively tagged documents in comparison with negatively tagged documents. Our approach is based on term weighting techniques that are used for information retrieval and sentiment classification. It differs significantly from these traditional methods due to our model of logarithmic differential term frequency and term presence distribution for sentiment classification. Terms with nearly equal distribution in positively tagged documents and negatively tagged documents were classified as a Senti-stop-word and discarded. The proportional distribution of a term to be classified as Senti-stop-word was determined experimentally. We evaluated the SentiTFIDF model by comparing it with state of art techniques for sentiment classification using the movie dataset.

Author 1: Kranti Ghag
Author 2: Ketan Shah

Keywords: Sentiment Classification; Term Weighting; Term Frequency; Term Presence; Document Vectors

PDF

Paper 7: New technique to insure data integrity for archival files storage (DIFCS)

Abstract: In this paper we are developing an algorithm to increase the security of using HMAC function (Key-Hashed Message Authentication) to insure data integrity for exchanging archival files. Hash function is a very strong tool used in information security. The algorithm we are developing is safe, quick and will allow the University of Tabuk (UT) authorities to be sure that data of archival document will not be changed or modified by unauthorized personnel through transferring in the network; it will also increase the efficiency of network in which archived files are exchanged. The basic issues of hash functions and data integrity will be presented as well. In this research: The developed algorithm is effective and easy to implement using HMAC algorithm to guarantee data integrity for archival scanned documents in the document management system.

Author 1: Mohannad Najjar

Keywords: cryptography; hash functions; data integrity; authentication; HMAC; file archiving

PDF

Paper 9: A Greedy Algorithm for Load Balancing Jobs with Deadlines in a Distributed Network

Abstract: One of the most challenging issues when dealing with distributed networks is the efficiency of jobs load balancing. This paper presents a novel algorithm for load balancing jobs that have a given deadline in a distributed network assuming central coordination. The algorithm uses a greedy strategy for global and local decision making: schedule a job as late as possible. It has an increased overhead over other well-known methods, but the load balancing policy provides a better fit for jobs.

Author 1: Ciprian I. Paduraru

Keywords: scheduling; greedy; coordination; network

PDF

Paper 10: Evaluation of Different Hypervisors Performance in the Private Cloud with SIGAR Framework

Abstract: To make cloud computing model Practical and to have essential characters like rapid elasticity, resource pooling, on demand access and measured service, two prominent technologies are required. One is internet and second important one is virtualization technology. Virtualization Technology plays major role in the success of cloud computing. A virtualization layer which provides an infrastructural support to multiple virtual machines above it by virtualizing hardware resources such as CPU, Memory, Disk and NIC is called a Hypervisor. It is interesting to study how different Hypervisors perform in the Private Cloud. Hypervisors do come in Paravirtualized, Full Virtualized and Hybrid flavors. It is novel idea to compare them in the private cloud environment. This paper conducts different performance tests on three hypervisors XenServer, ESXi and KVM and results are gathered using SIGAR API (System Information Gatherer and Reporter) along with Passmark benchmark suite. In the experiment, CloudStack 4.0.2 (open source cloud computing software) is used to create a private cloud, in which management server is installed on Ubuntu 12.04 – 64 bit operating system. Hypervisors XenServer 6.0, ESXi 4.1 and KVM (Ubuntu 12.04) are installed as hosts in the respective clusters and their performances have been evaluated in detail by using SIGAR Framework, Passmark and NetPerf.

Author 1: P. Vijaya Vardhan Reddy
Author 2: Dr. Lakshmi Rajamani

Keywords: CloudStack; Hypervisor; Management Server; Private Cloud; Virtualization Technology; SIGAR; Passmark

PDF

Paper 11: TCP- Costco Reno: New Variant by Improving Bandwidth Estimation to adapt over MANETs

Abstract: The Transmission Control Protocol (TCP) is traditional, dominant and has been de facto standard protocol, used as transport agent at transport layer of TCP/IP protocol suite. Basically it is designed to provide reliability and assure guaranty to end-to-end delivery of data over unreliable networks. In practice, most TCP deployments have been carefully designed in the context of wired networks. Ignoring the properties of wireless Ad Hoc Networks, therefore it can lead to TCP implementations with poor performance. The problem of TCP and all its existing variations within MANETs resides in its inability to distinguish between different data packet loss causes, whenever the data loss occur traditional TCP congestion control algorithm assumes loss is due to congestion episode and reduces sending parameters value unnecessary. Thus, TCP has not always the optimum behavior in front of packet losses which might cause network performance degradation and resources waste. In order to adapt TCP over mobile Ad hoc environment, improvements have been proposed based on RTT and BW estimation technique in the literature to help TCP to differentiate accurate causes between the different types of losses. But still does not handle all the problems accurately and effectively. In this paper, a proposed TCP-Costco Reno a New Variant, accurately estimates the available bandwidth over Mobile Ad Hoc networks and sets sending rate accordingly to maximize utilization of available resources and hence improves performance of TCP over mobile Ad hoc networks. The results of the simulation indicate an improvement in throughput over interference, link failure and signal loss validation scenarios. Further, it shows highest average of average throughput then those variants which are most successful over MANETs.

Author 1: Prakash B. Khelage
Author 2: Dr. Uttam D. Kolekar

Keywords: mobile ad hoc network (MANET); Ccongestionl; Link failure;signal loss; interference; Retransmission timeout; RTT and BW estimation

PDF

Paper 12: Early Development of UVM based Verification Environment of Image Signal Processing Designs using TLM Reference Model of RTL

Abstract: With semiconductor industry trend of “smaller the better”, from an idea to a final product, more innovation on product portfolio and yet remaining competitive and profitable are few criteria which are culminating into pressure and need for more and more innovation for CAD flow, process management and project execution cycle. Project schedules are very tight and to achieve first silicon success is key for projects. This necessitates quicker verification with better coverage matrix. Quicker Verification requires early development of the verification environment with wider test vectors without waiting for RTL to be available. In this paper, we are presenting a novel approach of early development of reusable multi-language verification flow, by addressing four major activities of verification – 1. Early creation of Executable Specification 2. Early creation of Verification Environment 3. Early development of test vectors and 4. Better and increased Re-use of blocks Although this paper focuses on early development of UVM based Verification Environment of Image Signal Processing designs using TLM Reference Model of RTL, same concept can be extended for non-image signal processing designs.

Author 1: Abhishek Jain
Author 2: Sandeep Jana
Author 3: Dr. Hima Gupta
Author 4: Krishna Kumar

Keywords: SystemVerilog; SystemC; Transaction Level Modeling; Universal Verification Methodology (UVM); Processor model; Universal Verification Component (UVC); Reference Model

PDF

Paper 13: Development of Rest Facility Information Exchange System by Utilizing Delay Tolerant Network

Abstract: In this paper, we propose temporary rest facilities information exchange system among many people unable to get home by utilizing Delay Tolerant Network (DTN) after a disaster. When public transportation services are interrupted by the disaster, those people try to get home on foot while taking a rest at the facility. However, it is difficult for those people to obtain information of temporary rest facilities provided hurriedly, because communication infrastructures in the disaster area are disconnected by the disaster damage. Therefore, we propose a method to exchange the information among those people mutually by using mobile device via DTN for diffusion of the information. By using DTN, those people can communicate with each other by using mobile device and use the rest facility on the basis of the information even if the communication infrastructures are disconnected. Then, we develop mobile device application software to exchange the rest facility information among the people via DTN. In order to evaluate the application, we verified the communication performance in practical experiments. The experimental results showed the developed application had sufficient performance to exchange the information of the rest facility via DTN. Then, we verify the diffusivity of the rest facility information by a network simulation. The simulation results showed that the rest facility information was diffused widely and effectively to those people.

Author 1: Masahiro Ono
Author 2: Kei Sawai
Author 3: Tsuyoshi Suzuki

Keywords: Delay Tolerant Network; rest facility; disaster; communication infrastructure; simulation

PDF

Paper 14: New Method Based on Multi-Threshold of Edges Detection in Digital Images

Abstract: Edges characterize object boundaries in image and are therefore useful for segmentation, registration, feature extraction, and identification of objects in a scene. Edges detection is used to classify, interpret and analyze the digital images in a various fields of applications such as robots, the sensitive applications in military, optical character recognition, infrared gait recognition, automatic target recognition, detection of video changes, real-time video surveillance, medical images, and scientific research images. There are different methods of edges detection in digital image. Each one of these methods is suited to a particular type of images. But most of these methods have some defects in the resulting quality. Decreasing of computation time is needed in most applications related to life time, especially with large size of images, which require more time for processing. Threshold is one of the powerful methods used for edge detection of image. In this paper, We propose a new method based on different Multi-Threshold values using Shannon entropy to solve the problem of the traditional methods. It is minimize the computation time. In addition to the high quality of output of edge image. Another benefit comes from easy implementation of this method.

Author 1: Amira S. Ashour
Author 2: Mohamed A. El-Sayed
Author 3: Shimaa E. Waheed
Author 4: S. Abdel-Khalek

Keywords: image processing; multi-threshold; edges detection; clustering

PDF

Paper 15: Dynamic Software Architecture for Medical Domain Using Pop Counts

Abstract: Over the past few decades, the complexity of software for almost any era has increased significantly. The aim of this paper is to provide an approach which not only feasible but also decision-oriented in medical era. It focus on the careful planning and organizing success in continuous process improvements in software and hardware technology as this brings a lot of trouble to system development and maintenance. We have used the pop count method to develop the dynamic software architecture with the existence of quality attributes in order to find out the level of severity in patients of any diseases on the specialist perception. This is useful for taking decision on priority healing and regular concentration of the patients even in the absence of the specialist. Further the method (model) tested on the 25 symptoms of 100 patients which does not contain any dichotomous data and found with the help of statistical evaluation (that it result almost perfect classification) that the architecture is conformance to the medical software architecture quality requirements.

Author 1: UMESH BANODHA
Author 2: KANAK SAXENA

Keywords: Software Architecture; Quality Attributes; Pop count; medical process reengineering

PDF

Paper 16: OLAWSDS:An Online Arabic Web Spam Detection System

Abstract: For marketing purposes, Some Websites designers and administrators use illegal Search Engine Optimization (SEO) techniques to optimize the ranking of their Web pages and mislead the search engines. Some Arabic Web pages use both content and link features, to increase artificially the rank of their Web pages in the Search Engine Results Pages (SERPs). This study represents an enhancement to previous work in this field. It includes the design and implementation of an online Arabic Web spam detection system, based on algorithms and mathematical foundations, which can detect the Arabic content and link web spam depending on the tree of the spam detection conditions, beside depending on the user’s feedback through a custom Web browser. The users can participate in making the decision about any Web page, through their feedbacks, so they judge if the Arabic Web pages in the browser are relevant for their particular queries or not. The proposed system uses the extracted content and link features from Arabic Web pages to determine whether to label each Web page as a spam or as a non-spam. This system also attempts to learn from the user’s feedback to enhance automatically its performance. Statistical analysis is adopted in this study to evaluate the proposed system. Statistical Package for the Social Sciences (SPSS) software is used to evaluate this new system which considers the users feedbacks as dependent variables, while Arabic content and links features on the other hand are considered independent variables. The statistical analysis with the SPSS is used to apply a variety of tests, such as the test of the analysis of variance (ANOVA). ANOVA is used to show the relationships between the dependent and independent variables in the dataset, which leads to solving problems and building intelligent decisions and results.

Author 1: Mohammed N. Al-Kabi
Author 2: Heider A. Wahsheh
Author 3: Izzat M. Alsmadi

Keywords: Arabic Web spam; content-based; link-based; Information Retrieval

PDF

Paper 17: RPOA Model-Based Optimal Resource Provisioning

Abstract: Optimal utilization of resources is the core of the provisioning process in the cloud computing. Sometimes the local resources of a data center are not adequate to satisfy the users’ requirements. So, the providers need to create several data centers at different geographical area around the world and spread the users’ applications on these resources to satisfy both service providers and customers QoS requirements. By considering the expansion of the resources and applications, the transmission cost and time have to be concerned as significant factors in the allocation process. According to the work of our previous paper, a Resource Provision Optimal Algorithm (RPOA) based on Particle Swarm Optimization (PSO) has been introduced to find the near optimal resource utilization with considering the customer budget and suitable for deadline time. This paper is considered an enhancement to RPOA algorithm to find the near optimal resource utilization with considering the data transfer time and cost, in addition to the customer budget and deadline time, in the performance measurement.

Author 1: Noha El. Attar
Author 2: Samy Abd El -Hafeez
Author 3: Wael Awad
Author 4: Fatma Omara

Keywords: Cloud Computing; Resource Provision Data Communication; Particle Swarm Optimization

PDF

Paper 18: Investigating the combination of structural and textual information about multimedia retrieval

Abstract: The expansion of structured information in different applications introduces a new ambiguity in multimedia retrieval in semi-structured documents. We investigate in this paper the combination of textual and structural context for multimedia retrieval in XML document thus we present a indexing model which combines textual and structural information. We propose a geometric method who use implicitly of textual and structural context of XML elements and we are particularly interested by improve the effectiveness of various structural factors for multimedia retrieval. Using a geometric metric, we can represent structural information in XML document with a vector for each element. Given a textual query, our model lets us combine scores obtained from each sources of evidence and return a list of relevant retrieved multimedia element. Experimental evaluation is carried out using the INEX Ad Hoc Task 2007 and the Image CLEF Wikipedia Retrieval Task 2010. The results show that combination of scores of textual modality and structural modality significantly improves compared results of using a single modality.

Author 1: Sana FAKHFAKH
Author 2: Mohamed TMAR
Author 3: Walid MAHDI

Keywords: Geometric distance; multimedia retrieval, element; structure; document modeling

PDF

Paper 19: Audio Search Based on Keyword Spotting in Arabic Language

Abstract: Keyword spotting is an important application of speech recognition. This research introduces a keyword spotting approach to perform audio searching of uttered words in Arabic speech. The matching process depends on the utterance nucleus which is insensitive to its context. For spotting the targeted utterances, the matched nuclei are expanded to cover the whole utterances. Applying this approach to Quran and standard Arabic has promising results. To improve this spotting approach, it is combined with a text search in case of the existence of a transcript. This can be applied on Quran as there is exact correspondence between the audio and text files of each verse. The developed approach starts by text search to identify the verses that include the target utterance(s). For each allocated verse, the occurrence(s) of the target utterance is determined. The targeted utterance (the reference) is manually segmented from an allocated verse. Then Keyword spotting is performed for the extracted reference to the corresponding audio file. The accuracy of the spotted utterances achieved 97%. The experiments showed that the use of the combined text and audio search has reduced the search time by 90% when compared with audio search only tested on the same content. The developed approach has been applied to non transcribed audio files (preaches and News) for searching chosen utterances. The results are promising. The accuracy of spotting was around 84% in case of preaches and 88% in case of the news.

Author 1: Mostafa Awaid
Author 2: Sahar A. Fawzi
Author 3: Ahmed H. Kandil

Keywords: Speech Recognition; Keyword Spotting; Template Matching

PDF

Paper 20: A Tentative Analysis of the Rectangular Horizontal-slot Microstrip Antenna

Abstract: In this paper, we have presented a new type of microstrip antenna mentioned as rectangular horizontal-slot patch antenna. Our main motto is to design a novel antenna which has the simplicity in structure and higher return loss. We have followed a tentative approach which leads us to an exceptional result, better than conventional one and the experimental outcomes result some guidelines for further practice. Here all of these antennas were analyzed by using GEMS (General Electro-Magnetic Solver) commercial software from 2COMU (Computer and Communication Unlimited)

Author 1: Md. Tanvir Ishtaique ul Huque
Author 2: Md. Imran Hasan

Keywords: GEMS; Microstrip antenna; Rectangular horizontal slot antenna; Return loss; Slot antenna; antenna

PDF

Paper 21: TCP I-Vegas in Mobile-IP Network

Abstract: Mobile Internet Protocol (Mobile-IP or MIP) provides hosts with the ability to change their point of attachment to the network without compromising their ability to communicate. However, when TCP Vegas is used over a MIP network, its performance degrades because it may respond to a handoff by invoking its congestion control algorithm. TCP Vegas is sensitive to the change of Round-Trip Time (RTT) and it may recognize the increased RTT as a result of network congestion. This is because TCP Vegas could not differentiate whether the increased RTT is due to route change or network congestion. This paper presents a new and improved version of conventional TCP Vegas, which we named as TCP I-Vegas (where “I”, stands for Improved). Vegas performs well when compared to Reno but when sharing bandwidth with Reno its performance degrades. I-Vegas has been designed keeping in mind that whenever TCP variants like Reno has to share the bandwidth with Vegas then instead of using Vegas, if we use I-Vegas then the loss which Vegas would have to bear will not be more. We compared the performance of I-Vegas with Vegas in MIP environment using Network Simulator (NS-2). Simulation results show that I-Vegas performs better than Vegas in terms of providing better throughput and congestion window behavior.

Author 1: Nitin Jain
Author 2: Dr. Neelam Srivastava

Keywords: TCP Vegas; Mobile-IP; NS-2

PDF

Paper 22: Complexity of Network Design for Private Communication and the P-vs-NP Question

Abstract: We investigate infeasibility issues arising along network design for information-theoretically secure cryptography. In particular, we consider the problem of communication in perfect privacy and formally relate it to graph augmentation problems and the P-vs-NP-question. Based on a game-theoretic privacy measure, we consider two optimization problems related to secure infrastructure design with constraints on computational efforts and limited budget to build a transmission network. It turns out that information-theoretic security, although not drawing its strength from computational infeasibility, still can run into complexity-theoretic difficulties at the stage of physical network design. Even worse, if we measure (quantify) secrecy by the probability of information-leakage, we can prove that approximations of a network design towards maximal security are computationally equivalent to the exact solutions to the same problem, both of which are again equivalent to asserting that P = NP. In other words, the death of public-key cryptosystems upon P = NP may become the birth of feasible network design algorithms towards information-theoretically confidential communication.

Author 1: Stefan Rass

Keywords: Complexity; NP; Privacy; Security; Game Theory; Graph Augmentation

PDF

Paper 23: Energy Saving EDF Scheduling for Wireless Sensors on Variable Voltage Processors

Abstract: Advances in micro technology has led to the development of miniaturized sensor nodes with wireless communication to perform several real-time computations. These systems are deployed wherever it is not possible to maintain a wired network infrastructure and to recharge/replace batteries and the goal is then to prolong as much as possible the lifetime of the system. In our work, we aim to modify the Earliest Deadline First (EDF) scheduling algorithm to minimize the energy consumption using the Dynamic Voltage and Frequency Selection. To this end, we propose an Energy Saving EDF (ES-EDF) algorithm that is capable of stretching the worst case execution time of tasks as much as possible without violating deadlines. We prove that ES-EDF is optimal in minimizing processor energy consumption and maximum lateness for which an upper bound on the processor energy saving is derived. In order to demonstrate the benefits of our algorithm, we evaluate it by means of simulation. Experimental results show that ES-EDF outperforms EDF and Enhanced EDF (E-EDF) algorithms in terms of both percentage of feasible task sets and energy savings.

Author 1: Hussein EL Ghor
Author 2: El-Hadi M Aggoune

PDF

Paper 24: Mobile Receiver-Assisted Localization Based on Selective Coordinates in Approach to Estimating Proximity for Wireless Sensor Networks

Abstract: Received signal strength (RSS)-based mobile localization has become popular due to its inexpensive localization solutions in large areas. Compared to various physical properties of radio signals, RSS is an attractive approach to localization because it can easily be obtained through existing wireless devices without any additional hardware. Although RSS is not considered to be a good choice for estimating physical distances, it provides some useful distance related information in adding and indicating connectivity information in neighboring nodes. RSS-based localization is generally divided into range-based and rangefree. Range-based localization can achieve excellent accuracy but is too costly to apply to large-scale networks. Methods of range-free localization are regarded as cost-effective solutions for localization in sensor networks. However, the localizations are subject to the effect of radio patterns that affect variations in the radial distance estimates between nodes. It is a challenging task to select an efficient RSS value that can provide small variations in the radial distance in wireless environments. We propose a method of Mobile Localization using the Proximities of Selective coordinates (MoLPS) to localize target nodes by using information on proximities between target nodes and mobile receivers as a metric to estimate the location of target nodes. We ran a simulation experiment to assess the performance of MoLPS with 100 target nodes that were randomly deployed along a sensory field boundary. We found from the results of the simulation experiment that localization error had been reduced to below 2m in more than 80% of the target nodes.

Author 1: Zulfazli Hussin
Author 2: Yukikazu Nakamoto

Keywords: Localization; proximity estimation; genetic algorithm;wireless sensor networks; received signal strength

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org