The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 4 Issue 5

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: A Fuzzy Rough Rule Based System Enhanced By Fuzzy Cellular Automata

Abstract: Handling uncertain knowledge is a very tricky problem in the current world as the data, we deal with, is uncertain, incomplete and even inconsistent. Finding an efficient intelligent framework for this kind of knowledge is a challenging task. The knowledge based framework can be represented by a rule based system that depends on a set of rules which deal with uncertainness in the data. Fuzzy rough rules are a good competitive in dealing with the uncertain cases. They are consisted of fuzzy rough variables in both the propositions and consequences. The fuzzy rough variables represent the lower and upper approximations of the subsets of a fuzzy variable. These fuzzy variables use labels (fuzzy subsets) instead of values. An efficient fuzzy rough rule based system must depend on good and accurate rules. This system needs to be enhanced to view the future recommendations or in other words the system in time sequence. This paper tries to make a rule based system for uncertain knowledge using fuzzy rough theory to generate the desired accurate rules and then use fuzzy cellular automata parallel system to enhance the rule based system developed and find out what the system would look like in time sequence so as to give good recommendations about the system in the future. The proposed model along with experimental results and simulations of the rule based systems of different data sets in time sequence is illustrated.

Author 1: Mona Gamal
Author 2: Ahmed Abou El-Fetouh
Author 3: Shereef Barakat

Keywords: fuzzy rough reduction; fuzzy rough rules; fuzzy cellular automata; Self Organized Feature Maps (SOFM).

PDF

Paper 2: Advanced Personnel Vetting Techniques in Critical Multi-Tennant Hosted Computing Environments

Abstract: The emergence of cloud computing presents a strategic direction for critical infrastructures and promises to have far-reaching effects on their systems and networks to deliver better outcomes to the nations at a lower cost. However, when considering cloud computing, government entities must address a host of security issues (such as malicious insiders) beyond those of service cost and flexibility. The scope and objective of this paper is to analyze, evaluate and investigate the insider threat in cloud security in sensitive infrastructures as well as to propose two proactive socio-technical solutions for securing commercial and governmental cloud infrastructures. Firstly, it proposes actionable framework, techniques and practices in order to ensure that such disruptions through human threats are infrequent, of minimal duration, manageable, and cause the least damage possible. Secondly, it aims for extreme security measures to analyze and evaluate human threats related assessment methods for employee screening in certain high-risk situations using cognitive analysis technology, in particular functional Magnetic Resonance Imaging (fMRI). The significance of this research is also to counter human rights and ethical dilemmas by presenting a set of ethical and professional guidelines. The main objective of this work is to analyze related risks, identify countermeasures and present recommendations to develop a security awareness culture that will allow cloud providers to utilize effectively the benefits of this advanced techniques without sacrificing system security.

Author 1: Farhan Hyder Sahito
Author 2: Wolfgang Slany

Keywords: Cloud Computing; Human Threats; Multi Layered Security Strategy; Employee Screening; fMRI.

PDF

Paper 3: Fine Particulate Matter Concentration Level Prediction by using Tree-based Ensemble Classification Algorithms

Abstract: Pollutant forecasting is an important problem in the environmental sciences. Data mining is an approach to discover knowledge from large data. This paper tries to use data mining methods to forecast ?PM?_(2.5) concentration level, which is an important air pollutant. There are several tree-based classification algorithms available in data mining, such as CART, C4.5, Random Forest (RF) and C5.0. RF and C5.0 are popular ensemble methods, which are, RF builds on CART with Bagging and C5.0 builds on C4.5 with Boosting, respectively. This paper builds ?PM?_(2.5) concentration level predictive models based on RF and C5.0 by using R packages. The data set includes 2000-2011 period data in a new town of Hong Kong. The ?PM?_(2.5) concentration is divided into 2 levels, the critical points is 25µg/m^3 (24 hours mean). According to 100 times 10-fold cross validation, the best testing accuracy is from RF model, which is around 0.845~0.854.

Author 1: Yin Zhao
Author 2: Yahya Abu Hasan

Keywords: Random Forest; C5.0; PM2.5 prediction; data mining.

PDF

Paper 4: Data Flow Sequences: A Revision of Data Flow Diagrams for Modelling Applications using XML

Abstract: Data Flow Diagrams were developed in the 1970’s as a method of modelling data flow when developing information systems. While DFDs are still being used, the modern web-based which is client-server based means that DFDs are not as useful. This paper proposes a modified form of DFD that incorporates, amongst other features sequences. The proposed system, called Data Flow Sequences (DFS) is better able to model real world systems in a way that simplifies application development. The paper also proposes an XML implementation for DFS which allows analytical tools to be used to analyse the DFS diagrams. The paper discusses a tool that is able to detect orphan data flow sequences and other potential problems.

Author 1: James PH Coleman

Keywords: Data Flow Diagrams; Modelling diagrams; XML; Data Flow Sequence Diagrams

PDF

Paper 5: Comparative study of Authorship Identification Techniques for Cyber Forensics Analysis

Abstract: Authorship Identification techniques are used to identify the most appropriate author from group of potential suspects of online messages and find evidences to support the conclusion. Cybercriminals make misuse of online communication for sending blackmail or a spam email and then attempt to hide their true identities to void detection.Authorship Identification of online messages is the contemporary research issue for identity tracing in cyber forensics. This is highly interdisciplinary area as it takes advantage of machine learning, information retrieval, and natural language processing. In this paper, a study of recent techniques and automated approaches to attributing authorship of online messages is presented. The focus of this review study is to summarize all existing authorship identification techniques used in literature to identify authors of online messages. Also it discusses evaluation criteria and parameters for authorship attribution studies and list open questions that will attract future work in this area.

Author 1: Smita Nirkhi
Author 2: Dr.R.V.Dharaskar

Keywords: cyber crime; Author Identification; SVM

PDF

Paper 6: Privacy Impacts of Data Encryption on the Efficiency of Digital Forensics Technology

Abstract: Owing to a number of reasons, the deployment of encryption solutions are beginning to be ubiquitous at both organizational and individual levels. The most emphasized reason is the necessity to ensure confidentiality of privileged information. Unfortunately, it is also popular as cyber-criminals' escape route from the grasp of digital forensic investigations. The direct encryption of data or indirect encryption of storage devices, more often than not, prevents access to such information contained therein. This consequently leaves the forensics investigation team, and subsequently the prosecution, little or no evidence to work with, in sixty percent of such cases. However, it is unthinkable to jeopardize the successes brought by encryption technology to information security, in favour of digital forensics technology. This paper examines what data encryption contributes to information security, and then highlights its contributions to digital forensics of disk drives. The paper also discusses the available ways and tools, in digital forensics, to get around the problems constituted by encryption. A particular attention is paid to the Truecrypt encryption solution to illustrate ideas being discussed. It then compares encryption's contributions in both realms, to justify the need for introduction of new technologies to forensically defeat data encryption as the only solution, whilst maintaining the privacy goal of users.

Author 1: Adedayo M. Balogun
Author 2: Shao Ying Zhu

Keywords: Encryption; Information Security; Digital Forensics; Anti-Forensics; Cryptography; TrueCrypt

PDF

Paper 7: ImageCompression Using Real Fourier Transform, Its Wavelet Transform And Hybrid Wavelet With DCT

Abstract: This paper proposes new image compression technique that uses Real Fourier Transform. Discrete Fourier Transform (DFT) contains complex exponentials. It contains both cosine and sine functions. It gives complex values in the output of Fourier Transform. To avoid these complex values in the output, complex terms in Fourier Transform are eliminated. This can be done by using coefficients of Discrete Cosine Transform (DCT) and Discrete Sine Transform (DST). DCT as well as DST are orthogonal even after sampling and both are equivalent to FFT of data sequence of twice the length. DCT uses real and even functions and DST uses real and odd functions which are equivalent to imaginary part in Fourier Transform. Since coefficients of both DCT and DST contain only real values, Fourier Transform obtained using DCT and DST coefficients also contain only real values. This transform called Real Fourier Transform is applied on colour images. RMSE values are computed for column, Row and Full Real Fourier Transform. Wavelet transform of size N2xN2 is generated using NxN Real Fourier Transform. Also Hybrid Wavelet Transform is generated by combining Real Fourier transform with Discrete Cosine Transform. Performance of these three transforms is compared using RMSE as a performance measure. It has been observed that full hybrid wavelet transform obtained by combining Real Fourier Transform and DCT gives best performance of all. It is compared with DCT Full Wavelet Transform. It beats the performance of Full DCT Wavelet transform. Reconstructed image quality obtained in Real Fourier-DCT Full Hybrid Wavelet Transform is superior to one obtained in DCT, DCT Wavelet and DCT Hybrid Wavelet Transform.

Author 1: Dr. H. B. Kekre
Author 2: Dr. Tanuja Sarode
Author 3: Prachi Natu

Keywords: Real Fourier Transform; Hybrid Wavelet Transform; DCT

PDF

Paper 8: Building Low Cost Cloud Computing Systems

Abstract: The actual models of cloud computing are based in megalomaniac hardware solutions, being its implementation and maintenance unaffordable to the majority of service providers. The use of jail services is an alternative to current models of cloud computing based on virtualization. Models based in utilization of jail environments instead of the used virtualization systems will provide huge gains in terms of optimization of hardware resources at computation level and in terms of storage and energy consumption. In this paper it will be addressed the practical implementation of jail environments in real scenarios, which allows the visualization of areas where its application will be relevant and will make inevitable the redefinition of the models that are currently defined for cloud computing. In addition it will bring new opportunities in the development of support features for jail environments in the majority of operating systems.

Author 1: Carlos Antunes
Author 2: Ricardo Vardasca

Keywords: cloud computing; IAAS; jail environments; optimization; PAAS.

PDF

Paper 9: Application of multi regressive linear model and neural network for wear prediction of grinding mill liners

Abstract: The liner of an ore grinding mill is a critical component in the grinding process, necessary for both high metal recovery and shell protection. From an economic point of view, it is important to keep mill liners in operation as long as possible, minimising the downtime for maintenance or repair. Therefore, predicting their wear is crucial. This paper tests different methods of predicting wear in the context of remaining height and remaining life of the liners. The key concern is to make decisions on replacement and maintenance without stopping the mill for extra inspection as this leads to financial savings. The paper applies linear multiple regression and artificial neural networks (ANN) techniques to determine the most suitable methodology for predicting wear. The advantages of the ANN model over the traditional approach of multiple regression analysis include its high accuracy.

Author 1: Farzaneh Ahmadzadeh
Author 2: Jan.Lundberg

Keywords: Wear prediction; Remaining useful life; Artificial neural network; Principal Component Analysis; Maintenance scheduling ; Condition Monitoring.

PDF

Paper 10: DCaaS: Data Consistency as a Service for Managing Data Uncertainty on the Clouds

Abstract: Ensuring data correctness over partitioned distributed database systems is a classical problem. Classical solutions proposed to solve this problem are mainly adopting locking or blocking techniques. These techniques are not suitable for cloud environments as they produce terrible response times; due to the long latency and faultiness of wide area network connections among cloud datacenters. One way to improve performance is to restrict access of users-bases to specific datacenters and avoid data sharing between datacenters. However, conflicts might appear when data is replicated between datacenters; nevertheless change propagation timeliness is not guaranteed. Such problems created data uncertainty on cloud environments. Managing data uncertainty is one of the main obstacles for supporting global distributed transactions on the clouds. To overcome this problem, this paper proposes an quota-based approach for managing data uncertainty on the clouds that guarantees global data correctness without global locking or blocking. To decouple service developers from the hassles of managing data uncertainty, we propose to use a new platform service (i.e. Data Consistency as a Service (DCaaS)) to encapsulate the proposed approach. DCaaS service also ensures SaaS services cloud portability, as it works as a cloud adapter between SaaS service instances. Experiments show that proposed approach realized by the DCaaS service provides much better response time when compared with classical locking and blocking techniques.

Author 1: Islam Elgedawy

Keywords: clouds; cloudlet; cloud adapter; data uncertainty; DCaaS; SaaS; PaaS

PDF

Paper 11: A Computational Model of Extrastriate Visual Area MT on Motion Perception

Abstract: Human vision system are sensitive to motion perception under complex scenes. Building motion attention models similar to human visual attention system should be very beneficial to computer vision and machine intelligence; meanwhile, it has been a challenging task due to the complexity of human brain and limited understanding of the mechanisms underlying the human vision system. This paper models the motion perception mechanisms in human extrastriate visual middle temporal area (MT) computationally. MT is middle temporal area which is sensitive on motion perception. This model can explain the attention selection mechanism and visual motion perception to some extent. With the proposed model, we analysis the motion perception under day time with single or multiple moving objects, we then mimic the visual attention process consisting of attention shifts and eye fixations against motion- feature-map. The model produced similar gist perception outputs in our experiments, when day-time images and nocturnal images from the same scene are processed. At last, we mentioned the future direction of this research.

Author 1: Jiawei Xu
Author 2: Shigang Yue

Keywords: Motion perception; daytime and nocturnal scenes; spatio-temporal phase

PDF

Paper 12: A Modified clustering for LEACH algorithm in WSN

Abstract: Node clustering and data aggregation are popular techniques to reduce energy consumption in large Wireless Sensor Networks (WSN). Cluster based routing is always a hot research area in wireless sensor networks. Classical LEACH protocol has many advantages in energy efficiency, data aggregation and so on. However, determining number of clusters present in a network is an important problem. Conventional clustering techniques generally assume this parameter to be user supplied. There exist very few techniques that can solve the problem of automatic detection of number of clusters satisfactorily. Some of these techniques rely on user supplied information, while others use cluster validity indices. In this paper, we proposed a rather simple method to identify the number of clusters that can give satisfactory results. Proposed method is compared with classical LEACH protocol and found to be giving better results.

Author 1: B. Brahma Reddy
Author 2: K.Kishan Rao

Keywords: Clustering Index; LEACH; Wireless Sensor Networks; Energy optimization; Network lifetime

PDF

Paper 13: Jabber-based Cross-Domain Efficient and Privacy-Ensuring Context Management Framework

Abstract: in pervasive environments, context-aware applications require a global knowledge of the context information distributed in different spatial domains in order to establish context-based interactions. Therefore, the design of distributed storage, retrieval, and dissemination mechanisms of context information across domains becomes vital. In such environments, we envision the necessity of collaboration between different context servers distributed in different domains; thus, the need for generic APIs and protocol allowing context information exchange between different entities: context servers, context providers, and context consumers. As a solution this paper proposes ubique, a distributed middleware for context-aware computing that allows applications to maintain domain-based context interests to access context information about users, places, events, and things - all made available by or brokered through the home domain server. This paper proposes also a new cross-domain protocol for context management which ensures the privacy and the efficiency of context information dissemination. It has been robustly built upon the Jabber protocol which is a widely adopted open protocol for instant messaging and is designed for near real-time communication. Simulation and experimentation results show that ubique framework well supports robust cross-domain context management and collaboration.

Author 1: Zakwan Jaroucheh
Author 2: Xiaodong Liu
Author 3: Sally Smith

Keywords: pervasive computing; cross-domain context management; context modeling; Jabber protocol; privacy.

PDF

Paper 14: An efficient user scheduling scheme for downlink Multiuser MIMO-OFDM systems with Block Diagonalization

Abstract: The combination of multiuser multiple-input multiple-output (MU-MIMO) technology with orthogonal frequency division multiplexing (OFDM) is an attractive solution for next generation of wireless local area networks (WLANs), currently standardized within IEEE 802.11ac, and the fourth-generation (4G) mobile cellular wireless systems to achieve a very high system throughput while satisfying quality of service (QoS) constraints. In particular, Block Diagonalization (BD) scheme is a low-complexity precoding technique for MU-MIMO downlink channels, which completely pre-cancels the multiuser interference. The major issue of the BD scheme is that the number of users that can be simultaneously supported is limited by the ratio of the number of base station transmit antennas to the number of user receive antennas. When the number of users is large, a subset of users must be selected, and selection algorithms should be designed to maximize the total system throughput. In this paper, the BD technique is extended to MU-MIMO-OFDM systems and a low complexity user scheduling algorithm is proposed to find the optimal subset of users that should transmit simultaneously, in light of the instantaneous channel state information (CSI), such that the total system sum-rate capacity is maximized. Simulation results show that the proposed scheduling algorithm achieves a good trade-off between sum-rate capacity performance and computational complexity.

Author 1: Mounir Esslaoui
Author 2: Mohamed Essaaidi

Keywords: MU-MIMO; OFDM; scheduling; precoding; Block Diagonalization;

PDF

Paper 15: QOS,Comparison of BNP Scheduling Algorithms with Expanded Fuzzy System

Abstract: Parallel processing is a filed in which different systems run together to save the time of the processing and to increase the performance of the system. It has been also seen that it works somewhat up to the load balancing concept. Previous algorithms like HLFET ,MCP, DLS ,ETF , have shown that they can reduce the burden of the processor by working simultaneous working system .In our research work , we have combined HLFET ,MCP, DLS ,ETF with FUZZY logic to check out what effect it makes to the parameters which has been taken from the previous work done like Makespan, SLR ,Speedup , Process Utilization.It has been found that the fuzzy logic system works better than the single algorithm.

Author 1: Amita Sharma
Author 2: Harpreet Kaur

Keywords: Parallel Processing; DAG,BNP; Fuzzy logic; Multiprocessor.

PDF

Paper 16: LASyM: A Learning Analytics System for MOOCs

Abstract: Nowadays, the Web has revolutionized our vision as to how deliver courses in a radically transformed and enhanced way. Boosted by Cloud computing, the use of the Web in education has revealed new challenges and looks forward to new aspirations such as MOOCs (Massive Open Online Courses) as a technology-led revolution ushering in a new generation of learning environments. Expected to deliver effective education strategies, pedagogies and practices, which lead to student success, the massive open online courses, considered as the “linux of education”, are increasingly developed by elite US institutions such MIT, Harvard and Stanford by supplying open/distance learning for large online community without paying any fees, MOOCs have the potential to enable free university-level education on an enormous scale. Nevertheless, a concern often is raised about MOOCs is that a very small proportion of learners complete the course while thousands enrol for courses. In this paper, we present LASyM, a learning analytics system for massive open online courses. The system is a Hadoop based one whose main objective is to assure Learning Analytics for MOOCs’ communities as a mean to help them investigate massive raw data, generated by MOOC platforms around learning outcomes and assessments, and reveal any useful information to be used in designing learning-optimized MOOCs. To evaluate the effectiveness of the proposed system we developed a method to identify, with low latency, online learners more likely to drop out

Author 1: Yassine Tabaa
Author 2: Abdellatif Medouri

Keywords: Cloud Computing; MOOCs; Hadoop; Learning Analytics.

PDF

Paper 17: Research on Chinese University Students’ Media Images

Abstract: At present, university students, as the "after 90" and a new generation of young intellectuals, are being paid generally attentions by mass media. Nevertheless, university students’ public images are on a decline as they have negative news appeared ceaselessly. Contemporary university students are becoming a group of people who are gazed at fixedly by the media. Moreover, the media keeps gazing at them and help them to build university students’ media images. However, this kind of media behavior affects public judgments on university students’ images. Furthermore, in the eye of the public, university students’ images become serious distortion.

Author 1: Chengliang Zhang
Author 2: Haifei Yu

Keywords: University students’ media image; Content analytic method; the public opinion; Synergistic effect

PDF

Paper 18: Secure Medical Images Sharing over Cloud Computing environment

Abstract: Nowadays, many applications have been appeared due to the rapid development in the term of telecommunication. One of these applications is the telemedicine where the patients' digital data can transfer between the doctors for farther diagnosis. Therefore, the protection of the exchanged medical data is essential especially when transferring these data in an insecure medium such as the cloud computing environment, where the security is considered a major issue. In this paper, two security approaches were presented to guarantee a secure sharing of medical images over the cloud computing environment by providing the mean of trust management between the authorized parities of these data and also allows the privacy sharing of the Electronic Patients' Records string data between those parities while preserving the shared medical image from the distortion. The first approach apply spatial watermarking technique while the second approach implements a hybrid spatial and transform techniques in order to achieve the needed goal. The experimental results show the efficiency of the proposed approaches and the robustness against various types of attacks.

Author 1: Fatma E.-Z. A. Elgamal
Author 2: Noha A. Hikal
Author 3: F.E.Z. Abou-Chadi

Keywords: Cloud computing; Electronic Patients' Records; Cloud drops; encryption; spatial synchronization; authentication; Hybrid image watermarking; spatial watermarking; Discrete cosine Transform

PDF

Paper 19: Revisit of Logistic Regression

Abstract: Logistic regression (LR) is widely applied as a powerful classification method in various fields, and a variety of optimization methods have been developed. To cope with large-scale problems, an efficient optimization method for LR is required in terms of computational cost and memory usage. In this paper, we propose an efficient optimization method using non-linear conjugate gradient (CG) descent. In each CG iteration, the proposed method employs the optimized step size without exhaustive line search, which significantly reduces the number of iterations, making the whole optimization process fast. In addition, on the basis of such CG-based optimization scheme, a novel optimization method for kernel logistic regression (KLR) is proposed. Unlike the ordinary KLR methods, the proposed method optimizes the kernel-based classifier, which is naturally formulated as the linear combination of sample kernel functions, directly in the reproducing kernel Hilbert space (RKHS), not the linear coefficients. Subsequently, we also propose the multiple-kernel logistic regression (MKLR) along with the optimization of KLR. The MKLR effectively combines the multiple types of kernels with optimizing the weights for the kernels in the framework of the logistic regression. These proposed methods are all based on CG-based optimization and matrix-matrix computation which is easily parallelized such as by using multi-thread programming. In the experimental results on multi-class classifications using various datasets, the proposed methods exhibit favorable performances in terms of classification accuracies and computation times.

Author 1: Takumi Kobayashi
Author 2: Kenji Watanabe
Author 3: Nobuyuki Otsu

PDF

Paper 20: Study of Current Femto-Satellite Approches

Abstract: The success of space technology evolves according to the technological progression in terms of density of CMOS integration (Complementary on - Silicon Metal) and MEMS (Micro-Electro-Mechanical System) [4]. The need of spatial services has been a significant growth due to several factors such as population increases, telecommunication applications, climate changes, earth control and observation military goals, and so on. To cover this, spatial vehicle generations, specific calculators and Femto-cell systems have been developed. More recently, Ultra - Small Satellites (USS) have been proposed and different approaches, concerning developing of these kind of spatial systems, have been presented in literature. This miniature satellite is capable to fly in the space and to provide different services such as imagery, measures and communications [4, 9, 10]. This paper deals with the study of two different USS Femto-satellite architectures that exist in literature in order to propose a future architecture that can provide an optimization of power supply consumption and ameliorate service communication quality.

Author 1: Nizar Tahri
Author 2: Chafaa Hamrouni
Author 3: Adel M.Alimi

Keywords: Femtosatellite; Communication; Spacecraft

PDF

Paper 21: Neural Network based Mobility aware Prefetch Caching and Replacement Strategies in Mobile Environment

Abstract: The Location Based Services (LBS) have ushered the way mobile applications access and manage Mobile Database System (MDS). Caching frequently accessed data into the mobile database environment, is an effective technique to improve the MDS performance. The cache size limitation enforces an optimized cache replacement algorithm to find a suitable subset of items for eviction from the cache. In wireless environment mobile clients move freely from one location to another and their access pattern exhibits temporal-spatial locality. To ensure efficient cache utilization, it is important to consider the movement direction, current and future location, cache invalidation and optimized prefetching for mobile clients when performing cache replacement. This paper proposes a Neural Network based Mobility aware Prefetch Caching and Replacement policy (NNMPCR) in Mobile Environment to manage LBS data. The NNMPCR policy employs a neural network prediction system that is able to capture some of the spatial patterns exhibited by users moving in a wireless environment. It is used to predict the future behavior of the mobile client. A cache-miss-initiated prefetch is used to reduce future misses and valid scope invalidation technique for cache invalidation. This makes the policy adaptive to clients movement behavior and optimizes the performance compared to earlier policies.

Author 1: Hariram Chavan
Author 2: Suneeta Sane
Author 3: H. B. Kekre

Keywords: Location Based Services, Caching, backpropagation, cache-miss-initiated prefetch, cache replacement policy.

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org