The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 4 Issue 4

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: E-learning in Higher Educational Institutions in Kuwait: Experiences and Challenges

Abstract: E-learning as an organizational activity started in the developed countries, and as such, the adoption models and experiences in the developed countries are taken as a benchmark in the literature. This paper investigated the barriers that affect or prevent the adoption of e-learning in higher educational institutions in Kuwait as an example of a developing country, and compared them with those found in developed countries. Semi-structured interviews were used to collect the empirical data from academics and managers in higher educational institutions in Kuwait. The research findings showed that the main barriers in Kuwait were lack of management awareness and support, technological barriers, and language barriers. From those, two barriers were specific to Kuwait (lack of management awareness and language barriers) when compared with developed countries. Recommendations for decision makers and suggestions for further research are also considered in this study.

Author 1: Mubarak M Alkharang
Author 2: George Ghinea

Keywords: e-learning; higher education; adoption; Kuwait; developed countries; e-learning barriers.

PDF

Paper 2: A Cost-Efficient and Reliable Resource Allocation Model Based on Cellular Automaton Entropy for Cloud Project Scheduling

Abstract: Resource allocation optimization is a typical cloud project scheduling problem: a problem that limits a cloud system’s ability to execute and deliver a project as originally planned. The entropy, as a measure of the degree of disorder in a system, is an indicator of a system’s tendency to progress out of order and into a chaotic condition, and it can thus serve to measure a cloud system’s reliability for project scheduling. In this paper, cellular automaton is used for modeling the complex cloud project scheduling system. Additionally, a method is presented to analysis the reliability of cloud scheduling system by measuring the average resource entropy (ARE). Furthermore, a new cost-efficient and reliable resource allocation (CERRA) model is proposed based on cellular automaton entropy to aid decision maker for planning projects on the cloud. At last, the proposed model is designed using Matlab toolbox and simulated with three basic cloud scheduling algorithm, First Come First Served Algorithm (FCFS), Min-Min Algorithm and Max-Min Algorithm. The simulation results show that the proposed model can lead to achieve a cost-efficient and reliable resource allocation strategy for running projects on the cloud environment.

Author 1: Huankai Chen
Author 2: Frank Wang
Author 3: Na Helian

Keywords: Resource Allocation; Cloud Project Scheduling; Entropy; Cellular Automaton; Cost-efficiency; Reliability; Complex System; Local Activity; Global Order; Disorder

PDF

Paper 3: Analysis of an Automatic Accessibility Evaluator to Validate a Virtual and Authenticated Environment

Abstract: This article’s objective is to analyze an automatic validation software compatible with the guidelines of Web Content Accessibility Guidelines (WCAG) 2.0 in an authenticated environment. To the evaluation it was utilized as a test platform the authenticated environment of Moodle, which is an open source platform created for educational environments. Initially, a brief conceptualization about accessibility and the operation of these guidelines was described, and then the software to be tested was chosen: the WAVE. In the next step, the tool’s operation was valued and the study’s analysis was made, which allowed the comparison between the testable errors of WAVE with the guidelines of WCAG 2.0. As the results of the research, it was concluded that the tool WAVE obtained a good performance, even though it did not include several guidelines of WCAG 2.0 and did not classified the results within the accessibility’s principles of Web Accessibility Initiative (WAI). Also showed itself more adequate to developers than to common users, which have no knowledge of Web programming language.

Author 1: Elisa Maria Pivetta
Author 2: Carla Flor
Author 3: Daniela Satomi Saito
Author 4: Vania Ribas Ulbricht

Keywords: automatic validation tool; WCAG 2.0; accessibility; WAVE)

PDF

Paper 4: Narrowing Down Learning Research: Technical Documentation in Information Systems Research

Abstract: learning how to use technical products is of high interest for customers as well as businesses. Besides product usability, technical documentation in various forms plays a major role for the acceptance of innovative products. Software applications partly integrate personalized learning strategies but late developments in information and communication technology extend these potentials to the non-software sector too. Mobile devices as smartphones allow the linking between physical and virtual world and are thereby eligible instruments for product learning and the application of adequate learning theories. Very few scientific publications accurately addressing the learning of product features and functionalities can be depicted. By applying a research profiling approach as a stepwise analysis of available publications, relevant learning paradigms and their corresponding scientific areas are depicted. As this research topic relates to marketing as well as information systems research the applied approach may also show beneficial for other interdisciplinary intentions.

Author 1: Thomas Puchleitner

Keywords: problem-based learning; self-regulated learning; self-directed learning; product learning; customer learning; consumer learning

PDF

Paper 5: Detection and Isolation of Packet Dropping Attacker in MANETs

Abstract: Several approaches have been proposed for Intrusion Detection Systems (IDS) in Mobile Ad hoc Networks (MANETs). Due to lack of MANETs infrastructure and well defined perimeter MANETs are susceptible to a variety of attacker types. To develop a strong security mechanism it is necessary to understand how malicious nodes can attack the MANETs. A new IDS mechanism is presented based on End-to-End connection for securing Optimized Link State Routing (OLSR) routing protocol. This new mechanism is named as Detection and Isolation Packet Dropped Attackers in MANETs (DIPDAM). DIPDAM mechanism based on three ID messages Path Validation Message (PVM) , Attacker Finder Message (AFM) and Attacker Isolation Message (AIM). DIPDAM mechanism based on End-to-End (E2E) communication between the source and the destination is proposed. The simulation results showed that the proposed mechanism is able to detect any number of attackers while keeping a reasonably low overhead in terms of network traffic.

Author 1: Ahmed Mohamed Abdalla
Author 2: Ahmad H. Almazeed
Author 3: Imane Aly Saroit
Author 4: Amira Kotb

Keywords: MANETS; IDS; OLSR; DIPDAM

PDF

Paper 6: Comparative Analysis of K-Means and Fuzzy C-Means Algorithms

Abstract: In the arena of software, data mining technology has been considered as useful means for identifying patterns and trends of large volume of data. This approach is basically used to extract the unknown pattern from the large set of data for business as well as real time applications. It is a computational intelligence discipline which has emerged as a valuable tool for data analysis, new knowledge discovery and autonomous decision making. The raw, unlabeled data from the large volume of dataset can be classified initially in an unsupervised fashion by using cluster analysis i.e. clustering the assignment of a set of observations into clusters so that observations in the same cluster may be in some sense be treated as similar. The outcome of the clustering process and efficiency of its domain application are generally determined through algorithms. There are various algorithms which are used to solve this problem. In this research work two important clustering algorithms namely centroid based K-Means and representative object based FCM (Fuzzy C-Means) clustering algorithms are compared. These algorithms are applied and performance is evaluated on the basis of the efficiency of clustering output. The numbers of data points as well as the number of clusters are the factors upon which the behaviour patterns of both the algorithms are analyzed. FCM produces close results to K-Means clustering but it still requires more computation time than K-Means clustering.

Author 1: Soumi Ghosh
Author 2: Sanjay Kumar Dubey

Keywords: clustering; k-means; fuzzy c-means; time complexity

PDF

Paper 7: Developing a Stochastic Input Oriented Data Envelopment Analysis (SIODEA) Model

Abstract: Data Envelopment Analysis (DEA) is a powerful quantitative tool that provides a means to obtain useful information about efficiency and performance of firms, organizations, and all sorts of functionally similar, relatively autonomous operating units, known as Decision Making Units (DMU). Usually the investigated DMUs are characterized by a vector of multiple inputs and multiple outputs. Unfortunately, not all inputs and/or outputs are deterministic; some could be stochastic. The main concern in this paper is to develop an algorithm to help any organization for evaluating their performance given that some inputs are stochastic. The developed algorithm is for a Stochastic Input Oriented Model based on the Chance Constrained Programming, where the stochastic inputs are normally distributed, while the remaining inputs and all outputs are deterministic.

Author 1: Basma E. El-Demerdash
Author 2: Ihab A. El-Khodary
Author 3: Assem A. Tharwat

Keywords: Data Envelopment Analysis; Stochastic Variables; Input Oriented; Performance Measure; Efficiency Measurement.

PDF

Paper 8: Studies and a Method to Minimize and Control the Jitter in Optical Based Communication System

Abstract: In the years, optical communication systems have been using significantly for attractive solutions to the increasing high data rate in telecommunication systems and various other applications. In the present days mostly, two types of communication schemes are using in data communication, namely asynchronous transmission and synchronous transmission depending on their timing and frame format. But both transmission systems are facing complications seriously with the involvement of jitter in data propagation. The jitter can degrade the performance of a transmission system by introducing bit errors and uncontrolled offsets or displacements in the digital signals. The jitter creates problems furiously at high data rate systems. The jitter need to be minimized in the communication system, otherwise it also degrades the performance of the interconnected systems with main circuit. This will happen due to improper synchronization or management of the clock scheme in the communication system. The improper organization of clock scheme propagates fault data and clock scheme to all other interconnected circuits. In the present work a new clock scheme is discussed to minimize the jitter in data propagation.

Author 1: N. Suresh Kumar
Author 2: R. Sridevi
Author 3: Dr. D.V. R. K. Reddy
Author 4: V. Sridevi

Keywords: Optics; Jitter; pipeline; Clock; propagation delay; high speed data.

PDF

Paper 9: Novel Steganography System using Lucas Sequence

Abstract: Steganography is the process of embedding data into a media form such as image, voice, and video. The major methods used for data hiding are the frequency domain and the spatial domain. In the frequency domain, the secret data bits are inserted into the coefficients of the image pixel's frequency representation such as Discrete Cosine Transform (DCT) , Discrete Fourier Transform (DFT) and Discrete Wavelet Transform (DWT) . On the other hand, in the spatial domain method, the secret data bits are inserted directly into the images' pixels value decomposition. The Lest Significant Bit (LSB) is consider as the most widely spatial domain method used for data hiding. LSB embeds the secret message's bits into the least significant bit plane( Binary decomposition) of the image in a sequentially manner . The LSB is simple, but it poses some critical issues. The secret message is easily detected and attacked duo to the sequential embedding process. Moreover, embedding using a higher bit plane would degrade the image quality. In this paper, we are proposing a novel data hiding method based on Lucas number system. We use Lucas number system to decompose the images' pixels values to allow using higher bit plane for embedding without degrading the image's quality. The experimental results show that the proposed method achieves better Peak Signal to Noise Ratio ( PSNR) than the LSB method for both gray scale and color images. Moreover, the security of the hidden data is enhanced by using Pseudo Random Number Generators(PRNG) for selecting the secret data bits to be embedded and the image's pixels used for embedding.

Author 1: Fahd Alharbi

Keywords: Steganography; LSB; Lucas; PSNR; PRNG

PDF

Paper 10: 3D CAD model reconstruction of a human femur from MRI images

Abstract: Medical practice and life sciences take full advantage of progress in engineering disciplines, in particular the computer assisted placement technique in hip surgery. This paper describes the three dimensional model reconstruction of human femur from MRI images. The developed program enables to obtain digital shape of 3D femur recognized by all CAD software and allows an accurate placement of the femoral component. This technic provides precise measurement of implant alignment during hip resurfacing or total hip arthroplasty, thereby reducing the risk of component mal-positioning and femoral neck notching.

Author 1: Mohammed RADOUANI
Author 2: Youssef AOURA
Author 3: Benaissa EL FAHIME
Author 4: Latifa OUZIZI

Keywords: biomechanic; MRI imaging; 3D reconstructionl; femur

PDF

Paper 11: Impact of other-cell interferences on downlink capacity in WCDMA Network

Abstract: Before the establishment of the UMTS network, operators are obliged to make the planning process to ensure a better quality of service (QOS) for mobile stations belonging to WCDMA cells. This process consists of estimating a set of parameters characterizing the radio cell in the downlink direction. Among them, we are interested in the Node B total required power PTot, and the maximum cell capacity, in the case of voice only and in the case of voice/video. To implement the effect of the other-cell interferences power, modeled as a fraction fDL of the own-cell received power, on various radio parameters described previously, we focused our study on two different scenarios: the first is based on an isolated cell and the second, on multiple cells. In addition, when the WCDMA cell reaches its maximum capacity, the introduction of admission control algorithms is essential to maintain the QOS of the ongoing mobile stations. For this purpose, we have proposed an admission control algorithm, based both on the Node B total required power and the cell loading factor. This algorithm gives rigorous results compared to the existing ones in the literature.

Author 1: Fadoua Thami Alami
Author 2: Noura Aknin
Author 3: Ahmed El Moussaoui

Keywords: WCDMA; planning process; downlink; capacity estimation; other-cell interferences ;own-cell interferences; Node B total required power;admission control.

PDF

Paper 12: Hierarchical Low Power Consumption Technique with Location Information for Sensor Networks

Abstract: In the wireless sensor networks composed of battery-powered sensor nodes, one of the main issues is how to save power consumption at each node. The usual approach to this problem is to activate only necessary nodes (e.g., those nodes which compose a backbone network), and to put other nodes to sleep. One such algorithm using location information is GAF (Geographical Adaptive Fidelity), and the GAF is enhanced to HGAF (Hierarchical Geographical Adaptive Fidelity). In this paper, we show that we can further improve the energy efficiency of HGAF by modifying the manner of dividing sensor-field. We also provide a theoretical bound on this problem.

Author 1: Susumu Matsumae
Author 2: Fukuhito Ooshita

Keywords: wireless sensor networks; geographical adaptive fidelity; energy conservation; network lifetime

PDF

Paper 13: Translation of Pronominal Anaphora from English to Telugu Language

Abstract: Discourses are linguistic structures above sentence level. Discourse is nothing but a coherent sequence of sentences. Discourse analysis is concerned with coherent processing of text segments larger than the sentence and this requires something more than just the interpretation of the individual sentences. A phenomenon that operates at discourse level includes cohesion. Text is cohesive if its elements link together. This linking can be either forward or backward. Pronominal referencing is one method for linking sentences. This paper presents the issues in translating pronominal references from English to Telugu language. This work handles resolution and generation of personal pronouns whose antecedents appear before the anaphora. An algorithm is developed for translation of pronominal references.

Author 1: T. Suryakanthi
Author 2: Dr. S.V.A.V. Prasad
Author 3: Dr. T. V. Prasad

Keywords: —GNP; Gender number person; SL: Source language;English; TL; target language: Telugu; S-singular; P-plural, Mmasculine; F-feminine; N-neuter; VBD – past tense verb form; VBZ- 3rd person singular present verb form; VBP- non 3rd person singular present verb form; MD- Modal

PDF

Paper 14: Learning by Modeling (LbM): Understanding Complex Systems by Articulating Structures, Behaviors, and Functions

Abstract: Understanding the behavior of complex systems has become a focal issue for scientists in a wide range of disciplines. Making sense of a complex system should require that a student construct a network of concepts and principles about the learning complex phenomena. This paper describes part of a project about Learning-by-Modeling (LbM). Many features of complex systems make it difficult for students to develop deep understanding. Previous research indicates that involvement with modeling scientific phenomena and complex systems can play a powerful role in science learning. Some researchers argue with this view indicating that models and modeling do not contribute to understanding complexity concepts, since these increases the cognitive load on students. In this study we investigated the effect of different modes of involvement in exploring scientific phenomena using computer simulation tools, on students’ mental model from the perspective of structure, behaviour and function. Quantitative and qualitative methods are used to report about 121 freshmen students that engaged in participatory simulations about complex phenomena, showing emergent, self-organized and decentralized patterns. Results show that LbM plays a major role in students' concept formation about complexity concepts.

Author 1: Kamel Hashem
Author 2: David Mioduser

Keywords: learning by modeling; simulation; complexity; mental models; educational technology

PDF

Paper 15: A Review of Computation Solutions by Mobile Agents in an Unsafe Environment

Abstract: Exploration in an unsafe environment is one of the major problems that can be seen as a basic block for many distributed mobile protocols. In such environment we consider that either the nodes (hosts) or the agents can pose some danger to the network. Two cases are considered. In the first case, the dangerous node is a called black hole, a node where incoming agents are trapped, and the problem is for the agents to locate it. In the second case, the dangerous agent is a virus; an agent moving between nodes infecting them, and the problem is for the “good” agents to capture it, and decontaminate the network. In this paper, we present several solutions for a black–hole and network decontamination problems. Then, we analyze their efficiency. Efficiency is evaluated based on the complexity, and the effort is in the minimization of the number of simultaneous decontaminating elements active in the system while performing the decontamination techniques.

Author 1: Anis Zarrad
Author 2: Yassine Daadaa

Keywords: Distributed algorithm; Mobile Agent; Network Decontamination; Black Hole Search; and Network Exploration

PDF

Paper 16: Semantic Conflicts Reconciliation as a Viable Solution for Semantic Heterogeneity Problems

Abstract: Achieving semantic interoperability is a current challenge in the field of data integration in order to bridge semantic conflicts occurring when the participating sources and receivers use different or implicit data assumptions. Providing a framework that automatically detects and resolves semantic conflicts is considered as a daunting task for many reasons, it should preserve the local autonomy of the integrated sources, as well as provides a standard query language for accessing the integrated data on a global basis. Many existing traditional and ontology-based approaches have tried to achieve semantic interoperability, but they have certain drawbacks that make them inappropriate for integrating data from a large number of participating sources. We propose semantic conflicts reconciliation (SCR) framework, it is ontology-based system in which all data semantics explicitly described in the knowledge representation phase and automatically taken into account through the interpretation mediation service phase, so conflicts detected and resolved automatically at the query time.

Author 1: Walaa S. Ismail
Author 2: Mona M. Nasr
Author 3: Torky I. Sultan
Author 4: Ayman E. Khedr

Keywords: Data Integration; Heterogeneous Sources; Interoperability; Semantic Conflicts; Context; Reconciliation Ontology.

PDF

Paper 17: An Intelligent mutli-object retrieval system for historical mosaics

Abstract: In this work we present a Mosaics Intelligent Retrieval System (MIRS) for digital museums. The objective of this work is to attain a semantic interpretation of images of historical mosaics. We use the fuzzy logic techniques and semantic similarity measure to extract knowledge from the images for multi-object indexing. The extracted knowledge provides the users (experts and laypersons) with an intuitive way to describe and to query the images in the database. Our contribution in this paper is firstly, to define semantic fuzzy linguistic terms to encode the object position and the inter-objects spatial relationships in the mosaic image. Secondly, to present a fuzzy color quantization approach using the human perceptual HSV color space and finally, to classify semantically the mosaics images using a semantic similarity measure. The automatically extracted knowledge are collected and traduced into XML language to create mosaics metadata. This system uses a simple Graphic User Interface (GUI) in natural language and applies the classification approach both on the mosaics images database and on user queries, to limit images classes in the retrieval process. MIRS is tested on images from the exceptional Tunisian collection of complex mosaics. Experimental results are based on queries of various complexities which yielded a system’s recall and precision rates of 86.6% and 87.1%, respectively, while the classification approach gives an average success rate evaluated to 76%.

Author 1: Wafa Maghrebi
Author 2: Anis B. Ammar
Author 3: Adel M. Alimi
Author 4: Mohamed A. Khabou

Keywords: retrieval; mosaics; metadata; classification; multi-objects

PDF

Paper 18: Optimization Query Process of Mediators Interrogation Based On Combinatorial Storage

Abstract: In the distributed environment where a query involves several heterogeneous sources, communication costs must be taken into consideration. In this paper we describe a query optimization approach using dynamic programming technique for set integrated heterogeneous sources. The objective of the optimization is to minimize the total processing time including load processing, request rewriting and communication costs, to facilitate communication inter-sites and to optimize the time of data transfer from site to others. Moreover, the ability to store data in more than one centre site provides more flexibility in terms of Security/Safety and overload of the network. In contrast to optimizers which are considered a restricted search space, the proposed optimizer searches the closed subsets of sources and independency relationship which may be deep laniary or hierarchical trees. Especially the execution of the queries can start traversal anywhere over any subset and not only from a specific source.

Author 1: L. Cherrat
Author 2: M. Ezziyyani
Author 3: M. Essaaidi

Keywords: Mediation; Datawarehouse; Optimisation; Classification

PDF

Paper 19: A Hybrid Framework using RBF and SVM for Direct Marketing

Abstract: one of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. This paper addresses using an ensemble of classification methods for direct marketing. Direct marketing has become an important application field for data mining. In direct marketing, companies or organizations try to establish and maintain a direct relationship with their customers in order to target them individually for specific product offers or for fund raising. A variety of techniques have been employed for analysis ranging from traditional statistical methods to data mining approaches. In this research work, new hybrid classification method is proposed by combining classifiers in a heterogeneous environment using arcing classifier and their performances are analyzed in terms of accuracy. A Classifier ensemble is designed using Radial Basis Function (RBF) and Support Vector Machine (SVM) as base classifiers. Here, modified training sets are formed by resampling from original training set; classifiers constructed using these training sets and then combined by voting. Empirical results illustrate that the proposed hybrid systems provide more accurate direct marketing system.

Author 1: M. Govidarajan

Keywords: Direct Marketing; Ensemble; Radial Basis Function; Support Vector Machine; Classification Accuracy.

PDF

Paper 20: Time Variant Change Analysis in Satellite Images

Abstract: This paper describes the time variant changes in satellite images using Self Organizing Feature Map (SOFM) technique associated with Artificial Neural Network. In this paper, we take a satellite image and find the time variant changes using above technique with the help of MATLAB. This paper reviews remotely sensed data analysis with neural networks. First, we present an overview of the main concepts underlying Artificial Neural Networks (ANNs), including the main architectures and learning algorithms. Then, the main tasks that involve ANNs in remote sensing are described. We first make a brief introduction to models of networks, for then describing in general terms Artificial Neural Networks (ANNs). As an application, we explain the back propagation algorithm, since it is widely used and many other algorithms are derived from it. There are two techniques that are used for classification in pattern recognition such as Supervised Classification and Unsupervised Classification. In supervised learning technique the network knows about the target and it has to change accordingly to get the desired output corresponding to the presented input sample data. Most of the previous work has already been done on supervised classification. In this study we are going to present the classification of satellite images using unsupervised classification method of ANN.

Author 1: Rachita Sharma
Author 2: Sanjay Kumar Dubey

Keywords: Satellite Images; SOFM; ANN; Supervised Classification, Unsupervised classification.

PDF

Paper 21: Reducing Attributes in Rough Set Theory with the Viewpoint of Mining Frequent Patterns

Abstract: The main objective of the Attribute Reduction problem in Rough Set Theory is to find and retain the set of attributes whose values vary most between objects in an Information System or Decision System. Besides, Mining Frequent Patterns aims finding items that the number of times they appear together in transactions exceeds a given threshold as much as possible. Therefore, the two problems have similarities. From that, an idea formed is to solve the problem of Attribute Reduction from the viewpoint and method of Mining Frequent Patterns. The main difficulty of the Attribute Reduction problem is the time consuming for execution, NP-hard. This article proposes two new algorithms for Attribute Reduction: one has linear complexity, and one has global optimum with concepts of Maximal Random Prior Set and Maximal Set.

Author 1: Thanh-Trung Nguyen
Author 2: Phi-Khu Nguyen

Keywords: accumulating frequent patterns; attribute reduction; maximal set; maximal random prior set; mining frequent patterns; rough set

PDF

Paper 22: Distributed Deployment Scheme for Homogeneous Distribution of Randomly Deployed Mobile Sensor Nodes in Wireless Sensor Network

Abstract: One of the most active research areas in wireless sensor networks is the coverage. The efficiency of the sensor network is measured in terms of the coverage area and connectivity. Therefore these factors must be considered during the deployment. In this paper, we have presented a scheme for homogeneous distribution of randomly distributed mobile sensor nodes (MSNs) in the deployment area. The deployment area is square in shape, which is divided into number of concentric regions centered at Base Station, these regions are separated by half of the communication range and further deployment area is divided in to numbers of regular hexagons. To achieve the maximum coverage and better connectivity MSNs will set themselves at the center of the hexagon on the instruction provided by the BS which is located at one of the corner in the deployment area. The simulation results shows that the presented scheme is better than CPVF and FLOOR schemes in terms of number of MSNs required for same coverage area and average movement required by MSNs to fix themselves at the desired location and energy efficiency.

Author 1: Ajay Kumar
Author 2: Vikrant Sharma
Author 3: D. Prasad

Keywords: Active MSNs, Desired location, Candidate location, Communication range, Sensing range etc.

PDF

Paper 23: Towards a Fraud Prevention E-Voting System

Abstract: Election falsification is one of the biggest problems facing third world countries as well as developed countries with respect to cost and time. In this paper, the guidelines for building a legally binding fraud-proof Electronic-Voting are presented. Also, the limitations are discussed.

Author 1: Dr. Magdi Amer
Author 2: Dr. Hazem El-Gendy

Keywords: e-voting, security

PDF

Paper 24: Impact of Medical Technology on Expansion in Healthcare Expenses

Abstract: the impact of medical technology on expansion in health care expenses has long been a subject of essential interest, mainly in the context of long-term outcrops of health spending, which must deal with the issue of the applicability of historical trends to future periods. The idea of this paper is to assess an approximate range for the involvement of technological alteration to growth in health spending, and to assess factors which might adjust this impact in the future. Based on the studies re-examined, we estimated that roughly half of growth in actual per capita health care costs is attributable to the beginning and diffusion of new medical technology, within an approximately probable range of 38 to 62 percent of expansion.

Author 1: Shakir Khan
Author 2: Dr. Mohamed Fahad AlAjmi

Keywords: medical technology; health costs; health care; research and development

PDF

Paper 25: Graph Mining Sub Domains and a Framework for Indexing – A Graphical Approach

Abstract: Graphs are one of the popular models for effective representation of complex structured huge data and the similarity search for graphs has become a fundamental research problem in Graph Mining. In this paper initially, the preliminary graph related basic theorems are brushed and showcased on with various research sub domains such as Graph Classification, Graph Searching, Graph Indexing, and Graph Clustering. These are discussed with few of the most dominant algorithms in their respective sub domains. Finally a model is proposed along with various algorithms with their future projection.

Author 1: K. Vivekanandan
Author 2: A. Pankaj Moses Monickaraj
Author 3: D. Ramya Chithra

Keywords: Graph; Graph Mining; Graph Classification; Graph Searching; Graph Indexing; Graph Clustering

PDF

Paper 26: Feedback Optimal Control of Low-thrust Orbit Transfer in Central Gravity Field

Abstract: Low-thrust trajectories with variable radial thrust is studied in this paper. The problem is tackled by solving the Hamilton- Jacobi-Bellman equation via State Dependent Riccati Equation( STDE) technique devised for nonlinear systems. Instead of solving the two-point boundary value problem in which the classical optimal control is stated, this technique allows us to derive closed-loop solutions. The idea of the work consists in factorizing the original nonlinear dynamical system into a quasi-linear state dependent system of ordinary differential equations. The generating function technique is then applied to this new dynamical system, the feedback optimal control is solved. We circumvent in this way the problem of expanding the vector field and truncating higher-order terms because no remainders are lost in the undertaken approach. This technique can be applied to any planet-to-planet transfer; it has been applied here to the Earth-Mars low-thrust transfer.

Author 1: Ashraf H. Owis

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org