The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 5 Issue 6

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: An Open Source P2P Encrypted Voip Application

Abstract: Open source is the future of technology. This community is growing by the day; developing and improving existing frameworks and software for free. Open source replacements are coming up for almost all proprietary software nowadays. This paper proposes an open source application which could replace Skype, a popular VoIP soft phone. The performance features of the developed software is analyzed and compared with Skype so that we can conclude that it can be an efficient replacement. This application is developed in pure Java using various APIs and package and boasts features like voice calling, chatting, file sharing etc. The target audience for this software will initially only be organizations (for internal communication) and later will be released on a larger scale.

Author 1: Ajay Kulkarni
Author 2: Saurabh Kulkarni

Keywords: voip; softphone; java; open source

PDF

Paper 2: The Coverage Analysis for Low Earth Orbiting Satellites at Low Elevation

Abstract: Low Earth Orbit (LEO) satellites are used for public networking and for scientific purposes. Communication via satellite begins when the satellite is positioned in its orbital position. Ground stations can communicate with LEO satellites only when the satellite is in their visibility region. The duration of the visibility and the communication vary for each LEO satellite pass over the station, since LEO satellites move too fast over the Earth. The satellite coverage area is defined as a region of the Earth where the satellite is seen at a minimum predefined elevation angle. The satellite’s coverage area on the Earth depends on orbital parameters. The communication under low elevation angles can be hindered by natural barriers. For safe communication and for savings within a link budget, the coverage under too low elevation is not always provided. LEO satellites organized in constellations act as a convenient network solution for real time global coverage. Global coverage model is in fact the complementary networking process of individual satellite’s coverage. Satellite coverage strongly depends on elevation angle. To conclude about the coverage variation for low orbiting satellites at low elevation up to 10º, the simulation for attitudes from 600km to 1200km is presented through this paper.

Author 1: Shkelzen Cakaj
Author 2: Bexhet Kamo
Author 3: Algenti Lala
Author 4: Alban Rakipi

Keywords: LEO; satellite; coverage

PDF

Paper 3: Estimating Null Values in Database Using CBR and Supervised Learning Classification

Abstract: Database and database systems have been used widely in almost, all life activities. Sometimes missed data items are discovered as missed or null values in the database tables. The presented paper proposes a design for a supervised learning system to estimate missed values found in the university database. The values of estimated data items or data it items used in estimation are numeric and not computed. The system performs data classification based on Case-Based Reasoning (CBR) to estimate loosed marks of students. A data set is used in training the system under the supervision of an expert. After training the system to classify and estimate null values under expert supervision, it starts classification and estimation of null data by itself.

Author 1: Khaled Nasser ElSayed

Keywords: DataBase(DB);Data mining; Case-Based Reasoning (CBR); Classification;Null Values; Supervised Learning

PDF

Paper 4: An Experience of Taiwan Policy Development To Accelerate Cloud Migration

Abstract: developing cloud computing is a key policy for government, while convenient service is an important issue for people living. In the beginning of 2010, the Taiwan Government has launched a “Cloud Computing Development Project”, and has devoted to service planning and investment activities. At the end of 2012, in a three-year comprehensive review and suggestion adoption from public and private sectors, the Taiwan Government adjusted the policy and rename as “Cloud Computing Application and Development Project”. From the perspectives of government application, industry development, and cloud open platform, this study describes how the vision drive goals and thinking push forward strategies. In the process of government and industry collaboration, it is progressively created value for cloud services. The Cloud Computing Project Management Office acts a key role as policy advisor, matching platform, and technical supporting to the achievements of (1) policy assessment and strategy enhancement; (2) construction of cloud open platform to the demand and supply linkage; (3) innovation and integration planning for government service application, leading to industry development.

Author 1: Sheng-Chi Chen

Keywords: Cloud Computing; Action Research; Project Management Office

PDF

Paper 5: Educational Data Mining Model Using Rattle

Abstract: Data Mining is the extraction of knowledge from the large databases. Data Mining had affected all the fields from combating terror attacks to the human genome databases. For different data analysis, R programming has a key role to play. Rattle, an effective GUI for R Programming is used extensively for generating reports based on several current trends models like random forest, support vector machine etc. It is otherwise hard to compare which model to choose for the data that needs to be mined. This paper proposes a method using Rattle for selection of Educational Data Mining Model.

Author 1: Sadiq Hussain
Author 2: G.C. Hazarika

Keywords: Educational Data Mining; R Programming; Rattle; ROC Curve; Support Vector Machine; Random Forest

PDF

Paper 6: Individual Syllabus for Personalized Learner-Centric E-Courses in E-Learning and M-Learning

Abstract: Most of e-learning and m-learning systems are course-centric. These systems provided services that concentrated on course material and pedagogical. They did not take into account varieties of student levels, skills, interests or preferences. This paper provides a design of an approach for personalized and self-adapted agent-based learning systems for enhancing e-learning and mobile learning (m-learning) services to be learner-centric. It presents a modeling of goals of different learners of a corporate training in computer courses in an educational institute. It figures how to customize and personalize learning paths (course syllabus) for e-learning and m-learning platforms. The delivering of e-courses become personalized learner-centric, which improves learning outcome, satisfaction of learners and enhances education.

Author 1: Khaled Nasser ElSayed

Keywords: AI; Agent; education; e-Learning; m-Learning; Semantic Net

PDF

Paper 7: Security Policies for Securing Cloud Databases

Abstract: Databases are an important and almost mandatory means for storing information for later use. Databases require effective security to protect the information stored within them. In particular access control measures are especially important for cloud databases, because they can be accessed from anywhere in the world at any time via the Internet. The internet has provided a plethora of advantages by increasing accessibility to various services, education, information and communication. The internet also presents challenges and disadvantages, which include securing services, information and communication. Naturally, the internet is being used for good but also to carry out malicious attacks on cloud databases. In this paper we discuss approaches and techniques to protect cloud databases, including security policies which can realized as security patterns.

Author 1: Ingrid A. Buckley
Author 2: Fan Wu

Keywords: relational database; cloud; security; threats; hackers, security patterns; cloud database

PDF

Paper 8: Comparative Performance Analysis of Feature(S)-Classifier Combination for Devanagari Optical Character Recognition System

Abstract: this paper presents a comparative performance analysis of feature(s)-classifier combination for Devanagari optical character recognition system. For performance evaluation, three classifiers namely support vector machines, artificial neural networks and k-nearest neighbors, and seven feature extraction approaches viz. profile direction codes, transition, zoning, directional distance distribution, Gabor filter, discrete cosine transform and gradient features have been used. The first four features have been used jointly as statistical features. The performance has also been evaluated by using the combination of these feature extraction approaches. In addition, performance evaluation has also been done by varying the feature vector length of Gabor and DCT features. For training the classifiers, 7000 samples of first 70 classes (out of 942 classes), recognized in the earlier work have been used. Such a large number of classes are due to the horizontal and vertical fusion/overlapping characters. We have chosen first 70 classes as their percentage contribution out of 942 classes has found to be 96.69%. For testing, 1400 samples have been collected separately. A corpus of 25 books has been used for sample collection. Classifiers trained on different features, have been compared for performance evaluation. It has been found that support vector machines trained with Gradient features provide the classification correctness of 99.429%, and there is no significant increase in the performance with the increase in the feature vector length.

Author 1: Jasbir Singh
Author 2: Gurpreet Singh Lehal

Keywords: Artificial Neural Network; DCT; Directional Distance Distribution; Feature extraction, Gabor; k-Nearest Neighbour; Profile direction codes; Support Vector Machines; Transition; Zoning

PDF

Paper 9: Principle of Duality on Prognostics

Abstract: The accurate estimation of the remaining useful life (RUL) of various components and devices used in complex systems, e.g., airplanes remain to be addressed by scientists and engineers. Currently, there area wide range of innovative proposals put forward that intend on solving this problem. Integrated System Health Management (ISHM) has thus far seen some growth in this sector, as a result of the extensive progress shown in demonstrating feasible and viable techniques. The problems related to these techniques were that they often consumed time and were too expensive and resourceful to develop. In this paper we present a radically novel approach for building prognostic models that compensates and improves on the current prognostic models inconsistencies and problems. Broadly speaking, the new approach proposes a state of the art technique that utilizes the physics of a system rather than the physics of a component to develop its prognostic model. A positive aspect of this approach is that the prognostic model can be generalized such that a new system could be developed on the basis and principles of the prognostic model of another system. This paper will mainly explore single switch dc-to-dc converters which will be used as an experiment to exemplify the potential success that can be discovered from the development of a novel prognostic model that can efficiently estimate the remaining useful life of one system based on the prognostics of its dual system.

Author 1: Mohammad Samie
Author 2: Amir M. S. Motlagh
Author 3: Alireza Alghassi
Author 4: Suresh Perinpanayagam
Author 5: Epaminondas Kapetanios

Keywords: Prognostic Model; Integrated System Health Management (ISHM); Degradation; Duality; Cuk Converter

PDF

Paper 10: Domain Based Prefetching in Web Usage Mining

Abstract: In the current web scenario, the Internet users expect the web to be more friendly and meaningful with reduced network traffic. Every end user needs the channel with high bandwidth. In order to reduce the web server load, the access latency and to improve the network bandwidth from heavy network traffic, a model called Domain based Prefetching (DoP) is recommended, which uses the technique of General Access Pattern Tracking. DoP presents the user with several generic Domains with the top visited web requests in each Domain, which are retrieved from the web log file for future web access.

Author 1: Dr. M. Thangaraj
Author 2: Mrs. V. T. Meenatchi

Keywords: Latency; Domain; Prefetching; bandwidth; Network Traffic; Web Log File

PDF

Paper 11: Teaching Introductory Programming

Abstract: From the educational point of view, learning by mistake could be influential teaching method, especially for teaching/learning Computer Science (CS), and/or Information Technologies (IT). As learning programming is very difficult and hard task, perhaps even more difficult and extremely demanding job to teach novices how to make correct computers programs. The concept of design pedagogical patterns has received surprisingly little attention so far from the researchers in the field of pedagogy/didactics of Computer Science. Design pedagogical patterns are descriptions of successful solutions of common problems that occur in teaching/learning CS and IT. Good pedagogical patterns could help teachers when they have to design new course, lessons, topics, examples, and assignments, in a particular context. Pedagogical patterns captured the best practice in a teaching/learning CS and/or IT. They could be very helpful to the teachers in preparing their own lessons. In this paper a brief description of special class design of pedagogical patterns, the group of patterns for learning by mistakes, is presented. In addition, usage of helpful and misleading pedagogical agents, which have been developed in Agent-based E-learning System (AE-lS), based on pedagogical pattern for explanation Explain, and pedagogical pattern for learning by mistakes Wolf, Wolf, Mistake, is described.

Author 1: Ljubomir Jerinic

Keywords: Pedagogical Pattern; Pattern Design; Learning; Programming; Computer science education; Programming; Software agents; Electronic learning; Computer aided instruction

PDF

Paper 12: Development Process Patterns for Distributed Onshore/Offshore Software Projects

Abstract: the globalisation of the commercial world, and the use of distributed working practices (Offshore/ onshore/ near-shore) has increased dramatically with the improvement of information and communication technologies. Many organisations, especially those that operate within knowledge intensive industries, have turned to distributed work arrangements to facilitate information exchange and provide competitive advantage in terms of cost and quicker delivery of the solutions. The information and communication technologies (ICT) must be able to provide services similar to face-to-face conditions. Additional organisations functions must be enhanced to overcome the shortcomings of ICT and also to compensate for time gaps, cultural differences, and distributed team work. Our proposed model identifies four key work models or patterns that affect the operation of distributed work arrangements, and we also propose guidelines for managing distributed work efficiently and effectively.

Author 1: Ravinder Singh
Author 2: Dr. Kevin Lano

Keywords: Distributed work; Onshore; Offshore; Software; IT projects; Programme and project Management

PDF

Paper 13: System Autonomy Modeling During Early Concept Definition

Abstract: The current rapid systems engineering design methods, such as AGILE, significantly reduce the development time. This results in the early availability of incremental capabilities, increases the importance of accelerating and effectively performing early concept trade studies. Current system autonomy assessment tools are level based and are used to provide the levels of autonomy attained during field trials. These tools have limited applicability in earlier design definition stages. An algorithmic system autonomy tool is needed to facilitate trade off studies, analyses of alternatives and concept of operations performed during those very early phases. We developed our contribution to such a tool and described it in this paper.

Author 1: Rosteslaw M. Husar
Author 2: Jerrell Stracener

Keywords: Systems Engineering; Autonomous Systems; Requirements Engineering; System of Systems component; System Autonomy Modeling

PDF

Paper 14: Prototype of a Web ETL Tool

Abstract: Extract, transform and load (ETL) is a process that makes it possible to extract data from operational data sources, to transform data in the way needed for data warehousing purposes and to load data into a data warehouse (DW). ETL process is the most important part when building the data warehouse. Because the ETL process is a very complex and time consuming, this paper presents a prototype of a web ETL tool that offers step-by-step guidance through the entire process to the end user. This ETL tool is designed as a web application so users can save time (and space) required for installation purposes.

Author 1: Matija Novak
Author 2: Kornelije Rabuzin

Keywords: ETL; data warehouse; web; ETL tool

PDF

Paper 15: Using an MPI Cluster in the Control of a Mobile Robots System

Abstract: Recently, HPC (High Performance Computing) systems have gone from supercomputers to clusters. The clusters are used in all tasks that require very high computing power such as weather forecasting, climate research, molecular modeling, physical simulations, cryptanalysis, etc. The use of clusters is increasingly important in the scientific community, where the need for high performance computing (HPC) is still growing. In this paper, we propose an improvement of a mobile robots system control by using an MPI (Message Passing Interface) cluster. This cluster will launch, manipulate and process data from multiple robots simultaneously.

Author 1: Mohamed Salim LMIMOUNI
Author 2: Saïd BENAISSA
Author 3: Hicham MEDROMI
Author 4: Adil SAYOUTI

Keywords: clusters; MPI; parallel programming; mobile systems; mobile robots

PDF

Paper 16: Simulation of Performance Execution Procedure to Improve Seamless Vertical Handover in Heterogeneous Networks

Abstract: One challenge of wireless networks integration is the ubiquitous wireless access abilities which provide the seamless handover for any moving communication device between different types of technologies (3GPP and non-3GPP) such as Global System for Mobile Communication (GSM), Wireless Fidelity (Wi-Fi), Worldwide Interoperability for Microwave Access (WiMAX), Universal Mobile Telecommunications System (UMTS) and Long Term Evolution (LTE). This challenge is important as Mobile Users (MUs) are becoming increasingly demanding for services regardless of technological complexities associated with it. To fulfill these requirements for seamless Vertical Handover (VHO) two main interworking architectures have been proposed by European Telecommunication Standards Institute (ETSI) for integration between different types of technologies; namely, loose and tight coupling. On the other hand, Media Independent Handover IEEE 802.21 (MIH) is a framework which has been proposed by IEEE Group to provide seamless VHO between the aforementioned technologies by utilizing these interworking architectures to facilitate and complement their works. The paper presents the design and the simulation of a Mobile IPv4 (MIPv4) based procedure for loose coupling architecture with MIH to optimize performance in heterogeneous wireless networks. The simulation results show that the proposed procedure provides seamless VHO with minimal latency and zero packet loss ratio.

Author 1: Omar Khattab
Author 2: Omar Alani

Keywords: Vertical Handover (VHO); Media Independent Handover (MIH); Interworking Architectures; Mobile IPv4 (MIPv4); Heterogeneous Wireless Networks

PDF

Paper 17: Toward an Effective Information Security Risk Management of Universities’ Information Systems Using Multi Agent Systems, Itil, Iso 27002,Iso 27005

Abstract: Universities in the public and private sectors depend on information technology and information systems to successfully carry out their missions and business functions. Information systems are subject to serious threats that can have adverse effects on organizational operations and assets, and individuals by exploiting both known and unknown vulnerabilities to compromise the confidentiality, integrity, or availability of the information being processes, stored or transmitted by those systems. Threats to information systems can include purposeful attacks, environmental disruptions, and human/machine errors, and can result in harm to the integrity of data. Therefore, it is imperative that all the actors at all levels in a university information system understand their responsibilities and are held accountable for managing information security risk-that is the risk associated with the operation and use of information systems that support the missions and business functions of their university. The purpose of this paper is to propose an information security toolkit namely URMIS (University Risk Management Information System) based on multi agent systems and integrating with existing information security frameworks and standards, to enhance the security of universities information systems.

Author 1: S. FARIS
Author 2: S.EL HASNAOUI
Author 3: H.MEDROMI
Author 4: H.IGUER
Author 5: A.SAYOUTI

Keywords: Information security; information systems; multi agent systems; ITIL V3; ISO 27002; ISO 27005

PDF

Paper 18: A Novel Cloud Computing Security Model to Detect and Prevent DoS and DDoS Attack

Abstract: Cloud computing has been considered as one of the crucial and emerging networking technology, which has been changed the architecture of computing in last few years. Despite the security concerns of protecting data or providing continuous service over cloud, many organisations are considering different types cloud services as potential solution for their business. We are researching on cloud computing security issues and potential cost effective solution for cloud service providers. In our first paper we have revealed number of security risks for cloud computing environment, which has focused on lack of awareness of cloud service providers. In our second paper, we have investigated on technical security issues involved in cloud service environment, where it’s been revealed that DoS or DDoS is one of the common and significant dangers for cloud computing environment. In this paper, we have investigated on different techniques that can be used for DoS or DDoS attack, have recommended hardware based watermarking framework technology to protect the organisation from these threats.

Author 1: Masudur Rahman
Author 2: Wah Man Cheung

Keywords: Denial of Service attack; Distributed Denial of Service Attack; mechanism of DoS and DDoS attack; framework to prevent DDoS attack, hardware based watermarking

PDF

Paper 19: Fast Efficient Clustering Algorithm for Balanced Data

Abstract: The Cluster analysis is a major technique for statistical analysis, machine learning, pattern recognition, data mining, image analysis and bioinformatics. K-means algorithm is one of the most important clustering algorithms. However, the k-means algorithm needs a large amount of computational time for handling large data sets. In this paper, we developed more efficient clustering algorithm to overcome this deficiency named Fast Balanced k-means (FBK-means). This algorithm is not only yields the best clustering results as in the k-means algorithm but also requires less computational time. The algorithm is working well in the case of balanced data.

Author 1: Adel A. Sewisy
Author 2: M. H. Marghny
Author 3: Rasha M. Abd ElAziz
Author 4: Ahmed I. Taloba

Keywords: Clustering; K-means algorithm; Bee algorithm; GA algorithm; FBK-means algorithm

PDF

Paper 20: Encrypted With Fuzzy Compliment-Max-Product Matrix in Watermarking

Abstract: Watermark is used to protect copyright and to authenticate images. In digital media, today’s world images are in electronic form available in the internet. For its protection and authentication invisible watermarking in encrypted form are used. In this paper encryption is done using fuzzy Compliment-Max-Product matrix and then encrypted watermark is embedded in the digital media at desired places using fuzzy rule. The Region of Interest (ROI) is decided with fuzzification. Then, watermark is inserted at the respective positions in the image. Robustness of watermark is judged for ROI. This method of watermarking is done on all image file formats and it is resistant for geometric, noise and compression attack.

Author 1: Sharbani Bhattacharya

Keywords: Watermarking; Fuzzy Compliment-Max-Product Matrix, Fuzzification; Encryption

PDF

Paper 21: Watermarking Digital Image Using Fuzzy Matrix Compositions and Rough Set

Abstract: Watermarking is done in digital images for authentication and to restrict its unauthorized usages. Watermarking is sometimes invisible and can be extracted only by authenticated party. Encrypt a text or information by public –private key from two fuzzy matrix and embed it in image as watermark. In this paper we proposed two fuzzy compositions Product-Mod-Minus, and Compliment-Product-Minus. Embedded watermark using Fuzzy Rough set created from fuzzy matrix compositions.

Author 1: Sharbani Bhattacharya

Keywords: Fuzzy Product-Mod-Minus Matrix; Fuzzy Compliment-Product-Minus Matrix; Fuzzy Rough set; Watermarking; Encrypting

PDF

Paper 22: The Solution Structure and Error Estimation for The Generalized Linear Complementarity Problem

Abstract: In this paper, we consider the generalized linear complementarity problem (GLCP). Firstly, we develop some equivalent reformulations of the problem under milder conditions, and then characterize the solution of the GLCP. Secondly, we also establish the global error estimation for the GLCP by weakening the assumption. These results obtained in this paper can be taken as an extension for the classical linear complementarity problems.

Author 1: Tingfa Yan

Keywords: GLCP; solution structure; error estimation

PDF

Paper 23: Forecasting Rainfall Time Series with stochastic output approximated by neural networks Bayesian approach

Abstract: The annual estimate of the availability of the amount of water for the agricultural sector has become a lifetime in places where rainfall is scarce, as is the case of northwestern Argentina. This work proposes to model and simulate monthly rainfall time series from one geographical location of Catamarca, Valle El Viejo Portezuelo. In this sense, the time series prediction is mathematical and computational modelling series provided by monthly cumulative rainfall, which has stochastic output approximated by neural networks Bayesian approach. We propose to use an algorithm based on artificial neural networks (ANNs) using the Bayesian inference. The result of the prediction consists of 20% of the provided data consisting of 2000 to 2010. A new analysis for modelling, simulation and computational prediction of cumulative rainfall from one geographical location is well presented. They are used as data information, only the historical time series of daily flows measured in mmH2O. Preliminary results of the annual forecast in mmH2O with a prediction horizon of one year and a half are presented, 18 months, respectively. The methodology employs artificial neural network based tools, statistical analysis and computer to complete the missing information and knowledge of the qualitative and quantitative behavior. They also show some preliminary results with different prediction horizons of the proposed filter and its comparison with the performance Gaussian process filter used in the literature.

Author 1: Cristian Rodriguez Rivero
Author 2: Julian Antonio Pucheta

Keywords: rainfall time series; stochastic method; bayesian approach; computational intelligence

PDF

Paper 24: Estimation of Water Quality Parameters Using the Regression Model with Fuzzy K-Means Clustering

Abstract: the traditional methods in remote sensing used for monitoring and estimating pollutants are generally relied on the spectral response or scattering reflected from water. In this work, a new method has been proposed to find contaminants and determine the Water Quality Parameters (WQPs) based on theories of the texture analysis. Empirical statistical models have been developed to estimate and classify contaminants in the water. Gray Level Co-occurrence Matrix (GLCM) is used to estimate six texture parameters: contrast, correlation, energy, homogeneity, entropy and variance. These parameters are used to estimate the regression model with three WQPs. Finally, the fuzzy K-means clustering was used to generalize the water quality estimation on all segmented image. Using the in situ measurements and IKONOS data, the obtained results show that texture parameters and high resolution remote sensing able to monitor and predicate the distribution of WQPs in large rivers.

Author 1: Muntadher A. SHAREEF
Author 2: Abdelmalek TOUMI
Author 3: Ali KHENCHAF

Keywords: In situ data measurements; IKONOS data; water quality parameters; GLCM; empirical models; fuzzy K-means clustering

PDF

Paper 25: A Compound Generic Quantitative Framework for Measuring Digital Divide

Abstract: The term digital divide had been used in the literature to conceptualize the gap in using and utilizing information and communication technologies. Digital divide can be identified on different levels such as individuals, groups, societies, organizations and countries. On the other hand, the concept of e-Inclusion is coined to define activities needed to bridge digital divide. One of the most challenging research areas in digital divide that had been a subject for exhaustive studies is measuring digital divide. Researchers have proposed many metrics and indices to measure digital divide. However, most of the proposed measures are bivariate comparisons that reduce measurement to comparisons of Internet penetration rates or alike. This paper proposes a compound generic framework for quantitative measuring of digital divide on the individuals or group level. The proposed framework takes into account the context of the digital divide in each society.

Author 1: Noureldien A. Noureldien

Keywords: Digital Divide; Digital Divide Indicator; E-inclusion; Inclusion Factors; Inclusion Activities

PDF

Paper 26: XCS with an internal action table for non-Markov environments

Abstract: To cope with sequential decision problems in non- Markov environments, learning classifier systems using the internal register have been proposed. Since, by utilizing the action part of classifiers, these systems control the internal register in the same way as choosing actions to the environment, they do not always work well. In this paper, we develop an effective learning classifier system with two different rule sets for internal and external actions. The first one is used for determining internal actions, that is, rules for controlling the internal register. It provides stable performance by separating control of the internal register from the action part of classifiers, and it is represented by “If [external state] & [internal state] then [internal action],” and we call a set of the first rules the internal action table. The second one is for selecting external actions as in the classical classifier system, but its structure is slightly different with the classical one; it is represented by “If [external state] & [internal state] & [internal action] then [external action].” In the proposed system, aliased states in the environment are identified by observing payoffs of a classifier and referring to the internal action table. To demonstrate the efficiency and effectiveness of the proposed system, we apply it to woods environments which are used in the related works, and compare the performance of it to those of the existing classifier systems.

Author 1: Tomohiro Hayashida
Author 2: Ichiro Nishizaki
Author 3: Keita Moriwake

Keywords: Learning classifier systems; Non-Markov environments; XCS; Internal register.

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org