The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 3 Issue 8

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Instruction Design Model for Self-Paced ICT System E-Learning in an Organization

Abstract: Adopting an Information Communication and Technology (ICT) system in an organization is somewhat challenging. User diversity, heavy workload, and different skill gap make the ICT adoption process slower. This research starts from a condition that a conventional ICT learning through short workshop and guidance book is not working well. This research proposes a model called ICT instruction design model (ICT-IDM). This model provides fast track learning through integration between multimedia learning and self-paced hands-on E-learning. Through this case study, we discovered that the proposed model provides 27% rapid learning adoption rather than conventional learning model.

Author 1: Ridi Ferdiana
Author 2: Obert Hoseanto

Keywords: ICT System Adoption; Learning Model; Multimedia Learning; E-learning; Instruction Design Model; Learning Plan.

PDF

Paper 2: An Enhanced MPLS-TE For Transferring Multimedia packets

Abstract: Multi-Protocol Label Switching is useful in managing multimedia traffic when some links are too congested; MPLS Traffic Engineering is a growing implementation in today's service provider networks. In This paper we propose an improvement of MPLS-TE called EMPLS-TE, it is based on a modification of operation of Forwarding Equivalence Class (FEC) in order to provide the quality of service to stream multimedia. The performance of the EMPLS-TE is evaluated by a simulation model under a variety of network conditions. We also compare its performance with that of unmodified MPLS-TE and MPLS. We demonstrate how a small change to the MPLS-TE protocol can lead to significantly improved performance results. We present a comparative analysis between MPLS, MPLS-TE and Enhanced MPLS-TE (EMPLS-TE). Our proposed EMPLS-TE has a performance advantageous for multimedia applications in their movement in a congested and dense environment. EMPLS-TE defines paths for network traffic based on certain quality of service. The simulation study is conducted in this paper; it is a means to illustrate the benefits of using this Enhanced MPLS-TE for multimedia applications.

Author 1: Abdellah Jamali
Author 2: Najib Naja
Author 3: Driss El Ouadghiri

Keywords: Multi-Protocol Label Switching (MPLS); Multi-Protocol Label Switching Traffic Engineering (MPLS-TE); Forwarding Equivalence Class (FEC); Quality Of Service (QoS); Simulation.

PDF

Paper 3: A New Algorithm for Data Compression Optimization

Abstract: People tend to store a lot of files inside theirs storage. When the storage nears it limit, they then try to reduce those files size to minimum by using data compression software. In this paper we propose a new algorithm for data compression, called j-bit encoding (JBE). This algorithm will manipulates each bit of data inside file to minimize the size without losing any data after decoding which is classified to lossless compression. This basic algorithm is intended to be combining with other data compression algorithms to optimize the compression ratio. The performance of this algorithm is measured by comparing combination of different data compression algorithms.

Author 1: I Made Agus Dwi Suarjaya

Keywords: algorithms; data compression; j-bit encoding; JBE; lossless.

PDF

Paper 4: Monte Carlo Based Non-Linear Mixture Model of Earth Observation Satellite Imagery Pixel Data

Abstract: Monte Carlo based non-linear mixel (mixed pixel) model of visible to near infrared radiometer of earth observation satellite imagery is proposed. Through comparative studies with actual real earth observation satellite imagery data between conventional linear mixel model and the proposed non-linear mixel model, it is found that the proposed mixel model represents the pixels in concern much precisely rather than the conventional linear mixel model.

Author 1: Kohei Arai

Keywords: remote sensing satellite; visible to near infrared radiometer; mixed pixel: mixel; Monte Carlo simulation model.

PDF

Paper 5: A Modified Feistel Cipher Involving Substitution, Shifting of rows, Mixing of columns, XOR operation with a Key and Shuffling

Abstract: In this paper, we have developed a modification to the Feistel cipher by taking the plaintext in the form of a pair of matrices and introducing a set of functions namely, substitute, shifting of rows, mixing of columns and XOR operation with a key. Further we have supplemented this process by using another function called shuffling at the end of each round of the iteration process. In this analysis, the cryptanalysis clearly indicates that the strength of the cipher is quite significant and this is achieved by the introduction of the aforementioned functions.

Author 1: V U.K Sastry
Author 2: K. Anup Kumar

Keywords: encryption; decryption; cryptanalysis; avalanche effect; XOR operation.

PDF

Paper 6: Automatic Association of Strahler’s Order and Attributes with the Drainage System

Abstract: A typical drainage pattern is an arrangement of river segment in a drainage basin and has several contributing identifiable features such as leaf segments, intermediate segments and bifurcations. In studies related to morphological assessment of drainage pattern for estimating channel capacity, length, bifurcation ratio and contribution of segments to the main stream, association of order with the identified segment and creation of attribute repository plays a pivotal role. Strahler’s (1952) proposed an ordering technique that categories the identified stream segments into different classes based on their significance and contribution to the drainage pattern. This work aims at implementation of procedures that efficiently associates order with the identified segments and creates a repository that stores the attributes and estimates of different segments automatically. Implementation of such techniques not only reduces both time and effort as compared to that of manual procedures, it also improves the confidence and reliability of the results.

Author 1: Mohan P Pradhan
Author 2: M. K. Ghose
Author 3: Yash R. Kharka

Keywords: Stream; digitization; Strahler’s order.

PDF

Paper 7: Performance model to predict overall defect density

Abstract: Management by metrics is the expectation from the IT service providers to stay as a differentiator. Given a project, the associated parameters and dynamics, the behaviour and outcome need to be predicted. There is lot of focus on the end state and in minimizing defect leakage as much as possible. In most of the cases, the actions taken are re-active. It is too late in the life cycle. Root cause analysis and corrective actions can be implemented only to the benefit of the next project. The focus has to shift left, towards the execution phase than waiting for lessons to be learnt post the implementation. How do we pro-actively predict defect metrics and have a preventive action plan in place. This paper illustrates the process performance model to predict overall defect density based on data from projects in an organization.

Author 1: J Venkatesh
Author 2: Mr. Priyesh Cherurveettil
Author 3: Mrs. Thenmozhi. S
Author 4: Dr. Balasubramanie. P

Keywords: process; performance; defect density; metrics.

PDF

Paper 8: Spontaneous-braking and lane-changing effect on traffic congestion using cellular automata model applied to the two-lane traffic

Abstract: In the real traffic situations, vehicle would make a braking as the response to avoid collision with another vehicle or avoid some obstacle like potholes, snow, or pedestrian that crosses the road unexpectedly. However, in some cases the spontaneous-braking may occur even though there are no obstacles in front of the vehicle. In some country, the reckless driving behaviors such as sudden-stop by public-buses, motorcycle which changing lane too quickly, or tailgating make the probability of braking getting increase. The new aspect of this paper is the simulation of braking behavior of the driver and presents the new Cellular Automata model for describing this characteristic. Moreover, this paper also examines the impact of lane-changing maneuvers to reduce the number of traffic congestion that caused by spontaneous-braking behavior of the vehicles.

Author 1: Kohei Arai
Author 2: Steven Ray Sentinuwo

Keywords: spontaneous-braking; traffic congestion; cellular automata; two-lane trafficcomponent.

PDF

Paper 9: Enhancing eHealth Information Systems for chronic diseases remote monitoring systems

Abstract: Statistics and demographics for the aging population in Europe are compelling. The stakes are then in terms of disability and chronic diseases whose proportions will increase because of increased life expectancy. Heart failure (HF), a serious chronic disease, induces frequent re-hospitalizations, some of which can be prevented by up-stream actions. Managing HF is quite a complex process: long, often difficult and expensive. In France, nearly one million people suffer from HF and 120,000 new cases are diagnosed every year. Managing such patients, a telemedicine system tools associated with motivation and education can significantly reduce the number of hospital days that believes therefore that the patient is hospitalized for acute HF. The current development projects are fully in prevention, human security, and remote monitoring of people in their living day-to-day spaces, from the perspective of health and wellness. These projects encompass gathering, organizing, structuring and sharing medical information. They also have to take into account the main aspects of interoperability. A different approach has been used to capitalize on such information: data warehouse approach, mediation approach (or integration by views) or integration approach by link (or so-called mashup). In this paper, we will focus on ontologies that take a central place in the Semantic Web: on one hand, they rely on modeling from conceptual representations of the areas concerned and, on the other hand, they allow programs to make inferences over them.

Author 1: Amir HAJJAM

Keywords: Ontologies; Web Semantic; Remote Monitoring; Chronic Diseases.

PDF

Paper 10: E-commerce Smartphone Application

Abstract: Mobile and e-commerce applications are tools for accessing the Internet and for buying products and services. These applications are constantly evolving due to the high rate of technological advances being made. This paper provides a new perspective on the types of applications that can be used. It describes and analyses device requirements, provides a literature review of important aspects of mobile devices that can use such applications and the requirements of websites designed for m-commerce. The design and security aspects of mobile devices are also investigated. As an alternative to existing m-commerce applications, this paper also investigates the characteristics and potential of the PhoneGap cross-mobile platform application. The results suggest that effective mobile applications do exist for various Smartphones, and web applications on mobile devices should be effective. PhoneGap and Spree applications can communicate using JSON instead of the XML language. Android simulators can be used for ensuring proper functionality and for compiling the applications.

Author 1: Abdullah Saleh Alqahtani
Author 2: Robert Goodwin

Keywords: E-commerce ; PhoneGap ; M-commerce ; Smartphones ; Spree –commerce ; Ruby on Rails.

PDF

Paper 11: SW-SDF Based Personal Privacy with QIDB-Anonymization Method

Abstract: Personalized anonymization is a method in which a guarding node is used to indicate whether the record owner is ready to reveal its sensitivity based on which anonymization will be performed. Most of the sensitive values that are present in the private data base do not require privacy preservation since the record owner sensitivity is a general one. So there are only few records in the entire distribution that require privacy. For example a record owner having disease flu doesn’t mind revealing his identity as compared to record owner having disease cancer. Even in this some of the record owners who have cancer are ready to reveal their identity, this is the motivation for SW-SDF based Personal Privacy. In this paper we propose a novel personalized privacy preserving technique that over comes the disadvantages of previous personalized privacy and other anonymization techniques. The core of this method can be divided in to two major components. The first component deals with additional attribute used in the table which is in the form of flags which can be used to divide sensitive attribute. Sensitive Disclosure Flag (SDF) determines whether record owner sensitive information is to be disclosed or whether privacy should be maintained. The second flag that we are using is Sensitive Weigh (SW) which indicates how much sensitive the attribute value is as compared with the rest. Second section deals with a novel representation called Frequency Distribution Block (FDB) and Quasi–Identifier Distribution Block(QIDB) which is used in anonymization. Experimental result show that it has lesser information loss and faster execution time as compared with existing methods.

Author 1: Kiran P
Author 2: Dr Kavya N P

Keywords: Privacy Peserving Data Mining(PPDM);Privacy Preserving Data Publishing(PPDP); Personal Anonymization.

PDF

Paper 12: Integration of data mining within a Strategic Knowledge Management framework

Abstract: In today’s globally interconnected economy, knowledge is recognised as a valuable intangible asset and source of competitive advantage for firms operating in both established and emerging industries. Within these contexts Knowledge Management (KM) manifests as set of organising principles and heuristics which shape management routines, structures, technologies and cultures within organisations.When employed as an integral part of business strategy KM can blend and develop the expertise and capacity embedded in human and technological networks. This may improve processes or add value to products, services, brands and reputation. We argue that if located within a suitable strategic framework, KM can enable sustainable competitive advantage by mobilising the intangible value in networks to create products, processes or services with unique characteristics that are hard to substitute or replicate. Despite the promise of integrated knowledge strategies within high technology and professional service industries, there has been limited discussion of business strategies linked to Knowledge Management in traditional capital intensive industries such as mining and petroleum. Within these industries IT-centric Knowledge Management Systems (KMS) have dominated, with varying degrees of success as business analysis, process improvement and cost reduction tools. This paper aims to explore the opportunities and benefits arising from the application of a strategic KM and Data Mining framework within the local operations of large domestic or multinational mining companies, located in Western Australia (WA). The paper presents a high level conceptual framework for integrating so called hard, ICT and soft, human systems representing the explicit and tacit knowledge embedded within broader networks of mining activity. This Strategic Knowledge Management (SKM) framework is presented as a novel first step towards improving organisational performance and realisation of the human and technological capability captured in organisational networks. The SKM framework represents a unique combination of concepts and constructs from the Strategy, Knowledge Management, Information Systems, and Data Mining literatures. It was generated from the Stage 1- Literature and industry documentation review of a two stage exploratory study. Stage 2 will comprise a quantitative case based research approach employing clearly defined metrics to describe and compare SKM activity in designated mining companies.

Author 1: Sanaz Moayer
Author 2: Scott Gardner

Keywords: Knowledge Management (KM); data mining, sustainable competitive advantage; Strategic Knowledge Management (SKM) framework; integration; hard and soft systems; Australian mining organisation.

PDF

Paper 13: Managing Changes in Citizen-Centric Healthcare Service Platform using High Level Petri Net

Abstract: The healthcare organizations are facing a number of daunting challenges pushing systems to deal with requirements changes and benefit from modern technologies and telecom capabilities. Systems evolution through extension of the existing information technology infrastructure becomes one of the most challenging aspects of healthcare and the adaptation to changes is a must. The paper presents a change management framework for a citizen-centric healthcare service platform. A combination between Petri nets model to handle changes and reconfigurable Petri nets model to react to these changes are introduced to fulfill healthcare goals. Thanks to this management framework model, consistency and correctness of a healthcare processes in the presence of frequent changes can be checked and guaranteed.

Author 1: Sabri MTIBAA
Author 2: Moncef TAGINA

Keywords: Healthcare; requirements changes; evolution; information technology; healthcare service platform; handle changes; reconfigurable Petri nets; consistency.

PDF

Paper 14: Software Architecture- Evolution and Evaluation

Abstract: The growth of various software architectural frameworks and models provides a standard governing structure for different types of organizations. Selection of a suitable framework for a particular environment needs much more detailed information in various aspects and a reference guide of features should be provided. This paper brings out the history of software architecture with a new evolution tree. It also technically analyses well known frameworks used in industries and other governmental organizations and lists out the supportive tools for them. This paper presents the comparative chart that can be used as a reference guide to understand top level frameworks and to further research to enable and promote the utilization of these frameworks in various environments.

Author 1: S Roselin Mary
Author 2: Dr.Paul Rodrigues

Keywords: Framework; Software Architecture; Views.

PDF

Paper 15: A hybrid Evolutionary Functional Link Artificial Neural Network for Data mining and Classification

Abstract: This paper presents a specific structure of neural network as the functional link artificial neural network (FLANN). This technique has been employed for classification tasks of data mining. In fact, there are a few studies that used this tool for solving classification problems. In this present research, we propose a hybrid FLANN (HFLANN) model, where the optimization process is performed using 3 known population based techniques such as genetic algorithms, particle swarm and differential evolution. This model will be empirically compared to FLANN based back-propagation algorithm and to others classifiers as decision tree, multilayer perceptron based back-propagation algorithm, radical basic function, support vector machine, and K-nearest Neighbor. Our results proved that the proposed model outperforms the other single model. (Abstract)

Author 1: Faissal MILI
Author 2: Manel HAMDI

Keywords: component Data mining; Classification; Functional link artificial neural network; genetic algorithms; Particle swarm; Differential evolution

PDF

Paper 16: Automatic Aircraft Target Recognition by ISAR Image Processing based on Neural Classifier

Abstract: This work proposes a new automatic target classifier, based on a combined neural networks’ system, by ISAR image processing. The novelty introduced in our work is twofold. We first present a novel automatic classification procedure, and then we discuss an improved multimedia processing of ISAR images for automatic object detection. The classifier, composed by a combination of 20 feed-forward artificial neural networks, is used to recognize aircraft targets extracted from ISAR images. A multimedia processing by two recently introduced image processing techniques is exploited to improve the shape and features extraction process. Performance analysis is carried out in comparison with conventional multimedia techniques and standard detectors. Numerical results obtained from wide simulation trials evidence the efficiency of the proposed method for the application to automatic aircraft target recognition.

Author 1: F Benedetto
Author 2: F. Riganti Fulginei
Author 3: A. Laudani
Author 4: G. Albanese

Keywords: Automatic target recognition; artificial intelligence; neural classifiers; ISAR image processing; shape extraction.

PDF

Paper 17: An Effective Identification of Species from DNA Sequence: A Classification Technique by Integrating DM and ANN

Abstract: Species classification from DNA sequences remains as an open challenge in the area of bioinformatics, which deals with the collection, processing and analysis of DNA and proteomic sequence. Though incorporation of data mining can guide the process to perform well, poor definition, and heterogeneous nature of gene sequence remains as a barrier. In this paper, an effective classification technique to identify the organism from its gene sequence is proposed. The proposed integrated technique is mainly based on pattern mining and neural network-based classification. In pattern mining, the technique mines nucleotide patterns and their support from selected DNA sequence. The high dimension of the mined dataset is reduced using Multilinear Principal Component Analysis (MPCA). In classification, a well-trained neural network classifies the selected gene sequence and so the organism is identified even from a part of the sequence. The proposed technique is evaluated by performing 10-fold cross validation, a statistical validation measure, and the obtained results prove the efficacy of the technique.

Author 1: Sathish Kumar S
Author 2: Dr.N.Duraipandian

Keywords: Pattern Generation; DNA Sequence; Pattern Support; Mining; Neural Network.

PDF

Paper 18: Brainstorming 2.0: Toward collaborative tool based on social networks

Abstract: Social networks are part of Web 2.0 collaborative tools that have a major impact in enriching the sharing and communication enabling a maximum of collaboration and innovation globally between web users. It is in this context that this article is positioned to be part of a series of scientific research conducted by our research team and that mixes social networks and collaborative decision making on the net. It aims to provide a new tool open source for solving various social problems posed by users in a collaborative 2.0 based on the technique for generating ideas, brainstorming method and social networks together for the maximum possible adequate profiles to the virtual brainstorming session. A tool is run by a user called expert accompanied by a number of users called validators to drive the process of extracting ideas to the loan of various users of the net. It offers then the solution to the problem of sending a satisfaction questionnaire administered by an expert ready for the affected user to measure the level of his satisfaction and also the success of the process launched. For its implementation, we propose a unified modeling using UML language, followed by a realization using the JAVA language.

Author 1: MohamedChrayah
Author 2: Kamal Eddine El Kadiri
Author 3: Boubker Sbihi
Author 4: Noura Aknin

Keywords: component: Web2.0, brainstorming, social networks, UML.

PDF

Paper 19: A Review On Cognitive Mismatch Between Computer and Information Technology And Physicians

Abstract: Health Information Technology has a great potential to transform the existing health care systems by making them safe, effective and efficient. Multi-functionality and interoperability of health information systems are very important functions. Hence these features cannot be achieved without addressing the knowledge and skills of the health care personnel. There is a great mismatch between Information Technology knowledge and skills of physicians as this discipline is completely missing in their educational tenure. So usability of health information technologies and system as well as evidence based practice in the future can be improved by addressing this cognitive mismatch. This will result in persistent partnership in HIS design between physician and IT personnel to get maximum usibility of the systems,

Author 1: Fozia Anwar
Author 2: Dr. Suziah Sulaiman
Author 3: Dr. P.D.D.Dominic

Keywords: cognitive mismatch; HIT; usibility.

PDF

Paper 20: Techniques to improve the GPS precision

Abstract: The accuracy of a standard market receiver GPS (Global Positioning System) is near 10-15 meters the 95% of the times. To reach a sub-metric level of accuracy some techniques must be used [1]. This article describes some of these procedures to improve the positioning accuracy by using a low-cost GPS in a differential relative positioning way. The proposed techniques are some variations of Kalman, fuzzy logic and information selection.

Author 1: Nelson Acosta
Author 2: Juan Toloza

Keywords: GPS accuracy; relative positioning; DGPS; precision farming GPS.

PDF

Paper 21: M-Commerce service systems implementation

Abstract: Mobile commerce supports automated banking services. However, the implementation of m-commerce services systems has become increasingly important in today’s dynamic banking environment. This research studied the relationships between technology acceptance model and m- commerce services. The results of the survey on 249 respondents in several Jordan banks revealed that technology acceptance model had a significant impact on m-commerce services. The results led to the recommendation that the technology acceptance model is a success model for support using new services for electronic commerce. In addition, managers play a significant role in influencing the mobile services in banks through social interaction. Managers should focus on relative advantage, usefulness, and ease of use, in order to develop the mobile commerce services implementation.

Author 1: Asmahan Altaher

Keywords: M-commerce services; usefulness; and ease of use; social interaction.

PDF

Paper 22: Clone Detection Using DIFF Algorithm For Aspect Mining

Abstract: Aspect mining is a reverse engineering process that aims at mining legacy systems to discover crosscutting concerns to be refactored into aspects. This process improves system reusability and maintainability. But, locating crosscutting concerns in legacy systems manually is very difficult and causes many errors. So, there is a need for automated techniques that can discover crosscutting concerns in source code. Aspect mining approaches are automated techniques that vary according to the type of crosscutting concerns symptoms they search for. Code duplication is one of such symptoms which risks software maintenance and evolution. So, many code clone detection techniques have been proposed to find this duplicated code in legacy systems. In this paper, we present a clone detection technique to extract exact clones from object-oriented source code using Differential File Comparison Algorithm (DIFF) to improve system reusability and maintainability which is a major objective of aspect mining.

Author 1: Rowyda Mohammed Abd El-Aziz
Author 2: Amal Elsayed Aboutabl
Author 3: Mostafa-Sami Mostafa

Keywords: aspect mining; reverse engineering; clone detection; DIFF algorithm.

PDF

Paper 23: On the Projection Matrices Influence in the Classification of Compressed Sensed ECG Signals

Abstract: In this paper the classification results of compressed sensed ECG signals based on various types of projection matrices is investigated. The compressed signals are classified using the KNN (K-Nearest Neighbour) algorithm. A comparative analysis is made with respect to the projection matrices used, as well as of the results obtained in the case of the original (uncompressed) signals for various compression ratios. For Bernoulli projection matrices it has been observed that the classification results for compressed cardiac cycles are comparable to those obtained for uncompressed cardiac cycles. Thus, for normal uncompressed cardiac cycles a classification ratio of 91.33% was obtained, while for the signals compressed with a Bernoulli matrix, up to a compression ratio of 15:1 classification rates of approximately 93% were obtained. Significant improvements of classification in the compressed space take place up to a compression ratio of 30:1.

Author 1: Monica Fira
Author 2: Liviu Goras
Author 3: Nicolae Cleju
Author 4: Constantin Barabasa

Keywords: ECG; compressed sensing; projection matrix; classification; KNN.

PDF

Paper 24: An Approach of Improving Student’s Academic Performance by using K-means clustering algorithm and Decision tree

Abstract: Improving student’s academic performance is not an easy task for the academic community of higher learning. The academic performance of engineering and science students during their first year at university is a turning point in their educational path and usually encroaches on their General Point Average (GPA) in a decisive manner. The students evaluation factors like class quizzes mid and final exam assignment lab -work are studied. It is recommended that all these correlated information should be conveyed to the class teacher before the conduction of final exam. This study will help the teachers to reduce the drop out ratio to a significant level and improve the performance of students. In this paper, we present a hybrid procedure based on Decision Tree of Data mining method and Data Clustering that enables academicians to predict student’s GPA and based on that instructor can take necessary step to improve student academic performance

Author 1: Hedayetul Islam Shovon
Author 2: Mahfuza Haque

Keywords: Database; Data clustering; Data mining; classification; prediction; Assessments; Decision tree; academic performance.

PDF

Paper 25: Prevention and Detection of Financial Statement Fraud – An Implementation of Data Mining Framework

Abstract: Every day, news of financial statement fraud is adversely affecting the economy worldwide. Considering the influence of the loss incurred due to fraud, effective measures and methods should be employed for prevention and detection of financial statement fraud. Data mining methods could possibly assist auditors in prevention and detection of fraud because data mining can use past cases of fraud to build models to identify and detect the risk of fraud and can design new techniques for preventing fraudulent financial reporting. In this study we implement a data mining methodology for preventing fraudulent financial reporting at the first place and for detection if fraud has been perpetrated. The association rules generated in this study are going to be of great importance for both researchers and practitioners in preventing fraudulent financial reporting. Decision rules produced in this research complements the prevention mechanism by detecting financial statement fraud.

Author 1: Rajan Gupta
Author 2: Nasib Singh Gill

Keywords: Data mining framework; Rule engine; Rule monitor.

PDF

Paper 26: Review of Remote Terminal Unit (RTU) and Gateways for Digital Oilfield delpoyments

Abstract: The increasing decline in easy oil has led to an increasing need for the optimization of oil and gas processes. Digital oilfields utilize remote operations to achieve these optimization goals and the remote telemetry unit and gateways are very critical in the realization of this objective. This paper presents a review of the RTUs and gateways utilized in digital oilfield architectures. It presents a review of the architecture, their functionality and selection criteria. It also provides a comparison of the specifications of some popular RTUs.

Author 1: Francis Enejo Idachaba
Author 2: Ayobami Ogunrinde

Keywords: Digital Oilfield; Gateway; HMI; i-fields; RTU; Smartfields.

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org