The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 2 Issue 12

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Estimation of the Visual Quality of Video Streaming Under Desynchronization Conditions

Abstract: This paper presents a method for assessing desynchronized video with the aid of a software package specially developed for this purpose. A unique methodology of substituting values for lost frames was developed. It is shown that in the event of non-similarity of the sent and received sequences because of the loss of some frames in transit, the estimation of the quality indicator via traditional (existing) software is done inaccurately. We present in this paper a novel method of estimating the quality of desynchronized video streams. The developed software application is able to carry out the estimation of the quality of video sequences even when parts of the frame is missing, by means of searching out contextually similar frames and “gluing” them in lieu of the lost frames. Comparing obtained results with those from existing software validates their accuracy. The difference in results and methods of estimating video sequences of different subject groups is also discussed. The paper concludes with adequate recommendations on the best methodology to adopt for specific estimation scenarios.

Author 1: A A Atayero
Author 2: O.I. Sheluhin
Author 3: Y.A. Ivanov
Author 4: A.A. Alatishe

Keywords: video streaming, encoder, decoder, video streaming quality, PSNR.

PDF

Paper 2: A Novel Intra-Domain Continues Handover Solution for Inter-Domain Pmipv6 Based Vehicular Network

Abstract: IP mobility management protocols (e.g. host based mobility protocols) incur significant handover latency, thus aggravate QoS for end user devices. Proxy Mobile IPv6 (PMIPv6) was proposed by the Internet Engineering Task Force (IETF) as a new network-based mobility protocol to reduce the host based handover latency. However the current PMIPv6 cannot support the vehicles high mobility while the vehicles motion within PMIPv6 domain. In this paper we introduce a novel intra-domain PMIPv6 handover technique based vehicular network using Media Independent Handover (MIH). The novel intra-domain PMIPv6 handover based vehicular network improves the handover performance of PMIPv6 by allowing the new PMIPv6 domain to obtain the MIIS information to estimate whether the handover is necessary or not before the vehicles movement to the second MAG of the new PMIPv6 domain. We evaluate the handover latency and data packet loss of the proposed handover process compared to PMIPv6. The conducted analysis results con?rm that the novel handover process yields the reduced handover latency compared to that of PMIPv6 and also prevents data packet loss.

Author 1: Haidar N Hussain
Author 2: Kamalrulnizam Abu Bakar
Author 3: Shaharuddin Salleh

Keywords: PMIPv6; MIH; MIIS.

PDF

Paper 3: Autonomous Control of Eye Based Electric Wheel Chair with Obstacle Avoidance and Shortest Path Findings Based on Dijkstra Algorithm

Abstract: Autonomous Eye Based Electric Wheel Chair: EBEWC control system which allows handicap person (user) to control their EWC with their eyes only is proposed. Using EBEWC, user can move to anywhere they want on a same floor in a hospital autonomously with obstacle avoidance with visible camera and ultrasonic sensor. User also can control EBEWC by their eyes. The most appropriate route has to be determined with avoiding obstacles and then autonomous real time control has to be done. Such these processing time and autonomous obstacle avoidance together with the most appropriate route determination are important for the proposed EBEWC. All the required performances are evaluated and validated. Obstacles can be avoided using acquired images with forward looking camera. The proposed EBEWC system allows creation of floor layout map that contains obstacles locations in a real time basis. The created and updated maps can be share by the electric wheel chairs on a same floor of a hospital. Experimental data show that the system allows computer input (more than 80 keys) almost perfectly and electric wheel chair can be controlled with human eyes-only safely.

Author 1: Kohei Arai
Author 2: Ronny Mardiyanto

Keywords: Human Computer Interaction; Gaze Estimation; Obstacle Avoidance; Electric Wheel Chair Control.

PDF

Paper 4: Eye-based Human Computer Interaction Allowing Phoning, Reading E-Book/E-Comic/E-Learning, Internet Browsing, and TV Information Extraction

Abstract: Eye-based Human-Computer Interaction: HCI system which allows phoning, reading e-book/e-comic/e-learning, internet browsing, and TV information extraction is proposed for handicap student in E-Learning Application. The conventional eye-based HCI applications are facing problems on accuracy and process speed. We develop new interfaces for improving key-in accuracy and process speed of eye-based key-in for E-Learning application, in particular. We propose eye-based HCI by utilizing camera mounted glasses for gaze estimation. We use the sight for controlling the user interface such as navigation of e-comic/e-book/e-learning contents, phoning, internet browsing, and TV information extraction. We develop interfaces including standard interface navigator with five keys, single line of moving keyboard, and multi line of moving keyboard in order to allow the aforementioned functions without burdening the accuracy. The experimental results show the proposed system does work the aforementioned functions in a real time basis.

Author 1: Kohei Arai
Author 2: Ronny Mardiyanto

Keywords: Eye-based HCI; E-Learning; Interface; Keyboard.

PDF

Paper 5: Very Low Power Viterbi Decoder Employing Minimum Transition and Exchangeless Algorithms for Multimedia Mobile Communication

Abstract: A very low power consumption viterbi decoder has been developed by low supply voltage and 0.15 µm CMOS process technology. Significant power reduction can be achieved by modifying the design and implementation of viterbi decoder using conventional techniques traceback and Register Exchange to Hybrid Register Exchange Method (HREM), Minimum Transition Register Exchange Method (MTREM), Minimum Transition Hybrid Register Exchange Method (MTHREM), Register exchangeless Method and Hybrid Register exchangeless Method. By employing the above said schemes such as, HREM, MTREM, MTHREM, Register exchangeless Method and Hybrid Register exchangeless Method; the viterbi decoder achieves a drastic reduction in power consumption below 100 µW at a supply voltage of 1.62 V when the data rate of 5 Mb/s and the bit error rate is less than 10-3. This excellent performance has been paved the way to employing the strong forward error correction and low power consumption portable terminals for personnel communication, mobile multimedia communication and digital audio broadcasting. Implementation insight and general conclusions can particularly benefit from this approach are given.

Author 1: S L Haridas
Author 2: Dr. N. K. Choudhari

Keywords: Hybrid register exchange method; minimum transition register exchange method; minimum transition hybrid register exchange method; register exchangeless method; hybrid register exchangeless method.

PDF

Paper 6: Outlier-Tolerant Kalman Filter of State Vectors in Linear Stochastic System

Abstract: The Kalman filter is widely used in many different fields. Many practical applications and theoretical results show that the Kalman filter is very sensitive to outliers in a measurement process. In this paper some reasons why the Kalman Filter is sensitive to outliers are analyzed and a series of outlier-tolerant algorithms are designed to be used as substitutes of the Kalman Filter. These outlier-tolerant filters are highly capable of preventing adverse effects from outliers similar with the Kalman Filter in complexity degree and very outlier-tolerant in the case there are some outliers arisen in sampling data set of linear stochastic systems. Simulation results show that these modified algorithms are safe and applicable.

Author 1: HU Shaolin
Author 2: Huajiang Ouyang
Author 3: Karl Meinke
Author 4: SUN Guoji

Keywords: Kalman filter; Outlier-tolerant; Outlier; Linear stochastic system.

PDF

Paper 7: Handsets Malware Threats and Facing Techniques

Abstract: Nowadays, mobile handsets combine the functionality of mobile phones and PDAs. Unfortunately, mobile handsets development process has been driven by market demand, focusing on new features and neglecting security. So, it is imperative to study the existing challenges that facing the mobile handsets threat containment process, and the different techniques and methodologies that used to face those challenges and contain the mobile handsets malwares. This paper also presents a new approach to group the different malware containment systems according to their typologies.

Author 1: Marwa M.A Elfattah
Author 2: Aliaa A.A Youssif
Author 3: Ebada Sarhan Ahmed

Keywords: mobile; malware; security; malicious programs.

PDF

Paper 8: Identifying Nursing Computer Training Requirements using Web-based Assessment

Abstract: Our work addresses issues of inefficiency and ineffectiveness in the training of nurses in computer literacy by developing an adaptive questionnaire system. This system works to identify the most effective training modules by evaluating applicants for pre-training and post-training. Our system, Systems Knowledge Assessment Tool (SKAT), aims to increase training proficiency, decrease training time and reduce costs associated with training by identifying areas of training required, and those which are not required for training, targeted to each individual. Based on the project’s requirements, a number of HTML documents were designed to be used as templates in the implementation stage. During this stage, the milestone principle was used, in which a series of coding and testing was performed to generate an error-free product.The decision-making process and it is components, as well as knowing the priority of each attribute in the application is responsible for determining the required training for each applicant. Thus, the decision-making process is an essential aspect of system design and greatly affects the training results of the applicant. The SKAT system has been evaluated to ensure that the system meets the project’s requirements. The evaluation stage was an important part of the project and required a number of nurses with different roles to evaluate the system. Based on their feedback, changes were made.

Author 1: Naser Ghazi
Author 2: Gitesh Raikundalia
Author 3: Janette Gogler
Author 4: Leslie Bell

Keywords: component; Training Needs Analysis (TNA); Nursing Compter Literacy; Web-based Questionnaire.

PDF

Paper 9: A Comparative study of Arabic handwritten characters invariant feature

Abstract: this paper is practically interested in the unchangeable feature of Arabic handwritten character. It presents results of comparative study achieved on certain features extraction techniques of handwritten character, based on Hough transform, Fourier transform, Wavelet transform and Gabor Filter. Obtained results show that Hough Transform and Gabor filter are insensible to the rotation and translation, Fourier Transform is sensible to the rotation but insensible to the translation, in contrast to Hough Transform and Gabor filter, Wavelets Transform is sensitive to the rotation as well as to the translation.

Author 1: Hamdi Hassen
Author 2: Maher Khemakhem

Keywords: component ; Arabic handwritten character; invariant feature; Hough transform; Fourier transform; Wavelet transform; Gabor Filter.

PDF

Paper 10: Pattern Discovery Using Association Rules

Abstract: The explosive growth of Internet has given rise to many websites which maintain large amount of user information. To utilize this information, identifying usage pattern of users is very important. Web usage mining is one of the processes of finding out this usage pattern and has many practical applications. Our paper discusses how association rules can be used to discover patterns in web usage mining. Our discussion starts with preprocessing of the given weblog, followed by clustering them and finding association rules. These rules provide knowledge that helps to improve website design, in advertising, web personalization etc.

Author 1: Ms chandra M
Author 2: Mr Rahul Jadhav
Author 3: Ms Dipa Dixit
Author 4: Ms Rashmi J
Author 5: Ms Anjali Nehete
Author 6: Ms Trupti Khodkar

Keywords: Weblogs; Pattern discovery; Association rules.

PDF

Paper 11: The macroeconomic effect of the information and communication technology in Hungary

Abstract: It was not until the beginning of the 1990s that the effects of information and communication technology on economic growth as well as on the profitability of enterprises raised the interest of researchers. After giving a general description on the relationship between a more intense use of ICT devices and dynamic economic growth, the author identified and explained those four channels that had a robust influence on economic growth and productivity. When comparing the use of information technonology devices in developed as well as in developing countries, the author highlighted the importance of the available additional human capital and the elimination of organizational inflexibilities in the attempt of narrowing the productivity gap between the developed and developing nations. By processing a large quantitiy of information gained from Hungarian enterprises operating in several economic sectors, the author made an attempt to find a strong correlation between the development level of using ICT devices and profitability together with total factor productivity. Although the impact of using ICT devices cannot be measured unequivocally at the microeconomic level because of certain statistical and methodological imperfections, by applying such analytical methods as cluster analysis and correlation and regression calculation, the author managed to prove that both the correlation coefficient and the gradient of the regression trend line showed a positive relationship between the extensive use of information and communication technology and the profitability of enterprises.

Author 1: Peter Sasvari

Keywords: ICT; Economic sector; Profitability; Total Factor Productivity.

PDF

Paper 12: Preprocessor Agent Approach to Knowledge Discovery Using Zero-R Algorithm

Abstract: Data mining and multiagent approach has been used successfully in the development of large complex systems. Agents are used to perform some action or activity on behalf of a user of a computer system. The study proposes an agent based algorithm PrePZero-r using Zero-R algorithm in Weka. Algorithms are powerful technique for solution of various combinatorial or optimization problems. Zero-R is a simple and trivial classifier, but it gives a lower bound on the performance of a given dataset which should be significantly improved by more complex classifiers. The Proposed Algorithm called PrePZero-r has significantly reduced time taken to build the model than Zero-R algorithm by removing the Lower Bound Values 0 while preprocessing and comparing the result with class values. Also proposed study introduced new factor “Accuracy (1-e)” for each individual attribute.

Author 1: Inamdar S A
Author 2: Narangale S.M.
Author 3: G. N. Shinde

Keywords: Data mining; Zero-R algorithm; Lower Bound Value; Class values.

PDF

Paper 13: Text Independent Speaker Identification using Integrating Independent Component Analysis with Generalized Gaussian Mixture Model

Abstract: Recently much work has been reported in literature regarding Text Independent speaker identification models. Sailaja et al (2010)[34] has developed a Text Independent speaker identification model assuming that the speech spectra of each individual speaker can be modeled by Mel frequency cepstral coefficient and Generalized Gaussian mixture model. The limitation of this model is the feature vectors (Mel frequency cepstral coefficients) are high in dimension and assumed to be independent. But feature represented by MFCC’s are dependent and chopping some of the MFCC’s will bring falsification in the model. Hence, in this paper a new and novel Text Independent speaker identification model is developed by integrating MFCC’s with Independent component analysis(ICA) for obtaining independency and to achieve low dimensionality in feature vector extraction. Assuming that the new feature vectors follows a Generalized Gaussian Mixture Model (GGMM), the model parameters are estimated by using EM algorithm. A Bayesian classifier is used to identify each speaker. The experimental result with 50 speaker’s data base reveals that the proposed procedure outperforms the existing methods.

Author 1: N M Ramaligeswararao
Author 2: Dr.V Sailaja
Author 3: Dr.K. Srinivasa Rao

Keywords: Independent component analysis; Generalized Gaussian Mixture Model; Mel frequency cepstral coefficients; Bayesian classifier; EM algorithm.

PDF

Paper 14: Energy Efficient Zone Division Multihop Hierarchical Clustering Algorithm for Load Balancing in Wireless Sensor Network

Abstract: Wireless sensor nodes are use most embedded computing application. Multihop cluster hierarchy has been presented for large wireless sensor networks (WSNs) that can provide scalable routing, data aggregation, and querying. The energy consumption rate for sensors in a WSN varies greatly based on the protocols the sensors use for communications. In this paper we present a cluster based routing algorithm. One of our main goals is to design the energy efficient routing protocol. Here we try to solve the usual problems of WSNs. We know the efficiency of WSNs depend upon the distance between node to base station and the amount of data to be transferred and the performance of clustering is greatly influenced by the selection of cluster-heads, which are in charge of creating clusters and controlling member nodes. This algorithm makes the best use of node with low number of cluster head know as super node. Here we divided the full region in four equal zones and the centre area of the region is used to select for super node. Each zone is considered separately and the zone may be or not divided further that’s depending upon the density of nodes in that zone and capability of the super node. This algorithm forms multilayer communication. The no of layer depends on the network current load and statistics. Our algorithm is easily extended to generate a hierarchy of cluster heads to obtain better network management and energy efficiency.

Author 1: Ashim Kumar Ghosh
Author 2: Anupam Kumar Bairagi
Author 3: Dr. M. Abul Kashem
Author 4: Md. Rezwan-ul-Islam1
Author 5: A J M Asraf Uddin1

Keywords: routing protocol; WSN; multihop; load balancing; cluster based routing; zone division.

PDF

Paper 15: Eyes Based Eletric Wheel Chair Control System- - I (eye) can control Electric Wheel Chair -

Abstract: Eyes base Electric Wheel Chair Control: EBEWC is proposed. The proposed EBEWC is controlled by human eyes only. Therefore disabled person can control the EBEWC by themselves. Most of the computer input system with human eyes only consider in specific condition and does not work in a real time basis. Moreover, it is not robust against various user races, illumination conditions, EWC vibration, and user's movement. Though experiments, it is found that the proposed EBEWC is robust against the aforementioned influencing factors. Moreover, it is confirmed that the proposed EBEWC can be controlled by human eyes only accurately and safely.

Author 1: Kohei Arai
Author 2: Ronny Mardiyanto

Keywords: computer input by human eyes only; gaze estimation; electric wheelchair control.

PDF

Paper 16: Fuzzy Petri Nets for Human Behavior Verification and Validation

Abstract: Regarding the rapid growth of the size and complexity of simulation applications, designing applicable and affordable verification and validation (V&V) structures is an important problem. On the other hand, nowadays human behavior models are principles to make decision in many simulations and in order to have valid decisions based on a reliable human decision model, first the model must pass the validation and verification criteria. Usually human behavior models are represented as fuzzy rule bases. In all the recent works, V&V process is applied on a ready given rule-base. In this work, we are first supposed to construct a fuzzy rule-base and then apply the V&V process on it. Considering the professor-student interaction as the case-study, in order to construct the rule base, a questionnaire is designed in a special way to be transformed to a hierarchical fuzzy rule-base. The constructed fuzzy rule base is then mapped to a fuzzy Petri net and then within the verification (generating and searching the reachability graph) and validation (reasoning the Petri net) process is searched for probable structural and semantic errors.

Author 1: M Kouzehgar
Author 2: M. A. Badamchizadeh
Author 3: S. Khanmohammadi

Keywords: human behavior; verification; validation; high-level fuzzy Petri nets; fuzzy rules.

PDF

Paper 17: SVD-EBP Algorithm for Iris Pattern Recognition

Abstract: This paper proposes a neural network approach based on Error Back Propagation (EBP) for classification of different eye images. To reduce the complexity of layered neural network the dimensions of input vectors are optimized using Singular Value Decomposition (SVD). The main objective of this work is to prove usefulness of SVD to form a compact set of features for classification by EBP algorithm. The results of our work indicate that optimum classification values are obtained with SVD dimensions of 20 and maximum number of classes as 9 with the state-of-the art computational resources The details of this combined system named as SVD-EBP system for Iris pattern recognition and the results thereof are presented in this paper.

Author 1: Babasaheb G Patil
Author 2: Dr. Mrs. Shaila Subbaraman

Keywords: Singular value decomposition (SVD); Error back Propagation (EBP).

PDF

Paper 18: Using Semantic Web to support Advanced Web-Based Environment

Abstract: In the learning environments, users would be helpless without the assistance of powerful searching and browsing tools to find their way. Web-based e-learning systems are normally used by a wide variety of learners with different skills, background, preferences, and learning styles. In this paper, we perform the personalized semantic search and recommendation of learning contents on the learning Web-based environments to enhance the learning environment. Semantic and personalized search of learning content is based on a comparison of the learner profile, that is based on learning style, and the learning objects metadata. This approach needs to present both the learner profile and the learning object description as certain data structures. Personalized recommendation of learning objects uses an approach to determine a more suitable relationship between learning objects and learning profiles. Thus, it may advise a learner with most suitable learning objects. Semantic learning objects search is based on the query expansion of the user query and by using the semantic similarity to retrieve semantic matched learning objects.

Author 1: Khaled M Fouad
Author 2: Mostafa A. Nofal
Author 3: Hany M. Harb
Author 4: Nagdy M. Nagdy

Keywords: Semantic Web; Domain Ontology; Learner Profile; Adaptive Learning; Semantic Search ; Recommendation.

PDF

Paper 19: A Virtual Environment Using Virtual Reality and Artificial Neural Network

Abstract: In this paper we describe a model, which gives a virtual environment to a group of people who uses it. The model is integrated with an Immersible Virtual Reality (IVR) design with an Artificial Neural Network (ANN) interface which runs on internet. A user who wants to participate in the virtual environment should have the hybrid IVR and ANN model with internet connection. IVR is the advanced technology used in the model to give an experience to the people to feel a virtual environment as a real one and ANN used to give a shape for the characters in the virtual environment (VE). This model actually gives an illusion to the user that as if they are in the real communication environment.

Author 1: Abdul Rahaman Wahab Sait
Author 2: Mohammad Nazim Raza

Keywords: component; Model; Virtual environment; Immersible virtual reality; Internet ; Artificial neural networks.

PDF

Paper 20: Agent based Bandwidth Reservation Routing Technique in Mobile Ad Hoc Networks

Abstract: In mobile ad hoc networks (MANETs), inefficient resource allocation causes heavy losses to the service providers and results in inadequate user proficiency. For improving and automating the quality of service of MANETs, efficient resource allocation techniques are required. In this paper, we propose an agent based bandwidth reservation technique for MANET. The mobile agent from the source starts forwarding the data packets through the path which has minimum cost, congestion and bandwidth. The status of every node is collected which includes the bottleneck bandwidth field and the intermediate node computes the available bandwidth on the link. At the destination, after updating the new bottleneck bandwidth field, the data packet is feedback to the source. In resource reservation technique, if the available bandwidth is greater than bottleneck bandwidth, then bandwidth reservation for the flow is done. Using rate monitoring and adjustment methodologies, rate control is performed for the congested flows. By simulation results, we show that the resource allocation technique reduces the losses and improves the network performance.

Author 1: Vishnu Kumar Sharma
Author 2: Dr. Sarita Singh Bhadauria

Keywords: Mobile Ad hoc Networks (MANETs), Mobile Agents (MA), Total Congestion Metric (TCM), Enhanced Distributed Channel Access (EDCA), Transmission opportunity limit (TXOP).

PDF

Paper 21: Sensor Node Deployment Strategy for Maintaining Wireless Sensor Network Communication Connectivity

Abstract: We propose a rescue robot sensor network system in which a teleoperated rescue robot sets up a wireless sensor network (WSN) to gather disaster information in post-disaster underground spaces. In this system, the rescue robot carries wireless sensor nodes (SNs) and deploys them between gateways in an underground space on demand by the operator’s command to establish a safe approach path before rescue workers enter. However, a single communication path only is setup, because the rescue robot linearly deploys SNs between gateways. Hence, the rescue robot cannot be operated remotely if the communication path is disconnected by, for example, SN failure or changes in the environmental conditions. Therefore, SNs must be adaptively deployed so as to maintain WSN communication connectivity and negate such situations. This paper describes an SN deployment strategy for construction of a WSN robust to communication disconnection, caused by SN failure or deterioration of communications quality, in order to maintain communication connectivity between SNs. We thus propose an SN deployment strategy that uses redundant communication connection and ensures communication conditions between end-to-end communications of the WSN. The proposed strategy maintained communication conditions such that throughput between end-to-end communications in the WSN. Experimental results verifying the efficacy of the proposed method are also described.

Author 1: Shigeaki TANABE
Author 2: Kei SAWAI
Author 3: Tsuyoshi SUZUKI

Keywords: wireless sensor network; deployment strategy; communication connectivity

PDF

Paper 22: Detection and Extraction of Videos using Decision Trees

Abstract: This paper addresses a new multimedia data mining framework for the extraction of events in videos by using decision tree logic. The aim of our DEVDT (Detection and Extraction of Videos using Decision Trees) system is for improving the indexing and retrieval of multimedia information. The extracted events can be used to index the videos. In this system we have considered C4.5 Decision tree algorithm [3] which is used for managing both continuous and discrete attributes. In this process, firstly we have adopted an advanced video event detection method to produce event boundaries and some important visual features. This rich multi-modal feature set is filtered by a pre-processing step to clean the noise as well as to reduce the irrelevant data. This will improve the performance of both Precision and Recall. After producing the cleaned data, it will be mined and classified by using a decision tree model. The learning and classification steps of this Decision tree are simple and fast. The Decision Tree has good accuracy. Subsequently, by using our system we will reach maximum Precision and Recall i.e. we will extract pure video events effectively and proficiently.

Author 1: Sk Abdul Nabi
Author 2: Shaik Rasool
Author 3: Dr.P. Premchand

Keywords: DEVDT; Data Processing; Data Pre-Processing; Decision Tree and Training Data.

PDF

Paper 23: An Approach to Improve the Representation of the User Model in the Web-Based Systems

Abstract: A major shortcoming of content-based approaches exists in the representation of the user model. Content-based approaches often employ term vectors to represent each user’s interest. In doing so, they ignore the semantic relations between terms of the vector space model in which indexed terms are not orthogonal and often have semantic relatedness between one another. In this paper, we improve the representation of a user model during building user model in content-based approaches by performing these steps. First is the domain concept filtering in which concepts and items of interests are compared to the domain ontology to check the relevant items to our domain using ontology based semantic similarity. Second, is incorporating semantic content into the term vectors. We use word definitions and relations provided by WordNet to perform word sense disambiguation and employ domain-specific concepts as category labels for the semantically enhanced user models. The implicit information pertaining to the user behavior was extracted from click stream data or web usage sessions captured within the web server logs. Also, our proposed approach aims to update user model, we should analysis user's history query keywords. For a certain keyword, we extract the words which have the semantic relationships with the keyword and add them into the user interest model as nodes according to semantic relationships in the WordNet.

Author 1: Yasser A Nada
Author 2: Khaled M. Fouad

Keywords: User model; Domain ontology; Semantic Similarity; Wordnet.

PDF

Paper 24: Solving the MDBCS Problem Using the Metaheuric–Genetic Algorithm

Abstract: The problems degree-limited graph of nodes considering the weight of the vertex or weight of the edges, with the aim to find the optimal weighted graph in terms of certain restrictions on the degree of the vertices in the subgraph. This class of combinatorial problems was extensively studied because of the implementation and application in network design, connection of networks and routing algorithms. It is likely that solution of MDBCS problem will find its place and application in these areas. The paper is given an ILP model to solve the problem MDBCS, as well as the genetic algorithm, which calculates a good enough solution for the input graph with a greater number of nodes. An important feature of the heuristic algorithms is that can approximate, but still good enough to solve the problems of exponential complexity. However, it should solve the problem heuristic algorithms may not lead to a satisfactory solution, and that for some of the problems, heuristic algorithms give relatively poor results. This is particularly true of problems for which no exact polynomial algorithm complexity. Also, heuristic algorithms are not the same, because some parts of heuristic algorithms differ depending on the situation and problems in which they are used. These parts are usually the objective function (transformation), and their definition significantly affects the efficiency of the algorithm. By mode of action, genetic algorithms are among the methods directed random search space solutions are looking for a global optimum.

Author 1: Milena Bogdanovic

Keywords: graph theory; NP-complete problems; the degree-bounded graphs; Integer linear programming; genetic algorithms.

PDF

Paper 25: Optimized Min-Sum Decoding Algorithm for Low Density Parity Check Codes

Abstract: Low Density Parity Check (LDPC) code approaches Shannon–limit performance for binary field and long code lengths. However, performance of binary LDPC code is degraded when the code word length is small. An optimized min-sum algorithm for LDPC code is proposed in this paper. In this algorithm unlike other decoding methods, an optimization factor has been introduced in both check node and bit node of the Min-sum algorithm. The optimization factor is obtained before decoding program, and the same factor is multiplied twice in one cycle. So the increased complexity is fairly low. Simulation results show that the proposed Optimized Min-Sum decoding algorithm performs very close to the Sum-Product decoding while preserving the main features of the Min-Sum decoding, that is low complexity and independence with respect to noise variance estimation errors.

Author 1: Mohammad Rakibul Islam,
Author 2: Dewan Siam Shafiullah
Author 3: Muhammad Mostafa Amir Faisal
Author 4: Imran Rahman

Keywords:  LDPC codes; Min-sum algorithm; Normalized min-sum algorithm; Optimization factor.

PDF

Paper 26: A New Approach of Digital Forensic Model for Digital Forensic Investigation

Abstract: The research introduces a structured and consistent approach for digital forensic investigation. Digital forensic science provides tools, techniques and scientifically proven methods that can be used to acquire and analyze digital evidence. The digital forensic investigation must be retrieved to obtain the evidence that will be accepted in the court. This research focuses on a structured and consistent approach to digital forensic investigation. This research aims at identifying activities that facilitate and improves digital forensic investigation process. Existing digital forensic framework will be reviewed and then the analysis will be compiled. The result from the evaluation will produce a new model to improve the whole investigation process.

Author 1: Inikpi O Ademu
Author 2: Dr Chris O. Imafidon
Author 3: Dr David S. Preston

Keywords: Case Relevance; Exploratory Testing; Automated Collection; Pre-Analysis; Post-Analysis; Evidence Reliability.

PDF

Paper 27: A Data Mining Approach for the Prediction of Hepatitis C Virus protease Cleavage Sites

Abstract: Summary: Several papers have been published about the prediction of hepatitis C virus (HCV) polyprotein cleavage sites, using symbolic and non-symbolic machine learning techniques. The published papers achieved different Levels of prediction accuracy. the achieved results depends on the used technique and the availability of adequate and accurate HCV polyprotein sequences with known cleavage sites. We tried here to achieve more accurate prediction results, and more Informative knowledge about the HCV protein cleavage sites using Decision tree algorithm. There are several factors that can affect the overall prediction accuracy. One of the most important factors is the availably of acceptable and accurate HCV polyproteins sequences with known cleavage sites. We collected latest accurate data sets to build the prediction model. Also we collected another dataset for the model testing. Motivation: Hepatitis C virus is a global health problem affecting a significant portion of the world’s population. The World Health Organization estimated that in1999; 170 million hepatitis C virus (HCV) carriers were present worldwide, with 3 to 4 million new cases per year. Several approaches have been performed to analyze HCV life cycle to find out the important factors of the viral replication process. HCV polyprotein processing by the viral protease has a vital role in the virus replication. The prediction of HCV protease cleavage sites can help the biologists in the design of suitable viral inhibitors. Results: The ease to use and to understand of the decision tree enabled us to create simple prediction model. We used here the latest accurate viral datasets. Decision tree achieved here acceptable prediction accuracy results. Also it generated informative knowledge about the cleavage process itself. These results can help the researchers in the development of effective viral inhibitors. Using decision tree to predict HCV protein cleavage sites achieved high prediction accuracy.

Author 1: Ahmed mohamed samir ali gamal eldin

Keywords: component; HCV polyprotein; decision tree; protease; decamers

PDF

Paper 28: Enhancing Business Intelligence in a Smarter Computing Environment through Cost Analysis

Abstract: The paper aims at improving Business Intelligence in a Smarter Computing Environment through Cost Analysis. Smarter Computing is a new approach to designing IT infrastructures to create new opportunities like creating new business models, find new ways of delivering technology-based services, and generate new insights from IT to fuel innovation and dramatically improve the economics of IT. The paper looks at various performance metrics to lower the cost of implementing Business intelligence in a smarter computing environment, to generate a cost efficient system. To ensure it, smarter services are deployed with business strategy. The working principle is based on workloads optimizations and their corresponding performance metrics like value metrics, Advanced Data Capabilities and Virtualizations so as to decrease the total IT cost.

Author 1: Saurabh Kacker
Author 2: Vandana Choudhary
Author 3: Tanupriya Choudhury
Author 4: Vasudha Vashisht

Keywords: Smarter Computing; Business Intelligence; Cost Analysis; Virtualizations; Advanced Data Capabilities; Value Metrics.

PDF

Paper 29: A Flexible Tool for Web Service Selection in Service Oriented Architecture

Abstract: Web Services are emerging technologies that enable application to application communication and reuse of services over Web. Semantic Web improves the quality of existing tasks, including Web services discovery, invocation, composition, monitoring, and recovery through describing Web services capabilities and content in a computer interpretable language. To provide most of the requested Web services, a Web service matchmaker is usually required. Web service matchmaking is the process of finding an appropriate provider for a requester through a middle agent. To provide the right service for the right user request, Quality of service (QoS)-based Web service selection is widely used. Employing QoS in Web service selection helps to satisfy user requirements through discovering the best service(s) in terms of the required QoS. Inspired by the mode of the Internet Web search engine, like Yahoo, Google, in this paper we provide a QoS-based service selection algorithm that is able to identify the best candidate semantic Web service(s) given the description of the requested service(s) and QoS criteria of user requirements. In addition, our proposed approach proposes a ranking method for those services. We also show how we employ data warehousing techniques to model the service selection problem. The proposed algorithm integrates traditional match making mechanism with data warehousing techniques. This integration of methodologies enables us to employ the historical preference of the user to provide better selection in future searches. The main result of the paper is a generic framework that is implemented to demonstrate the feasibility of the proposed algorithm for QoS-based Web application. Our presented experimental results show that the algorithm indeed performs well and increases the system reliability.

Author 1: Walaa Nagy
Author 2: Hoda M. O. Mokhtar
Author 3: Ali El-Bastawissy

Keywords: Semantic Web; Web services; Web services match-making; Data warehouses; Quality of Services (QoS); Web service ranking.

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org