The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 4 Issue 9

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: An Autonomic Auto-scaling Controller for Cloud Based Applications

Abstract: One of the key promises of Cloud Computing is elasticity – applications have at their disposal a very large pool of resources from which they can allocate whatever they need. For any fair-size application the amount of resources is significant and both overprovisioning and under provisioning have a negative impact on the customer. In the first case it leads to over costs and in the second case to poor application performance with negative business repercussions as well. It is then an important problem to provision resources appropriately. In addition, it is well known that application workloads exhibit high variability over short time periods. This creates the necessity of having autonomic mechanisms that make resource management decisions in real time and optimizing both cost and performance. To address these problems we present and autonomic auto-scaling controller that based on the stream of measurements from the system maintains the optimal number of resources and responds efficiently to workload variations, without incurring in over costs for high churn of resources or short duration peaks in the workload. To evaluate the performance of our system we conducted extensive evaluations based on traces of real applications deployed in the cloud. Our results show significant improvements over existing techniques.

Author 1: Jorge M. Londoño-Peláez
Author 2: Carlos A. Florez-Samur

Keywords: autonomic resource management; cloud computing

PDF

Paper 2: Partition based Graph Compression

Abstract: Graphs are used in diverse set of disciplines ranging from computer networks to biological networks, social networks, World Wide Web etc. With the advancement in the technology and the discovery of new knowledge, size of graphs is increasing exponentially. A graph containing millions of nodes and billions of edges can be of size in TBs. At the same time, the size of graphs presents a big obstacle to understand the essential information they contain. Also with the current size of main memory it seems impossible to load the whole graph into main memory. Hence the need of graph compression techniques arises. In this paper, we present graph compression technique which partition graphs into subgraphs and then each partition can be compressed individually. For partitioning, proposed approach identifies weak links present in the graph and partition graph at those weak links. During query processing, the partitions which are required need to be decompressed, eliminating decompression of whole graph.

Author 1: Meera Dhabu
Author 2: Dr. P. S. Deshpande
Author 3: Siyaram Vishwakarma

Keywords: graph compression; su graph; partitioning

PDF

Paper 3: Performance Analysis and Comparison of 6to4 Relay Implementations

Abstract: the depletion of the public IPv4 address pool may speed up the deployment of IPv6. The coexistence of the two versions of IP requires some transition mechanisms. One of them is 6to4 which provides a solution for the problem of an IPv6 capable device in an IPv4 only environment. From among the several 6to4 relay implementations, the following ones were selected for testing: sit under Linux, stf under FreeBSD and stf under NetBSD. Their stability and performance were in¬ves¬ti¬gat¬ed in a test network. The increasing measure of the load of the 6to4 relay implementations was set by incrementing the number of the client computers that provided the traffic. The packet loss and the response time of the 6to4 relay as well as the CPU utilization and the memory consumption of the computer running the tested 6to4 relay im¬ple¬men¬ta¬tions were measured. The implementations were tested also under very heavy load conditions to see if they are safe to be used in production systems.

Author 1: Gábor Lencse
Author 2: Sándor Répás

Keywords: IPv6 deployment; IPv6 transition solutions; 6to4; performance analysis

PDF

Paper 4: Review on Aspect Oriented Programming

Abstract: Aspect-oriented programming (AOP) has been introduced as a potential programming approach for the specification of nonfunctional component properties, such as fault-tolerance, logging and exception handling. Such properties are referred to as crosscutting concerns and represent critical issues that conventional programming approaches could not modularize effectively leading to a complex code. This paper discusses AOP concept, the necessity that led to it, how it provides better results in code quality and software development efficiency, followed by stating challenges that developers and researchers face when dealing with this approach. It has been concluded that AOP is promising and deserves more attention from developers and researchers. However, more systematic evaluation studies should be conducted to better understand its implications.

Author 1: Heba A. Kurdi

Keywords: Aspect Oriented Programming; software engineering; AspectJ

PDF

Paper 5: Fingerprint Image Segmentation Using Haar Wavelet and Self Organizing Map

Abstract: Fingerprint image segmentation is one of the important preprocessing steps in Automatic Fingerprint Identification Systems (AFIS). Segmentation separates image background from image foreground, removing unnecessary information from the image. This paper proposes a new fingerprint segmentation method using Haar wavelet and Kohonen’s Self Organizing Map (SOM). Fingerprint image was decomposed using 2D Haar wavelet in two levels. To generate features vectors, the decomposed image was divided into nonoverlapping blocks of 2x2 pixels and converted into four elements vectors. These vectors were then fed into SOM network that grouped them into foreground and background clusters. Finally, blocks in the background area were removed based on indexes of blocks in the background cluster. From the research that has been carried out, we conclude that the proposed method is effective to segment background from fingerprint images.

Author 1: Sri Suwarno
Author 2: Subanar
Author 3: Agus Harjoko
Author 4: Sri Hartati

Keywords: Fingerprint Segmentation; AFIS; background image; foreground image; Haar wavelet; SOM

PDF

Paper 6: A Concept-to-Product Knowledge Management Framework: Towards a Cloud-based Enterprise 2.0 Environment at a Multinational Corporation in Penang

Abstract: Knowledge management initiatives of a multinational corporation in Penang are currently deployed via its enterprise-wide portal and Intranet. To improve knowledge management initiatives from its current strength, efforts could now be focused on synergizing organizational workflow as well as computing resources and repositories. This paper proposes a concept-to-product knowledge management framework to be deployed in a cloud-based environment. It aims to provide effective support for collaborative knowledge management efforts at all stages of projects. The multi-layered framework is built upon the organizational memory which drives relevant processors and applications. The framework manifests itself in the form of a cloud-based concept-to-product dashboard from which employees can access applications and tools that facilitate their day-to-day tasks in a seamless manner.

Author 1: Yu-N Cheah
Author 2: Soo Beng Khoh

Keywords: knowledge management framework; organizational memory; Enterprise 2.0; workflow management; cloud computing; concept to product

PDF

Paper 7: Pedagogy: Instructivism to Socio-Constructivism through Virtual Reality

Abstract: learning theories evolved with time, beginning with instructivism, constructivism, to social constructivism. These theories no doubt were applied in education and they had their effects on learners. Technology advanced, created a paradigm shift by creating new ways of teaching and learning as found in virtual reality (VR). VR provided creative ways in which students learn, provides opportunity to achieve learning goals by presenting artificial environments. We developed and simulated a virtual reality system on a desktop by deploying Visual Basic.NET, Java and Macromedia Flash. This simulated environment enhanced students’ understanding by providing a degree of reality unattainable in a traditional two-dimensional interface, creating a sensory-rich interactive learning environment.

Author 1: Moses O. Onyesolu
Author 2: Victor C. Nwasor
Author 3: Obiajulu E. Ositanwosu
Author 4: Obinna N. Iwegbuna

Keywords: learning theory; virtual reality; simulated environment; education; pedagogy

PDF

Paper 8: A Soft Processor MicroBlaze-Based Embedded System for Cardiac Monitoring

Abstract: this paper aims to contribute to the efforts of design community to demonstrate the effectiveness of the state of the art Field Programmable Gate Array (FPGA), in the embedded systems development, taking a case study in the biomedical field. With this design approach, we have developed a System on Chip (SoC) for cardiac monitoring based on the soft processor MicroBlaze and the Xilkernel Real Time Operating System (RTOS), both from Xilinx. The system permits the acquisition and the digitizing of the Electrocardiogram (ECG) analog signal, displaying heart rate on seven segments module and ECG on Video Graphics Adapter (VGA) screen, tracing the heart rate variability (HRV) tachogram, and communication with a Personal Computer (PC) via the serial port. We have used the MIT_BIH Database records to test and evaluate our implementation performance. In terms of the resources utilization, the implementation occupies around 70% of the used FPGA, namely the Xilinx Spartan 6 XC6SLX16. The accuracy of the QRS detection exceeds 96%.

Author 1: El Hassan El Mimouni
Author 2: Mohammed Karim

Keywords: ECG; FPGA; Heart Rate Variability; MicroBlaze; QRS detection; SoC; Xilkernel

PDF

Paper 9: Recommender System for Personalised Wellness Therapy

Abstract: rising costs and risks in health care have shifted the preference of individuals from health treatment to disease prevention. This prevention treatment is known as wellness. In recent years, the Internet has become a popular place for wellness-conscious users to search for wellness-related information and solutions. As the user community becomes more wellness conscious, service improvement is needed to help users find relevant personalised wellness solutions. Due to rapid development in the wellness market, users value convenient access to wellness services. Most wellness websites reflect common health informatics approaches; these amount to more than 70,000 sites worldwide. Thus, the wellness industry should improve its Internet services in order to provide better and more convenient customer service. This paper discusses the development of a wellness recommender system that would help users find and adapt suitable personalised wellness therapy treatments based on their individual needs. This paper introduces new approaches that enhance the convenience and quality of wellness information delivery on the Internet. The wellness recommendation task is performed using an Artificial Intelligence technique of hybrid case-based reasoning (HCBR). HCBR solves users’ current wellness problems by applying solutions from similar cases in the past. From the evaluation results for our prototype wellness recommendation system, we conclude that wellness consultants are using consistent wellness knowledge to recommend solutions for sample wellness cases generated through an online consultation form. Thus, the proposed model can be integrated into wellness websites to enable users to search for suitable personalized wellness therapy treatment based on their health condition.

Author 1: Thean Pheng Lim
Author 2: Wahidah Husain
Author 3: Nasriah Zakaria

Keywords: recommender system; rule-based reasoning; case-based reasoning; Wellness

PDF

Paper 10: A Survey of Network-On-Chip Tools

Abstract: Nowadays System-On-Chips (SoCs) have evolved considerably in term of performances, reliability and integration capacity. The last advantage has induced the growth of the number of cores or Intellectual Properties (IPs) in a same chip. Unfortunately, this important number of IPs has caused a new issue which is the intra-communication between the elements of a same chip. To resolve this problem, a new paradigm has been introduced which is the Network-On-Chip (NoC). Since the introduction of the NoC paradigm in the last decade, new methodologies and approaches have been presented by research community and many of them have been adopted by industrials. The literature contains many relevant studies and surveys discussing NoC proposals and contributions. However, few of them have discussed or proposed a comparative study of NoC tools. The objective of this work is to establish a reliable survey about available design, simulation or implementation NoC tools. We collected an important amount of information and characteristics about NoC dedicated tools that we will present throughout this survey. This study is built around a respectable amount of references and we hope it will help scientists.

Author 1: Ahmed Ben Achballah
Author 2: Slim Ben Saoud

Keywords: Embedded Systems; Network-On-Chip; CAD Tools; Performance Analysis; Verification and Measurement

PDF

Paper 11: Construction of Neural Networks that Do Not Have Critical Points Based on Hierarchical Structure

Abstract: a critical point is a point at which the derivatives of an error function are all zero. It has been shown in the literature that critical points caused by the hierarchical structure of a real-valued neural network (NN) can be local minima or saddle points, although most critical points caused by the hierarchical structure are saddle points in the case of complex-valued neural networks. Several studies have demonstrated that singularity of those kinds has a negative effect on learning dynamics in neural networks. As described in this paper, the decomposition of high-dimensional neural networks into low-dimensional neural networks equivalent to the original neural networks yields neural networks that have no critical point based on the hierarchical structure. Concretely, the following three cases are shown: (a) A 2-2-2 real-valued NN is constructed from a 1-1-1 complex-valued NN. (b) A 4-4-4 real-valued NN is constructed from a 1-1-1 quaternionic NN. (c) A 2-2-2 complex-valued NN is constructed from a 1-1-1 quaternionic NN. Those NNs described above do not suffer from a negative effect by singular points during learning comparatively because they have no critical point based on a hierarchical structure.

Author 1: Tohru Nitta

Keywords: critical point; singular point; redundancy; complex number; quaternion

PDF

Paper 12: Acoustic Strength of Green Turtle and Fish Based on FFT Analysis

Abstract: the acoustic power at difference angle and distance were measure for four different ages of Green Turtles and three species of fish using modified echo sounder V1082. The echo signal from TVG output was digitized at a sampling rate 1MHz using analog to digital converter (Measurement Computing USB1208HS). Animals were tied with wood frame to ensure it can’t move away from the sound beam. The scatter value for fish demonstrates echo strength is different and depends on the angle of measurement. The lowest acoustic power of fish was recorded from their tail. The finding show that, there is significant difference between fish and turtles aged 12 to 18 years at 4.5 meter and 5 meter. The carapace and plastron of sea turtle gives high backscattering strength compare to other side. The high value obtained probably because of the hard surface of the carapace and plastron. This result is considered important in determining the best method of separating sea turtle and fish. Through this result, revealed that size, surface and animal angle play important role in determining acoustic strength value.

Author 1: Azrul Mahfurdz
Author 2: Sunardi
Author 3: Hamzah Ahmad
Author 4: Syed Abdullah Syed Abdul Kadir
Author 5: Nazuki Sulong

Keywords: Echosounder; Green Turtle; acoustic power; TED

PDF

Paper 13: A Distributed Key Based Security Framework for Private Clouds

Abstract: Cloud computing in its various forms continues to grow in popularity as organizations of all sizes seek to capitalize on the cloud’s scalability, externalization of infrastructure and administration and generally reduced application deployment costs. But while the attractiveness of these public cloud services is obvious, the ability to capitalize on these benefits is significantly limited for those organization requiring high levels of data security. It is often difficult if not impossible from a legal or regulatory perspective for government agencies or health services organizations for instance to use these cloud services given their many documented data security issues. As a middle ground between the benefits and security concerns of public clouds, hybrid clouds have emerged as an attractive alternative; limiting access, conceptually, to users within an organization or within a specific subset of users within an organization. Private clouds being significant options in hybrid clouds, however, are still susceptible to security vulnerabilities, a fact which points to the necessity of security frameworks capable of addressing these issues. In this paper we introduce the Treasure Island Security Framework (TISF), a conceptual security framework designed to specifically address the security needs of private clouds. We have based our framework on a Distributed Key and Sequentially Addressing Distributed file system (DKASA); itself borrowing heavily from the Google File System and Hadoop. Our approach utilizes a distributed key methodology combined with sequential chunk addressing and dynamic reconstruction of metadata to produce a more secure private cloud. The goal of this work is not to evaluate framework from an operational perspective but to instead provide the conceptual underpinning for the TISF. Experimental findings from our evaluation of the framework within a pilot project will be provided in a subsequent work.

Author 1: Ali Shahbazi
Author 2: Julian Brinkley
Author 3: Ali Karahroudy
Author 4: Nasseh Tabrizi

Keywords: private cloud security framework; distributed key; dynamic metadata reconstruction; cloud security

PDF

Paper 14: An Open Cloud Model for Expanding Healthcare Infrastructure

Abstract: with the rapid improvement of computation facilities, healthcare still suffers limited storage space and lacks full utilization of computer infrastructure. That not only adds to the cost burden but also limits the possibility for expansion and integration with other healthcare services. Cloud computing which is based on virtualization, elastic allocation of resources, and pay as you go for used services, opened the way for the possibility to offer fully integrated and distributed healthcare systems that can expand globally. However, cloud computing with its ability to virtualize resources doesn't come cheap or safe from the healthcare perspective. The main objective of this paper is to introduce a new strategy of healthcare infrastructure implementation using private cloud based on OpenStack with the ability to expand over public cloud with hybrid cloud architecture. This research proposes the migration of legacy software and medical data to a secured private cloud with the possibility to integrate with arbitrary public clouds for services that might be needed in the future. The tools used are mainly OpenStack, DeltaCloud, and OpenShift which are open source adopted by major cloud computing companies. Their optimized integration can give an increased performance with a considerable reduction in cost without sacrificing the security aspect. Simulation was then performed using CloudSim to measure the design performance.

Author 1: Sherif E. Hussein
Author 2: Hesham Arafat

Keywords: Cloud Computing; OpenStack; Openshif; Cloudsim; e-health

PDF

Paper 15: A Novel Algorithm for Improving the ESP Game

Abstract: one of the human-computation techniques is games with a purpose (GWAP) and microtask crowdsourcing. These techniques can help in making the image retrieval (IR) be more accurate and helpful. It provides the IR system’s database with a rich of information by adding more descriptions and annotations to images. One of the systems of human-computation is ESP Game. ESP Game is a type of games with a purpose. In the ESP game there has been a lot of work was proposed to solve many of the problems in it and make the most benefit of the game. One of these problems is that the ESP game neglects player's answers for the same image that don't match. This paper presents a new algorithm to use neglected data to generate new labels for the images. We deploy our algorithm at the University of Menoufia for evaluation. In this trial, we first focused on measuring the total number of labels generated by our Recycle Unused Answers For Images algorithm (RUAI). In our evaluation of the RUAI algorithm we present a new evaluation measure we called it quality of labels measure. This measure identifies the quality of the labels in compared to the pre-qualified labels. The results reveal that the proposed algorithm improved the results in compared to the ESP game in all cases.

Author 1: Mohamed Sakr
Author 2: Hany Mahgoub
Author 3: Arabi Keshk

Keywords: ESP game; Games with a purpose; Human computation; crowdsourcing

PDF

Paper 16: A quadratic convergence method for the management equilibrium model

Abstract: in this paper, we study a class of methods for solving the management equilibrium model. We first give an estimate of the error bound for the model, and then, based on the estimate of the error bound, propose a method for solving the model. We prove that our algorithm is quadratically convergent without the requirement of existence of a non-degenerate solution.

Author 1: Jiayi Zhang

Keywords: Management equilibrium model; estimation of error bound; algorithm; quadratic convergence

PDF

Paper 17: The Bitwise Operations Related to a Fast Sorting Algorithm

Abstract: in the work we discuss the benefit of using bitwise operations in programming. Some interesting examples in this respect have been shown. What is described in detail is an algorithm for sorting an integer array with the substantial use of the bitwise operations. Besides its correctness we strictly prove that the described algorithm works in time O(n). In the work during the realization of each of the examined algorithms we use the apparatus of the object-oriented programming with the syntax and the semantics of the programming language C++

Author 1: Krasimir Yordzhev

Keywords: bitwise operations; programming languages C/C++ and Java; sorting algorithm

PDF

Paper 18: Efficient Role Assignment Scheme for Multichannel Wireless Mesh Networks

Abstract: a wireless mesh network (WMN) is cost-effective access network architecture. The performance of multi-hop communication quickly reduces as the number of hops becomes larger. Nassiri et al. proposed a Molecular MAC protocol for autonomic assignment and use of multiple channels to improve network performance. In the Molecular MAC protocol, each node forms a shortest path-spanning tree to a gateway node linked to a wired Internet. After a tree is formed, the nodes with an even-numbered depth and an odd-numbered depth are assigned with the roles of a nucleus and an electron, respectively. After such roles are assigned, each nucleus selects an idle channel. However, this protocol has the following drawback; since the nodes with an even-numbered depth are assigned with the role of a nucleus, there are many nuclei in the topology. The number of assigned channels tends to increase, since each nucleus selects an idle channel that is not currently being occupied by its neighboring nuclei. In wireless communications networks, channels are very important resources. Thus, it is necessary to assign the minimum number of channels as little as possible. To do so, this paper proposes an efficient role assignment scheme, which can reduce the number of assigned channels by reducing the number of nodes assigned as nuclei and preventing nodes within the transmission range of each other from becoming nuclei. Based on various simulation results, the proposed scheme was verified.

Author 1: Sunmyeng Kim
Author 2: Hyun Ah Lee

Keywords: role assignment; multichannel; mesh network

PDF

Paper 19: A Diy Approach to Uni-Temporal Database Implementation

Abstract: when historical versions of data are concerned for a MIS (Management Information System) we naturally might resort to temporal database products. These bi-temporal products, however, are often extravagant and not easily mastered to most of MIS practitioners. Hence we present a plain DIY (do it yourself) solution, the Audit & Change Logs Mechanism-based approach--ACLM, to meet the uni-temporal requirement from restoring historical versions of data. With ACLM programmers can code SQL scripts on demand to trace and replay any snapshot of historical data version via RDBMS built-in functions, they need not to shift away from their usual way of coding stored procedures for data maintenance. Besides, the ACLM approach is compatible with mega-data change, and its additive overhead was instantiated imperceptible for throughputs of routine access with a typical scenario.

Author 1: Haitao Yang
Author 2: Fei Xu
Author 3: Lating Xia

Keywords: DIY solution; recurrence; historical snapshot; uni-temporal database; MIS

PDF

Paper 20: The Impact of Cognitive Tools on the Development of the Inquiry Skills of High School Students in Physics

Abstract: the purpose of the study was to compare the effectiveness of two teaching strategies that utilize two different cognitive tools on the development of students’ inquiry skills in mechanics. The strategies were used to help students formulate Newton’s 2nd law of motion. Two cognitive tools had been used: a computer simulation and manipulations of concrete objects in physics laboratory. A quasi-experimental method that employed the 2 Cognitive Tools ? 2 Time of learning split-plot factorial design was applied in the study. The sample consisted of 54 Grade 11 students from two physics classes of the university preparation section in a high school of the province of Ontario (Canada). One class was assigned to interactive computer simulations (treatment) and the other to concrete objects in physics laboratory (control). Both tools were embedded in the general framework of the guided-inquiry cycle approach. The results showed that the interaction effect of the Cognitive Tools ? Time of learning was not statistically significant. However, the results also showed a significant effect on the development of students’ inquiry skills regardless of the type of cognitive tool they had used. Although the findings suggested that these two strategies are effective in developing students’ inquiry skills in mechanics, students in the computer simulation group had shown larger gain in their inquiry skills test than their counterparts in the laboratory group.

Author 1: Mohamed I. Mustafa
Author 2: Louis Trudel

Keywords: inquiry skills; teaching strategy; cognitive tools; high school physics; simulation; science laboratory

PDF

Paper 21: A System for Multimodal Context-Awareness

Abstract: in this paper we present the improvement of our novel localization system, by introducing radio-frequency identification (RFID) which adds person identification capabilities and increases multi-person localization robustness. Our system aims at achieving multi-modal context-awareness in an assistive, ambient intelligence environment. The unintrusive devices used are RFID and 3-D audio-visual information from 2 Kinect sensors deployed at various locations of a simulated apartment to continuously track and identify its occupants, thus enabling activity monitoring. More specifically, we use skeletal tracking conducted on the depth images and sound source localization conducted on the audio signals captured by the Kinect sensors to accurately localize and track multiple people. RFID information is used mainly for identification purposes but also for rough location estimation, enabling mapping of the location information from the Kinect sensors to the identification events of the RFID. Our system was evaluated in a real world scenario and attained promising results exhibiting high accuracy, therefore showing the great prospect of using the RFID and Kinect sensors jointly to solve the simultaneous identification and localization problem.

Author 1: Georgios Galatas
Author 2: Fillia Makedon

Keywords: Multimodal; Context-awareness; Microsoft Kinect; RFID; Localization; Identification

PDF

Paper 22: Improving the Security of the Medical Images

Abstract: Applying security to the transmitted medical images is important to protect the privacy of patients. Secure transmission requires cryptography, and watermarking to achieve confidentiality, and data integrity. Improving cryptography part needs to use an encryption algorithm that stands for a long time against different attacks. The proposed method is based on number theory and uses Chinese remainder theorem as a backbone. This approach achieves high level of security and stands against different attacks for a long time. On watermarking part, the medical image is divided into two regions: a region of interest (ROI) and a region of background (ROB). The pixel values of the ROI contain the important information so this region must not experience any change. The proposed watermarking technique is based on dividing the medical image in to blocks and inserting the watermark to the ROI by shifting the blocks. Then, an equivalent number of blocks in the ROB are removed. This approach can be considered as lossless since it does not affect on the ROI, also it does not increase the image size. In addition, it can stand against some watermarking attacks such cropping, and noise.

Author 1: Ahmed Mahmood
Author 2: Tarfa Hamed
Author 3: Charlie Obimbo
Author 4: Robert Dony

Keywords: Medical Imaging Security; Telemedicine Security; Chinese remainder theorem; Watermarking

PDF

Paper 23: Mining Positive and Negative Association Rules Using FII-Tree

Abstract: Positive and negative association rules are important to find useful information hidden in large datasets, especially negative association rules can reflect mutually exclusive correlation among items. Association rule mining among frequent items has been extensively studied in data mining research. However, in recent years, there has been an increasing demand for mining the infrequent items. In this paper, we propose a tree based approach to store both frequent and infrequent itemsets to mine both the positive and negative association rules from frequent and infrequent itemsets. It minimizes I/O overhead by scanning the database only once. The performance study shows that the proposed method is an efficient than the previously proposed method.

Author 1: T Ramakrishnudu
Author 2: R B V Sbramanyam

Keywords: data mining; association rule; frequent itemset; positive association rule; negative association rule

PDF

Paper 24: Link-Budget Design and Analysis showing Impulse-based UWB Performance Trade-Off flexibility as Integrator Solution for Different Wireless Short-Range Infrastructures

Abstract: Future wireless indoor scenarios are expected to be complex requiring wireless nodes to adaptive responding to dynamic changes according to channel conditions. Interacting with neighboring nodes to achieve optimized performance in term of date-rates, distance and BER performance are the main concerns of designing future wireless solutions. IR-UWB came to the picture as the missing Puzzle to achieve these requirements and gluing the different wireless indoor existing infrastructures in global platform solutions. This paper shows the flexibility of IR-UWB in signal design at the physical layer level as cross-layer architecture of optimized performance. A detailed performance analysis presented in this paper as a mathematical model of the proposed wireless solution and described in proposed link budget design template. The performance evaluation is carried-on to show the proposed system as a good candidate in different wireless scenarios for different data-rate requirements, distance for specific requirement BER. Simulations results and well as experimental statistical analysis of the received signal under different channel models and conditions are carried-out as a proof of concept of the proposed system.

Author 1: M. S Jawad
Author 2: Othman A. R
Author 3: Z. Zakaria
Author 4: A.A.lsa
Author 5: W. Ismail

Keywords: Ultra Wideband; Time Hopping-pulse position Modulation; Radio Frequency Identification; Wireless Sensors Networks; RAKE Receiver; Bit Error rate

PDF

Paper 25: Comparative Study of the Software Metrics for the complexity and Maintainability of Software Development

Abstract: Software metrics is one of the well-known topics of research in software engineering. Metrics are used to improve the quality and validity of software systems. Research in this area focus mainly on static metrics obtained by static analysis of the software. However modern software systems without object oriented design are incomplete. Every system has its own complexity which should be measured to improve the quality of the system. This paper describes the different types of metrics along with the static code metrics and Object oriented metrics. Then the metrics are summarized on the basis of relevance in finding the complexity and hence help in better maintainability of the software code, retaining the quality and making it cost effective.

Author 1: Dr Sonal Chawla
Author 2: Gagandeep Kaur

Keywords: Static metrics; OO metrics; MOOD

PDF

Paper 26: Multithreading Image Processing in Single-core and Multi-core CPU using Java

Abstract: Multithreading has been shown to be a powerful approach for boosting a system performance. One of the good examples of applications that benefits from multithreading is image processing. Image processing requires many resources and processing run time because the calculations are often done on a matrix of pixels. The programming language Java supports the multithreading programming as part of the language itself instead of treating threads through the operating system. In this paper we explore the performance of Java image processing applications designed with multithreading approach. In order to test how the multithreading influences on the performance of the program, we tested several image processing algorithms implemented with Java language using the sequential one thread and multithreading approach on single and multi-core CPU. The experiments were based not only on different platforms and algorithms that differ from each other from the level of complexity, but also on changing the sizes of the images and the number of threads when multithreading approach is applied. Performance is increased on single core and multiple core CPU in different ways in relation with image size, complexity of the algorithm and the platform.

Author 1: Alda Kika
Author 2: Silvana Greca

Keywords: multithreading; image processing; multi-core; Java

PDF

Paper 27: EICT Based Diagnostic Tool and Monitoring System for EMF Radiation to Sustain Environmental Safety

Abstract: the adverse effects of electromagnetic radiation from mobile phones and communication towers on health issues are being well documented today. However, exact correlation between radiation of communication towers and their radiation levels, are not monitored. Aim of this paper is to study, analyze, apply networking and data mining technologies to develop an EICT based Diagnostic tool and Monitoring system for electromagnetic radiation levels into environment. This system is to network all mobile towers of each service provider as a single entity and then connect all service providers to a central monitoring agency online for continuous monitoring. Since very large numbers of mobile towers exist in India, each state can have its own regional network which is further networked with national central network. This can be enlarged to entire world for monitoring the EMF radiation levels near every mobile tower. For these regional national and international networks the connectivity is to be instituted by the respective service provider. In this paper an attempt is made to logically apply Data Mining and networking technologies to develop a central EICT based diagnostic tool and monitoring system for EMF radiation from each transmission tower. With this system regional, national and international agencies/authorities can monitor the EMF radiation at each and every transmission tower area continuously and verify them with exposure standards. It is proposed to display this information using Integrated Display System in front of monitoring authority at appropriate levels.

Author 1: K Parandham
Author 2: RP Gupta
Author 3: Amitab Saxena

Keywords: EICT Based Diagnostic tool; Electromagnetic Fields(EMF) Radiation; Mobile Telephony; Data Mining; Data Warehousing; Electronics; Information and Communication Technologies(EICT); International Commission on Non-Ionizing Radiation Protection(ICNIRP); Compressed Natural Gas(CNG)

PDF

Paper 28: Facing the challenges of the One-Tablet-Per-Child policy in Thai primary school education

Abstract: The Ministry of Education in Thailand is currently distributing tablets to all first year primary (Prathom 1) school children across the country as part of the government’s “One Tablet Per Child” (OTPC) project to improve education. Early indications suggest that there are many unexplored issues in designing and implementing tablet activities for such a large and varied group of students and so far there is a lack of evaluation on the effectiveness of the tablet activities. In this article, the authors propose four challenges for the improving Thailand’s OTPC project, consisting of: developing contextualised content, ensuring usability, providing teacher support, and assessing learning outcomes. A case study on developing science activities for first year primary school children on the OTPC devices is the basis for presenting possible solutions to the four challenges. In presenting a solution to the challenge of providing teacher support, an architecture is described for collecting data from student interactions with the tablet in order to analysis the current progress of students while in a live classroom setting. From tests in three local Thai schools, the authors evaluate the case study from both student and teacher perspectives. In concluding the paper, a framework for guiding mobile learning innovation is utilised to review the qualities and shortcomings of the case study.

Author 1: Ratchada Viriyapong
Author 2: Antony Harfield

Keywords: educational technology; m-learning; mobile computing; tablet-based education

PDF

Paper 29: A Remote Health Care System Combining a Fall Down Alarm and Biomedical Signal Monitor System in an Android Smart-Phone

Abstract: First aid and immediate help are very important following an accident. The earlier the detection and treatment is carried out, the better the prognosis and chance of recovery of the patients. It is even more important when considering the elderly. Once the elderly have an accident, they not only physically injure their body, but also impair their mental and social ability, and may develop severe sequela. In the last few years, the continuously developed Android cell phone has been applied to many fields. Despite the nature of the GPS positioning system that the mobile phone currently uses, most applications used are SMS and file transfers. However, these biomedical measurement signals, passing through a transferring interface and uploading to the mobile, result the little really successful cases with the remote health care feasibility. This research will develop an Android cell phone which combines the functionality of an ECG, pulsimeter, SpO2, and BAD (Body Activity Detector) for real-time monitoring of the activity of a body. When an accident occurs, the signals go through Android smart phone, immediately notifying the remote ends and providing first time help.

Author 1: Ching-Sung Wang
Author 2: Chien-Wei Li
Author 3: Teng-Hui Wang

Keywords: Health care; Biomedical signal; Fall down alarm; Real-time; Android smart phone

PDF

Paper 30: Detecting Linkedin Spammers and its Spam Nets

Abstract: Spam is one of the main problems of the WWW. Many studies exist about characterising and detecting several types of Spam (mainly Web Spam, Email Spam, Forum/Blob Spam and Social Networking Spam). Nevertheless, to the best of our knowledge, there are no studies about the detection of Spam in Linkedin. In this article, we propose a method for detecting Spammers and Spam nets in the Linkedin social network. As there are no public or private Linkedin datasets in the state of the art, we have manually built a dataset of real Linkedin users, classifying them as Spammers or legitimate users. The proposed method for detecting Linkedin Spammers consists of a set of new heuristics and their combinations using a kNN classifier. Moreover, we proposed a method for detecting Spam nets (fake companies) in Linkedin, based on the idea that the profiles of these companies share content similarities. We have found that the proposed methods were very effective. We achieved an F-Measure of 0.971 and an AUC close to 1 in the detection of Spammer profiles, and in the detection of Spam nets, we have obtained an F-Measure of 1.

Author 1: V´ictor M. Prieto
Author 2: Manuel A´ lvarez
Author 3: Fidel Cacheda

PDF

Paper 31: Limit Cycle Generation for Multi-Modal and 2-Dimensional Piecewise Affine Control Systems

Abstract: This paper considers a limit cycle control problem of a multi-modal and 2-dimensional piecewise affine control system. Limit cycle control means a controller design method to generate a limit cycle for given piecewise affine control systems. First, we deal with a limit cycle synthesis problem and derive a new solution of the problem. In addition, theoretical analysis on the rotational direction and the period of a limit cycle is shown. Next, the limit cycle control problem for piecewise affine control system is formulated. Then, we obtain matching conditions such that the piecewise affine control system with the state feedback law corresponds to the reference system which generates a desired limit cycle. Finally, in order to indicate the effectiveness of the new method, a numerical simulation is illustrated.

Author 1: Tatsuya Kai

PDF

Paper 32: Privacy-Preserving Clustering Using Representatives over Arbitrarily Partitioned Data

Abstract: The challenge in privacy-preserving data mining is avoiding the invasion of personal data privacy. Secure computa- tion provides a solution to this problem. With the development of this technique, fully homomorphic encryption has been realized after decades of research; this encryption enables the computing and obtaining results via encrypted data without accessing any plaintext or private key information. In this paper, we propose a privacy-preserving clustering using representatives (CURE) algorithm over arbitrarily partitioned data using fully homomor- phic encryption. Our privacy-preserving CURE algorithm allows cooperative computation without revealing users’ individual data. The method used in our algorithm enables the data to be arbitrarily distributed among different parties and to receive accurate clustering result simultaneously.

Author 1: Yu Li
Author 2: Sheng Zhong

PDF

Paper 33: Texture Classification based on Bidimensional Empirical Mode Decomposition and Local Binary Pattern

Abstract: This paper presents a new simple and robust texture analysis feature based on Bidimensional Empirical Mode Decomposition (BEMD) and Local Binary Pattern (LBP). BEMD is a locally adaptive decomposition method and suitable for the analysis of nonlinear or nonstationary signals. Texture images are decomposed to several Bidimensional Intrinsic Mode Functions (BIMFs) by BEMD, which present a new set multi-scale components of images. In our approach, firstly, saddle points are added as supporting points for interpolation to improve original BEMD, and then images are decomposed by the new BEMD to several components (BIMFs). After then, Local Binary Pattern (LBP) in different sizes is used to detect features from different BIMFs. At last, normalization and BIMFs selection method are adopted for features selection. The proposed feature presents invariant while preserving LBP’s simplicity. Our method has also been evaluated in CuRet and KTH-TIPS2a texture image databases. It is experimentally demonstrated that the proposed feature achieves higher classification accuracy than other state-of-theart texture representation methods, especially in small training samples condition.

Author 1: JianJia Pan
Author 2: YuanYan Tang

Keywords: Texture classification, Empirical Mode Decomposition, Local Binary Pattern, Invariant feature

PDF

Paper 34: A Bayesian framework for glaucoma progression detection using Heidelberg Retina Tomograph images

Abstract: Glaucoma, the second leading cause of blindness in the United States, is an ocular disease characterized by structural changes of the optic nerve head (ONH) and changes in visual function. Therefore, early detection is of high importance to preserve remaining visual function. In this context, the Heidelberg Retina Tomograph (HRT), a confocal scanning laser tomograph, is widely used as a research tool as well as a clinical diagnostic tool for imaging the optic nerve head to detect glaucoma and monitor its progression. In this paper, a glaucoma progression detection technique is proposed using the HRT images. Contrary to the existing methods that do not integrate the spatial pixel dependency in the change detection map, we propose the use of the Markov Random Field (MRF) to handle a such dependency. In order to estimate the model parameters, a Monte Carlo Markov Chain procedure is used. We then compared the diagnostic performance of the proposed framework to existing methods of glaucoma progression detection.

Author 1: Akram Belghith
Author 2: Christopher Bowd
Author 3: Madhusudhanan Balasubramanian
Author 4: Robert N. Weinreb
Author 5: Linda M. Zangwill

Keywords: Glaucoma, Markov random field, change detec-tion, Bayesian estimation

PDF

Paper 35: Towards a Seamless Future Generation Network for High Speed Wireless Communications

Abstract: The MIMO technology towards achieving future generation broadband networks design criteria is presented. Typical next generation scenarios are investigated. The MIMO technology is integrated with the OFDM technology for effective space, time and frequency diversity exploitations for high speed outdoor environment. Two different OFDM design kernels (fast Fourier transform (FFT) and wavelet packet transform (WPT)) are used at the baseband for OFDM system travelling at terrestrial high speed for 800MHz and 2.6GHz operating frequencies. Results show that the wavelet kernel for designing OFDM systems can withstand doubly selective channel fading for mobiles speeds up to 280Km/hr at the expense of the traditional OFDM design kernel, the fast Fourier transform.

Author 1: Kelvin O. O. Anoh
Author 2: Raed A. A. Abd-Alhameed
Author 3: Michael Chukwu
Author 4: Mohammed Buhari
Author 5: Steve M. R. Jones

Keywords: Doppler Effect, Doubly selective fading, frequencyselective fading, OFDM, Wavelet, MIMO

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org