The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 5 Issue 7

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Benefits Management of Cloud Computing Investments

Abstract: This paper examines investments in cloud computing using the Benefits Management approach. The major contribution of the paper is to provide a unique insight into how organizations derive value from cloud computing investments. The motivation for writing this paper is to consider the business benefits generated from utilizing cloud computing in a range of organizations. Case studies are used to describe a number of organizations approaches to benefits exploitation using cloud computing. It was found that smaller organizations can generate rapid growth using strategies based on cloud computing. Larger organizations have used utility approaches to reduce the costs of IT infrastructure.

Author 1: Richard Greenwell
Author 2: Xiaodong Liu
Author 3: Kevin Chalmers

Keywords: Cloud Computing; Benefits Management; Information Systems Management

PDF

Paper 2: Image Segmentation Via Color Clustering

Abstract: This paper develops a computationally efficient process for segmentation of color images. The input image is partitioned into a set of output images in accordance to color characteristics of various image regions. The algorithm is based on random sampling of the input image and fuzzy clustering of the training data followed by crisp classification of the input image. The user prescribes the number of randomly selected pixels comprising the trainer set and the number of color classes characterizing the image compartments. The algorithm developed here constitutes an effective preprocessing technique with various applications in machine vision systems. Spectral segmentation of the sensor image can potentially lead to enhanced performance of the object detection, classification, recognition, authentication and tracking modules of the autonomous vision system.

Author 1: Kaveh Heidary

Keywords: Clustering; Classification; Image Segmentation; Machine Vision

PDF

Paper 3: A Framework to Improve Communication and Reliability Between Cloud Consumer and Provider in the Cloud

Abstract: Cloud services consumers demand reliable methods for choosing appropriate cloud service provider for their requirements. Number of cloud consumer is increasing day by day and so cloud providers, hence requirement for a common platform for interacting between cloud provider and cloud consumer is also on the raise. This paper introduces Cloud Providers Market Platform Dashboard. This will act as not only just cloud provider discoverability but also provide timely report to consumer on cloud service usage and provide new requirement based on consumer cloud usage and cost for the same. Dashboard is also responsible for getting cost of each service provider for a particular requirement. Our solution will learn from requirements and provide required details for consumer for effective usage of cloud services. This also enable service provider to understand requirements, provide quality of service, to understand new requirement and deliver. This framework also deals with how best we can use before and after usage of cloud services to choose a right service provider for a particular requirement in a community.

Author 1: Vivek Sridhar

Keywords: cloud computing; requirement communication; requirement engineering; cloud service; cloud discoverability; data Mining; artificial intelligence in Cloud Computing

PDF

Paper 4: A Wavelet-Based Approach for Ultrasound Image Restoration

Abstract: Ultrasound's images are generally affected by speckle noise which is mainly due to the scattering phenomenon’s coherent nature. Speckle filtration is accompanied with loss of diagnostic features. In this paper a modest new trial introduced to remove speckles while keeping the fine features of the tissue under diagnosis by enhancing image’s edges; via Curvelet denoising and Wavelet based image fusion. Performance evaluation of our work is done by four quantitative measures: the peak signal to noise ratio (PSNR), the square root of the mean square of error (RMSE), a universal image quality index (Q), and the Pratt’s figure of merit (FOM) as a quantitative measure for edge preservation. Plus Canny edge map which is extracted as a qualitative measure of edge preservation. The measurements of the proposed approach assured its qualitative and quantitative success into image denoising while maintaining edges as possible. A Gray phantom is designed to test our proposed enhancement method. The phantom results assure the success and applicability of this paper approach not only to this research works but also for gray scale diagnostic scans’ images including ultrasound’s B-scans.

Author 1: Mohammed Tarek GadAllah
Author 2: Samir Mohammed Badawy

Keywords: Ultrasound Medical Imaging; Curvelet Based Image Denoising; Wavelet Based Image Fusion

PDF

Paper 5: A Second Correlation Method for Multivariate Exchange Rates Forecasting

Abstract: Foreign exchange market is one of the most complex dynamic market with high volatility, non linear and irregularity. As the globalization spread to the world, exchange rates forecasting become more important and complicated. Many external factors influence its volatility. To forecast the exchange rates, those external variables can be used and usually chosen based on the correlation to the predicted variable. A new second correlation method to improve forecasting accuracy is proposed. The second correlation is used to choose the external variable with different time interval. The proposed method is tested using six major monthly exchange rates with Nonlinear Autoregressive with eXogenous input (NARX) compared with Nonlinear Autoregressive (NAR) for model benchmarking. We evaluated the forecasting accuracy of proposed method is increasing by 16.8% compared to univariate NAR model and slight better than linear correlation on average for Dstat parameter and gives almost no improvement for MSE.

Author 1: Agus Sihabuddin
Author 2: Subanar
Author 3: Dedi Rosadi
Author 4: Edi Winarko

Keywords: forecasting; foreign exchange; NARX; second correlation

PDF

Paper 6: Mitigation of Cascading Failures with Link Weight Control

Abstract: Cascading failures are crucial issues for the study of survivability and resilience of our infrastructures and have attracted much interest in complex networks research. In this paper, we study the overload-based cascading failure model and propose a soft defense strategy to mitigate the damage from such cascading failures. In particular, we assign adjustable weights to individual links of a network and control the weight parameter. The information flow and the routing patterns in a network are then controlled based on the assigned weights. The main idea of this work is to control the traffics on the network and we verify the effectiveness of the load redistribution for mitigating cascading failure. Numerical results imply that network robustness can be enhanced significantly using the relevant smart routing strategy, in which loads in the network are properly redistributed.

Author 1: Hoang Anh Tran Quang
Author 2: Akira Namatame

Keywords: cascading failures; link’s weight; network robustness

PDF

Paper 7: Applicability of the Maturity Model for IT Service Outsourcing in Higher Education Institutions

Abstract: Outsourcing is a strategic option which complements IT services provided internally in organizations. This study proposes the applicability of a new holistic maturity model based on standards ISO/IEC 20000 and ISO/IEC 38500, and the frameworks and best practices of ITIL and COBIT, with a specific focus on IT outsourcing. This model allows independent validation and practical application in the field of higher education. In addition, this study allows to achieve an effective transition to a model of good governance and management of outsourced IT services which, aligned with the core business of universities, affect the effectiveness and efficiency of its management, optimizes its value and minimizes risks.

Author 1: Victoriano Valencia García
Author 2: Dr. Eugenio J. Fernández Vicente
Author 3: Dr. Luis Usero Aragonés

Keywords: IT governance; IT management; Outsourcing; IT services; Maturity model; Maturity measurement

PDF

Paper 8: Modification of CFCM in The Presence of Heavy AWGN for Bayesian Blind Channel Equalizer

Abstract: In this paper, the modification of conditional Fuzzy C-Means (CFCM) aimed at estimation of unknown desired channel states is accomplished for Bayesian blind channel equalizer under the presence of heavy additive Gaussian noise (AWGN). For the modification of CFCM to search the optimal channel states of a heavy noise-corrupted communication channel, a Gaussian weighted partition matrix, along with the Bayesian likelihood fitness function and the conditional constraint of ordinary CFCM, is developed and exploited. In the experiments, binary signals are generated at random and transmitted through both types of linear and nonlinear channels which are corrupted with various degrees of AWGN, and the modified CFCM estimates the channel states of those unknown channels. The simulation results, including the comparison with the previously developed algorithm exploiting the ordinary CFCM, demonstrate the effectiveness of proposed modification in terms of accuracy and speed, especially under the presence of heavy AWGN. Therefore, the proposed modification can possibly constitute a search algorithm of optimal channel states for Bayesian blind channel equalizer in severe noise-corrupted communication environments.

Author 1: Changkyu Kim
Author 2: Soowhan Han

Keywords: Gaussian Partition Matrix; Conditional Fuzzy C-Means; Channel States; Bayesian Blind Equalizer

PDF

Paper 9: An Object-Oriented Smartphone Application for Structural Finite Element Analysis

Abstract: Smartphones are becoming increasingly ubiquitous both in general society and the workplace. Recent increases in mobile processing power have shown the current generation of smartphones has equivalent processing power to a supercomputer from the early 1990s. Many industries have abandoned desktop computing and are now entirely reliant on mobile devices. Given these facts it is logical that smartphones are considered as the next platform for finite element analysis (FEA). This paper presents an architecture for a smartphone FEA application using object-oriented programming. A MVC design pattern is adopted and a demonstration FEA application for the Android smartphone platform is presented.

Author 1: B.J. Mac Donald

Keywords: Objected-oriented programming; Finite Element Method; Java; Android

PDF

Paper 10: New Approach for Image Fusion Based on Curvelet Approach

Abstract: Most of the image fusion work has been limited to monochrome images. Algorithms which utilize human colour perception are attracting the image fusion community with great interest. It is mainly due to the reason that the use of colour greatly expands the amount of information to be conveyed in an image. Since, the human visual system is very much sensitive to colours; research was undertaken in mapping three individual monochrome multispectral images to the respective channels of an RGB image to produce a false colour fused image. Producing a fused colour output image which maintains the original chromaticity of the input visual image is highly tricky. The focus of this paper is developing a new approach to fuse a color image (visual image) and a corresponding grayscale one (Infrared image – IR) using the curvelet approach using different fusion rules in new fields. The fused image obtained by the proposed approach maintain the high resolution of the colored image (visual image), incorporate any hidden object given by the IR sensor as an example, or complements the two input images and keep the natural color of the visual image.

Author 1: Gehad Mohamed Taher
Author 2: Mohamed ElSayed Wahed
Author 3: Ghada EL Taweal

Keywords: Image fusion; visual colored image; monochrome images

PDF

Paper 11: Automated Menu Recommendation System Based on Past Preferences

Abstract: Data mining plays an important role in ecommerce in today’s world. Time is critical when it comes to shopping as options are unlimited and making a choice can be tedious. This study presents an application of data mining in the form of an Android application that can provide user with automated suggestion based on past preferences. The application helps a person to choose what food they might want to order in a specific restaurant. The application learns user behavior with each order - what they order in each kind of meal and what are the products that they select together. After gathering enough information, the application can suggest the user about the most selected dish in the recent past and since the application started to learn. Applications, such as these, can play a major role in helping make a decision based on past preferences, thereby reducing the user involvement in decision making.

Author 1: Daniel Simon Sanz
Author 2: Ankur Agrawal

Keywords: data mining; Apriori; Android; restaurant; recommendation system

PDF

Paper 12: A Shape Based Image Search Technique

Abstract: This paper describes an interactive application we have developed based on shaped-based image retrieval technique. The key concepts described in the project are, i)matching of images based on contour matching; ii)matching of images based on edge matching; iii)matching of images based on pixel matching of colours. Further, the application facilitates the matching of images invariant of transformations like i) translation ; ii) rotation; iii) scaling. The key factor of the system is, the system shows the percentage unmatched of the image uploaded with respect to the images already existing in the database graphically, whereas, the integrity of the system lies on the unique matching techniques used for optimum result. This increases the accuracy of the system. For example, when a user uploads an image say, an image of a mango leaf, then the application shows all mango leaves present in the database as well other leaves matching the colour and shape of the mango leaf uploaded.

Author 1: Aratrika Sarkar
Author 2: PallabiBhatttacharjee

Keywords: shape-based image retrieval; contour matching; edge matching; pixel matching

PDF

Paper 13: Computer Ethics in the Semantic Web Age

Abstract: Computer ethics can be defined as a set of moral principles that monitor the use of computers. Similar rules were then required for both programmers and users. Issues that were not anticipated in the past have arisen due to the introduction of newer platforms such as Semantic Web. Both programmers and users are now obliged to consider phenomenon such as informed consent. In this paper, I will explore the ethical problems that arise for professionals and users with the advent of new technologies, especially with privacy concerns and global information.

Author 1: Aziz Alotaibi

Keywords: Computer ethics; semantic web; privacy concerns; and global information

PDF

Paper 14: A Tool Design of Cobit Roadmap Implementation

Abstract: Over the last two decades, the role of information technology in organizations has changed from primarily a supportive and transactional function to being an essential prerequisite for strategic value generation. The organizations based their operational services through its Information Systems (IS) that need to be managed, controlled and monitored constantly. IT governance (ITG), i.e. the way organizations manage IT resources, has became a key factor for enterprise success due to the increasing enterprise dependency on IT solutions. There are several approaches available to deal with ITG. These methods are diverse, and in some cases, long and complicated to implement. One well-accepted ITG framework is COBIT, designed for a global approach. This paper describes a design of a tool for COBIT roadmap implementation. The model is being developed in the course of ongoing PhD research.

Author 1: Karim Youssfi
Author 2: Jaouad Boutahar
Author 3: Souhail Elghazi

Keywords: IT governance; COBIT; Tool design; Roadmap; Implementation

PDF

Paper 15: Ontology Mapping of Business Process Modeling Based on Formal Temporal Logic

Abstract: A business process is the combination of a set of activities with logical order and dependence, whose objective is to produce a desired goal. Business process modeling (BPM) using knowledge of the available process modeling techniques enables a common understanding and analysis of a business process. Industry and academics use informal and formal techniques respectively to represent business processes (BP), having the main objective to support an organization. Despite both are aiming at BPM, the techniques used are quite different in their semantics. While carrying out literature research, it has been found that there is no general representation of business process modeling is available that is expressive than the commercial modeling tools and techniques. Therefore, it is primarily conceived to provide an ontology mapping of modeling terms of Business Process Modeling Notation (BPMN), Unified Modeling Language (UML) Activity Diagrams (AD) and Event Driven Process Chains (EPC) to temporal logic. Being a formal system, first order logic assists in thorough understanding of process modeling and its application. However, our contribution is to devise a versatile conceptual categorization of modeling terms/constructs and also formalizing them, based on well accepted business notions, such as action, event, process, connector and flow. It is demonstrated that the new categorization of modeling terms mapped to formal temporal logic, provides the expressive power to subsume business process modeling techniques i.e. BPMN, UML AD and EPC.

Author 1: Irfan Chishti
Author 2: Jixin Ma
Author 3: Brian Knight

Keywords: Business Process Modeling techniques; Ontology; Temporal Logic; Semantics; Mapping

PDF

Paper 16: Adaptive Cache Replacement:A Novel Approach

Abstract: Cache replacement policies are developed to help insure optimal use of limited resources. Varieties of such algorithms exist with relatively few that dynamically adapt to traffic patterns. Algorithms that are tunable typically utilize off-line training mechanisms or trial-and-error to determine optimal characteristics. Utilizing multiple algorithms to establish an efficient replacement policy that dynamically adapts to changes in traffic load and access patterns is a novel option that is introduced in this article. A simulation of this approach utilizing two existing, simple, and effective policies; namely, LRU and LFU was studied to assess the potential of the adaptive policy. This policy is compared and contrasted to other cache replacement policies utilizing public traffic samples mentioned in the literature as well as a synthetic model created from existing samples. Simulation results suggest that the adaptive cache replacement policy is beneficial, primarily in smaller cache sizes.

Author 1: Sherif Elfayoumy
Author 2: Sean Warden

Keywords: cache replacement policy; high performance computing; adaptive caching; Web caching

PDF

Paper 17: Application of Fuzzy Self-Optimizing Control Based on Differential Evolution Algorithm for the Ratio of Wind to Coal Adjustment of Boiler in the Thermal Power Plant

Abstract: The types of coal are multiplex in domestic small and medium sized boilers, and with unstable ingredients, the method of maintaining the amount of wind and coal supply in a fixed proportion of the wind adjustment does not always ensure the best economical boiler combustion process, the key of optimizing combustion is to modify reasonable proportion of wind and coal online. In this paper, a kind of fuzzy self-optimizing control based on differential evolution algorithm is proposed, which applied in the power plant boiler system, the boiler combustion efficiency has been significantly improved than previous indirect control. In this paper, a thermal power plant is our research object, in the case of determining the optimum system performance, the unit efficiency can be increased significantly using this method, and the important issues of energy efficiency of power plants can be successfully solved.

Author 1: Ting Hou
Author 2: Liping Zhang
Author 3: Yuchen Chen

Keywords: fuzzy self-optimizing control; differential evolution algorithm; best ratio of wind to coal; boiler efficiency

PDF

Paper 18: Automatic Optic Disc Boundary Extraction from Color Fundus Images

Abstract: Efficient optic disc segmentation is an important task in automated retinal screening. For the same reason optic disc detection is fundamental for medical references and is important for the retinal image analysis application. The most difficult problem of optic disc extraction is to locate the region of interest. More over it is a time consuming task. This paper tries to overcome this barrier by presenting an automated method for optic disc boundary extraction using Fuzzy C Means combined with thresholding. The discs determined by the new method agree relatively well with those determined by the experts. The present method has been validated on a data set of 110 colour fundus images from DRION database, and has obtained promising results. The performance of the system is evaluated using the difference in horizontal and vertical diameters of the obtained disc boundary and that of the ground truth obtained from two expert ophthalmologists. For the 25 test images selected from the 110 colour fundus images, the Pearson correlation of the ground truth diameters with the detected diameters by the new method are 0.946 and 0.958 and, 0.94 and 0.974 respectively. From the scatter plot, it is shown that the ground truth and detected diameters have a high positive correlation. This computerized analysis of optic disc is very useful for the diagnosis of retinal diseases.

Author 1: Thresiamma Devasia
Author 2: Paulose Jacob
Author 3: Tessamma Thomas

Keywords: fundus image; optic nerve head; optic disc; Fuzzy C-Means clustering

PDF

Paper 19: Feature Descriptor Based on Normalized Corners and Moment Invariant for Panoramic Scene Generation

Abstract: Panorama generation systems aim at creating a wide-view image by aligning and stitching a sequence of images. The technology is extensively used in many fields such as virtual reality, medical image analysis, and geological engineering. This research is concerned with combining multiple images with a region of overlap to produce a wide field of view by the detection of feature points for images with different camera motion in an efficient and fast way. Feature extraction and description are important and critical steps in panorama construction. This study presents techniques of corner detection, moment invariant and random sampling to locate the important features and built storing descriptors in the images under noise, transformation, lighting, little viewpoint changes, blurring and compression circumstances. Corner detection and normalization are used to extract features in the image, while the descriptors are built by moment invariant in an efficient way. Finally, the matching and motion estimation is implemented based on the random sampling method. The results of experiments conducted on images and video sequences taken by handheld camera and images taken from the internet. The results show that the proposed algorithm generates panoramic image and panoramic video of good quality in a fast and efficient way.

Author 1: Kawther Abbas Sallal
Author 2: Abdul-Monem Saleh Rahma

Keywords: Feature extraction; feature description; motion estimation; registration; panoramic scene

PDF

Paper 20: Hybrid Client Side Phishing Websites Detection Approach

Abstract: Phishing tricks to steal personal or credential information by entering victims into a forged website similar to the original site, and urging them to enter their information believing that this site is legitimate. The number of internet users who are becoming victims of phishing attacks is increasing beside that phishing attacks have become more sophisticated. In this paper we propose a client-side solution to protect against phishing attacks which is a Firefox extension integrated as a toolbar that is responsible for checking whether recipient website is trusted or not by inspecting URLs of each requested webpage. If the site is suspicious the toolbar is going to block it. Every URL is evaluated corresponding to features extracted from it. Three heuristics (primary domain, sub domain, and path) and Naïve Bayes classification using four lexical features combined with page ranking received from two different services (Alexa, and Google page rank) used to classify URL. The proposed method requires no server changes and will prevent internet users from fraudulent sites especially from phishing attacks based on deceptive URLs. Experimental results show that our approach can achieve 48% accuracy ratio using a test set of 246 URL, and 87.5% accuracy ratio by excluding NB addition tested over 162 URL.

Author 1: Firdous Kausar
Author 2: Bushra Al-Otaibi
Author 3: Asma Al-Qadi
Author 4: Nwayer Al-Dossari

Keywords: Phishing Attacks; Browser Plugin; Anti Phishing; Security; Firefox

PDF

Paper 21: A Study of Scala Repositories on Github

Abstract: Functional programming appears to be enjoying a renaissance of interest for developing practical, “real-world” applications. Proponents have long maintained that the functional style is a better way to modularize programs and reduce complexity. What is new in this paper is we test this claim by studying the complexity of open source codes written in Scala, a modern language that unifies functional and object programming. We downloaded from GitHub, Inc., a portfolio of mostly “trending” Scala repositories that included the Scala compiler and standard library, much of them written in Scala; the Twitter, Inc., server and its support libraries; and many other repositories, several of them production-oriented and commercially inspired. In total we investigated approximately 22,000 source files with 2 millions lines of code and 223,000 methods written by hundreds of programmers. To analyze these sources, we developed a novel compiler kit that measures lines of code and adaptively learns to estimate the cyclomatic complexity of functional-object codes. The data show, first, lines of code and cyclomatic complexity are positively correlated as we expected but only weakly which we did not expect with Kendall’s t=0.258–0.274. Second, 75% of the Scala methods are straight-line, that is, they have the lowest possible cyclomatic complexity. Third, nearly 70% of methods have three or fewer lines. Fourth, the distributions of lines of code and cyclomatic complexity are both non-Gaussian (P<0.01), which is as surprising as it is interesting. These data may offer new insights into software complexity and the large-scale structure of applications including but not necessarily limited to Scala.

Author 1: Ron Coleman
Author 2: Matthew A. Johnson

Keywords: Functional programming; Scala; GitHub.com

PDF

Paper 22: A Crypto-Steganography: A Survey

Abstract: The two important aspects of security that deal with transmitting information or data over some medium like Internet are steganography and cryptography. Steganography deals with hiding the presence of a message and cryptography deals with hiding the contents of a message. Both of them are used to ensure security. But none of them can simply fulfill the basic requirements of security i.e. the features such as robustness, undetectability and capacity etc. So a new method based on the combination of both cryptography and steganography known as Crypto-Steganography which overcome each other’s weaknesses and make difficult for the intruders to attack or steal sensitive information is being proposed. This paper also describes the basics concepts of steganography and cryptography on the basis of previous literatures available on the topic.

Author 1: Md. Khalid Imam Rahmani
Author 2: Kamiya Arora
Author 3: Naina Pal

Keywords: Steganography; Image Steganography; Cryptography; Least Significant Bit (LSB); Enhanced Least Significant Bit (ELSB); Compression; Decompression; Advanced Encryption Standard (AES); Data Encryption Standard (DES); Hashing algorithms

PDF

Paper 23: An Ecn Approach to Congestion Control Mechanisms in Mobile Adhoc Networks

Abstract: Node(s)/link(s) of a network are subjected to overloading; network performance deteriorates substantially due to network congestion. Network congestion can be mitigated with the help of Explicit Congestion notification (ECN) technique. ECN notification is carried out by setting ECN bit in the TCP header. This allows for end-to-end notification of network congestion without dropping packets. ECN bit notifies TCP sources of incipient congestion before losses occur. ECN is a binary indicator (1 bit) which does not reflect the congestion level completely and so its convergence speed is relatively low. In our work, we have used an extra ECN bit (2 bit ECN). The extra bit allows for passing of additional congestion feedback to the source node. This enables the source node to determine the level of congestion based on which steps can be taken to ensure faster convergence. In comparison to single bit ECN, the additional information afforded by double bit ECN allows for more flexibility to adjust window size, to handle congestion. Simulation results have shown that the proposed method improves overall performance of the network by over 12%.

Author 1: Som Kant Tiwari
Author 2: Dr.Y.K.Rana
Author 3: Prof. Anurag Jain

Keywords: Explicit Congestion Notification (ECN); Mobile ad hoc Networks (MANET); Congestion control; Congestion window Transmission Control Protocol (TCP)

PDF

Paper 24: Clustering of Image Data Using K-Means and Fuzzy K-Means

Abstract: Clustering is a major technique used for grouping of numerical and image data in data mining and image processing applications. Clustering makes the job of image retrieval easy by finding the images as similar as given in the query image. The images are grouped together in some given number of clusters. Image data are grouped on the basis of some features such as color, texture, shape etc. contained in the images in the form of pixels. For the purpose of efficiency and better results image data are segmented before applying clustering. The technique used here is K-Means and Fuzzy K-Means which are very time saving and efficient.

Author 1: Md. Khalid Imam Rahmani
Author 2: Naina Pal
Author 3: Kamiya Arora

Keywords: Clustering; Segmentation; K-Means Clustering; Fuzzy K-Means

PDF

Paper 25: Identifying and Extracting Named Entities from Wikipedia Database Using Entity Infoboxes

Abstract: An approach for named entity classification based on Wikipedia article infoboxes is described in this paper. It identifies the three fundamental named entity types, namely; Person, Location and Organization. An entity classification is accomplished by matching entity attributes extracted from the relevant entity article infobox against core entity attributes built from Wikipedia Infobox Templates. Experimental results showed that the classifier can achieve a high accuracy and F-measure scores of 97%. Based on this approach, a database of around 1.6 million 3-typed named entities is created from 20140203 Wikipedia dump. Experiments on CoNLL2003 shared task named entity recognition (NER) dataset disclosed the system’s outstanding performance in comparison to three different state-of-the-art systems.

Author 1: Muhidin Mohamed
Author 2: Mourad Oussalah

Keywords: named entity identification; Wikipedia infobox; infobox templates; Named Entity Classification (NEC);

PDF

Paper 26: Design and Implementation of an Interpreter Using Software Engineering Concepts

Abstract: In this paper, an interpreter design and implementation for a small subset of C Language using software engineering concepts are presented. This paper reinforces an argument for the application of software engineering concepts in the area of interpreter design but it also focuses on the relevance of the paper to undergraduate computer science curricula. The design and development of the interpreter is also important to software engineering. Some of its components form the basis for different engineering tools. This paper also demonstrates that some of the standard software engineering concepts such as object-oriented design, design patterns, UML diagrams, etc., can provide a useful track of the evolution of an interpreter, as well as enhancing confidence in its correctness

Author 1: Fan Wu
Author 2: Hira Narang
Author 3: Miguel Cabral

Keywords: Interpreter; Software Engineering; Computer Science Curricula

PDF

Paper 27: A parallel line sieve for the GNFS Algorithm

Abstract: RSA is one of the most important public key cryptosystems for information security. The security of RSA depends on Integer factorization problem, it relies on the difficulty of factoring large integers. Much research has gone into problem of factoring a large number. Due to advances in factoring algorithms and advances in computing hardware the size of the number that can be factorized increases exponentially year by year. The General Number Field Sieve algorithm (GNFS) is currently the best known method for factoring large numbers over than 110 digits. In this paper, a parallel GNFS implementation on a BA-cluster is presented. This study begins with a discussion of the serial algorithm in general and covers the five steps of the algorithm. Moreover, this approach discusses the parallel algorithm for the sieving step. The experimental results have shown that the algorithm has achieved a good speedup and can be used for factoring a large integers.

Author 1: Sameh Daoud
Author 2: Ibrahim Gad

Keywords: parallel Algorithm; Public Key Cryptosystem; GNFS Algorithm

PDF

Paper 28: Estimating the Number of Test Workers Necessary for a Software Testing Process Using Artificial Neural Networks

Abstract: On time and within budget software project development represents a challenge for software project managers. Software management activities include but are not limited to: estimation of project cost, development of schedules and budgets, meeting user requirements and complying with standards. Recruiting development team members is a sophisticated problem for a software project manager. Since the utmost cost in software development effort is manpower, software project effort and is associated cost estimation models are used in estimating the effort required to complete a project. This effort estimate can then be converted into dollars based on the proper labor rates. An initial development team needs to be selected not only at the beginning of the project but also during the development process. It is important to allocate the necessary team to a project and efficiently distribute their effort during the development life cycle. In this paper, we provide our initial idea of developing a prediction model for defining the estimated required number of test workers of a software project during the software testing process. The developed models utilize the test instance and the number of observed faults as input to the proposed models. Artificial Neural Networks (ANNs) successfully build the dynamic relationships between the inputs and output and produce and accurate predication estimates.

Author 1: Alaa F. Sheta
Author 2: Sofian Kassaymeh
Author 3: David Rine

Keywords: Staff Management; Neural Networks; Software Testing; Estimation

PDF

Paper 29: Natural Gradient Descent for Training Stochastic Complex-Valued Neural Networks

Abstract: In this paper, the natural gradient descent method for the multilayer stochastic complex-valued neural networks is considered, and the natural gradient is given for a single stochastic complex-valued neuron as an example. Since the space of the learnable parameters of stochastic complex-valued neural networks is not the Euclidean space but a curved manifold, the complex-valued natural gradient method is expected to exhibit excellent learning performance.

Author 1: Tohru Nitta

Keywords: Neural network; Complex number; Learning; Singular point

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org