2017-01-21T15:15:44Z http://thesai.org/oai/oai2.aspx
oai:thesai.org:10.14569/IJACSA.2010.010101 2010-08-01
A New Approach for Handling Null Values in Web Server Log Pradeep Ahirwar Deepak Singh Tomar Rajesh Wadhvani Null value, web mining, k-means clustering, fuzzy C-means clustering, log records, log parser. International Journal of Advanced Computer Science and Applications(IJACSA), 1(1), 2010 The web log data embed much of the user’s browsing behavior and the operational data generated through Internet end user interaction may contain noise. Which affect the knowledge based decision. Handling these noisy data is a major challenge. Null value handling is an important noise handling technique in relational data base system. In this work the issues related to null value are discussed and null value handling concept based on train data set is applied to real MANIT web server log. A prototype system based on Fuzzy C-means clustering techniques with trained data set is also proposed in this work. The proposed method integrates advantages of fuzzy system and also introduces a new criterion, which enhances the estimated accuracy of the approximation. The comparisons between different methods for handling null values are depicted. The result shows the effectiveness of the methods empirically on realistic web logs and explores the accuracy, coverage and performance of the proposed Models. http://thesai.org/Downloads/Volume1No1/Paper_1-A_New_Approach_for_Handling_Null_Values_in_Web_Server_Log.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010101 eng
oai:thesai.org:10.14569/IJACSA.2010.010102 2012-07-01
Traffic Classification – Packet-, Flow-, and Application-based Approaches Sasan Adibi Traffic Classification; Packet; Flow; Applications, Delay; Payload Size. International Journal of Advanced Computer Science and Applications(IJACSA), 1(1), 2010 Traffic classification is a very important mathematical and statistical tool in communications and computer networking, which is used to find average and statistical information of the traffic passing through certain pipe or hub. The results achieved from a proper deployment of a traffic analysis method provide valuable insights, including: how busy a link is, the average end-toend delays, and the average packet size. These valuable information bits will help engineers to design robust networks, avoid possible congestions, and foresee future growth. This paper is designed to capture the essence of traffic classification methods and consider them in packet-, flow-, and application-based contexts. http://thesai.org/Downloads/Volume1No1/Paper_2-Traffic_Classification-Packet-Flow-and_Application-based_Approaches.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010102 eng
oai:thesai.org:10.14569/IJACSA.2010.010103 2012-07-01
Algebraic Specifications:Organised and focussed approach in software development Rakesh L Dr. Manoranjan Kumar singh Abstract data types (ADTs), Formal-Methods, Abstraction, Equational reasoning, Symbolic computation. International Journal of Advanced Computer Science and Applications(IJACSA), 1(1), 2010 Algebraic specification is a formal specification approach to deal with data structures in an implementation independent way. Algebraic specification is a technique whereby an object is specified in terms of the relationships between the operations that act on that object. In this paper we are interested in proving facts about specifications, in general, equations of the form t1 = t2 , where t1 and t2 are members of term (S), being the signature of the specification. One way of executing the specification would be to compute the first algebra for the specification and then to check whether t1 and t2 belong in the same equivalence class. The use of formal specification techniques for software engineering has the advantage that we can reason about the correctness of the software before its construction. Algebraic specification methods can be used for software development to support verifiability, reliability, and usability. The main aim of this research work is to put such intuitive ideas into concrete setting in order for better quality product.(...) http://thesai.org/Downloads/Volume1No1/Paper_3-Algebraic_Specifications.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010103 eng
oai:thesai.org:10.14569/IJACSA.2010.010104 2012-07-01
Evaluating the impact of information systems on end user performance: A proposed model Ahed Abugabah Louis Sanzogni Osama Alfarraj Information systems, user performance, task technology fit, technology acceptance model. International Journal of Advanced Computer Science and Applications(IJACSA), 1(1), 2010 In the last decades, information systems (IS) researchers have concentrated their efforts in developing and testing models that help with the investigation of IS and user performance in different environments. As a result, a number of models for studying end users’ systems utilization and other related issues including system usefulness, system success and user aspects in business organizations have appeared. A synthesized model consolidating three well-known and widely used models in IS research is proposed. Our model was empirically tested in a sophisticated IS environment investigating the impacts of the enterprise recourse planning (ERP) systems on user perceived performance. Statistical analysis was performed including factors analysis and regression to test the model and prove its validity. The findings demonstrated that the proposed model performed well as most factors had direct and or indirect significant influences on user perceived performance suggesting therefore that the model possesses the ability to explain the main impacts of these factors on ERP users. http://thesai.org/Downloads/Volume1No1/Paper_4-Evaluating_the_impact_of_information_systems_on_end_user_performance.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010104 eng
oai:thesai.org:10.14569/IJACSA.2010.010105 2012-07-01
Network Anomaly Detection via Clustering and Custom Kernel in MSVM Arvind Mewada Shamila Khan Prafful Gedam IDS; K-mean; MSVM; RBF; KDD99, Custom Kernel International Journal of Advanced Computer Science and Applications(IJACSA), 1(1), 2010 Multiclass Support Vector Machines (MSVM) have been applied to build classifiers, which can help Network Intrusion detection. Beside their high generalization accuracy, the learning time of MSVM classifiers is still a concern when applied into Network intrusion detection systems. This paper speeds up the learning time of MSVM classifiers by reducing the number of support vectors. In this study, we proposed KMSVM method combines the K-means clustering technique with custom kernel in MSVM. Experiments performed on KDD99 dataset using KMSVM method, and the results show that the KMSVM method can speed up the learning time of classifiers by both reducing support vectors and improve the detection rate on testing dataset. http://thesai.org/Downloads/Volume1No1/Paper_5-Network_Anomaly_Detection_via_Clustering_and_Custom_Kernel_in_MSVM.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010105 eng
oai:thesai.org:10.14569/IJACSA.2010.010106 2012-07-01
Iris Recognition System Neha Kak Rishi Gupta Sanchit Mahajan iris recognition, biometric identification, pattern recognition, segmentation International Journal of Advanced Computer Science and Applications(IJACSA), 1(1), 2010 In a biometric system a person is identified automatically by processing the unique features that are posed by the individual. Iris Recognition is regarded as the most reliable and accurate biometric identification system available. In Iris Recognition a person is identified by the iris which is the part of eye using pattern matching or image processing using concepts of neural networks. The aim is to identify a person in real time, with high efficiency and accuracy by analysing the random patters visible within the iris if an eye from some distance, by implementing modified Canny edge detector algorithm. The major applications of this technology so far have been: substituting for passports (automated international border crossing); aviation security and controlling access to restricted areas at airports; database access and computer login. http://thesai.org/Downloads/Volume1No1/Paper_6-Iris_Recognition_System.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010106 eng
oai:thesai.org:10.14569/IJACSA.2010.010107 2012-07-01
Radio Frequency Identification Based Library Management System Priyanka Grover Anshul Ahuja Radio frequency identification technology; RFID Readers; RFID Tags; Inductive Coupling International Journal of Advanced Computer Science and Applications(IJACSA), 1(1), 2010 Radio frequency identification (RFID) is a term that is used to describe a system that transfers the identity of an object or person wirelessly, using radio waves. It falls under the category of automatic identification technologies. This paper proposes RFID Based Library Management System that would allow fast transaction flow and will make easy to handle the issue and return of books from the library without much intervention of manual book keeping. The proposed system is based on RFID readers and passive RFID tags that are able to electronically store information that can be read with the help of the RFID reader. This system would be able to issue and return books via RFID tags and also calculates the corresponding fine associated with the time period of the absence of the book from the library database. http://thesai.org/Downloads/Volume1No1/Paper_7-Radio_Frequency_Identification_Based_Library_Management_System.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010107 eng
oai:thesai.org:10.14569/IJACSA.2010.010108 2012-07-01
PC And Speech Recognition Based Electrical Device Control Sanchit Dua Electronic device controller, Diode, Transformer International Journal of Advanced Computer Science and Applications(IJACSA), 1(1), 2010 controlling of various appliances using PC. It consist a circuit for using the printer port of a PC, for control application using software and some interface hardware. The interface circuit along with the given software can be used with the printer port of any PC for controlling up to eight equipments. Parallel port is a simple and inexpensive tool for building computer controlled devices and projects. The simplicity and ease of programming makes parallel port popular. http://thesai.org/Downloads/Volume1No1/Paper_8-PC_And_Speech_Recognition_Based_Electrical_Device_Control.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010108 eng
oai:thesai.org:10.14569/IJACSA.2010.010109 2012-07-01
[Survey Report] How WiMAX Will Deploy In India Rakesh Kumar Jha Dr Upena D Dalal Electronic device controller, Diode, Transformer International Journal of Advanced Computer Science and Applications(IJACSA), 1(1), 2010 controlling of various appliances using PC. It consist a circuit for using the printer port of a PC, for control application using software and some interface hardware. The interface circuit along with the given software can be used with the printer port of any PC for controlling up to eight equipments. Parallel port is a simple and inexpensive tool for building computer controlled devices and projects. The simplicity and ease of programming makes parallel port popular. http://thesai.org/Downloads/Volume1No1/Paper_9-How_WiMAX_will_deploy_in_India.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010109 eng
oai:thesai.org:10.14569/IJACSA.2010.010201 2012-07-01
The effect of Knowledge Characteristics in students performances Asmahan M. Altaher codify ability, Explicitness, Availability, Teach ability, student performance. International Journal of Advanced Computer Science and Applications(IJACSA), 1(2), 2010 Knowledge characteristics are the essential step in leveraging knowledge value in the university. Share Document, and contributes knowledge may not be useful without the context provided by experience. This paper focuses on the characteristics of knowledge in applied science private university and its effect on student’s performance, which aim to focus in the nature knowledge and the quality of material. Questioner was designed and sent to MIS students in the applied sciences university in order to improve the context of the knowledge and facilitated the knowledge usage in order to improve the student knowledge level. The result lead recommends that the university should understand the knowledge characteristics and the potential techniques that support sharing knowledge. In addition the university should now which type of knowledge can by articulated or which knowledge can be taught to individuals, through training, practices or apprenticeship, in order to improve the student performance. http://thesai.org/Downloads/Volume1No2/Paper_1-The_effect_of_Knowledge_Characteristics_in_students_performances.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010201 eng
oai:thesai.org:10.14569/IJACSA.2010.010202 2012-07-01
A New Personalized Recommendation Technique Based on the Modified TOPSIS Method Guan-Dao Yang Lu Sun Personalized Recommendation Technique; Improved Gray Correlation Analysis; Modified TOPSIS Method. International Journal of Advanced Computer Science and Applications(IJACSA), 1(2), 2010 Personalized recommendation service helping users to target the interesting information from the excessive information set has been widely concerned. In this paper, we firstly propose a new method named Modified TOPSIS Method utilizing the Improved Gray Correlation Analysis Method. Then, we present a new personalized recommendation technique based on the Modified TOPSIS Method. Finally, the verification method utilizing Spearman’s Rank Correlation Coefficient demonstrates that our new personalized recommendation technique is efficient. http://thesai.org/Downloads/Volume1No2/Paper_2-A_New_Personalized_Recommendation_Technique_Based_on_the_Modified_TOPSIS_Method.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010202 eng
oai:thesai.org:10.14569/IJACSA.2010.010203 2012-07-01
Detection and Measurement of magnetic data for short length wireless communication using FT-IR Abu Saleh FT; FT-IR; Spectrum; prism. International Journal of Advanced Computer Science and Applications(IJACSA), 1(2), 2010 Infrared (IR) radiation is a type of electromagnetic radiation. Infrared “light” has a longer wavelength than visible light. Red light has a longer wavelength than other colors of light, and infrared has even longer waves than red does; so infrared is sort of “redder-than-red” light or “beyond red” light. Infrared radiation lies between visible light and radio waves on the electromagnetic spectrum. In this paper, the infrared radiation is used for detecting the magnetic data for high speedy short range wireless communication. But infrared radiation may use in various way. This paper contains the performance of the FT-IR technique that is for multiplexing the transmissions of different users are viewed at the same time. http://thesai.org/Downloads/Volume1No2/Paper_3-Detection_and_Measurement_of_magnetic_data_for_short_length_wireless_communication.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010203 eng
oai:thesai.org:10.14569/IJACSA.2010.010204 2012-07-01
A Novel and Efficient countermeasure against Power Analysis Attacks using Elliptic Curve Cryptography M Prabu R.Shanmugalakshmi component Simple Power Analysis, Differential Power Analysis, Security Analysis Model, Algorithm design, Side Channel Attacks International Journal of Advanced Computer Science and Applications(IJACSA), 1(2), 2010 Recently, there is a leakage in the communication channel information by the cryptographic processors. It is the major chore to overcome the pouring out or spreading out of the secure data. In this paper, a new level of security analysis model is constructed in power analysis using Elliptic Curve Cryptography. And so many side channel attacks and their countermeasures are explained undeniably. An algorithm design based on power analysis is also described and it makes our countermeasure more secure against simple power analysis, differential power analysis and other attacks. The theoretical analysis based on these result has been shown and it represents how the algorithm design should fluctuate from the facade side channel attacks. http://thesai.org/Downloads/Volume1No2/Paper_4-A_Novel_and_Efficient_countermeasure_against_Power_Analysis_Attacks.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010204 eng
oai:thesai.org:10.14569/IJACSA.2010.010205 2012-07-01
Dynamic path restoration for new call blocking versus handoff call blocking in heterogeneous network using buffers for QoS Ajai Kumar Daniel R Singh J P Saini Handoff call, New call buffer, Congestion, Heterogeneous network, Quality of Service (QoS), International Journal of Advanced Computer Science and Applications(IJACSA), 1(2), 2010 An Ad hoc network is a collection of wireless mobile nodes dynamically forming a temporary network without the use of any existing heterogeneous network infrastructure or centralized administration. Routing protocols used inside ad hoc networks must be prepared to automatically adjust to an environment that can vary between the extremes of high mobility with low band width, and low mobility with high bandwidth The tremendous growth of wireless networks demands the need to meet different multimedia (such as voice audio, video, data, etc) applications available over the network. This application demand and allocation could lead to congestion if the network has to maintain such high resources for the quality of service (QoS) requirements of the applications. In this paper, a new Protocol is proposed for wireless mobile heterogeneous networks are based on the use of. path information, traffic and bandwidth resource information at each node, for allocation of route path and Handoff problem. The proposed protocol uses two buffers one for new call and another buffer is use for handoff calls if there is no channel available instead of dropping (rejecting) them it store in the buffer and when ever the channel is free it allocate for communication The protocol improved the performance of the network especially by the effect of the dynamic threshold of buffer size of new call buffer and handoff call buffer In the link failure situation we provide another path for the communication by applying a Restoration Mechanism for the survivability of link and improved the QoS of mobile network . http://thesai.org/Downloads/Volume1No2/Paper_5-Dynamic_path_restoration_for_new_call_blocking_versus_handoff_call_blocking.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010205 eng
oai:thesai.org:10.14569/IJACSA.2010.010206 2012-07-01
PATTERN BASED SUBSPACE CLUSTERING: A REVIEW Debahuti Mishra Shruti Mishra Sandeep Satapathy Amiya Kumar Rath Milu Acharya Subspace clustering; Biclustering; p-cluster; z-cluster International Journal of Advanced Computer Science and Applications(IJACSA), 1(2), 2010 The task of biclustering or subspace clustering is a data mining technique that allows simultaneous clustering of rows and columns of a matrix. Though the definition of similarity varies from one biclustering model to another, in most of these models the concept of similarity is often based on such metrics as Manhattan distance, Euclidean distance or other Lp distances. In other words, similar objects must have close values in at least a set of dimensions. Pattern-based clustering is important in many applications, such as DNA micro-array data analysis, automatic recommendation systems and target marketing systems. However, pattern-based clustering in large databases is challenging. On the one hand, there can be a huge number of clusters and many of them can be redundant and thus makes the pattern-based clustering ineffective. On the other hand, the previous proposed methods may not be efficient or scalable in mining large databases. The objective of this paper is to perform a comparative study of all subspace clustering algorithms in terms of efficiency, accuracy and time complexity. http://thesai.org/Downloads/Volume1No2/Paper_6-PATTERN_BASED_SUBSPACE_CLUSTERING.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010206 eng
oai:thesai.org:10.14569/IJACSA.2010.010207 2012-07-01
Real-time Facial Emotion Detection using Support Vector Machines Anvita Bajpai Emotion-Detection;Facial-expressions; libsvm; Support Vector Machines; Facial action coding system(FACS) International Journal of Advanced Computer Science and Applications(IJACSA), 1(2), 2010 There have been continuous researches in the field of emotion detection through faces of biological species in the last few decades. This was further fuelled by the rise of artificial intelligence which has added a new paradigm to its ongoing research. This paper discusses the role of one of the artificial intelligence techniques, Support vector machines for efficient emotion detection. This study comprised of experiments conducted on Java platform by using libsvm. The coordinates of vital points of a face have been used for training the SVM network which finally led to proper identification of various emotions appearing on a human face. http://thesai.org/Downloads/Volume1No2/Paper_7-Real-time_Facial_Emotion_Detection_using_Support_Vector_Machines.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010207 eng
oai:thesai.org:10.14569/IJACSA.2010.010208 2012-07-01
On-line Rotation Invariant Estimation and Recognition R Bremananth Andy W. H. Khong M. Sankari Feature extraction; Line integrals; Orientation-detection; Optimized Gabor filters; Rotation-invariant recognition; Radon transform. International Journal of Advanced Computer Science and Applications(IJACSA), 1(2), 2010 Rotation invariant estimation is an important and computationally difficult process in the real-time human computer interaction. Our new methodologies propose here for on-line image rotation angle estimation, correction and feature extractions based on line integrals. We reveal that a set of projection data of line integrals from single (fan-arc and fan-beam) or multi point sources (Radon transform) are employed for orientation estimation. After estimating orientation, image angle variations are altered to its principal direction. We further combine Boltzmann machine and k-mean clustering to obtain parameter optimized Gabor filters, which are used to extract non-redundant compact set of features for classification. The proposed method of fan-line, fan-arc and Radon transform are compared for real-time image orientation detection. Accuracy of classification is evaluated with the classifiers viz., back propagation, Hamming neural network, Euclidean-norm distance, and k-nearest neighbors. Experiment on a database of 535 images consisting of license plate and iris images. The viability of suggested algorithms has been tested with different classifiers. Thus, this paper proposes an efficient rotation invariant recognition for on-line images recognition. http://thesai.org/Downloads/Volume1No2/Paper_8-On-line_Rotation_Invariant_Estimation_and_Recognition.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010208 eng
oai:thesai.org:10.14569/IJACSA.2010.010209 2012-07-01
RSS-Crawler Enhancement for Blogosphere-Mapping Justus Bross Patrick Hennig Philipp Berger Christoph Meinel weblogs, rss-feeds, data mining, knowledge discovery, blogosphere, crawler, information extraction International Journal of Advanced Computer Science and Applications(IJACSA), 1(2), 2010 The massive adoption of social media has provided new ways for individuals to express their opinions online. The blogosphere, an inherent part of this trend, contains a vast array of information about a variety of topics. It is a huge think tank that creates an enormous and ever-changing archive of open source intelligence. Mining and modeling this vast pool of data to extract, exploit and describe meaningful knowledge in order to leverage structures and dynamics of emerging networks within the blogosphere is the higher-level aim of the research presented here. Our proprieteary development of a tailor-made feed-crawler-framework meets exactly this need. While the main concept, as well as the basic techniques and implementation details of the crawler have already been dealt with in earlier publications, this paper focuses on several recent optimization efforts made on the crawler framework that proved to be crucial for the performance of the overall framework. http://thesai.org/Downloads/Volume1No2/Paper_9-RSS-Crawler_Enhancement_for_Blogosphere-Mapping.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010209 eng
oai:thesai.org:10.14569/IJACSA.2010.010301 2012-07-01
Comparative Study of Gaussian Mixture Model and Radial Basis Function for Voice Recognition Fatai Adesina Anifowose Gaussian Mixture Model, Radial Basis Function, Artificial Intelligence, Computational Intelligence, Biometrics, Optimal Parameters, Voice Pattern Recognition, DTREG International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 A comparative study of the application of Gaussian Mixture Model (GMM) and Radial Basis Function (RBF) in biometric recognition of voice has been carried out and presented. The application of machine learning techniques to biometric authentication and recognition problems has gained a widespread acceptance. In this research, a GMM model was trained, using Expectation Maximization (EM) algorithm, on a dataset containing 10 classes of vowels and the model was used to predict the appropriate classes using a validation dataset. For experimental validity, the model was compared to the performance of two different versions of RBF model using the same learning and validation datasets. The results showed very close recognition accuracy between the GMM and the standard RBF model, but with GMM performing better than the standard RBF by less than 1% and the two models outperformed similar models reported in literature. The DTREG version of RBF outperformed the other two models by producing 94.8% recognition accuracy. In terms of recognition time, the standard RBF was found to be the fastest among the three models. http://thesai.org/Downloads/Volume1No3/Paper%201-A%20Comparative%20Study%20of%20Gaussian%20Mixture%20Model%20and%20Radial%20Basis%20Function%20for%20Voice%20Recognition.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010301 eng
oai:thesai.org:10.14569/IJACSA.2010.010302 2012-07-01
Multiphase Scalable Grid Scheduler Based on Multi-QoS Using Min-Min Heuristic Nawfal A Mehdi Ali Mamat Hamidah Ibrahim Shamala A/P K Multi-phase; QoS; Grid Scheduling International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 In scheduling, the main factor that affects searching speed and mapping performance is the number of resources or the size of search space. In grid computing, the scheduler performance plays an essential role in the overall performance. So, it is obvious the need for scalable scheduler that can manage the growing in resources (i.e. scalable). With the assumption that each resource has its own specifications and each job has its own requirements; then searching the whole search space (all the resources) can waste plenty of scheduling time. In this paper, we propose a two-phase scheduler that uses min-min algorithm to speed up the mapping time with almost the same efficiency. The scheduler is also based on the assumption that the resources in grid computing can be classified into clusters. The scheduler tries first to schedule the jobs to the suitable cluster (i.e. first phase) and then each cluster schedule the incoming jobs to the suitable resources (i.e. second phase). The scheduler is based on multidimensional QoS to enhance the mapping as much as it can. The simulation results show that the use of two-phase strategy can support the scalable scheduler. http://thesai.org/Downloads/Volume1No3/Paper%202-Multiphase%20Scalable%20Grid%20Scheduler%20Based%20on%20Multi-QoS%20Using%20Min-Min%20Heuristic.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010302 eng
oai:thesai.org:10.14569/IJACSA.2010.010303 2012-07-01
Loss Reduction in Distribution System Using Fuzzy Techniques Sheeraz kirmani Md. Farrukh Rahman Chakresh Kumar Capacitor placement, Distribution systems, Fuzzy expert system. International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 In this paper, a novel approach using approximate reasoning is used to determine suitable candidate nodes in a distribution system for capacitor placement. Voltages and power loss reduction indices of distribution system nodes are modeled by furzy membership functions. A fuzzy expert system (FES) containing a set of heuristic rules is then used to determine the capacitor placement suitability of each node in the distribution system. Capacitors are placed on the nodes with the highest suitability. a new design methodology for determining the size, location, type and number of capacitors to be placed on a radial distribution system is presented. The objective is to minimize the peak power losses and the energy losses in the distribution system considering the capacitor cost. Test results have been presented along with the discussion of the algorithm. http://thesai.org/Downloads/Volume1No3/Paper%203-Loss%20Reduction%20in%20Distribution%20System%20Using.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010303 eng
oai:thesai.org:10.14569/IJACSA.2010.010304 2012-07-01
A threat risk modeling framework for Geospatial Weather Information System (GWIS) a DREAD based study K Ram Mohan Rao Durgesh Pant Rapid Application Development, Risk rating, Security assessment. International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 Over the years, the focus has been on protecting network, host, database and standard applications from internal and external threats. The Rapid Application Development (RAD) process makes the web application extremely short and makes it difficult to eliminate the vulnerabilities. Here we study web application risk assessment technique called threat risk modeling to improve the security of the application. We implement our proposed mechanism the application risk assessment using Microsoft’s threat risk DREAD model to evaluate the application security risk against vulnerability parameters. The study led to quantifying different levels of risk for Geospatial Weather Information System (GWIS) using DREAD model. http://thesai.org/Downloads/Volume1No3/Paper%204-A%20threat%20risk%20modeling%20framework%20for%20Geospatial%20Weather%20Information%20System%20(GWIS)%20a%20DREAD%20based%20study.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010304 eng
oai:thesai.org:10.14569/IJACSA.2010.010305 2012-07-01
Council-based Distributed Key Management Scheme for MANETs Abdelmajid HAJAMI Mohammed ELKOUTBI Key Management; MANET; Clustering International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 Mobile ad hoc networks (MANETs) have been proposed as an extremely flexible technology for establishing wireless communications. In comparison with fixed networks, some new security issues have arisen with the introduction of MANETs. Secure routing, in particular, is an important and complicated issue. Clustering is commonly used in order to limit the amount of secure routing information. In this work, we propose an enhanced solution for ad hoc key management based on a cauterized architecture. This solution uses clusters as a framework to manage cryptographic keys in a distributed way. This paper sheds light on the key management algorithm for the OLSR protocol standard. Our algorithm takes into account the node mobility and engenders major improvements regarding the number of elected cluster heads to create a PKI council. Our objective is to distribute the certification authority functions for a reduced and less mobile cluster heads that will serve for keys exchange. http://thesai.org/Downloads/Volume1No3/Paper%205-A%20Council-based%20Distributed%20Key%20Management.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010305 eng
oai:thesai.org:10.14569/IJACSA.2010.010306 2012-07-01
Improved Spectrogram Analysis for ECG Signal in Emergency Medical Applications A.K.M Fazlul Haque Md. Hanif Ali M Adnan Kiber Spectrogram, ECG, PSD, Periodogram, Time-varying signal, FFT. International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 This paper presents the spectrogram effect of biomedical signal, especially for ECG. Simulation module developed for the spectrogram implementation. Spectrogram based on ECG signal and power spectral density together with off-line evaluation has been observed. ECG contains very important clinical information about the cardiac activities of heart. The features of small variations in ECG signal with time-varying morphological characteristics needs to be extracted by signal processing method because there are not visible of graphical ECG signal. Small variations of simulated normal and noise corrupted ECG signal have been extracted using spectrogram. The spectrogram found to be more precise over conventional FFT in finding the small abnormalities in ECG signal. These form time-frequency representations for processing time-varying signals. By using the presented method, it is ensure that high resolution time-varying spectrum estimation with no lag error can be produced. Other benefits of the method are the straightforward procedure for evaluating the statistics of the spectrum estimation. http://thesai.org/Downloads/Volume1No3/Paper%206-Improved%20Spectrogram%20Analysis%20for%20ECG%20Signal%20in%20Emergency%20Medical%20Applications.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010306 eng
oai:thesai.org:10.14569/IJACSA.2010.010307 2012-07-01
High Quality Integrated Data Reconstruction for Medical Applications A.K.M Fazlul Haque Md. Hanif Ali M Adnan Kiber FFT, IFFT, ECG, Baseband, Reconstruction, Noise, FDA tool. International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 In this paper, the implementation of a high quality integrated data reconstruction model and algorithm has been proposed, especially for medical applications. Patients’ Information acquired at the sending end and reconstructed at the receiving end by using a technique that would be high quality for the signal reconstruction process. A method is proposed in which the reconstruction of data like ECG, audio and other patients’ vital parameters that are acquired in the time-domain and operated in the frequency-domain. Further the data will be reconstructed in the time-domain from the frequency domain where high quality data is required. In this particular case, high quality ensures the distortion less and noiseless recovered baseband signal. This would usually require the application of Fast Fourier Transform (FFT) and Inverse Fast Fourier Transform (IFFT) return the data to the spatial domain. The simulation is performed using Matlab. The Composite baseband signal has been generated by developing a program as well as by acquiring to the workspace. The feature of the method is that it can achieve high-quality integrated data reconstruction and can be associated easily with spatial domain. http://thesai.org/Downloads/Volume1No3/Paper%207-High%20Quality%20Integrated%20Data%20Reconstruction%20for%20Medical%20Applications.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010307 eng
oai:thesai.org:10.14569/IJACSA.2010.010308 2012-07-01
An Electronic Design of a Low Cost BRAILLE HANDGLOVE M Rajasenathipathi M.Arthanari M.Sivakumar Braille, cell, vibration, dots, motor International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 This paper documents a new design for a Braille Hand glove, comprising of a majority of electrical components, the design aims to produce a product to perform vibrations in six position of blind’s person right hand. A low cost and robust design will provide the blind with an affordable and reliable tool also it produce the new technique and communications method for blind persons. http://thesai.org/Downloads/Volume1No3/Paper%208-AN%20ELECTRONIC%20DESIGN%20OF%20A%20%20LOW%20%20COST%20%20BRAILLE%20%20%20HANDGLOVE.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010308 eng
oai:thesai.org:10.14569/IJACSA.2010.010309 2012-07-01
Test-Bed for Emergency Management Simulations Anu Vaidyanathan test-bed, Emergency Management, Live Call Records, PCMD, Proactive Crowd-Sourcing, Agents International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 We present a test-bed for Emergency Management Simulations by contrasting two prototypes we have built, CAVIAR and Reverse 111. We outline the desirable design principles that guide our choices for simulating emergencies and implement these ideas in a modular system, which utilizes proactive crowd-sourcing to enable emergency response centers to contact civilians co-located with an emergency, to provide more information about the events. This aspect of proactive crowd-sourcing enables Emergency response centers to take into account that an emergency situation’s inherent nature is dynamic and that initial assumptions while deploying resources to the emergency may not hold, as the emergency unfolds. A number of independent entities, governmental and non-governmental are known to interact while mitigating emergencies. Our test-bed utilizes a number of agents to simulate various resource sharing policies amongst different administrative domains and non-profit civilian organizations that might pool their resources at the time of an emergency. A common problem amongst first responders is the lack of interoperability amongst their devices. In our test-bed, we integrate live caller data obtained from traces generated by Telecom New Zealand, which tracks cell-phone users and their voice and data calls across the network, to identify co-located crowds. The test-bed has five important components including means to select and simulate Events, Resources and Crowds and additionally provide a visual interface as part of a massive online multi-player game to simulate Emergencies in any part of the world. We also present our initial evaluation of some resource sharing policies in our intelligent agents, which are part of our test-bed. http://thesai.org/Downloads/Volume1No3/Paper%209-A%20Test-Bed%20for%20Emergency%20Management%20Simulation.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010309 eng
oai:thesai.org:10.14569/IJACSA.2010.010310 2012-07-01
Emerging Trends of Ubiquitous Computing Prakriti Trivedi Kamal Kishore Sagar Vernon Braille, cell, vibration, dots, motor International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 Ubiquitous computing is a method of enhancing computer use by making many computers available throughout the physical environment, but making them effectively invisible to the user. Background network to support ubiquitous computing is ubiquitous network by which users can enjoy network services whenever and wherever they want (home, office, outdoors). In this paper Issues related to ubiquitous network, smart objects and wide area ubiquitous Networks have been discussed. We also discuss various elements used in ubiquitous computing with the challenges in this computing environment http://thesai.org/Downloads/Volume1No3/Paper%2010-Emerging_Trends_of_Ubiquitous_Computing.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010310 eng
oai:thesai.org:10.14569/IJACSA.2010.010311 2012-07-01
Modelling and Analysing of Software Defect Prevention Using ODC Prakriti Trivedi Som Pachori Defect Prevention, ODC, Defect Trigger International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 As the time passes the software complexity is increasing and due to this software reliability and quality will be affected. And for measuring the software reliability and quality various defect measurement and defect tracing mechanism will be used .Software defect prevention work typically focuses on individual inspection and testing technique. ODC is a mechanism by which we exploit software defect that occur during the software development life cycle. Orthogonal defect classification is a concept which enables developers, quality managers and project managers to evaluate the effectiveness and correctness of the software http://thesai.org/Downloads/Volume1No3/Paper%2011-Modelling%20and%20Analysing%20of%20Software%20Defect.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010311 eng
oai:thesai.org:10.14569/IJACSA.2010.010312 2012-07-01
Enhanced Segmentation Procedure for Intima-Adventitial Layers of Common Carotid Artery V Savithri S.Purushothaman Artery, boundary detection, imaging, Ultrasonic, parallel programming International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 This paper presents an enhanced Segmentation technique for use on noisy B-mode ultrasound images of the carotid artery. This method is based on Image Enhancement, Edge detection and Morphological operations in Boundary detection. This procedure may simplify the job of the practitioner for analyzing accuracy and variability of segmentation results. Possible plaque regions are also highlighted. A thorough evaluation of the method in the clinical environment shows that inter observer variability is evidently decreased and so is the overall analysis time. The results demonstrate that it has the potential to perform qualitatively better than applying existing methods in intima and adventitial layer detection on B-mode images. http://thesai.org/Downloads/Volume1No3/Paper%2012-Enhanced%20Segmentation%20Procedure%20for%20Intima%20Adventitial%20Layers%20of%20Common%20Carotid.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010312 eng
oai:thesai.org:10.14569/IJACSA.2010.010313 2012-07-01
Application of Locality Preserving Projections in Face Recognition Shermina J Defect Prevention, ODC, Defect Trigger International Journal of Advanced Computer Science and Applications(IJACSA), 1(3), 2010 Face recognition technology has evolved as an enchanting solution to address the contemporary needs in order to perform identification and verification of identity claims. By advancing the feature extraction methods and dimensionality reduction techniques in the application of pattern recognition, a number of face recognition systems has been developed with distinct degrees of success. Locality preserving projection (LPP) is a recently proposed method for unsupervised linear dimensionality reduction. LPP preserve the local structure of face image space which is usually more significant than the global structure preserved by principal component analysis (PCA) and linear discriminant analysis (LDA). This paper focuses on a systematic analysis of locality-preserving projections and the application of LPP in combination with an existing technique This combined approach of LPP through MPCA can preserve the global and the local structure of the face image which is proved very effective. Proposed approach is tested using the AT & T face database. Experimental results show the significant improvements in the face recognition performance in comparison with some previous methods. http://thesai.org/Downloads/Volume1No3/Paper%2013-Application%20of%20Locality%20Preserving%20Projections%20in%20Face%20Recognition.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010313 eng
oai:thesai.org:10.14569/IJACSA.2010.010401 2012-07-01
Improving the Technical Aspects of Software Testing in Enterprises Tim A Majchrzak Software testing, testing, software quality, design science, IT alignment, process optimization, technical aspects International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 Many software developments projects fail due to quality problems. Software testing enables the creation of high quality software products. Since it is a cumbersome and expensive task, and often hard to manage, both its technical background and its organizational implementation have to be well founded. We worked with regional companies that develop software in order to learn about their distinct weaknesses and strengths with regard to testing. Analyzing and comparing the strengths, we derived best practices. In this paper we explain the project’s background and sketch the design science research methodology used. We then introduce a graphical categorization framework that helps companies in judging the applicability of recommendations. Eventually, we present details on five recommendations for tech-nical aspects of testing. For each recommendation we give im-plementation advice based on the categorization framework. http://thesai.org/Downloads/Volume1No4/Paper%201-%20Improving%20the%20Technical%20Aspects%20of%20Software%20Testing%20in%20Enterprises.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010401 eng
oai:thesai.org:10.14569/IJACSA.2010.010402 2012-07-01
Organizational and collaborative knowledge management: a Virtual HRD model based on Web2.0 Musadaq Hanandi Michele Grimaldi Virtual Human Resource Development, Knowledge Management, Human Resource Management, Web2.0, Organizational model. International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 Knowledge development and utilization can be facilitated by human resource practices. At the organizational level, the competitive advantage depends upon the firm utilization of existing knowledge and its ability to generate new knowledge more efficiently. At the individual level, increased delegation of responsibility and freedom of creativity may better allow the discovery and utilization of local and dispersed knowledge in the organization. This paper aims at introducing an innovative organizational model to support enterprises, international companies, and governments, in developing their human resource, through the virtual human resource, as a tool for knowledge capturing and sharing inside the organization.The VHRD organizational model allows different actors (top Management, employees, and external experts) to interact and participate in the learning process, by providing non-threatening self-evaluation and individualized feedback. In this way, the model, which is based on possible patterns and rules from existing learning systems, Web 2.0 and a homogeneous set of integrated systems and technologies, can be of support to the enterprise human resource department. In addition to this, the paper presents an evaluation method to assess the knowledge management results inside the organisation, by connecting the financial impacts with the strategy map. http://thesai.org/Downloads/Volume1No4/Paper%202-Organizational%20and%20collaborative%20knowledge%20management%20a%20Virtual%20HRD%20model%20based%20on%20Web2.0.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010402 eng
oai:thesai.org:10.14569/IJACSA.2010.010403 2012-07-01
Development of a Low-Cost GSM SMS-Based Humidity Remote Monitoring and Control system for Industrial Applications Dr B Ramamurthy S.Bhargavi Dr.R.ShashiKumar Automation, GSM, SMS, Humidity Sensor (HSM-20G), ARM Controller LPC2148, Remote Monitoring & Control, AT Commands, Password Security, Mobile phone. International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 The paper proposes a wireless solution, based on GSM (Global System for Mobile Communication) networks [1] for the monitoring and control of humidity in industries. This system provides ideal solution for monitoring critical plant on unmanned sites. The system is Wireless [2] therefore more adaptable and cost-effective. Utilizing Humidity sensor HSM-20G, ARM Controller LPC2148 and GSM technology this system offers a cost effective solution to wide range of remote monitoring and control applications. Historical and real time data can be accessed world wide using the GSM network. The system can also be configured to transmit data on alarm or at preset intervals to a mobile phone using SMS text messaging. The proposed system monitors and controls the humidity from the remote location and whenever it crosses the set limit the LPC2148 processor will sends an SMS to a concerned plant authority(s) mobile phone via GSM network. The concerned authority can control the system through his mobile phone by sending AT Commands to GSM MODEM and in turn to processor. Also the system provides password security against operator misuse/abuse. The system uses GSM technology [3] thus providing ubiquitous access to the system for security and automated monitoring and control of Humidity. http://thesai.org/Downloads/Volume1No4/Paper%203-Development%20of%20a%20Low-Cost%20GSM%20SMS-Based%20Humidity%20Remote%20Monitoring%20and%20Control%20system%20for%20Industrial%20Applications.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010403 eng
oai:thesai.org:10.14569/IJACSA.2010.010404 2012-07-01
Design, Development and Simulations of MHD Equations with its proto type implementations Rajveer S Yaduvanshi Harish Parthasarathy Lorentz force, Navier Stokes Equation, Maxwell’s Equation, Iterative Solution, Prototype. International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 The equations of motion of conducting fluid in a magnetic field are formulated. These consist of three sets. First is the mass conservation equation, second Navier Stokes equation which is Newton’s second law taking into account the force of magnetic field on moving charges. The electrical field effects are neglected as is usually done in MHD. The third set is Maxwell’s equation especially to monopole condition along with Ampere’s law with the current given by ohm’s law in a moving frame (the frame in which the moving particles of fluid is at rest).The mass conservation equation assuming the fluid to be incompressible leads us to express the velocity field as the curl of a velocity vector potential. The curl of the Navier Stokes equation leads to the elimination of pressure, there by leaving with an equation involving only magnetic field and the fluid velocity field. The curl of the Ampere law equation leads us to another equation relating to the magnetic field to the velocity field. A special case is considered in which the only non vanishing components of the fluid are the x and y components and the only non vanishing component of the magnetic field is z component. In this special case the velocity vector potential only has one non zero component and this is known as stream function. The MHD equation in this reduces to three partial differential equations for the three functions in 2D model. ? stream function embeds and components. Application of MHD system prototype has been worked and presented. http://thesai.org/Downloads/Volume1No4/Paper%204-Design,%20Development%20and%20Simulations%20of%20MHD%20equations.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010404 eng
oai:thesai.org:10.14569/IJACSA.2010.010405 2012-07-01
Multipath Fading Channel Optimization for Wireless Medical Applications A.K.M Fazlul Haque Md. Hanif Ali M Adnan Kiber ISI, ICI, OFDM, TTL, Channel Fading International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 In this paper, a new method has been proposed to eliminate the intersymbol interference (ISI) and interchannel interference (ICI) for discrete multitone/orthogonal frequency division multiplexing (DMT/OFDM) systems by considering Time to Live (TTL) of multipath channel fading, especially for wireless medical application. In this method, the existence time of the packet is considered as the maximum propagation time and when the packet is sent to the receiver, the down count of the TTL starts. The existence of the packet in a network depends on TTL in an Internet Protocol (IP) packet that tells a network router whether or not the packet has been in the network too long and should be discarded. The proposed structure prevents ICI with a preprocessing method that utilizes a particular time that is equal to the continuation time of the packet and removes ISI by canceling the replica at the receiver. The simulation results show that the proposed method reduces the BER/ISI better under mutipath fading environment than other existing system. http://thesai.org/Downloads/Volume1No4/Paper%205-Multipath%20Fading%20Channel%20Optimization%20for%20Wireless%20Medical%20Applications.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010405 eng
oai:thesai.org:10.14569/IJACSA.2010.010406 2012-07-01
Performance Analysis of Indoor Positioning System Leena Arya S.C. Sharma Millie Pant WLAN, Access point, Path loss model, Qualnet 5.0. International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 In the new era of wireless communication, Wireless Local Area Network has emerged as one of the key players in the wireless communication family. It is now a trend to develop the WLAN in various colleges and office campuses for increasing productivity and quality of goods. There are many obstacles when deploying WLAN, which demands seamless indoor handover. The objective of the work reported here is to develop modeling tools using QUALNET 5.0 simulation design tool for performance optimization of WLAN access points. To predict the signal strength and interference in a WLAN system, propagation model has been used. http://thesai.org/Downloads/Volume1No4/Paper%206-Performance%20Analysis%20of%20Indoor%20Positioning%20System.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010406 eng
oai:thesai.org:10.14569/IJACSA.2010.010407 2012-07-01
Performance Comparison between Ant Algorithm and Modified Ant Algorithm Shaveta Malik Ant Algorithm, Modified Ant algorithm, Travelling Salesman Problem, Quadratic problem, Ant System International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 This paper gives a brief about two of the meta-heuristic techniques that are used to find best among the optimal solutions for complex problems like travelling salesman problem, Quadratic problem. Both of these techniques are based on the natural phenomenon of ant. Ant algorithm find good path but due to some short comings of it, this algorithm is not able to give best out of the good or optimal solutions, but modified ant algorithm which is based on probability finds out the best among the optimal paths We will also see that the modified ant algorithm can obtain less number of hops which helps us to get the best solution to typical problems. http://thesai.org/Downloads/Volume1No4/Paper%207-Performance%20Comparison%20Between%20Ant%20Algorithm%20and%20Modified%20Ant%20Algorithm.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010407 eng
oai:thesai.org:10.14569/IJACSA.2010.010408 2012-07-01
Dynamic Reduct and Its Properties In the Object-Oriented Rough Set Models M Srivenkatesh P.V.G.D.Prasadreddy Y.Srinivas Rough Set, Dynamic Reduct, Feature Core, Indiscernibility Relations, Discernibility Matrices. International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 This Paper deals with a new type of Reduct in the object-oriented rough set model which is called dynamic reduct. In the object–oriented rough set models, objects are treated as instances of classes, and illustrate structural hierarchies among objects based on is-a relationship and has-a relationship[6]. In this paper, we propose dynamic reduct and the notation of core according to the dynamic reduct in the object-oriented rough set models. It describes various formal definitions of core and discusses some properties about dynamic core in the object-oriented rough set models. http://thesai.org/Downloads/Volume1No4/Paper%208-Dynamic%20reducts%20and%20its%20Properties%20in%20the%20Object-Oriented%20Rough%20Set%20Models.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010408 eng
oai:thesai.org:10.14569/IJACSA.2010.010409 2012-07-01
Parallel Printer Port for Phase Measurement Dr R.Padma Suvarna Dr.M. Usha Rani P.M.Kalyani Dr.R.Seshadri Yaswanth Kumar.Avulapati Phase detector, voltage controlled oscillator, phase locked loop and parallel printer port. International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 This white paper talks about the measurement of phase angle using Phase Locked Loop and printer port.. The phase detector compares the phase of a periodic input signal against the phase of the output of voltage controlled oscillator and generates an average output voltage Vout which is linearly proportional to the phase difference, ?ø between its two inputs. This output voltage is measured using the parallel Printer port of a PC. http://thesai.org/Downloads/Volume1No4/Paper%209-PARALLEL%20PRINTER%20PORT%20FOR%20PHASE%20MEASUREMENT.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010409 eng
oai:thesai.org:10.14569/IJACSA.2010.010410 2012-07-01
Traffic Load based Performance Analysis of DSR, STAR & AODV Adhoc Routing Protocol Parma Nand Dr. S.C. Sharma Rani Astya Adhoc networks; wireless networks; CBR, routing protocols; route discovery; simulation; performance evaluation; MAC; IEEE 802.11; STAR; DSR; AODV. International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 The wireless adhoc network is comprised of nodes (it can be static or mobile) with wireless radio interface. These nodes are connected among themselves without central infrastructure and are free to move. It is a multihop process because of the limited transmission range of energy constrained wireless nodes. Thus, in such a multihop network system each node (also known as router) is independent, self-reliant and capable of routing the packets over the dynamic network topology and therefore routing becomes very important and basic operation of adhoc network. Many protocols are reported in this field but it is difficult to decide which one is best. In this paper table driven protocol STAR and on demand routing protocols AODV, DSR based on IEEE 802.11 are surveyed and characteristic summary of these routing protocols is presented. Their performance is analyzed on throughput, jitter, packet delivery ratio and end-to-end delay performance measuring metrics by varying CBR data traffic load and then their performance is also compared using QualNet 5.0.2 network simulator. http://thesai.org/Downloads/Volume1No4/Paper_10-Traffic_Load_based_Performance_Analysis.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010410 eng
oai:thesai.org:10.14569/IJACSA.2010.010411 2012-07-01
Computing the Most Significant Solution from Pareto Front obtained in Multi-objective Evolutionary P.M Chaudhari Dr. R.V. Dharaskar Dr. V. M. Thakare Multiobjective,Pareto front ,Clustering techniques International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 Problems with multiple objectives can be solved by using Pareto optimization techniques in evolutionary multi-objective optimization algorithms. Many applications involve multiple objective functions and the Pareto front may contain a very large number of points. Selecting a solution from such a large set is potentially intractable for a decision maker. Previous approaches to this problem aimed to find a representative subset of the solution set. Clustering techniques can be used to organize and classify the solutions. Implementation of this methodology for various applications and in a decision support system is also discussed. http://thesai.org/Downloads/Volume1No4/Paper%2011-Computing%20the%20Most%20Significant%20Solution%20from%20Pareto%20Front%20obtained%20in%20Multi-objective%20Evolutionary%20Algorithms.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010411 eng
oai:thesai.org:10.14569/IJACSA.2010.010412 2012-07-01
Applying Intuitionistic Fuzzy Approach to Reduce Search Domain in an Accidental Case Yasir Ahmad Sadia Husain Afshar Alam fuzzy sets, Intuitionistic fuzzy relation, Intuitionistic fuzzy database, Intuitionistic fuzzy tolerance International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 In this paper we are using intuitionistic fuzzy approach to minimize search domain in an accidental case where the data collected for investigation is intuitionistic fuzzy in nature. To handle these types of imprecise information we use intuitionistic fuzzy tolerance relation and translate intuitionistic fuzzy query to reach to the conclusion. Here we present an example of vehicle hit and run case where the accused had fled the accident spot within seconds leaving no clue behind. http://thesai.org/Downloads/Volume1No4/Paper%2012-Applying%20Intuitionistic%20Fuzzy%20Approach%20to%20Reduce%20Search%20Domain%20in%20an%20Accidental%20Case.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010412 eng
oai:thesai.org:10.14569/IJACSA.2010.010413 2012-07-01
Search Technique Using Wildcards or Truncation: A Tolerance Rough Set Clustering Approach Sandeep Kumar Satapathy Shruti Mishra Debahuti Mishra Clustering, Tolerance Rough Set, Search Engine, Wildcard Truncation International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 Search engine technology plays an important role in web information retrieval. However, with Internet information explosion, traditional searching techniques cannot provide satisfactory result due to problems such as huge number of result Web pages, unintuitive ranking etc. Therefore, the reorganization and post-processing of Web search results have been extensively studied to help user effectively obtain useful information. This paper has basically three parts. First part is the review study on how the keyword is expanded through truncation or wildcards (which is a little known feature but one of the most powerful one) by using various symbols like * or! The primary goal in designing this is to restrict ourselves by just mentioning the keyword using the truncation or wildcard symbols rather than expanding the keyword into sentential form. The second part of this paper gives a brief idea about the tolerance rough set approach to clustering the search results. In tolerance rough set approach we use a tolerance factor considering which we cluster the information rich search result and discard the rest. But it may so happen that the discarded results do have some information which may not be up to the tolerance level; still they do contain some information regarding the query. The third part depicts a proposed algorithm based on the above two and thus solving the above mentioned problem that usually arise in the tolerance rough set approach . The main goal of this paper is to develop a search technique through which the information retrieval will be very fast, reducing the amount of extra labor needed on expanding the query. http://thesai.org/Downloads/Volume1No4/Paper%2013-Search%20Technique%20Using%20Wildcards%20or%20Truncation%20A%20Tolerance%20Rough%20Set%20Clustering%20Approach.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010413 eng
oai:thesai.org:10.14569/IJACSA.2010.010414 2012-07-01
Measuring Semantic Similarity between Words Using Web Documents Sheetal A Takale Sushma S. Nandgaonkar Semantic Similarity, Wikipedia, Web Search Engine, Natural Language Processing, Information Retrieval, Web Mining. International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 Semantic similarity measures play an important role in the extraction of semantic relations. Semantic similarity measures are widely used in Natural Language Processing (NLP) and Information Retrieval (IR). The work proposed here uses web-based metrics to compute the semantic similarity between words or terms and also compares with the state-of-the-art. For a computer to decide the semantic similarity, it should understand the semantics of the words. Computer being a syntactic machine, it can not understand the semantics. So always an attempt is made to represent the semantics as syntax. There are various methods proposed to find the semantic similarity between words. Some of these methods have used the precompiled databases like WordNet, and Brown Corpus. Some are based on Web Search Engine. The approach presented here is altogether different from these methods. It makes use of snippets returned by the Wikipedia or any encyclopedia such as Britannica Encyclopedia. The snippets are preprocessed for stop word removal and stemming. For suffix removal an algorithm by M. F. Porter is referred. Luhn’s Idea is used for extraction of significant words from the preprocessed snippets. Similarity measures proposed here are based on the five different association measures in Information retrieval, namely simple matching, Dice, Jaccard, Overlap, Cosine coefficient. Performance of these methods is evaluated using Miller and Charle’s benchmark dataset. It gives higher correlation value of 0.80 than some of the existing methods. http://thesai.org/Downloads/Volume1No4/Paper%2014-Measuring%20Semantic%20Similarity%20Between%20Words%20Using%20Web%20Documents.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010414 eng
oai:thesai.org:10.14569/IJACSA.2010.010415 2012-07-01
Classification of Self-Organizing Hierarchical Mobile Adhoc Network Routing Protocols - A Summary Udayachandran Ramasamy K. Sankaranarayanan MANET, Routing Protocols, Routing Topology , Routing Algorithms and QoS. International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 MANET is a special kind of wireless network. It is a collection of mobile nodes without having aid of established infrastructure. Mobile Adhoc network removes the dependence on a fixed network infrastructure by treating every available mobile node as an intermediate switch, thereby extending the range of mobile nodes well beyond that of their base transceivers. Other advantages of Manet include easy installation and upgrade, low cost and maintenance, more flexibility, and the ability to employ new and efficient routing protocols for wireless communication. In this paper we present four routing algorithm, classifications, discuss their advantages and disadvantages. http://thesai.org/Downloads/Volume1No4/Paper%2015-Classification%20of%20Self-Organizing%20Hierarchical%20Mobile%20Adhoc%20Network%20Routing%20Protocols%20-%20A%20Summary%20.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010415 eng
oai:thesai.org:10.14569/IJACSA.2010.010416 2012-07-01
Clustering Methods for Credit Card using Bayesian rules based on K-means classification S Jessica Saritha Prof. P.Govindarajulu K. Rajendra Prasad S.C.V. Ramana Rao C.Lakshmi Clusters, Probability, K-Means, Thomas Bayesian rule, Credit Card, attributes, banking. International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 K-means clustering algorithm is a method of cluster analysis which aims to partition n observations into clusters in which each observation belongs to the cluster with the nearest mean. It is one of the simplest unconfirmed learning algorithms that solve the well known clustering problem. It is similar to the hope maximization algorithm for mixtures of Gaussians in that they both attempt to find the centers of natural clusters in the data. Bayesian rule is a theorem in probability theory named for Thomas Bayesian. It is used for updating probabilities by finding conditional probabilities given new data. In this paper, K-mean clustering algorithm and Bayesian classification are joint to analysis the credit card. The analysis result can be used to improve the accuracy. http://thesai.org/Downloads/Volume1No4/Paper%2016-Clustering%20Methods%20for%20Credit%20Card%20using%20Bayesian%20rules%20based%20on%20K-means%20classification.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010416 eng
oai:thesai.org:10.14569/IJACSA.2010.010417 2012-07-01
Flow Controlling of Access at Edge Routers S.C.V Ramana Rao S.Naga Mallik Raj S. Neeraja P.Prathusha J.David Sukeerthi Kumar bandwidth, Traffic, edge-routers, routers, decision, multimedia, Quality of Service, framework, algorithm, domain. International Journal of Advanced Computer Science and Applications(IJACSA), 1(4), 2010 It is very important to allocate and manage resources for multimedia type of data traffic flows with real-time performance requirements in order to guarantee quality-of-service (QoS). In this paper, we develop a scalable architecture and an algorithm for access control of real-time flows. Since individual management of each traffic flow on each transit router can cause a fundamental scalability problem in both data and control planes, we consider that each flow is classified at the ingress router and data flow is aggregated according to the class inside the core network as in a DiffServ framework. In our approach, access decision is made for each flow at the edge routers, but it is scalable because per-flow states are not maintained and the access algorithm is simple. In the proposed access control scheme, an admissible bandwidth, which is defined as the maximum rate of a flow that can be accommodated additionally while satisfying the delay performance requirements for both existing and new flows, is calculated based on the available bandwidth measured by edge routers. The admissible bandwidth is a entry for access control, and thus, it is very important to accurately estimate the acceptable bandwidth. The performance of the proposed algorithm is evaluated by taking a set of simulation experiments using bursty traffic flows. http://thesai.org/Downloads/Volume1No4/Paper%2017-Flow%20Controlling%20of%20Access%20at%20Edge%20Routers.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010417 eng
oai:thesai.org:10.14569/IJACSA.2010.010501 2012-07-01
Wavelet Time-frequency Analysis of Electro-encephalogram (EEG) Processing Zhang xizheng Yin ling Wang weixiong EEG, time-frequency analysis, wavelet transform, de-noising. International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 This paper proposes time-frequency analysis of EEG spectrum and wavelet analysis in EEG de-noising. In this paper, the basic idea is to use the characteristics of multi-scale multi-resolution, using four different thresholds to wipe off interference and noise after decomposition of the EEG signals. By analyzing the results, understanding the effects of four different methods, it comes to a conclusion that the wavelet de-noising and soft threshold is a better conclusion. http://thesai.org/Downloads/Volume1No5/Paper%201-Wavelet%20Time-frequency%20Analysis%20of%20Electro-encephalogram%20(EEG)%20Processing.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010501 eng
oai:thesai.org:10.14569/IJACSA.2010.010502 2012-07-01
Requirements Analysis through Viewpoints Oriented Requirements Model (VORD) Ahmed M Salem Software Requirements; Requirements Modeling; Functional Requirements International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 This paper describes an extension to the Viewpoints Oriented Requirements Definition (VORD) model and attempts to resolve its lack of direct support for viewpoint interaction. Supporting the viewpoint interaction provides a useful tool for analyzing requirements changes and automating systems. It can also be used to indicate when multiple requirements are specified as a single requirement. The extension is demonstrated with the bank auto-teller system that was part of the original VORD proposal. http://thesai.org/Downloads/Volume1No5/Paper%202-Requirements%20Analysis%20through%20Viewpoints%20Oriented%20Requirements%20Model%20(VORD).pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010502 eng
oai:thesai.org:10.14569/IJACSA.2010.010503 2012-07-01
Model for Enhancing Requirements Traceability and Analysis Ahmed M Salem Requirements Traceability; Software Faults; Software Quality. International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 Software quality has been a challenge since the inception of computer software. Software requirements gathering, analysis, and specification; are viewed by many as the principle cause of many of the software complex problems. Requirements traceability is one of the most important and challenging tasks in ensuring clear and concise requirements. Requirements need to be specified and traced throughout the software life cycle in order to produce quality requirements. This paper describes a preliminary model to be used by software engineers to trace and verify requirements at the initial phase. This model is designed to be adaptable to requirement changes and to assess its impact. http://thesai.org/Downloads/Volume1No5/Paper%203-A%20Model%20for%20Enhancing%20Requirements%20Traceability%20and%20Analysis.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010503 eng
oai:thesai.org:10.14569/IJACSA.2010.010504 2012-07-12
The result oriented process for students based on distributed datamining Pallamreddy Venkatasubbareddy Vuda Sreenivasarao Learning result evaluation system; distributed data mining; decision tree. International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 The student result oriented learning process evaluation system is an essential tool and approach for monitoring and controlling the quality of learning process. From the perspective of data analysis, this paper conducts a research on student result oriented learning process evaluation system based on distributed data mining and decision tree algorithm. Data mining technology has emerged as a means for identifying patterns and trends from large quantities of data. It is aimed at putting forward a rule-discovery approach suitable for the student learning result evaluation and applying it into practice so as to improve learning evaluation of communication skills and finally better serve learning practicing. http://thesai.org/Downloads/Volume1No5/Paper%204-THE%20RESULT%20ORIENTED%20PROCESS%20FOR%20STUDENTS%20BASED%20ON%20DISTRIBUTED%20DATAMINING.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010504 eng
oai:thesai.org:10.14569/IJACSA.2010.010505 2012-07-01
An New Filtering Methods in the Wavelet Domain for Bowel Sounds Zhang xizheng Yin ling Wang weixiong bowel sounds; de-noising; wavelet transform; threshold International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 Bowel sounds signal (BS) is one of the important human physiological signals, analysis of the BS signal can then study gastrointestinal physiology and implement direct and effective diagnosis of gastrointestinal disorders. Use different threshold denoising methods denoising the original bowel sounds, simulated in the environment of MATLAB, then compared the denoising results and analyzed the advantages and disadvantages of those threshold denoising methods. http://thesai.org/Downloads/Volume1No5/Paper%205%20An%20New%20Filtering%20Methods%20in%20the%20Wavelet%20Domain%20for%20Bowel%20Sounds%20.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010505 eng
oai:thesai.org:10.14569/IJACSA.2010.010506 2012-07-01
2- Input AND Gate For Fast Gating and Switching Based On XGM Properties of InGaAsP Semiconductor Optical Amplifier Vikrant k Srivastava Devendra Chack Vishnu Priye Optical Logic Gates, Semiconductor Optical Amplifier, SOA, Four Wave Mixing, Cross Gain Modulationp> International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 We report on an all-optical AND-gate using simultaneous Four-Wave Mixing (FWM) and Cross-Gain Modulation (XGM) in a semiconductor optical amplifier (SOA). The operation of the proposed AND gate is simulated and the results demonstrate its effectiveness. This AND gate could provide a new possibility for all-optical computing and all-optical routing in future all-optical networks. In an AND ( AB ) gate, Boolean is firstly obtained by using signal B as a pump beam and clock signal as a probe beam in SOA-1. By passing signal A as a probe beam and as a pump beam through SOA-2, Boolean AB is acquired. Proposed optical logic unit is based on coupled nonlinear equations describing XGM and FWM effects. These equations are first solved to generate the pump, probe and conjugate pulses in a SOA. The pulse behavior are analyzed and applied to realize behavior of all-optical AND gate and its function is verified with the help of waveform and analytical assumption. http://thesai.org/Downloads/Volume1No5/Paper%206%202-%20Input%20AND%20Gate%20For%20Fast%20Gating%20and%20Switching%20Based%20On%20XGM%20Properties%20of%20InGaAsP%20Semiconductor%20Optical%20Amplifier.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010506 eng
oai:thesai.org:10.14569/IJACSA.2010.010507 2012-07-01
Analysis and Enhancement of BWR Mechanism in MAC 802.16 for WIMAX Networks R Bhakthavathsalam R. ShashiKumar V. Kiran Y. R. Manjunath WiMAX, MAC 802.16 , BWR, NS-2, Tcl International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 WiMAX [Worldwide Interoperability for Microwave Access] is the latest contender as a last mile solution for providing broadband wireless Internet access and is an IEEE 802.16 standard. In IEEE 802.16 MAC protocol, Bandwidth Request (BWR) is the mechanism by which the Subscriber Station (SS) communicates its need for uplink bandwidth allocation to the Base Station (BS). The performance of the system is affected by the collisions of BWR packets in uplink transmission, that have the direct impact on the size of the contention period in uplink sub frame, uplink access delay and uplink throughput. This paper mainly deals with Performance Analysis and Improvement of Uplink Throughput in MAC 802.16 by the Application of a New Mechanism of Circularity. The implementation incorporates a generic simulation of the contention resolution mechanism at the BS. An analysis of the total uplink access delay and also uplink throughput is performed. A new paradigm of circularity is employed by selectively dropping appropriate control packets in order to obviate the bandwidth request collisions, which yields the minimum access delay and thereby effective utilization of available bandwidth towards the uplink throughput. This new paradigm improves contention resolution among the bandwidth request packets and thereby reduces the delay and increases the throughput regardless of the density and topological spread of subscriber stations handled by the BS in the network. http://thesai.org/Downloads/Volume1No5/Paper%207%20Analysis%20and%20Enhancement%20of%20BWR%20Mechanism%20in%20MAC%20802.16%20for%20WIMAX%20Networks.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010507 eng
oai:thesai.org:10.14569/IJACSA.2010.010508 2012-07-01
An Intelligent Software Workflow Process Design for Location Management on Mobile Devices N Mallikharjuna Rao P.Seetharam Wireless system, mobile, BPR, software design, intelligent design, Fuzzy database International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 Advances in the technologies of networking, wireless communication and trimness of computers lead to the rapid development in mobile communication infrastructure, and have drastically changed information processing on mobile devices. Users carrying portable devices can freely move around, while still connected to the network. This provides flexibility in accessing information anywhere at any time. For improving more flexibility on mobile devices, the new challenges in designing software systems for mobile networks include location and mobility management, channel allocation, power saving and security. In this paper, we are proposing intelligent software tool for software design on mobile devices to fulfill the new challenges on mobile location and mobility management. In this study, the proposed Business Process Redesign (BPR) concept aims at an extension of the capabilities of an existing, widely used process modeling tool in industry with ‘Intelligent’ capabilities to suggest favorable alternatives to an existing software workflow design for improving flexibilities on mobile devices. http://thesai.org/Downloads/Volume1No5/Paper%208%20An%20Intelligent%20Software%20Workflow%20Process%20Design%20for%20Location%20Management%20on%20Mobile%20Devices.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010508 eng
oai:thesai.org:10.14569/IJACSA.2010.010509 2012-07-01
Pattern Discovery using Fuzzy FP-growth Algorithm from Gene Expression Data Sabita Barik Debahuti Mishra Shruti Mishra Sandeep Ku. Satapathy Amiya Ku. Rath Milu Acharya Gene Expression Data; Association Rule mining; Apriori Algorithm, Frequent Pattern Mining, FP-growth Algorithm International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 The goal of microarray experiments is to identify genes that are differentially transcribed with respect to different biological conditions of cell cultures and samples. Hence, method of data analysis needs to be carefully evaluated such as clustering, classification, prediction etc. In this paper, we have proposed an efficient frequent pattern based clustering to find the gene which forms frequent patterns showing similar phenotypes leading to specific symptoms for specific disease. In past, most of the approaches for finding frequent patterns were based on Apriori algorithm, which generates and tests candidate itemsets (gene sets) level by level. This processing causes iterative database (dataset) scans and high computational costs. Apriori algorithm also suffers from mapping the support and confidence framework to a crisp boundary. Our hybridized Fuzzy FP-growth approach not only outperforms the Apriori with respect to computational costs, but also it builds a tight tree structure to keep the membership values of fuzzy region to overcome the sharp boundary problem and it also takes care of scalability issues as the number of genes and condition increases. http://thesai.org/Downloads/Volume1No5/Paper%209-Pattern%20Discovery%20using%20Fuzzy%20FP-growth%20Algorithm%20from%20Gene%20Expression%20Data.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010509 eng
oai:thesai.org:10.14569/IJACSA.2010.010510 2012-07-01
Identification and Evaluation of Functional Dependency Analysis using Rough sets for Knowledge Discovery Y V Sreevani Prof. T. Venkat Narayana Rao Roughset;knowledge base; data mining; functional dependency; core knowledge. International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 The process of data acquisition gained momentum due to the efficient representation of storage/retrieving systems. Due to the commercial and application value of these stored data, Database Management has become essential for the reasons like consistency and atomicity in giving birth to DBMS. The existing database management systems cannot provide the needed information when the data is not consistent. So knowledge discovery in databases and data mining has become popular for the above reasons. The non-trivial future expansion process can be classified as Knowledge Discovery. Knowledge Discovery process can be attempted by clustering tools. One of the upcoming tools for knowledge representation and knowledge acquisition process is based on the concept of Rough Sets. This paper explores inconsistencies in the existing databases by finding the functional dependencies extracting the required information or knowledge based on rough sets. It also discusses attribute reduction through core and reducts which helps in avoiding superfluous data. Here a method is suggested to solve this problem of data inconsistency based medical domain with a analysis. http://thesai.org/Downloads/Volume1No5/Paper%2010-Identification%20and%20Evaluation%20of%20Functional%20Dependency%20Analysis%20using%20%20Roughsets%20for%20Knowledge%20Discovery.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010510 eng
oai:thesai.org:10.14569/IJACSA.2010.010511 2012-07-01
Generalized Two Axes Modeling, Order Reduction and Numerical Analysis of Squirrel Cage Induction Machine for Stability Studies Sudhir kumar P. K. Ghosh S. Mukherjee Induction machine, Model order reduction, Stability, Transient analysis International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 A substantial amount of power system load is made of large number of three phase induction machine. The transient phenomena of these machines play an important role in the behavior of the overall system. Thus, modeling of induction machine is an integral part of some power system transient studies. The analysis takes a detailed form only when its modeling becomes perfect to a greater accuracy. When the stator eddy current path is taken into account, the uniform air-gap theory in phase model analysis becomes inefficient to take care of the transients. This drawback necessitates the introduction of analysis of the machine in d-q axis frame. A widely accepted induction machine model for stability studies is the fifth-order model which considers the electrical transients in both rotor and stator windings and the mechanical transients. In practice, some flux-transient can be ignored due to the quasi-stationary nature of the variables concerned. This philosophy leads to the formation of reduced order model. Model Order Reduction (MOR) encompasses a set of techniques whose goal is to generate reduced order models with lower complexity while ensuring that the I/O response and other characteristics of the original model (such as passivity) are maintained. This paper takes the above matter as a main point of research. The authors use the philosophy of the speed-build up of induction machine to find the speed versus time profile for various load conditions and supply voltage disturbances using numerical methods due to Runge- Kutta, Trapezoidal and Euler’s in Matlab platform. The established fact of lesser computation time in reduced order model has been verified and improvement in accuracy is observed. http://thesai.org/Downloads/Volume1No5/Paper%2011%20A%20Generalized%20Two%20Axes%20Modeling%20,%20Order%20Reduction%20and%20Numerical%20Analysis%20of%20Squirrel%20Cage%20Induction%20Machine%20for%20Stability%20Studies.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010511 eng
oai:thesai.org:10.14569/IJACSA.2010.010512 2012-07-01
Single Input Multiple Output (SIMO) Wireless Link with Turbo Coding M M Kamruzzaman Dr. Mir Mohammad Azad Diversity; multipath fading; Rayleigh fading; Turbo coding; Maximal Ratio Combining (MRC); maximum-likelihood detector; multiple antennas International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 Performance of a wireless link is evaluated with turbo coding in the presence of Rayleigh fading with single transmitting antenna and multiple receiving antenna. QAM modulator is considered with maximum likelihood decoding. Performance results show that there is a significant improvement in required signal to noise ratio (SNR) to achieve a given BER. It is found that the system attains a coding gain of 14.5 dB and 13 dB corresponding to two receiving antenna and four receiving antennas respectively over the corresponding uncoded system. Further, there is an improvement in SNR of 6.5 dB for four receiving antennas over two receiving antennas for the turbo coded system. http://thesai.org/Downloads/Volume1No5/Paper%2012%20Single%20Input%20Multiple%20Output%20(SIMO)%20Wireless%20Link%20with%20Turbo%20Coding.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010512 eng
oai:thesai.org:10.14569/IJACSA.2010.010513 2012-07-01
Reliable Multicast Transport Protocol: RMTP Pradip M Jawandhiya Sakina F.Husain M.R.Parate Dr. M. S. Ali Prof. J.S.Deshpande multicast routing, MANET, acknowledgement implosion, designated receiver International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 This paper presents the design, implementation, and performance of a reliable multicast transport protocol (RMTP). RMTP is based on a hierarchical structure in which receivers are grouped into local regions or domains and in each domain there is a special receiver called a designated receiver (DR) which is responsible for sending acknowledgments periodically to the sender, for processing acknowledgment from receivers in its domain, and for retransmitting lost packets to the corresponding receivers. Since lost packets are recovered by local retransmissions as opposed to retransmissions from the original sender, end-to-end latency is significantly reduced, and the overall throughput is improved as well. Also, since only the DR’s send their acknowledgments to the sender, instead of all receivers sending their acknowledgments to the sender, a single acknowledgment is generated per local region, and this prevents acknowledgment implosion. Receivers in RMTP send their acknowledgments to the DR’s periodically, thereby simplifying error recovery. In addition, lost packets are recovered by selective repeat retransmissions, leading to improved throughput at the cost of minimal additional buffering at the receivers. This paper also describes the implementation of RMTP and its performance on the Internet. http://thesai.org/Downloads/Volume1No5/Paper%2013%20Reliable%20Multicast%20Transport%20Protocol%20RMTP.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010513 eng
oai:thesai.org:10.14569/IJACSA.2010.010514 2012-07-01
Modular neural network approach for short term flood forecasting a comparative study Rahul P Deshmukh A. A. Ghatol Artificial neural network, Forecasting, Rainfall, Runoff, Models. International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 The artificial neural networks (ANNs) have been applied to various hydrologic problems recently. This research demonstrates static neural approach by applying Modular feedforward neural network to rainfall-runoff modeling for the upper area of Wardha River in India. The model is developed by processing online data over time using static modular neural network modeling. Methodologies and techniques for four models are presented in this paper and a comparison of the short term runoff prediction results between them is also conducted. The prediction results of the Modular feedforward neural network with model two indicate a satisfactory performance in the three hours ahead of time prediction. The conclusions also indicate that Modular feedforward neural network with model two is more versatile than other and can be considered as an alternate and practical tool for predicting short term flood flow. http://thesai.org/Downloads/Volume1No5/Paper%2014-Modular%20neural%20network%20approach%20for%20short%20term%20flood%20forecasting%20a%20comparative%20study.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010514 eng
oai:thesai.org:10.14569/IJACSA.2010.010515 2012-07-01
suitable segmentation methodology based on pixel similarities for landmine detection in IR images Dr G Padmavathi Dr. P. Subashini Ms. M. Krishnaveni Segmentation, Global Consistency error, h-maxima, threshold, Landmine detection International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 Identification of masked objects especially in detection of landmines is always a difficult problem due to environmental inference. Here, segmentation phase is highly concentrated by performing an initial spatial segmentation to achieve a minimal number of segmented regions while preserving the homogeneity criteria of each region. This paper aims in evaluating similarities based segmentation methods to compose the partition of objects in Infra-Red images. The output is a set of non-overlapping homogenous regions that compose the pixels of the image. These extracted regions are used as the initial data structure in feature extraction process. Experimental results conclude that h-maxima transformation provides better results for landmine detection by taking the advantage of the threshold. The relative performance of different conventional methods and proposed method are evaluated and compared using the Global Consistency Error and Structural Content. It proves that h-maxima gives significant results that definitely facilitate the landmine classification system more effectively. http://thesai.org/Downloads/Volume1No5/Paper%2015-A%20suitable%20segmentation%20methodology%20based%20on%20pixel%20similarities%20for%20landmine.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010515 eng
oai:thesai.org:10.14569/IJACSA.2010.010516 2012-07-01
Decision Tree Classification of Remotely Sensed Satellite Data using Spectral Separability Matrix M K Ghose Ratika Pradhan Sucheta Sushan Ghose Decision Tree Classifier (DTC), Separability Matrix, Maximum Likelihood Classifier (MLC), Stopping Criteria. International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 In this paper an attempt has been made to develop a decision tree classification algorithm for remotely sensed satellite data using the separability matrix of the spectral distributions of probable classes in respective bands. The spectral distance between any two classes is calculated from the difference between the minimum spectral value of a class and maximum spectral value of its preceding class for a particular band. The decision tree is then constructed by recursively partitioning the spectral distribution in a Top-Down manner. Using the separability matrix, a threshold and a band will be chosen in order to partition the training set in an optimal manner. The classified image is compared with the image classified by using classical method Maximum Likelihood Classifier (MLC). The overall accuracy was found to be 98% using the Decision Tree method and 95% using the Maximum Likelihood method with kappa values 97% and 94 % respectively. http://thesai.org/Downloads/Volume1No5/Paper%2016-Decision%20Tree%20Classification%20Of%20Remotely%20Sensed%20Satellite%20Data%20Using%20Spectral%20Separability%20Matrixpaper.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010516 eng
oai:thesai.org:10.14569/IJACSA.2010.010517 2012-07-01
Texture Based Segmentation using Statistical Properties for Mammographic Images H B Kekre Saylee Gharge H.B.Kekre,Saylee Gharge International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 Segmentation is very basic and important step in computer vision and image processing. For medical images specifically accuracy is much more important than the computational complexity and thus time required by process. But as volume of data of patients goes on increasing then it becomes necessary to think about the processing time along with accuracy. Here in this paper, new algorithm is proposed for texture based segmentation using statistical properties. For that probability of each intensity value of image is calculated directly and image is formed by replacing intensity by its probability . Variance is calculated in three different ways to extract the texture features of the mammographic images. These results of proposed algorithm are compared with well known GLCM and Watershed algorithm. http://thesai.org/Downloads/Volume1No5/Paper%2017-Texture%20Based%20Segmentation%20using%20Statistical%20Properties%20for%20Mammographic%20Images.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010517 eng
oai:thesai.org:10.14569/IJACSA.2010.010518 2012-07-01
Enhancement of Passive MAC Spoofing Detection Techniques Aiman Abu Samra Ramzi Abed Intrusion Detection, RSS, RTT, Denial of Service International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 Failure of addressing all IEEE 802.11i Robust Security Networks (RSNs) vulnerabilities enforces many researchers to revise robust and reliable Wireless Intrusion Detection Techniques (WIDTs). In this paper we propose an algorithm to enhance the performance of the correlation of two WIDTs in detecting MAC spoofing Denial of Service (DoS) attacks. The two techniques are the Received Signal Strength Detection Technique (RSSDT) and Round Trip Time Detection Technique (RTTDT). Two sets of experiments were done to evaluate the proposed algorithm. Absence of any false negatives and low number of false positives in all experiments demonstrated the effectiveness of these techniques. http://thesai.org/Downloads/Volume1No5/Paper%2018-Enhancement%20of%20Passive%20MAC%20Spoofing%20Detection%20Techniques.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010518 eng
oai:thesai.org:10.14569/IJACSA.2010.010519 2012-07-01
framework for Marketing Libraries in the Post-Liberalized Information and Communications Technology Era Garimella Bhaskar Narasimha Rao Digital Libraries, University Learining, Adaptability International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 The role of library is professional and is fast customizing to altering technological platforms. Our subscriber’s perceptions of the nature of our libraries are also altering like never before – particularly in Universities. In this dynamic environment the efficient presentation of the library is an essential survival tool. This holds true whether a library is contributing effectively to an institution’s overall management effort or, sensitively and subtly, promoting rational attitudes in the minds of its local constituency. Citations, references and annotations referred here are intended to provide background, ideas, techniques and inspiration for the novice as well as the experienced personnel. We hope you will find this paper as interesting and useful as we trust it would be, and that libraries may gain in useful collaborations and reputation from the application of information provided by resources identified here. Result oriented marketing is not on its own - some kind of substitute for a well-run service provider or service the demands of its user population, but in the 21st century even the best-run centers of learning or information service will only prosper if effort and talent are devoted to growth orientation. Almost all of us can render to this effort, and this paper can help coax the roots of marketing talent into a complete, harvestable and fragrant flower. http://thesai.org/Downloads/Volume1No5/Paper%2019-A%20framework%20for%20Marketing%20Libraries%20in%20the%20Post-Liberalized%20Information%20and%20Communications%20Technology%20Era.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010519 eng
oai:thesai.org:10.14569/IJACSA.2010.010520 2012-07-01
OFDM System Analysis for reduction of Inter symbol Interference Using the AWGN Channel Platform D M Bappy Ajoy Kumar Dey Susmita Saha Avijit Saha Shibani Ghosh OFDM; Inter symbol Interference; AWGN; matlab; algorithm. International Journal of Advanced Computer Science and Applications(IJACSA), 1(5), 2010 Orthogonal Frequency Division Multiplexing (OFDM) transmissions are emerging as important modulation technique because of its capacity of ensuring high level of robustness against any interferences. This project is mainly concerned with how well OFDM system performs in reducing the Inter Symbol Interference (ISI) when the transmission is made over an Additive White Gaussian Noise (AWGN) channel. When OFDM is considered as a low symbol rate and a long symbol duration modulation scheme, it is sensible to insert a guard interval between the OFDM symbols for the purpose of eliminating the effect of ISI with the increase of Signal to Noise Ratio (SNR). http://thesai.org/Downloads/Volume1No5/Paper%2020%20OFDM%20System%20Analysis%20for%20reduction%20of%20Inter%20symbol%20Interference%20Using%20the%20AWGN%20Channel%20Platform.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010520 eng
oai:thesai.org:10.14569/IJACSA.2010.010601 2012-07-01
Pause Time Optimal Setting for AODV Protocol on RPGM Mobility Model in MANETs Sayid Mohamed Abdule Suhaidi Hassan Osman Ghazali Mohammed M. Kadhum MANETs, AODV, pause time, optimal settin, RPGM International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 For the last few years, a number of routing protocols have been proposed and implemented for wireless mobile Ad hoc network. The motivation behind this paper is to discover and study the pause time effects on Ad hoc on Demand Distance Vector (AODV) protocol and find out the node pause time optimal setting for this protocol where Reference Point Group Mobility (RPGM) model uses as a reference model. In order to come across the best performance of a particular routing protocol, there a need to examine a number of parameters with different performance and analyze the optimal setting of that protocol and its network configuration environment. This experiment, the speed is fixed with 20 ms in all scenarios while the pause time is varying from scenario to another to observe the optimal setting of the pause time on protocol’s performance in this configuration. The outcome of the experiment are analyzed with different parameters such as varying number of nodes, increasing connections, increasing pause time and discussed the effects of the pause time. The results have shown that the value of the pause time can be affecting the performance of the protocol. In the experiment, we found that the lower pause time give better performance of the protocol. However, this paper is a part of ongoing research on AODV protocol in link failure. Thus, it is important to figure out the factors which can be involved the performance of the protocol. http://thesai.org/Downloads/Volume1No6/Paper_1-PAUSE_TIME_OPTIMAL_SETTING_FOR_AODV_PROTOCOL_ON_RPGM_MOBILITY_MODEL_IN_MANETs.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010601 eng
oai:thesai.org:10.14569/IJACSA.2010.010602 2012-07-01
Automating Legal Research through Data Mining M F M Firdhous Text Mining; Legal Research; Term Weighting; Vector Space International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 The term legal research generally refers to the process of identifying and retrieving appropriate information necessary to support legal decision-making from past case records. At present, the process is mostly manual, but some traditional technologies such as keyword searching are commonly used to speed the process up. But a keyword search is not a comprehensive search to cater to the requirements of legal research as the search result includes too many false hits in terms of irrelevant case records. Hence the present generic tools cannot be used to automate legal research. This paper presents a framework which was developed by combining several ‘Text Mining’ techniques to automate the process overcoming the difficulties in the existing methods. Further, the research also identifies the possible enhancements that could be done to enhance the effectiveness of the framework. http://thesai.org/Downloads/Volume1No6/Paper_2-Automating_Legal_Research_through_Data_Mining.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010602 eng
oai:thesai.org:10.14569/IJACSA.2010.010603 2012-07-01
Quantization Table Estimation in JPEG Images Salma Hamdy Haytham El-Messiry Mohamed Roushdy Essam Kahlifa Digital image forensics; forgery detection; compression history; Quantization tables. International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 Most digital image forgery detection techniques require the doubtful image to be uncompressed and in high quality. However, most image acquisition and editing tools use the JPEG standard for image compression. The histogram of Discrete Cosine Transform coefficients contains information on the compression parameters for JPEGs and previously compressed bitmaps. In this paper we present a straightforward method to estimate the quantization table from the peaks of the histogram of DCT coefficients. The estimated table is then used with two distortion measures to deem images as untouched or forged. Testing the procedure on a large set of images gave a reasonable average estimation accuracy of 80% that increases up to 88% with increasing quality factors. Forgery detection tests on four different types of tampering resulted in an average false negative rate of 7.95% and 4.35% for the two measures respectively. http://thesai.org/Downloads/Volume1No6/Paper_3_Quantization_Table_Estimation_in_JPEG_Images_.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010603 eng
oai:thesai.org:10.14569/IJACSA.2010.010605 2012-07-01
Modified ID-Based Public key Cryptosystem using Double Discrete Logarithm Problem Chandrashekhar Meshram Public key Cryptosystem, Identity based Cryptosystem, Discrete Logarithm Problem, Double Discrete Logarithm Problem. International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 In 1984, Shamir [1] introduced the concept of an identity-based cryptosystem. In this system, each user needs to visit a key authentication center (KAC) and identify him self before joining a communication network. Once a user is accepted, the KAC will provide him with a secret key. In this way, if a user wants to communicate with others, he only needs to know the “identity” of his communication partner and the public key of the KAC. There is no public file required in this system. However, Shamir did not succeed in constructing an identity based cryptosystem, but only in constructing an identity-based signature scheme. Meshram and Agrawal [4] have proposed an id - based cryptosystem based on double discrete logarithm problem which uses the public key cryptosystem based on double discrete logarithm problem. In this paper, we propose the modification in an id based cryptosystem based on the double discrete logarithm problem and we consider the security against a conspiracy of some entities in the proposed system and show the possibility of establishing a more secure system. http://thesai.org/Downloads/Volume1No6/Paper_5_Modified_ID-Based_Public_key_Cryptosystem_using_Do.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010605 eng
oai:thesai.org:10.14569/IJACSA.2010.010606 2012-07-01
Efficient Implementation of Sample Rate Converter Charanjit singh Manjeet Singh patterh Sanjay Sharma WiMAX; Half Band filter; FIR filter; CIC Filter; Farrow Filter; FPGA International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 Within wireless base station system design, manufacturers continue to seek ways to add value and performance while increasing differentiation. Transmit/receive functionality has become an area of focus as designers attempt to address the need to move data from very high frequency sample rates to chip processing rates. Digital Up Converter (DUC) and Digital Down Converter (DDC) are used as sample rate converters. These are the important block in every digital communication system; hence there is a need for effective implementation of sample rate converter so that cost can be reduced. With the recent advances in FPGA technology, the more complex devices providing high-speed as required in DSP applications are available. The filter implementation in FPGA, utilizing the dedicated hardware resources can effectively achieve application-specific integrated circuit (ASIC)-like performance while reducing development time cost and risks. So in this paper the technique for an efficient design of DDC for reducing sample rate is being suggested which meets the specifications of WiMAX system. Its effective implementation also ensures the pathway for the efficient applications in VLSI designs. Different design configurations for the sample rate converter are explored. The sample rate converter can be designed using half band filters, fixed FIR filters, poly-phase filters, CIC filters or even farrow filters. http://thesai.org/Downloads/Volume1No6/Paper_6-Efficient_Implementation_of_Sample_Rate_Converter.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010606 eng
oai:thesai.org:10.14569/IJACSA.2010.010607 2012-07-01
Cloud Computing Through Mobile-Learning N Mallikharjuna Rao C.Sasidhar V. Satyendra Kumar Cloud Computing, Education, SAAS, Quality Teaching, Cost effective Cloud, Mobile phone, Mobile Cloud International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 Cloud computing is the new technology that has various advantages and it is an adoptable technology in this present scenario. The main advantage of the cloud computing is that this technology reduces the cost effectiveness for the implementation of the Hardware, software and License for all. This is the better peak time to analyze the cloud and its implementation and better use it for the development of the quality and low cost education for all over the world. In this paper, we discuss how to influence on cloud computing and influence on this technology to take education to a wider mass of students over the country. We believe cloud computing will surely improve the current system of education and improve quality at an affordable cost. http://thesai.org/Downloads/Volume1No6/Paper_7_Cloud_Computing_Through_Mobile-Learning.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010607 eng
oai:thesai.org:10.14569/IJACSA.2010.010608 2012-07-01
Robust R Peak and QRS detection in Electrocardiogram using Wavelet Transform P Sasikala Dr. R.S.D. Wahidabanu Electrocardiogram, Wavelet Transform, QRS complex, Filters, Thresholds International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 In this paper a robust R Peak and QRS detection using Wavelet Transform has been developed. Wavelet Transform provides efficient localization in both time and frequency. Discrete Wavelet Transform (DWT) has been used to extract relevant information from the ECG signal in order to perform classification. Electrocardiogram (ECG) signal feature parameters are the basis for signal Analysis, Diagnosis, Authentication and Identification performance. These parameters can be extracted from the intervals and amplitudes of the signal. The first step in extracting ECG features starts from the exact detection of R Peak in the QRS Complex. The accuracy of the determined temporal locations of R Peak and QRS complex is essential for the performance of other ECG processing stages. Individuals can be identified once ECG signature is formulated. This is an initial work towards establishing that the ECG signal is a signature like fingerprint, retinal signature for any individual Identification. Analysis is carried out using MATLAB Software. The correct detection rate of the Peaks is up to 99% based on MIT-BIH ECG database. http://thesai.org/Downloads/Volume1No6/Paper_8_Robust_R_Peak_and_QRS_detection_in_Electrocardiogram_using_Wavelet_Transform.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010608 eng
oai:thesai.org:10.14569/IJACSA.2010.010609 2012-07-01
Performance Evaluation of Node Failure Prediction QoS Routing Protocol (NFPQR) in Ad Hoc Networks Dr D Srinivas Rao Sake Pothalaiah NFPQ;, C-NFPQR; CEDAR; PLBQR; TDR; QoS; Routing Protocols International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 The characteristics of ad hoc networks make the QoS support a very complex process unlike traditional networks. The nodes in ad hoc wireless networks have limited power capabilities. The node failure in the network leads to different problems such as network topology changes, network partitions, packet losses and low signal quality. Many QoS routing protocols like Predictive location based QoS routing protocol (PLBQR), Ticket based QoS routing, Trigger based distributed QoS routing (TDR) protocol ,Bandwidth routing(BR) protocol, Core extracted distributed routing (CEDAR) protocol have been proposed. However these algorithms do not consider the node failures and their consequences in the routing. Thus most of the routing protocols do not perform well in frequent or unpredictable node failure conditions. Node Failure Predication QoS Routing” (NFPQR) scheme provides an optimal route selection by predicting the possibility of failure of a node through its power level. The NFPQR protocol has been modified as C-NFPQR (Clustered NFPQR) in order to provide power optimization using clustered based approach. The performance of the NFPQR and C-NFPQR is evaluated through the same QoS parameters. http://thesai.org/Downloads/Volume1No6/Paper_9-Performance_Evaluation_of_Node_Failure_Prediction_QoS_Routing_Protocol_(NFPQR)_in_Ad_Hoc_Network.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010609 eng
oai:thesai.org:10.14569/IJACSA.2010.010610 2012-07-01
Microcontroller Based Home Automation System with Security Inderpreet Kaur Automation, 8051 microcontroller, LDR, LED, ADC, Relays, LCD display, Sensors, Stepper motor International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 With advancement of technology things are becoming simpler and easier for us. Automatic systems are being preferred over manual system. This unit talks about the basic definitions needed to understand the Project better and further defines the technical criteria to be implemented as a part of this project. http://thesai.org/Downloads/Volume1No6/Paper_10-Microcontroller_Based_Home_Automation_System_With_Security.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010610 eng
oai:thesai.org:10.14569/IJACSA.2010.010611 2012-07-01
Characterization and Architecture of Component Based Models Er Iqbaldeep kaur Dr.P.K.Suri Er.Amit Verma Components, CBSD, CORBA, KOAYLA, EJB, Component retrieval, repositories etc. International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 Component based Software Engineering is the most common term nowadays in the field of software development. The CBSE approach is actually based on the principle of ‘Select and Use’ rather than ‘Design and Test’ as in traditional software development methods. Since this trend of using and ‘reusing’ components is in its developing stage, there are many advantages and problems as well that occur while use of components. Here is presented a series of papers that cover various important and integral issues in the field concerned. This paper is an introductory research on the essential concepts, principles and steps that underlie the available commercialized models in CBD. This research work has a scope extending to Component retrieval in repositories and their management and implementing the results verification. http://thesai.org/Downloads/Volume1No6/Paper_11-CHARACTERIZATION%20AND%20ARCHITECTURE%20OF%20COMPONENT%20BASED%20MODELS.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010611 eng
oai:thesai.org:10.14569/IJACSA.2010.010612 2012-07-01
Performance Improvement by Changing Modulation Methods for Software Defined Radios Bhalchandra B Godbole Dilip S. Aldar Wireless mobile communication, SDR, reconfigurability, modulation switching International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 This paper describes an automatic switching of modulation method to reconfigure transceivers of Software Defined Radio (SDR) based wireless communication system. The programmable architecture of Software Radio promotes a flexible implementation of modulation methods. This flexibility also translates into adaptively, which is used here to optimize the throughput of a wireless network, operating under varying channel conditions. It is robust and efficient with processing time overhead that still allows the SDR to maintain its real-time operating objectives. This technique is studied for digital wireless communication systems. Tests and simulations using an AWGN channel show that the SNR threshold is 5dB for the case study. http://thesai.org/Downloads/Volume1No6/Paper_12_Performance_Improvement_by_Changing_Modulation_Met.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010612 eng
oai:thesai.org:10.14569/IJACSA.2010.010613 2012-07-01
Randomized Algorithmic Approach for Biclustering of Gene Expression Data Sradhanjali Nayak Debahuti Mishra Satyabrata Das Amiya Kumar Rath Bicluster; microarray data; gene expression; randomized algorithm International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 Microarray data processing revolves around the pivotal issue of locating genes altering their expression in response to pathogens, other organisms or other multiple environmental conditions resulted out of a comparison between infected and uninfected cells or tissues. To have a comprehensive analysis of the corollaries of certain treatments, deseases and developmental stages embodied as a data matrix on gene expression data is possible through simultaneous observation and monitoring of the expression levels of multiple genes. Clustering is the mechanism of grouping genes into clusters based on different parameters. Clustering is the process of grouping genes into clusters either considering row at a time(row clustering) or considering column at a time(column clustering). The application of clustering approach is crippled by conditions which are unrelated to genes. To get better of these problems a unique form of clustering technique has evolved which offers simultaneous clustering (both rows and columns) which is known as biclustering. A bicluster is deemed to be a sub matrix consisting data values. A bicluster is resulted out of the removal of some of the rows as well as some of the columns of given data matrix in such a fashion that each row of what is left reads the same string. A fast, simple and efficient randomized algorithm is explored in this paper, which discovers the largest bicluster by random projections. http://thesai.org/Downloads/Volume1No6/Paper_13_Randomized_Algorithmic_Approach_for_Biclustering_of_Gene_Expression_Data.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010613 eng
oai:thesai.org:10.14569/IJACSA.2010.010614 2012-07-12
A Method of Genetic Algorithm (GA) for FIR Filter Construction: Design and Development with Newer Approaches in Neural Network Platform Ajoy Kumar Dey Avijit Saha Shibani Ghosh Genetic Algorithm; FIR: filter design; optimization; neural network. International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 The main focus of this paper is to describe a developed and dynamic method of designing finite impulse response filters with automatic, rapid and less computational complexity by an efficient Genetic approach. To obtain such efficiency, specific filter coefficient coding scheme has been studied and implemented. The algorithm generates a population of genomes that represents the filter coefficient where new genomes are generated by crossover, mutation operations methods. Our proposed genetic technique has able to give better result compare to other method http://thesai.org/Downloads/Volume1No6/Paper_14%20A%20Method%20of%20Genetic%20Algorithm%20(GA)%20for%20FIR%20Filter%20Construction%20Design%20and%20Development%20with%20Newer%20Approaches%20in%20Neural%20Network%20Platform.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010614 eng
oai:thesai.org:10.14569/IJACSA.2010.010615 2012-07-01
Hybrid Technique for Human Face Emotion Detection Renu Nagpal Pooja Nagpal Sumeet Kaur biomertics;adaptive median filter; bacteria foraging optimization;feature detection;facial expression International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 This paper presents a novel approach for the detection of emotions using the cascading of Mutation Bacteria Foraging optimization and Adaptive Median Filter in highly corrupted noisy environment. The approach involves removal of noise from the image by the combination of MBFO & AMF and then detects local, global and statistical feature form the image. The Bacterial Foraging Optimization Algorithm (BFOA), as it is called now, is currently gaining popularity in the community of researchers, for its effectiveness in solving certain difficult real-world optimization problems. Our results so far show the approach to have a promising success rate. An automatic system for the recognition of facial expressions is based on a representation of the expression, learned from a training set of pre-selected meaningful features. However, in reality the noises that may embed into an image document will affect the performance of face recognition algorithms. As a first we investigate the emotionally intelligent computers which can perceive human emotions. In this research paper four emotions namely anger, fear, happiness along with neutral is tested from database in noisy environment of salt and pepper. Very high recognition rate has been achieved for all emotions along with neutral on the training dataset as well as user defined dataset. The proposed method uses cascading of MBFO & AMF for the removal of noise and Neural Networks by which emotions are classified. http://thesai.org/Downloads/Volume1No6/Paper_15_Hybrid_Technique_for_Human_Face_Emotion_Detection.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010615 eng
oai:thesai.org:10.14569/IJACSA.2010.010616 2012-07-01
Design Strategies for AODV Implementation in Linux Ms Prinima Gupta Dr. R. K Tuteja Ad-hoc Networking, AODV, MANET. International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 In a Mobile Ad hoc Network (MANET), mobile nodes constructing a network, nodes may join and leave at any time, and the topology changes dynamically. Routing in a MANET is challenging because of the dynamic topology and the lack of an existing fixed infrastructure. In this paper, we explore the difficulties encountered in implementing MANET routing protocols in real oper¬ating systems, and study the common requirements im¬posed by MANET routing on the underlying operating system services. Also, it explains implementation techniques of the AODV protocol to determine the needed events, such as: Snooping, Kernel Modification, and Netfilter. In addition, this paper presents a discussion of the advantages as well as disadvantages of each implementation of this architecture in Linux. http://thesai.org/Downloads/Volume1No6/Paper_16_DESIGN_STRATEGIES_FOR_AODV_IMPLEMENTATION_IN_LINUX.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010616 eng
oai:thesai.org:10.14569/IJACSA.2010.010617 2012-07-01
Model Based Test Case Prioritization For Testing Component Dependency In CBSD Using UML Sequence Diagram Arup Abhinna Acharya Vuda Sreenivasarao Namita Panda Regression Testing, Object Interaction Graph, Test Cases, CBSD International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 Software maintenance is an important and costly activity of the software development lifecycle. To ensure proper maintenance the software undergoes regression testing. It is very inefficient to re execute every test case in regression testing for small changes. Hence test case prioritization is a technique to schedule the test case in an order that maximizes some objective function. A variety of objective functions are applicable, one such function involves rate of fault detection - a measure of how quickly faults are detected within the testing process. Early fault detection can provide a faster feedback generating a scope for debuggers to carry out their task at an early stage. In this paper we propose a method to prioritize the test cases for testing component dependency in a Component Based Software Development (CBSD) environment using Greedy Approach. An Object Interaction Graph (OIG) is being generated from the UML sequence diagrams for interdependent components. The OIG is traversed to calculate the total number of inter component object interactions and intra component object interactions. Depending upon the number of interactions the objective function is calculated and the test cases are ordered accordingly. This technique is applied to components developed in Java for a software system and found to be very effective in early fault detection as compared to non-prioritize approach. http://thesai.org/Downloads/Volume1No6/Paper_17_Model_Based_Test_Case_Prioritization_For_Testing_Component_Dependency_In_CBSD_Using_UML_Sequence_Diagram.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010617 eng
oai:thesai.org:10.14569/IJACSA.2010.010618 2012-07-01
EM Wave Transport 2D and 3D Investigations Rajveer S Yaduvanshi Harish Parthasarathy Boltzmann Transport Equation, Probability Distribution Function, Coupled BTE-Maxwell’s. International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 Boltzmann Transport Equation [1-2] has been modelled in close conjunction with Maxwell’s equation, investigations for 2D and 3D transport carriers have been proposed. Exact solution of Boltzmann Equation still remains the core field of research. We have worked towards evaluation of 2D and 3D solutions of BTE. Application of our work can be extended to study the electromagnetic wave transport in upper atmosphere i.e. ionosphere. We have given theoretical and numerical analysis of probability density function, collision integral under various initial and final conditions. Modelling of coupled Boltzmann Maxwell’s equation taking binary collision and multi species collision terms has been evaluated. Solutions of Electric Field (E) and Magnetic Field (B) under coupled conditions have been obtained. PDF convergences under the absence of electric field have been sketched, with an iterative approach and are shown in figure 1. Also 3D general algorithm for solution of BTE has been suggested. http://thesai.org/Downloads/Volume1No6/Paper_18_EM__WAVE__TRANSPORT_2D3D_INVESTIGATIONS.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010618 eng
oai:thesai.org:10.14569/IJACSA.2010.010619 2012-07-01
A Study on Associative Neural Memories B.D C.N Prasad P E S N Krishna Prasad Sagar Yeruva P Sita Rama Murty Associative memories; SAM; DAM; Hopfield model; BAM; Holographic Associative Memory (HAM); Context-sensitive Auto-associative Memory (CSAM); Context-sensitive Asynchronous Memory (CSYM) International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 Memory plays a major role in Artificial Neural Networks. Without memory, Neural Network can not be learned itself. One of the primary concepts of memory in neural networks is Associative neural memories. A survey has been made on associative neural memories such as Simple associative memories (SAM), Dynamic associative memories (DAM), Bidirectional Associative memories (BAM), Hopfield memories, Context Sensitive Auto-associative memories (CSAM) and so on. These memories can be applied in various fields to get the effective outcomes. We present a study on these associative memories in artificial neural networks. http://thesai.org/Downloads/Volume1No6/Paper_19_A_Study_on_Associative_Neural_Memories.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010619 eng
oai:thesai.org:10.14569/IJACSA.2010.010620 2012-07-01
Adaptive Channel Estimation Techniques for MIMO OFDM Systems Md Masud Rana Md. Kamal Hosain MIMO; NLMS; OFDM; RLS International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 In this paper, normalized least mean (NLMS) square and recursive least squares (RLS) adaptive channel estimator are described for multiple input multiple output (MIMO) orthogonal frequency division multiplexing (OFDM) systems. These CE methods uses adaptive estimator which are able to update parameters of the estimator continuously, so that the knowledge of channel and noise statistics are not necessary. This NLMS/RLS CE algorithm requires knowledge of the received signal only. Simulation results demonstrated that the RLS CE method has better performances compared NLMS CE method for MIMO OFDM systems. In addition, the utilizing of more multiple antennas at the transmitter and/or receiver provides a much higher performance compared with fewer antennas. Furthermore, the RLS CE algorithm provides faster convergence rate compared to NLMS CE method. Therefore, in order to combat the more channel dynamics, the RLS CE algorithm is better to use for MIMO OFDM systems. http://thesai.org/Downloads/Volume1No6/Paper_20_ADAPTIVE_CHANNEL_ESTIMATION_TECHNIQUES_FOR_MIMO_OFDM_SYSTEMS.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010620 eng
oai:thesai.org:10.14569/IJACSA.2010.010621 2012-07-01
The Impact of Social Networking Websites to Facilitate the Effectiveness of Viral Marketing Abed Abedniya Sahar Sabbaghi Mahmouei Social networks website, viral marketing, structural equation modeling International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 The Internet and the World Wide Web have become two key components in today's technology based organizations and businesses. As the Internet is becoming more and more popular, it is starting to make a big impact on people's day-to-day life. As a result of this revolutionary transformation towards the modern technology, social networking on the World Wide Web has become an integral part of a large number of people's lives. Social networks are websites which allow users to communicate, share knowledge about similar interests, discuss favorite topics, review and rate products/services, etc. These websites have become a powerful source in shaping public opinion on virtually every aspect of commerce. Marketers are challenged with identifying influential individuals in social networks and connecting with them in ways that encourage viral marketing content movement and there has been little empirical research study about of this website to diffuse of viral marketing content. In this article, we explore the role of social network websites which has influence on viral marketing, and the characteristics of the most influential users to spread share viral content. Structural equation modeling is used to examine the patterns of inter-correlations among the constructions and to empirically test the hypotheses. http://thesai.org/Publication/IJACSA/Archives/Volume1No6.aspx The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010621 eng
oai:thesai.org:10.14569/IJACSA.2010.010622 2012-07-01
A Face Replacement System Based on Face Pose Estimation Kuo Yu Chiu Shih-Che Chien Sheng-Fuu Lin Facial feature, Face replacement, Neural network, Support vector machine (SVM) International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 Face replacement system plays an important role in the entertainment industries. However, most of these systems nowadays are assisted by hand and specific tools. In this paper, a new face replacement system for automatically replacing a face with image processing technique is described. The system is divided into two main parts: facial feature extraction and face pose estimation. In the first part, the face region is determined and the facial features are extracted and located. Eyes, mouth, and chin curve are extracted by their statistical and geometrical properties. These facial features are used as the information for the second part. A neural network is adopted here to classify the face pose according to the feature vectors which are obtained from the different ratio of facial features. From the experiments and some comparisons, they show that this system works better while dealing with different pose, especially for non-frontal face pose. http://thesai.org/Downloads/Volume1No6/Paper_22_A_Face_Replacement_System_Based_on_Face_Pose_Estimation.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010622 eng
oai:thesai.org:10.14569/IJACSA.2010.010623 2012-07-01
A Comprehensive Analysis of Spoofing P Ramesh Babu D.Lalitha Bhaskari CH.Satyanarayana Spoofing, Filtering, Attacks, Information, Trust International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 The main intention of writing this paper is to enable the students, computer users and novice researchers about spoofing attacks. Spoofing means impersonating another person or computer, usually by providing false information (E-mail name, URL or IP address). Spoofing can take on many forms in the computer world, all of which involve some type false representation of information. There are a variety of methods and types of spoofing. We would like to introduce and explain following spoofing attacks in this paper: IP, ARP, E-Mail, Web, and DNS spoofing. There are no legal or constructive uses for implementing spoofing of any type. Some of the outcomes might be sport, theft, vindication or some other malicious goal. The magnitude of these attacks can be very severe; can cost us millions of dollars. This Paper describes about various spoofing types and gives a small view on detection and prevention of spoofing attacks. http://thesai.org/Downloads/Volume1No6/Paper_23_A_Comprehensive_Analysis_of_Spoofing.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010623 eng
oai:thesai.org:10.14569/IJACSA.2010.010624 2012-07-01
Key Management Techniques for Controlling the Distribution and Update of Cryptographic keys T Lalith R.Umarani G.M.Kadharnawaz International Journal of Advanced Computer Science and Applications(IJACSA), 1(6), 2010 Key management plays a fundamental role in cryptography as the basis for securing cryptographic techniques providing confidentiality, entity authentication, data origin authentication, data integrity, and digital signatures. The goal of a good cryptographic design is to reduce more complex problems to the proper management and safe-keeping of a small number of cryptographic keys, ultimately secured through trust in hardware or software by physical isolation or procedural controls. Reliance on physical and procedural security (e.g., secured rooms with isolated equipment), tamper-resistant hardware, and trust in a large number of individuals is minimized by concentrating trust in a small number of easily monitored, controlled, and trustworthy elements. http://thesai.org/Downloads/Volume1No6/Paper_24_Key_Management_Techniques_for_Controlling_the_Distribution_and_Update_of_Cryptographic_keys.pdf The Science and Information (SAI) Organization 2010 text http://dx.doi.org/10.14569/IJACSA.2010.010624 eng
oai:thesai.org:10.14569/IJACSA.2011.020101 2012-07-01
Computing knowledge and Skills Demand: A Content Analysis of Job Adverts in Botswana Y. Ayalew Z. A. Mbero T. Z. Nkgau P. Motlogelwa A. Masizana-Katongo computing job adverts; job adverts in Botswana; content analysis International Journal of Advanced Computer Science and Applications(IJACSA), 2(1), 2011 This paper presents the results of a content analysis of computing job adverts to assess the types of skills required by employers in Botswana. Through the study of job adverts for computing professionals for one year (i.e., January 2008 to December 2008), we identified the types of skills required by employers for early career positions. The job adverts were collected from 7 major newspapers (published both daily and weekly) that are circulated throughout the country. The findings of the survey have been used for the revision and development of curricula for undergraduate degree programmes at the Department of Computer Science, University of Botswana. The content analysis focused on the identification of the most sought after types of qualifications (i.e., degree types), job titles, skills, and industry certifications. Our analysis reveals that the majority of the adverts did not set a preference to a particular type of computing degree. Furthermore, our findings indicate that those job titles and computing skills which are on high demand are not consistent with previous studies carried out in the developed countries. This requires further investigation to identify reasons for these differences from the perspective of the practices in the IT industry. It also requires further investigation regarding the degree of mismatch between the employers computing skills demands and the knowledge and skills provided by academic programmes in the country. http://thesai.org/Downloads/Volume2No1/Paper%201-Computing%20knowledge%20and%20Skills%20Demand%20A%20Content%20Analysis%20of%20Job%20Adverts%20in%20Botswana.pdf The Science and Information (SAI) Organization 2011 text http://dx.doi.org/10.14569/IJACSA.2011.020101 eng
oai:thesai.org:10.14569/IJACSA.2011.020102 2012-07-01
Open Source Software in Computer Science and IT Higher Education: A Case Study Dan R Lipsa Robert S. Laramee open source software (OSS), free software International Journal of Advanced Computer Science and Applications(IJACSA), 2(1), 2011 The importance and popularity of open source software has increased rapidly over the last 20 years. This is due to a variety of advantages open source software has to offer and also the wide availability of the Internet in the early nineties. We identify and describe important open source software characteristics and then present a case study using open source software to teach three Computer Science and IT courses for one academic year. We compare fulfilling our educational requirements and goals with open source software and with proprietary software. We present some of the advantages of using Open Source Software (OSS). Finally we report on our experiences of using open source software in the classroom and describe the benefits and drawbacks of using this type of software over common proprietary software from both a financial and educational point of view. http://thesai.org/Downloads/Volume2No1/Paper%202-Open%20source%20software%20in%20computer%20science%20and%20IT%20higher%20education.pdf The Science and Information (SAI) Organization 2011 text http://dx.doi.org/10.14569/IJACSA.2011.020102 eng
oai:thesai.org:10.14569/IJACSA.2011.020103 2012-07-01
Analyzing the Load Balance of Term-based Partitioning Ahmad Abusukhon Mohammad Talib Term-partitioning schemes, Term-frequency partitioning, Term-lengthpartitioning, Node utilization, Load balance International Journal of Advanced Computer Science and Applications(IJACSA), 2(1), 2011 In parallel (IR) systems, where a large-scale collection is indexed and searched, the query response time is limited by the time of the slowest node in the system. Thus distributing the load equally across the nodes is very important issue. Mainly there are two methods for collection indexing, namely document-based and term-based indexing. In term-based partitioning, the terms of the global index of a large-scale data collection are distributed or partitioned equally among nodes, and then a given query is divided into sub-queries and each sub-query is then directed to the relevant node. This provides high query throughput and concurrency but poor parallelism and load balance. In this paper, we introduce new methods for terms partitioning and then we compare the results from our methods with the results from the previous work with respect to load balance and query response time. http://thesai.org/Downloads/Volume2No1/Paper%203-Analyzing%20the%20Load%20Balance%20of%20Term-based%20Partitioning.pdf The Science and Information (SAI) Organization 2011 text http://dx.doi.org/10.14569/IJACSA.2011.020103 eng
oai:thesai.org:10.14569/IJACSA.2011.020104 2012-07-01
A Genetic Algorithm for Solving Travelling Salesman Problem Adewole Philip Akinwale Adio Taofiki Otunbanowo Kehinde Genetic Algorithm, Generation, Mutation rate, Population, Travelling Salesman Problem International Journal of Advanced Computer Science and Applications(IJACSA), 2(1), 2011 In this paper we present a Genetic Algorithm for solving the Travelling Salesman problem (TSP). Genetic Algorithm which is a very good local search algorithm is employed to solve the TSP by generating a preset number of random tours and then improving the population until a stop condition is satisfied and the best chromosome which is a tour is returned as the solution. Analysis of the algorithmic parameters (Population, Mutation Rate and Cut Length) was done so as to know how to tune the algorithm for various problem instances. http://thesai.org/Downloads/Volume2No1/Paper%204-A%20Genetic%20Algorithm%20for%20Solving%20Travelling%20Salesman%20Problem.pdf The Science and Information (SAI) Organization 2011 text http://dx.doi.org/10.14569/IJACSA.2011.020104 eng
oai:thesai.org:10.14569/IJACSA.2011.020105 2012-07-01
Grid Approximation Based Inductive Charger Deployment Technique in Wireless Sensor Networks Fariha Tasmin Jaigirdar Mohammad Mahfuzul Islam Sikder Rezwanul Huq wireless sensor network; energy efficiency; network security; grid approximation; inductive charger. International Journal of Advanced Computer Science and Applications(IJACSA), 2(1), 2011 Ensuring sufficient power in a sensor node is a challenging problem now-a-days to provide required level of security and data processing capability demanded by various applications scampered in a wireless sensor network. The size of sensor nodes and the limitations of battery technologies do not allow inclusion of high energy in a sensor. Recent technologies suggest that the deployment of inductive charger can solve the power problem of sensor nodes by recharging the batteries of sensors in a complex and sensitive environment. This paper provides a novel grid approximation algorithm for efficient and low cost deployment of inductive charger so that the minimum number of chargers along with their placement locations can charge all the sensors of the network. The algorithm proposed in this paper is a generalized one and can also be used in various applications including the measurement of network security strength by estimating the minimum number of malicious nodes that can destroy the communication of all the sensors. Experimental results show the effectiveness of the proposed algorithm and impacts of the different parameters used in it on the performance measures. http://thesai.org/Downloads/Volume2No1/Paper%205-Grid%20Approximation%20Based%20Inductive%20Charger%20Deployment%20Technique%20in%20Wireless%20Sensor%20Networks.pdf The Science and Information (SAI) Organization 2011 text http://dx.doi.org/10.14569/IJACSA.2011.020105 eng
oai:thesai.org:10.14569/IJACSA.2011.020106 2012-07-01
PAV: Parallel Average Voting Algorithm for Fault-Tolerant Systems Abbas Karimi Faraneh Zarafshan Adznan b. Jantan Fault-tolerant; Voting Algorithm; Parallel- Algorithm; Divide and Conquer. International Journal of Advanced Computer Science and Applications(IJACSA), 2(1), 2011 Fault-tolerant systems are such systems that can continue their operation, even in presence of faults. Redundancy as one of the main techniques in implementation of fault-tolerant control systems uses voting algorithms to choose the most appropriate value among multiple redundant and probably faulty results. Average (mean) voter is one of the commonest voting methods which is suitable for decision making in highly-available and long-missions applications in which the availability and speed of the system is critical. In this paper we introduce a new generation of average voter based on parallel algorithms which is called as parallel average voter. The analysis shows that this algorithm has a better time complexity (log n) in comparison with its sequential algorithm and is especially appropriate for applications where the size of input space is large. http://thesai.org/Downloads/Volume2No1/Paper%206-PAV%20Parallel%20Average%20Voting%20Algorithm%20for%20Fault%20Tolerant%20Systems.pdf The Science and Information (SAI) Organization 2011 text http://dx.doi.org/10.14569/IJACSA.2011.020106 eng
oai:thesai.org:10.14569/IJACSA.2011.020107 2012-07-01
Solution of Electromagnetic and Velocity Fields for an Electrohydrodynamic Fluid Dynamical System Rajveer S Yaduvanshi Harish Parthasarathy Permittivity tuning, Incompressible fluid, Navier-Maxwell’s coupled equations, resonance frequency reconfigure ability. International Journal of Advanced Computer Science and Applications(IJACSA), 2(1), 2011 We studied the temporal evolution of the electromagnetic and velocity fields in an incompressible conducting fluid by means of computer simulations from the Navier Stokes and Maxwell’s equations. We then derived the set of coupled partial differential equations for the stream function vector field and the electromagnetic field. These equations are first order difference equations in time and fetch simplicity in discretization. The spatial partial derivatives get converted into partial difference equations. The fluid system of equations is thus approximated by a nonlinear state variable system. This system makes use of the Kronecker Tensor product. The final system has taken account of anisotropic permittivity. The conductivity and magnetic permeability of the fluid are assumed to be homogeneous and isotropic. Present work in this paper describes characterization of magneto hydrodynamic anisotropic medium due to permittivity. Also an efficient and modified novel numerical solution using Tensor product has been proposed. This numerical technique seems to be potentially much faster and provide compatibility in matrices operation. Application of our characterization technique shall be very useful in tuning of permittivity in Liquid crystal polymer, Plasma and Dielectric lens antennas for obtaining wide bandwidth, resonance frequency reconfigure ability and better beam control. http://thesai.org/Downloads/Volume2No1/Paper%207-Solution%20of%20Electromagnetic%20and%20Velocity%20Fields%20for%20an%20Electrohydrodynamic%20Fluid%20Dynamical%20System.pdf The Science and Information (SAI) Organization 2011 text http://dx.doi.org/10.14569/IJACSA.2011.020107 eng
oai:thesai.org:10.14569/IJACSA.2011.020108 2012-07-01
Survey of Wireless MANET Application in Battlefield Operations Dr C Rajabhushanam Dr. A. Kathirvel MANET; routing; protocols; wireless; simulation International Journal of Advanced Computer Science and Applications(IJACSA), 2(1), 2011 In this paper, we present a framework for performance analysis of wireless MANET in combat/battle field environment. The framework uses a cross-layer design approach where four different kinds of routing protocols are compared and evaluated in the area of security operations. The resulting scenarios are then carried out in a simulation environment using NS-2 simulator. Research efforts also focus on issues such as Quality of Service (QoS), energy efficiency, and security, which already exist in the wired networks and are worsened in MANET. This paper examines the routing protocols and their newest improvements. Classification of routing protocols by source routing and hop-by-hop routing are described in detail and four major categories of state routing are elaborated and compared. We will discuss the metrics used to evaluate these protocols and highlight the essential problems in the evaluation process itself. The results would show better performance with respect to the performance parameters such as network throughput, end-to-end delay and routing overhead when compared to the network architecture which uses a standard routing protocol. Due to the nature of node distribution the performance measure of path reliability which distinguishes ad hoc networks from other types of networks in battlefield conditions, is given more significance in our research work. http://thesai.org/Downloads/Volume2No1/Paper%208-Survey%20of%20Wireless%20MANET%20Application%20in%20Battlefield%20Operations.pdf The Science and Information (SAI) Organization 2011 text http://dx.doi.org/10.14569/IJACSA.2011.020108 eng
oai:thesai.org:10.14569/IJACSA.2011.020109 2012-07-01
An Efficient Resource Discovery Methodology for HPGRID Systems D.Doreen Hephzibah Miriam K.S.Easwarakumar Peer-to-Peer; Grid; Hypercube; Isomorphic partitioning; Resource Discovery International Journal of Advanced Computer Science and Applications(IJACSA), 2(1), 2011 An efficient resource discovery mechanism is one of the fundamental requirements for grid computing systems, as it aids in resource management and scheduling of applications. Resource discovery activity involves searching for the appropriate resource types that match the user’s application requirements. Classical approaches to Grid resource discovery are either centralized or hierarchical, and it becomes inefficient when the scale of Grid systems increases rapidly. On the other hand, the Peer-to-Peer (P2P) paradigm emerged as a successful model as it achieves scalability in distributed systems. Grid system using P2P technology can improve the central control of the traditional grid and restricts single point of failure. In this paper, we propose a new approach based on P2P techniques for resource discovery in grids using Hypercubic P2P Grid (HPGRID) topology connecting the grid nodes. A scalable, fault-tolerant, self-configuring search algorithm is proposed as Parameterized HPGRID algorithm, using isomorphic partitioning scheme. By design, the algorithm improves the probability of reaching all the working nodes in the system, even in the presence of non-alive nodes (inaccessible, crashed or nodes loaded by heavy traffic). The scheme can adapt to a complex, heterogeneous and dynamic resources of the grid environment, and has a better scalability http://thesai.org/Downloads/Volume2No1/Paper%209-An%20Efficient%20Resource%20Discovery%20Methodology%20for%20HPGRID%20Systems.pdf The Science and Information (SAI) Organization 2011 text http://dx.doi.org/10.14569/IJACSA.2011.020109 eng