The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Metadata Harvesting (OAI2)
  • Digital Archiving Policy
  • Promote your Publication

IJACSA

  • About the Journal
  • Call for Papers
  • Author Guidelines
  • Fees/ APC
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Editors
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Guidelines
  • Fees
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Editors
  • Reviewers
  • Subscribe

IJACSA Volume 10 Issue 4

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Global and Local Characterization of Rock Classification by Gabor and DCT Filters with a Color Texture Descriptor

Abstract: In the automatic classification of colored natural textures, the idea of proposing methods that reflect human perception arouses the enthusiasm of researchers in the field of image processing and computer vision. Therefore, the color space and the methods of analysis of color and texture, must be discriminating to correspond to the human vision. Rock images are a typical example of natural images and their analysis is of major importance in the rock industry. In this paper, we combine the statistical (Local Binary Pattern (LBP) with Hue Saturation Value (HSV) and Red Green Blue (RGB) color spaces fusion) and frequency (Gabor filter and Discrete Cosine Transform (DCT)) descriptors named respectively Gabor Adjacent Local Binary Pattern Color Space Fusion (G-ALBPCSF) and DCT Adjacent Local Binary Pattern Color Space Fusion (D-ALBPCSF) for the extraction of visual textural and colorimetric features from direct view images of rocks. The textural images from the two G-ALBPCSF and D-ALBPCSF approaches are evaluated through similarity metrics such as Chi2 and the intersection of histograms that we have adapted to color histograms. The results obtained allowed us to highlight the discrimination of the rock classes. The proposed extraction method provides better classification results for various direct view rock texture images. Then it is validated by a confusion matrix giving a low error rate of 0.8% of classification.

Author 1: J Wognin Vangah
Author 2: Sié Ouattara
Author 3: Gbélé Ouattara
Author 4: Alain Clement

Keywords: Rock; classification; G-ALBPCSF; D-ALBPCSF; LBP; gabor; DCT; RGB; HSV; color texture

Download PDF

Paper 2: Adoption of the Internet of Things (IoT) in Agriculture and Smart Farming towards Urban Greening: A Review

Abstract: It is essential to increase the productivity of agricultural and farming processes to improve yields and cost-effectiveness with new technology such as the Internet of Things (IoT). In particular, IoT can make agricultural and farming industry processes more efficient by reducing human intervention through automation. In this study, the aim to analyze recently developed IoT applications in the agriculture and farming industries to provide an overview of sensor data collections, technologies, and sub-verticals such as water management and crop management. In this review, data is extracted from 60 peer-reviewed scientific publications (2016-2018) with a focus on IoT sub-verticals and sensor data collection for measurements to make accurate decisions. Our results from the reported studies show water management is the highest sub-vertical (28.08%) followed by crop management (14.60%) then smart farming (10.11%). From the data collection, livestock management and irrigation management resulted in the same percentage (5.61%). In regard to sensor data collection, the highest result was for the measurement of environmental temperature (24.87%) and environmental humidity (19.79%). There are also some other sensor data regarding soil moisture (15.73%) and soil pH (7.61%). Research indicates that of the technologies used in IoT application development, Wi-Fi is the most frequently used (30.27%) followed by mobile technology (21.10%). As per our review of the research, we can conclude that the agricultural sector (76.1%) is researched considerably more than compared to the farming sector (23.8%). This study should be used as a reference for members of the agricultural industry to improve and develop the use of IoT to enhance agricultural production efficiencies. This study also provides recommendations for future research to include IoT systems' scalability, heterogeneity aspects, IoT system architecture, data analysis methods, size or scale of the observed land or agricultural domain, IoT security and threat solutions/protocols, operational technology, data storage, cloud platform, and power supplies.

Author 1: A. A Raneesha Madushanki
Author 2: Malka N Halgamuge
Author 3: W. A. H. Surangi Wirasagoda
Author 4: Ali Syed

Keywords: Internet of Things; IoT; agricultural; smart farming; business; sensor data; automation

Download PDF

Paper 3: e-Learning Tools on the Healthcare Professional Social Networks

Abstract: According to many studies, professional social networks are not widespread in the health care environment, especially doctors. The article is devoted to two advanced digital tools that can attract the image and increase motivation for professional social networks. The first tool is the inclusion of e-learning, both to increase the level of knowledge and to confirm qualification skills among professionals. The second tool is the developed system of the formation of the tests constructor. The article describes solution of being developed Internet-resources for mass use in professional health care.

Author 1: Evgeny Nikulchev
Author 2: Dmitry Ilin
Author 3: Bladimir Belov
Author 4: Pavel Kolyasnikov
Author 5: Alexander Kosenkov

Keywords: Healthcare professional social networks; advanced digital tools; e-learning; test constructor; e-learning

Download PDF

Paper 4: Systematic Literature Review (SLR) of Resource Scheduling and Security in Cloud Computing

Abstract: Resource scheduling in cloud computing is a com-plex task due to the number and variety of resources available and the volatility of usage-patterns of resources considering that the resource setting is on the service provider. This compounded further when security issues are also factored in. This paper provide a Systematic Literature Review (SLR) that will help to identify as much prior relevant research that has been done in the area of research topic. Also, all papers that are found from the search will be classified into groups to stand on the current situation and to identify possible existing gaps.

Author 1: Abdullah Sheikh
Author 2: Malcolm Munro
Author 3: David Budgen

Keywords: Cloud computing; security; resource scheduling; systematic literature review; SLR

Download PDF

Paper 5: Towards a Mechanism for Protecting Seller’s Interest of Cash on Delivery by using Smart Contract in Hyperledger

Abstract: In emerging economies, with the explosion of e-commerce, payment methods have increasingly enhanced security. However, Cash-on-Delivery (COD) payment method still prevails in cash-based economies. Although COD allows consumers to be more proactive in making payments, it still appears to be vulnerable by the appearance of a third party (shipping companies). In this paper, we proposed a payment system based on “smart contract” implemented on top of blockchain technology to minimize risks for parties. The platform consists of a set of rules that each party must follow including specific delivery time and place, cost of delivery, mortgage money; thereby, forcing parties to be responsible for their tasks in order to complete the contract. We also provided a detailed implementation to illustrate the efficiency of our model.

Author 1: Ha Xuan Son
Author 2: Minh Hoang Nguyen
Author 3: Nguyen Ngoc Phien
Author 4: Hai Trieu Le
Author 5: Quoc Nghiep Nguyen
Author 6: Van Dai Dinh
Author 7: Phu Thinh Tru
Author 8: Phuc Nguyen

Keywords: Blockchain; fintech; smart contract; customer; seller; shipper; cash on delivery; hyperledger

Download PDF

Paper 6: Dynamic Modification of Activation Function using the Backpropagation Algorithm in the Artificial Neural Networks

Abstract: The paper proposes the dynamic modification of the activation function in a learning technique, more exactly backpropagation algorithm. The modification consists in changing slope of sigmoid function for activation function according to increase or decrease the error in an epoch of learning. The study was done using the Waikato Environment for Knowledge Analysis (WEKA) platform to complete adding this feature in Multilayer Perceptron class. This study aims the dynamic modification of activation function has changed to relative gradient error, also neural networks with hidden layers have not used for it.

Author 1: Marina Adriana Mercioni
Author 2: Alexandru Tiron
Author 3: Stefan Holban

Keywords: Artificial neural networks; activation function; sigmoid function; WEKA; multilayer perceptron; instance; classifier; gradient; rate metric; performance; dynamic modification

Download PDF

Paper 7: Fault Injection and Test Approach for Behavioural Verilog Designs using the Proposed RASP-FIT Tool

Abstract: Soft-core processors and complex Field Pro-grammable Gate Array (FPGA) designs are described as an algorithmic manner, i.e. behavioural abstraction level in Hard-ware Description Languages (HDL). Lower abstraction levels add complexity and delays in the design cycle as well as in the fault injection approach. Therefore, fault simulation/emulation techniques are demanded to develop an approach for testing of design and to evaluate dependability analysis of FPGA designs at this abstraction level. Broadly, the fault injection techniques for FPGA-based designs at the HDL code level are categorised into emulation and simulation-based techniques. This work is an extension of our previous methodologies developed for FPGA designs written at data-flow and gate abstraction levels under the proposed RASP-FIT tool. These methodologies include fault injection by code parsing of the SUT, test approach for finding the test vectors using dynamic and static compaction techniques, fault coverage, and compaction ratio directly at the code level of the design. In this paper, we described the proposed approaches briefly, and the enhancement of a Verilog code modifier for the behavioural designs is presented in detail.

Author 1: Abdul Rafay Khatri
Author 2: Ali Hayek
Author 3: Josef Börcsök

Keywords: Behavioural designs; code parsing; fault injection; test approach; Verilog HDL

Download PDF

Paper 8: An Efficient Image Haze Removal Algorithm based on New Accurate Depth and Light Estimation Algorithm

Abstract: Single image Dehazing has become a challenging task for a variety of image processing and computer applications. Many attempts have been devised to recover faded colors and improve image contrast. Such methods, however, do not achieve maximum restoration, as images are often subject to color distortion. This paper proposes an efficient single image Dehazing algorithm that offers satisfactory scene radiance restoration. The proposed method stands on the estimation of two key indices; image blur and atmospheric light that can be employed in the Image Formation Model (IFM) to recover scene radiance of the hazy image. More clearly, we propose an efficient depth estimation method using image blur. Most existing algorithms implement atmospheric light as a constant which often leads to inaccurate estimations, we propose a new algorithm “A-Estimate” based on blur and energy to estimate the atmospheric light accurately, an adaptive transmission map also has been proposed. Experimental results on real and synthesized hazy images demonstrate an improved performance in the proposed method when compared to existing state-of-the-art methods.

Author 1: Samia Haouassi
Author 2: Wu Di
Author 3: Meryem Hamidaoui
Author 4: Tobji Rachida

Keywords: Image dehazing; Image Formation Model (IFM); depth map; transmission map; atmospheric light; image blur; image energy

Download PDF

Paper 9: Arabic Text Classification using Feature-Reduction Techniques for Detecting Violence on Social Media

Abstract: With the current increase in the number of online users, there has been a concomitant increase in the amount of data shared online. Techniques for discovering knowledge from these data can provide us with valuable information when it comes to detecting different problems, including violence. Violence is one of the significant problems humanity has faced in recent years all over the world, and this is especially a problem in Arabic countries. To address this issue, this research focuses on detecting violence-related tweets to help in solving this problem. Text mining is an important technique that can be used to find and predict information from text. In this study, a text classification model is built for detecting violence in Arabic dialects on Twitter using different feature-reduction approaches. The experiment comprises bagging, K-nearest neighbors (KNN), and Bayesian boosting using different extraction features, namely, root-based stemming, light stemming, and n-grams. In addition, the study used the following feature-reduction techniques: support vector machine (SVM), Chi-squared (CHI), the Gini index, correlation, rules, information gain (IG), deviation, symmetrical uncertainty, and the IG ratio. The experiment showed that the bagging with tri-gram approach has the highest accuracy at 86.61%, and a combination of IG with SVM from reduction features registers an accuracy of 90.59%.

Author 1: Hissah ALSaif
Author 2: Taghreed Alotaibi

Keywords: Violence; text mining; classification; feature-reduction techniques; Arabic; Twitter posts

Download PDF

Paper 10: Industrial Financial Forecasting using Long Short-Term Memory Recurrent Neural Networks

Abstract: This research deals with the industrial financial forecasting in order to calculate the yearly expenditure of the organization. Forecasting helps in estimation of the future trends and provides a valuable information to make the industrial decisions. With growing economies, the financial world spends billions in terms of expenses. These expenditures are also defined as budgets or operational resources for a functional workplace. These expenses carry a fluctuating property as opposed to a linear or constant growth and this information if extracted can reshape the future in terms of effective spending of finances and will give an insight for the future budgeting reforms. It is a challenge to grasp over the changing trends with an effective accuracy and for this purpose machine learning approaches can be utilized. In this study Long Short-Term Memory (LSTM), which is a variant of Recurrent Neural Network (RNN) from the family of Artificial Neural Networks (ANN), is used for forecasting purposes along with a statistical tool IBM SPSS for comparative analysis. In this study, the experiments are performed on the data set of Pakistan GDP by type of expenditure at current prices - national currency (1970-2016) produced by Economic Statistics Branch of the United Nations Statistics Division (UNSD). Results of this study demonstrate that the proposed model predicted the expenses with better accuracy than that of the classical statistical tools.

Author 1: Muhammad Mohsin Ali
Author 2: Muhammad Imran Babar
Author 3: Muhammad Hamza
Author 4: Muhammad Jehanzeb
Author 5: Saad Habib
Author 6: Muhammad Sajid Khan

Keywords: Financial forecasting; prediction; long-short term memory; recurrent neural networks; artificial neural networks; IBM SPSS

Download PDF

Paper 11: Software Artefacts Consistency Management towards Continuous Integration: A Roadmap

Abstract: Software development in DevOps practices has become popular with the collaborative intersection between development and operations teams. The notion of DevOps practices drives the software artefacts changes towards continuous integration and continuous delivery pipeline. Subsequently, traceability management is essential to handle frequent changes with rapid software evolution. This study explores the process and approaches to manage traceability ensuring the artefact consistency towards CICD in DevOps practice. We address the key notions in traceability management process including artefact change detection, change impact analysis, consistency management, change propagation and visualization. Consequently, we assess the applicability of existing change impact analysis models in DevOps practice. This study identifies the conceptualization of the traceability management process, explores the state-of-art solutions and suggests possible research directions. This study shows that the lack of support in heterogeneous artefact consistency management with well-defined techniques. Most of the related models are limited with the industry-level applicability in DevOps practice. Accordingly, there is inadequate tool support to manage traceability between heterogeneous artefacts. This study identifies the challenges in managing software artefact consistency and suggests possible research directions that can be applied to manage the traceability in the process of software development in DevOps practice.

Author 1: D A Meedeniya
Author 2: I. D. Rubasinghe
Author 3: I. Perera

Keywords: Consistency management; traceability; continuous integration; DevOps; comparative study

Download PDF

Paper 12: A GRASP-based Solution Construction Approach for the Multi-Vehicle Profitable Pickup and Delivery Problem

Abstract: With the advancement of e-commerce and Internet shopping, the high competition between carriers has made many companies rethink their service mechanisms to customers, in order to ensure that they stay competitive in the market. Therefore, companies with limited resources focus on serving only customers who provide high profits at the lowest possible cost. The Multi-Vehicle Profitable Pickup and Delivery Problem (MVPPDP) is a vehicle routing problem and one variant of the Selective Pickup and Delivery Problem (SPDP) that is considered to plan the services for these types of companies. The MVPPDP aims to serve only the profitable customers, where the products are transformed from a selection of pickup customers to the corresponding delivery customers, within a given travel time limit. In this paper, we utilize the construction phase of the well-known Greedy Randomized Adaptive Search Procedure (GRASP) to build initial solutions for the MVPPDP. The performance of the proposed method is compared with two greedy construction heuristics that were previously used in the literature to build the initial solutions of the MVPPDP. The results proved the effectiveness of the proposed method, where eight new initial solutions are obtained for the problem. Our approach is especially beneficial for building a population of solutions that combine both diversity and quality, which can help to obtain good solutions in the improvement phase of the problem.

Author 1: Abeer I Alhujaylan
Author 2: Manar I. Hosny

Keywords: Selective pickup and delivery problem; multi-vehicle pro?table pickup and delivery problem; greedy randomized adaptive search procedure; metaheuristic algorithms

Download PDF

Paper 13: Secured Multi-Hop Clustering Protocol for Location-based Routing in VANETs

Abstract: In today’s world, with the rise in the count of vehicles and lack of proper navigation, the congestion has become a major problem. In this scenario, VANETs play a very important part in improving the traffic condition and also in providing proper navigation. Improved navigation system reduces congestion thereby reducing the possibility of occurrence of accidents. In this research work, we have used a position-based routing protocol i.e., GPSR (Greedy Perimeter Stateless Routing Protocol) to effectively analyze the geographical position of the vehicles in the network and to provide updated navigation information. In this system, we have used a security mechanism to identify valid and invalid messages for secure V2V and V2I communications. This mechanism drops all the invalid messages thereby keeping the VANET secure. It also reduces the possibility of attacks on wireless communications in the VANET. This system has better safety features and network performance compared to other hybrid schemes via NS2 simulation.

Author 1: K Sushma Eunice
Author 2: I. Juvanna

Keywords: Vanet; tamper-proof device; location-based routing protocols; intelligent transportation system; gpsr algorithm; trusted authority; roadside unit; routing protocols

Download PDF

Paper 14: Compound Mapping and Filter Algorithm for Hybrid SSD Structure

Abstract: With the recent development of byte-unit non-volatile random access memory (RAM), various methods utilizing quad level cell (QLC) not-AND (NAND) flash memory with non-volatile RAM have been proposed. However, tests have shown that these hybrid structures lead to a reduction in the performance of a hybrid solid state disk (SSD) owing to issues regarding space efficiency. This study proposes a compound address method and filter algorithm suitable for the next generation of NAND flash, called hybrid storage media, where QLCs and phase-change memory (PCM) are used together. The filter-mapping algorithm includes a management method that stores data in phase-change memory or flash memory according to the next command, which is accessed when a write command that is half or less than half a page in length is received from the file system. Tests have shown that the compound mapping and filter algorithm reduces the wasted pages by more than half and the number of merge operations is also significantly decreased. This leads to a decrease in the number of delete operations and improves the overall processing speed of the hardware.

Author 1: Jin Young Kim
Author 2: Se Jin Kwon

Keywords: Pram; hybrid architecture; QLC NAND flash memory; algorithm

Download PDF

Paper 15: Optimal Compression of Medical Images

Abstract: In today’s healthcare system, medical images are playing a vital role in the diagnosis. The challenges arise to the hospital management systems (HMS) are to store and communicate the large volume of medical images generated by various imaging modalities. Efficient compression of medical images is required to reduce the bit rate to increase the storage capacity and speed-up the transmission without affecting its quality. Over the past few decades, several compression standards have been proposed. In this paper, an intelligent JPEG2000 compression scheme is presented to compress the medical images efficiently. Unlike the traditional compression techniques, genetic programming (GP)-based quantization matrices are used to quantize the wavelet coefficients of the input image. Experimental results validate the usefulness of the proposed intelligent compression scheme.

Author 1: Rafi Ullah Habib

Keywords: Medical images; wavelet transform; JPEG2000; genetic programming; compression; quantization

Download PDF

Paper 16: Protection of Ultrasound Image Sequence: Employing Motion Vector Reversible Watermarking

Abstract: In healthcare information systems, medical data is very important for diagnosis. Most of the health institutions store their patients’ data on third-party servers. Therefore, its security is very important, since the advent of advanced multimedia and communication technology, whereby digital contents are manipulated, copied, and duplicated without leaving any trace. In this paper, a reversible watermarking technique is applied to the patients’ data (ultrasound image sequence). Since the traditional watermarking schemes can experience some permanent distortions that are not acceptable in the medical application. Thus, a reversible watermarking technique has been used, which can not only secure the ultrasound image sequence but also restore the original sequence back. For watermark embedding, the magnitude and phase angles of motion vectors of the image sequence are used that are obtained by using Full-Search block-based motion estimation algorithm. Before applying the motion estimation algorithm and watermark embedding, the histogram pre-processing is performed to avoid underflow/overflow. Unlike other state-of-the-art watermarking schemes that are reported in the last decades, the experimental results show that the proposed algorithm is simple, provides a much larger embedding capacity and better quality of the watermarked image sequence.

Author 1: Rafi Ullah Habib
Author 2: Fayez Al-Fayez

Keywords: Reversible watermarking; ultrasound sequence; full-search; motion vectors; side information

Download PDF

Paper 17: Extended Fuzzy Analytical Hierarchy Process Approach in Determinants of Employees’ Competencies in the Fourth Industrial Revolution

Abstract: This paper explored the education factors and ranked their impacts on the employees’ competencies development in Vietnam. Factors contributing to the employees’ competencies in the Vietnamese context are proposed based on the literature review under the justification of experts’ interviews. Then, the extended fuzzy analytical hierarchy process (EFAHP) approach was used to prioritize the importance of the factors affecting the employees’ competency. The research finding confirmed the decisive role of teachers with the greatest weight of impact on the employees’ competency, though there was a shift of teacher’s role to that of facilitator in the Fourth Industrial Revolution education.

Author 1: Phuc Van Nguyen
Author 2: Phong Thanh Nguyen
Author 3: Quyen Le Hoang Thuy To Nguyen
Author 4: Vy Dang Bich Huynh

Keywords: Employees’ competency; fuzzy logic; Extended Fuzzy Analytical Hierarchy Process (EFAHP)

Download PDF

Paper 18: A Recurrent Neural Network and a Discrete Wavelet Transform to Predict the Saudi Stock Price Trends

Abstract: Stock markets can be characterised as being complex, dynamic and chaotic environments, making the prediction of stock prices very tough. In this research work, we attempt to predict the Saudi stock price trends with regards to its earlier price history by combining a discrete wavelet transform (DWT) and a recurrent neural network (RNN). The DWT technique helped to remove the noises pertaining to the data gathered from the Saudi stock market based on a few chosen samples of companies. Then, a designed RNN has trained via the Back Propagation Through Time (BPTT) method to aid in predicting the Saudi market’s stock prices for the next seven days’ closing price pertaining to the chosen sample of companies. Then, analysis of the obtained results was carried out to make a comparison with the results from those employing the traditional prediction algorithms like the auto regressive integrated moving average (ARIMA). Based on the comparison, it was found that the put forward method (DWT+RNN) allowed more accurate prediction of the day’s closing price versus the ARIMA method employing the mean squared error (MSE), mean absolute error (MAE) and root mean squared error (RMSE) criterion.

Author 1: Mutasem Jarrah
Author 2: Naomie Salim

Keywords: Recurrent Neural Network (RNN); Discrete Wavelet Transform (DWT); deep learning; prediction; stock market

Download PDF

Paper 19: Medical Image(s) Watermarking and its Optimization using Genetic Programming

Abstract: In this paper, an medical image watermarking technique has been proposed, where intelligence has been incorporated into the encoding and decoding structure. The motion vectors of the medical image sequence are used for embedding the watermark. Instead of a manual selection of the candidate motion vectors, a generalized approach is used to select the most suitable motion vectors for embedding the watermark. Genetic programming (GP) module has been employed to develop a function in accordance with imperceptibility and watermarking capacity. Employment of intelligence in the system improves its imperceptibility, capacity, and resistance toward different attacks that can occur during communication and storing. The motion vectors are generated by applying a block-based motion estimation algorithm. In this work, Full-Search method has been used for its better performance as compared to the other methods. Experimental results show marked improvement in capacity and visual similarity as compared to the conventional approaches.

Author 1: Rafi Ullah Habib
Author 2: Hani Ali Alquhayz

Keywords: Capacity; imperceptibility; genetic programming; image sequence; watermarking

Download PDF

Paper 20: Voltage Variation Signals Source Identification and Diagnosis Method

Abstract: Power Quality (PQ) problem has become an important issue for generating bad impact to the users nowadays. It is important to detect and identify the source of the PQ problem. This paper presents a voltage variation signals source identification and diagnosis method by determining the average time frequency representation (TFR) phase power of the impedance. The signals focused in this study are the voltage variation signals, which include voltage sag, swell and interruption. The voltage variation signals from different source location (upstream, downstream as well as up and downstream) according to the IEEE Standard 1159 by using the mathematical models. The signals are first analyzed by using the Spectrograms which act as the feature producing tool. Then, the average power TFR of phase domain of each signal is calculated and tabulated. Finally, the performance of the method is identified by using support vector machine (SVM) and k-nearest neighbor (kNN). The results show that this method is an effective and suitable technique for identifying the source of voltage variation.

Author 1: Weihown Tee
Author 2: Mohd Rahimi Yusoff
Author 3: Abdul Rahim Abdullah
Author 4: Muhamad Faizal Yaakub

Keywords: Power quality; voltage variation; spectrogram; source identification; average time frequency representation phase power

Download PDF

Paper 21: Reliability and Connectivity Analysis of Vehicluar Ad Hoc Networks for a Highway Tunnel

Abstract: Vehicular ad-hoc network (VANET) uses ‘mobile internet’ to facilitate the communication between vehicles and with the goal to ensure road safety and achieve secure communication. Thus the reliability of this type of networks is of paramount significance. The safety-related messages are disseminated in VANETs, on the wireless medium through vehicle to vehicle (V2V) and Vehicle to roadside (V2R) communications. Hence, the Reliability of network is an essential requirement. This paper considers the effect of vehicle transmission range R_tran and vehicle density ρ on the connectivity probability. In addition, a reliability model which takes into account minimal safe headway S_h among nearby vehicles at highway tunnel is specified. The reason is that under the tunnel Global Positioning System (GPS), a component of onboard unit (OBU) needs a rich line of sight for perfect services, because due to signal interference, the GPS does not work properly. Though, in the case of a fully connected network, there are chances of danger between vehicles which are close to each other. Therefore, the network is not safe, as accidents and collision can happen at any time. Hence, maintaining the minimal safe headway distance under the tunnel is interesting and useful for VANET. The obtained results show that the little difference of the minimal safe headway under the tunnel can cause a serious change in the entire network reliability. Suggesting that while designing the network reliability models the safe headway S_h cannot be ignored.

Author 1: Saajid Hussain
Author 2: Di Wu
Author 3: Wang Xin
Author 4: Sheeba Memon
Author 5: Naadiya Khuda Bux
Author 6: Arshad Saleem

Keywords: Minimal safe headway; reliability; highway tunnel; vehicular ad hoc networks; connectivity probability

Download PDF

Paper 22: Instrument Development for Measuring the Acceptance of UC&C: A Content Validity Study

Abstract: Studies on the acceptance of Unified Communications and Collaboration (UC&C) tools such as instant messaging and video conferencing have been around for some time. Adoption and acceptance of UC&C tools and services has boosted productivity and improved communications by integrating voice, video and data into one platform. UC&C also allows collaboration by enabling users to interact with each other in different media. However, their acceptance rate by individual in developing nations has been low. It is hypothesized that the factors that contribute to acceptance are developed based on two underlying theories. The first is that of diffusion of innovation and the other is that of service dominant logic. Items is constructed based on eight constructs which are relative advantage, compatibility, ease of use, trialability, observability, improved service, value co-creation capacity, and coordination efficiency. In order to validate the items, content validity ratios are calculated on a set of questionnaire. The ratios will determine which items should be included or removed from the questionnaire. The paper concludes with a discussion on the implications of the findings from the experts’ evaluation and also from the content validity ratios. The new items are used in designing the survey instrument to measure the acceptance of UC&C.

Author 1: Emy Salfarina Alias
Author 2: Muriati Mukhtar
Author 3: Ruzzakiah Jenal

Keywords: Acceptance model; content validity; diffusion of innovation; service science; unified communications and collaboration

Download PDF

Paper 23: Congestion Control Techniques in WSNs: A Review

Abstract: Congestion control has a great importance in wireless sensor network (WSN), where efficient application of congestion control mechanisms can prolong the network lifetime. Thus, proper examination is needed to improve more refine way to address the congestion occurrence and resolution. While designing congestion control techniques, the maximum output can be achieved by efficient utilization of required resources within WSN. From last few years several approaches have been brought in, that consist of routing protocols which provide support with congestion control, congestion prevention, and reliable data routing. In old schemes the topology reset and extent traffic drop take place because sink node executes the congestion avoidance. Therefore, node level congestion avoidance, detection, congestion preventing, and resolution mechanisms have been proposed during past few years. Our paper provides a brief overview and performance comparison of centralized and distributed congestion control algorithms in WSN.

Author 1: Babar Nawaz
Author 2: Khalid Mahmood
Author 3: Jahangir Khan
Author 4: Mahmood ul Hassan
Author 5: Ansar Munir Shah
Author 6: Muhammad Kashif Saeed

Keywords: WSNs; congestion control; congestion preventing; reliable data

Download PDF

Paper 24: Convolutional Neural Network based for Automatic Text Summarization

Abstract: In recent times, the apps for the processing of a natural language has been formed and generated through the use of intelligent and soft computing methods that allow computer systems to practically mimic practices related to the process of human texts like the detection of plagiarism, determination of the pattern as well as machine translation, Thereafter, Text summarization serves as the procedure of abridging writing within consolidated structures. ‘Automatic text summarization’ or the ATS is when a computer system is used to create a text summarization. In this study, the researchers have introduced a novel ATS system, i.e., CNN-ATS, which is a convolutional neural network that enables to Automatic text summarization using a text matrix representation. CNN-ATS is a deep learning system that was used to evaluate the improvements resulting from the increase in the depth to determine the better CNN configurations, assess the sentences, and determine the most informative one. Sentences deemed important are extracted for document summarization. The researchers have investigated this novel convolutional network depth for determining its accuracy during the informative sentences selection for each input text document. The experiment findings of the proposed method are based on the Convolutional Neural Network that uses 26 different configurations. It demonstrates that the resulting summaries have the potential to be better compared to other summaries. DUC 2002 served as the data warehouse. Some of the news articles were used as input in this experiment. Through this method, a new matrix representation was utilized for every sentence. The system summaries were examined by using the ROUGE tool kit at 95% confidence intervals, in which results were extracted by employing average recall, F-measure and precision from ROUGE-1, 2, and L.

Author 1: Wajdi Homaid Alquliti
Author 2: Norjihan Binti Abdul Ghani

Keywords: Automatic text summarization; extracts summarization; information retrieval; deep learning; convolutional neural network

Download PDF

Paper 25: Hospital Readmission Prediction using Machine Learning Techniques

Abstract: One of the most critical problems in healthcare is predicting the likelihood of hospital readmission in case of chronic diseases such as diabetes to be able to allocate necessary resources such as beds, rooms, specialists, and medical staff, for an acceptable quality of service. Unfortunately relatively few research studies in the literature attempted to tackle this problem; the majority of the research studies are concerned with predicting the likelihood of the diseases themselves. Numerous machine learning techniques are suitable for prediction. Nevertheless, there is also shortage in adequate comparative studies that specify the most suitable techniques for the prediction process. Towards this goal, this paper presents a comparative study among five common techniques in the literature for predicting the likelihood of hospital readmission in case of diabetic patients. Those techniques are logistic regression (LR) analysis, multi-layer perceptron (MLP), Naïve Bayesian (NB) classifier, decision tree, and support vector machine (SVM). The comparative study is based on realistic data gathered from a number of hospitals in the United States. The comparative study revealed that SVM showed best performance, while the NB classifier and LR analysis were the worst.

Author 1: Samah Alajmani
Author 2: Hanan Elazhary

Keywords: Decision tree; hospital readmission; logistic regression; machine learning; multi-layer perceptron; Naïve Bayesian classifier; support vector machines

Download PDF

Paper 26: A Comparative Analysis of Wavelet Families for the Classification of Finger Motions

Abstract: Wavelet transform (WT) has been widely used in biomedical, rehabilitation and engineering applications. Due to the natural characteristic of WT, its performance is mostly depending on the selection of mother wavelet function. A proper mother wavelet ensures the optimum performance; however, the selection of mother wavelet is mostly empirical and varies according to dataset. Hence, this paper aims to investigate the best mother wavelet of discrete wavelet transform (DWT) and wavelet packet transform (WPT) in the classification of different finger motions. In this study, twelve mother wavelets are evaluated for both DWT and WPT. The electromyography (EMG) data of 12 finger motions are acquired from online database. Four useful features are extracted from each recorded EMG signal via DWT and WPT transformation. Afterward, support vector machine (SVM) and linear discriminate analysis (LDA) are employed for performance evaluation. Our experimental results demonstrate Bior3.3 to be the most suitable mother wavelet in DWT. On the other hand, WPT with Bior2.2 overtakes other mother wavelets in the classification of finger motions. The results obtained suggest that Biorthogonal families are more suitable for accurate EMG signals classification.

Author 1: Jingwei Too
Author 2: Abdul Rahim Abdullah
Author 3: Norhashimah Mohd Saad

Keywords: Mother wavelet; discrete wavelet transform; wavelet packet transform; electromyography; classification

Download PDF

Paper 27: Comparative Analysis of Cow Disease Diagnosis Expert System using Bayesian Network and Dempster-Shafer Method

Abstract: Livestock is a source of animal protein that contains essential acids that improve human intelligence and health. Popular livestock in Indonesia is cow. Consumption of meat per capita is increased by 0.1% kg / capita / year. The high demand for beef in Indonesia is due to the increasing of population in Indonesia by 1.49% per year. More than 90% of cows are reared by rural communities with less of knowledge about livestock and have low economic capabilities. In addition, the number of experts or veterinarians are also limited. One of the solutions that can be done to socialize the knowledge of experts or veterinarians is by using expert system. Some methods that can be used in expert systems are Bayesian network and Dempster-Shafer method. The purpose of this research is to analyze the comparison of cow disease diagnosis with bayesian network and Dempster-Shafer method. In order to know which method is better in diagnosing cow disease. The data used is 21 cow diseases with 77 symptoms. Each method is tested with the same 10 cases. The conclusions obtained by Bayesian network and Dempster-Shafer method. Both of methods give the same diagnosis results but with different percentage. The mean value of diagnosis percentage by Dempster-Shafer method is 87,2% while bayesian network method is 75,3%. Thus, it can be said that the Dempster-Shafer method is better at diagnosing cow disease.

Author 1: Aristoteles Aristoteles
Author 2: Kusuma Adhianto
Author 3: Rico Andrian
Author 4: Yeni Nuhricha Sari

Keywords: Expert system; Bayesian network; Dempster-Shafer; cow disease

Download PDF

Paper 28: Enhanced e-Learning Experience using Case based Reasoning Methodology

Abstract: In recent year’s improvement in innovation includes new limits for verifying data that will incite essential changes in eLearning. The user can see e-learning material subject to the reference given to them and select the best approach to see the resources. This proposed system addresses retrieval, reuse, revise and retain phases of CBR. For building personalized e-Learning, this work identifies different feature set such as learning style, learning object, knowledge level, and problem list. For constructing this model used case-based reasoning along with a k-nearest neighbour. Role of the K-nearest neighbour method is to identify the perfect k factor for better analysis for calculation of accurate retrieval process. There is further addition of new cases based on the simulation of new user history limit to a certain threshold value. This model acquires dynamically incremental dataset for classification. Further, there is time and accuracy comparison on dataset done by K-nearest neighbour, decision tree and support vector machine. Eventually, eLearning spares time, upgrades the learning knowledge and gives scholarly achievement.

Author 1: Swati Shekapure
Author 2: Dipti D. Patil

Keywords: K-nearest neighbour method; eLearning; learning objects; learning style; case based reasoning

Download PDF

Paper 29: Effect of Correlating ImageThreshold Values with Image Gradient Field on Damage Detection in Composite Structures

Abstract: Effect of image threshold level variation is studied and proved to be a critical factor in damage detection and characterization of impacted composite Reaction Injection Molding ((RIM) structures. The variation of threshold is used as an input to both gradient field algorithm and segmentation algorithm. The choice of optimum threshold for a tested composite type is achieved as a result of correlation between the resulted gradient field images and segmented images. Type and extent of damage is also analyzed using detailed pixel distribution as a function of both impact energy and threshold level variation. The demonstrated cascading based technique is shown to be promising for an accurate testing and classification of damage in composite structures in many critical areas such as medical, aerospace and automotive.

Author 1: Mahmoud Zaki Iskandarani

Keywords: Gradient norm; edge detection; gray level mapping; segmentation; threshold; histogram; image processing; composites, impact damage

Download PDF

Paper 30: Images Steganography Approach Supporting Chaotic Map Technique for the Security of Online Transfer

Abstract: One of the most important issue in this domain is the security concern of the transfer data. The online transfer data may access illegally through attack the communication gate between the servers and the users. The main aim of this study is to enhance the security level of the online data transfer using two integrated methods; images steganography to hide the transfer data in image media, and chaotic map to remap the original format of the transfer data. The integration between these two methods is effective to secure the data in several format such as text, audio, and images. The proposed algorithm is the prototyped using JAVA programming, 20 images and text messages of usable sizes (plain data) were tested on the dataset using the developed programming. The simulation using local server is accomplished to analyze the security performance based on two factors; the plain data size and the data transfer distance. Many attacking attempts are performed on the simulation test using known attacking techniques such as observe the stego images quality. The experiment results show that about 85% of the attacking attempts fail to catch the stego images. 95% of the attacks fail in remap meaningful parts of the chaotic data. The results indicate the very good level of the propose security methods to secure the online transfer data. The contribution of this study is the effective integration between the steganography and chaotic map approaches to assure high security level of online data transfer.

Author 1: Yasser Mohammad Al-Sharo

Keywords: Security; steganography; chaotic map; encryption; network data

Download PDF

Paper 31: ABCVS: An Artificial Bee Colony for Generating Variable T-Way Test Sets

Abstract: To achieve acceptable quality and performance of any software product, it is crucial to assess various software components in the application. There exist various software-testing techniques such as combinatorial testing and covering array. However, problems such as t-way combinatorial explosion is still challenging in any combinatorial testing strategy, as it takes into consideration the entire combinations of input variables. Therefore, to overcome this problem, several optimizations and metaheuristic strategies have been suggested. One of the most effective optimization algorithms based techniques is the Artificial Bee Colony (ABC) algorithm. This paper presents t-way generation strategy for both a uniform and variable strength test suite by applying the ABC strategy (ABCVS) to reduce the size of the test suite and to subsequently enhance the test suite generation interaction. To assess both the effectiveness and performance of the presented ABCVS, several experiments were conducted applying various sets of benchmarks. The results revealed that the proposed ABCVS outweigh the existing based strategies and demonstrated wider interaction between components as opposed to AI-search based and computational based strategies. The results also revealed higher prospect of ABCVS in the aspect of its effectiveness and performance as observed in the majority of case studies.

Author 1: Ammar K Alazzawi
Author 2: Helmi Md Rais
Author 3: Shuib Basri

Keywords: T-way testing; variable-strength interaction; combinatorial testing; covering array; test suite generation; artificial bee colony algorithm

Download PDF

Paper 32: Improving the Performance of {0,1,3}-NAF Recoding Algorithm for Elliptic Curve Scalar Multiplication

Abstract: Although scalar multiplication is highly fundamental to elliptic curve cryptography (ECC), it is the most time-consuming operation. The performance of such scalar multiplication depends on the performance of its scalar recoding which can be measured in terms of the time and memory consumed, as well as its level of security. This paper focuses on the conversion of binary scalar key representation into {0, 1, 3}-NAF non-adjacent form. Thus, we propose an improved {0, 1, 3}-NAF lookup table and mathematical formula algorithm which improves the performance of {0, 1, 3}-NAF algorithm. This is achieved by reducing the number of rows from 15 rows to 6 rows, and reading two (instead of three) digits to produce one. Furthermore, the improved lookup table reduces the recoding time of the algorithm by over 60% with a significant reduction in memory consumption even with an increase in key size. Specifically, the improved lookup table reduces the memory consumption by as much as 75% for the big key, which shows its higher level of resilience to side channel attacks.

Author 1: Waleed K AbdulRaheem
Author 2: Sharifah Bte Md Yasin
Author 3: Nur Izura Binti Udzir
Author 4: Muhammad Rezal bin Kamel Ariffin

Keywords: Elliptic Curve Cryptosystem (ECC); scalar multiplication algorithm; {0, 1, 3}-NAF method; Non-Adjacent Form (NAF)

Download PDF

Paper 33: Novel Software-Defined Network Approach of Flexible Network Adaptive for VPN MPLS Traffic Engineering

Abstract: Multi-Protocol Label Switching VPN (MPLS-VPN) is a technology for connecting multiple remote sites across the operator’s private infrastructure. MPLS VPN offers advantages that traditional solutions cannot guarantee, in terms of security and quality of service. However, this technology is becoming more prevalent among businesses, banks or even public institutions. With this strong trend, the management of the paths on which these tunnels can be deployed has become a necessity is a priority need for Internet access providers (ISPs). Through the principle of controller orchestration, ISPs can overcome this difficulty. Software-defined network is a paradigm allowing through the principle of orchestration to manage the entire network infrastructure. In this paper, we propose a new approach called FNA-TE "Flexible Network Adaptive - Traffic Engineering", this approach allows to manage MPLS VPN tunnels to meet the QoS requirements of those with the highest priority.

Author 1: Faycal Bensalah
Author 2: Najib El Kamoun

Keywords: SDN; QoS; VPN; MPLS; Adaptive network

Download PDF

Paper 34: Big Data Strategy

Abstract: The importance of data analysis in companies grows every day, with a global market that generates large amounts of transactions. Industry 4.0 is one of the technological trends, which is a set of diverse technologies whose objective is the digitalization and technological connectivity of the entire value chain of organizations. Data analysis and decision making in real time have a positive impact on efficiency. One of the technologies that support this concept is big data, which can support companies to use and manage large volumes of data as support in decision making. In this research project, the computational environment of Apache Hadoop software has been analyzed to create a technological strategy that supports companies in creating a roadmap to know and implement big data technology; as a result, a computer laboratory for big data has been created at the Autonomous University of Coahuila, Mexico to support medium-sized manufacturing companies in their data analysis strategy for decision making.

Author 1: Alicia Valdez
Author 2: Griselda Cortes
Author 3: Sergio Castaneda
Author 4: Laura Vazquez
Author 5: Angel Zarate
Author 6: Yadira Salas
Author 7: Gerardo Haces Atondo

Keywords: Technological strategy; big data; Hadoop; data analysis

Download PDF

Paper 35: Performance Analysis of Security Mechanism for Automotive Controller Area Network

Abstract: Connectivity of modern cars has led to security issues. A number of contributions have proposed the use of cryptographic algorithms in order to provide automotive Controller Area Network (CAN) security. However, due to CAN protocol characteristics, real time requirements within cryptographic schemes are not guaranteed. In this work, effects of implementing cryptographic approaches have been investigated by proposing a performance analysis methodology of cryptographic algorithm. Until get implanting the proposed method in a real vehicle, a platform based on STMicroelectronics’32F407 (STM32F407) microcontroller board has been deployed to test the proposed methodology. The experiments show that the implementation of a cryptographic algorithm has an impact on clock cycles number and therefore, on real-time performances.

Author 1: Mabrouka Gmiden
Author 2: Mohamed Hedi Gmiden
Author 3: Hafedh Trabelsi

Keywords: Automotive CAN security; cryptographic algorithms; analysis methodology; real-time performances

Download PDF

Paper 36: Feature-based Sentiment Analysis for Slang Arabic Text

Abstract: The increased number of Arab users on microblogging services who use Arabic language to write and read has triggered several researchers to study the posted data and discover the user’s opinion and feelings to support decision making. In this paper, a sentiment analysis framework is presented for slang Arabic text. A new dataset with Jordanian dialect is presented. Numerous specific Arabic features are shown with their impact on slang Arabic Tweets. The new set of features consists of lexicon, writing style, grammatical and emotional features. Several experiments are conducted to test the performance of the proposed scheme. The new proposed scheme produces better results in comparison with others. The experiments show that the system performs well without translating the tweets to English or standard Arabic.

Author 1: Emad E Abdallah
Author 2: Sarah A. Abo-Suaileek

Keywords: Sentiment analysis; Arabic features; opinion mining; emotional features; social media

Download PDF

Paper 37: Healthcare Management using ICT and IoT based 5G

Abstract: In healthcare management, all patients need to be looked after properly with the latest technology. Although treatment facilities of healthcare management are available wirelessly, many treatments are still pending and delayed because the number of patients is increasing. In this research, 2 problems are focused on they are the availability of treatment facilities and an efficient way of handling healthcare administration records. In healthcare management, e_Health applications focus on medical treatment and administration. However, these applications depend on the Information and Communication Technology (ICT) and Radio Frequency Identification (RFID) systems. Using IoT based 5G and the latest technologies, this research provides an efficient method to solve these problems. In this method, ICT based on 5G networks and IoT based 5G are the major components which include efficient management protocols for treating the patients and elders through the appropriate e_Health applications. Although some patients and older adults visit healthcare homes or hospitals regularly, they never become satisfied people because they always expect better services. Some healthcare management treats these people as customers and maintains customer relationship. Improving the accuracy and quality of healthcare services and customers’ satisfaction depends on Customer Relationship Management (CRM) through evolving technologies. As results, ICT based on RFID and other latest technologies enhances the quality of e_Health application and healthcare services with the satisfaction of CRM values. Despite the profits and benefits, these enhancements are the conclusions of the healthcare service and management.

Author 1: Vijey Thayananthan

Keywords: Information communication technology; e_Health; customer relationship management; Radio Frequency Identification (RFID); Internet of Things based fifth generation (IoT based 5G)

Download PDF

Paper 38: Towards a Conceptual Model to Evaluate usability of Digital Government Services in Malaysia

Abstract: The Malaysian government is committed to provide comprehensive digital government services and it is reflected in some policies and strategic plans such as 11th Malaysia Plan 2016-2020 (RMKe-11) for digital government transformation. However, though most of the Malaysia government services are online yet they are still inadequate and the majority of users are unhappy with the current services. Usability is a critical aspect in the success of digital government. Thus, this research aims to develop and validate a usability conceptual model of digital government services in Malaysia context to identify key factors that influence the perceived usability that assists to encourage usage and satisfaction of digital government services. This research has applied quantitative-deductive approach and employed PLS-SEM analysis. Empirical results indicate that Effectiveness, Efficiency, Learnability, Satisfaction, Usefulness, and Citizen Centric are key factors of perceived usability of digital government services. The evaluation of the proposed conceptual model yielded that three of the six factors which are Effectiveness, Satisfaction, and Citizen Centric have significant positive influence on perceived usability of digital government in Malaysia context.

Author 1: Rini Yudesia Naswir
Author 2: Nurazean Maarop
Author 3: Mahmudul Hasan
Author 4: Salwani Daud
Author 5: Ganthan Narayana Samy
Author 6: Pritheega Magalingam

Keywords: Digital government; citizen-centric; quantitative; usability

Download PDF

Paper 39: A Hybrid of Multiple Linear Regression Clustering Model with Support Vector Machine for Colorectal Cancer Tumor Size Prediction

Abstract: This study proposed the new hybrid model of Multiple Linear Regression Clustering (MLRC) combined with Support Vector Machine (SVM) to predict tumor size of colorectal cancer (CRC). Three models: Multiple Linear Regression (MLR), MLRC and hybrid MLRC with SVM model were compared to get the best model in predicting tumor size of colorectal cancer using two measurement statistical errors. The proposed model of hybrid MLRC with SVM have found two significant clusters whereby, each clusters contained 15 and three significant variables for cluster 1 and 2, respectively. The experiments found that the proposed model tend to be the best model with least value of Mean Square Error (MSE) and Root Mean Square Error (RMSE). This finding has shed light to health practitioner in determining the factors that contribute to colorectal cancer.

Author 1: Muhammad Ammar Shafi
Author 2: Mohd Saifullah Rusiman
Author 3: Shuhaida Ismail
Author 4: Muhamad Ghazali Kamardan

Keywords: Colorectal cancer; multiple linear regression; support vector machine; fuzzy c- means; clustering; prediction

Download PDF

Paper 40: Intrusion-Miner: A Hybrid Classifier for Intrusion Detection using Data Mining

Abstract: With the rapid growth and usage of internet, number of network attacks have increase dramatically within the past few years. The problem facing in nowadays is to observe these attacks efficiently for security concerns because of the value of data. Consequently, it is important to monitor and handle these attacks and intrusion detection system (IDS) has potentially diagnostic ability to handle these attacks to secure the network. Numerous intrusion detection approaches are presented but the main hindrance is their performance which can be improved by increasing detection rate as well as decreasing false positive rates. Optimizing the performance of IDS is very serious issue and challenging fact that gets more attention from the research community. In this paper, we proposed a hybrid classification approach ‘Intrusion-Miner’ with the help of two classifier algorithm for network anomaly detection to get optimum result and make it possible to detect network attacks. Thus, principal component analysis (PCA) and Fisher Discriminant Ratio (FDR) have been implemented for the feature selection and noise removal. This hybrid approach is compared with J48, Bayesnet, JRip, SMO, IBK and evaluate the performance using KDD99 dataset. Experimental result revealed that the precision of the proposed approach is measured as 96.1 % with low false positive and high false negative rate as compare to other state-of-the-art algorithm. The simulation result evaluation shows that perceptible progress and real-time intrusion detection can be attained as we apply the suggested models to identify diverse kinds of network attacks.

Author 1: Samra Zafar
Author 2: Muhammad.Kamran
Author 3: Xiaopeng.Hu

Keywords: Intrusion detection system; principal component analysis; intrusion-minor; fisher discriminant ratio

Download PDF

Paper 41: Gene Optimized Deep Neural Round Robin Workflow Scheduling in Cloud

Abstract: Workflow scheduling is a key problem to be solved in the cloud to increases the quality of services. Few research works have been designed for performing workflow scheduling using different techniques. But, scheduling performance of existing techniques was not effective when considering a larger number of user tasks. Besides, the makespan of workflow scheduling was higher. In order to solve such limitations, Gene Optimized Deep Neural Round Robin Scheduling (GODNRRS) Technique is proposed. The designed GODNRRS Technique contains three layers namely input, hidden and output layer to efficiently perform workflow scheduling in the cloud. The GODNRRS Technique initially gets the number of user tasks as input in the input layer and forwards it to the hidden layer. After taking input, GODNRRS Technique initializes gene population with the assist of virtual machines in Amazon cloud server at the first hidden layer. Next, GODNRRS Technique determines fitness function for each virtual machine using their energy, memory, CPU time, bandwidth capacity at the second hidden layer. Afterward, GODNRRS Technique defines a weight for each virtual machine at the third hidden layer depends on their fitness function estimation. Consequently, GODNRRS Technique distributes the user tasks to optimal virtual machines according to their weight value at the fourth hidden layer in a cyclic manner. At last, the output layer renders the scheduled tasks result. Thus, GODNRRS Technique handles workflows in the cloud with improved scheduling efficiency and lower energy and makespan. The GODNRRS Technique conduct the experimental evaluation using metrics such as scheduling efficiency, makespan, and energy consumption with respect to a different number of user tasks from LIGO , Montage and cybershake real-time applications. The experimental result show that the GODNRRS Technique is able to increases the efficiency and also reduces the makespan of workflows scheduling in the cloud as compared to state-of-the-art works.

Author 1: Shanmugasundaram M
Author 2: Kumar R
Author 3: Kittur H M

Keywords: Bandwidth capacity; processor time; energy; fitness function; memory; user task; virtual machine

Download PDF

Paper 42: Multi-criteria Intelligent Algorithm for Supply Chain Management

Abstract: In most societies, supply chain management and e-procurement processes, are one of the cornerstones of any economy and a primary influencer on people’s lives. Providing these communities with their different commodity needs de-pends on a wide range of suppliers. Due to supplier’s variety and diversity; the process of choosing the suitable supplier is considered the difficult and critical process, especially if this process is performed traditionally then decision making will be time and effort consuming to reach the desired results. To solve the problem of choosing the suitable supplier, this research suggests an intelligent algorithm, based on a given determinants that are specified by the decision maker or the customer. The proposed algorithm employs a set of intelligent formulas that will convert the predefined preferences into quantitative measure-ments. Quantitative measurements will be used to differentiate between different suppliers or between the set of given offers. The experimental results showed that D3􀀀 model employed the given preferences and succeeded in selecting the most appropriate offer among the many presented offers by the available suppliers.

Author 1: Mahdi Jemmali
Author 2: Loai Kayed B.Melhim
Author 3: Mafawez Alharbi

Keywords: Decision making; supplier; scoring; algorithms, supply chain management

Download PDF

Paper 43: Techniques, Tools and Applications of Graph Analytic

Abstract: Graphs have acute significance because of poly-tropic nature and have wide spread real world big data appli-cations, e.g., search engines, social media, knowledge discovery, network systems, etc. Major challenge is to develop efficient systems to store, process and analyze large graphs generated by these applications. Graph analytic is important research area in big data graphs dealing with efficient extraction of useful knowledge and interesting patterns from rapidly growing big data streams. Tremendously huge and complex data of graph applications requires specially designed graph databases having special data structures and effective features for data modeling and querying. The manipulation of large size of data requires effective scalable and distributed computational techniques for efficient graph partitioning and communication. Researchers have proposed different analytical techniques, storage structures, and processing models. This study provides insight of different graph analytical techniques and compares existing graph storage and computational technologies. This work also assesses the perfor-mance, strengths and limitations of various graph databases and processing models.

Author 1: Faiza Ameer
Author 2: Muhammad Kashif Hanif
Author 3: Ramzan Talib
Author 4: Muhammad Umer Sarwar
Author 5: Zahid Khan
Author 6: Khawar Zulfiqar
Author 7: Ahmad Riasat

Keywords: Graph; graph analytic; big data; graph tools; analytical techniques

Download PDF

Paper 44: Dual-Cross-Polarized Antenna Decoupling for 43 GHz Planar Massive MIMO in Full Duplex Single Channel Communications

Abstract: Massive Multiple Input Multiple Output (MIMO) and Full Duplex Single Channel (FDSC) at mm-Wave are key technology of future advanced wireless communications. Self-interference is the main problem in this technique because big number of antennas. This paper proposes dual-cross-polarized configuration to reduce self-interference between antennas. We conduct some computer simulations to design the antenna and to verify self-interference effect of the designed antenna. Computer simulation shows that the proposed design has lower Envelope Correlation Coefficient (ECC). This result is achieved because dual-cross-polarized technique can reduce coupling between an-tennas. We found that bit-error-rate (BER) performances of the proposed antenna is better than single polarized antenna indicating that the designed antenna is well design to reduce self-interference effect between antennas.

Author 1: Muhsin Muhsin
Author 2: Rina Pudji Astuti

Keywords: Massive MIMO; dual polarized; mm-Wave; cou-pling; self-interference; full duplex single channel

Download PDF

Paper 45: Multimodal Age-Group Recognition for Opinion Video Logs using Ensemble of Neural Networks

Abstract: With the wide spread usage of smartphones and social media platforms, video logging is gaining an increasing popularity, especially after the advent of YouTube in 2005 with hundred millions of views per day. It has attracted interest of many people with immense emerging applications, e.g. filmmak-ers, journalists, product advertisers, entrepreneurs, educators and many others. Nowadays, people express and share their opinions online on various daily issues using different forms of content including texts, audios, images and videos. This study presents a multimodal approach for recognizing the speaker’s age group from social media videos. Several structures of Artificial Neural Networks (ANNs) are presented and evaluated using standalone modalities. Moreover, a two-stage ensemble network is proposed to combine multiple modalities. In addition, a corpus of videos has been collected and prepared for multimodal age-group recog-nition with focus on Arabic language speakers. The experimental results demonstrated that combining different modalities can mitigate the limitations of unimodal recognition systems and lead to significant improvements in the results.

Author 1: Sadam Al Azani
Author 2: El-Sayed M. El-Alfy

Keywords: Multimodal recognition; opinion mining; age groups; word embedding; acoustic features; visual features; in-formation fusion; ensemble learning; Arabic speakers

Download PDF

Paper 46: Performance Evaluation of Completed Local Ternary Pattern (CLTP) for Face Image Recognition

Abstract: Feature extraction is the most important step that affects the recognition accuracy of face recognition. One of these features are the texture descriptors that are playing an important role as local features descriptor in many of the face recognition systems. Recently, many types of texture descriptors had been proposed and used for face recognition task. The Completed Local Ternary Pattern (CLTP) is one of the texture descriptors that has been proposed for texture image classification and had been tested for different image classification tasks. It proposed to overcome the Local Binary Pattern (LBP) drawbacks where the CLTP is more robust to noise as well as shown a good discriminative property than others. In this paper, a comprehensive study on the performance of the CLTP for face recognition task has been done. The aim of this study is to investigate and evaluate the CLTP performance using eight different face datasets and compared with the previous texture descriptors. In the experimental results, the CLTP had been shown good recognition rates and outperformed the other texture descriptors for this task. Several face datasets are used in this paper, such as Georgia Tech Face, Collection Facial Images, Caltech Pedestrian Faces, JAFFE, FEI, YALE, ORL, UMIST datasets.

Author 1: Sam Yin Yee
Author 2: Taha H. Rassem
Author 3: Mohammed Falah Mohammed
Author 4: Nasrin M. Makbol

Keywords: Face recognition; recognition accuracy; Local Binary Pat-tern (LBP); Completed Local Binary Pattern (CLBP); Com-pleted Local Ternary Pattern (CLTP)

Download PDF

Paper 47: Quantitative Analysis of Healthy and Pathological Vocal Fold Vibrations using an Optical Flow based Waveform

Abstract: The objective assessment of the vocal fold vibrations is important in diagnosing several vocal diseases. Given the high speed of the vibrations, the high speed videoendoscopy is commonly used to capture the vocal fold movements into video recordings. Commonly, two steps are carried out in order to automatically quantify laryngeal parameters and assess the vibra-tions. The first step aims to map the spatial-temporal information contained in the video recordings into a representation that facilitates the analysis of the vibrations. Numerous techniques are reported in the literature but the majority of them require the segmentation of all the images of the video, which is a complex task. The second step aims to quantify laryngeal parameters in order to assess the vibrations. To this aim, most of the existing approaches require an additional processing to the representation in order to deduce those parameters. Furthermore, for some reported representations, the assessment of the symmetry and the periodicity of the vocal fold dynamics needs setting up parameters that are specific to the representation under consideration; which makes difficult the comparison between the existing techniques. To alleviate these problems, the present study investigates the use of a recently proposed representation named optical flow based waveform, in order to objectively quantify the laryngeal parameters. This waveform is retained in this study as it does not require the segmentation of all the images of the video. Furthermore, it will be shown in the present work that the automatic quantification of the vibrations using this waveform can be carried out without applying any additional processing. Moreover, common laryngeal parameters are exploited; hence, no specific parameters are needed to be defined for the automatic assessment of the vibrations. Experiments conducted on healthy and pathological phonation show the accuracy of the waveform. Besides, it is more sensitive to pathological phonation than the state-of-the-art techniques.

Author 1: Heyfa Ammar

Keywords: Quantification; vocal fold vibrations; optical flow based waveform; pathology

Download PDF

Paper 48: A Parallel Hybrid-Testing Tool Architecture for a Dual-Programming Model

Abstract: High-Performance Computing (HPC) recently has become important in several sectors, including the scientific and manufacturing fields. The continuous growth in building more powerful super machines has become noticeable, and the Exascale supercomputer will be feasible in the next few years. As a result, building massively parallel systems becomes even more important to keep up with the upcoming Exascale-related technologies. For building such systems, a combination of programming models is needed to increase the system's parallelism, especially dual and tri-level programming models to increase parallelism in heterogeneous systems that include CPUs and GPUs. There are several combinations of the dual-programming model; one of them is MPI+ OpenACC. This combination has several features that increase the application’s parallelism concerning heterogeneous architecture and support different platforms with more performance, productivity, and programmability. However, building systems with different programming models are error-prone and difficult and are also hard to test. Also, testing parallel applications is already a difficult task because parallel errors are hard to detect due to the non-determined behavior of the parallel application. Integrating more than one programming model inside the same application makes even it more difficult to test because this integration could come with a new type of errors. Our main contribution is to identify and categorize OpenACC run-time errors and determine their causes with a brief explanation for the first time in research. Also, we proposed a solution for detecting run-time errors in application implemented in the dual-programming model. Our solution based on using hybrid testing techniques to discover real and potential run-time errors. Finally, to the best of our knowledge, there is no parallel testing tool built to test applications programmed by using the dual-programming model MPI + OpenACC or any tri-level programming model or even the OpenACC programming model to detect their run-time errors. Also, OpenACC errors have not been classified or identified before.

Author 1: Ahmed Mohammed Alghamdi
Author 2: Fathy Elbouraey Eassa

Keywords: Software testing; OpenACC run-time error classifications; hybrid testing tool; dual-programming model; OpenACC

Download PDF

Paper 49: Efficient Mining of Association Rules based on Clustering from Distributed Data

Abstract: Data analysis techniques need to be improved to allow the processing of data. One of the most commonly used techniques is the Association Rule Mining. These rules are used to detect facts that often occur together within a dataset. Unfortunately, existing methods generate a large number of association rules, without accentuation on the relevance and utility of these rules, and hence, complicating the results interpretation task. In this paper, we propose a new approach for mining association rules with an emphasis on easiness of assimilation and exploitation of the carried knowledge. Our approach addresses these shortcomings, while efficiently and intelligently minimizing the rules size. In fact, we propose to optimize the size of the extraction contexts taking advantages of the Clustering techniques. We then extract frequent itemsets and rules in the form of Meta-itemsets and Meta-rules, respectively. Experiments on benchmarking datasets show that our approach leads to a significant reduction of the number of generated rules thereby speeding up the execution time.

Author 1: Marwa Bouraoui
Author 2: Amel Grissa Touzi

Keywords: Distributed data; association rules mining; clustering; meta-items; meta-rules

Download PDF

Paper 50: Academic Emotions Affected by Robot Eye Color: An Investigation of Manipulability and Individual-Adaptability

Abstract: We investigate whether academic emotions are affected by the color of a robot’s eyes in lecture behaviors. In conventional human-robot interaction research on robot lecturers, the emphasis has been on robots assisting or replacing human lecturers. We expanded these ideas and examined whether robots could lecture using one’s behaviors that are impossible for humans. Psychological research has shown that color affects emotions. Because emotion is strongly related to learning, and a framework of emotion control is required. Thus, we considered whether emotions related to the learner’s academic work, called “academic emotions,” can be controlled by the color of a robot’s illuminated eye light. In this paper, we found that the robot’s eye light color affects academic emotions and that the effect can be manipulated and adapted to individuals. Furthermore, the manipulability of academic emotions by color was confirmed in a situation mimicking a real lecture.

Author 1: Kento Koike
Author 2: Yuya Tsuji
Author 3: Takahito Tomoto
Author 4: Daisuke Katagami
Author 5: Takenori Obo
Author 6: Yuta Ogai
Author 7: Junji Sone
Author 8: Yoshihisa Udagawa

Keywords: Robot lecturer; academic emotions; lecture behavior; human-robot interaction

Download PDF

Paper 51: Formal Concept Analysis based Framework for Evaluating Information System Success

Abstract: This paper aims to propose a methodology for evaluating information system success. It is based on two main fields, which are formal concept analysis and multi criteria decision-making methods. A framework whose main objective is to visualize the synchronization between company processes and information system indicators via process mapping and formal concept analysis exploited the methodology. Moreover, owing to the application of multi criteria decision-making methods, we can rank the information system among the others system for the purpose to ameliorate system performance. In practice, we apply the steps of this framework on a Moroccan bank by choosing a combination of processes and indicators.

Author 1: Ansar Daghouri
Author 2: Khalifa Mansouri
Author 3: Mohammed Qbadou

Keywords: Formal concept analysis; process; multi criteria; indicator; evaluation

Download PDF

Paper 52: Cluster based Hybrid Approach to Task Scheduling in Cloud Environment

Abstract: Cloud computing technology enables sharing of computer system resources among users through internet. Many numbers of users may request for sharable resources from a cloud. The sharable resources must be effectively distributed among requested users with in a less amount of time. Task scheduling is one of the ways of handling the user requests effectively in a cloud environment. There were many existing biologically inspired optimization techniques worked with task scheduling problems. The proposed paper is aimed at clubbing clustering techniques with biologically inspired optimization algorithms for deriving better results. A new hybrid methodology KPSOW (K-means with PSO using weights) has been proposed in the paper, which makes use of the strengths of both the K-means and PSO algorithms with the inclusion of weights concept. The results have shown that KPSOW has made considerable changes in reducing the makespan and improves the utilization of computing resources in the cloud.

Author 1: Y. Home Prasanna Raju
Author 2: Nagaraju Devarakonda

Keywords: Task scheduling; cloud computing; clustering; k-means; particle swarm optimization; makespan

Download PDF

Paper 53: How Volunteering Affects the Offender’s Behavior

Abstract: Agent Based modelling is widely used for presenting and evaluating a social phenomenon. Agent based modelling helps the researcher to analyze and evaluate a social model and its related hypothetical theories by simulating a real situation. This research presents a model for showing the behavior of an offender that is greatly influenced by volunteering of people on the offending tendencies. It is observed that how the offending behavior of someone urges others to do the same criminal act or violation of norms. And how can volunteering make the offender feel shameful of his doings and motivate others to volunteer in likewise situation in future. An agent based Model is presented and simulated to evaluate and validate the conceptualization of presented social dilemma. This model is simulated by asking some questions with exacting focus on the offending behavior of an agent. This study evaluates all the simulated results from the presented model to describe theoretical foundation spreading of offending or criminal behavior. Moreover, it validates the role of volunteering in the decrement of offending tendencies of the people as it presents an understandable situation in which offending of someone increases the offending tendencies of audience. Moreover the results of this research show that the volunteering decreases the offending tendencies of not just offender but also of the audience.

Author 1: Momina Shaheen
Author 2: Tayyaba Anees
Author 3: Muhammad Imran Manzoor
Author 4: Shuja Akbar
Author 5: Iqra Obaid
Author 6: Aimen Anum

Keywords: Offender’s behavior; spreading of criminal behavior; agent-based modelling; simulation; norm violation; criminology

Download PDF

Paper 54: Real Time RNA Sequence Edition with Matrix Insertion Deletion for Improved Bio Molecular Computing using Template Match Measure

Abstract: The RNA sequence editing has become a challenging task in the molecular computing. There are number of approaches that have been discussed earlier for the problem RNA editing in bio molecular computing, but they suffer to achieve higher performance. To improve the performance, an real time approach has been presented which uses sequence depth measure (SDM). The method receives the RNA sequence and estimates the depth measure for different sub sequences generated. Based on the SDM value, a cumulative sequence match measure (SMM) has been measured to classify the sequence towards the different classes available. The matrix insertion and deletion is performed based on the template match measure (TMM) which has been computed based on the matches found in the templates available for different classes. The experimental results of our approach prove to outperform in terms of Accuracy, Risk Detection Accuracy, Time Complexity and False Classification Ratio which in turn increases the performance of bio molecular computing and matrix insertion deletion.

Author 1: Kotteeswaran C
Author 2: Khanaa V
Author 3: Rajesh A

Keywords: Bio Molecular Computing; RNA Sequence; SDM; SMM; Templates; TMM

Download PDF

Paper 55: CMMI-DEV Implementation Simplified

Abstract: With the advance technology and increase in customer requirements, software organizations pursue to reduce cost and increase productivity by using standards and best practices. The Capability Maturity Model Integration (CMMI) is a software process improvement model that enhances productivity and reduces time and cost of running projects. As a reference model, CMMI does not specify systemic steps of how to implement the model practices, leaving a room for organization development approaches. Small organizations with low budgets and those who are not looking for CMMI appraisals cannot cope with the high price of CMMI implementation. However, they need to manage the risk of CMMI implementation under their administration. Therefore, this paper proposes a simplified plan using the spiral model to implement the CMMI to reach level 2. The objective is to make the implementation more straightforward to implement and fit CMMI specification without hiring external experts. Compared to related implementation frameworks, the proposed model is deemed competitive and applicable under organizations’ conditions.

Author 1: Maruthi Rohit Ayyagari
Author 2: Issa Atoum

Keywords: CMMI; Spiral model; Capability Maturity Model Integration; process improvement; CMMI Implementation

Download PDF

Paper 56: Algorithm for Enhancing the QoS of Video Traffic over Wireless Mesh Networks

Abstract: One of the major issues in a wireless mesh networks (WMNs) which needs to be solved is the lack of a viable protocol for medium access control (MAC). In fact, the main concern is to expand the application of limited wireless resources while simultaneously retaining the quality of service (QoS) of all types of traffic. In particular, the video service for real-time variable bit rate (rt-VBR). As such, this study attempts to enhance QoS with regard to packet loss, average delay, and throughput by controlling the transmitted video packets. The packet loss and average delay of QoS for video traffic can be controlled. Results of simulation show that Optimum Dynamic Reservation-Time Division Multiplexing Access (ODR-TDMA) has achieved excellent utilization of resource that improvised the QoS meant for video packets. This study has also proven the adequacy of the proposed algorithm to minimize packet delay and packet loss, in addition to enhancing throughput in comparison to those reported in previous studies.

Author 1: Abdul Nasser A Moh
Author 2: Radhwan Mohamed Abdullah
Author 3: Abedallah Zaid Abualkishik
Author 4: Borhanuddin Bin Moh. Ali
Author 5: Ali A. Alwan

Keywords: Wireless Mesh Networks (WMNs); Medium Access Control (MAC); Quality of Service (QoS); video traffic

Download PDF

Paper 57: Cybercrime in Morocco

Abstract: Cybercrime encompasses all illegal actions and facts that target cyberspaces and cause enormous economic and financial damages to organizations and individuals. A cyberspace is essentially composed of digital information as well as its store and communication instruments/platforms. To remedy this phenomenon, attention has focused particularly on both computer security and legislation in an area where the human behavior is also decisive. Social psychology has well defined the concept of behavior and also studied its relations with the attitude in human action. This paper aims to broaden the scope of cybercrime to also discuss marginal phenomena which do not attract enough attention but could easily be converted to digital criminals once circumstances become appropriate. The main objective of this work is the study of the ‘human’-‘digital world’ interactivity in a specific geographical area or precisely the study of the human behavior towards digital crimes. The proposed study targets young people of a small Moroccan city that is in the south of the country central region and constitutes its global economy barycenter. The study dealt specifically with a sample of Moroccan young living in El Jadida city that coincidentally contains individuals from other Moroccan cities which enriched this study more.

Author 1: M EL Hamzaoui
Author 2: Faycal Bensalah

Keywords: Cybercrime; cyberspace; information system; information and communication technologies; social psychology

Download PDF

Paper 58: A Comprehensive Comparative Analysis of Two Novel Underwater Routing Protocols

Abstract: The most unmanned area of this planet is sheltered with water; that is roughly 71.9% of the total area of this planet. A large quantity of marine life is present in this area. That is the reason underwater research is bounded due to unexplored benefits. Due to the addition of sensors and growing interests in the exploration and monitoring of marine life Underwater Wireless Sensor Network (UWSN) can play an important role. A variety of routing protocols has been deployed in order to get information between deployed nodes. Providing stable data transmission, maximum throughput, minimum consumption of the energy and delay are challenging tasks in the UWSN. These routing protocols can be Layer-by-Layer Angle-Based Flooding (L2-ABF) and Diagonal and Vertical Routing Protocol (DVRP). In order to get stable data transmission, the node density plays our role in shallow and deep water. Several parameters are employed to evaluate the output efficiency of these routing protocols. In this paper, like an end to end delay, loss of data packets during transmission and data delivery ratio within communication are considered the major parameters for evaluation. For this, the network simulator is used with the aqua sim package. The results, we have produced during this study; guides us about the best routing protocol for data transmission. It finally reveals that the L2-ABF performs better then DVRP in a different situation, further the tradeoffs relationship is achieved against multiple situations.

Author 1: Umar Draz
Author 2: Tariq Ali
Author 3: Khurshid Asghar
Author 4: Sana Yasin
Author 5: Zubair Sharif
Author 6: Qasim Abbas
Author 7: Shagufta Aman

Keywords: Data transmission; throughput; end-to-end delay; energy consumption; L2-ABF; DVRP; Delay

Download PDF

Paper 59: Extreme Learning Machine and Particle Swarm Optimization for Inflation Forecasting

Abstract: Inflation is one indicator to measure the development of a nation. If inflation is not controlled, it will have a lot of negative impacts on people in a country. There are many ways to control inflation, one of them is forecasting. Forecasting is an activity to find out future events based on past data. There are various kinds of artificial intelligence methods for forecasting, one of which is the extreme learning machine (ELM). ELM has weaknesses in determining initial weights using trial and error methods. So, the authors propose an optimization method to overcome the problem of determining initial weights. Based on the testing carried out the purposed method gets an error value of 0.020202758 with computation time of 5 seconds.

Author 1: Adyan Nur Alfiyatin
Author 2: Agung Mustika Rizki
Author 3: Wayan Firdaus Mahmudy
Author 4: Candra Fajri Ananda

Keywords: Extreme learning machine; particle swarm optimization; inflation; prediction

Download PDF

Paper 60: Impact Factors of IT Flexibility within Cloud Technology on Various Aspects of IT Effectiveness

Abstract: Cloud computing Adoption has achieved an essential inflection factor; this is affecting IT and business models and strategies all through the industries. There is a lack of empirical evidence how the adoption of cloud technology and a certain power of cloud technology specifically, affect various aspects of information technology effectiveness as IT-helpfulness to users, user satisfaction, and IT-quality of service. The intent of this paper has been added to the main of knowledge that could be used by researchers, companies, and businesses alike to accomplish optimal outcomes with focus on the main useful power of cloud-based services, solutions, and components inside them. The research findings presented that statistical evidence that confirms factors of IT Flexibility inside-cloud computing has a much stronger correlation with several aspects of information technology effectiveness than the remainder. The new awareness gain would improve the decision making procedure for IT executives and IT managers at what time allowing for cloud computing adoption base services and solutions

Author 1: Salameh A Mjlae
Author 2: Zarina Mohamad
Author 3: Wan Suryani

Keywords: Cloud Computing Adoption (CCA); IT Flexibility (ITF); IT Effectiveness (ITF)

Download PDF

Paper 61: Assistive Technologies for Bipolar Disorder: A Survey

Abstract: Bipolar disorder is a severe mental illness characterized by periodic manic and depressive episodes. The current mode of assessment of the patient’s bipolar state is using subjective clinical diagnosis influenced by the patients self-reporting. There are many intervention technologies available to help manage the illness and many researches have worked up on objective diagnosis and state prediction. Most of the recent work is focused on sensor-based objective prediction to minimize the delay between a relapse and the patient’s visit to the clinic for diagnosis and treatment. Due to the severity of the societal and economic burden caused by bipolar disorder, these researches have been given great emphasis. In this paper, we will start with a discussion of global severity of the disorder and economic and family burden inflicted due to it; we then talk about the existing mechanisms in place to identify the current state of the bipolar patient, then we go on to discussing the behavioral intervention technologies available and researched upon to help patients manage the disorder. Next, we mention the shift in focus of the current research, i.e. towards sensor based predictive systems for patients and clinical professionals, highlighting some of the preliminary researches and clinical studies and their outcomes.

Author 1: Yumna Anwar
Author 2: Dr. Arshia Khan

Keywords: Bipolar disorder; mobile applications; electrodermal activity; heart rate variability; behavioral intervention technologies; depression; mania

Download PDF

Paper 62: Hybrid Genetic-FSM Technique for Detection of High-Volume DoS Attack

Abstract: Insecure networks are vulnerable to cyber-attacks, which may result in catastrophic damages on the local and global scope. Nevertheless, one of the tedious tasks in detecting any type of attack in a network, including DoS attacks, is to determine the thresholds required to discover whether an attack is occurring or not. In this paper, a hybrid system that incorporates different heuristic techniques along with a Finite State Machine is proposed to detect and classify DoS attacks. In the proposed system, a Genetic Programming technique combined with a Genetic Algorithm are designed and implemented to represent the system core that evolves an optimized tree—based detection model. A Hill-Climbing technique is also employed to enhance the system by providing a reference point value for evaluating the optimized model and gaining better performance. Several experiments with different configurations are conducted to test the system performance using a synthetic dataset that mimics real-world network traffic with different features and scenarios. The developed system is compared to many state-of-art techniques with respect to several performance metrics. Additionally, a Mann-Whitney Wilcoxon test is conducted to validate the accuracy of the proposed system. The results show that the developed system succeeds in achieving higher overall performance and prove to be statistically significant.

Author 1: Mohamed Samy Nafie
Author 2: Khaled Adel
Author 3: Hassan Abounaser
Author 4: Amr Badr

Keywords: Denial of Service (DoS); Evolutionary Algorithms (EA); Finite State Machine (FSM); Genetic Algorithm (GA); Genetic Programming (GP); Hill-Climbing Search

Download PDF

Paper 63: Using Space Syntax and Information Visualization for Spatial Behavior Analysis and Simulation

Abstract: This study used space syntax to discuss user movement dynamics and crowded hot spots in a commercial area. Moreover, it developed personas according to its onsite observations, visualized user movement data, and performed a deep-learning simulation using the generative adversarial network (GAN) to simulate user movement in an urban commercial area as well as the influences such move might engender. From a pedestrian perspective, this study examined the crowd behavior in a commercial area, conducted an onsite observation of people’s spatial behaviors, and simulated user movement through data-science-driven approaches. Through the analysis process, we determined the spatial differences among various roads and districts in the commercial area, and according to the user movement simulation, we identified key factors that influence pedestrian spatial behaviors and pedestrian accessibility. Moreover, we used the deformed wheel theory to investigate the spatial structure of the commercial area and the synergetic relationship between the space and pedestrians; deformed wheel theory presents the user flow differences in various places and the complexity of road distribution, thereby enabling relevant parties to develop design plans that integrate space and service provision in commercial areas. This research contributes to the interdisciplinary study of spatial behavior analysis and simulation with machine learning applications.

Author 1: Sheng Ming Wang
Author 2: Chieh-Ju Huang

Keywords: Spatial behavior; space syntax; information visualization; generative adversarial network (GAN); user movement

Download PDF

Paper 64: Content based Document Classification using Soft Cosine Measure

Abstract: Document classification is a deep-rooted issue in information retrieval and assumed to be an imperative part of an assortment of applications for effective management of text documents and substantial volumes of unstructured data. Automatic document classification can be defined as a content-based arrangement of documents to some predefined categories which is for sure, less demanding for fetching the relevant data at the right time as well as filtering and steering documents directly to users. For recovering data effortlessly at the minimum time, scientists around the globe are trying to make content-based classifiers and as a consequence, an assortment of classification frameworks has been developed. Unfortunately, because of using conventional algorithms, almost all of these frameworks fail to classify documents into the proper categories. However, this paper proposes the Soft Cosine Measure as a document classification method for classifying text documents based on its contents. This classification method considers the similarity of the features of the texts rather than making their physical compatibility. For example, the traditional systems consider ‘emperor’ and ‘king’ as two different words where the proposed method extracts the same meaning for both of these words. For feature extraction capability and content-based similarity measure technique, the proposed system scores the classification accuracy up to 98.60%, better than any other existing systems.

Author 1: Md Zahid Hasan
Author 2: Shakhawat Hossain
Author 3: Md. Arif Rizvee
Author 4: Md. Shohel Rana

Keywords: Classification; similarity; feature extraction; cosine similarity; soft cosine measure; content; document

Download PDF

Paper 65: Fuzzy Delphi Method for Evaluating HyTEE Model (Hybrid Software Change Management Tool with Test Effort Estimation)

Abstract: When changes are made to a software system during development and maintenance, they need to be tested again i.e. regression test to ensure that changes behave as intended and have not impacted the software quality. This research will produce an automated tool that can help the software manager or a maintainer to search for the coverage artifact before and after a change request. Software quality engineer can determine the test coverage from new changes which can support cost estimation, effort, and schedule estimation. Therefore, this study is intended to look at the views and consensus of the experts on the elements in the proposed model by benefitting the Fuzzy Delphi Method. Through purposive sampling, a total of 12 experts from academic and industrial have participated in the verification of items through 5-point linguistic scales of the questionnaire instrument. Outcome studies show 90% of elements in the proposed model consists of change management, traceability support, test effort estimation support, regression testing support, report and GUI meet, the value threshold (d construct) is less than 0.2 and the percentage of the expert group is above 75%. It is shown that elements of all the items contained in the venue are needed in the HyTEE Model (Hybrid Software Change Management Tool with Test Effort Estimation) based on the consensus of experts.

Author 1: Mazidah Mat Rejab
Author 2: Nurulhuda Firdaus Mohd Azmi
Author 3: Suriayati Chuprat

Keywords: Fuzzy Delphi Method; software traceability; test effort estimation; regression testing; software changes

Download PDF

Paper 66: Efficient MRI Segmentation and Detection of Brain Tumor using Convolutional Neural Network

Abstract: Brain tumor is one of the most life-threatening diseases at its advance stages. Hence, detection at early stages is very crucial in treatment for improvement of the life expectancy of the patients. magnetic resonance imaging (MRI) is being used extensively nowadays for detection of brain tumors that requires segmenting huge volumes of 3D MRI images which is very challenging if done manually. Thus, automatic segmentation of the images will significantly lessen the burden and also improve the process of diagnosing the tumors. This paper presents an efficient method based on convolutional neural networks (CNN) for the automatic segmentation and detection of a brain tumor using MRI images. Water cycle algorithm is applied to CNN to obtain an optimal solution. The developed technique has an accuracy of 98.5%.

Author 1: Alpana Jijja
Author 2: Dr. Dinesh Rai

Keywords: Brain tumor; segmentation; convolutional neural network; water cycle algorithm

Download PDF

Paper 67: Smartphones-Based Crowdsourcing Approach for Installing Indoor Wi-Fi Access Points

Abstract: This study provides a new Crowdsourcing-based approach to identify the most crowded places in an indoor environment. The Crowdsourcing Indoor Localization system (CSI) has been one of the most used techniques in location-based applications. However, many applications suffer from the inability to locate the most crowded locations for various purposes such as advertising. These applications usually need to perform a survey before identifying target places, which require additional cost and time consuming. For example, Access Points (APs) installation can rely on an automated system to identify the best places where these APs should be placed without the need to use primitive ways to determine the best locations. In this work, we present a new approach for Wi-Fi designers and advertising companies to recognize the proper positions for placing APs and advertisement activities in indoor buildings. The recorded data of the accelerometer sensors are analyzed and processed to detect user’s steps and thereby predict the most crowded places in a building. Our experiments show promising results in terms of the most widely used metrics in the subject as the accuracy for detecting users’ steps reaches 95.8% and the accuracy for detecting the crowded places is 90.4%.

Author 1: Ahmad Abadleh
Author 2: Wala Maitah
Author 3: Hamzeh Eyal Salman
Author 4: Omar Lasassmeh
Author 5: Awni Hammouri

Keywords: Crowdsourcing; indoor localization system; ac-celerometer sensors; Wi-Fi access point; smartphones

Download PDF

Paper 68: Data Citation Service for Wikipedia Articles

Abstract: The citation of big scientific data is crucial not only for scientific activity but also for the scientific discovery and dissemination within scientist network. The main objective of this research is to develop a service-oriented data citation system using data mining techniques for Middle East and North Africa scientists. A novel service oriented framework is proposed to prototype the development of the system that includes query for-malization, service discovery, service composition design, service selection, search space, and service optimization. In this research, Wikipedia scientific-related articles are connected with more than 35 petabyte Pangaea datasets. The output of this work is a web service that takes Wikipedia article information as an input and provides the possible relevant datasets (if exist) related to the article. The evaluation of this research is based on a quantitative assessment performed to the quality of web service metrics, such as number of access and bandwidth utilization; which shows that the framework is robust enough to handle both big data access and its citation.

Author 1: Arif Bramantoro
Author 2: Ahmad A. Alzahrani

Keywords: Scientific dataset; web services; wikipedia; pangaea; big data

Download PDF

Paper 69: Software Abstractions for Large-Scale Deep Learning Models in Big Data Analytics

Abstract: The goal of big data analytics is to analyze datasets with a higher amount of volume, velocity, and variety for large-scale business intelligence problems. These workloads are normally processed with the distribution on massively parallel analytical systems. Deep learning is part of a broader family of machine learning methods based on learning representations of data. Deep learning plays a significant role in the information analysis by adding value to the massive amount of unsupervised data. A core domain of research is related to the development of deep learning algorithms for auto-extraction of complex data formats at a higher level of abstraction using the massive volumes of data. In this paper, we present the latest research trends in the development of parallel algorithms, optimization techniques, tools and libraries related to big data analytics and deep learning on various parallel architectures. The basic building blocks for deep learning such as Restricted Boltzmann Machines (RBM) and Deep Belief Networks (DBN) are identified and analyzed for parallelization of deep learning models. We proposed a parallel software API based on PyTorch, Hadoop Distributed File System (HDFS), Apache Hadoop MapReduce and MapReduce Job (MRJob) for developing large-scale deep learning models. We obtained about 5-30% reduction in the execution time of the deep auto-encoder model even on a single node Hadoop cluster. Furthermore, the complexity of code development is significantly reduced to create multi-layer deep learning models.

Author 1: Ayaz H Khan
Author 2: Ali Mustafa Qamar
Author 3: Aneeq Yusuf
Author 4: Rehanullah Khan

Keywords: Big data; deep learning; deep auto-encoders; Re-stricted Boltzmann Machines (RBM)

Download PDF

Paper 70: Person Detection from Overhead View: A Survey

Abstract: In recent years, overhead view based person detection gained importance, due to handling occlusion problem and providing better coverage in scene, as com-pared to frontal view. In computer vision, overhead based person detection holds significant importance in many appli-cations including person detection, person counting, person tracking, behavior analysis and occlusion free surveillance system, etc. This paper aims to provide a comprehensive sur-vey on recent development and challenges related to person detection from top view. To the best of our knowledge, it is the first attempt which provides the survey of different overhead person detection techniques. This paper provides an overview of state of the art overhead based person detection methods and guidelines to choose the appropriate method in real life applications. The techniques are divided into two main categories: the blob-based techniques and the feature-based techniques. Various detection factors such as field of view, region of interest, color space, image resolution are also examined along with a variety of top view datasets.

Author 1: Misbah Ahmad
Author 2: Imran Ahmed
Author 3: Kaleem Ullah
Author 4: Iqbal khan
Author 5: Ayesha Khattak
Author 6: Awais Adnan

Keywords: Person detection; background subtraction; seg-mentation; blob based techniques; feature based techniques

Download PDF

Paper 71: Towards a Context-Dependent Approach for Evaluating Data Quality Cost

Abstract: Data-related expertise is a central and determining factor in the success of many organizations. Big Tech companies have developed an operational environment that extracts benefit from collected data to increase the efficiency and effectiveness of daily operations and services offered. However, in a complex economic environment, with transparent accounting and financial management, it is not possible to solve data quality issues with “dollars” without justifications and measurable indicators beforehand. The overall goal is not to improve data quality by any means, but to plan cost-effective data quality projects that benefit the organization. This knowledge is particularly relevant for organizations with little or no experience in the field of data quality assessment and improvement. Indeed, it is important that the costs and benefits associated with data quality are explicit and above all, quantifiable for both business managers and IT analysts. Organizations must also evaluate the different scenarios related to the implementation of data quality projects. The optimal scenario must provide the best financial and business value and meet the specifications in terms of time, resources and cost. The approach presented is this paper is an evaluation-oriented approach. For data quality projects, it evaluates the positive impact on the organization's financial and business objectives, which could be linked to the positive value of quality improvement and the implementation complexity, which could be coupled with the costs of quality improvement. This paper tries also to translate empirically the implementation complexity to costs expressed in monetary terms.

Author 1: Meryam Belhiah
Author 2: Bouchaïb Bounabat

Keywords: Data quality improvement project; cost of data quality; data quality assessment and improvement; cost/benefit analysis

Download PDF

Paper 72: Towards Usability Guidelines for the Design of Effective Arabic Websites

Abstract: The Arabic websites constitute 1% of the web content with more than 225 million viewers and 41% Internet penetration. However, there is a lack of design guidelines related to the selection and use of appropriate font type and size and images in Arabic websites. Both text and images are vital multimedia components of websites and thereby were selected for investigation in this study. The herein paper performed an in-depth inspection of font and image design practices within 73 most visited Arabic websites in Saudi Arabia according to Alexa Internet ranking in the first quarter of 2019. Our exhaustive analysis showed discrepancies between the international design recommendations and the actual design of Arabic websites. There was a considerable variation and inconsistency in using font types and sizes between and within the Arabic websites. Arabic Droid Kufi was used mostly for styling page titles and navigation menus, whilst Tahoma was used for styling paragraphs. The font size of the Arabic text ranged from 12 to 16 pixels, which may lead to poor readability. Images were used heavily in the Arabic websites causing prolonged site loading times. Moreover, the images strongly reflected the dimensions of the Saudi culture, especially collectivism and masculinity. Current Arabic web design practices are compared against the findings from past studies about international designs and lessons aiming at ameliorating the Arabic web design are inferred.

Author 1: Abdallah Namoun
Author 2: Ahmad B. Alkhodre

Keywords: Arabic websites; design principles; font type; font size; readability; legibility; images; graphics; site performance

Download PDF

Paper 73: Digital Certificate Exchange of Agricultural Products using SOAP Web Services

Abstract: Developing countries have continued to experience a number of challenges in managing import and export certificate for various goods. In this paper, we are proposing a model for digital certificate exchange in an effort to improve the security levels of data exchange among the government organizations and the business community. With the increase of various information systems being used in many organizations, data exchange between systems has become critical. The model developed uses SOAP web services for data exchange and RSA encryption for secure data exchange. Ministry of Agriculture is responsible for the issuance of import and export certificates in Zambia while Zambia Revenue Authority is responsible for ensuring that goods imported or exported out of the country have a valid certificate that is authentic. The results show that the model provides a secure and timely exchange of information between the ministries and the government agencies.

Author 1: Miyanda Chilikwela
Author 2: Jackson Phiri

Keywords: Digital; certificate; Rivest, Shamir and Adleman (RSA); Simple Object Access Protocol (SOAP); web service; model

Download PDF

Paper 74: New Approach based on Model Driven Engineering for Processing Complex SPARQL Queries on Hive

Abstract: Semantic web technologies are increasingly used in different domains. The core technology of the Semantic Web is the RDF standard. Today with the growth of RDF data it requires systems capable of handling these large volumes of data and responding to very complex requests at the join level, Several RDF data processing systems have been proposed, but are not dedicated to handling complex SPARQL queries. in this paper we present a new approach based on model driven engineering for processing complex SPARQL queries using one of the big data processing tools named Hive. We evaluate our system using three datasets from LUBM Benchmark. The results of this evaluation show the performance, and the scalability of our approach, also give very interesting results when it is compared with existing works.

Author 1: Mouad Banane
Author 2: Abdessamad Belangour

Keywords: SPARQL; big data; model driven engineering; RDF

Download PDF

Paper 75: Improved Cryptanalysis of Provable Certificateless Generalized Signcryption

Abstract: Certificateless generalized signcryption adaptively work as certificateless signcryption, signature or encryption scheme having single algorithm for suitable storage-constrained environments. Recently, Zhou et al. proposed a novel Certificates generalized scheme, and proved its ciphertext indistinguishability under adaptive chosen ciphertext attacks (IND-CCA2) using Gap Bi-linear Diffie-Hellman and Computational Diffie-Hellman assumption as well as proved existential unforgeability against chosen message attacks (EUF-CMA) using the Gap Bi-linear Diffie-Hellman and Computational Diffie-Hellman assumption in random oracle model. In this paper, we analyzed Zhou et al. scheme and unfortunately proved IND-CCA2 insecure in encryption and signcryption modes in defined security model. We also present a practical and improved scheme, provable secure in random oracle model.

Author 1: Abdul Waheed
Author 2: Jawaid Iqbal
Author 3: Nizamud Din
Author 4: Shahab Ul Islam
Author 5: Arif Iqbal Umar
Author 6: Noor Ul Amin

Keywords: Digital signature; certificateless encryption; cer-tificateless generalized signcryption; malicious-but-passive KGC; random oracle model

Download PDF

Paper 76: Efficient Energy Utilization in Cloud Fog Environment

Abstract: Cloud computing provides various kind of services like storage and processing that can be accessed on-demand when required. Despite its countless benefits, it incorporates some issues too that limits the full adaption of cloud and enjoy its various benefits. Mainly the issue faced during the adaptability of cloud infrastructure is high latency and unawareness of location. To overcome these issues the concept of fog computing is introduced to reduce the load on the cloud and improve the allocation of resources. The fog provides the same services as the cloud. The main features of fog are; location awareness, low latency, and mobility. However, increasing the use of IoT devices, also increase the usage of Cloud. Fog environment. So, much usage of fog getting attention of researcher about energy consumption. In this paper, we try to solve the problem of energy consumption in terms of resources allocation by applying the load balancing algorithms and compare its result with the energy models.

Author 1: Babur Hayat Malik
Author 2: Muhammad Nauman Ali
Author 3: Sheraz Yousaf
Author 4: Mudassar Mehmood
Author 5: Hammad Saleem

Keywords: Energy efficiency; fog computing; cloud computing; load balancing; resources allocation

Download PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. Registered in England and Wales. Company Number 8933205. All rights reserved. thesai.org