The Science and Information (SAI) Organization
  • Home
  • About Us
  • Journals
  • Conferences
  • Contact Us

Publication Links

  • IJACSA
  • Author Guidelines
  • Publication Policies
  • Digital Archiving Policy
  • Promote your Publication
  • Metadata Harvesting (OAI2)

IJACSA

  • About the Journal
  • Call for Papers
  • Editorial Board
  • Author Guidelines
  • Submit your Paper
  • Current Issue
  • Archives
  • Indexing
  • Fees/ APC
  • Reviewers
  • Apply as a Reviewer

IJARAI

  • About the Journal
  • Archives
  • Indexing & Archiving

Special Issues

  • Home
  • Archives
  • Proposals
  • Guest Editors
  • SUSAI-EE 2025
  • ICONS-BA 2025
  • IoT-BLOCK 2025

Future of Information and Communication Conference (FICC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Computing Conference

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Intelligent Systems Conference (IntelliSys)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact

Future Technologies Conference (FTC)

  • Home
  • Call for Papers
  • Submit your Paper/Poster
  • Register
  • Venue
  • Contact
  • Home
  • Call for Papers
  • Editorial Board
  • Guidelines
  • Submit
  • Current Issue
  • Archives
  • Indexing
  • Fees
  • Reviewers
  • Subscribe

IJACSA Volume 10 Issue 11

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

View Full Issue

Paper 1: Anomaly Detection using Unsupervised Methods: Credit Card Fraud Case Study

Abstract: The usage of credit card has increased dramatically due to a rapid development of credit cards. Consequently, credit card fraud and the loss to the credit card owners and credit cards companies have been increased dramatically. Credit card Supervised learning has been widely used to detect anomaly in credit card transaction records based on the assumption that the pattern of a fraud would depend on the past transaction. However, unsupervised learning does not ignore the fact that the fraudsters could change their approaches based on customers’ behaviors and patterns. In this study, three unsupervised methods were presented including autoencoder, one-class support vector machine, and robust Mahalanobis outlier detection. The dataset used in this study is based on real-life data of credit card transaction. Due to the availability of the response, fraud labels, after training the models the performance of each model was evaluated. The performance of these three methods is discussed extensively in the manuscript. For one-class SVM and auto encoder, the normal transaction labels were used for training. However, the advantages of robust Mahalanobis method over these methods is that it does not need any label for its training.

Author 1: Mahdi Rezapour

Keywords: Credit card fraud; anomaly detection; SVM; Mahalanobis distance; autoencoder; unsupervised techniques

PDF

Paper 2: On the Possibility of Implementing High-Precision Calculations in Residue Numeral System

Abstract: This article proposes a method for accelerating high-precision calculations by parallelizing arithmetic operations of addition, subtraction and multiplication. The proposed approach allows us to apply the advantages of the residue numeral system: absence of carry-overs when adding, subtracting, multiplying and reducing high-precision calculations with numbers of high digit capacity to parallel and independent execution of arithmetic operations with numbers of low digit capacity across many modules. Due to the complexity of performing non-modular operations such as: inverse transformation into a positional numeral system, number comparisons, sign identification and number rank calculation in a residue numeral system, the effect of acceleration of high-precision calculations is possible when solving some computational problems with a small number of non-modular operations, for example: determination of the scalar product of vectors, discrete Fourier transformation, iterative solution of systems of linear equations by the methods of Jacoby, Gaussa-Zeidel, etc. Implementation of the proposed method are demonstrated by the example of finding the scalar product of vectors.

Author 1: Otsokov Sh A
Author 2: Magomedov Sh.G

Keywords: High-precision calculations; residue numeral system; positional numeral system; number conversion; rank determination

PDF

Paper 3: A Hybrid Model of Autoregressive Integrated Moving Average and Artificial Neural Network for Load Forecasting

Abstract: The complementary strengths and weaknesses of both statistical modeling paired with machine learning has been an ongoing technique in the development and implementation of forecasting models that analyze the dataset’s linear as well as nonlinear components in the generation of accurate prediction results. In this paper, autoregressive integrated moving average (ARIMA) and artificial neural networks (ANN) were implemented as a hybrid forecasting model for a power utility’s dataset in order to predict the next day’s electric load consumption. ARIMA and ANN models were serially developed resulting to the findings that out of the twelve evaluated ARIMA models, ARIMA (8,1,2) exhibited the best forecasting performance. After identifying the optimal ANN layers and input neurons, this study showed that out of the six evaluated supervised feedforward ANN models, the ANN model which employed Hyperbolic Tangent activation function and Resilient Propagation training algorithm also exhibited the best forecasting performance. With Zhang’s ARIMA and ANN hybridization technique, this study showed that the hybrid model delivered Mean Absolute Percentage Error (MAPE) of 4.09% which is within the 5% internationally accepted forecasting error for electric load forecasting. Through the findings of this research, both the ARIMA statistical model and ANN machine learning approaches showed promising results in being implemented as a forecasting model pair to analyze the linear as well as non-linear properties of a power utility’s electric load data.

Author 1: Lemuel Clark P Velasco
Author 2: Daisy Lou L. Polestico
Author 3: Gary Paolo O. Macasieb
Author 4: Michael Bryan V. Reyes
Author 5: Felicisimo B. Vasquez Jr

Keywords: Hybrid model; autoregressive integrated moving average; electric load forecasting; Artificial Neural Network (ANN)

PDF

Paper 4: The Use of Microwave Drying Process to the Granular Materials

Abstract: The use of electro thermal technologies based on microwave energy represents an important step in the development of new innovative solutions. The numerical modelling allows to study the influence of the high frequency electromagnetic and thermal field on the dielectric materials during the drying process before achieving practical installation. So, when is developed the experimental model will be already known some of the phenomena that characterize the system, being eliminated a number of unknown issues. This paper describes experiments conducted to gather data on production parameters in order to improve the stored corn seed quality. The interpretation and dissemination of results triggers the description of "recipes" for drying corn seeds. The described method is flexible and can be applied to near any agricultural seeds in further researches.

Author 1: Francisc Ioan Hathazi
Author 2: Vasile Darie Soproni
Author 3: Mircea Nicolae Arion
Author 4: Carmen Otilia Molnar
Author 5: Simina Vicas (Coman)
Author 6: Olimpia Smaranda Mintas

Keywords: Microwave energy; microwave field; experimental models; laboratory models; drying and seed treatment

PDF

Paper 5: Intuition, Accuracy, and Immersiveness Analysis of 3D Visualization Methods for Haptic Virtual Reality

Abstract: The purpose of this study is to analyze the usefulness of 3D immersive stereoscopic virtual reality technology in applications that provide tactile sensations. Diverse experiments show that the haptic 3D visualization method presented in a 3D stereoscopic space using headsets is more intuitive, accurate and immersive than the haptic 3D visualization method displayed in 3D flat displays. In particular, the intuitiveness has been significantly improved in a stereoscopic 3D visualization rather than flat 3D visualization. In spite of the general superiority of the stereoscopic 3D visualization, it is mentionable that the precise operation performance has not been greatly improved in the elaborate object movements. It means that users can recognize and manipulate the position of objects more quickly in 3D stereoscopic immersive VR environments, however, the precise operation does not benefit greatly in the 3D stereoscopic visualization. Note that the degree of game immersion is remarkably augmented in the case of using 3D stereoscopic visualization.

Author 1: Taehoon Kim
Author 2: Chanwoo Kim
Author 3: Hyeonseok Song
Author 4: Mee Young Sung

Keywords: 3D; stereoscopic; haptic; intuition; immersiveness; recognize; simulation; virtual reality; visualization

PDF

Paper 6: Performance Evaluation of E32 Long Range Radio Frequency 915 MHz based on Internet of Things and Micro Sensors Data

Abstract: This research discusses how to build and analyze a 915 MHz Long Range (LoRa) E32 Frequency-based Node Sensor network with a Micro Sensor with 3 sensor outputs produced i.e, Temperature (DegC), Air Pressure (hPa), and Humidity (%). therefore, This research succeeded in making a sensor node using the LoRa E32 915 MHz using a mini type ATmega 328p microcontroller with a 3.7 volt, 1000 mAh battery. The display on the receiver uses an 8X2 LCD which will output 3 sensor data outputs. furthermore, the result and analysis of this research are how to analysis of the LoRa Chirp Signal, furthermore, LoRa Chirp Signal obtained from the Textronix Spectrum analyzer in realtime, Quality of Service (QoS), Receive Signal Strength Indicator (RSSI) (-dBm), uplink and downlink data on the Internet Server. Furthermore, The Micro Sensor Graph Output will be displayed on the application server with a sensor data graph. In this research Application Server used is Thingspeak from Mathworks.

Author 1: Puput Dani Prasetyo Adi
Author 2: Akio Kitagawa

Keywords: Long range; microcontroller; internet of things; quality of service; micro sensor

PDF

Paper 7: e-Learning System using Mashup based e-Learning Content Collection and an Attractive Avatar in OpenSimulator

Abstract: Mashup based e-learning content collection and provide system with an attractive avatar in OpenSimulator is proposed for making learning processes attractive and finding the students week points through learning processes with avatar. Through experiments with under graduated students, it is found that the proposed e-learning system is useful for the under graduated students. The students may find their week points through learning processes. Avatar provides the questions of the student’s week subjects. The students can enjoy the communications with their avatar. Therefore, the students have the most appropriate e-learning content which provided by the mashup based information retrievals and have lessons with their own avatar attractively using the proposed e-learning system.

Author 1: Kohei Arai

Keywords: OpenSimulator; Mashup; e-learning; mobile learning; avatar; search engine

PDF

Paper 8: The Impact of Social Networks on Students’ Academic Achievement in Practical Programming Labs

Abstract: Internet and Communication Technology (ICT) is being applied extensively in education to allow students to obtain information at anytime from anywhere. Social media is a new approach that is used to enhance and improve the delivery of education. Facebook groups and YouTube channels are the most operated networks that manage to deliver information. In this paper, we will study the impact of applying Facebook groups and YouTube videos on students’ academic achievements in computer programming labs especially in object-oriented programming 2 (OOP2) lab at the Hashemite University. The practical programming lab plays an important role in understanding the theoretical programming concepts, for this reason programming lab is chosen as a case study. The proposed methodology embeds the social media networks as new major dimension in the teaching process of OOP2 Lab side by side with traditional lectures and e-learning tool such as Moodle. In this research: three surveys are utilized respectively to inspect the reasons of the weakness in OOP2 lab, evaluate the course learning outcomes (CLOs) by students before and after applying the proposed methodology and to investigate the opinions of students toward using social media networks within learning process. The results showed that operating the social media sites used by students on a daily basis creates a friendly and close educational environment as well as enhancing the academic results of students.

Author 1: Ayat Al Ahmad
Author 2: Randa Obeidallah

Keywords: e-Learning; Facebook; YouTube; social network; programming lab; course learning outcome

PDF

Paper 9: Performance Evaluation of Different Short Path Algorithms to Improve Oil-Gas Pipelines

Abstract: The oil-gas pipeline is a complicated and expensive system in terms of construction, control, materials, monitoring, and maintenance which includes economic, social and environmental hazards. As a case study of Iraq, the system of pipelines is above the ground and is liable to disasters that may produce an environmental tragedy as well as the loss of life and money. Hence, this article presents a performance evaluation of different short path algorithms to improve oil-gas pipelines. The chosen algorithms in this paper were Parallel Short Path Algorithm (PSPA), Ant Colony Optimization (ACO) algorithm and Genetic Algorithm (GA). The main performance metric is the cost of the pipelines. Simulation trials were performed using the MATLAB program for the chosen algorithms. The performance comparison showed that the lowest cost of laying oil and gas pipelines was by applying the GA algorithm when the number of wells was set to 50-600. Conversely, the PSPA algorithm showed the best performance in terms of required implementation time for all scenarios. Besides, PSPA appeared to have acceptable performance in terms of the cost of the pipeline when the number of wells was arranged between50-600. Furthermore, PSPA showed the best performance for 700 and 840 wells in terms of the cost of laying the oil and gas pipelines compared to ACO and GA. It should be noted that the ACO algorithm showed medium performance in terms of the cost of laying oil and gas pipelines compared to PSPA and GA.

Author 1: Nabeel Naeem Almaalei
Author 2: Siti Noor Asyikin Mohd Razali

Keywords: PSPA; ACO; GA; Oil-Gas pipeline; performance; cost; short path

PDF

Paper 10: A New Model to Detect 2D Hand based on Multi-feature Skin Model

Abstract: Recognition of hand gesture is one of Human PCs most growing interfaces. In most vision-based signal recognition system, the initial phase is hand detection and separation. Because the hands are linked to a variety of day by day, local work experiences both extraordinary changes in the illumination and the innate unbroken appearance of the hand. In order to address these issues, we suggest another 2D hand position software that can be seen as a combination of multi-feature hand proposal generation and cascading neural system network characterization (CCNN). When considering various luminances we select color, Gabor, Hoard and Filter to separate the skin and produce a hand proposal. Therefore, we are selling a cascaded CNN that holds the deep setting information between the proposals. A mix of some datasets, including a few Oxford Hands Datasets, VIVA Hand Recognition, and Egohands Datasets, is tested as the positive example and image patch Net 2012, FDDEB dataset as a bad example; the proposed Multi-Feature Directed Cascaded CNN (MFS-CCNN) strategy. Aggressive results are achieved by the technique proposed. Our average sample dataset accuracy is considerably inferior to DPM. With an average of 43.55 and 51.78 percent accuracy, our CCNN and MFS-CCNN model perform DPM. Average accuracy of the CCNN model in a combined test set is 9.16% higher than the SSD model. Still, our model is faster than a DPM based on the statistical performance.

Author 1: Abdullah Shawan Alotaibi

Keywords: Hand detection; feature modeling; convolutional neural networks

PDF

Paper 11: The Computational Efficiency of Monte Carlo Breakage of Articles using Serial and Parallel Processing: A Comparison

Abstract: This paper presents a GPU-based parallelized and a CPU-based serial Monte-Carlo method for breakage of a particle. We compare the efficiency of the graphic card’s graphics processing unit (GPU) and the general-purpose central processing unit (CPU), in a simulation using Monte Carlo (MC) methods for processing the particle breakage. Three applications are used to compare the computational performance times, clock cycles and speedup factors, to find which platform is faster under which conditions. The architecture of the GPU is becoming increasingly programmable; it represents a potential speedup for many applications compared to the modern CPU. The objective of the paper is to compare the performance of the GPU and Intel Core i7-4790 multicore CPU. The implementation for the CPU was written in the C programming language, and the GPU implemented the kernel using Nvidia’s CUDA (Compute Unified Device Architecture). This paper compares the computational times, clock cycles and the speedup factor for a GPU and a CPU, with various simulation settings such as the number of simulation entries (SEs), for a better understanding of the GPU and CPU computational efficiency. It has been found that the number of SEs directly affects the speedup factor.

Author 1: Jherna Devi
Author 2: Jagdesh Kumar

Keywords: Breakage of particles; Central Processing Unit (CPU); Graphics Processing Unit (GPU); CUDA; computational time; clock cycle; speedup factor

PDF

Paper 12: Developing Lexicon-based Algorithms and Sentiment Lexicon for Sentiment Analysis of Saudi Dialect Tweets

Abstract: Majority of studies on sentiment analysis field, specifically Arabic lexicon-based approach, are focused on doing preprocessing methods on targeted dataset text or collected textual data from Twitter (Twitter dataset) rather than dealing with lexicon itself. This study proposes a new method, we constraint firstly on building a new sentiment lexicon with reasonable number of words and then doing adequate preprocessing methods on the lexicon’s words in addition to the (Twitter dataset). The study presents Saudi Dialect Sentiment lexicon called SaudiSentiPlus contains 7139 words which mostly generated from Saudi tweets and other dictionaries. Moreover, this study also presents two lexicon-based algorithms for Saudi dialect to deal with (prefixes and suffixes) letters in order to increase performance of proposed Saudi dialect lexicon. The experiment which has been conducted in this study to evaluate the performance of SaudiSentiPlus comprises four phases. The precision, recall, accuracy, and F-Score are measured in every phase. We built our testing dataset from twitter by focusing on Saudi dialect hashtags (971 thousands tweets from 162 hashtags). The results, show that accuracy of SaudiSentiPlus with the two lexicon-based algorithms reached to 81%.

Author 1: Waleed Al-Ghaith

Keywords: Sentiment analysis; opinion mining; lexicon-based; Arabic text mining; Saudi Arabia

PDF

Paper 13: Empirical Analysis of Object-Oriented Software Test Suite Evolution

Abstract: The software system is evolving over the time, thus, the test suite must be repaired according to the changing code. Updating test cases manually is a time-consuming activity, especially for large test suites, which motivate the recent development of automatically repairing test techniques. To develop an effective automatic repair technique that reduces the effort of development and the cost of evolution, the developer should understand how the test suite evolves in practice. This investigation aims to conduct a comprehensive empirical study on eight Java systems with many versions of these systems and their test suites to find out how the test suite is evolving, and to find the relationship between the change in the program and the corresponding evolution in the test suite. This study showed that the test suite size is mostly increased, where the test suite complexity is stabilized. The increase (or decrease) in the code size will mostly increase (or decrease) the test suite size. However, the increasing or decreasing in the code complexity is offset by stabilizing the test suite complexity. Moreover, the percentage of the code coverage tends to be increased more than decreased, but in the mutation coverage, the opposite is true.

Author 1: Nada Alsolami
Author 2: Qasem Obeidat
Author 3: Mamdouh Alenezi

Keywords: Software; test; code complexity; code coverage; test evolution

PDF

Paper 14: On Designing Bee Inspired Routing Algorithm for Device-to-Device Communication in the Internet of Things

Abstract: Device-to-device communication is popular research trend that presents ubiquitous information exchange on the Internet of Things. D2D communication provides data exchange without transiting to a base station using direct communication between two devices. For such environment, successful delivery of data to the receiver is needed. In this paper, we suggest a Bee-Inspired Routing Algorithm (BIRA) for D2D communication in IoT exploits the multiple interfaces of a “thing” in IoT having different wireless standards. BIRA is on demand routing algorithm simulates the bee’s foraging behavior model to find optimal path between source and destination for multi-hop communication. The performance of BIRA is assessed through extensive simulations that concludes BIRA realizes better packet delivery ratio as well as it performs lower average end-to-end delay in different traffic load compared to the conventional AODV protocol. Also, BIRA achieves least energy consumption than AODV and increases network lifetime.

Author 1: Asmaa Mohammed Almazmoomi
Author 2: Muhammad Mostafa Monowar

Keywords: Device-to-device communication; internet-of-things; Bee-Inspire Algorithm; routing protocol

PDF

Paper 15: Computerized Drug Verification System: A Panacea for Effective Drug Verification

Abstract: Computerized Drug Verification System (CDVS) is a research work geared towards establishing the means of identifying authentic drugs in Nigeria with emphasis on identifying the manufacturing date and expiration date respectively using an interactive mobile app. At the point of drug purchase, only few drugs have methods of verifying their authenticity using both Personal Identification Number (PIN) and National Agency for Food and Drug Administration Control (NAFDAC) number. However, the production and expiry dates of such drugs are not always known, hence drugs that have long expired may still report authentic thereby endangering the lives of the consumers. This research addresses the challenges providing an interactive platform for drug verification, with especially strength in its ability to incorporate the manufacturing date, the manufacturer, expiry date and the authentication of drugs. The product of this research, “NAFDAC VERIFY” an interactive mobile and web application can be used in verifying the authenticity of drugs in Nigeria in partnership with Mobile Authentication Service (MAS). This system which runs both on android mobile phones and web-based have been tested using some raw data and the system proves to be very robust and achieves the set out objectives.

Author 1: Oketa Christian Kelechi
Author 2: Alo Uzoma Rita
Author 3: Okemiri Henry Anayo
Author 4: Richard-Nnabu Nneka Ernestecia
Author 5: Achi Ifeanyi Isaiah
Author 6: Chinazo I. Chima
Author 7: Afolabi Idris Yinka
Author 8: Mgbanya Praise Chinenye

Keywords: NAFDAC verify; security; drug verification; information system; drug authentication

PDF

Paper 16: Empirical Analysis of Context based Content Delivery for M-Learning Scenarios using ANFIS

Abstract: Today's world is unusually popular with the internet and mobile devices in everyday life. It offers unprecedented possibilities learning with mobility. This kind of learning can be called "M-learning" (Mobile Learning) at any point in the world. Meeting learners ' needs in the current scenario in which collaborative electronic and mobile education systems have become more evolving. Every learners ' needs differ in terms of learning; context, content, learning styles, speed of learning, even including preferences, places. Mobile learning enables the learner to learn while moving, enabling the learner to learn in any time and any place. Learning styles have evolved along with advances in technology; specifically advances in mobile technology and mobile networks. Portable devices such as mobile phones, tabs, iPods, etc. are commonly used today by all. The way we learn has been changed with the use of these devices in education. In M-learning environment the learner has access to the contents everywhere and every time through mobile devices. Customization and learner context awareness are the important factors in providing the learner with relevant content. Appropriate content delivery based on a learner's context is a complex process. So a content delivery system is needed that takes into account learners ' needs such as context, style and devices features. To model such a system neural network with fuzzy reasoning can be used, to accommodate the dynamically changing learning styles, contexts and characteristics of smart device. "If-then" conditions can be formed to build the suggestion rules required for such a content delivery system. ANFIS i.e. Artificial Nero Fuzzy Inference System is an integral asset to create fuzzy systems with IF-THEN guidelines. To model and analyze this type of context aware and adaptive content delivery system for an M-learning environment, ANFIS can be used. In this article, use of ANFIS tool is demonstrated for various m-learning scenarios with different contexts. Four different contexts are constructed based on the inputs given by the student learners. Using ANFIS these four scenarios have been analyzed empirically for their performance based on the RMSE of various membership functions.

Author 1: Sudhindra B Deshpande
Author 2: Shrinivas R.Mangalwede

Keywords: Personalization; adaptive learning e-learning; distance learning; mobile learning; M-Learning; ANFIS; fuzzy system; neural networks; personalization; context awareness; content adaption; content delivery; expert system

PDF

Paper 17: Performance of LoRa Network for Environmental Monitoring System in Bidong Island Terengganu, Malaysia

Abstract: Recent wireless communication network which is Low Power Wide Area Network (LPWAN) bring a huge potential in monitoring system as the integration between sensors application and LPWAN technology itself contribute to greater efficiency, higher productivity, and better quality of life in multiform sectors such as in healthcare, smart city application, and monitoring system. The project involves the implementation of Low Power Wide Area Network (LPWAN) while delving into performances of the LoRa Technologies in the development of the environmental monitoring system in Bidong Island located in state of Terengganu, Malaysia. An Arduino Uno microcontroller stacked with LoRa module shield via SPI connection and in conjunction with few sensors work as end device to capture the environment data and send it to the gateway over the long range at a very low-data-rate with low power consumption for remote monitoring system. In this paper, we identify the best spreading factors that works well in the island region by manipulating all the spreading factor from SF6 to SF12. The result shows SF11 provides the best packet delivery ratio and the most stable Receive Signal Strength Indicator (RSSI) compared to other SFs. Moreover, we evaluate the external factors which caused the packet loss data and provide ways to improve the signal quality and to achieve optimal results for signal transmission and communication performance.

Author 1: Nur Aziemah Azmi Ali
Author 2: Nurul Adilah Abdul Latiff
Author 3: Idrus Salimi Ismail

Keywords: LPWAN; LoRa; environmental monitoring; WSN

PDF

Paper 18: Vulnerability Analysis of E-voting Application using Open Web Application Security Project (OWASP) Framework

Abstract: This paper reports on security concerns in the E-voting used for the election of village heads. Analysis of the system and server uses two different tools to determine the accuracy of scanning vulnerabilities based on the OWASP Framework. We reported that the results of the scanning using the ZAP tool got vulnerability information with the following risk level, one high level, three medium levels, and eleven low levels. The Arachni tool got vulnerability information with the following risk level, one high level, three medium levels, and two low levels. ZAP has a more complex vulnerability view than Arachni. Fatal findings on E-voting in this E-voting system is XSS, which impacts clients, which can be exploited by attackers to bypass security. Directory Traversal allows attackers to access directories and can execute commands outside of the web server’s base directory. Cyber Hiscox Readiness report in 2018 in several European countries such as The United States, Britain, Germany, Spain, and the Netherlands, that the Attackers target through the most vulnerable security holes such as injection, Broken Authentication, Sensitive Data Exposure, XXE, Merged, Security Misconfiguration, XSS, Insecure Deserialization, Using Components with Known Vulnerabilities, Insufficient Logging, and Monitoring. The purpose of cyberattacks alone can threaten the stability of the country and disturb other factors. E-voting, as part of an electronic government system, needs to be audited in terms of security, which can cause the system to disrupt.

Author 1: Sunardi
Author 2: Imam Riadi
Author 3: Pradana Ananda Raharja

Keywords: Vulnerability; e-voting; open web application security project framework; attacker

PDF

Paper 19: On the Prediction of Properties of Benzene using MP4 Method Executed on High Performance Computing with Heterogeneous Platform

Abstract: High computational complexity problem, high computational cost and deal with a big data are the motivation to study the physical and chemical properties of benzene. Based on the limitation of memory system, processor speed and huge time step computation, we propose the implementation of parallel Gaussian suites of program, particularly the program dealing with high order Møller–Plesset perturbation theory, on high performance homogeneous computing platform (HPC) for predicting the physical and chemical properties of small to medium size molecules, such as benzene, the subject of the present work. Besides high accuracy of the geometrical parameters that can be offered by MP4 simulation, orbital shapes, HOMO-LUMO energy gaps and spectral properties of the molecule are among the properties that can be obtained with accurate prediction. In order to achieve high performance indicators, we need to execute the program in multiple instruction and data stream (MIMD) paradigm using homogenous processors architecture. At the end of this paper, it is shown that Parallel algorithm of Gaussian program using the Linda software can be executed and is well suited in both homogenous and heterogeneous processors. The performance evaluation is essentially based on run time, temporal performance, effectiveness, efficiency, and speedup.

Author 1: Norma Alias
Author 2: Riadh Sahnoun
Author 3: Nur Fatin Kamila Zanalabidin

Keywords: MP4; parallel computing; homogenous platform; properties of benzene; high performance computing

PDF

Paper 20: A Framework for Traceable and Transparent Supply Chain Management for Agri-food Sector in Malaysia using Blockchain Technology

Abstract: This paper presents a framework for traceable and transparent supply chain management (SCM) system for the agri-food sector using blockchain technology in Malaysia. Numerous researchers believed that the current SCM system consists of several weak points, especially when multiple enterprise resource planning (ERP) system utilizing centralized SCM. Thus, data transparency and traceability are limited. This study hypothesized that if blockchain technology correlates with transparency and traceability of SCM, the above limitation can be minimized, as blockchain technology works in a distributed manner. This research uses “pepper” as an agri-food domain. The research also recommends that permissioned blockchain is a better fit as compared to permissionless blockchain.

Author 1: Kok Yong Chan
Author 2: Johari Abdullah
Author 3: Adnan Shahid Khan

Keywords: Supply chain; blockchain; consensus algorithm; traceability; transparency

PDF

Paper 21: A Key-Ordered Decisional Learning Parity with Noise (DLPN) Scheme for Public Key Encryption Scheme in Cloud Computing

Abstract: The variation of decisional learning parity with noise (DLPN) named as key-Ordered DLPN based security algorithm is presented in this work. The proposed scheme uses DLPN by extending it to an even-odd-order scheme, depend on the value of probability distribution of odd and even bits for encryption, where odd and even bits are the input integer values for key generation algorithm. This states that the probability distribution of odd and even bits are ordered based on the key generation, the process of odd and even bits resolving is the solution of DLPN attacker problems, thus, the proposed scheme provides more correctness and security proof. Through the learning parity with noise (LPN), DLPN and RSA algorithms, the proposed system is evaluated, to measure the encryption time, public key and ciphertext bits.

Author 1: Tarasvi Lakum
Author 2: B.Thirumala Rao

Keywords: LPN; DLPN; RSA; key-ordered; time

PDF

Paper 22: A Proposed Course Recommender Model based on Collaborative Filtering for Course Registration

Abstract: Students face issues and challenges in making decisions for course registration. Traditionally, students rely on suggestions from academic advisers prior to course registration. Therefore, students spend a considerable amount of time waiting for advisers to help them register for the right subjects. However, the number of students rises yearly, thereby increasing the responsibilities of lecturers. Moreover, academic advisers experience constraints in analysing data during consultations for course registration. Therefore, this study proposes a course recommender model based on collaborative filtering. Collaborative filtering is adopted because it provides recommendations based on students’ performance in previous subjects. A dataset from the Information & Communication Technology Centre (ICT) of the University Malaysia Pahang is used to evaluate the proposed model. The evaluation is conducted based on two experiments. The first experiment is performed by calculating the difference between actual and predicted scores to verify prediction accuracy. Results show that the average of the mean absolute error of the proposed model is 0.319, which is highly accurate. The second experiment is conducted by comparing the recommendations of the proposed model with those of experts to validate the course recommendation accuracy of the proposed model. Results of the second experiment show that the proposed model has a 91.06% accuracy rate with an error rate of 8.94%. In addition, average precision is 0.68 and recall is 0.724, which are considered accurate. Therefore, the proposed model can play a vital role in assisting students and academic advisers to recommend the right courses during registration, thereby overcoming the limitations of academic advising.

Author 1: Norazuwa Binti Salehudin
Author 2: Hasan Kahtan
Author 3: Mansoor Abdullateef Abdulgabber
Author 4: Hael Al-bashiri

Keywords: Course registration; recommender system; collaborative filtering; academic advisory

PDF

Paper 23: Optimization of Cúk Voltage Regulator Parameters for Better Performance and Better Efficiency

Abstract: This paper discusses the harmonic distortion and voltage-current ripple minimization of a Cúk regulator based on the design optimization of its parameters using multichannel connection with uncoupled smoothing filters. The main attention is focused on the analysis and simulation of the fundamental and two-phase parallel connection of Cúk regulator with uncoupled smoothing filters. A detailed analysis has been done to show the benefits of uncoupled smoothing filters and their positive impact on balancing the energy compensation between the capacitors and inductors of the double-phase Cúk regulator as compared to conventional one. As a result of that the dc source current values do not go from positive to negative and vice versa as in the case of a fundamental connection which does not cause any saturation problem for the regulator. In general, multichannel parallel connection of Cúk regulators with uncoupled smoothing filters has ingrained benefits such as eminent current distribution characteristics, sacredness to component tolerance, reduction of parasitic effects and relief in current control complexity. Specifically, by employing double-phase connection with uncoupled smoothing filters for these regulators, overall current fluctuation can be effectively reduced by more than 25% after introducing the double-phase connection, compared to that of simple connection. Moreover, it is proved that the output voltage ripple of the double-phase connection is also reduced by more than 25% from that of the fundamental connection. Computer simulations using Simplorer 7 or Matlab and Excel have been done to validate the concepts.

Author 1: Walid Emar
Author 2: Zakariya Al-omari
Author 3: Omar A. Saraereh

Keywords: Cúk regulator; smoothing filters; double-phase connection; overall current ripple

PDF

Paper 24: Traditional Learning Problems of Computing Students

Abstract: This paper aims to report the traditional learning problems of computing courses students. To identify the problems a questionnaire was framed to focus on the problems and issues faced by students while interacting in a traditional learning environment. The study tested the respondent’s attitude with five-point Likert Scale. The study was analyzed by using the NCSS program. Traditional learning problems were abridging by computing mean, median, mode, standard deviation and IRQ. This research highlights the problems of different computing courses particularly, problems of basic programming concepts, unable to write code, language barrier and confidence besides these, highlighted the various academic and non-academic problems. Reliability Analysis was achieved by Cronbach’s Alpha and got encouraging results of an 80% reliability coefficient.

Author 1: Saima Siraj
Author 2: Akhtar Hussain Jalbani
Author 3: Muhammad Ibrahim Channa

Keywords: Traditional learning problems; computing education; statistical analysis of questionnaire

PDF

Paper 25: Parallel Platform for Supporting Stream Ciphers Over Multi-core Processors

Abstract: Designing secure and fast cryptographic primitives is one of the critical issues in the current era. Several domains, including Internet of Things (IoT), military and banking, require fast and secure data encryption over public channels. Most of the existing stream ciphers are designed to work sequentially and therefore not utilizing available computing power. Also, other stream ciphers are designed based on complex mathematical problems which makes these ciphers slower due to the complex computations. For this purpose, a novel parallel platform for enhancing the performance of stream ciphers is presented. The platform is designed to work efficiently over multi-core processors using multithreading techniques. The architecture of the platform relies on independent components that can operate over multiple cores available on the corresponding communication ends. Two groups of stream ciphers were considered as case studies in our experiments. The first category includes stream ciphers of a sequential design, while the second category includes parallelizable stream ciphers. Performance tests and analysis shows that the parallel platform was able to maximize the encryption throughput of the selected stream ciphers dramatically. The enhancements on the encryption throughput is relative to the constructional design of the stream ciphers. Parallelized stream ciphers (Salsa20, DSP-128, and ECSC-128) was able to achieve higher throughput compared to other sequentially designed stream ciphers.

Author 1: Sally Almanasra

Keywords: Stream ciphers; parallel computing; multithreading; cryptographic primitives; multi-core processors

PDF

Paper 26: Segmentation of Crescent Sand Dunes in High Resolution Satellite Images using a Support Vector Machine for Allometry

Abstract: The study of sand dunes movement is essential to understand and prevent the desertification phenomenon, and collecting data from the field is a labor intensive task, as deserts contain usually a large number of sand dunes. We propose to use computer vision and machine learning algorithms, combined with remote sensing and specifically high resolution satellite images for collecting data about the position and characteristics of moving sand dunes. We focused on the fastest moving sand dunes called barchans, which are threatening the settlements in the region of Laayoune, Morocco. We developed a process with three stages: In the first stage, we used an image processing approach with cascading Haar features for the detection of dunes location. In the second stage, we used a support vector machine for the segmentation of contours, and in the third stage we used an algorithm to measure the allometric features of barchans dunes. We explored the collected data, and found relevant correlations between dunes length, and width, and horns sizes, which could be used as key indicators for dunes growth and progression. This study is therefore of high interest for urban planners and geologists who study sand dunes and require technical methods, based on machine learning and computer vision to allow them to collect large amount of data from satellite images to understand sand dunes progression and counter desertification problems. The use of cascading Haar feature provided a good accuracy, and the use of Support Vector Machines, along with the high resolution satellite images provided a good precision for the segmentation of barchan dunes contours, allowing the collection of morphological features which provide significant information on barchans sand dunes dynamics.

Author 1: M A Azzaoui
Author 2: L. Masmoudi
Author 3: H. El Belrhiti
Author 4: I. E. Chaouki

Keywords: Image segmentation; support vector machines; high resolution satellite images; remote sensing; sand dunes; desertification

PDF

Paper 27: Dynamic Hand Gesture to Text using Leap Motion

Abstract: This paper presents a prototype for converting dynamic hand gestures to text by using a device called Leap Motion. It is one of the motion tracking technologies, which could be used for recognising hand gestures without the need of wearing any external devices or capturing any images and videos. In this study, five custom dynamic hand gestures of American Sign Language were created with Leap Motion to measure the recognition accuracy of the proposed prototype using the Geometric Template Matching, Artificial Neural Network, and Cross-Correlation algorithms. The experimental results showed that the prototype achieved recognition accuracy of more than 90% in the training phase and about 60% in the testing phase.

Author 1: Nur Aliah Nadzirah Jamaludin
Author 2: Ong Huey Fang

Keywords: Dynamic hand gesture; leap motion; American sign language; artificial neural network; cross-correlation; geometric template matching

PDF

Paper 28: Investigating Factors Affecting Knowledge Management Practices in Public Sectors

Abstract: Knowledge Management (KM) is a systematic approach in creating, sharing, using and managing information effectively sustain knowledge regardless public or private organizations. It helps organizations to make better decision making in order to achieve the goals and increase the productivity. However, many public organizations are still facing challenges to adopt knowledge management practices compared to private organization due to lack of awareness. They are not aware of the influenced factors such as people, process, and technology. Therefore, this paper identifies influencing factors that contributed to the successful KM practices in public sectors. This study employs quantitative approaches by distributing a set of questionnaires to 83 IT practitioners in public organizations. 63 returned responses were analyzed using the Rasch Measurement Model. The findings indicated that there is a lack of participation amongst the staff in practicing efficient knowledge management due to they are still not ready to accept changes to the new system, lack of exposure and behavior. In addition, looking at critical success factor such as on the human resources (HR), there is a lack of encouragement such as rewards and recognition given to employees who practices KM in the organization. As a result, this paper highlights the most influential factors for effective knowledge management practices in terms of people, process and technology. We hope that the results can be used as a guideline to rectify the challenges in KM practices especially in the public organizations.

Author 1: Subashini Ganapathy
Author 2: Zulkefli Mansor
Author 3: Kamsuriah Ahmad

Keywords: Critical success factors; influencing factors; knowledge management; public sector

PDF

Paper 29: Image Steganography using Combined Nearest and Farthest Neighbors Methods

Abstract: Security is invariably a significant concern during communication. With the ease of communication, there is always a pending threat of intrusion. Steganography is one such way to achieve security by concealing confidential information within a more innocent looking media like image, audio, video etc. In this paper, a new technique is proposed that uses the relationship of a pixel with its Nearest Neighbor and Farthest Neighbor to hide secret information into that pixel. The cover image is divided into2x2 non-overlapping blocks. According to the vulnerability of the relationship among the pixels, blocks are labelled as Stable and Unstable. The Stable block hides ‘k’ secret while unstable block hides ‘n’ secret bits. 25 types of different set of ‘k’ and ‘n’ is examined to evaluate the performance of proposed method. 2k method is applied to improve the quality of stego image. The experimental result shows that the proposed technique hides a significant number of secret bits with high PSNR. While compared with other existing methods, the proposed method achieves a much higher visual quality than that of those methods.

Author 1: Farhana Sharmin
Author 2: Muhammad Ibrahim Khan

Keywords: Steganography; cryptography; Pixel Value Difference (PVD); image processing; cover image; information security; stego image

PDF

Paper 30: Decision Support System for Employee Candidate Selection using AHP and PM Methods

Abstract: PT. Prima Grafika is a digital printing company that is looking for the best prospective employees. Based on research and observations that have been made, the company only uses administrative data in selecting prospective employees. Processing a lot of data and documents with the same applicant's name slowing down the data processing and exceeding the time limit has been determined to get the best employees. The solution to this problem is by making a web-based system in recruitment, a decision support system combined with Analytic Hierarchy Process (AHP) method as a weighting to conduct priority criteria analysis by pairwise comparisons between two criterion so that all criteria are covered, and the Profile Matching (PM) method as a ranking. This study uses three methods of testing, namely, Black Box Testing system tested by 60 respondents who can be accepted by the company; User Acceptance Testing (UAT) obtained from 10 respondents with an ideal score of 900 produced an actual score of 779 or 86.1%, in total this whole system is acceptable; and Delone and McLean Model Test Results obtained from 10 respondents with an ideal score of 850 produced an actual score of 726 or 85.7% which is very good. With the results of ranking: A001 - Cantika Dewi = 4.12, A004-Arif Yulkianto = 3.98, A002 Eprriadi = 3,913, A003-Rika Novriani = 3,467.

Author 1: Soleman

Keywords: Prospective employees; decision support systems; Analytic Hierarchy Process (AHP); profile matching

PDF

Paper 31: Analyzing the Impact of Forest Cover at River Bank on Flood Spread by using Predictive Analytics and Satellite Imagery

Abstract: Floods have been a recurring problem for a number of countries around the world including Pakistan. It is believed that densely populated forests at river banks can prevent floods from spreading towards settlements and farmlands. The role of forest in flood spread has been an area of research for a while but the role of predictive modeling in this area is yet to be investigated in detail. In this study, we have used predictive analytics and satellite imagery to develop an environmental model that can predict the flood spread by considering forest cover at river bank and month of the year as parameters. We have used the satellite images of an area situated in the northern region of Pakistan i.e. Dera Ghazi Khan from the USGS’s Land Sat program. These images comprised of a section of the Indus River and its adjoining areas. We want to analyze the forest bank at various section of the Indus River. We developed and trained our predictive model by using the satellite imagery data and tested it on a separate dataset to determine error percentage. The model showed significant promise and predicted the flood spread with an average accuracy of above 80%.

Author 1: Muhammad Aneeq Yusuf
Author 2: Muhammad Khalid Khan
Author 3: Tariq Mahmood
Author 4: Muhammad Umer
Author 5: Rafi Ullah Afridi

Keywords: Floods; forests; predictive modeling; satellite imagery; environmental modeling

PDF

Paper 32: A Review of Feature Selection and Sentiment Analysis Technique in Issues of Propaganda

Abstract: Propaganda is a form of communication that is used in influencing communities, or people in general, to push forward an agenda for a certain goal. Nowadays, there are different means used in distributing propaganda including postings on social media, illustrations, cartoons and animations, articles, TV and radio shows. This paper is focused on election propaganda. Candidates in elections would use propaganda as a form of communication to channel and deliver messages through social media. Sentiment analysis (SA) is then used in identifying the positive and negative elements within the propaganda itself, through analysing the related documents, social media, articles or forums. This paper presents the various techniques used by previous researchers in issues of propaganda using SA, which include feature selection to remove irrelevant features and sentiment methods to identify sentiment in documents or others. Feature selection is a dominant side in sentiment analysis due to content of textual has a high measurement classification that can jeopardize SA classification interpretation. This paper also explores several SA techniques to identify sentiments in issues of propaganda. This study has also attempted to identify the use of swarm algorithms as a suitable feature selection method in SA for propaganda issues.

Author 1: Siti Rohaidah Ahmad
Author 2: Muhammad Zakwan Muhammad Rodzi
Author 3: Nurlaila Syafira Shapiei
Author 4: Nurhafizah Moziyana Mohd Yusop
Author 5: Suhaila Ismail

Keywords: Sentiment analysis; feature selection; swarm algorithm; propaganda

PDF

Paper 33: DesCom: Routing Decision using Estimation Time in VDTN

Abstract: VDTN was proposed as a disrupting network which is established on the paradigm of the delay-tolerant network. VDTN uses vehicular nodes to convey messages as, it permits sparse opportunistic network connectivity, which is considered by the low node density where the vehicular traffic is sporadic, and no end-to-end paths exist between nodes. The message bundle is directed from the sender to the receiver node based on the routing protocol decision. While Routing protocols take decisions based on different metrics like Time to live, Location, Remaining Buffer Size, meeting probability, etc. In this paper, a routing protocol named DesCom is proposed for Vehicular Delay-Tolerant Network under a highly suppressed and sparse environment. DesCom takes the decision based on Message TTL, Transmission rate, and Estimation time. Estimation time is calculated in our previous work. The protocol defines whether to direct the message to the requested node or search the other more suitable node to carry that data bundles. After compiling multiple simulations with different numbers of vehicles and comparing DesCom with other routing protocols it is concluded that DesCom has the least buffer time with low latency along with good delivery probability.

Author 1: Adnan Ali
Author 2: Jinlong Li
Author 3: Aqsa Tanveer
Author 4: Maryam Batool
Author 5: Nimra Choudhary

Keywords: Estimation time; VDTN; routing; vehicular delay- tolerant network; ONE simulator

PDF

Paper 34: LCAHASH-1.1: A New Design of the LCAHASH System for IoT

Abstract: The present paper represents an extension of LCAHASH system. LCAHASH is a previously developed lightweight hash algorithm. It is based on cellular automata. It was developed as an alternative to existing hash functions to ensure data integrity and to meet the security requirements of the Internet of Things devices. Due to the limited amount of storage and the limited computation capabilities of these devices, the algorithms used by these devices should be as efficient and as robust as possible. In this contribution, we propose an enhanced version of the original LCAHASH algorithm to improve its efficiency and its robustness. A description of the system proposed along with a security analysis and the results of a statistical battery of tests (Dieharder) are included. These results show that the system proposed exhibits good statistical and cryptographic features.

Author 1: Anas Sadak
Author 2: Bouchra Echandouri
Author 3: Fatima Ezzahra Ziani
Author 4: Charifa Hanin
Author 5: Fouzia Omary

Keywords: Information security; hash function; cellular automata; IoT; data integrity

PDF

Paper 35: An Empirical Comparison of Machine Learning Algorithms for Classification of Software Requirements

Abstract: Intelligent software engineering has emerged in recent years to address some difficult problems in requirements engineering. Requirements are crucial for software development. Moreover, the classification of natural language user requirements into functional and non-functional requirements is a fundamental challenge as it defines the fulfillment criteria of the users’ expected needs and wants. Therefore the research of this article aims to explore and compare random forest algorithm and gradient boosting algorithm to determine the accuracy of functional requirements and non-functional requirements in the process of requirements classification through the conduct of experiments. Random forest and gradient boosting are ensemble algorithms in machine learning that combines the decisions from several base models to improve the prediction performance. Experimental results show that the gradient boosting algorithm yields improved prediction performance when classifying non-functional requirements, in comparison to the random forest algorithm. However, the random forest algorithm is more accurate to classify functional requirements.

Author 1: Law Foong Li
Author 2: Nicholas Chia Jin-An
Author 3: Zarinah Mohd Kasirun
Author 4: Chua Yan Piaw

Keywords: Machine learning; ensemble algorithms; requirements classification; functional requirements; non-functional requirements

PDF

Paper 36: PathGazePIN: Gaze and Path-based Authentication Entry Method

Abstract: In these days, smartphones are being widely used, people use them to store sensitive and private information. The password authentication is the most used authentication method, however, the password can be disclosure through shoulder surfing attack, since the users write their password in public places and they tend to make the password easy to remember which increase the vulnerability to attacks. Many authentications schemes were proposed to prevent shoulder surfing attack, but they still vulnerable to shoulder surfing attack or have accuracy problems or lack in usability. In this paper, we proposed a gaze-text based PIN entry method, which we called PathGazePIN. It will utilize the user's eye movement to enter the password. The main idea is allowing the user to authenticate by following numbers that move along fixed paths on the screen by using two authentication interfaces: random interface and sorted interface. The results represented that the proposed system will increase the security against shoulder surfing attack as well as usability and accuracy.

Author 1: Bayan M AlBaradi
Author 2: Amani M. AlTowayan
Author 3: Maram M. AlAnazi
Author 4: Sadaf Ambreen
Author 5: Dina M. Ibrahim

Keywords: User authentication; shoulder surfing attacks; PIN; gaze-based; security; accuracy

PDF

Paper 37: Controlling High PAPR in Vehicular OFDM-MIMO using Downlink Optimization Model under DCT Transform

Abstract: The persisting challenges of the radio channel in vehicular networks entail the use of multi-antennas which is known as Multiple-Input Multiple-Output (MIMO). In order to obtain an efficient multi-user MIMO system, the power of the radio frequency (RF) components should be optimized. Necessarily practical solutions are essential to lower the vehicular nodes' complexity and support a robust Orthogonal Frequency Division Multiplexing (OFDM) discipline with a simplified equalization at the receiver. In this paper, the pre-coding Zadoff-Chu Sequence (ZCS) is employed along with the Discrete Cosine Transform (DCT) to control and optimize the high peak power. It intends the transmission over multi-user MIMO downlink vehicular channels. At last, the convex optimization is utilized to guarantee the peak-to-average power ratio (PAPR) minimization. Simulation results have shown that the proposed model can lessen the high PAPR compared to the least-square pre-coding. At the same time, it proved its effectiveness and accuracy as it enhances the transmission quality over multi-user MIMO-OFDM downlink vehicular channel.

Author 1: Ahmed Ali
Author 2: Esraa Eldesouky

Keywords: Zadoff-Chu Sequence; Discrete Cosine Transform (DCT); Peak-to-Average Power Ratio (PAPR); Vehicular Networks

PDF

Paper 38: The Correction of the Grammatical Case Endings Errors in Arabic Language

Abstract: Syntax plays a key role in natural language processing, but it does not always occupy an important position in applications. The main objective of this article is to solve the problem of the grammatical case ending errors produced by Arabic learners or certain common errors. Arabic can be considered more complex than English or French. He does not have vowels; diacritic signs (vowels) are placed above or below the letters. These diacritic signs are abandoned in most Arabic texts. This induces both grammatical and lexical ambiguities in Arabic. The present paper describes an automatic correction of this type of errors using “Stanford Parser” with an ontology containing the rules of the Arabic language. We segment the text into sentences, then we extract the annotations of each word with the syntactic relations coming from our parser, then we treat the relations obtained with our ontology. Finally, we compare the original sentence with the corrected one in order to detect the error. The implemented system achieved a total detection of about 94%. It is concluded that the approach is clearly promising by observing the results as compared to the limited number of available Arabic grammar checkers.

Author 1: Chouaib MOUKRIM
Author 2: Abderrahim TRAGHA
Author 3: El Habib BENLAHMER

Keywords: Automatic correction; ontology; syntactic errors; case endings; natural language processing; Arabic

PDF

Paper 39: Partition Ciphering System: A Difficult Problem Based Encryption Scheme

Abstract: In this article, a new encryption scheme called Partition Ciphering System is proposed to adapt and process the message according to the partition problem. The objective of this system, that can be applied as a standalone system or as a building block in a bigger system, is to achieve confidentiality, and maintain a balance between ones and zeros in the output so that attacks like frequency cryptanalysis is avoided and good entropy is met. At first, the authors describe the partition problem together with an adapted version. Secondly, the encryption and the decryption processes are provided. Next, a comparison, in terms of the statistical properties using the DIEHARDER battery, security analysis and performance, with other encryption schemes is presented. From the results, the proposed cryptosystem is resistant to frequency analysis and shows good entropy in the output. Moreover, compared to the Advanced Encryption Standard, it has a random behavior and good confusion and diffusion(Avalanche effect). Also, it displays better performance and resistance to brute force attack on the key.

Author 1: Ziani Fatima Ezzahra
Author 2: Omary Fouzia

Keywords: Encryption scheme; partition problem; frequency analysis; avalanche effect; confusion; diffusion; statistical properties

PDF

Paper 40: The Implementation of Business Intelligence and Analytics Integration for Organizational Performance Management: A Case Study in Public Sector

Abstract: Literature study shows that several works have been conducted on the implementation of BI in performance management, but the analytical aspects were not being considered. Business analytics is an activity of applying analytics to strengthen strategic and operational business activities. While performance management is important to determine organisational success and in public sector, it has become more challenging due to generality of public sector objectives and different level of stakeholders involved. Existing frameworks were built separately and this limits the implementation of Business Intelligence and Analytics as an integrated component, and could not meet the current performance management needs and expectations. The objective of this study is to establish a framework that integrates elements of business intelligence, analytics and performance management for the comprehensive implementation in public sector. This study identifies four main components of this integrated framework: Process, People, Governance and Ability. Each component consists of several key elements and sub-elements. The proposed framework is validated and implemented by real case study conducted in one organisation in Malaysia. The implementation demonstrates the suitability and practicality of this framework to be implemented in real environment.

Author 1: Jamaiah Yahaya
Author 2: Nur Hani Zulkifli Abai
Author 3: Aziz Deraman
Author 4: Yusmadi Yah Jusoh

Keywords: Business intelligence; business analytics; organisational performance management; framework; case study

PDF

Paper 41: A Survey on Distributed Greenhouse Gases Monitoring Systems

Abstract: Monitoring of air quality represents a major task, due to the direct impact of pollution on human health. Pollution has been further aggravated by the progresses that have taken place in the last decades: traffic growth, traffic noise in cities and growth of urban areas, rising cities, increased energy consumption, industrialization, and economic development. Global warming and acid rain are the results of these factors; thus, air quality is essential to be monitored. This survey presents a set of researches and applications related to air quality monitoring, aimed to detect, measure, collect and process data aggregated from sensors, such as gas sensors for sensing concentration of gases such as CO2 (Carbon dioxide), relative humidity, temperature, TVOC (Total Volatile Organic Compounds), PM (particulate matter) and noise level. Based on some processes, users will be able to see the polluted areas on the map. The paper presents a state of the art of air monitoring systems, noise monitoring systems, air pollution systems. Also, the paper proposes a distributed greenhouse monitoring system for pollution measurement and control.

Author 1: Adela Puscasiu
Author 2: Alexandra Fanca
Author 3: Dan-Ioan Gota
Author 4: Silviu Folea
Author 5: Honoriu Valean

Keywords: Air quality; air monitoring systems; noise monitoring systems; air pollution systems

PDF

Paper 42: The Development of a Visual Output Approach for Programming via the Application of Cognitive Load Theory and Constructivism

Abstract: Programming is a skill of the future. However, decades of experience and research had indicated that the teaching and learning of programming are full of problems and challenges. As such educators and researchers are always on the look-out for suitable approaches and paradigms that can be adopted for the teaching and learning of programming. In this article, it is proposed that a visual output approach is suitable based on the current millennials affinities for graphics and visuals. The proposed VJava Module is developed via the application of two main learning theories, which are, the cognitive load theory and constructivism. There are two submodules which consist of eight chapters that cover the topics Introduction to Programming and Java, Object Using Turtle Graphics, Input and Output, Repetition Structure, Selection Structure, More Repetition Structures, Nested Loops and Arrays. To enable Java programs to produce graphical and animated outputs, the MJava library was developed and integrated into this module. The module is validated by three Java programming experts and an instructional design expert on the module content, design and usability aspects.

Author 1: Marini Abu Bakar
Author 2: Muriati Mukhtar
Author 3: Fariza Khalid

Keywords: Introductory programming; CS1; novices; Java programming; learning; objects-first

PDF

Paper 43: Smart Age Detection for Social Media

Abstract: Over the recent years, there has been an immense attraction towards age detection due to its raised implementation in various sectors. Such as government regulations and rules, security control, and human computer interaction. Popular human features such as the face and fingerprints can be modified or changed with time. However, ear has a stable structure that does not change with time and have unique features that satisfies the requirements of a biometric trait. This research presents a detailed analysis extracting the features of the human ear only by applying Deep Learning techniques. In particular, Convolutional Neural Network (CNN) is applied on large datasets which have multiple layers to extract the features and classify them. The proposed methodology increased the number of the dataset by collecting more private children datasets, and consequently achieved high accuracy by 98.75% along with amending the architecture of the selected neural network compared to previous studies. This research can be benefited to control the contents of social media by detecting the age of group whether it is under 18 or above 18.

Author 1: Manal Alghieth
Author 2: Jawaher Alhuthail
Author 3: Kholod Aldhubiay
Author 4: Rotan Alshowaye

Keywords: Age detection; deep learning; ear features; CNN; control social media

PDF

Paper 44: Automated Methodology for Optimizing Menus in Personalized Nutrition

Abstract: In the personalized nutrition rationalized management system the central practical task is to compile an optimized menu that provides the best value for a multi-criteria set of assessments. These are nutrient composition, cost (economic acceptability), energy value, food intolerances, individual preferences, etc. To solve the problem, a combined optimization method is used. It includes preliminary ordering of options and controlled enumeration. The result of solving the problem of developing a personalized nutrition menu is a diet that meets the needs of a particular diet, taking into account its nutritional status, individual preferences and intolerances, medical appointments. In connection with discrete values of the outputs of dishes and recipes, the task of optimizing the human diet is in practice a combinatorially integer, and for its solution the method of computer modeling and controlled enumeration of options was used. Evaluation of the effectiveness of optimization is carried out by external experts. The developed menu design system makes it possible to repeatedly solve the problem of optimizing a personalized menu when changing incoming data for reasons of changing dietary tasks, introducing new products, changing food preferences, etc. With this approach, the system is a personalized food model that is regularly used for rational planning and allows to achieve a reduction in labor costs (compared to the “manual” compilation of a menu in a computer system) by 2-3 times. An additional way to use such a model is the targeted design of functional product formulations. Moreover, the properties of the product are not evaluated in isolation, but as part of a specific diet.

Author 1: Valery I Karpov
Author 2: Nikolay M. Portnov
Author 3: Igor A. Nikitin
Author 4: Yury I. Sidorenko
Author 5: Sergey M. Petrov
Author 6: Nadezhda M. Podgornova
Author 7: Mikhail Yu. Sidorenko
Author 8: Sergey V. Shterman
Author 9: Igor V. Zavalishin

Keywords: Personalized nutrition; menu optimization; nutrition management; practical nutrition

PDF

Paper 45: Node Relocation Techniques for Wireless Sensor Networks: A Short Survey

Abstract: Sensor nodes in a sensor network often operate in harsh and challenging environments and this leads to frequent failure of sensor nodes. Failure of sensor nodes leads to partitioning in the network connectivity. For significant effectiveness of applications of sensor networks, the inter-sensor connectivity among sensors is vital. Some sensors are also involved in sustaining the flow of information from the sensor to unapproachable end users. The network can be split up into multiple incoherent blocks and cease working due to physical damage or onboard energy depletion. To deal with such scenarios, a plethora of node repositioning techniques are proposed in the literature. In this article, the recent and up to date mode of research on dynamic sensor repositioning in WSN is discussed. This article classifies sensor repositioning methods into on-demand and post-deployment repositioning based on whether the optimization is accomplished at deployment time or while the network is functioning.

Author 1: Mahmood ul Hassan
Author 2: Khalid Mahmood
Author 3: Shahzad Ali
Author 4: Amin Al Awady
Author 5: Muhammad Kashif Saeed

Keywords: Node failure; sensor node; post-deployment; on-demand relocation; internode connectivity

PDF

Paper 46: Comprehensive e-Learning System with Simulation Capabilities for Understanding of Complex Equations

Abstract: A comprehensive e-learning system with simulation capabilities for understanding of complex equations is proposed. Through experiment with the proposed e-learning system, it is found that the proposed system is much effective and comprehensive than the conventional e-learning system without any mathematical expressions of processing and simulation capabilities by 14.5 %, the time required for learning increased by 18.2% though. In addition, the proposed system not only helps students to understand subjects and complex equations but also can be effective to increase students’ motivation to learn.

Author 1: Kohei Arai

Keywords: SCORM; e-learning; simulation; Physics subject; action script

PDF

Paper 47: Challenges in Wireless Body Area Network

Abstract: Wireless Body Area Network (WBAN) refers to a short-range, wireless communications in the vicinity of, or inside a human body. WBAN is emerging solution to cater the needs of local and remote health care related facility. Medical and non-medical applications have been revolutionarily under consideration for providing a healthy and gratify service to humanity. Being very critical in communication of the data from body, it faces many challenges, which are to be tackled for the safety of life and benefit of the user. There is variety of challenges faced by WBAN. WBAN is favorite playground for attackers due to its usability in various applications. This article provides systematic overview of challenges in WBAN in communication and security perspectives.

Author 1: Muhammad Asam
Author 2: Tauseef Jamal
Author 3: Muhammad Adeel
Author 4: Areeb Hassan
Author 5: Shariq Aziz Butt
Author 6: Aleena Ajaz
Author 7: Maryam Gulzar

Keywords: WBAN (Wireless Body Area Network); denial of service attacks; resource management; cooperation; security

PDF

Paper 48: e-Parking: Multi-agent Smart Parking Platform for Dynamic Pricing and Reservation Sharing Service

Abstract: Parking is a key element of a sustainable urban mobility policy. It plays a fundamental role in travel planning and transport management, as the foremost vector of modal choice, but also as a potential means of freeing up public spaces. In this article we define the smart parking concept, as an application of smart mobility, present a historical analysis of the evolution of smart parking framework and show a statistical analysis of the published patent applications in this field around the world using the ORBIT database. Then, we propose a new smart parking architecture based on multi-agent features. Finally, we introduce the e-Parking system, platform to improve the driver experience of crowded cities. It provides real-time parking prices and offers a reservation and guidance services. In addition, the system assigns an optimal parking for a driver based on the user’s requirements that combine proximity to destination, parking cost and dwell time, while ensuring a fair sharing of public space among users and improves traffic conditions. Our approach is based on dynamic pricing policy. Our scheme is suitable for mixed-usage areas, as it considers the presence of reserved and not reserved driver in the same parking area.

Author 1: Bassma Jioudi
Author 2: Aroua Amari
Author 3: Fouad Moutaouakkil
Author 4: Hicham Medromi

Keywords: Smart cities; smart mobility; smart parking; dynamic pricing policy; cruising traffic

PDF

Paper 49: Semantic Micro-Services Model for Vehicle Routing using Ant Colony Optimization

Abstract: In this paper, we propose a new model of vehicle routing based on micro services using the principle of selection and composition of paths. In this model each micro service is responsible for a road resource, a step in the driver's journey, implementing a specific objective of the road course. The micro services are deployed in a cloud architecture in multiple instances according to a system of increase in load and fault tolerance. Drivers requests are sent to a proxy micro-service, with abstract structures of road courses represented by oriented graphs. The proxy micro-service is responsible for analyzing the request to determine the driver's profile and its context in the journey in order to select the micro-service responsible for providing information on the most appropriate road resource. The method of meta-heuristic optimization of road courses, used in the proposed approach, is based on the ant colony algorithm, or we will describe an adaptation of this optimization method to propose to the driver, at each stage of the journey, the most optimal resource to exploit.

Author 1: Asmaa ROUDANE
Author 2: Mohamed YOUSSFI
Author 3: Khalifa MANSOURI

Keywords: Road traffic; road traffic management systems; intelligent systems; complex systems; graphs; real-time; optimization methods; micro-services; ant colonies; vehicle routing

PDF

Paper 50: Monitoring of Rainfall Level Ombrometer Observatory (Obs) Type using Android Sharp GP2Y0A41SK0F Sensor

Abstract: Measurements of rainfall carried out are generally automatic, but how many parties have carried out research using automatic rain gauge instruments. The rainfall level data obtained can be used to detect flooding so that it reduces the occurrence of natural disasters earlier. The principle works are when rain falls the water is collected and the height will be detected. The Sharp GP2Y0A41SK0F sensor will be read on Arduino Uno and pass through the signal conditioning circuit and then forwarded to the Water Pump to pump water so that the water comes out of the tube and the data will be stored. ESP8266, the Wi-Fi module will send data to Android. Measurements are made when the water is full the value of max 53.2 means heavy rain. After the data is obtained, then look for the standard error of the measurement tool for 7 days. The design of the model builds a tool using a funnel with a diameter of 14 cm and a height of 26 cm. Based on the calculation of the design that has been made the measuring instrument developed is able to measure rainfall up to 26000 mm rain height.

Author 1: Anton Yudhana
Author 2: Yunita Dwi Andriliana
Author 3: Son Ali Akbar
Author 4: Sunardi
Author 5: Subhas Mukhopadhyay
Author 6: Ismail Rakip Karas

Keywords: Arduino; infrared sensor; android; observatory; rainfall

PDF

Paper 51: Deep MRI Segmentation: A Convolutional Method Applied to Alzheimer Disease Detection

Abstract: The learning techniques have a particular need especially for the detection of invisible brain diseases. Learning-based methods rely on MRI medical images to reconstruct a solution for detecting aberrant values or areas in the human brain. In this article, we present a method that automatically performs segmentation of the brain to detect brain damage and diagnose Alzheimer's disease (AD). In order to take advantages of the benefits of 3D and reduce complexity and computational costs, we present a 2.5D method for locating brain inflammation and detecting their classes. Our proposed system is evaluated on a set of public data. Preliminary results indicate the reliability and effectiveness of our Alzheimer's Disease Detection System and demonstrate that our method is beyond current knowledge of Alzheimer's disease diagnosis.

Author 1: Hanane Allioui
Author 2: Mohamed Sadgal
Author 3: Aziz Elfazziki

Keywords: Computer-Assisted Diagnosis (CAD); Alzheimer's disease (AD); Image segmentation; Machine learning; Convolutional Neural Networks (CNN); Magnetic Resonance Imaging

PDF

Paper 52: Improved Adaptive Semi-Unsupervised Weighted Oversampling using Sparsity Factor for Imbalanced Datasets

Abstract: With the incredible surge in data volumes, problems associated with data analysis have been increasingly complicated. In data mining algorithms, imbalanced data is a profound problem in machine learning paradigm. It appears due to desperate nature of data in which, one class with a large number of instances presents the majority class, while the other class with only a few instances is known as minority class. The classifier model biases towards the majority class and neglects the minority class which may happen to be the most essential class; resulting into costly misclassification error of minority class in real-world scenarios. Imbalanced data problem is significantly overcome by using re-sampling techniques, in which oversampling techniques are proven to be more effective than undersampling. This study proposes an Improved Adaptive Semi Unsupervised Weighted Oversampling (IA-SUWO) technique with sparsity factor, which efficiently solves between-the-class and within-the-class imbalances problem. Along with avoiding over-generalization, overfitting problems and removing noise from the data, this technique enhances the number of synthetic instances in the minority sub-clusters appropriately. A comprehensive experimental setup is used to evaluate the performance of the proposed approach. The comparative analysis reveals that the IA-SUWO performs better than the existing baseline oversampling techniques.

Author 1: Haseeb Ali
Author 2: Mohd Najib Mohd Salleh
Author 3: Kashif Hussain

Keywords: Data mining; imbalanced data; minority; majority; oversampling

PDF

Paper 53: Factors Contributing to the Success of Information Security Management Implementation

Abstract: Information Security Management (ISM) concerns shielding the integrity, confidentiality, availability, authenticity, reliability and accountability of the organisation’s information from unauthorised access in order to ensure business continuity and customers’ confidence. The importance of information security (IS) in today’s situation should be given due attention. Recognising its importance, organisations nowadays have devoted wide efforts in protecting their information. They establish information security policy, processes, and procedures as well as reengineer their organisational structures to align with ISM principles. Regardless of the efforts, security incidents continue to occur in many organisations. This phenomenon shows that the current implementation of ISM is still ineffective due to unaware of the factors contributing to the success of ISM. Thus, the objective of this paper is to identify ISM success factors and their elements through a large-scale survey. The survey involves 243 practitioners from statutory bodies, public and private organisations in Malaysia. The results of the survey indicate that top management, IS coordinator team, ISM team, IS audit team, employees, third parties, IS policy, IS procedures, resource planning, competency development and awareness, risk management, business continuity management, IS audit and IT infrastructure are the factors that contribute to the success of ISM implementation. These factors shall guide practitioners in planning and refining ISM implementation in their organisations.

Author 1: Mazlina Zammani
Author 2: Rozilawati Razali
Author 3: Dalbir Singh

Keywords: Information security; information security management; success factors; key factors

PDF

Paper 54: Improving Long Short-Term Memory Predictions with Local Average of Nearest Neighbors

Abstract: The study presented in this paper aims to improve the accuracy of meteorological time series predictions made with the recurrent neural network known as Long Short-Term Memory (LSTM). To reach this, instead of just making adjustments to the architecture of LSTM as seen in different related works, it is proposed to adjust the LSTM results using the univariate time series imputation algorithm known as Local Average of Nearest Neighbors (LANN) and LANNc which is a variation of LANN, that allows to avoid the bias towards the left of the synthetic data generated by LANN. The results obtained show that both LANN and LANNc allow to improve the accuracy of the predictions generated by LSTM, with LANN being superior to LANNc. Likewise, on average the best LANN and LANNc configurations make it possible to outperform the predictions reached by another recurrent neural network known as Gated Recurrent Unit (GRU).

Author 1: Anibal Flores
Author 2: Hugo Tito
Author 3: Deymor Centty

Keywords: Long Short-Term Memory; Local Average of Nearest Neighbors; univariate time series prediction; LANN; LANNc; Gated Recurrent Unit; GRU

PDF

Paper 55: Data Sanitization Framework for Computer Hard Disk Drive: A Case Study in Malaysia

Abstract: In digital forensics field, data wiping is considered one of the anti-forensics’ technique. On the other perspective, data wiping or data sanitization is the technique used to ensure that the deleted data are unable to be accessed by any unauthorized person. This paper introduces a process for data sanitization from computer hard disk drive. The process was proposed and tested using commercial data sanitization tools. Multiple testing has been conducted at accredited digital forensic laboratory of CyberSecurity Malaysia. The data sanitization process was performed using overwritten method provided by state-of-the-art data sanitization tools. For each sanitization tool, there are options for the wiping technique (overwritten) process. The options are either to wipe using single pass write or multi pass write. Logical data checking in the hard disk sector was performed during pre and post data disposal process for a proper verification. This is to ensure that the entire sector has been replaced by data sanitization bit pattern in correspondence to the selected wiping technique. In conclusion, through the verification of data sanitization it will improve the process of ICT asset disposal.

Author 1: Nooreen Ashilla Binti Yusof
Author 2: Siti Norul Huda Binti Sheikh Abdullah
Author 3: Mohamad Firham Efendy bin Md Senan
Author 4: Nor Zarina binti Zainal Abidin
Author 5: Monaliza Binti Sahri

Keywords: Data sanitization; anti-forensics; digital forensics; wiping technique

PDF

Paper 56: Render Farm for Highly Realistic Images in a Beowulf Cluster using Distributed Programming Techniques

Abstract: Now-a-days, photorealistic images are demanded for the realization of scientific models, so we use rendering tools that convert three-dimensional models into highly realistic images. The problem of generating photorealistic images occurs when the three-dimensional model becomes larger and more complex, so the time to generate an image is much greater due to the limitations of hardware resources, about this problem is implemented the render farm, which consists in a set of computers interconnected by a high-speed network that provides a strip of the global image distributed in each participating computers with the intention of reducing the processing time of highly complex computational images. The research was implemented in a high-performance Beowulf group of the Universidad de Ciencias y Humanidades using a total of 18 computers. To demonstrate the efficiency of a rendering farm implementation, scalability tests were performed using a 360° equirrectangular model with a total of 67 million pixels, the work is carried out to achieve highly complex renderings in less time to benefit the direction of the research.

Author 1: Enrique Lee Huamaní
Author 2: Patricia Condori
Author 3: Brian Meneses-Claudio
Author 4: Avid Roman-Gonzalez

Keywords: Distributed programming; computational parallelism; Beowulf cluster; high-efficiency computing; render farm

PDF

Paper 57: Performance Evaluation of IoT Messaging Protocol Implementation for E-Health Systems

Abstract: Now-a-days, e-health and healthcare applications in the internet of things are growing rapidly. These applications are starting from remote monitoring of patient's parameters in home to monitoring patients during his life activities at work, transportation, etc. So we can monitor patients at any place outside of hospitals and clinical settings. By using this technology, we can save lives and reduce the number of emergency visits to hospitals. In contemporary time, there are great progress and opportunities for the internet of things (IoT) related E-health systems. Most IoT e-health platforms consist of three main parts; client nodes (patient or doctor), IoT server and IoT communication messaging protocol. One of E-health systems design over IoT challenge is choosing the most suitable IoT messaging protocol for E-health applications. In this paper, IoT remote patient and e-Health monitoring system was designed for monitoring physiological medical signals of patients based on most two famous IoT messaging protocols, MQTT and CoAP. These medical signals can be include parameters like heart rate signals, electro-cardio graph (ECG), patient temperature, blood pressure, etc. This practical comparison between CoAP and MQTT is to choose most suitable for e-health systems. The proposed approach was evaluated based on most significant protocol parameters like capability, efficiency, communication method and message delay. Practical and simulation results show the performance of the proposed E-health systems over IoT for different network infrastructure with different losses percentages.

Author 1: M Zorkany
Author 2: K.Fahmy
Author 3: Ahmed Yahya

Keywords: E-health; IoT; IoT protocol; CoAP; MQTT and remote patient monitoring

PDF

Paper 58: Analysis of Password and Salt Combination Scheme To Improve Hash Algorithm Security

Abstract: In system security, hashes play important role in ensuring data. It remains the secure and the management of access rights by those entitled to. The increasing power of hash algorithms, various methods, are carried out one of them using salting techniques. Salt is usually attached as a prefix or postfix to the plaintext before hashing. But applying salt as a prefix or postfix is not enough. There are so many ways to find the plaintext from the resulting cipher text. This research discusses the combination scheme other than the prefix and postfix between password and salt increasing the security of hash algorithms. There is no truly secure system and no algorithm that has no loopholes. But this technique is to strengthen the security of the algorithm. So that, it gives more time if an attacker wants to break into the system. To measure the strength generated from each combination scheme, a tool called Hashcat is used. That is the way known as the best composition in applying salt to passwords.

Author 1: Sutriman
Author 2: Bambang Sugiantoro

Keywords: Security; hash; hashing scheme; salting; password

PDF

Paper 59: Capturing Software Security Practices using CBR: Three Case Studies

Abstract: Generally, software security can be regarded as one of the most important issues in software engineering field since it may affect the software product effectiveness due to the various technological vulnerabilities and menaces. Most traditional software security approaches provide security activities through the software development lifecycle (SDLC) from requirements to design, implementation, testing and deployment. This paper focuses on embedding security concerns in the software development lifecycle (SDLC) using a bottom-up approach that is based on case based reasoning (CBR) paradigm. Thus, we study three high security-focusing cases for software projects, namely “e-shop”, “Mobiling” and “intranet” using a structured case study method. Then, we populate these three cases in the proposed framework that is an excerpt of the case project base. Furthermore, this paper identifies the specificity of each case, discusses completeness of the proposed framework and proposes suggestions for improvement. Finally, usages scenarios are defined sustaining the use of the proposed framework.

Author 1: Ikram Elrhaffari
Author 2: Ounsa Roudies

Keywords: CBR; project features; case base; e-shop; mobiling; intranet; mutualize; security practices; security requirements

PDF

Paper 60: An Efficient Model for Medical Data Classification using Gene Features

Abstract: In the medical field to solve the new issues, the novel approaches for managing relevant features by using genomes are considered; using the sub-sequence of genes the outcome of interest is analyzed. In this implementation part of the model, we have given the MEDLINE and PubMed archives as inputs to the proposed model. A large number of MESH terms with gene and protein are utilized to characterize the patterns of a large number of medical documents from a large set of records. Standard datasets with different characteristics are used for examination study. The characteristics and inadequacies of different techniques are noted. Feature selection techniques are given in perspective of data composes and region traits by applying proper rules. Feature context extraction through name element distinguishing proof is an essential errand of online therapeutic report grouping for learning disclosure databases. The parameters are identified to compare with other models implemented on these datasets and the results prove that the proposed method is very effective than existing models. The primary point of the proposed ensemble learning models is to characterize the high dimensional information for gene/protein-based disease expectation in light of substantial biomedical databases. The proposed model uses an efficient ranking algorithm to select the relevant attributes from a set of all attributes; the attributes are given to the classifier to improve the accuracy based on the users’ interest.

Author 1: Kosaraju Chaitanya
Author 2: Rachakonda Venkatesh
Author 3: Thulasi Bikku

Keywords: Classification; Hadoop framework; biomedical documents; feature selection; gene features; medical datasets

PDF

Paper 61: Mutual Authentication Security Scheme in Fog Computing

Abstract: Fog paradigm is a new and emerging technology that extends the services of cloud computing near to edge network. This paradigm aims to provide rich resources near to edge devices and remove the deficiencies of cloud computing for example, latency. However, this paradigm is distributed in nature and does not guarantee the trustworthiness and good behavior of edge devices. Thus, authentication and key exchange are significant challenges in front of this new paradigm. The researchers have worked on different authentication and key exchange protocols. Recently Maged Hamada Ibrahim proposed an authentication scheme that permits fog user to authenticate mutually with fog server under the authority of a cloud service provider. Alongside, Amor et al proposed an anonymous mutual authentication scheme. In this scheme, the fog user and fog server authenticate each other without disclosing the user real identity, using public-key cryptosystem. But, we demonstrated that Maged Hamada Ibrahim does not preserve the user anonymity, hence, it was exposed to man in the middle attack. Amor et al. scheme is computationally complex as it is using public key cryptosystem that has low throughputs and requires large memory, which not suitable to employ for fog computing that connecting internet of things with small memory, and requires high throughputs. Therefore, to overcome the above-aforementioned security problems internet of things constraints, an improved mutual authentication security scheme based on advanced encryption standard and hashed message authentication code in fog computing has been proposed. Our scheme provides mutual authentication between internet of things devices and fog servers. We proved that the proposed improved scheme provides secure mutual authentication using the widely accepted Burrows Abdi Needham logic. In this study, the properties i.e. performance, security, and functionality are analyzed and compared with existing and related mutual authentication schemes. Our scheme provides better in security, functionalities, communication and computation cost as compared with the existing schemes.

Author 1: Gohar Rahman
Author 2: Chuah Chai Wen

Keywords: Fog computing; mutual authentication; man in the middle attack; key exchange; ban logic

PDF

Paper 62: Milk Purity Recognition Software through Image Processing

Abstract: Currently in Peru, there is a per capita milk consumption of 87 kg per year; however, the Food and Agriculture Organization of the United Nations (FAO) recommends a consumption of 120 kg per person; the industry, when the milk is acquired from small livestock suppliers, does not analyze the milk before buying it, which there is a high risk that the milk is adulterated with water, in this sense, it proposes an alternative way of preliminary detection of the presence of water in milk, only through a laser a photograph, which greatly reduces the costs of milk analysis. Milk contains different nutrients, vitamins and minerals, which are beneficial for people, so it is very known if it is adulterated or not, that way to prevent diseases. In this document, the reader will read an alternative to the existing methods for the analysis of milk, for the presented method the application of Matlab Classification Learner and the fine K-Nearest Neighbors (KNN) algorithm were used, in which a success rate of 95.4% was obtained.

Author 1: Alvarado-Díaz Witman
Author 2: Meneses-Claudio Brian
Author 3: Roman-Gonzalez Avid

Keywords: Milk; adulterated milk; milk with water; milk analysis; image processing; classification learner; image processing

PDF

Paper 63: Time and Frequency Analysis of Heart Rate Variability Data in Heart Failure Patients

Abstract: The paper presents a mathematically based analysis of heart rate variability of two groups cardiology records: healthy individuals and patients diagnosed with heart failure. The main objective of the study is to perform a parametric evaluation of the cardiovascular system of the human body using time domain and frequency domain analysis of heart rate variability. Making distinguish between diseased individuals and healthy individuals is an interesting challenge that contemporary researchers are working on. Cardiologic records obtained through continuous Holter monitoring (24 hours) were used to address the issues in this study. The obtained results show significantly reduced values of most of the studied parameters in the time domain and the frequency domain in patients with heart failure compared to healthy individuals. The low values of the studied parameters indicate low variability of heart rate and poor overall health status. The graphical results of two study groups are shown when applying the modified Welch periodogram. These graphical results give a visual idea of the variability of time series in healthy and diseased individuals. The obtained numeric and graphical results show that heart failure patients can be distinguished from healthy individuals. The applied mathematical methods in heart rate variability studying can be used as an aid in the cardiology practice of doctors.

Author 1: Galya N Georgieva-Tsaneva

Keywords: Heart rate variability; time domain analysis; frequency domain analysis; Welch periodogram; heart failure

PDF

Paper 64: E-learning Benchmarking Adoption: A Case Study of Sur University College

Abstract: As an integral tool nowadays, e-learning presents fresh paradigm in the fields of education and management. The effectiveness of e-learning in the improvement of learning and teaching method has been proven. Accordingly, there have been countless works carried out to gain comprehension on the adoption of e-learning particularly in terms of its use extent. Furthermore, a new model of e-learning Success based on McLean & DeLone Information System Success (IS Success) and Diffusion of Innovation (DOI) and the effects of e-learning on student’s performance are comprehensively highlighted. Appositely, e-learning benchmarking adoption among students is characterized and measured in this study with the integration of DOI with IS Success attributed with such adoption. In this quantitative study, data were gathered from the students enrolled in SUC. The results indicate that and the variables of relative advantage, complexity, system quality, information quality and service quality appear to have significant linkage to the adoption of e-learning as well as net benefit is significantly correlated with the adoption of e-learning. In regards to the used methods for examining the adoption of e-learning particularly, future studies could benefit from the use of quantitative and qualitative methods in combination.

Author 1: Saleem Issa Al Zoubi
Author 2: Ahmad Issa Alzoubi

Keywords: e-learning; DOI; IS success; net benefit; extent of usage; LMS; benchmarking; quality assurance

PDF

Paper 65: Development of Warning Device in Risk Situations for Children with Hearing Impairment at Low Cost

Abstract: Hearing impairment is the partial or total loss of hearing. There are approximately 34 million hearing impaired children in the world. The equipment used as a means of communication to improve interaction with society is very expensive, so in this study was built an electronic device with the ability to recognize some words configured as an emergency message. This equipment will be used as a basic means of communication for hearing impaired children at a very low price. The equipment consists of a transmitter and a receiver, which communicate over Wi-Fi 802.11 at distances between 0m and 95m using low-power electronic devices and recent technology such as WeMos D1 mini Lite cards. This device was tested on approximately 20 people caring for hearing impaired children, obtaining a measure of approval of approximately 74%. This is the first step in research that we plan to continue to reduce health gaps and improve communication for children with disabilities. Our group works for preventive health, reducing health gaps in the most vulnerable population.

Author 1: Kevin Rodriguez-Villarreal
Author 2: Zumaeta-Mori Jhon
Author 3: Alva Mantari Alicia
Author 4: Roman-Gonzalez Avid

Keywords: Hearing disability; device; low cost; algorithm; communication

PDF

Paper 66: A Deep-Learning Model for Predicting and Visualizing the Risk of Road Traffic Accidents in Saudi Arabia: A Tutorial Approach

Abstract: Around the world, road traffic accidents (RTAs) cause significant concerns for decision makers and researchers on traffic safety. The diversity, rarity, and interconnectivity of historical data on factors causing car accidents point to the need for more focused studies for analyzing, predicting, and visualizing the risk of accidents over the short and long term for preventive purposes. There are many techniques and tools applied to analyze, forecast, and visualize risk. Most RTA studies have applied linear time-series methods to forecasting the risk with limited studies applying machine-learning and deep-learning techniques, especially in Saudi Arabia. Recently, many global studies have applied long short-term memory (LSTM) networks, which can be used to automatically learn the temporal dependence structures for challenging time-series forecasting problems. This paper displays a tutorial for designing a prototype of an interactive analytical tool based on a multivariate LSTM model for time-series data to predict future car accidents, fatalities, and injuries in the Kingdom of Saudi Arabia (KSA). This interactive tool visualizes the real data with the predicted values regionally in a web browser with Python. The tutorial represents the annual data of the period between 1417 (1996) and 1433 (2013), then uses the data with some contributing factors, such as population, gender, nationality, number of vehicles, and length of road, to generate the input data and predict the future values of accidents, fatalities, and injuries up to the year 1452 (2030). After that the real and predicted values are visualized regionally on an interactive map that represents the degree of risk. Finally, the paper discusses the evaluation and utilization of the proposed prototype in the future in the field of road safety.

Author 1: Maram Alrajhi
Author 2: Mahmoud Kamel

Keywords: LSTM for time-series forecasting; deep learning; RTA; data visualization; interactive map; Saudi Arabia

PDF

Paper 67: Optimization of Multi-Product Aggregate Production Planning using Hybrid Simulated Annealing and Adaptive Genetic Algorithm

Abstract: In the planning of aggregate production, company stakeholders need a long time due to the many production variables that must be considered so that the production value can meet consumer demand with minimal production costs. The case study is the company that produces more than a type of product so there are several variables must be considered and computational time is required. Genetic Algorithm is applied as they have the advantage of searching in a solution space but are often trapped in locally optimal solutions. In this study, the authors proposed a new mathematical model in the form of a fitness function aimed at assessing the quality of the solution. To overcome this local optimum problem, the authors refined it by combining the Genetic Algorithm and Simulated Annealing so called hybrid approach. The function of Simulated Annealing is to improve every solution produced by Genetic Algorithm. The proposed hybrid method is proven to produce better solutions.

Author 1: Gusti Eka Yuliastuti
Author 2: Agung Mustika Rizki
Author 3: Wayan Firdaus Mahmudy
Author 4: Ishardita Pambudi Tama

Keywords: Agreggate; genetic algorithm; hybrid; production planning; simulated annealing

PDF

Paper 68: Hybrid Topic Cluster Models for Social Healthcare Data

Abstract: Social media and in particular, microblogs are becoming an important data source for disease surveillance, behavioral medicine, and public healthcare. Topic Models are widely used in microblog analytics for analyzing and integrating the textual data within a corpus. This paper uses health tweets as microblogs and attempts the health data clustering by topic models. The traditional topic models, such as Latent Semantic Indexing (LSI), Probabilistic Latent Schematic Indexing (PLSI), Latent Dirichlet Allocation (LDA), Non-negative Matrix Factorization (NMF), and integer Joint NMF(intJNMF) methods are used for health data clustering; however, they are intractable to assess the number of health topic clusters. Proper visualizations are essential to extract the information from and identifying trends of data, as they may include thousands of documents and millions of words. For visualization of topic clouds and health tendency in the document collection, we present hybrid topic models by integrating traditional topic models with VAT. Proposed hybrid topic models viz., Visual Non-negative Matrix Factorization (VNMF), Visual Latent Dirichlet Allocation (VLDA), Visual Probabilistic Latent Schematic Indexing (VPLSI) and Visual Latent Schematic Indexing (VLSI) are promising methods for accessing the health tendency and visualization of topic clusters from benchmarked and Twitter datasets. Evaluation and comparison of hybrid topic models are presented in the experimental section for demonstrating the efficiency with different distance measures, include, Euclidean distance, cosine distance, and multi-viewpoint cosine similarity.

Author 1: K Rajendra Prasad
Author 2: Moulana Mohammed
Author 3: R M Noorullah

Keywords: Multi-viewpoint based metric; traditional topic models; hybrid topic models; topic visualization; health tendency

PDF

Paper 69: Lizard Cipher for IoT Security on Constrained Devices

Abstract: Over the past decades, security become the most challenging task in Internet of Things. Therefore, a convenient hardware cryptographic module is required to provide accelerated cryptographic operations such as encryption. This study investigates the implementation of Lizard cipher on three Arduino platforms to determine its performance. This study is successful in implementing Lizard cipher on Arduino platform as a constrained devices and resulting 0.98 MB of memory utilization. The execution time of Lizard cipher is compared among Arduino variants, i.e Arduino Mega, Arduino Nano and Arduino Uno with ANOVA test. Tukey’s HSD post-hoc test reveales that the execution time is significantly slower in Arduino Mega compared to Arduino Nano and Arduino Uno. This result will help IoT security engineers in selecting a lightweight cipher that is suitable for constraints of the target device.

Author 1: Ari Kusyanti
Author 2: Rakhmadhany Primananda
Author 3: Kalbuadi Joyo Saputro

Keywords: Lizard cipher; IoT security; Arduino; ANOVA

PDF

Paper 70: A Generic Approach for Weight Assignment to the Decision Making Parameters

Abstract: Weight assignment to the decision parameters is a crucial factor in the decision-making process. Any imprecision in weight assignment to the decision attributes may lead the whole decision-making process useless which ultimately mislead the decision-makers to find an optimal solution. Therefore, attributes’ weight allocation process should be flawless and rational, and should not be just assigning some random values to the attributes without a proper analysis of the attributes’ impact on the decision-making process. Unfortunately, there is no sophisticated mathematical framework for analyzing the attribute’s impact on the decision-making process and thus the weight allocation task is accomplished based on some human sensing factors. To fill this gap, present paper proposes a weight assignment framework that analyzes the impact of an attribute on the decision-making process and based on that, each attribute is evaluated with a justified numerical value. The proposed framework analyzes historical data to assess the importance of an attribute and organizes the decision problems in a hierarchical structure and uses different mathematical formulas to explicit weights at different levels. Weights of mid and higher-level attributes are calculated based on the weights of root-level attributes. The proposed methodology has been validated with diverse data. In addition, the paper presents some potential applications of the proposed weight allocation scheme.

Author 1: Md Zahid Hasan
Author 2: Shakhawat Hossain
Author 3: Mohammad Shorif Uddin
Author 4: Mohammad Shahidul Islam

Keywords: Multiple attribute decision problem; average term frequency; cosine similarity; weight setup for multiple attributes; decision making

PDF

Paper 71: A Comparative Study of the Most Influential Learning Styles used in Adaptive Educational Environments

Abstract: E-learning has evolved from traditional content delivery approaches to a personalized, adaptive and learner-centered knowledge transfer. In the way of customizing the learning experience learning styles represent key features that cannot be neglected. Learning style designates any representative characteristic of an individual while learning, i.e. a particular way of dealing with a given learning task, the preferred media, or the learning strategies adopted in order to achieve a task. Despite the fact that the use of learning styles in adaptive educational environments has become controversial, but there is no empirical evidence of its usefulness. The main objective of our paper is to respond to the question “What learning style model is most appropriate for use in adaptive educational environments?”

Author 1: Othmane ZINE
Author 2: Aziz DEROUICH
Author 3: Abdennebi TALBI

Keywords: Adaptive educational environments; learning style; Felder-Silverman; personalization; learner modeling

PDF

Paper 72: Performance Analysis of Double Gate Junctionless Tunnel Field Effect Transistor: RF Stability Perspective

Abstract: This paper investigates the RF Stability performance of the Double Gate Junctionless Tunnel Field Effect Transistor (DGJL-TFET). The impact of the geometrical parameter, material and bias conditions on the key figure of merit (FoM) like Transconductance (gm), Gate capacitance (Cgg) and RF parameters like Stern Stability Factor (K), Critical Frequency (fk) are investigated. The analytical model provides the relation between fk and small signal parameters which provide guidelines for optimizing the device parameter. The results show improvement in ON current, gm, ft and fk for the optimized device structure. The optimized device parameters provide guidelines to operate DGJL-TFET for RF applications.

Author 1: Veerati Raju
Author 2: Sivasankaran K

Keywords: Junctionless tunnel FET; band to band tunnelling; High-k; RF stability; critical frequency

PDF

Paper 73: A Robust Optimization Approach of SQL-to-SPARQL Query Rewriting

Abstract: In order to ensure the interoperability between semantic web and relational databases, several approaches have been developed to ensure SQL-to-SPARQL query transformation direction, but all these approaches have the same weakness. In fact, they convert directly the input SQL query to its equivalent SPARQL one without any pre-processing phase enabling the optimization of this input query filled by users before starting the conversion process. This weakness has motivated us to add a pretreatment phase aiming to optimize the most important SQL statements which seem to have the biggest impact on the effectiveness of the transformed queries. Our main contribution is to enrich these rewriting systems by adding an optimization layer that integrate a set of simplification rules of Left, Right and Full Outer Join in order to avoid, firstly unnecessary operations during the conversion process, and secondly SPARQL queries with a high complexity due to Optional patterns obtained from outer join in this conversion context.

Author 1: Ahmed Abatal
Author 2: Mohamed Bahaj
Author 3: Soussi Nassima

Keywords: SQL-to-SPARQL; outer join optimization; query transformation; SQL simplification; query optimization layer

PDF

Paper 74: Impact of Scrum and Tactic Workflow Management System on Organization Performance (A Study on Animation Studios in Pakistan)

Abstract: Assessing and considering the efficiency effect of scrum management system is a significant and troublesome issue for a researcher. We prescribe that you best comprehend the adequacy of scrum management system applications by breaking down at the data handling level in the animation studio. As we going to concentrate on the progression of data with others in the association procedure is moderately not coupled, to oblige the estimation issue. In light of the aftereffects of this research, it will reason that in the wake of actualizing the scrum management system framework, the organization will turn out to be progressively proficient and compelling underway exercises. Also, its exhibition will upgrade and the greater part of the issues will resolve effectively which prompted better profitability and a superior notoriety in the market. The study additionally attempted to underline the effect of animation studio the scrum management system. It is proposed to determine how the scrum management system causes a movement studio to work adequately. Research proposes that activity studios ought to give adaptability to execute the executives data frameworks, ought to likewise be noticed that to advance control of the organization's market and obtain sufficient programming and a proper program through correspondence media associations to meet the worldwide scrum management system business condition in the developing business sector development and extension.

Author 1: Abdul Wahab Khan
Author 2: Usman Khan
Author 3: Maaz Bin Ahmad
Author 4: Farhan Shafique

Keywords: MIS (Management Information System); scrum management system; deployment; performance; animated studios

PDF

Paper 75: Atmospheric Light Estimation using Particle Swarm Optimization for Dehazing

Abstract: For the past decade, many researchers have been working towards the improvement in the visibility of single hazy images, using the haze image model. According to the haze image model, the hazy-free image is restored by estimating the atmospheric light and transmission from a hazy image. The objective of this proposed work is to improve the perceptibility by decreasing the density of haze in the hazy images. The research work was carried to estimate the optimal value of atmospheric light by tuning the weights using a bioinspired technique called Particle Swarm Optimization (PSO) based on the objective of minimizing the fog density. We have selected a fitness function or objective function which incorporates all statistical features to differentiate a clear image from the hazy image. The results are validated with the state-of-the-art, by measuring fog density of the restored image using Fog Aware Density Evaluator (FADE). Also, the results are validated by measuring the Peak signal to noise ratio (PSNR) and structural similarity index (SSI) using ground truth images from Foggy Road image database (FRIDA). This research work demonstrates better results qualitatively and quantitatively.

Author 1: Padmini T N
Author 2: Shankar. T

Keywords: Hazy images; particle swarm optimization; dark channel prior; transmission; atmospheric light

PDF

Paper 76: Intelligent Pedagogical Model with Kinesthetic-Static Immersion based on the Neuro-Linguistic Programming Approach (NLP)

Abstract: In this paper, the authors propose a teaching/learning pedagogical model, based on an approach that uses neurolinguistic programming, educational data mining and haptic interaction. It also uses the theory of learning styles, which are identified with data mining techniques, clustering and the Farthest First algorithm, as well as a test of Neurolinguistic Programming. Depending on the results obtained, the teaching/learning strategies are defined, and the activities of an educational coaching are suggested, with the purpose of boosting the students' attention in the classroom, stimulating their communicative and psychomotor skills. The proposal was evaluated with a sample of students of regular basic education, to whom an instrument was applied before and after carrying out the tests of applying the plan of the teaching/learning activities. For this purpose, a multifunctional learning kit was constructed, which is a didactic and playful resource, applicable to the student's psychomotor area. The kit contains an application and a hardware device called "Tusuna-pad 1.0", which was implemented in the Unity games engine and was programmed using the C# language. The pedagogical model was validated with the participation of students of Regular Basic Education, considering pedagogical and computational aspects, results that were validated and duly analyzed. Finally, conclusions and recommendations for future work were established.

Author 1: Simón Choquehuayta Palomino
Author 2: José Herrera Quispe
Author 3: Luis Alfaro
Author 4: Blas Choquehuayta Llamoca

Keywords: Haptic interaction; viral immersion; learning styles; neuro-linguistic programming; educational data mining

PDF

Paper 77: Budgets Balancing Algorithms for the Projects Assignment

Abstract: This paper focused on the resolution of the project’s assignment problem. Several heuristics have been developed and proposed in this paper to serve as lower bounds to our studied problem. In a developing country, it is interesting to make an equitable distribution of projects in different cities in order to guarantee equality and regional development. Each project is characterized by its budget. The problem is to find an appropriate schedule to assign all projects to all cities. This appropriate schedule seeking the maximization of the budget in the city that having the minimum budget. In this paper, six heuristics were proposed to carry out the objective of resolving the studied problem. The experimental results show that the algorithm given by the heuristic P6r outperforms all other heuristics cited in this paper.

Author 1: Mahdi Jemmali

Keywords: Heuristic; scheduling algorithms; project assignment

PDF

Paper 78: Browser Extension based Hybrid Anti-Phishing Framework using Feature Selection

Abstract: Phishing is one of the socially engineered cyberse-curity attacks where the attacker impersonates a genuine and legitimate website source and sends emails with the intention of stealing sensitive personal information. The phishing websites’ URLs are usually spread through emails by luring the users to click on them or by embedding the link to fake website replicating any genuine e-commerce website inside the invoice or other documents. The phishing problem is very wide and no single solution exists to mitigate all the vulnerabilities properly. Thus, multiple techniques are often combined and implemented to mitigate specific attacks. The primary objective of this paper is to propose an efficient and effective anti-phishing solution that can be implemented at the client-side in the form of a browser extension and should be capable to handle real-time scenarios and zero-day attacks. The proposed approach works efficiently for any phishing link carrier mode as the execution on clicking on any link or manually entering URL in the browser doesn’t proceed unless the proposed framework approves that the website associated with that URL is genuine. Also, the proposed framework is capable to handle DNS cache poisoning attacks even if the system’s DNS cache is somehow infected. This paper first presents a comprehensive review that broadly discusses the phishing life cycle and available anti-phishing countermeasures. The proposed framework considers the pros and cons of existing methodologies and presents a robust solution by combining the best features to ensure that a fast and accurate response is achieved. The effectiveness of the approach is tested in a real-time dataset consisting of live phishing and legitimate website URLs and the framework is found to be 98.1% accurate in identifying websites correctly in very less time.

Author 1: Swati Maurya
Author 2: Harpreet Singh Saini
Author 3: Anurag Jain

Keywords: Anti-phishing; browser extension; machine learn-ing; feature selection

PDF

Paper 79: Autonomous Navigation of Unmanned Aerial Vehicles based on Android Smartphone

Abstract: In the past few years, the adoption of drone technology across industries has increased dramatically as more businesses started to recognize its potential, uses, and scale of global reach. This paper proposes a design solution of a smart GPS quadcopter aircraft navigation system, discusses its hardware and software implementation process and eventually, analyses and reports the final test results. The flight path of the quadrotor is remotely manipulated via an Android based Graphical User Interface. This outdoor handheld application al-lows the operator to select a point of interest through Google map satellite view; consequently, the quadrotor takes off then hovers and ultimately lands on the destination location. Instructions in conjunction with coordinates are sent and received throughout a web server which serves the communication operation between the smartphone and the quadrotor. Experimental results yield fruitful data communication and successful autonomous flight control with smooth and stable maneuvering.

Author 1: Talal Bonny
Author 2: Mohamed B. Abdelsalam

Keywords: UAV; drone; quadrotor; android; Arduino; GPS; GPRS; GSM

PDF

Paper 80: SQL to SPARQL Conversion for Direct RDF Querying

Abstract: With the advances in native storage means of RDF data and associated querying capabilities using SPARQL, there is a need to let SQL users benefit from such capabilities for interoperability objectives and without any conversion of the RDF data into relational data. In this sense, this work present SQL2SPARQL4RDF an automatic conversion algorithm of SQL queries into SPARQL queries for querying RDF data, which extends the previously established algorithm with relevant SQL elements such as queries with INSERT, DELETE, GROUP BY and HAVING clauses. SQL users are provided with a relational schema of their RDF data against which they can formulate their SQL queries that are then converted into SPARQL equivalent ones with respect to the provided schema. This avoids the birding of translating instances and data replication and thus saving load-ing times and guaranteeing fast execution especially in the case of massive amounts of data. In addition, the automatic mapping framework developed by the java programming language, and implement many new mapping functionalities. Furthermore, to test and validate the efficiency of the mapping approach and adding a module for automatic execution and evaluation of the various obtained SPARQL queries on Allegrograph.

Author 1: Ahmed ABATAL
Author 2: Khadija Alaoui
Author 3: Larbi Alaoui
Author 4: Mohamed Bahaj

Keywords: Resource Description Framework (RDF); Struc-tured Query Language (SQL); Simple Protocol and RDF Query Language (SPARQL); schema mapping; query conversion; Allegrograph

PDF

Paper 81: Instagram Shopping in Saudi Arabia: What Influences Consumer Trust and Purchase Decisions?

Abstract: The recent developments of social networking sites (SNSs), along with the increasing usage of online shopping, has led to the emergence of social commerce platforms. Social commerce (s-commerce) is the use of Web 2.0 technologies and social media to deliver e-commerce services for consumers. The Kingdom of Saudi Arabia (KSA) has been witnessing a rapid growth in s-commerce usage, with Instagram being the most popular networks in the region. This paper is one of the few that investigates the factors affecting consumers’ trust and purchase intentions in Instagram as a s-commerce platform in Saudi Arabia. The proposed model explores a number of factors, such as Social Media Influencers (SMIs), Key Opinion Leaders (KOLs) and consumer feedback, in terms of their influence on consumers’ trust and purchase decisions. In addition to the effect of Maroof, which is an e-service provided by the Saudi Ministry of Commerce and Investment to evaluate the reliability of online stores. Following a quantitative approach and using Partial Least Squares Modeling (PLS-SEM), findings of this study revealed a positive relationship between consumers’ trust and their purchase intentions. Additionally, the impact of SMIs and consumer feedback was shown to increase consumers’ trust, in turn affecting intent to buy from Instagram stores, while the effect of Maroof and KOLs was shown to directly influence consumers’ purchase intentions.

Author 1: Taghreed Shaher Alotaibi
Author 2: Afnan Abdulrahman Alkhathlan
Author 3: Shaden Saad Alzeer

Keywords: Social commerce; Instagram; trust; purchase intention; Maroof; key opinion leaders; social media influencers

PDF

Paper 82: Evaluate Metadata of Sparse Matrix for SpMV on Shared Memory Architecture

Abstract: Sparse Matrix operations are frequently used op-erations in scientific, engineering and high-performance com-puting (HPC) applications. Among them, sparse matrix-vector multiplication (SpMV) is a popular kernel and considered an important numerical method for science, engineering and in scientific computing. However, SpMV is a computationally expen-sive operation. To obtain better performance, SpMV depends on certain factors; choosing the right storage format for the sparse matrix is one of them. Other things like data access pattern, the sparsity of the matrix data set, load balancing, sharing of the memory hierarchy, etc. are other factors that affect performance. Metadata, that describes the substructure of the sparse matrix, like shape, density, sparsity, etc. of the sparse matrix also affects performance efficiency for any sparse matrix operation. Various approaches presented in literature over the last few decades given good results for certain types of matrix structures and don’t perform as well with others. Developers thus are faced with a difficulty in choosing the most appropriate format. In this research, an approach is presented that evaluates metadata of a given sparse matrix and suggest to the developers the most suitable storage format to use for SpMV.

Author 1: Nazmul Ahasan Maruf
Author 2: Waseem Ahmed

Keywords: Sparse matrix vector multiplication; sparse matrix metadata; sparse matrix vector multiplication parallelization; shared memory architecture; sparse matrix storage formats; high performance computing

PDF

Paper 83: Survey on Domain Specific Languages Implementation Aspects

Abstract: Domain Specific Languages (DSLs) bridge the gap between the business model and the technical model. DSLs allow the technical developer to write programs with the business domain notations. This leads to higher productivity and better quality than General Purpose Languages (GPLs). One of the main challenges of utilizing DSLs in the current software process is how to reduce the implementation cost and the knowledge required for building and maintaining DSLs. Language workbenches are environments that provide high level tools for implementing different language aspects. The purpose of this paper is to provide a survey on the different aspects of implementing DSLs. The survey includes structure, editor, semantics, and composability language aspects. Furthermore, it overviews the approaches used for each aspect and classify the current workbenches according to these approaches.

Author 1: Eman Negm
Author 2: Soha Makady
Author 3: Akram Salah

Keywords: Domain Specific Language (DSL); language work-bench; language implementation aspects; software language engineering

PDF

Paper 84: Fine-tuning Resource Allocation of Apache Spark Distributed Multinode Cluster for Faster Processing of Network-trace Data

Abstract: In the field of network security, the task of process-ing and analyzing huge amount of Packet CAPture (PCAP) data is of utmost importance for developing and monitoring the behavior of networks, having an intrusion detection and prevention system, firewall etc. In recent times, Apache Spark in combination with Hadoop Yet-Another-Resource-Negotiator (YARN) is evolving as a generic Big Data processing platform. While processing raw network packets, timely inference of network security is a primitive requirement. However, to the best of our knowledge, no prior work has focused on systematic study on fine-tuning the resources, scalability and performance of distributed Apache Spark cluster (while processing PCAP data). For obtaining best performance, various cluster parameters like number of cluster nodes, number of cores utilized from each node, total number of executors run in the cluster, amount of main-memory used from each node, executor memory overhead allotted for each node to handle garbage collection issue, etc., have been fine-tuned, which is the focus of the proposed work. Through the proposed strategy, we could analyze 85GB of data (provided by CSIR Fourth Paradigm Institute) in just 78 seconds, using 32 node (256 cores) Spark cluster. This would otherwise take around 30 minutes in traditional processing systems.

Author 1: Shyamasundar L B
Author 2: V Anilkumar
Author 3: Jhansi Rani P

Keywords: Big data; packet data analysis; network security; distributed apache spark cluster; Yet Another Resource Negotiator (YARN); parameter tuning

PDF

Paper 85: Rich Style Embedding for Intrinsic Plagiarism Detection

Abstract: Stylometry plays an important role in the intrinsic plagiarism detection, where the goal is to identify potential plagiarism by analyzing a document involving undeclared changes in writing style. The purpose of this paper is to study the interaction between syntactic structures, attention mechanism, and contextualized word embeddings, as well as their effectiveness on plagiarism detection. Accordingly, we propose a new style embedding that combines syntactic trees and the pre-trained Multi-Task Deep Neural Network (MT-DNN). Additionally, we use attention mechanisms to sum the embeddings, thereby exper-imenting with both a Bidirectional Long Short-Term Memory (BiLSTM) and a Convolutional Neural Network (CNN) max-pooling for sentences encoding. Our model is evaluated on two sub-task; style change detection and style breach detection, and compared with two baseline detectors based on classic stylometric features.

Author 1: Oumaima Hourrane
Author 2: El Habib Benlahmer

Keywords: Plagiarism detection; style embedding; deep neural network; stylometry; syntactic trees

PDF

Paper 86: UAV Control Architecture: Review

Abstract: Since civil Unmanned Aerial Vehicles (UAVs) are expected to perform a wide rang of mission, the subject of designing an efficient control architecture for autonomous UAV is a very challenging problem. Several contributions had been done in order to implement an autonomous UAV. The key challenge of all these contributions is to develop the global strategy. Robotic control approaches could be classified into six categories: Deliberative, Reactive, Hybrid, Behavior, Hybrid Behavior and subsumption approach. In this paper, we will review the existing control architectures to extract the main features of civil UAVs. The definition, advantage and drawback of each architecture will be highlighted to finally provide a comparative study of the mentioned control approaches.

Author 1: IDALENE Asmaa
Author 2: BOUKHDIR Khalid
Author 3: MEDROMI Hicham

Keywords: Unmanned Aerial Vehicle; control architecture; de-liberative approach; reactive approach; hybrid approach; behavior approach; hybrid behavior approach; subsumption approach

PDF

Paper 87: Effective Combination of Iris-based Cancelable Biometrics and Biometric Cryptosystems

Abstract: The fuzzy commitment scheme (FCS) is one of the most effective biometric cryptosystems (BCs) that provide secure management of cryptographic keys using biometric templates. In this scheme, error correcting codes (ECCs) are firstly employed to encode a cryptographic key into a codeword which is then secured via linking (committing) it with a biometric template of the same length. Unfortunately, the key length is constrained by the size of the adopted biometric template as well as the employed ECC(s). In this paper, we propose a secure iris template protection scheme that combines cancelable biometrics with the FCS in order to secure long cryptographic keys without sacrificing the recognition accuracy. First, we utilize cancelable biometrics to derive revocable templates of large sizes from the most reliable bits in iris codes. Then, the FCS is applied to the obtained cancelable iris templates to secure cryptographic keys of the desired length. The revocability of cryptographic keys as well as true iris templates is guaranteed due to the hybridization of both techniques. Experimental results show that the proposed hybrid system can achieve high recognition accuracy regardless of the key size.

Author 1: Osama Ouda
Author 2: Norimichi Tsumura
Author 3: Toshiya Nakaguchi

Keywords: Biometric template protection; cancelable biomet-rics; biometric cryptosystems; BioEncoding; fuzzy commitment

PDF

Paper 88: Learning Management System Personalization based on Multi-Attribute Decision Making Techniques and Intuitionistic Fuzzy Numbers

Abstract: The personalization of Learning Management Sys-tems is a fundamental task in the current context of e-Learning and the WWW. However, there are many controversies around the criteria used to make the selection and presentation of the most appropriate content for each user. The most used approaches in the last decade were the identification of learning styles, the analysis of the history and navigational behavior, and the classification of user profiles, without finding conclusive evidence to determine a method that can be adopted universally, consid-ering the complexity of the cognitive processes involved. This paper proposes an approach based on multi-attribute decision making techniques, which allows considering and combining the criteria most effectively used in the area, according to particular contexts, as a new approach to the content personalization and appropriate learning objects selection. The application of this approach aims to maximize the effectiveness and efficiency of the teaching process and enrich the user experience.

Author 1: Jorge Luna-Urquizo

Keywords: Learning Management Systems (LMS); e-Learning; multi-attribute decision making; learning styles; content personal-ization; learning objects selection

PDF

Paper 89: A Closer Look at Arabic Text Classification

Abstract: The world has witnessed an information explosion in the past two decades. Electronic devices are now available in many varieties such as PCs, Laptops, book readers, mobile devices and with relatively affordable prices. This and the ubiquitous use of software applications such as social media and cloud applications, and the increasing trend towards digitalization, the amount of information on the global cloud has surged to an unprecedented level. Therefore, a dire need exists in order to mine this massively large amount of data and produce meaningful information. Text Classification is one of the known and well established data mining techniques that has been used and reported in the literature. Text classification methods include statistical and machine learning algorithms such as Naive Baysian, Support Vector Machines and others have widely been used. Many works have been reported regarding text classification of various languages including English, Chinese, Russian, and many others. Arabic is the fifth most spoken language in the world. There has been many works in the literature for Arabic text classification. However, and to the best of our knowledge, there is no recent work that presents a good, critical and comprehensive survey of the Arabic text classification for the past two decades. The aim of this paper is to present a concise and yet comprehensive review of the Arabic text classification. We have covered over 50 research papers covering the past two decades (2000 - 2019). The main focus of this paper is to address the following issues: 1) The techniques reported in the literature including. 2) New Techniques. 3) Most claimed efficient technique. 4) Datasets used and which ones are most popular. 5) Which feature selection techniques are used? 6) Popular classes/categories used. 7) Effect of stemming techniques on classification results.

Author 1: Mohammad A R Abdeen
Author 2: Sami AlBouq
Author 3: Ahmed Elmahalawy
Author 4: Sara Shehata

Keywords: Arabic text classification; support vector machines; k-NN; Naive Bayesian; decision trees; C4.5; maximum entropy; feature selection; Arabic dataset

PDF

Paper 90: Automatic Semantic Categorization of News Headlines using Ensemble Machine Learning: A Comparative Study

Abstract: Due to widespread availability of Internet there are a huge of sources that produce massive amounts of daily news. Moreover, the need for information by users has been increasing unprecedently, so it is critical that the news is automatically classified to permit users to access the required news instantly and effectively. One of the major problems with online news sets is the categorization of the vast number news and articles. In order to solve this problem, the machine learning model along with the Natural Language Processing (NLP) is widely used for automatic news classification to categorize topics of untracked news and individual opinion based on the user’s prior interests. However, the existing studies mostly rely on NLP but uses huge documents to train the prediction model, thus it is hard to classify a short text without using semantics. Few studies focus on exploring classifying the news headlines using the semantics. Therefore, this paper attempts to use semantics and ensemble learning to improve the short text classification. The proposed methodology starts with preprocessing stage then applying feature engineering using word2vec with TF-IDF vectorizer. Afterwards, the classification model was developed with different classifier KNN, SVM, Naïve Bayes and Gradient boosting. The experimental results verify that Multinomial Naïve Bayes shows the best performance with an accuracy of 90.12% and recall 90%.

Author 1: Raghad Bogery
Author 2: Nora Al Babtain
Author 3: Nida Aslam
Author 4: Nada Alkabour
Author 5: Yara Al Hashim
Author 6: Irfan Ullah Khan

Keywords: Natural language processing; feature engineering; word embedding; text classification; ensemble learning

PDF

Paper 91: Identification of Learning Styles and Automatic Assignment of Projects in an Adaptive e-Learning Environment using Project Based Learning

Abstract: The use of the project-based learning approach is one of the emerging trends in education and Adaptive e-Learning platforms. One of the main challenges in this line of research is to identify the learning style of students, whose results are considered in this work, for the assignment of course projects which best suit the characteristics of each student in particular as they incorporate different types of learning strategies and objects in order to contribute to facilitate and simplify the teaching/learning process of the adaptive e-Learning platform, which uses the project-based learning approach. In this work, after carrying out a review of the literature for the theoretical foundation and establishment of the state of Art, a line module of automatic recognition of learning styles is proposed, which uses the information of the interaction of the student with the system and the one that is based on Neural Networks and Fuzzy Logic, whose results are considered by the module of selection and assignment of projects that uses the Case Base Reasoning, later carrying out the tests and the analysis of the results obtained.

Author 1: Luis Alfaro
Author 2: Erick Apaza
Author 3: Jorge Luna-Urquizo
Author 4: Claudia Rivera

Keywords: Adaptive e-Learning; project based collaborative learning; case-based reasoning; learning styles; back propagation neural networks

PDF

The Science and Information (SAI) Organization
BACK TO TOP

Computer Science Journal

  • About the Journal
  • Call for Papers
  • Submit Paper
  • Indexing

Our Conferences

  • Computing Conference
  • Intelligent Systems Conference
  • Future Technologies Conference
  • Communication Conference

Help & Support

  • Contact Us
  • About Us
  • Terms and Conditions
  • Privacy Policy

© The Science and Information (SAI) Organization Limited. All rights reserved. Registered in England and Wales. Company Number 8933205. thesai.org