System Design and Case Study Reporting on AQASYS: A Web-based Academic Quality Assurance System

The demands of modern education have evolved from a teacher-centric requirement to a learnercentric requirement. Knowledge, skill, and competence are the most sought-after attributes of a graduate. Features such as the objective focus of learning, curriculum planning, a set of high expectations, and extended opportunities to the learner after completion of education are at the center of all the planning. It is all about skilloriented, outcome-based standardization that has been infused by the societal stakeholders into the modern global education system to create a work-ready human capital. In this paper, a software product for academic quality assurance is presented. The software provides a generic framework to any educational institution that operates to implement known international standards of education. The software accepts the data and computes the quality parameters as per the selected standards. It has an analytical module that provides summary analytics and generates the course reports in the given format automatically. The software is tested with a case study and results are presented. The paper also presents the system design approach with discussion on the technologies selected for the development. Keywords—Outcome-based education; quality standards; automated software; system design; education technology; accreditation


I. INTRODUCTION
In the last decade or two, the requirements of higher education particularly in professional education have shifted from a "teacher-centric" approach to a "learner-centric approach" [5]. In the traditional approach whatever teacher knew was used as the content or curriculum of the teaching. However, in the changing societal needs and driving forces of the job market, industry, and global changes, the learner needs to have knowledge, skill, and competence in the chosen areas of learning. The learner selects the domain of learning and sets objectives to achieve them at the end of the learning. This kind of pole shift and paradigm shift in teaching-&-learning has brought the educational institutes to frame new curricula, with defined objectives and clearly stated goals as" what the learner will be able to do" at the end of the learning? The standard educational institutes continuously loop around these parameters and assess the level of achievement through their operational practices. This activity is given the word "Quality" [1] [2] in most of the educational institutes that are signatories to the internationally recognized standards organizations [5].
In this work, a web-based academic quality management system is implemented. The paper describes the system design process and reports on the results after specific case study data is input to the system. The system is called AQASYS, an Academic Quality Assurance System.
The research questions on methods of quantifying Course Learning Outcomes (CLO's), Program Learning Outcomes, the Difficulty level of the course, automated feedback assessments, CLO, PLO attainments are addressed. An automated CLO mapping module, design of a text-mining based Bloom's level recommender module are built and integrated into the system. The system generates coursespecific and program-specific results providing several summary analytic outputs of business value. The system addresses the issue of "closing the loop" to assess the committed objectives.
The system facilitates the academic quality assessment process with web-interface for the users and a reliable database at the backend. It is a secured, reliable, and all-time available resource for its users. The system provides all inherent advantages of web-based applications.

II. MOTIVATION
Most of the institutes that were visited and the faculty that were consulted, shared their experience of handling "academic quality" matters in their respective institutes and departments. Based on this the following major observations were made: 1) Each faculty was using some kind of excel sheet and running around for the templates and methods of calculations on assessments. There was absolute last-minute chaos, confusion, and lethargy in what they were doing.
2) There was no archive of centralized data repository to click and see the analytics of the past and explore it for future strategies.
3) Lack of sound research-driven statistical and mathematical foundations uniformly applied across the departments for assessment and achievements.

4)
Plenty of "number in number out" kind of manipulated data to skew the academic quality for the personal advantage of the faculty or to the advantage of the department. The piety of the information was affected.
5) Irrational ways of mapping CLOs with the question paper text, resulting in the illusion of whole quality exercise. And finally, 6) Very poor regularity and ease of academic data management.
In light of these observations, the paper is presenting a web-based academic quality assurance system that has a centralized repository of academic data, research-driven academic foundation for all quality parameters, and above all ease of access with availability, confidentiality, and integrity.
The paper is organized in the following way: Section III presents a literature review and related work. Section IV lists the contributions of the paper. Section V presents in detail the software architecture used. Section VI addresses system design features. Section VII briefs about the development platform and the associated technologies used in the product development. Section VIII discusses the operational parameters and their formulae of computation. Section IX presents results and selected screenshots. The final section, Section X is the concluding section.

III. RELATED WORK
In the past two decades, there have been several research papers and software products attempting to illustrate and automate the process of academic accreditation [10] in higher education. This section reviews and compares systems like the one presented in this paper.
ABET Course Assessment Tool (ACAT) [4] is one of the earliest (2010) and decent attempts found to be made to automate the accreditation process. It is a web-based application designed to assist in the collecting of data and the generation of standardized reports.
Course and Student Management System (CSMS) [9] is a Web-based system based on the concept of an articulation matrix that maps various course outcomes to ABET program outcomes [6]. The articulation matrix is nothing but the course assessment matrix. It is a 2D table, in which rows represent learning objectives, and columns represent class activities conducted to meet those learning objectives. The articulation matrix is filled when the course learning objectives are achieved.
Program and Courses Outcomes System (PACOS) [13] is an initiation from Hong Kong University [12]. It is an opensource web-application. PACOS has various power levels of access with different user privileges. A user can perform CRUD (Create, Read, Update, Delete) operations on courses, course information, and program outcomes. PACOS manages data with excel files as inputs and displays result in the form of histogram and reports.
Program Outcome Assessment (POA) [3][11] is a webbased program developed using ASP.Net technologies to automate the program assessment. It supports a complete flow from defining the program outcomes [6] to entering all course information in real-time. The program computes on-the-fly performs all the evaluations and presents the output in the form of graphs, color-coded tables, and a year, year to year comparison of various program outcomes for a span of five years.
Apart from the above work in which mainly academicians have demonstrated automated software programs for program assessment and accreditation, there are many software companies offering these services through their proprietary software products. [17] is an excellent, sustainable & affordable tool for monitoring the educational process and performance. It is a product from smart-accredit.com. It allows to plan and design assessments, generate course folders, collect and analyze feedbacks, display outcome attainments. It offers a free trial version for individuals for a limited period of testing the product.

1) CLOSO: CLOSO
2) Contineo: 'Contineo' [18] is a pioneering software platform for the implementation and administration of academic autonomy with the guided philosophy of the Outcome-Based Education System. It mainly focuses on the accreditation standards such as the National Board of Accreditation (NBA) followed in India and the subcontinent. Service is available through contineo.in for personalized customization in the product.
3) Watermark: [19] It is a Cloud-based software that helps higher education institutions manage assessment, accreditation, curriculum, course evaluation, & faculty activity.
It is available at (https://www.watermarkinsights.com/) 4) WEAVE: [19] Weave is a cloud-based accreditation and assessment solution designed to help higher education institutions with program review, course planning, and more. The programmatic assessment functionality lets organizations analyze assessments and provide insight into student performance. It is available at (https://weaveeducation.com/).
There is another plethora of companies offering accreditation services spanning industry and academia. To list a few here [19] ARMATURE, Jetpack, Jura, SPOL, powerDMS, qualtrax, compWalk, DocTract, Submittable, WizeHive, and the list grows. It is to be noted that the companies listed in this paragraph however are not focusing on academic accreditation in particular but targeted for commercial business enterprises in general.

IV. CCONTRIBUTIONS OF THE PAPER
Academic quality Assurance System (AQASYS) presented in this paper is different from most of the above products in the following major ways: 1) It has an intelligent box (iBOX) feature which automatically classifies assessment questions and assigns proper CLOs as per Bloom's taxonomy.
2) It maps back the assessment questions to Program Learning outcomes via the Course Learning Outcomes (CLO). 814 | P a g e www.ijacsa.thesai.org 3) It produces the Course report automatically in each course by collecting relevant grade-book information, student feedback analysis, CLO achievement statistics and intelligent recommendations for course improvement wherever necessary.
4) It produces the summary analytics of the program for all courses of the program in a single view so that executives of the program can easily comprehend the course statistics.
5) It has in-built triggers to remind and notify the user about data entry delays so that the system can track the punctuality of the faculty in pursuing the quality matter in their academic functioning.
6) It has data archiving and data freezing facilities so that the data cannot be modified even by the data owner (faculty) after a specified due date. This helps to keep the sanctity of data and draw a line for "end of a cycle".
7) It is generic in accepting program-dependent PLOs, CLOs, any number of students, any type of assessment, with time stamps and user foot-prints being recorded in the background.
8) It has a file upload facility to store quality files and course portfolios.
9) It provides data-isolation, user authentication, and ease of usage. The user need not bother about the computational details of quality standards.
10) Of course it is a web-based product with MVC architecture and accessible on desktop, tablet, and mobile as the design is made self-responsive.

V. THE SOFTWARE ARCHITECTURE
Considering the existing IT environment, user dynamics, software quality attributes that are required, design structure and business strategy employed in most of the higher education institutes a Model View Controller (MVC) framework was used in a client-server architecture, refer Fig. 1. The open source framework called Codeigniter [14] was selected for this purpose. It supports MVC development with PHP [7].
• Model: The Model contains only the pure application data, Model works as back-end. It deals with back-end operations and fetches data from the database and send it to the controller.
• View: A view is simply a web page, or a page fragment, like a header, footer, sidebar, etc The View presents the model's data to the user. It works as frontend. It displays data and captures user actions, sends user actions to the controller.
• Controller: The controller works as an intermediator between model and view and controls the actions between them. It listens to events triggered by the view (or another external source) and executes the appropriate reaction to these events. Controllers decide how the HTTP requests are to be handled.

A. The Use Cases
The use case diagram Fig. 3, describes how the different users interact in the AQASYS. Admin, Faculty, Student are all a type of users with different access privileges. New user is also a type of user who requires signup before getting access to AQASYS services. Each user can avail different services as indicated in the use case boundary.

B. The Class Diagram
The class diagram Fig. 4, describes the structure of AQASYS. The user class is the parent class for many user types such as student, faculty, admin, guest and the NewUser. Every user is associated with a department. A department has PLOs and courses. Each course has CLOs, KPIs and course assessments. CLOs contribute to PLOs of the department and they are associated with PLOs. The CLOs, KPIs and the assessments are mapped while producing the Table of Specifications (ToS) on the course, example refer Table I. In order to perform mapping, AQASYS provides an intelligent auto-mapper class called iBOX. The iBOX is based on the Bloom's taxonomy. The iBOX class is associated with MappingTable class. Most of the operational methods are grouped in AqasysOp class.

C. The Data Model
This section describes the selected listing of databases and the table structures within each database. A sample of the model is shown in Fig. 5. The model has databases for each department and inside each department there are tables such as studentlist, facultylist, courses, courseallotment, plodefintions, clodefintions, kpidefinitions, tostable, etc. refer Fig. 6. There is a separate database for users of AQASYS. The application_users database has tables such as users, userlogs, departments, contactform. The database on Bloom data stores an incrementally growing data as a result of text processing in the AQASYS application. The Bloom data and associated alogorithms add intelligence to the system through iBOX tab.      Table Structure. VII. THE DEVELOPMENT PLATFORM Choosing a development platform and technology is always an intriguing question for the developer. After conducting good research, in this application, Codeigniter MVC framework [14] with PHP was chosen for the reason that the Codeigniter has a light footprint, flexible with MVC and non-MVC applications, supports MYSQL database and the learning curve is not very steep for the developer. Codeigniter has inbuilt security against Cross-Site Request Forgery (CSRF) and Cross-Site Scripting (XSS) attacks and provides content security.Finally, Codeigniter 4.0.3, PHP 7.2, HTML 5, CSS, JavaScript, AJAX, Bootstrap 4.
MYSQL are the combination of technologies used for the development.

VIII. THE OPERATIONS IN AQASYS
This section presents various operations and computations involved in AQASYS. These computational formulae are adopted from the academic quality standards and customized to the institutional requirement. Refer the following Table II, to understand the computations of this section.

1) Percentage CLO Attainment PCA (Pure):
This calculation considers all the students who have appeared for the exam and obtained some score including the failure scores if any. This calculation doesn't exclude the failed student while calculating PCA hence the name PCA(Pure). It is basically the percentage of the CLO mark attained by the students as defined below: Where MS is the mark of each student and MX is Maximum marks allotted for that question of the assessment (CLO). N is the total number of students addressed by the j th CLO. 2) Percentage student CLO attainment (PSCA): It is a measure of x% of students who attained y% of CLO mark. As per the user requirements, the PSCA calculation excludes the records of failed students and classifies the graded students into four classes viz. Excellent (E), Adequate (A), Medium (M), Unsatisfactory(U). The target range for classifying into E, A, M, and U are dynamically adjustable with the slider scale Fig. 7. After classifying them into E, A, M, and U percentage numbers of (E+A) students are calculated. If this number is above a certain set target, then the corresponding CLO is declared as "attained". The target selection is also dynamically adjustable as per the policy requirements of the quality committee. The following screenshot of the slider, (Fig. 7) and the specimen calculations make the PSCA calculation clear: Slider: Fig. 7. Criteria Setting Slider for CLO Assessments.
As per the above slider, the numbers indicate the upper limits for each range. That means the ranges for each class are as follows (Table III):   TABLE III. TABLE OF  The following Table IV presents values of different operational parameters for the example data of Table II with E,A,M and U classification. Fig. 8 presents comparison of PSCA(Pure) and PSCA(adjusted). In the "adjusted value computation the records of failed students are not considered. This PSCA(adjusted) parameter, slightly skews the results towards PLO attainment advantage. Fig. 9 shows final values.

3) Grading:
Grade Book generation is dependent on the complete marks submission in all assessments as planned for the course. Incomplete submission will not produce the grade book. The mark submission is to be done at each question level at each assessment for all the registered students. Any lapse is considered as incomplete data submission and grade book will not be generated. This strategy is adopted to collect the data from the root of the assessment source alleviating any high-level manipulation or adjustments to alter the grades. The grade ranges used are as below in Table V. AQASYS generates course specific grade summary charts for the faculty login and grade summary charts for all the courses of the department for the administrator login. The sample grade summary screen shot is as shown in Fig. 10 below: 4) PLO Attainments: Before calculating the PLO attainments, it is important to understand the relation between CLO, PLO and course Assessments of the department. The following Entity-relationship diagram, Fig 11, shows relation between entities: department, course, assessment, PLOs and CLOs.
It means that a department will have many courses, a course will have many assessments. A course will have many defined CLOs. Each assessment will have many CLOs. And finally, many CLOs contribute to many PLOs.

a) PLO Attainment in the course (CPLO):
Now it is clear that, the CLOs of a course will contribute to one or more PLOs. There is "many to many " (m:n) relation between CLOs and PLOs of the course. Therefore, while calculating the PLO attainment by a course, the contributions of each CLO to the PLO in question are to be considered. The attainment in i th PLO is given by the average sum of all the contributing CLOs. The general formula used is as given below: 819 | P a g e www.ijacsa.thesai.org Where V(i) is the value of j th CLO contributing to i th PLO.
where, N is the total number of courses.

5) Difficulty Level(DL):
This parameter is captured as Coefficient of Difficulty else-where[], however the parameter is not very convincing as it takes into account the only maximum scores. In this paper a new parameter is defined to capture the difficulty level. In this paper, the difficulty Level (DL) is the degree of difficulty of the assessment as felt by the students (the learner). It is defined as below: Where M i is the mark scored by i th student, M x is the maximum mark for the question (CLO) and N is the number of students. If all the students secure the allotted maximum mark then, the numerator of the fractional term in the above equation will be equal to its denominator resulting in the final value of DL=0, suggesting that the question paper was easy. If it is extremely difficult then all the students would score zero, resulting in DL=1 suggesting a very difficult question paper. The maximum value of DL is 1 and the minimum value is 0. DL may be expressed in a percentage value. Overall Difficulty Level (ODL) is the average of all the DLs.

6) Survey collection & Analysis:
AQASYS collects the different kinds of surveys including course feedback surveys, alumni surveys and any survey as launched by the administrator. The system has two major provisions to upload the survey forms into the database.
• Through the excel sheets.
• Directly from the survey forms (example Google form).
After the due date of collection of survey data, the results are analyzed by the bar charts, pie charts, etc. Fig. 15 shows an instance of such a feedback graph as generated by the system. Fig. 15. Screen Shot of a Survey Graph Generated by AQASYS. 821 | P a g e www.ijacsa.thesai.org 7) CLO Mapping (CLOM): CLO mapping is the most important part of the quality assurance system. Generally, it was found that.
• During the curriculum design of most of the programs, the care that is due for defining the CLOs, is not given.
• While mapping the assessment questions to appropriate CLOs, the attitude and approach is very un-scientific to the educational philosophy. Most of the faculty sways hay.
The underpinning point here is that if an academic institution has symptoms as observed above, then the whole exercise of quality evaluation becomes just an exercise of number jargon that would remain far removed from the real state of academics in the organization.
Keeping these observations at the center of the design, AQASYS integrates an intelligent auto-mapping module called iBOX that assigns possible CLOs automatically to the questions of an assessment questionnaire. iBOX works on the text mining algorithms and takes into account Bloom's taxonomy to recognize action verbs. a) iBOX Design: This section presents the design basics of iBOX (Fig 16). The goal is to first automatically identify the action verbs from the assessment questions and map them to CLOs, later map CLOs to the PLOs.The iBOX design explores Bloom's taxonomy and Action Verbs (refer to Table VI).
The iBOX algorithms, initially extract the action verbs from the given text document after verb extraction, the algorithms search Bloom's database for the matching verbs. If the matching verb is found in the database then, categorizes and label the document as per the scheme shown in Fig. 17, other-wise the algorithm allows human intervention to match the sentence with the action verb and Bloom's level. The algorithm updates Bloom's database with this newly learned action verb. Thus iBOX is incrementally learning while it continues to automatically classify the CLO, PLO, and question sentences into different Bloom's levels. Fig. 16 shows the iBOX as it appears in AQASYS.
iBOX basically processes the input text files to identify the action verbs and match them to the verbs available in the Bloom database. If a proper match is found, it labels the CLO, PLO mapping automatically else it adds the new unmatched verb to the new verb list in the Bloom database. The new verb then becomes part of Bloom's verb list. The user is allowed to manually classify the new verb and insert it into an appropriate Bloom level list as an update. This way the iBOX follows incremental learning.
Three text files containing 1) The question paper 2) CLO definitions 3) PLO definitions are the inputs to a sentence segmentation and verb extraction module of IBOX.
This module uses Stanford parser [16] and wordNet database [8][15] to extract the exact verb from the sentences of the input documents. Table VII shows the decision making process that is inbuilt in the iBOX.
Vq  is a verb in the question paper V c  is a verb in the CLO definitions V p  is a verb in the PLO definitions The extracted verbs are searched for their presence in the verb list of Bloom database. If they are present, then each verb (Vq, Vc, Vp) is compared with the members of the other verb lists corresponding to each Bloom's level (Remember, Understanding, Apply, Analyze, Evaluate, Design). If the verb under test is the same as the word in the other group then it is denoted as W s (similar word), else if it is different, it is denoted as W d (different word). It is to be noted that each of these verbs is already verified to be present in the Bloom database.

8) Course report generation:
Utilizing all the information available about the course in the database AQASYS generates the academic course report in the required format automatically in a single click.
It intelligently inserts recommendations in the report wherever necessary. The automatic course report generation feature makes AQASYS unique from other competitive research in this field. Fig. 18 shows the user interface for course report generation and Fig. 19 shows the instance of the course report being generated.     Fig. 20 shows instance of grade results and Fig. 21 shows instance of PLO attainments.

1) AQASYS Performance evaluation:
In order to evaluate the performance of the application, the following few application performance metrics were considered. Refer Table XII. The Appdex metric, as defined below: = + 2 825 | P a g e www.ijacsa.thesai.org was considered to evaluate "user satisfaction".   The count of concurrent users was observed to understand the "request rate" and the load that the application can handle. The application was found to be performing quite satisfactorily. "Availability" of web-application like the one in this paper is very important from the user's perspective. The application was "pinged" at an interval of a minute regularly for a duration of 120 hours. The log results showed successful "ping" all the time. The application was highly available.
2) AQASYS selected screenshots: The following figures, Fig 22 through Fig. 28, show various instances of AQASYS output as depicted against their names.

X. CONCLUSION
This paper presented a web-based academic quality assurance system. The unique features of the system included automatic course report generation and intelligent Bloom's level and CLO-PLO mapping module, that automatically assigns appropriate Bloom levels to the text presented to it by matching the action verbs. The paper presented the content from the system designer's perspective as well as from the user's perspective. The system was launched in the institutional LAN on a pilot basis and real-time data was loaded onto the system. Different operational parameters such as PCA, PSCA, DL, Course level PLO attainment (CPLO), Semester PLO attainment (SPLO), course-wise grade details, and other summary analytic were tested. The computed parameters were found to be as expected in the quality standards.
This system would help the faculty, quality committee, and the administration to measure the learning quality in the selected program according to the ABET and NCAAA standards. The faculty need not worry about the clumsy calculation formulae and heaps of excel-sheets and run around seeking last-minute templates to stitch the quality. The faculty now can focus on teaching, followed by assessing and entering the scores of the students. All the calculations related to quality are carried out by the system automatically. The system provides summary charts and tabulates data outputs. As the application uses a client-server architecture, the server database archives the historical data and is available for viewing at any point in time.
The application makes the matters of handling of academic quality, a seamlessly an easy task with features of reliability, security, availability, and privileged access being embedded into the system.