Assessment and Evaluation Framework with Successful Application in ABET Accreditation

The development of reliable and easy-to-deploy assessment plans are a world-wide interest of academic programs. The cultivation of a culture of assessment or engaging in an accreditation effort is dependent on the development of effective assessment frameworks. Examining overly large variety of sources and using different tools challenge the applicability of assessment plans and can prove to be major hurdles. In this paper, a unified framework is proposed that enables the assessment and evaluation of student outcomes, at the program level, and evaluating student performance as well. The proposed framework identifies a set of courses to be assessed using direct tools. The tools enable measurements of attainment scores at the course learning outcomes, performance indicators, and student outcome levels to create a paradigm for unified assessment. The framework was deployed within a two cycles empirical study and led to a successful accreditation of a computer engineering program by ABET. The paper includes a thorough analysis and evaluation of the framework and its application. Keywords—ABET, accreditation, assessment methodology, engineering education, capstone design projects


Introduction
The rapid growth in number of higher education institutes, all over the world, necessitates exploring frameworks for assessment and evaluation (AE) that promote quality. Program reviews are challenged by the need to satisfy a variety of criteria as mandated by the local, regional, and international quality assurance requirements [1]. Successful program reviews are achieved using assessment plans that are welltailored to fit the characteristics of the programs and their hosting institutions. Many differences exist among academic institutions, such as size, structure, resources, culture, leadership, mission, and scope of available industries. Effective AE frameworks must fit the contextual features of the program and the institution/country, while satisfying the international professional accreditation requirements [2,3].
Lately, the focus of the accreditation process has been shifted to outcome-based assessment (OBA), rather than simply investigating the institutional input variables such as resources, grants, and faculty-to-student ratios. The change in orientation proved to present the framework application in accrediting a computer engineering program by ABET and the results. A thorough analysis and evaluation is included in Section 6. Section 7 concludes the paper and sets the ground for future works.

Research Objectives
The development of reliable and easy-to-deploy assessment plan is the main motivation for the proposed unified assessment and evaluation approach. The need for reliable assessment and evaluation is essential for the accurate identification of opportunities for improvement and closing the assessment loop on solid, valid, and evidence-driven grounds. Moreover, the need for accurate assessment must not compromise the ease-of-deployment; otherwise, the applicability of the plan becomes challenging and can face major hurdles. The research objectives of the current investigation comprise the following: • The development of a unified framework that enables the assessment and evaluation of student outcomes, at the program and course levels, and evaluating student performance as well. • The development of PIs that maps to ABET's (a) through (k) SOs.
• The identification of a convenient-to-handle set of courses in a computer engineering program to be assessed using direct tools with special emphasis on capstone design courses. • The development of learning outcomes for the identified courses that map to the developed PIs • The formulation of a statistical framework that enables the calculation of attainment scores at the course, PI, and SO levels. • The description of the closing-the-loop procedure that uses the developed framework to enable the identification of opportunities of improvement at the course and program levels. • The deployment of the proposed framework within a two-cycle plan that led to a successful accreditation of a computer engineering program by ABET. • Comparison with similar works and identification of future works.
The application of the proposed framework confirmed its effectiveness in achieving its purpose including assessing SOs, PIs, learning outcomes of courses, and evaluating student performance. Moreover, the framework application successfully identified opportunities of improvement, aided closing the assessment loop, and lead to a successful accreditation by ABET.

Unified Framework
The development of reliable and easy-to-deploy assessment plan is the main motivation for the proposed unified assessment and evaluation approach. The need for reliable assessment and evaluation is essential for the accurate identification of oppor-tunities for improvement and closing the assessment loop on solid, valid, and evidence-driven grounds. Moreover, the need for accurate assessment must not compromise the ease-of-deployment; otherwise, the applicability of the plan becomes challenging and can face major hurdles. The research objectives of the current investigation comprise the following: The SOs are the abilities that a student should possess at the time of graduation. The proof of attainment of SOs is a verification of the students' knowledge and a confirmation of their achievement of the intended learning objectives or standards set by a program. Furthermore, SO assessment and evaluation helps to identify areas of improvement in the curriculum and student learning. The proposed framework is built upon the concepts of OBA. In OBA, and upon the completion of the learning experience, a student is expected to attain the intended outcomes. Although surveying tools, such as exit surveys, can provide useful measurement of SO attainments, the focus of the presented methodology is on assessments carried out in courses. Assessments at the course level enjoy being accurate and strongly evident through exams and other direct tools. The learning outcomes identified at the courses level are then used as the building blocks of the framework. The proposed framework comprises the following levels: 1. Student outcomes (SOs) 2. Student outcome performance indicators (PIs) 3. Course learning outcomes (CLOs) 4. Assessment and evaluation components (AECs) To facilitate the measurement of SOs, they are refined into a detailed set of PIs. The main motivation behind the expansion into PIs is to give a concrete meaning for every SO. The indicators are to adhere to the good principles of development including being specific, measurable, achievable, results-focused, and time-bounded (SMART).
The CLOs are the formal statements of what students are expected to learn in a course. The attainment of CLOs in a course can be measured in a variety of ways. For example, a qualitative rubric can be developed that can aid the assessment of attainment by the instructor of every student. The proposed framework adopts the use of a selection of AECs within a weighted-average formula to measure the attainment of every CLO per student. An AEC can be an exam, exam question, quiz, homework, project component, etc. AECs serve as the base of grading the student performance and calculating the attainment percentage score per CLO. The assessment scores per CLO enable the calculation of student percentage attainments in a course. At this point, one or more CLOs can map onto the PIs; here, the attainment scores of PIs are the weighted-average aggregation for those of the CLOs. The mappings among the AECs onto CLOs and onto PIs create a hierarchal framework that enables vertical assessments of SOs and horizontal evaluation of student performance in courses. Figure 1 depicts the hierarchy of the framework. The attainment and evaluation measurements are combined by a weighted average, given by ! ! ! ! ! !!!!"# ! ! ! !!! , where ! is the aggregate score, !!!"# ! ! is the percentage score obtained for the ! th component !"# ! and ! ! is the weight associated with the ! th component !"# ! such where ! is the total number of AECs in a course.

4
A Case-Study on Accrediting a Computer Engineering Program by ABET

Case-Study
This paper studies the validity and the effectiveness of a framework for assessment of SOs. The framework enables a unified use of scores to obtain performance evaluations of students and attainment of CLOs, PIs, and SOs. The measurement of attainments for the PIs and SOs are taken in a set that ranges from sophomore to senior year courses and include courses that provide major design experiences including Capstone Design Projects (CDPs). The proposed framework stresses the need for triangulation of attainment scores based on complementary course characteristics.
The proposed framework provides simple and effective assessment of SOs. The simplicity comes from the limited set of courses where the assessment tools are deployed. The effectiveness is demonstrated through the successful accreditation of a computer engineering program by ABET. The study is carried out over two full assessment cycles, where each cycle is a single academic year (AY). The targeted academic years were the 2013-2014 and 2014-2015.

Student Outcomes SO1
!"# !"$ !"% CLOs AECs Evaluation Assessment PIs Fig. 1. The hierarchy of the unified framework for AE. Each SO is expanded into several PIs; assessment of the PI is carried out using careful mapping of a selected set of CLOs from a large bouquet of courses. Assessment of CLOs is carried out using a weighted average of selected AECs in a course.

Application in ABET Accreditation
For the framework application, the initial set of computer engineering SOs provided by ABET is adopted. The SOs are 11 outcomes labeled (a) to (k). The SOs are refined into a set of 29 Performance Indicators (PIs; see Table 1).

Course Learning Outcomes Level
Courses comprise a large set of CLOs (See Table 2) that can support the assessment of PIs. The reliability of assessment is preserved even with the selection of a representative set of sampled courses. The application of the proposed framework samples from a set of courses that comprises digital logic design (DLD), computer organization and architecture (COA), electrical circuits (EC) and its lab (ECL), microcontrollers and interfacing (MI) and its lab (MIL), embedded system design (ESD), signals and systems (SS), and capstone design projects (CDPs). Table 1 presents a mapping among the CLOs and PIs to enable the measurement of attainment based on the measurements at the level of CLOs. The CLOs per course are welldesigned to provide clear mappings onto the indicators. For example, PI a.1 maps to the DLD course CLOs 1 and 2 with weights of 15% and 10%.

Digital Logic Design (DLD)
(1) Apply number system conversions, typically related to binary system (2) Simplify Boolean expressions using basic theorems and properties of Boolean Algebra

Computer Organization and Architecture (COA)
(1) Demonstrate an understanding of the basic organization and architecture of modern computer systems (2) Analyze computer system performance (3) Demonstrate an understanding of how computer programs are organized, stored, and executed at the machine level (4) Write basic assembly language programs (6) Analyze an instruction-set architecture and propose a suitable datapath and control unit implementations (9) Demonstrate an understanding of the input/output mechanisms used to connect computers to their external environments

Microprocessors and Interfacing (MI)
(3) Design microcontroller applications (4) Develop microcontroller applications (5) Learn both HW and SW aspects of integrating digital devices (such as memory and I/O interfaces) into microcontroller-based systems (6) Learn the operating principles of common microcontroller peripherals such as UARTs, timers, and A/D and D/A converters (7) Use microcontrollers to implement projects

Microprocessors and Interfacing Lab (MIL)
(1) Use MPLAB to successfully write, debug, trace, and execute assembly programs (2) Write assembly language programs (3) Experiment with the available PIC microcontroller hardware to run various microcontroller applications (4) Implement input interface applications including switches, push buttons, and keypads (5) Implement output interface applications including 7-segment and LCD displays (6) Implement motor control applications including DC and stepper motors control (7) Implement time-sensitive applications using timers, interrupts, and clocks (8) Implement data conversion applications using ADCs …Continue  Electrical Circuits (EC) is a core course in almost all computer engineering programs. The proposed course has 11 CLOs; the first three are dedicated to measure and assess applied mathematical capabilities. The remaining nine follow the structure outlined in the course textbook [12]. The CLOs of the EC course map to ABET SOs with three levels of emphasis (See Table 3). An SO mapping emphasis is either high (H), medium (M), or low (L); the emphasis depends on the extent of coverage of a CLO in the course materials in relation to an SO.
In the current investigation, a variety of assessment components are employed including Course Work (CW) that comprise seven homework sets (HW) and six inclass quizzes (QZ). In addition, assessment components include Design Project (DP), Midterm Exam (ME), ME Question (MEQ), a Final Exam (FE), and FE Question (FEQ). The evaluation of student performance is calculated using a weighted average formula to produce the student's total percentage score in the course. The attainment of CLOs is calculated based on the scores of specific AECs. The adopted mapping of AECs onto CLOs and the assigned weights (Ws) are shown in Table 4. The proposed approach is unified in the sense that it provides attainment scores for each CLO and SO, besides, evaluating the performance of students. A sample FEQ is shown in Figure 2; the question is MEQ-6 that maps explicitly onto CLOs 1, 3 and 11, and implicitly onto CLOs 4, 6, and 7. At this point, the attainment of an SO is calculated as the average of attainments of all mapped CLOs only with high (H). An alternate calculation of the attainment that employs a weighted average formula based on all the three emphasis levels H, M, and L is also possible. Based on the attainment percentage scores, a ranking is assigned for each CLO per the following rubric: • Beginning: Below basic achievement, percentage attainment is below 65% • Developing: Basic achievement, percentage attainment is between 65% and 75% • Competent: Satisfactory achievement, score between 75% and 85% • Accomplished: Mastery of a learning outcome, score above 85% Table 3. Mapping of the Electric Circuits (EC) CLOs (1 through 11) to the SOs (a through k) with the level of emphasis.

Capstone Design Project Courses
In the proposed assessment methodology, CDPs are selected as one of the main components of the set of courses of measurements. The CDP setup, assessment, and evaluation follow the structure of the tool presented in [13]. The adopted methodology is unified in the sense that it provides attainment scores for CLOs, PIs and SOs, in addition to evaluating student performance.
As per the adopted setup, CDPs are scheduled over a period of two regular semesters. The pre-requisite for the CDP is a senior design experience provided in the courses on Microprocessors, Microcontrollers and Interfacing, and Embedded System Design. The pre-requisite courses are equipped with extensive practical laboratory components. The following assessment tools are used to assess the CDPs and the performance of each student: The criteria, key indicators, and percentage weight assignments are shown in Table  5. Table 6 shows the mapping among of the CDP evaluation criteria and PIs SOs (a) through (k). As the mappings are many-to-many; a weighted calculation is used to quantify the assessment for SOs. The assessment tool including the complete set of analytic rubrics is presented in detail in [13].

E. Presentation and Oral Communication
(%20, Supervisor and Examination Committee Members) 1.

Deployment and Closing the Loop
The proposed framework requires one course assessment form (CAF) for each course from the selected set of sampled courses. The CAF is focused on the calculation of attainment scores of CLOs as shown in Table 4. The CAFs are initiated by the course instructor(s) and submitted for further review at the program level. During this cycle, proofing is done to make sure that the data, statistics, comments and conclusions are consistent. In addition, supporting evidence is attached to the CAFs, including results to ensure the quality of the assessment process. The review at the program level places great value on the direct assessment of learning outcomes as collected and analyzed using the course form and CDP assessments. The assessment results, collected using the forms, are combined to calculate an attainment score in percent for every PI. Opportunities of improvement are identified at the course levels through the CLO attainment scores. Opportunities for improvement at the program level are identified using the PIs and SOs attainment scores. Indeed, other assessment tools can be used in assessment for triangulation and wider coverage; such tools include different types of surveys and recommendations from faculty, staff, and industrial and student advisory boards. Table 7 shows the CLO attainment results for the EC course as sampled over the two assessment cycles of the AY 2013 -2014 and 2014 -2015. The attainment scores of PIs are calculated based on the CLO scores; PI scores are shown in Table 1. The attainment scores of SOs are calculated as the average of PIs (See Table 8). The ranking of all attainment scores follows the same rubric of the CLOs.

General Evaluation
Many benefits are noted for the proposed framework. The framework addresses the need to identify a limited pool of courses from different seniority levels to obtain accurate assessment results. The selected manageable set of sampled courses aims to facilitate the assessment process including data collection, while maintaining effectiveness. The many-to-many mapping among courses, PIs, and SOs enables the triangulation of assessment results from different extensive sources of measurements. The conceptual base of the framework is refined into a clear measurement structure that measures both SO attainments and student performance. The framework can easily be adopted by other programs and disciplines without changing the statistics or the measurement structure. To close the assessment loop, evaluation can lead to identification of opportunities of improvement at the course level, using CAFs, or at the PI and SO levels. CLO, PI, or SO scores that are below the desired competency level is considered an opportunity of improvement. Identified low scores at the PI and SO levels trigger an investigative backtracking procedure to check relevant CLOs, whether within or outside the pool of sampled courses, to propose improvements. The framework was deployed within a two-cycle assessment plan. The case-study provides the opportunity for deep reasoning and analysis of the proposed framework. The application of the framework successfully led to accrediting the computer engineering program at the American University of Kuwait by ABET in August 2016 with the AY 2021 -2022 as the expected year of next evaluation [14]. The self-study report of the computer engineering program that adopted the proposed framework was featured, as exemplary, during ABET Symposium 2016 [15]. The adopted CDP measurement tool, from [13], clearly identifies the evaluations at the level of the team and individuals and enables adequate distinction for the assessment of students. The adopted CDP tool unifies the evaluation of project and course qualities and the assessment of attainment of outcomes.

Challenges and Limitations
The aim of the developed framework is to endorse the principles of reliable, thorough, accurate, easy-to-deploy, and unified assessment and evaluation. The successful application of the framework faces several challenges including the reliance on the commitment of assessors to design AECs, to thoroughly review the course material, and to provide concrete evidences. Data collection, result aggregation, identification of opportunities for improvement, besides sorting, filing, organization, and ease of access of data require careful setup and use of records keepings and software tools support.
A couple of limitations are identified for the proposed framework and it sets the ground for future work. A multi-site study can provide increased confidence in the obtained results and enables wider considerations. The duration of the case-study is limited to two assessment cycles over two academic years. The duration can be extended to provide a deeper analysis of the result trends. Indeed, the study is for a program with specific curriculum and course setups. Similar program setups must be made to successfully adopt the framework.

Closely Related Work
A variety of frameworks are developed to measure the attainment level of the SOs. Designing of curricula, course assessment forms, performance indicators, and reports for continuous improvements play an important role in this process. Many successful case studies were presented to exemplify these frameworks using both direct and indirect methods. Deciding on which courses should contribute to assess a certain SO and what assessment tools to be used is usually the first and most important step. In [16], it is argued that using course work for assessment is very time consuming for the faculty involved and for any outside assessment coordinator; consequently, only comprehensive final exams are used. The proposed exams were locally designed by a special committee and go through a continuous cycle of refining and improvements. Many other institution, especially in the US, use standardized tests, such as SAT, GRE, and MFAT; although this kind of assessment establishes a unified ground for international comparisons, it sometimes lacks the ability to provide the necessary level of detail required for adequate outcomes assessment [17]. Depending on just the grade average can be used to assess SOs, as illustrated in [18], only class averages are used as assessment with a reference threshold score of a single practical assignment. Using dedicated assessment tools that are independent from evaluation, such as essays, tailored assessment exams, and E-portfolios are used in [19]; these tools are often combined with grading rubrics to measure the degree of competency, required for different SOs; they can be also aggregated with grades to better assess the curricula. Other frameworks that depend on national competitions and exams, capstones and senior design courses, among other methods for measuring SOs were also reported in the literature [20][21][22].

Conclusion
The development of reliable, effective-in-application, and easy-to-deploy assessment plans are a world-wide interest of academic programs. This paper studies the validity and the effectiveness of a framework for assessment of SOs. The framework enables a unified use of scores to obtain performance evaluations of students and attainment of CLOs, PIs, and SOs. The framework is deployed within two assessment cycles each of one AY. The framework made a robust tool and the results led to successful identification of opportunities for improvement, closing the assessment loop, and accreditation by ABET of a Computer engineering program. The deployment results identified improvements for CLOs 2, 3, and 9 in the EC course during the AY 2013 -2014 and proved the major enhancement in the attainment of CLOs 2 and 9 during the AY 2014 -2015. The results highlighted the main need for improvement in attaining indicators a1, d1, e1, e2, f1, f3, and accordingly SO(a) and SO(f) with ranks of Beginning and/or Developing. Future work includes the development of additional software tools that facilitate the deployment of the framework. Future work also includes carrying out a multi-site study over a longer duration that can provide increased confidence in the obtained results and enables wider considerations. puter Engineering. He is a member of the ECE ABET Accreditation Steering Committee and the Chair of the ECE Academic and Curriculum Committee. His research interests include virtual engineering, nonlinear dynamics, hybrid signal processing, engineering physics, and chaos. He is a member of IEEE, AACC and AIP. Jibran Yousafzai is an Assistant Professor of Electrical and Computer Engineering. He is the Chair of the ECE Project Evaluation Committee and a member of the ECE Industrial Advisory Board. His research interests are in the broad area of signal processing, ranging from theoretical aspects of signal analysis to applications in automatic speech recognition and resolution of robustness issues, machine learning and digital audio processing. He is a member of IEEE.
Article submitted 08 June 2017. Published as resubmitted by the authors 25 July 2017.