Online Education During a Pandemic – Adaptation and Impact on Student Learning

ssriniv@mcmaster.ca Abstract —Universities and educational institutions worldwide had to ab-ruptly suspend their in-person classes and offer the rest of the term in an online format. This adjustment meant that instructors had to switch their instruction format and redesign their assessment strategies to ensure good quality education. In this work, we present the methods used in two courses for this transition and the impact on student learning. Specifically, we present data from two courses: second-year engineering mathematics and first-year object-oriented programming. The online instruction was delivered covering all the objectives, and the online assessment environment was designed with all possible safe-guards to maintain integrity. Our data from these assessments show that the measures were successful. Further, the data indicate that while the pandemic severely impacted the first-year students, the second-year students did not experience any learning issues in the transition. We also present the lessons learned for future


Introduction
The novel coronavirus spread rapidly to several countries in the initial part of 2020, resulting in a global pandemic. This resulted in abrupt disruption of everyday life and the functioning of businesses and organizations. While every organization was faced with some challenge or the other, the educational institutions worldwide were severely crippled because of the government-issued social distancing norms. McMaster University was in the middle of the Winter 2020 term when the pandemic was declared. The departments were given a few days' notice to stop all in-person instructions, transform the education to an online learning environment, and adjust the assessment practices to conduct online exams.
By selecting to operate the remainder of the term in a remote environment, the university gave the students an opportunity to complete the rest of the term, albeit with some challenges and inconvenience, enabling them to progress into the higher levels. It must be noted that this was the only viable option because suspending all educational activities altogether would disrupt the academic cycle of the students, delaying their graduation, and probably affecting their employment offers. University closure would also mean a shutdown of scientific research that is critical for students to graduate.
The focus of this work is to present this transition in two courses at the W Booth School of Engineering Practice and Technology at McMaster University, evaluate the effect of this transition on student learning, and present the lessons learned for future improvement. Specifically, the two courses presented in this work include the secondyear undergraduate Mathematics and first-year undergraduate Object-Oriented Programming. In both courses, the last three weeks of the course were offered in a synchronous online format. The topics covered in these three weeks are critical because they are advanced concepts that are relevant to the subsequent courses in the upper years.
The first step in this transition was to consider the various pedagogical approaches to adopt the most suitable one for these courses. More precisely, several practices were reviewed, i.e., active learning [1][2][3][4][5][6][7], inquiry-based learning [8,9], co-operative and small group learning [10,11], problem-based learning [12][13][14][15][16][17][18][19], and undergraduate research-based learning [20]. After careful consideration, we determined that an active learning approach rooted in the principles of constructivist theory is best suited for these courses. In the following sections, we present the methodology that we used to deliver the material and the outcome of this strategy on student learning. Student learning is measured via their performance in the assessment components of the courses. We also reflect on the challenges that one is faced in such an adaptation and the lessons that we can learn from this experience to enhance future courses.

Materials and Methods
As mentioned above, an active learning strategy was applied to deliver two courses in the last three weeks of the winter 2020 term. The first course is on the topics related to signals and systems in the second year of an undergraduate engineering program. The course (Course-A) is taught from a mathematical point of view, stressing the formulations and solutions of equations. The second course is on objected-oriented Programming (OOP), in which the concepts are taught in C++ programming language (Course-B). The students enrolling in these courses eventually major in one of the following three disciplines: Automotive Engineering Technology, Automation Engineering Technology, Biotechnology, or Software Engineering Technology. In the Winter 2020 term, about 73 students took the first course during the covid pandemic. Similarly, about 130 students took the programming course (Course-B) during the pandemic.

Course design
Topics covered in the mathematics course include (a) Continuous-time signals and systems, (b) Convolution, (c) Laplace transform, (d) Fourier series and transform, and (e) Discrete-time signals and systems. In Course-B, the following topics were cov-ered: (a) File I/O, (b) pointers, (c) structures, (d) objects and classes (e) inheritance, (f) polymorphism, and (g) UML diagrams.
In the initial schedule, all the topics in both courses were to be delivered in a faceto-face setting over a duration of 13 weeks. In both courses, two classes per week were scheduled, each for two hours. The students' performance was measured using a variety of assessments, namely, assignments, term tests (2 in Course-A and 1 in Course-B), group project (in Course-B), and a comprehensive final exam. In Course-A, the students were given weekly assignments to ensure a continuous learning of the material. On the other hand, students in Course-B participated in 8 Labs where they had to develop programs for several questions, assessing them on the concepts taught in the previous week. The students in Course-B also completed a group project and presented for evaluation in the last three weeks of the course. While in-person assessments were planned for the two courses in the Winter 2020 term, two assessments (exams) in Course-A and one in Course-B were conducted in an online format due to the pandemic interruption.

Online delivery
In both courses, we continued following the class schedule during the pandemic. Classroom sessions were held via zoom. In Course-A, lecture notes were provided in a PDF format with our online learning management system (LMS). These notes include definitions, relevant explanation of the topic, some solved examples, and unsolved problems. During lectures, the notes were screen-shared, enabling the students to view the live annotations during the lecture. The annotated notes were also shared with the students at the end of the lecture. During the lecture, numerous questions were posed to the students to maintain an active learning environment. Further, unlike the regular in-class sessions, the intensity and frequency of such interactions were deliberately increased by the instructor.
In Course-B, during the pandemic shutdown, the students were required to complete their individual projects and submit it to a dropbox on our LMS. This is because all topics were taught before the pandemic shutdown in an in-person environment. In preparation for the project, several meetings between the instructor and the groups were arranged to continuously monitor the project's progress. During the discussion, the instructor provided the feedback on quality and progress of the project.
Thus, in both courses, depending upon the topic's complexity, several examples are demonstrated to elucidate the topic. Student understanding is evaluated by jointly solving several examples. This engagement with examples and problemsolving/program implementation exercises enable a healthy exchange of ideas and solutions among the students.
It must be noted that faculty members were given few days' notice to convert all in-person academic operations to online format. It was a challenging task considering the time limitation and the amount of work involved in resuming academic activities. The first challenge was to find a suitable virtual platform to connect with students that fulfill our requirements to smoothly complete the academic cycle. Many of the faculty members also had no exposure to any virtual platform, therefore multiple training sessions were arranged by the university for their training. This presented a steep learning curve for several faculty members while others adopted them fairly easily. The next challenge was to convert some of the course content into digital format. For the authors of this course, these challenges were not unsurmountable.

Assessments
In both courses, a variety of assessments, formal and informal, are employed to evaluate the students' understanding of the concepts. While the informal assessments are not graded, the formal assessments have numeric grades associated with them. Graded assessments include assignments, labs, tests, project, and the final exam. Assignments and labs are based on weekly topics covered in lectures. Students are allowed to collaborate with their peers and seek help from the instructor and the teaching assistants. The main objective of these assessments is to determine the students' level of comprehension to calibrate the upcoming lectures for better delivery of the concepts. If the class performance is poor, the instructor could review the concepts in the next class with more examples, and/or post additional examples with solutions in the LMS to help students understand the concepts better.
The two assessments in Course-A are conducted in weeks five and ten, respectively. The first assessment focuses on the concepts taught in the first four weeks, whereas the second assessment focuses on the concepts taught in weeks five through nine. The assessment 1 in Course-B is conducted in 6 th week, which covers the concepts taught in the first five weeks.
In this work, in both courses, we will consider two assessments administered in two offerings, namely, Winter 2020 and Winter 2019. Assessments-1 in both terms is in-person and serves as a benchmark for comparing the quality of the cohorts. Assessment-2 is an in-person assessment in Winter 2019 term and an online assessment in Winter 2020. This comparison will give us a measure of the impact of the assessment mode on student performance in the context of the pandemic setting.
Designing a new assessment methodology for an online format is a challenge and required some brainstorming activity. A major constraint was that we were not allowed to monitor the students on camera due to privacy concerns. This is because any such invigilation policy is required to be communicated to the students at the beginning of the term, giving them a choice to deregister from the course. With this in mind, for this course it was determined that each student should more or less have a unique exam and have a time constraint such that there won't be enough time for the student to collaborate with others. This strategy increased the work manifold but helped maintain the integrity of the exams to a large extent.
To prevent any collaboration during the exams in the online format, we did the following: In Course-A, a question bank consisting of three pools was created for the 2 nd assessment. Each pool had four different questions from a specific topic but at the same level of difficulty. During the exam, each student received three random questions, one from each pool. Students were required to write the solutions and submit a scanned version of their solutions to a dropbox on the LMS. The dropbox was set with time restrictions preventing further uploads once the time expires. Similarly, in the programming course (Course-B), each student was given a set of five questions from a random pool of fifteen questions. All questions required students to write a program or alter a given program. With stringent time limitations, simultaneous examination duration, and randomization of the questions, the collaboration was mitigated.

Results
The main objective of this work it to determine the effect of the abrupt transition to an online format on student learning. For this, in this study, learning was measured as the performance of students in the assessments. Reference data was obtained by comparing this cohort's performance with the students who took the same course in Winter 2019. The Winter 2019 cohort was taught the same topics in both courses and had the same number and type of assessments. The only variation was in Weeks 10-13, in which the Winter 2019 cohort attended classes in person and appeared for in-person exams.  The students' performance in the two tests in Course-A is summarized in Table 1. As seen in this table, the performance of the cohorts in the two successive years was almost identical. It must be noted that the assessments (exams) were at the same level of difficulty in all the assessments. The format of Assessment-2 is slightly different in the two years: In Winter 2019 all students appeared for the same paper-based examination, receiving the same set of questions. On the other hand in the online format in Winter 2020, students were presented with random questions from a database. However, in both cohorts, the students were assessed on the same type of questions. The close agreement in the first assessment clearly indicates that the learning ability of the two cohorts is similar. The grade distribution in Assessment-1 in Course-A is shown in Fig. 1. As is evident, in both years, the distribution was also very similar. In fact, the distribution in Assessment-2 was also very close in both years.
In the overall grade distribution of the course, the number of students getting A and B grades in the Winter 2020 term was 8% lower than in the Winter 2019 term. On the other hand, there was a proportionate increase in the number of students getting C grade in the Winter 2020 term. This could be attributed to the overall stress that was introduced with the abrupt changes during the pandemic shutdown.
The students' performance in the two assessments (exams) in Course-B are summarized in Table 2. As seen in this table, the student performance in Assessment-1, that happened in-person, is almost identical in both years. Assessment-2 was in-person in 2019 and online in 2020. In this case, we observe a steep decline in the grades of the student in the online format. It must be noted that both cohorts received nearly the same type of questions. As in Course-A, in 2019, the assessment was in person and so a single version of the assessment was given to all students. In the online format in 2020, a question bank with similar questions was used, and the LMS generated random question sets for the students. However, the level of difficulty of the questions was the same in both years.

Fig. 2. Grade distribution in Assessment-2 in Course-B
The performance of the students in the two high stakes assessments in Course-B is summarized in Table 2. As seen in this table, their performance is almost identical in the first assessment in both years. The grade distribution of the two cohorts was also almost identical, with about 57% of the students in 2019 and 61% in 2020 getting a grade of C or better. Once again, this clearly indicates that the learning ability of the cohorts is quite similar. However, we notice a significant difference in the scores in the second assessment. More precisely, the average score of the 2020 cohort is approximately 10% lower than the score of the 2019 cohort. The grade distribution in the second assessment in this course is shown in Figure 2. As seen in this figure, there is a significant increase in the number of students getting a failing grade in this assessment.
It is interesting to note that while the abrupt transition to an online format did not impact the student performance in Course-A, it has significantly affected the student performance in Course-B. We attribute this to the maturity levels of the students in their planning and time management skills. More specifically, Course-A is a secondyear course. After taking several courses, the students in this course are better prepared to handle stress and optimize their performance by planning and splitting their time between several courses to strike a balance. On the other hand, the students in Course-B are first-year students who have just joined the university. Further, this course is one of the first courses they take at the university while they are still acclimatizing to a university setting. The pandemic introduced a severe entropy that is quite intense for these students who are still learning to manage/optimize their time and resources to succeed in all the courses. Unfortunately, for the students in Course-B, this has ultimately led them to perform more poorly than the students in the upper levels. This is consistent with the postulates of Love and Kotchen [21], who have found that students tend to optimize their performance by splitting their time between several courses to strike a balance to do well in all courses. In summary, the current evidence from just two courses suggests that students in the upper years are not likely to be affected significantly in an online environment. However, new entrants into a university setting might find adapting to the online learning environment quite challenging.

Pedagogical approach
In both courses, we introduced the concepts through a constructivist approach to learning [22][23][24]. It has been our experience that the students in the W Booth School of Engineering Practice and Technology have a natural aptitude for this form of learning [25]. This was clearly established in our earlier work [25] in which we implemented a new course in the area of computational modeling of biological systems for the undergraduate students. In that, we found that using constructivist approach to learning resulted in a deeper level of engagement in the students. There was a healthy interaction with the peers as well as instructors throughout the course during the active learning sessions. More precisely, when compared with the section of students that were not exposed to this approach of learning, several students obtained an A grade or better in the course, an improvement by more than one letter grade. Thus, the overall cognitive learning was good with the implementation of activities requiring collaboration and reflection, that enabled good mental construction of the concepts through such rich interactions. In following this approach, the classroom environment in the online setting involved ample discussions among the students and between the students and the instructor. More precisely, after introducing a concept, students solved several problems via active participation. The interactive session that is a critical aspect of this form of learning allows for a variety of experiences for the students that will ultimately foster a strong and lasting mental construction of the concepts. Thus, by experiencing a variety of viewpoints, strategies, and techniques in such an active learning environment, cognition is initiated in students' minds by combining new information with the already known facts.
The informal assessments done during the lecture sessions play a crucial role and are useful in engaging students in online classes. By avoiding the association of such assessments with grades and engaging in informal assessment via sample questions inside the classroom, we could alleviate the anxiety to perform, and students were more focused on learning. With this strategy, a student who does not understand the concept is more inclined to acknowledge that, ask questions, and is interested in learning the principles to successfully solve the problem. In Course-B, where there were quite often a variety of approaches to implement the program to get the same desired outcome, students engaged in in-depth discussions to present their viewpoints and attempted to learn the alternative implementation approaches for the same problem. This helps foster an excellent mental construction of the concepts, and students can master the principles of the subject.

Assessment challenges
Measurement of learning and the progress into the next levels are dictated by the students' performance in various assessments. As mentioned earlier, a variety of assessment techniques are employed in engineering education. In this work, we used informal as well as formal assessments. The informal assessments mostly involve group discussions and collaborative problem-solving sessions inside the classroom to gauge student learning. Thus, the informal assessments inside the classroom give us a measure of the cohort's preparedness and understanding. The formal assessments that we undertook in these courses could be divided into collaborative and individual assessments. Collaborative assessments include assignments, labs, and projects. While some of these assessments were conducted inside the classroom, several of them, that required more time, were done outside the class. These assessments were largely based on an element of trust, assuming that the students would make an effort to learn from their peers instead of just copying their solutions for grades. The real challenge is in designing and administering the individual online assessments. This is because in an online exam environment without an invigilator, there is always a possibility of collaboration, compromising the exams' integrity. A possible solution that we have successfully implemented to mitigate this risk includes administering the exams through the LMS. In this, we develop a database of questions, and random question sets are administered to the students with a short time window for returning the solutions. Further, administering the exam on an online platform such as Zoom and requiring the students to have their camera on through the duration of the exam serves as a strong deterrent to collaboration. However, we should state that this is not a foolproof solution, and there is a possibility to collaborate if a student is technologically more adept. However, from the results presented in this work, we did not find any evidence of collaboration since students' performance in the online environment is either the same or poorer. Thus, we can conclude that the combination of random exam sets administered in a short window of time and a remote proctoring (via camera) setting is quite effective in maintaining the integrity of the examination process.
The other options that could be considered for the online examinations include (i) Enlisting a proctoring company's services that will use video conferencing and expert invigilators to ensure the integrity of an assessment. (ii) Instructors can conduct personal interviews of students via video conferencing to evaluate their learning. However, there are several challenges with an interview-based examination, such as the inefficiency of the process for large classes, recording requirements to re-evaluate a student appeal, etc.

Technical challenges
There are technical challenges in shifting from an in person to online learning. The primary challenge is the procurement of appropriate technology needed to deliver online classes, namely, tablets, cameras, microphones, headsets, etc. To facilitate equity and inclusion in learning, additional requirements include the need for recording services, closed captioning, etc. Thus, there is a sizeable financial commitment to set up the infrastructure to facilitate the delivery of the material to all students. Instead of procuring the setup for each instructor, the departments could mitigate some costs by investing in recording stations on campus that could be shared by faculty members to develop content for online delivery of the lectures.
The other major challenge is in the delivery of the content itself. Specifically, the successful delivery of live lectures in an online setting strongly depends upon the stability of the internet connection. Loss of internet connection during a lecture or poor connectivity would result in an ineffective and, in some instances, an unpleasant learning environment for the student. This can be addressed, to some extent, by providing the lecture notes ahead of the class via LMS and including a lot of supplementary material by pre-recording key examples and concepts. Additionally, to help students who have poor connectivity, an instructor could record lectures and make them available to the students for review outside the class. However, it should be immediately stated that providing recordings of the lecture could result in low attendance. In our work, we provide lecture notes, and supplementary content. However, we maintained an attendance registry and found that the attendance levels in our courses were quite healthy (>90%) in almost every class.

Conclusion
In this work, we present the approach that we took to transform and offer two courses in an online format in the last three weeks of the Winter 2020 term, in the wake of the pandemic. The two courses were a second-year undergraduate engineering math course (Course-A) and a first-year course in object-oriented programming (Course-B) in the Bachelor of Technology programs in the W Booth School of Engineering Practice and Technology at McMaster University. For content delivery, the zoom platform was used, and the content itself was a combination of PowerPoint slides/Onenote with annotations. To emulate a live in-class lecture session we imbibed principles of constructivist theory and included principles of active learning in the live online lectures. This kept the attendance at healthy levels and fostered interactions, exchange of ideas and thoughts, and ultimately enabled learning. The active learning sessions also gave students a sense of comfort in knowing that their peers are available for discussions and deliberations.
Learning was measured through a variety of assessment techniques. Informal assessments that did not have any grade associated with them helped gauge student learning and enabled us to calibrate our lectures, post supplementary materials, and offer extra help to struggling students. For individual assessments with a grade component associated with them, the current in-person assessment protocols were transformed. Specifically, we introduced timed tests via our learning management systems in which students were presented with a random question set generated from a database. Students had to submit the scanned version of the answers written on paper to an online dropbox. Comparison of the student performance with previous cohorts in Winter 2019 showed: i. In Couse-A, students in both cohorts (2019 and 2020) showed an almost identical performance. We could conclude that the format of online assessments met the integrity objectives. Further, the practically identical performance shows that the abrupt transition due to the pandemic did not negatively impact student learning in this course. In other words, student learning was not severely impacted due to the pandemic in this upper-year course. ii. In Course-B, students in both cohorts (2019 and 2020) showed an almost identical performance in the in-person assessment 1. This indicates that the capability and quality of the two cohorts are nearly the same. However, the 2020 cohort's average in the assessment 2 was almost 10% below the average of the 2019 cohort. While the 2020 cohort took the assessment 2 in an online format, the 2019 cohort took the same assessment in an in-person format. We attribute this to the fact that these are first-year university students who are still learning to optimize their time between different courses. Under such circumstances, the pandemic introduced a lot of pressure, and the students could not do well.
While these findings are pertinent for these courses, a firm conclusion that the pandemic impacts only the first-year students and not the upper-year students cannot be drawn without analyzing several other courses.
Going forward, the challenges in offering online courses across an entire department in the university are immense: (i) One has to invest in appropriate technological infrastructure to help faculty members create material for online content, record supplementary content, and deliver it, all the while adhering to the equity and accessibility guidelines. (ii) Diverse assessments are needed, and in situations requiring individual assessments, one might need to hire a proctoring company to conduct tests and exams. (iii) There would be a need to ensure that the quality of the learning environment is credible without issues such as disruption of classes due to internet issues, or diminished active learning components, that could diminish the quality of education.