Next Article in Journal
Crossing Borders: Conceptualising National Exhibitions as Contested Spaces of Holocaust Memory at the Auschwitz Birkenau State Museum
Next Article in Special Issue
Promoting Positive Emotions during the Emergency Remote Teaching of English for Academic Purposes: The Unexpected Role of the Constructionist Approach
Previous Article in Journal
The Role of Play and Objects in Children’s Deep-Level Learning in Early Childhood Education
Previous Article in Special Issue
Collaborative Learning in Teaching Culture Studies to Further Training Program Students
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A COVID-19 Shift to Online Learning: A Comparison of Student Outcomes and Engagement for the Bacterial Unknown Identification Project

1
Department of Biological Sciences, Purdue University, West Lafayette, IN 47907, USA
2
Department of Animal Sciences, Purdue University, West Lafayette, IN 47907, USA
*
Author to whom correspondence should be addressed.
Submission received: 23 May 2023 / Revised: 1 July 2023 / Accepted: 9 July 2023 / Published: 11 July 2023

Abstract

:
Many microbiology classes include a bacterial unknown identification project (BUIP), in which students identify an unknown microorganism. The COVID-19 pandemic forced a change in instructional methods from an in-person to an online version of this project. Our aim was to determine how the shift in learning from in person to online impacted three aspects of student engagement: student satisfaction, as measured by the withdrawal rate from the course; student enthusiasm, evaluated by student self-efficacy; and student learning, determined using non-point-based questions. To address the effectiveness of these modified versions of the BUIP, approximately five hundred students completed multiple-choice and Likert-style surveys before and after the project across two semesters: the semesters immediately prior to and during the initial phase of the COVID-19 pandemic. We found that while each semester reflected increases in perceived engagement with the material (p < 0.05), overall, students reported feeling comfortable with the process of unknown identification irrespective of the semester. Surprisingly, learned information, as measured by pre- minus post-survey scores, was equal across semesters. Additionally, we observed fewer course withdrawals during the semester in which the project transitioned online. These data shed light on how the curriculum during the transition to online learning led to equivalent student learning but greater student engagement, and therefore points to the importance of comparing shifts in a curriculum within one course.

1. Introduction

The COVID-19 pandemic caused an unprecedented transition in teaching as schools were forced to shift to online learning. Rather than sitting in classrooms with face-to-face instruction, students were now viewing their teachers through a computer screen in the comfort of their own homes. This sudden shift from in-person to virtual classes created new difficulties in student learning. For instance, research has shown that students identify technical issues, a lack of social interaction, decrease in motivation, and absence of immediate feedback as problems associated with online instruction [1,2,3]. Furthermore, students who rely on transnational education may struggle linguistically with online education [4]. However, despite these issues, exposure to virtual learning (e-learning) or hybrid courses (both in-person and online learning) during the pandemic still left students with a positive attitude toward online learning [5]. As such, a majority of students favor having online instruction, and educational institutions are adapting to a post-COVID-19 world by implementing e-learning or hybrid courses [4,6,7]. Thus, as online learning becomes more prevalent, instructors face several challenges, namely, how to make the courses educational and interactive, all while maintaining student interest [6,8].
Courses that require physical experiences to enhance learning, such as laboratory courses, can be especially difficult to transition online [9,10]. A prime example of this is in undergraduate microbiology courses that incorporate a wet-lab component. Students perform technical skills related to physically handling microorganisms. More specifically, they learn how to isolate a microorganism to develop a pure culture, how to characterize bacteria using various media and biochemical tests, and how to differentiate microorganisms based on staining techniques. Armed with this knowledge, students are then tasked with completing a commonly used microbiology assessment: the bacterial unknown identification project (BUIP) [11]. The BUIP is where students work within the laboratory, individually or in groups, to identify an unknown microorganism with the eventuality of submitting a written report or giving an oral presentation describing their results [12]. This project, therefore, not only allows the students to reinforce the hands-on skills that they have accumulated throughout the course, but also assesses their critical thinking; thus, students take ownership of their learning [13].
While the BUIP is traditionally taught as an in-person exercise, the COVID-19 pandemic forced many instructors to adapt it for online learning. This rapid shift to a virtual format created unforeseen challenges; students were neither capable of engaging with their peers, nor were they able to practice their knowledge of the biochemical tests needed to identify a microorganism. However, there have been a few approaches to the BUIP that addressed these issues. For example, students were assigned unknown microorganisms that they needed to identify using a Microsoft PowerPoint simulation [14]. This simulation promoted an interactive experience for the students, while also reinforcing their understanding of the various biochemical tests [14]. Students’ lack of social engagement with their peers has also been addressed. One course adapted the BUIP learning gains to focus more on peer learning and collaboration by working in teams, causing students to have an overall positive learning experience [15].
The efforts to transition the BUIP to an online format measured student learning experiences using final grades and end-of-project surveys [15]. Other science courses undergoing emergency remote instruction have used metrics such as student withdrawal to determine student engagement [16]. Alternatively, student interest in undergraduate science majors has been determined through Likert-scale surveys that assessed student self-efficacy [17]. However, while some studies have explored how COVID-19 affected undergraduate student learning in science courses, and other studies have explored changes in learning gains across the same course, there is limited research linking these two and directly evaluating student interest and learning for the same course taught before and during the COVID-19 pandemic [18,19,20].
During COVID-19, students at Purdue University had to unexpectedly switch their in-person microbiology laboratory course to an online version. Given the increasing number of hybrid and online courses, we aimed to determine whether online instruction of an in-person lab exercise would influence student engagement, learning, and enthusiasm. We hypothesized that students conducting their BUIP virtually rather than in person would experience lower retention rates, lower learning gains, and greater enthusiasm. To investigate the differences in retention rates, we first compared student enrollment across two semesters, one in which the BUIP was conducted in person (pre-COVID-19) or a semester in which the BUIP was conducted online (during COVID-19). We found that a greater number of students continued the course when taught virtually. We also characterized student outcomes and perceptions by surveying students before and after the BUIP and observed no changes in student learning across the two semesters. However, survey results indicated greater students’ enjoyment of the project after online instruction as compared to in-person instruction. These findings point to the importance of analyzing multiple metrics within one course across multiple semesters in order to improve student experiences in microbiology courses to ensure positive student outcomes, especially as the world increases the availability of hybrid or online courses.

2. Materials and Methods

2.1. Overview

We evaluated the BUIP at a large university in the US over a two-semester period. We assessed student learning gains and students’ evaluations of their own performance. Student learning gains were measured with a pre- and post-project survey, while self-efficacy and enthusiasm were measured with a post-project survey (see Supplementary Materials).

2.2. Intended Audience

The introductory microbiology laboratory at Purdue University is a fifteen-week course designed for non-biology undergraduates whose majors include, but are not limited to, pharmacy, animal science, food science, and agriculture. Prior to enrollment, students are required to take both Introductory Biology and Introductory Chemistry. The BUIP occurs after students develop a basic understanding of microbiological skills, including staining techniques, microscopy, streak plating, aseptic transfer, and selective and differential media. Typically, between 250 and 300 students are enrolled in the course, divided into laboratory sections of 24 students per group. The instructional team consists of five to six graduate teaching assistants, eight to ten undergraduate teaching assistants, and one teaching faculty. While teaching assistants varied over the course of the two semesters, the teaching faculty remained the same.

2.3. General Project Overview

The BUIP was conducted over five to six weeks, at the end of which a written report was due. Each week, students were required to work on their project in addition to completing regularly scheduled coursework, so students were expected to spend an additional one hour per week working outside of class. At the end of the project, students submitted a written three-to-five-page report that described their process in identifying the unknown organism as well as discussed the impact of their results. Teaching assistants graded the project using a detailed rubric.

2.4. In-Person Method

Each student pair was provided a single bacterial culture from which they were asked to streak the organism onto an LB agar plate. After the growth of their organism, students were provided the necessary materials for identification of the unknown organism, including a Gram staining kit, a microscope, one fluid thioglycolate broth tube, selective and differential medias, oxidase reagent, and hydrogen peroxide. Students were given a table containing approximately twenty organisms and their physiological characteristics. Students then worked independently over the next six weeks to complete the project, working concurrently with additional coursework, and attending open-laboratory periods as needed.

2.5. Online Method

For the online version of the BUIP, each student was provided a number that corresponded to a randomized table of organisms that was maintained by the instructional staff. Twice each week, students could request the test results for their unknown organisms from their teaching assistants via email. The teaching assistants, using a photographic database of available organisms with corresponding test results, responded to the students’ requests with the correct images of the students’ organisms. Test results included Gram staining, growth patterns in fluid thioglycolate broth, results on various selective and differential medias, the presence of the cytochrome oxidase enzyme, and catalase activity as detected with hydrogen peroxide. Students were given a table containing approximately twenty organisms and their physiological characteristics. Students worked independently over five weeks to complete the project, working concurrently with additional coursework, and using email to interact with their teaching assistants.

2.6. Enrollment

To compare student retention, enrollment was determined by obtaining the student roster from the registrar at three time points within the semester: within the first two weeks of class, the week of the start of the class project, and during the week of the final exam. Total student numbers were tallied at these three points to determine the percentage of students that withdrew from the course.

2.7. Survey Design and Distribution

To assess student outcomes and perception, students completed two surveys, one the week before the start of the project, and one at the culmination of the project during the final exam (Supplemental Data S1). Students completing the in-person BUIP completed a paper version of the survey, were instructed not to write their names on the surveys, and submitted them anonymously to a locked dropbox. Students completing the online BUIP completed an online version of the survey, in which their names were not recorded and aggregate data were exported to Excel. The number of survey responses and enrollment for each dataset is included in Supplemental Table S2.
To determine learning gains, four multiple-choice questions were asked in both the pre- and post-project surveys. To determine self-efficacy, students were asked two true–false questions on the post-project survey. To characterize student enjoyment of the project, students were asked to use a Likert scale, in which students answered 5 for ‘strongly agree’ and 1 for ‘strongly disagree’, on the post-project survey.

2.8. Participant Consent and Ethical Considerations

While all students completed the BUIP as a component of the Introduction to Microbiology course, participation in the survey was voluntary, and participants were informed prior to starting the survey that all data collected were anonymous and would be used for research purposes. Students had the option to leave the survey blank or not submit it. Each student was provided with a single survey, thereby preventing multiple responses. Institutional Review Board (IRB) approval was obtained to compare the survey data across the semesters involving student enrollment and survey data collection, human subject IRB 2021-1226 (Purdue University).

2.9. Statistical Analysis

After grades were submitted, anonymized survey data were collected in paper or through an online format. Data were collated and figures were generated in Excel. Learning gains were measured using the rubric described previously [21], in which gain = (post% − pre%)/(100 − pre%) and a low gain is ≤0.3, medium gain is 0.3–0.7, and high gain is ≥0.7. The IBM SPSS Statistics program was used to conduct binomial logistic regression, or chi-squared analysis, of pre-test, post-test, and self-evaluation data between the two surveyed semesters. p values < 0.05 were considered statistically significant.

3. Results

In this study, our aim was to determine whether students’ interest in the coursework would directly correlate with learning gains. To answer this question, we compared student survey results between two classroom setups: one setup that used the existing curriculum (in-person), and a setup in which students transitioned online for the second half of the course (online). Students were surveyed prior to and upon completion of the project.

3.1. Student Retention

Initially, we compared overall student enrollment between the two semesters to determine whether a shift in project type corresponded to withdrawal rates (Figure 1). Overall, we observed a higher (8.1%) withdrawal rate during the typical semester in which the in-person project was completed as compared to the semester in which the project was completed online (5.4%) (Figure 1). However, given that the project occurred after the first exam (approximately halfway through the semester), we also compared withdrawal rates in the second half of the course. Interestingly, here, we observed a lower (2.9%) second-half withdrawal rate when students completed the project in person as compared to the online project (4.4%).

3.2. Student Outcomes

To determine whether the shift in instructional strategy had an effect on student learning, we surveyed students on four questions that covered various facets of the BUIP and the learning goals. Across the two semesters, for each of the questions, the learning gains were low (<0.3), ranging from −0.18 to 0.11 (Supplemental Table S3). However, apparent differences were observed between the two semesters that may point to instructional strategies that emphasized distinct skills.
In the final written report for the BUIP, students are expected to provide their hypothesis; therefore, we surveyed whether students could select a correct hypothesis out of four provided options. While we observed no significant difference in the students’ improvement in identifying a correct hypothesis between the two semesters, there were still notable shifts (Figure 2A). Students in the in-person semester had an 8% improvement in their ability to identify a correct hypothesis when compared to students who completed the project online, where there was no change in student responses.
During the first half of the course, which both sets of students completed in person, students are taught that microorganisms typically exist in mixed communities, and they are taught techniques involved in isolation, such as the Gram stain. We wanted to determine whether students retained this information and if they were able to understand the process of isolation and identification throughout the BUIP. As expected, the majority of the students were able to select the correct choice for these questions before the start of the BUIP. Furthermore, despite a slight shift in the responses at the end of both semesters, there was no significant difference in the students’ ability to answer questions about mixed culture and Gram stain correctly when comparing pre-project to post-project responses for the in-person and online semesters (Figure 2B).
Although this project uses biochemical-based techniques to identify unknown microorganisms, we hoped to identify whether students understood the limitations of this method of identification. Whether students participated in this project in person or online, students at the end of the semester more strongly believed the BUIP could exclusively identify a microorganism response than the more correct response of sequencing the small ribosomal RNA (Figure 2D). Ultimately, there was no significant difference in students’ answers when comparing pre-project to post-project responses between the two semesters.

3.3. Student Perceptions

While evaluating student learning is important when undertaking any shift in curriculum, one goal of this study was to determine whether the transition online affected student enthusiasm for the project. Therefore, we asked students to rate statements about the project that related to the perceived effort, value, and emotional response. Overall, students had a similar response across both semesters. For instance, students thought that the project did not take too much time (effort), it was worth a fair number of points (value), and it did not make them cry (emotional response). Generally, students completing the BUIP online disagreed to a greater degree that the project took too much of their free time, averaging a score of 2.4 compared to the in-person semester, which had an average Likert score of 3.7 (Figure 3A). However, students perceived the equal value of the project across both semesters (average Likert scores of 2.2 in person and 2.5 online). The emotional response to the project was also perceived equally across both semesters with average Likert scores of 2.1. Concurrently, students were asked whether they found the BUIP that semester exciting, exhausting, a combination of the two, or choose not to respond (Figure 3B). Of note, we observed that there was significantly higher excitement for the project when completed online, where 30% of the online students were excited by the project as compared to the in-person semester, in which 17% of the students were excited (chi-square analysis, p < 0.05). Furthermore, a greater percentage of students chose not to answer and found the project more exhausting when administered in person.
As a final testament to student self-efficacy and evaluation of the project, students were asked if they were comfortable with the process of identifying an unknown microorganism, and whether the unknown project applied to their life or career. Although students who completed the online BUIP rated themselves more highly on comfort with the process and felt it was more applicable to their career, there did not appear to be major differences between the in-person and online semesters (Table 1).

4. Discussion

During the Spring 2020 semester, educational institutions were forced to adopt an e-learning education format due to the COVID-19 pandemic [22]. Consequently, microbiology laboratory courses shifted to an online learning mode, which has been shown to work well for delivering content and assessing student learning [15]. However, a call was made to determine the consequences of digital technology on student education [23]. We wanted to determine whether the COVID-19 transition to online learning affected student retention, learning, and enthusiasm. Given that the COVID-19 pandemic was unexpected, we evaluated the change in student learning and enthusiasm using a pre-existing survey in which students were asked questions reflecting learning objectives and self-efficacy. Student retention was measured by determining enrollment at three points during the semester. We observed higher levels of student enrollment during the COVID-19 transition to online instruction. We also noted that learning objectives were maintained, and that student enthusiasm increased.
The primary goal of the in-person delivery of the BUIP is to assess written communication of scientific findings and comprehensive knowledge of bacterial physiology, different media, microscopy, and experimental design and implementation. The goal in transitioning this project to online delivery was to maintain these objectives as much as possible [15]. For a subset of these learning objectives, including bacterial isolation, staining techniques, and hypothesis generation, we used pre- and post-test assessments and a final self-evaluation survey to gauge student performance. While some reports have indicated that student learning during the online transition increased [24], others have noticed a decrease in student learning [25], while yet others, including ours, noted no change [26,27]. Collectively, these differences between studies may be due to student cohort [28,29], self-reporting [30], or subject matter [31]. One key example is a meta-analysis of online and face-to-face learning curricula across multiple biology courses that indicated that the student performance varied across courses, which was associated with differences in course design and the lack of pre-test versus post-test elements in the studies observed [32].
Studies show that students attending online courses drop out at a substantially higher rate than those attending on-campus courses (25–40% and 10–20%, respectively) [33]. Although there are differences between online courses and in-person courses that were moved online with little lead time, previous studies suggest that students’ satisfaction and the amount of time to invest in learning influences students’ decision to complete or drop out of classes [34]. We found that the students in our study had a more positive overall experience and a lower drop-out rate during the online format as compared to the in-person semester, which has been previously noted for e-learning [34]. An alternative possibility for the overall lower withdrawal rates during the online semester was the increase in course accessibility. While the use of an electronic bacterial unknowns simulation creates an assignment that is accessible and equitable to all students, regardless of their ability to attend in-person microbiology laboratory courses due to institutional requirements, health concerns, or other disruptions [14], online instruction generally suffers from inadequate infrastructure, limited collaboration, and poor communication [35]. Given the aggregate nature of the data in this study, future research should aim to identify causative factors that impact students’ decisions to drop out of a course, whether it is an in-person, online, or hybrid format.
Many of the studies around microbiology education in different teaching formats, including the survey used in this paper, lack testing for technical expertise. Typically, when the BUIP is performed in a laboratory setting, students are exposed to various materials and scientific instruments. This hands-on project aids in improving student microbiological skills, as it provides a practical extension of the concepts taught [36], a deeper and physically rooted understanding of scientific systems [37], and it provides holistic conceptual knowledge of the coursework [38]. Here, we show that students responded positively to the online transition of this project; they reported they had successfully learned. However, students who require transferable technical skills would have a gap in their fundamental training if unable to complement their experience with wet labs that addressed these skills [39,40], which was not tested in this study. Options for these courses are to be flexible in how the subject is covered, such as by implementing hybrid laboratory courses to expose students to laboratory techniques unavailable in their lab courses [15]. This type of mixed-medium course would reduce the accessibility gap and give students the opportunity to experience material from multiple platforms. Furthermore, this would support students who are interested in a mix of online and in-person hands-on laboratory activities, even if these students are strongly in favor of digital online lab activities [41].
One strategy to cope with increasing student enrollment and limited finances is the practice of mixing or reducing wet labs and introducing dry alternatives. One could argue that in the field of microbiology education, dry labs are best suited for non-microbiology-major students who will not directly handle microorganisms in their future careers. However, students who need transferable technical skills, including aseptic transfer, bacterial isolation, and microscopy, would have a gap in their foundational training if they could not supplement their experience with wet laboratories that addressed these skills [30,42]. This study supports the success of dry-lab BUIPs, where students adapted to a shift in their curriculum, as noted by the lack of significant difference in students’ learning between the two semesters. Although we studied a shift from in-person to online learning, these results are consistent with previous pre-COVID-19 studies that showed no statistically significant difference in student success in fully online courses compared with face-to-face instruction [43]. However, there is limited research on evaluating technical skills for the same course taught in person and online [35,44].

Limitations and Future Directions

The findings of this study are limited to a single-institution survey that tested specific material. Therefore, the study findings might not be generalizable to other microbiology classrooms that teach different material or have a greater baseline use of online materials. Additionally, as each question was aggregated across the entire study population, we were unable to directly correlate student learning and enthusiasm. Moreover, this study was a study of opportunity with pre-existing survey questions prepared by the study investigators that had not undergone rigorous standardization, which might limit the strength of the study.
A future study could determine how effectively students learned and retained their technical skills by longitudinal assessments in which the same cohort of students is either tracked in subsequent classes that require this course as a pre-requisite, or they are tested for technical skills. Other potential studies could include expanding the work here to microbiology classrooms that teach students who may be more likely to be disadvantaged by the remote approach.

5. Conclusions

To our knowledge, this is the first study that explores the effects of student learning both before and during the COVID-19 pandemic within the same classroom by using surveys to assess student learning and engagement. Here, we demonstrate that a difference in instructional design within the same course has marginal differences in non-grade-based student learning. Overall, the transition to an online format in this class did not change students’ learning gains; however, the students’ enjoyment in the project and retention in the course increased. Thus, these results might aid in modifying microbiology curricula to meet the demands of both students and universities.

Supplementary Materials

The following supporting information can be downloaded at: https://0-www-mdpi-com.brum.beds.ac.uk/article/10.3390/educsci13070702/s1, Data S1: Student Survey; Table S1: Survey responses and enrollment; Table S2: Learning gains; Table S3: Supporting Data.

Author Contributions

Conceptualization, A.D.F.; methodology, A.D.F.; formal analysis, A.D.F., K.P. and M.A.; investigation, A.D.F. and M.A.; writing—original draft preparation, A.D.F., K.P. and M.A.; writing—review and editing, A.D.F., K.P. and M.A.; visualization, A.D.F. and M.A.; supervision, A.D.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of Purdue University (protocol code 2021-1226 and approved on 09/09/2021).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available in Supplemental Table S3.

Acknowledgments

We would like to thank Purdue University Statistical Consulting Service, in particular David B. Arthur, for their guidance on statistical analysis.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Vonderwell, S. An examination of asynchronous communication experiences and perspectives of students in an online course: A case study. Internet High. Educ. 2003, 6, 77–90. [Google Scholar] [CrossRef]
  2. Song, L.; Singleton, E.S.; Hill, J.R.; Koh, M.H. Improving online learning: Student perceptions of useful and challenging characteristics. Internet High. Educ. 2004, 7, 59–70. [Google Scholar] [CrossRef]
  3. Wisanti; Ambawati, R.; Putri, E.; Rahayu, D.; Khaleyla, F. Science online learning during the COVID-19 pandemic: Difficulties and challenges. J. Physics Conf. Ser. 2021, 1747, 012007. [Google Scholar] [CrossRef]
  4. Sun, Y.; Li, N.; Hao, J.L.; Di Sarno, L.; Wang, L. Post-COVID-19 Development of Transnational Education in China: Challenges and Opportunities. Educ. Sci. 2022, 12, 416. [Google Scholar] [CrossRef]
  5. McKenzie, L. ‘Students Want Online Learning Options Post-Pandemic’. Inside Higher Ed. 2021. Available online: https://www.insidehighered.com/news/2021/04/27/survey-reveals-positive-outlook-online-instruction-post-pandemic (accessed on 27 April 2021).
  6. Bashir, A.; Bashir, S.; Rana, K.; Lambert, P.; Vernallis, A. Post-COVID-19 Adaptations; the Shifts Towards Online Learning, Hybrid Course Delivery and the Implications for Biosciences Courses in the Higher Education Setting. Front. Educ. 2021, 6, 711619. [Google Scholar] [CrossRef]
  7. Pham, H.-H.; Ho, T.-T.-H. Toward a ‘new normal’ with e-learning in Vietnamese higher education during the post COVID-19 pandemic. High. Educ. Res. Dev. 2020, 39, 1327–1331. [Google Scholar] [CrossRef]
  8. Dhawan, S. Online Learning: A Panacea in the Time of COVID-19 Crisis. J. Educ. Technol. Syst. 2020, 49, 5–22. [Google Scholar] [CrossRef]
  9. Kontra, C.; Lyons, D.J.; Fischer, S.M.; Beilock, S.L. Physical Experience Enhances Science Learning. Psychol. Sci. 2015, 26, 737–749. [Google Scholar] [CrossRef] [Green Version]
  10. May, D.; Morkos, B.; Jackson, A.; Hunsu, N.J.; Ingalls, A.; Beyette, F. Rapid transition of traditionally hands-on labs to online instruction in engineering courses. Eur. J. Eng. Educ. 2022, 1–19. [Google Scholar] [CrossRef]
  11. James, Alena Marie. ‘An Instructor’s Ultimate Assessment: The Bacterial Unknowns Project’. ASM.Org. 2016. Available online: http://www.yourdomain.com/index.php/education-blog/item/109-an-instructor-s-ultimate-assessment-the-bacterial-unknowns-project (accessed on 18 August 2016).
  12. Wagner, S.C.; Stewart, J.R.S. Microbial Safari: Isolation & Characterization of Unknowns in an Introductory Microbiology Laboratory. Am. Biol. Teach. 2000, 62, 588–592. [Google Scholar] [CrossRef]
  13. Engohang-Ndong, J.; Gerbig, D.G., Jr. Making the Basic Microbiology Laboratory an Exciting and Engaging Experience. J. Microbiol. Biol. Educ. 2013, 14, 125–126. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Rhodes, D.V.L.; Barshick, M.R. Adapting a Bacterial Unknowns Project to Online Learning: Using Microsoft PowerPoint to Create an Unknowns Identification Simulation. J. Microbiol. Biol. Educ. 2021, 22. [Google Scholar] [CrossRef] [PubMed]
  15. Amrein, J.; Dimond, Z.; Reboullet, J.; Hotze, E. Bacterial Unknown Project in the COVID19 Era: Transition from in-Person Lab Experience to Online Environment. J. Microbiol. Biol. Educ. 2021, 22, 22.1.52. [Google Scholar] [CrossRef] [PubMed]
  16. Anzovino, M.E.; Mallia, V.A.; Morton, M.S.; Paredes, J.E.B.; Pennington, R.; Pursell, D.P.; Rudd, G.E.A.; Shepler, B.; Villanueva, O.; Lee, S. Insights and Initiatives While Teaching Organic Chemistry I and II with Laboratory Courses in the Time of COVID-19. J. Chem. Educ. 2020, 97, 3240–3245. [Google Scholar] [CrossRef]
  17. Wester, E.R.; Walsh, L.L.; Arango-Caro, S.; Callis-Duehl, K.L. Student Engagement Declines in STEM Undergraduates during COVID-19–Driven Remote Learning. J. Microbiol. Biol. Educ. 2021, 22, 22.1.50. [Google Scholar] [CrossRef]
  18. Seitz, H.; Rediske, A. Impact of COVID-19 Curricular Shifts on Learning Gains on the Microbiology for Health Sciences Concept Inventory. J. Microbiol. Biol. Educ. 2021, 22, 22.1.73. [Google Scholar] [CrossRef]
  19. Shaw, T.J.; Yang, S.; Nash, T.R.; Pigg, R.M.; Grim, J.M. Knowing is half the battle: Assessments of both student perception and performance are necessary to successfully evaluate curricular transformation. PLoS ONE 2019, 14, e0210030. [Google Scholar] [CrossRef]
  20. Amir, L.R.; Tanti, I.; Maharani, D.A.; Wimardhani, Y.S.; Julia, V.; Sulijaya, B.; Puspitawati, R. Student perspective of classroom and distance learning during COVID-19 pandemic in the undergraduate dental study program Universitas Indonesia. BMC Med. Educ. 2020, 20, 392. [Google Scholar] [CrossRef]
  21. Coletta, V.P.; Steinert, J.J. Why normalized gain should continue to be used in analyzing preinstruction and postinstruction scores on concept inventories. Phys. Rev. Phys. Educ. Res. 2020, 16, 010108. [Google Scholar] [CrossRef] [Green Version]
  22. Morgan, H. Best Practices for Implementing Remote Learning during a Pandemic. Clear. House J. Educ. Strateg. Issues Ideas 2020, 93, 135–141. [Google Scholar] [CrossRef]
  23. Williamson, B.; Eynon, R.; Potter, J. Pandemic politics, pedagogies and practices: Digital technologies and distance education during the coronavirus emergency. Learn. Media Technol. 2020, 45, 107–114. [Google Scholar] [CrossRef]
  24. Gonzalez, T.; De La Rubia, M.A.; Hincz, K.P.; Comas-Lopez, M.; Subirats, L.; Fort, S.; Sacha, G.M. Influence of COVID-19 confinement on students’ performance in higher education. PLoS ONE 2020, 15, e0239490. [Google Scholar] [CrossRef] [PubMed]
  25. Donnelly, R.; Patrinos, H.A. Learning loss during COVID-19: An early systematic review. PROSPECTS 2022, 51, 601–609. [Google Scholar] [CrossRef] [PubMed]
  26. Schoenfeld-Tacher, R.M.; Dorman, D.C. Effect of Delivery Format on Student Outcomes and Perceptions of a Veterinary Medicine Course: Synchronous Versus Asynchronous Learning. Veter-Sci. 2021, 8, 13. [Google Scholar] [CrossRef] [PubMed]
  27. Elliott, E.R.; Reason, R.D.; Coffman, C.R.; Gangloff, E.J.; Raker, J.R.; Powell-Coffman, J.A.; Ogilvie, C.A. Improved Student Learning through a Faculty Learning Community: How Faculty Collaboration Transformed a Large-Enrollment Course from Lecture to Student Centered. CBE—Life Sci. Educ. 2016, 15, ar22. [Google Scholar] [CrossRef] [PubMed]
  28. Jaap, A.; Dewar, A.; Duncan, C.; Fairhurst, K.; Hope, D.; Kluth, D. Effect of remote online exam delivery on student experience and performance in applied knowledge tests. BMC Med. Educ. 2021, 21, 86. [Google Scholar] [CrossRef]
  29. Kim, J.W.; Myung, S.J.; Yoon, H.B.; Moon, S.H.; Ryu, H.; Yim, J.-J. How medical education survives and evolves during COVID-19: Our experience and future direction. PLoS ONE 2020, 15, e0243958. [Google Scholar] [CrossRef]
  30. Mahdy, M.A.A. The Impact of COVID-19 Pandemic on the Academic Performance of Veterinary Medical Students. Front. Veter-Sci. 2020, 7, 594261. [Google Scholar] [CrossRef]
  31. Lestari, W.; Ichwan, S.J.A.; Yaakop, S.Z.; Sabaznur, N.; Ismail, A.; Sukotjo, C. Online Learning during the COVID-19 Pandemic: Dental Students’ Perspective and Impact on Academic Performance, One Institution Experience. Dent. J. 2022, 10, 131. [Google Scholar] [CrossRef]
  32. Biel, R.; Brame, C.J. Traditional Versus Online Biology Courses: Connecting Course Design and Student Learning in an Online Setting. J. Microbiol. Biol. Educ. 2016, 17, 417–422. [Google Scholar] [CrossRef]
  33. Xenos, M. Prediction and assessment of student behaviour in open and distance education in computers using Bayesian networks. Comput. Educ. 2004, 43, 345–359. [Google Scholar] [CrossRef]
  34. Levy, Y. Comparing dropouts and persistence in e-learning courses. Comput. Educ. 2007, 48, 185–204. [Google Scholar] [CrossRef]
  35. O’doherty, D.; Dromey, M.; Lougheed, J.; Hannigan, A.; Last, J.; McGrath, D. Barriers and solutions to online learning in medical education-an integrative review. BMC Med. Educ. 2018, 18, 130. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Waldrop, M.M. Education online: The virtual lab. Nature 2013, 499, 268–270. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. National Research Council. Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century; Pellegrino, J.W., Hilton, M.L., Eds.; The National Academies Press: Washington, DC, USA, 2012. [Google Scholar] [CrossRef]
  38. Bernhard, J. Insightful learning in the laboratory: Some experiences from 10 years of designing and using conceptual labs. Eur. J. Eng. Educ. 2010, 35, 271–287. [Google Scholar] [CrossRef] [Green Version]
  39. Abbasi, M.S.; Ahmed, N.; Sajjad, B.; Alshahrani, A.; Saeed, S.; Sarfaraz, S.; Alhamdan, R.S.; Vohra, F.; Abduljabbar, T. E-Learning perception and satisfaction among health sciences students amid the COVID-19 pandemic. Work Read. Mass. 2020, 67, 549–556. [Google Scholar] [CrossRef]
  40. Kanagaraj, P.; Sakthivel, R.; Christhumary, P.C.; Arulappan, J.; Matua, G.A.; Subramanian, U.; Kanagaraj, A.; Jacob, J.; Muniyandi, H. Nursing Student’s Satisfaction With Virtual Learning During COVID-19 Pandemic in India. SAGE Open Nurs. 2022, 8, 237796082211449. [Google Scholar] [CrossRef]
  41. Brockman, R.M.; Taylor, J.M.; Segars, L.W.; Selke, V.; Taylor, T.A.H. Student perceptions of online and in-person microbiology laboratory experiences in undergraduate medical education. Med. Educ. Online 2020, 25, 1710324. [Google Scholar] [CrossRef] [Green Version]
  42. Mutalib, A.A.A.; Akim, A.M.; Jaafar, M.H. A systematic review of health sciences students’ online learning during the COVID-19 pandemic. BMC Med. Educ. 2022, 22, 524. [Google Scholar] [CrossRef]
  43. Campbell, C.D.; Challen, B.; Turner, K.L.; Stewart, M.I. #DryLabs20: A New Global Collaborative Network to Consider and Address the Challenges of Laboratory Teaching with the Challenges of COVID-19. J. Chem. Educ. 2020, 97, 3023–3027. [Google Scholar] [CrossRef]
  44. Muilenburg, L.Y.; Berge, Z.L. Student barriers to online learning: A factor analytic study. Distance Educ. 2005, 26, 29–48. [Google Scholar] [CrossRef]
Figure 1. Student enrollment across two consecutive semesters. A greater percent of students withdrew when the class was taught in person as compared to when it transitioned online. Student enrollment was calculated by collecting the class roster at each of the three time points and calculating the percentage of students that withdrew compared to the numbers at start of the semester. The in-person method is depicted in light grey; the online method is depicted in dark grey.
Figure 1. Student enrollment across two consecutive semesters. A greater percent of students withdrew when the class was taught in person as compared to when it transitioned online. Student enrollment was calculated by collecting the class roster at each of the three time points and calculating the percentage of students that withdrew compared to the numbers at start of the semester. The in-person method is depicted in light grey; the online method is depicted in dark grey.
Education 13 00702 g001
Figure 2. Changes in student learning before and after the BUIP. Post- minus pre- average responses for multiple-choice responses indicate students performed equally overall across two semesters. Four questions that reflect two of the learning goals (develop an appropriate hypothesis and apply appropriate microbiological methods) were given to students before the start of the project and at the end of the semester. Correct answers are indicated with black bars with post- minus pre- values indicated, and incorrect answers are in various shades of grey. For all four questions, there was no significant difference in students’ pre- and post-answers across both semesters (log-linear analysis, p-value < 0.05).
Figure 2. Changes in student learning before and after the BUIP. Post- minus pre- average responses for multiple-choice responses indicate students performed equally overall across two semesters. Four questions that reflect two of the learning goals (develop an appropriate hypothesis and apply appropriate microbiological methods) were given to students before the start of the project and at the end of the semester. Correct answers are indicated with black bars with post- minus pre- values indicated, and incorrect answers are in various shades of grey. For all four questions, there was no significant difference in students’ pre- and post-answers across both semesters (log-linear analysis, p-value < 0.05).
Education 13 00702 g002
Figure 3. Self-reported student perceptions and engagement. Self-reporting of project engagement indicates students had greater positive outcomes when the class transitioned online. Questions regarding student engagement with the project were determined through either (A) a Likert scale reflecting the perceived difficulty of the project or (B) a multiple-choice question reflecting the excitement or exhaustion of the project. For both the Likert scale and multiple-choice question, there was a significant shift in students’ engagement with the project across both surveyed semesters (chi-square analysis, p-value < 0.05).
Figure 3. Self-reported student perceptions and engagement. Self-reporting of project engagement indicates students had greater positive outcomes when the class transitioned online. Questions regarding student engagement with the project were determined through either (A) a Likert scale reflecting the perceived difficulty of the project or (B) a multiple-choice question reflecting the excitement or exhaustion of the project. For both the Likert scale and multiple-choice question, there was a significant shift in students’ engagement with the project across both surveyed semesters (chi-square analysis, p-value < 0.05).
Education 13 00702 g003
Table 1. Self-reported student efficacy. Students responded yes or no to two questions to address self-efficacy. The percentage of yes responses is indicated.
Table 1. Self-reported student efficacy. Students responded yes or no to two questions to address self-efficacy. The percentage of yes responses is indicated.
Post-Test “Yes” Responses
In personOnline
I am comfortable with the process of identification87.792.7
The unknown project applied to my life or career55.561.6
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Fricker, A.D.; Perri, K.; Abdelhaseib, M. A COVID-19 Shift to Online Learning: A Comparison of Student Outcomes and Engagement for the Bacterial Unknown Identification Project. Educ. Sci. 2023, 13, 702. https://0-doi-org.brum.beds.ac.uk/10.3390/educsci13070702

AMA Style

Fricker AD, Perri K, Abdelhaseib M. A COVID-19 Shift to Online Learning: A Comparison of Student Outcomes and Engagement for the Bacterial Unknown Identification Project. Education Sciences. 2023; 13(7):702. https://0-doi-org.brum.beds.ac.uk/10.3390/educsci13070702

Chicago/Turabian Style

Fricker, Ashwana Devi, Kayla Perri, and Maha Abdelhaseib. 2023. "A COVID-19 Shift to Online Learning: A Comparison of Student Outcomes and Engagement for the Bacterial Unknown Identification Project" Education Sciences 13, no. 7: 702. https://0-doi-org.brum.beds.ac.uk/10.3390/educsci13070702

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop