Next Article in Journal
Education and Care: How Teachers Promote the Inclusion of Children and Youth at Risk in South Africa
Next Article in Special Issue
Higher Education during the Pandemic: The Predictive Factors of Learning Effectiveness in COVID-19 Online Learning
Previous Article in Journal
Digital Escape Room, Using Genial.Ly and A Breakout to Learn Algebra at Secondary Education Level in Spain
Previous Article in Special Issue
COVID-19 and the Digital Transformation of Education: What Are We Learning on 4IR in South Africa?
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Simulated Fieldwork: A Virtual Approach to Clinical Education

Department of Occupational Therapy, Duquesne University, Pittsburgh, PA 15282, USA
*
Author to whom correspondence should be addressed.
Submission received: 25 August 2020 / Revised: 24 September 2020 / Accepted: 26 September 2020 / Published: 2 October 2020

Abstract

:
The purpose of this study was to investigate student satisfaction and perceived clinical reasoning and learning using a computer-based simulation platform that incorporates case-based learning principles. The simulation was used to replace a previously scheduled face-to-face clinical rotation which was cancelled due to COVID-19. A descriptive design was used to implement the Satisfaction with Simulation Experience Scale (SSES) with students (n = 27) following each a low fidelity (paper cases) and high fidelity (Simucase™) simulation. A comparison of the SSES data following paper cases and simulation scenarios indicated statistically significant increases in Debrief and Reflection (p = 0.008) and Clinical Reasoning (p = 0.043), suggesting that students develop in-depth reflection, reasoning, and clinical abilities as they progress through their simulated experience.

1. Introduction

Due to the current landscape of our communities and educational system, faculty across the country are seeking out evidence-based, practical ways to meet the demands of curriculum. COVID-19 has affected the education system at multiple levels and has had a significant impact on the student’s ability to meet requirements in professional programs such as clinical education. The use of simulation and simulated environments is one strategy to respond to lack of clinical training sites.

1.1. Simulation as a Model of Practice

Simulation can be described as a teaching-learning modality that replaces and strengthens real experiences with guided ones that evoke clinical reasoning and reproduces aspects of real scenarios using an interactive approach [1]. Simulation can be described as low or high fidelity, which refers to the “degree of realism associated with a particular simulation activity” [2] (p. 11). While the literature varies around categorizing fidelity, for the purposes of this model, the authors define peer-practice, paper-case studies and role play as lower fidelity simulations, whereas standardized patients, human patient simulators and simulation labs that can mimic physiological responses are closer to the high-fidelity range of simulations. Evidence indicates that simulation provides various opportunities for learners to develop competence and confidence [3]. In addition, many clinicians, educators, and healthcare leaders believe simulation promotes patient safety and raises the quality of patient care [4]. The literature indicates that when simulation is combined with faculty engagement through debriefing, students achieve higher results [5,6]. Most debriefing models follow a pre-brief, scenario, and debrief structure. Pre-debrief is often used by the instructor to reflect on the experience of the learner and their own experience with debriefing. Pre-debrief is often important in establishing learning objectives and can provide context to the client or case, the experience, or key points to consider in advance of the scenario. Sawyer and colleagues [7] suggest a Gather, Analyze and Summarize structure for post-simulation debrief with intentional prompts for the faculty facilitator to promote guided self-reflection. Creating a standardized format for faculty to facilitate debriefs is a known best practice approach in simulation education [8]. In addition to the benefits of frequent and structured debriefs in fostering self-reflection and frequent opportunities for feedback, structured student assessment via a rubric is another critical component to use along with simulation [9].

1.2. Simulation in Healthcare Education

In nursing, simulation has long been used to train in such areas as physical assessment, communication, and interprofessional collaboration [10]. In fact, recent research suggests that up to 50% of clinical hours in a prelicensure RN program could be effectively replaced by simulated experiences without negative impacts on learning outcomes [11]. Similarly, in Speech Language Pathology (SLP) programs, a 2016 revision to the Standard V-B of the Certificate of Clinical Competence in Speech Language Pathology noted that clinical simulation is an acceptable alternative method of direct clinical practice time, with up to 75 simulation-hours attributable to clinical certification [12].
In allied health education, results are similar. In one study, high-fidelity ICU simulation positively impacted physical therapy students’ perceived readiness for clinical education [13]. In another study focused on interprofessional education (IPE), researchers found that that simulated IPE can assist in improving skill levels in areas such as how to communicate, collaborate, and learn from other health care professions [14,15]. In addition to these benefits, there is also the potential for simulation to meet the increasing demands of clinical education. In physical and occupational therapy curricula, there are greater burdens on clinical training hours. These include limitations in the amount of patient encounters, ensuring exposure across the lifespan, and ensuring a focus on patient safety and outcomes. With COVID-19, these demands rose exponentially, essentially eliminating certain opportunities within certain practice settings such as skilled nursing facilities and schools.
Clinical competence is the foundation of occupational therapy (OT) education [16] and using simulation provides students an opportunity to interact with virtual client scenarios, practice the required skills, and avoid risk of injury or harm to the actual patient who may be vulnerable. In OT, although the Accreditation Council for Occupational Therapy Education [17] has recently indicated that simulation is an acceptable method to deliver early clinical education, there are limited resources available in this area. In a large scale randomized controlled trial, Imms and colleagues found that students can achieve equivalent learning outcomes in a 40 h simulated placement to those achieved in a 40 h traditional placement [18]. The primary objectives of this study were to (1) examine the impact of high-fidelity human patient simulations on students’ perceived levels of knowledge, confidence, and clinical reasoning; and (2) to examine student perceptions of the value of simulation as an experiential learning strategy.

2. Materials and Methods

2.1. Study Design and Sampling

The research study used a descriptive design aimed to investigate student satisfaction with a computer-based simulation platform that incorporates case-based learning principles, via SimucaseTM and their perception of clinical reasoning skills. The study was embedded into a graduate level occupational therapy course focused on clinical reasoning, which requires a level I fieldwork experience. Participants were sampled from a cohort of twenty-nine students ages 20–22 within a BS/MS program, during the third semester of their professional program.

2.2. Overview of Simulation Experience

Fieldwork education is an essential requirement within occupational therapy and is situated as progressive experiential learning, distinguished as level I and level II. Level I fieldwork is designed to “enrich didactic coursework” and provides introductory learning experiences via “directed observation and participation in selected aspects” of occupational therapy practice [17] (p. 41). Current accreditation standards indicate that level I FW can be met in a variety of ways, including simulated environments, standardized patients, faculty-practice, faculty-led visits and supervision by a fieldwork educator in a practice setting [17]. Simulated environment is further defined by [17] as “a setting that provides an experience similar to a real-world setting in order to allow clients to practice specific occupations” (p. 54). The researchers believe that simulation can be an ideal way to deliver level I fieldwork experiences as it provides the students with the opportunity to repeatedly practice skills in a low-stakes environment. Although simulation education is an acceptable method to accomplish level I fieldwork, there is limited research in this area, particularly for occupational therapy.
This research study examined student perceptions in using Simucase™ to satisfy a one-week, level I fieldwork experience. Simucase™ is a computer-based simulation platform that provides students with experiences designed to teach complete processes using video-recordings of client scenarios. Currently, this technology is available for audiology, occupational therapy, physical therapy and speech-language pathology professions. Differing from other virtual learning resources, Simucase™ offers a comprehensive platform including simulations, part-task trainers (short scenarios focusing on one skill), and an observation video library [19]. Another benefit of Simucase is that it is designed to measure student skills and enhance clinical competency. These skills include honing clinical observations, interviewing clients and families, collaborating with other disciplines, administering and interpreting assessments, designing intervention plans, and implementing interventions, which are all conventional skill sets to be developed and built during level I FW. The simulation scoring provided within Simucase™ is based upon strength of the clinical decision making of the student. Virtual scenarios vary in age, diagnosis and practice settings, which can expose students to more diverse experiences that may not be available or accessible during traditional in-person fieldwork. For example, virtual patient simulations include patients across the life span (ages 2–80 years old) and within various practice settings such as community-based practice, home health, early intervention, school-systems and more. Diagnoses of the virtual patients include orthopedic, neurological, developmental and mental health conditions. Each simulation is very authentic, created from an actual client, and is submitted by a practicing clinician.
Using the Simucase™ platform, the researchers created a robust one-week curriculum to align with the traditional learning objectives within the academic program and best practice for simulation education. This included a structured debriefing to stimulate self-reflection and a rubric to evaluate student engagement and clinical performance. Ahead of the one-week simulation, students were provided the overview of the experience (See Appendix A) and were able to self-select particular simulations to engage with, which determined the smaller debriefing sessions. The researchers created structured learning activities to be completed by the student daily. Depending on the day, students were required to submit a deliverable ahead of their scheduled debrief using the educational institution’s learning management platform, Blackboard™. Deliverables may have included a journal reflection, written sample of documentation based upon a simulation, a recorded video clip where the student demonstrates a relevant intervention, or their competency report, provided by Simucase™ after activities, such as the Part-Task Trainer, were completed. Four faculty members were involved to facilitate the simulation fieldwork experience which reduced the instructor to student ratio for debriefing sessions. Students met one to two times per day within their small groups via web conferencing to process their learning and engage in structured discussion around various aspects of each simulation. Faculty used the suggested debriefing prompts provided within the Simucase™ faculty platform to facilitate small group discussion and feedback sessions. Following each debrief session, faculty completed the rubric in Appendix B. The daily rubrics were averaged at the end of the week with comprehensive summative feedback on overall engagement across the entire one-week experience.

2.3. Instruments, Data Collection, and Analysis

Following Institutional Review Board (IRB) approval, data was collected through the Qualtrics platform, using the Satisfaction with Simulation Experience Scale (SSES) [20] and a final student evaluation and self-reflection questionnaire to evaluate their overall experience. The SSES is an 18-item self-rated scale that assesses student satisfaction with simulation in three areas: Debrief and Reflection, Clinical Reasoning, and Clinical Learning. The scale has well-documented reliability and validity. Williams and Dousek [21] found that the SSES has adequate internal consistency and construct validity. Using the SSES, participants provided a rating to each question using a five-point Likert scale (1 = Strongly Disagree to 5 = Strongly Agree). Data collection for the study occurred at two points. The SSES was initially administered at the start of the 3rd semester, prior to any simulation education, asking students to reflect on their experiences with paper-case scenarios and in-class discussions thus far in the curriculum. In addition to the post-SSE, students also completed a final evaluation at the end of the week (Appendix C), which stimulated self-reflection on their overall experience with the Simucase platform.
Descriptive statistics for scores on the SSES pre-and post-simulation were analyzed using Statistical Package for the Social Sciences version 25 [22]. A Wilcoxon signed ranks test was used due to the small sample size and the inability to assume a normal distribution; it revealed a significant difference in comparing the post-paper case and post-simulation means. In addition to the quantitative data, to understand the full experience of students following the simulation, qualitative data were reviewed and analyzed through concept-driven coding. The qualitative data were gathered through the use of the final evaluation (n = 29), where students had the chance to reflect on their experience. The three authors began the qualitative analysis with initial coding of the responses, then organization of the categories, and finally, provided a structure for the overall supporting quotes [23]. For the purposes of this manuscript, only the most salient quotes from each category were used to describe the findings, aligned with the quantitative data below.

3. Results

A Wilcoxen Signed Ranks Test indicated a statistically significant increase in two of the three sections of the SSES post-simulation: Debrief and Reflection (z = −2.67, p = 0.008) with a large effect size (r = 0.63); and Clinical Reasoning (z = −2.023, p = 0.043) with a large effect size (r = 0.64). Clinical Learning did not indicate a statistically significant difference (z = −1.826, p = 0.068, r = 0.65); however, individual statements in this section were shown to have significance. These findings suggest that students develop in-depth reflection, reasoning, and clinical abilities as they progress through their simulated experience. Supporting quotes from the final student evaluation and self-reflection questionnaire will also be shared within these results. Each of these sections, plus the students’ perception of the overall experience, will be further elaborated on below.

3.1. Perceived Value of Debriefing and Reflection

As the literature suggests, debriefing and reflection are at the core of any simulation experience [5,6]. For these 29 students, these made a significant impact on their learning and perceived value of the program. Table 1 presents the correlating statements of the SSES in this section, mean scores, and the significance following the Simucase experience.
In addition to the SSES scores, students often commented on the debriefing process throughout the post-fieldwork survey. One student stated:
“While this fieldwork experience was different than we expected, I feel that I have learned so much from it. I felt that the daily debriefs were very helpful for me to view the cases and situations from points of view that I had not considered. It gave me a more well-rounded clinical reasoning understanding of what was presented. Hearing what my peers thought and learned along with the clinical examples from my professors, I learned more in that sense than what I would have at a site.”
Other students discussed how the debriefing specifically impacted their learning, in a way that was unique to this simulated experience. For example, one student commented:
“The debriefing sessions were really helpful for me to articulate my clinical reasoning and any questions I had for the professor. These virtual debrief sessions allowed me to have the chance to communicate what I thought about certain cases/assessments and helped me improve my communication skills.”
Similarly, another student stated, “The most helpful aspect was the discussions and debriefs as that is not an aspect that we would have gotten to do as much of in a hands-on experience.” The results of the SESS and the statements made by students indicate the impact that debriefing and reflection during simulation have on learning.

3.2. Perceived Use of Clinical Reasoning

The second focus area of the SSES, which was of particular importance to the researchers, was the perceived use of clinical reasoning. In occupational therapy, clinical reasoning is defined as, “the thinking that guides practice… [and that] cognitive activity constitutes the heart of the clinical enterprise” [24] (p. 601). In this section, as presented in Table 2, students had statistically significant growth in all statements, with particular emphasis on the opportunity to reflect on clinical ability, which supports data from the first section as well.
In the survey, students were asked to reflect on their overall learning through this experience. Similarly to the SSES data, the concept of clinical reasoning in some capacity was discussed by the majority of participants. One student discussed her use of diagnostic reasoning, an aspect of clinical reasoning often discussed in occupational therapy. She stated:
“This was my first experience working on a case that involved children with ASD. I knew the signs and symptoms of ASD [autism spectrum disorder], but every child presents differently. So being able to work with three different cases involving ASD and seeing the differences helped expand my knowledge and experience with working with ASD but different children.”
Overall, 24 out of the 29 students referenced clinical reasoning skills and growth in some way. One example from a student who captured this finding stated, “I believe [this experience] helped my clinical reasoning skills because I was able to share my thinking in response to the faculty’s questions, as well as be challenged to think from the different perspective of my peers.”

3.3. Perceived Use of Clinical Learning

In the final section of the SSES, students assessed their perceived connections to clinical learning. In this scale, clinical learning asks the students to reflect on their overall abilities and skills to apply knowledge to practice. Table 3 presents the statements associated with this sub-section following the paper case and Simucase experiences.
Again, student responses correlated with the findings of this SSES section. In terms of the overall experience, one student stated:
“I felt that this virtual fieldwork experience has enhanced my skill set and comfortability in working with different populations and diagnosis. The Simucase format of watching videos of the patient during therapy sessions, reading through patient charts, intake forms, and case histories and then applying that information by making informed decisions on their plan of care through answering of questions was very helpful.”
Another student reflected on their own self-regulated learning and efficacy, stating:
“I think this experience allowed me to be more comfortable with being wrong and trying new approaches than I would be in a facility with a new [Fieldwork Educator] and unfamiliar clients. It gave me the opportunity to experiment without any real implications if I chose something incorrectly.”

3.4. Perception of Overall Experience and Preparedness

Since this experience was the final level I fieldwork experience prior to the students going out into their full time clinicals (Level II Fieldwork), it was important to also understand how they perceived the value of the experience in preparing them for Level IIs. One aspect that was continually referenced in the students’ final assessment and post-survey was confidence. One student stated:
“My confidence going into Level II fieldwork has increased knowing that I have practiced administering assessments, developing treatment plans for several different types of clients, writing SOAP notes, and discussing other considerations such as billing, ethical dilemmas, and safety issues. It helped immensely to be able to compare my experience with others and know that I was on the right track, as well as listen to advice from the professionals that are in our own department. I felt that the assignments were well-timed and appropriate to the types of things I would be practicing in a real fieldwork setting, and I’m glad I got to use Simucase, which makes you use clinical reasoning on real-life patients. I believe FWI was as good as it could get despite the circumstances!”
Another student similarly discussed their preparedness for full time clinicals, stating:
“Virtual fieldwork experience enhanced my skill set and confidence for level II fieldwork because it allowed me to use clinical reasoning through gathering data on each of the clients and using it to make ethical decisions for evaluations and interventions. I like how Simucase gave us feedback on our answers to help self-reflect on what I put and what to change. I also liked how we worked with multiple clients with different diagnoses, making it more realistic to in-person fieldwork. Finally, I liked how we were given opportunities to self-reflect throughout the week and share these reflections during the discussion.”
Students were also asked to provide honest feedback on what challenged them or did not go as anticipated in the experience. Multiple students suggested having the opportunity to process more cases, to better replicate the actual clinic setting. One student stated:
“I think one way this fieldwork could be enhanced next year, is to add in more stimulations or videos on different interventions with various populations. During fieldwork in December we are seeing multiple sessions per day which isn’t quite possible in a virtual setting but maybe adding in a way for us to get more exposure to how different sessions would run could be helpful.”
Eight of the students also stated in some way they would like a greater variety of pediatric diagnoses. One stated, “I did learn from those cases which helped to expand my experiences, but I would have liked working with other diagnoses such as CP [cerebral palsy] or spina bifida just to name a few.”

4. Discussion

The results of this study identify students’ self-reported satisfaction related to in-depth reflection, reasoning, clinical abilities, and preparedness upon completion of the simulated fieldwork I experience. The findings in this study support the growing evidence that simulation can enhance student competence, confidence, and perceived readiness for clinical education [3,4,14].

4.1. Perceived Value of Debriefing and Reflection

Simulation provides the opportunity to fully review and discuss a case and seek clarification in a less critical time frame versus in the clinic. The structured debriefing sessions provided time for students to discuss performance, ask questions and reflect on each case and assignment. Faculty were able to summarize important issues and provide constructive feedback that furthered student understanding and learning. In this study, the students reported the positive impact of debriefings and reflections on their understanding and learning, which adds support to prior studies that identify debriefing as an integral part of simulation-based learning [25,26,27]. This aspect of debriefing and reflecting on a regular basis provides students a way to engage in constant real-time feedback, which can be different than a traditional clinical setting due to the fast-paced environment and productivity demands. Decreased feedback, high workload, and time constraints have been identified as barriers to learning in clinical settings [28,29,30].

4.2. Perceived Use of Clinical Reasoning

Student learning and understanding can be negatively impacted by limited exposure to varied client population at a clinical site [29]. The simulation environment provides the opportunity to engage with a variety of cases which vary in diagnoses and ages. In this study, the students’ ability to independently engage and/or make clinical decisions with a variety of clients was enhanced in the simulated environment. The students identified how this variety expanded their clinical reasoning beyond a particular setting, diagnosis, or age. The students were required to make clinical decisions on their own. This varied exposure and independent thinking improved clinical reasoning as each case presented a new challenge and students evoked different diagnostic reasoning and clinical decision-making skills [26,29]. These skills are essential for future clinical rotation success. James and Mussleman [30] found that a cause of failure on future, more progressive fieldwork experiences was often a result of a lack of clinical reasoning and problem-solving skills. Participating in group discussions which required the students to reflect on and explain their thinking and listening to the thoughts of others supported student growth for use of clinical reasoning [26]. Students valued hearing different perspectives and reasons why different methods of intervention may be suitable for the same case [25].

4.3. Perceived Use of Clinical Learning

Students reported an overall improvement in their clinical learning after the simulated experience. Clinical learning asks the students to reflect on their overall abilities and skills to apply knowledge to practice. The simulations tested their clinical ability by challenging them to apply what they have learned to each step of the case study and make clinical decisions. The safe environment of the simulation enabled students to try new approaches and make mistakes without risk. Debriefing and reflection, working with different populations/diagnoses, and the safe environment for learning and making mistakes have been supported in the literature as enhancing the students’ perceived use of clinical learning through simulation [25,31].

4.4. Perception of Overall Experience and Preparedness

Discussion of confidence was found in the student’s final evaluation and self-reflection. Students reported the overall simulation experience enabled them to practice clinical skills, including assessing, identifying interventions, documenting, and clinical reasoning. The simulations and learning activities designed by the researchers also engaged the students in meaningful discussion and learning around important concepts such as ethical reasoning, billing and reimbursement, and safety issues. Through practice, reflection, and discussion, students reported an increase in self-confidence as they transitioned into their level II fieldwork rotation [3,14]. Studies about the preparedness for level II fieldwork and future practice identify confidence as a factor for success in both transitions [29,32].

4.5. Limitations and Future Research

The study design identified students’ perceptions of and satisfaction with a computer-based simulation platform. Clinical competence was not measured through an outcome assessment. The necessary first step was to understand student perception of the aspects of simulation related to debriefing and reflection, clinical reasoning, clinical learning and overall experience and preparedness. The limitations of this study stem from the purposive sample and the lack of diversity in gender, age, and race. However, students who participated in this study were representative of the student population in similar sized occupational therapy educational programs [33]. The study also involved the findings of one cohort of students from a private, Catholic academic institution in the northeast, and may or may not reflect the behavior and attitudes of other occupational therapy students in different geographical areas or types of institutions. We would recommend repeating this study with a larger, more diverse group of students with different genders, ages, and races across various occupational therapy programs at different academic institutions. In addition, further exploration is needed of self-perceived clinical competence and reasoning versus actual assessment of specific outcomes of these skills.

5. Conclusions

Simulation is currently being explored as a model of practice to meet the increasing demands of clinical education within the health sciences. Through debriefing and reflection, simulation can enable students to experience clinical learning, and to develop clinical reasoning, confidence, and perceived readiness for level II fieldwork clinical education. Simulation also provides a way to increase the exposure of patient encounters across the lifespan and within various clinical arenas. Given the current landscape with shortage of fieldwork placements across the US and the impact of COVID-19 on site availability, understanding the student experience with these teaching modalities will help shape and further define the future use of simulation for level I FW. This study’s findings suggest that simulated case scenarios enable students to experience aspects of clinical learning and clinical reasoning through guided inquiry, integration of knowledge, and reflection/feedback.

Author Contributions

Conceptualization, A.M., R.M.M., & E.D.D.; methodology, E.D.D.; formal analysis, A.M.; discussion, R.M.M.; writing—original draft preparation, A.M.; writing—review and editing, R.M.M. & E.D.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

The authors would like to acknowledge Emily Casile and Alexandria Raymond for their assistance in manuscript preparation. In addition, we would like to thank Maura Lavelle and Wendy Brzozowski from Simucase for their support in curriculum development, manuscript preparation, and student support.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Education 10 00272 i001
Table A1. Level I Fieldwork: Simucase. Instructions: The following assignments must all be completed on the days/times assigned to align with appropriate debriefing sessions. Please review these assignments/requirements prior to the Level I FW Orientation, so that you can plot out the cases you are most interested in (when appropriate). As a reminder, expectations are that you come prepared to each debriefing session with the defined products for that assignment.
Table A1. Level I Fieldwork: Simucase. Instructions: The following assignments must all be completed on the days/times assigned to align with appropriate debriefing sessions. Please review these assignments/requirements prior to the Level I FW Orientation, so that you can plot out the cases you are most interested in (when appropriate). As a reminder, expectations are that you come prepared to each debriefing session with the defined products for that assignment.
DateAssignment/RequirementCriteriaProducts to Bring to DebriefDate Scheduled to Complete/Debrief
MondayChart Review (1–2 h)Using your chosen Simucase client and the simulation template as a guide, complete chart review, particularly focusing on relevant information.Part 1 of Simulation Template completed in fullComplete this task on Monday; debrief will be at 3:00 pm
Intervention (3–5 h)Based upon the chart review, choose one treatment technique on
  • Biomechanical, Behavioral/Psychiatric/OR Neurosensorimotor treatment approach—After watching the simulation, discuss a treatment principle covered in OCCT 519, OCCT 530, OCCT 520 or OCCT 525. Examples: retrograde massage, patient transfer, PLB etc…
  • Home Education Program (HEP) or Patient Education Session—Examples: ROM/Strengthening HEP, instruction on use of adaptive equipment, creation of splint wear schedule, patient education of home safety, fall prevention, etc…
Part 2 of Simulation Template completed in fullComplete this task on Monday; debrief will be at 3:00 pm
Associated 5 min video clip demonstrating your chosen technique(Debrief for 1 h)
TuesdayPart Task Trainer (2–3 h)For students who were scheduled for pediatrics:
  • Complete one of the GOAL or TVPS-4 assessments under Part Task Trainer. Go through the simulation, indicate your findings through Simucase, note your questions/overall interpretation of the assessment. For students who were scheduled for an adult site:
  • Complete the Simucase CLQT+ Part Task Trainer with Julia. Go through the simulation, document your findings/questions/overall interpretation of the assessment.
Bring your final Simucase report to debrief with questions/commentsComplete this task on Tuesday morning; debrief will be at 12:00 pm (1 h)
Documentation (2–3 h)Using the client from Monday, complete a SOAP note on your chosen intervention.
In addition, create a narrative discharge summary or transition note, depending on what you feel is most appropriate for the client’s situation.
Upload SOAP Note and DC summary or transition to Blackboard, prior to the debrief sessionComplete this task Tuesday afternoon; debrief will be at 3:00 pm (1 h)
WednesdayInterdisciplinary Activity (2–3 h)In the Video library, watch the “Nico Child Development Day Collaborative Assessment Part 1 & 2”. Observe the OT interacting with other interdisciplinary team members and discuss the unique role of OT in the context of the interdisciplinary team in this particular case. Any observable TeamSTEPPS approach?Reflective JournalComplete this task on Wednesday morning; debrief will be at 12:00 pm (1 h)
Reimbursement/Productivity (2–3 h)Adult settings: Use Ed Intervention 2 (OT)
Pediatric settings: Use Alaina Intervention case
Bring detailed answers to prompt questions; be prepared to discuss these at debriefing.Complete this task on Wednesday afternoon; debrief will be at 3:00 pm (1 h)
Behavioral Health settings:
Use Stress Management Group Intervention case
  • Prior to starting the intervention, explore reimbursement structure and processes for the setting (Ed: Home care; Alaina: school setting; Sebastian/Mark: Community practice). Make note of what you find in reference to the following questions:
    • How do OTs account for billable time spent with client?
    • What documentation requirements are there for reimbursement?
    • What might be internal barriers and challenges to reimbursement?
    • External barriers, challenges to reimbursement?
    • What is the accrediting agency of the setting?
2.
Go through the case. Make note of what you think the billing structure might look then, then be sure to review the PDF of billing codes and be prepared to discuss what might have been most appropriate for your interventions of choice.
3.
Productivity is a measure of output (work).
Using the AOTA Toolkit (https://www.aota.org/Practice/Ethics/Tools-for-Productivity-Requirements.aspx), determine how productivity might be measured at this site?
What strategies might you use to effectively meet productivity standards?
Come up with a potential unethical situation related to productivity with your Simucase client/practice. Determine any potential repercussions for this situation.
ThursdaySafetyIn the various cases you observed, what patient safety measures were observed? Were there any situations where the patient was at-risk?
In Occupationaltherapy.com, please watch the following course: Prevention of Medical Errors (Barbara Kornblau) This course looks at practice errors in occupational therapy and how to prevent them. It reviews root- cause analysis, error reduction and prevention, patient safety, and contraindications and indications specific to occupational therapy management, including medication and side effects.
Reflective journal
Completed quiz with any additional questions
Complete this task on Thursday morning; debrief will be at 12:00 pm
Psychosocial ImpactIn all the sessions you reviewed thus far, what psychosocial factors were observed in the clients? How might you respond differently than what was observed, or in addition to the conversations you saw occur?Reflective journalComplete this task on Thursday afternoon; debrief will be at 3:00 pm
In Occupationaltherapy.com, choose one of the 63 mental health related topics that might be relevant to the case you followed. Complete the associated journal and be prepared to discuss your findings.
FridayUse of Clinical ReasoningReflect over the course of your simulations this week. Using the table below, indicate specific examples of using different types of clinical reasoning during your experience with Simucase.Reflective journal
Post-journal reflection on personal goals and sim experience
Complete this task on Friday; debrief will be at 3:00 pm
Student Eval of Level 1 ExpSatisfaction with Simulated Experience Scale Complete the post-survey on Blackboard Complete this task by 8:00 pm on Friday
Eval of level 1 StudentFaculty will compile overall feedback and score related to behavior/participation/engagement in debriefing sessions.Debriefing rubric

Appendix B

Education 10 00272 i002
Table A2. Rubric for Student Engagement during Debrief. Name: Rubric Key. Exemplary Engagement indicates criteria is met 90–100%. Accomplished Engagement indicates criteria is met 80–90%. Developing Engagement indicates 70–80%. Beginning Engagement is Less than 70%.
Table A2. Rubric for Student Engagement during Debrief. Name: Rubric Key. Exemplary Engagement indicates criteria is met 90–100%. Accomplished Engagement indicates criteria is met 80–90%. Developing Engagement indicates 70–80%. Beginning Engagement is Less than 70%.
Exemplary
4
Accomplished
3
Developing
2
Beginning
1
Notes
Frequency of Engagement During DebriefStudent initiates contributions more than once in each recitationStudent initiates contribution once in each recitationStudent initiates contribution at least in half of the recitationsStudent does not initiate contribution & needs instructor to solicit input.
Quality of Engagement During
Debrief
Comments always insightful & constructive; uses appropriate terminology. Comments balanced between general impressions, opinions & specific, thoughtful criticisms or contributionsComments mostly insightful & constructive; mostly uses appropriate terminology. Occasionally comments are too general or not relevant to the discussion.Comments are sometimes constructive, with occasional signs of insight. Student does not use appropriate terminology; comments not always relevant to the discussion.Comments are uninformative, lacking in appropriate terminology. Heavy reliance on opinion & personal taste, e.g., “I agree”, “I disagree”, “Me too”, “Yes”, “No” etc.
Information SeekingAssertively seeks information to plan; carefully collects useful data from observing and interacting with the case; effective use of evidenceActively seeks information to support planning; occasionally does not pursue important leads. Makes limited efforts to seek additional information from the patient; often seems not to know what information to seek and/or pursues unrelated or outdated information.Is ineffective in seeking information; relies mostly on objective data; fails to collect relevant evidence
Prioritizing DataFocuses on the most relevant and important data useful for explaining the caseGenerally focuses on the most important data and seeks further relevant information but also may try to attend to less pertinent dataMakes an effort to prioritize data and focus on the most important, but also attends to less relevant or useful dataHas difficulty focusing and appears not to know which data are most important to the diagnosis; attempts to attend to all available data
Being SkillfulShows competency with necessary OT skills in simulation
(90–100 overall Comp Rating)
Displays proficiency in the use of most OT skills; could improve with speed/accuracy
(80–89 overall Comp Rating)
Is hesitant or ineffective in using OT skills
(70–79 overall Comp Rating)
Is unable to select and or perform OT skills
(0–69 overall Comp Rating)
Adapted from: CMU’s Eberly Center for Teaching Excellence Retrieved from: https://www.cmu.edu/teaching/assessment/examples/courselevel-bycollege/cfa/tools/participationrubric-cfa.pdf. Lassater, K. (2007). Clinical judgment development: Using simulation to create an assessment rubric. Journal of Nursing Education, 46(11), 496–504. Comments/20.

Appendix C

Virtual Level I Fieldwork Evaluation/Reflection
Each student will complete this Level I fieldwork evaluation at the conclusion of the experience. Be honest! This is for you and your faculty to continue working on your professional development. Please rate yourself as you really felt you performed. Although this fieldwork was not completed as we intended, we still want to learn about what worked in this experience and what did not. Carefully respond to the reflective questions posed at the bottom of the evaluation. Thank you.
Part 1: Professional Behaviors. Please comment on how well prepared you feel for level II fieldwork, not that you have mastered all content. In one paragraph (less than 300 words) summarize your performance.
Part 2: Professional Skills. Please comment on how well prepared you feel for level II fieldwork, not that you have mastered all content. In one paragraph (less than 300 words) summarize your performance.
General Reflection on the Experience
1. Tell us how this virtual fieldwork experience enhanced your skill set and confidence for Level II fieldwork. Please be specific with features of the experience that were helpful.
2. Tell us how this virtual fieldwork experience could be modified to enhance your skill set and build your confidence for Level II fieldwork. Please be specific with suggestions.
3. Each of you have received feedback from previous level I fieldwork educators, faculty, and your peers in various ways. You also shared a goal in OTH512 for the week through Flipgrid. Please make a statement on progress you have made in the goal areas you have set for yourself based on this overall process.

References

  1. Aabersold, M. Simulation-based learning: No longer a novelty in undergraduate education. J. Issues Nurs. 2018, 23, 1–13. [Google Scholar] [CrossRef]
  2. Cunningham, S.; Foote, L.; Sowder, M.; Cunningham, C. Interprofessional education and collaboration: A simulation-based learning experience focused on common and complementary skills in an acute care environment. J. Interprof. Care 2018, 32, 395–398. [Google Scholar] [CrossRef] [PubMed]
  3. Mieure, K.D.; Vincent, W.R.; Cox, M.R.; Jones, M.D. A high-fidelity simulation mannequin to introduce pharmacy students to advanced cardiovascular life support. Am. J. Pharm. Educ. 2010, 74, 1–7. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Niemeyer, M.A. Effective patient safety education for novice RNs: A systematic review. J. Nurs. Educ. Pract. 2018, 8, 103–115. [Google Scholar] [CrossRef] [Green Version]
  5. McGaghie, W.C.; Siddall, V.J.; Mazmanian, P.E.; Myers, J. Simulation in undergraduate and graduate medical education: Implications for CME. Chest 2009, 135, 62S–68S. [Google Scholar] [CrossRef]
  6. Rieber, L.P.; Boyce, M.; Assad, C. The effects of computer animation on adult learning and retrieval tasks. J. Comput. Based Instr. 1990, 17, 46–50. [Google Scholar]
  7. Sawyer, T.; Loren, D.; Halamek, L.P. Post-event debriefs during neonatal care: Why are we not doing them and how can we start? J. Perinatol. 2016, 36, 415–419. [Google Scholar] [CrossRef]
  8. Kessler, D.O.; Cheng, A.; Mullan, P.C. Debriefing in the emergency department after clinical events: A practice guide. Ann. Emerg. Med. 2014, 65, 690–698. [Google Scholar] [CrossRef]
  9. Lasater, K. Clinical judgment development: Using simulation to create an assessment rubric. J. Nurs. Educ. 2016, 46, 496–503. [Google Scholar] [CrossRef]
  10. Lavoie, P.; Clarke, S.P. Simulation in nursing education. Nurs. Manag. 2017, 47, 18–20. [Google Scholar] [CrossRef]
  11. Hayden, J.K.; Smiley, R.A.; Alexander, M.; Kardong-Edgren, S.; Jeffries, P.R. The NCSBN national simulation study: A longitudinal, randomized, controlled study replacing clinical hours with simulation in prelicensure nursing education. J. Nurs. Regul. 2014, 5, C1-S64. [Google Scholar] [CrossRef]
  12. Standards for the Certification of Clinical Competence in Speech Language Pathology. Available online: https://www.asha.org/certification/2014-Speech-Language-Pathology-Certification-Standards/ (accessed on 1 August 2020).
  13. Nithman, R.W.; Spiegel, J.J.; Lorello, D. Effect of high-Fidelity ICU simulation on a physical therapy Student’s perceived readiness for clinical education. J. Acute Care Phys. Ther. 2016, 7, 16–24. [Google Scholar] [CrossRef]
  14. Jacobs, R.; Beyer, E.; Carter, K. Interprofessional simulation education designed to teach occupational therapy and nursing students complex patient transfers. J. Interprof. Educ. Pract. 2017, 6, 67–70. [Google Scholar] [CrossRef]
  15. Mills, B.; Hansen, S.; Nang, C.; McDonald, H.; Lyons-Wall, P.; Hunt, J.; O’Sullivan, T. A pilot evaluation of simulation-based interprofessional education for occupational therapy, speech pathology and dietetic students: Improvements in attitudes and confidence. J. Interprof. Care 2020, 34, 472–480. [Google Scholar] [CrossRef]
  16. Polatajko, H.; Lee, L.; Bossers, A. Performance evaluation of occupational therapy students: A reliability study. Can. J. Occup. Ther. 1994, 61, 20–27. [Google Scholar] [CrossRef]
  17. 2018 Accreditation Council for Occupational Therapy Education (ACOTE®) Standards and Interpretive Guide. Available online: https://www.aota.org/~/media/Corporate/Files/EducationCareers/Accredit/StandardsReview/2018-ACOTE-Standards-Interpretive-Guide.pdf (accessed on 24 August 2020).
  18. Imms, C.; Froude, E.; Chu, E.M.Y.; Sheppard, L.; Darzins, S.; Guinea, S.; Gospodarevskaya, E.; Carter, R.; Symmons, M.A.; Penman, M.; et al. Simulated versus traditional occupational therapy placements: A randomized controlled trial. Aust. Occup. Ther. J. 2018, 65, 556–564. [Google Scholar] [CrossRef]
  19. Simucase User Guide 4.0. Available online: https://d1e47g7vecbcl4.cloudfront.net/pdf/SC_1117_UserGuide_April_2020.pdf (accessed on 24 August 2020).
  20. Levett-Jones, T.; McCoy, M.; Lapkin, S.; Noble, D.; Hoffman, K.; Roche, J.; Arthur, C.; Dempsey, J. The development and psychometric testing of the satisfaction with simulation experience scale. Nurs. Educ. Today 2011, 31, 4705–4710. [Google Scholar] [CrossRef]
  21. Williams, B.; Dousek, S. The satisfaction with simulation experience scale (SSES): A validation study. J. Nurs. Educ. Pract. 2012, 2, 74–80. [Google Scholar] [CrossRef]
  22. IBM SPSS Statistics for Windows, Version 25.0. Available online: https://www.ibm.com/support/pages/downloading-ibm-spss-statistics-25 (accessed on 22 August 2020).
  23. Berg, B.L. Qualitative Research Methods for the Social Sciences, 5th ed.; Allyn & Bacon: Boston, MA, USA, 2004. [Google Scholar]
  24. Rogers, J.C. Eleanor Clarke Slagle lectureship-clinical reasoning: The ethics. science, and art. Am. J. Occup. Ther. 1983, 37, 601–616. [Google Scholar] [CrossRef] [Green Version]
  25. Fey, M.K.; Scrandis, D.; Daniels, A.; Haut, C. Learning through debriefing: Students’ perspectives. Clin. Simul. Nurs. 2014, 10, 249–259. [Google Scholar] [CrossRef]
  26. Forneris, S.G.; Neal, D.O.; Tiffany, J.; Keuhn, M.B.; Heidi, M.; Blazovich, L.M.; Holland, A.E.; Smerillo, M. Enhancing clinical reasoning through simulation debriefing: A multisite study. Nurs. Educ. Perspect. 2015, 36, 304–310. [Google Scholar] [CrossRef] [PubMed]
  27. Al-Mously, N.; Nabil, N.M.; Al-Babtain, S.A.; Found Abbas, M.A. Undergraduate medical students’ perceptions on the quality of feedback received during clinical rotations. Med. Teach. 2014, 36, 17–23. [Google Scholar] [CrossRef] [PubMed]
  28. Grenier, M. Facilitators and barriers to learning in occupational therapy fieldwork education: Student perspectives. Am. J. Occup. Ther. 2015, 69, 1–9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  29. Hanson, D. The perspectives of fieldwork educators regarding level II FW students. Occup. Ther. Healthc. 2011, 25, 164–177. [Google Scholar] [CrossRef]
  30. James, K.L.; Musselman, L. Commonalities in level II fieldwork failure. Occup. Ther. Healthc. 2005, 19, 67–81. [Google Scholar] [CrossRef]
  31. Palominos, E.; Levett-Jones, T.; Power, J.; Martinez-Maldonado, R. Healthcare students’ perceptions and experiences of making errors in simulation: An integrative review. Nurs. Educ. Today 2019, 77, 32–39. [Google Scholar] [CrossRef]
  32. Patterson, B.; D’Amico, M. What does the evidence say about student, fieldwork educator, and new occupational therapy practitioner perceptions of successful level II fieldwork and transition to practice? A scoping review. J. Occup. Ther. Educ. 2020, 4, 1–22. [Google Scholar] [CrossRef]
  33. Academic Programs Annual Data Report: Academic Year 2017–2018. Available online: https://www.aota.org/~/media/Corporate/Files/EducationCareers/Educators/2017-2018-Annual-Data-Report.pdf (accessed on 1 August 2020).
Table 1. Perceived satisfaction with debriefing and reflection during the simulation process.
Table 1. Perceived satisfaction with debriefing and reflection during the simulation process.
SSES StatementPost-Paper CasePost-Simulationp-Value 1
The facilitator provided constructive criticism during the debriefing.3.74.730.00
The facilitator summarised important issues during the debriefing.3.934.960.00
I had the opportunity to reflect on and discuss my performance during the debriefing.3.274.530.00
The debriefing provided an opportunity to ask questions.4.134.900.02
The facilitator provided feedback that helped me to develop my clinical reasoning skills.3.834.830.00
Reflecting on and discussing the simulation enhanced my learning.3.934.830.00
The facilitator’s questions helped me to learn.3.934.830.00
I received feedback during the debriefing that helped me to learn.3.534.700.00
The facilitator made me feel comfortable and at ease during the debriefing.3.834.830.00
1p < 0.05; utilizing Wilcoxen signed ranks test.
Table 2. Perceived use of clinical reasoning during the simulation process.
Table 2. Perceived use of clinical reasoning during the simulation process.
SSES StatementPost-Paper CasePost-Simulationp-Value 1
The simulation developed my clinical reasoning skills.3.664.310.00
The simulation developed my clinical decision-making ability.3.594.310.00
The simulation enabled me to demonstrate my clinical reasoning skills.3.794.380.01
The simulation helped me to recognize patient deterioration early.3.274.210.00
This was a valuable learning experience.4.04.480.01
1p < 0.05; utilizing Wilcoxen signed ranks test.
Table 3. Perceived clinical learning during the simulation process.
Table 3. Perceived clinical learning during the simulation process.
SSES StatementPost-Paper CasePost-Simulationp-Value 1
The simulation caused me to reflect on my clinical ability.3.724.410.01
The tested my clinical ability.3.894.340.05
The simulation helped me to apply what I learned from the case study.3.864.620.00
The simulation helped me to recognize my clinical strengths and weaknesses.3.524.280.00
1p < 0.05; utilizing Wilcoxen signed ranks test.

Share and Cite

MDPI and ACS Style

Mattila, A.; Martin, R.M.; DeIuliis, E.D. Simulated Fieldwork: A Virtual Approach to Clinical Education. Educ. Sci. 2020, 10, 272. https://0-doi-org.brum.beds.ac.uk/10.3390/educsci10100272

AMA Style

Mattila A, Martin RM, DeIuliis ED. Simulated Fieldwork: A Virtual Approach to Clinical Education. Education Sciences. 2020; 10(10):272. https://0-doi-org.brum.beds.ac.uk/10.3390/educsci10100272

Chicago/Turabian Style

Mattila, Amy, Retta M. Martin, and Elizabeth D. DeIuliis. 2020. "Simulated Fieldwork: A Virtual Approach to Clinical Education" Education Sciences 10, no. 10: 272. https://0-doi-org.brum.beds.ac.uk/10.3390/educsci10100272

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop