Abstract
Objective. To assess the impact of an advanced cardiac life support (ACLS) simulation on pharmacy student confidence and knowledge.
Design. Third-year pharmacy students participated in a simulation experience that consisted of team roles training, high-fidelity ACLS simulations, and debriefing. Students completed a pre/postsimulation confidence and knowledge assessment.
Assessment. Overall, student knowledge assessment scores and student confidence scores improved significantly. Student confidence and knowledge changes from baseline were not significantly correlated. Conversely, a significant, weak positive correlation between presimulation studying and both presimulation confidence and presimulation knowledge was discovered.
Conclusions. Overall, student confidence and knowledge assessment scores in ACLS significantly improved from baseline; however, student confidence and knowledge were not significantly correlated.
INTRODUCTION
High-fidelity advanced cardiac life support (ACLS) simulation experiences are incorporated into a variety of health care education programs, and a number of strengths are identified. In doctor of pharmacy (PharmD) curricula and postgraduate training settings, ACLS simulations increase learner confidence in ACLS management skills.1-5 Because of additional emotional stressors, simulation leads to greater anxiety during ACLS instruction but is correlated with an enhanced performance of ACLS skills afterwards.6 The qualitative and quantitative value of adding stressors requires further examination but represents useful variables in simulation-based education.
The impact of levels of simulation fidelity has been studied in medical residents and showed no difference among high-, mid-, or low-fidelity simulations pertaining to test scores or perceived instructor/course quality.7 However, residents trained on high-fidelity mannequins performed better than those trained on mid- or low-fidelity with respect to megacode performance. Curtin and colleagues demonstrated that the use of computer-based simulation prior to mannequin-based simulation improved the achievement of learning goals and outcomes.8 In that study, survival of the “patient” increased by 35.6% with those who completed the computer simulation and students felt more confident with their recommendations. In the medical literature, studies more consistently demonstrate improved knowledge and ACLS-related skills in medical students who complete high-fidelity simulations.9-11
However, the impact of a high-fidelity ACLS simulation on pharmacy student knowledge has only been minimally assessed with variable results. Mieure and colleagues reported a median postsimulation knowledge examination score of 25% after completion of an ACLS simulation experience that consisted of a presession lecture, calculation exercise, and 40-minute ACLS session using a high-fidelity simulation mannequin in third-year PharmD curricula. Yet, that study only assessed students through four examination questions, and these students were not assessed at baseline.2 Davis and colleagues noted significant ACLS-related knowledge improvements from baseline when comparing classroom-based lecture to high-fidelity simulation in a second-year accelerated PharmD student population.3 The highest examination score improvement (a change of 43 absolute percentage points) occurred when the lecture was followed by simulation.3 Bingham and colleagues reported nonsignificant, numeric improvements in clinical performance for 75% of ACLS skills using simulation scenarios and mannequin survival rates for student teams with prior ACLS simulation training within the previous 120 days vs those who had no prior training.4 Given the minimal data available and the varying results, the question of whether a high-fidelity ACLS simulation experience confers any significant pharmacy student knowledge benefit warrants further attention.
Advanced cardiac life support simulation experience and subsequent increases in knowledge have been demonstrated with pharmacy residents. Eng and colleagues reported significant increases in advanced resuscitation knowledge scores of pharmacy residents using a pre/posthigh-fidelity simulation written examinations (score increase of 65% to 88% respectively, p=0.001).5 Competency evaluations of the pharmacy residents also showed a significant improvement in advanced resuscitation skills based on a comprehensive intervention checklist (64% presimulation vs 77% postsimulation, p=0.009).5 While the impact of ACLS simulations on knowledge is not clearly defined in the pharmacy student population, significant increases in health care student and postgraduate health care trainee confidence have been well-documented in similar ACLS simulation studies.1,3,5,12-15 To our knowledge, there are no published studies evaluating the impact of a high-fidelity ACLS simulation experience on PharmD students’ ACLS management confidence and knowledge.
DESIGN
The purpose of this study was to assess how a high-fidelity ACLS simulation affects PharmD students’ ACLS-related knowledge and confidence. The study also evaluated whether a correlation existed between students’ ACLS simulation-related confidence and knowledge. Another study endpoint was the correlation between students’ preparation for the simulation module and ACLS simulation-related confidence or knowledge. Based on prior studies, we anticipated that the ACLS simulation experience would be associated with an increase in student confidence; however, we were uncertain of the extent of impact of an ACLS simulation on student knowledge or the relationship between student confidence and student knowledge. This study was approved by the institutional review board as exempt research.
At the South Carolina College of Pharmacy (SCCP), simulation experiences are used in the Clinical Assessment course, which is a capstone laboratory experience that equips third-year students with skills for providing pharmaceutical care in inpatient and outpatient settings, prior to beginning advanced pharmacy practice experiences (APPEs) in the fourth year. The didactic portion of the course is taught by faculty members and is delivered synchronously through distance education to all three regional campuses. Each campus uses affiliated simulation centers in their respective geographic regions for portions of the laboratory component. Among the three regional campuses, laboratory content is similar; however, content and interprofessional trainee involvement varies slightly because of logistical differences among the campuses. Each campus uses full-time faculty, adjunct faculty members, and pharmacy residents as laboratory facilitators.
The ACLS module evaluated in this study occurred in the Clinical Assessment course and consisted of a 100-minute didactic lecture and a simulation experience. The didactic lecture explained the pharmacist’s role as a member of the ACLS team as well as other team roles, defined and described ACLS terms and situations, reviewed common ACLS cardiac rhythms and treatment algorithms, and provided information about ACLS-related medication use, dosing, and preparation. The simulation consisted of a brief training on team roles and featured a high-functioning team adapted from DeVita and colleagues16 and Baker and colleagues,17 high-fidelity ACLS simulation scenarios, and a debriefing session conducted by the laboratory facilitators. The team roles taught in the didactic lecture and brief training were: team leader, airway manager/assistant, procedure physician, chest compressor, medication/equipment manager, bedside nurse, and data manager/recorder.16 Features of a high-functioning team taught in the brief training were: team leadership, mutual performance monitoring, backup behavior, adaptability, team/collective orientation, shared mental models, mutual trust, and closed-loop communication.17 For the simulation scenarios, students were divided into teams of approximately five students each. Some campuses incorporated live interprofessional team collaboration with medical students, while others were unable to do so because of logistical constraints. Thus, only pharmacy student data were included in this study. Faculty members and simulation center staff developed the simulation scenarios used on all three campuses that involved unstable bradycardia or supraventricular tachycardia and a progression to ventricular fibrillation, pulseless electrical activity (PEA), or asystole. Simulation center staff and course facilitators operated simulation equipment and mannequins, while SCCP faculty members and facilitators supervised logistics, clinical teaching, and debriefing sessions. The laboratory schedule allowed students to complete each scenario once.
After scenarios were completed, the debriefing session consisted of a question-answer and discussion session to share overall simulation experiences, as scenarios could not be recorded at all of the facilities to enable playback during the debriefing. The debriefing was followed by postsimulation knowledge and confidence assessment. The learning objectives of the ACLS module addressed all four domains of the Center for Advancement of Pharmacy Education (CAPE) Outcomes.18 Student learners employed principles of patient-centered care by using appropriate patient assessment techniques to evaluate vitals and cardiac rhythms. They also demonstrated problem-solving skills and acted as patient advocates as they selected and prepared appropriate medications to be administered and calculated appropriate intravenous (IV) infusion rates for selected medications. They also collaborated, communicated, and practiced professionalism as they explored the pharmacist’s role in managing ACLS as part of an interdisciplinary team, and applied principles learned regarding high-functioning teams.18
The lecture portion of the ACLS module was delivered at the beginning of the simulation laboratory week. Students completed an ACLS knowledge assessment immediately before simulation activities. Baseline confidence assessments were submitted prior to the simulation as well. Following the ACLS simulation, confidence and knowledge were also assessed. The confidence and knowledge assessment questions evaluated students across eight investigator-designed ACLS competence domains similar to competence domains/instruments used in previous studies.1,2,5 All of the assessments used in the study were investigator-generated.
These competence domains are listed in Table 1. These domains include pharmacist skills of vital importance during ACLS scenarios, including identification of arrhythmias, identification of first- and second-line pharmacotherapy options, medication dose (intravenous infusion rate) calculation, determining medication route of administration, identifying code cart contents, identifying appropriate interprofessional communication styles, and knowing the primary role of the pharmacist in ACLS scenarios. The pre/postsimulation confidence and knowledge assessment questions were mapped to these domains for comparison. While the confidence assessment questions were the same in both assessments, the knowledge assessments were different to prevent rote memorization or recall from being a confounding factor.
Student Confidence and Knowledge ACLS Competence Domain Map
Students’ pre/postACLS simulation confidence was assessed by a 5-point Likert scale (1=strongly disagree to 5=strongly agree). The cumulative confidence scores across all eight domains could range from 8 to 40. Each of the confidence questions corresponded to a particular knowledge question representing the same ACLS assessment domain. Each question was scored “1” if it was correct according to the answer key and “0” if incorrect. Cumulative scores of the eight domains of pre/postsimulation knowledge questions could range from 0 to 8. Students received an overall participation grade for the ACLS simulation and the presimulation confidence survey, but no course grades were specifically allocated to prelaboratory or postlaboratory knowledge or postsimulation confidence assessments, so participation was encouraged but not required.
Primary endpoints of the study were changes in pre/postsimulation knowledge and confidence assessment scores. Secondary endpoints included an evaluation of the relationship between student confidence and ACLS knowledge, before and after the simulation, as well as the relationship between time spent studying for the simulation experience with confidence and knowledge before the experience. The correlation between changes in confidence and changes in knowledge following the simulation was also assessed. A post hoc analysis was conducted to assess between-campus differences in confidence and knowledge changes from baseline. Pre/postsimulation confidence levels were compared using the Wilcoxon signed rank test to evaluate differences in these ordinal (Likert scale) data.
A McNemar test was used to compare the differences in students’ responses in the knowledge assessments. A paired t test was used to analyze the cumulative student knowledge data, which were continuous. Pearson’s correlation analysis was used for all correlation analyses. Between campus differences were assessed using analysis of variance (ANOVA) and Tukey multiple comparison analysis for data with equal variances. In the case of unequal variances, the Welch’s ANOVA and Games-Howell procedures were used to evaluate between-campus differences. Statistical analyses were performed using SAS for Windows, v9.4 (SAS Institute Inc., Cary, NC) or SPSS,v23 (IBM, Armonk, NY).
EVALUATION AND ASSESSMENT
One hundred seventy-seven third-year pharmacy students across all three SCCP campuses (91 students from Columbia, 20 students from Greenville, and 66 students from Charleston) were included in this study. Of those, 167 completed the presimulation confidence survey (94.4% response rate), 176 students completed the postsimulation confidence survey (99.4% response rate), 166 students completed both the pre/psimulation confidence surveys (93.8% response rate), and 176 (99.4%) completed the pre/postsimulation knowledge assessment and indicated the amount of time spent preparing for the laboratory.
As shown in Table 2, there was a significant improvement in cumulative student confidence scores in the postassessment compared to the preassessment (mean score: 34.1 post vs 32.0 pre, p<0.0001). Within each of the eight competence domains, there was also a significant improvement in student confidence (p<0.05), with the exception of arrhythmia recognition and selection of the route of medication administration, in which there were numeric, but not significant, increases (p=0.20 and p=0.06, respectively). There was a significant improvement in cumulative student knowledge assessment scores in the postassessment compared to the preassessment (mean raw score: 6.4 post vs 6.1 pre, mean percentage score: 80% post vs 76.3% pre, p=0.005).
Changes in Student Confidence and Knowledge
The effect size of this change was estimated using Cohen’s d, which was calculated to be 0.26. There were significant improvements in the number of students providing correct answers for four out of eight competence domains on the postsimulation knowledge assessment. Student knowledge regarding identification of common code cart contents (p=0.0063), most appropriate route of administration of medications (p<0.0001), identification of the primary role of the pharmacist (p<0.0001), and identification of appropriate interprofessional communication styles in ACLS scenarios (p=0.0007) all improved significantly after the experience.
There were no significant changes in other domains including first-line pharmacotherapy selection or dose (IV medication infusion rate) calculations (p>0.05 for both domains). In contrast to the confidence assessment, however, there was a decline in three of the individual competence domains of the student knowledge assessment after the ACLS simulation intervention. For two of the three domains, the decline was significant. Arrhythmia recognition is one such example, with a significant decrease in the percentage of students correctly answering in the post knowledge assessment compared with the presimulation assessment (78% post vs 90% pre, p=0.0014). A similar pattern was observed with the percentage of students correctly answering the question related to identification of second-line pharmacotherapy options (60% post vs 86% pre, p<0.0001).
In the postsurvey, students ranked the most important learning points from the simulation experience by competence domain. Rankings indicated that the most valuable lessons learned during the simulation were selection of first-line pharmacotherapy agents during ACLS scenarios, followed by cardiac arrhythmia recognition, second-line pharmacotherapy agent selection, code team communication strategies, code team roles and responsibilities, code cart contents, dosing calculations, and medication administration route selection.
Students’ presimulation confidence and knowledge were not significantly correlated (r=-0.031, p=0.70) (Table 3). Similarly, there was no significant correlation between postsimulation confidence and postsimulation knowledge (r=0.13, p=0.092) or between changes in confidence and changes in knowledge from baseline (r=-0.11, p=0.18). On the other hand, a significant, but weak, positive correlation between presimulation studying and presimulation confidence was discovered (r=0.22, p=0.0052). There was also a significant, but weak, positive correlation between presimulation studying and presimulation knowledge (r=0.22, p=0.0035).
Correlations Between Knowledge, Confidence, and Studying
Because of the inability to conduct the ACLS simulation in a completely identical manner across the three campuses, a post hoc analysis was conducted to evaluate intercampus differences in the changes in confidence and knowledge occurring before and after the simulation. The results of this analysis are found in Table 4, and demonstrate that, although there were no significant differences in the cumulative knowledge change between the two largest campus locations (representing 89% of the study population), there was a significant difference in the mean knowledge increase between the two campuses having the highest and lowest knowledge assessment scores at baseline (p=0.028). The only individual knowledge domain with significant intercampus differences in knowledge changes from baseline was the domain addressing interprofessional communication styles. All other changes from baseline within the individual knowledge domains were similar between campuses. There was also a significant difference between the two largest campuses regarding cumulative confidence changes across all domains before and after the simulation. As shown in Table 4, several of the individual confidence domains contributed to this overall intercampus difference in confidence change from baseline.
Campus-Specific Changes in Confidence and Knowledge
DISCUSSION
This study adds to the literature addressing use of high-fidelity simulation for ACLS training with pharmacy students and demonstrates the benefit of complementing didactic ACLS instruction with high-fidelity simulation. Although published data demonstrate improved ACLS simulation-related student confidence, the data are conflicted regarding an overall improved knowledge base.2-4 Our study shows that overall student confidence as well as confidence in most individual competence domains significantly improved after the ACLS simulation training. The lack of significantly improved confidence in recognizing arrhythmias and identifying the most appropriate medication administration route could reflect a lack of emphasis in these areas by the laboratory facilitators. Though the facilitators were given teaching points to emphasize during the debriefing sessions, the discussions were sometimes limited by time. Additionally, perhaps one would not expect student confidence in arrhythmia recognition to significantly improve after only one laboratory session, given how challenging it can be even for physicians who frequently identify cardiac rhythms in their clinical work.19
The cumulative student knowledge assessment scores significantly improved after the high-fidelity simulation experience, indicating an additional benefit to student learning over that gained by an ACLS-focused didactic lecture. However, the improvement was driven by only half of the competence domains. The significant improvements in team roles, communication, and interprofessional communication are not surprising. It seems that these learning opportunities could be explored more thoroughly in simulations and experiential learning settings than in didactic settings. While the cost efficacy of simulations vs APPEs have not been formally evaluated, the cost our program pays per hour of student experience in simulations averages approximately $10-$15 over several years, compared to the rate that many experiential programs pay of approximately $1.50-$3.00 per hour of student experience on clinical APPEs. As more studies are conducted in this area, it will be interesting to evaluate the cost-efficacy of simulations compared to experiential education, and whether the added financial investment is warranted.
Conversely, student performance was significantly worse postsimulation for arrhythmia recognition, even though the student confidence scores improved numerically, but nonsignificantly, from before to after simulation. Bingham and colleagues demonstrated a similar trend even after multiple simulations.4 The students in their study did not improve their ability to correctly identify dysrhythmias, with a nonsignificant 10% decrease in rate of recognition. Thus, the results of this study and our study may reflect the inherently difficult nature of arrhythmia identification.
There was also a nonsignificant decline in student knowledge regarding first-line pharmacotherapy selection and a significant decline in student knowledge postsimulation regarding second-line pharmacotherapy selection. Perhaps this is because identifying optimal ACLS pharmacotherapy can be complex. The lack of improvement in these areas could also be a result of student perceptions of the applicability of the simulation to their projected career path or the fact that each student did not have the opportunity to participate in each team role.
In the postsurvey, the three domains students ranked with highest average learning value (ranks 1, 2, and 3) were those in which their postsimulation knowledge actually declined. Of note, student self-confidence scores increased in all three of these domains. These factors suggest that student self-perceptions regarding learning and ability may not accurately reflect actual increases in knowledge. This is further supported by the negative, albeit nonsignificant, relationship between changes in confidence and knowledge from baseline. While this points to the potential need for improvements in student self-awareness, a single simulation experience does not appear to offer a solution. Additional studies would be required to assess the impact of simulation experiences on student self-awareness, to measure long-term retention of any acquired skills and knowledge, and to evaluate the sustained impact of training among licensed pharmacy practitioners who were previously exposed to ACLS simulation.
The lack of correlation between presimulation confidence and knowledge, postsimulation confidence and knowledge, and changes in confidence and knowledge is not surprising given the subjective nature of the self-assessment. In medical literature, students do not accurately assess their own abilities.20 Students tend to overestimate their abilities prior to educational programming, which may decrease the likelihood of detecting a positive intervention effect.21 Additionally, underreporting of real change for self-reported measures may occur because of the confounding factor of response shift bias.21 Furthermore, a lack of congruence occurs between self-confidence and academic performance in didactic coursework as well as clinical skills-based assessments.22,23 Confidence ratings may not be a good predictor of knowledge retention. Although confidence was not associated with knowledge improvement, it is still an essential component of a pharmacist’s professional development to effectively apply knowledge and skills.
While we anticipated that using different knowledge assessment questions before and after the ACLS simulation intervention to assess knowledge in each of the eight ACLS competence domains would have been a strength of the study, it may also represent a limitation as an unintended difference in question difficulty level could have impacted student performance. This possibly explains the two competence domains (arrhythmia recognition and second-line pharmacotherapy selection) in which there was a significant decrease in the percentage of students correctly answering the postsimulation knowledge question compared with the presimulation knowledge assessment.
The campus with the highest baseline cumulative knowledge score actually experienced a decline in their cumulative knowledge scores postsimulation, which provides further evidence of the possibility of the postsimulation knowledge assessment questions being appreciably more difficult than the baseline assessment. The difficulty of postsimulation knowledge assessment questions may also explain why we observed a small to moderate practical difference for knowledge change postsimulation vs presimulation. Perhaps having students rate the difficulty of the paired knowledge assessment questions that tested presimulation vs postsimulation knowledge in the eight ACLS competence domains would have provided information regarding variability in question difficulty.
Another study limitation resulted from the structure of the course: approximately half of students took the presimulation confidence assessment prior to lecture and the other half took it between lecture and the simulation, introducing the possibility that survey respondents’ confidence could have varied at baseline, based on exposure to the lecture. However, as there was no significant difference between the presimulation confidence levels of those taking the presimulation confidence assessment prior to lecture and those responding between lecture and simulation (p=0.21), it does not appear that exposure to the lecture significantly impacted presimulation confidence. Because all students completed the presimulation confidence assessment at some point prior to the simulation experience, we believe the changes in confidence occurring before and after simulation are primarily attributable to the simulation experience.
The weak but significant correlation between presimulation preparation time and presimulation confidence as well as performance is encouraging. A positive correlation would be expected, but we predicted the correlation to be stronger. Student recall bias could have impacted the accuracy of presimulation preparation time, or it may have been overestimated despite the fact that the survey indicated accuracy was most important and it was acceptable to record zero study time prior to the ACLS simulation experience.
While exact rationales for the intercampus differences in confidence and knowledge changes from baseline observed in the post hoc analysis cannot be causally linked to the unavoidable logistical and laboratory facilitator differences present among the three campuses, they may be related. For example, only Columbia campus students had the opportunity to practice drawing up medications during the simulation, so this may explain why their confidence changes from baseline were larger in route of administration selection and dose calculation performance competence domains than the other campuses that were unable to provide this experience. It is not surprising that several of the between-campus confidence differences were present in domains related to interprofessional communication and team roles since only the Columbia and Greenville campuses were able to have medical student participation in their simulations. Despite the presence of these intercampus differences, the persistence of overall positive change in pharmacy student confidence and knowledge from baseline in our aggregate analysis of all three campuses suggests external validity of the study findings.
Because student confidence is a subjective parameter and student knowledge is the lowest level of Bloom’s Taxonomy of Learning,24 future directions of this study include evaluating the impact of our ACLS simulation experience on higher levels of learning, such as comprehension and application. To assess comprehension, we plan to evaluate student retention of principles learned in the ACLS simulation laboratory through repeated exposure. To assess application, we plan to evaluate student skills in ACLS simulation laboratory experiences using a skill-based rubric in the future.
SUMMARY
A high-fidelity ACLS laboratory simulation, including team role training and high-fidelity ACLS simulation scenarios followed by debriefing led to a significant improvement from baseline in overall student confidence and knowledge assessment scores. However, there appeared to be a disconnect between student confidence improvements and student knowledge improvements as confidence scores increased while knowledge scores decreased in three of the eight competence domains, with the decline being significant in two domains. Neither presimulation nor postsimulation confidence correlated significantly with performance on the pre/postsimulation knowledge assessments, and there was no significant correlation between changes in confidence and changes in knowledge from baseline. Studying before laboratory was significantly positively correlated with pre-simulation confidence and knowledge, so in our sample it appeared to be a better predictor of student knowledge than confidence. Taken together, these findings suggest that student perceptions about learning are not always reflective of objectively assessed knowledge improvements, and highlight the need for developing self-awareness among pharmacy students. Further studies are needed to assess the impact of ACLS simulation laboratory experiences on higher levels of learning, including knowledge retention through simulation repetition and knowledge application through skill assessment.
- Received July 31, 2015.
- Accepted November 18, 2015.
- © 2016 American Association of Colleges of Pharmacy