Abstract
Objective. To examine how the intrasemester sequencing of a simulation component, delivered during an ambulatory care introductory pharmacy practice experience (IPPE), affects student performance on a series of 3 assessments delivered during the second professional (P2) year.
Design. At the Jefferson College of Pharmacy (JCP), P2 student pharmacists were randomly assigned to 6 weeks of simulation activities, followed by 6 weeks on site at an ambulatory care clinic or vice versa during either the fall or spring semesters. At the end of each semester, these students completed 3 skills-based assessments: answering a series of drug information (DI) questions; conducting medication adherence counseling; and conducting a medication history. The 2 groups’ raw scores on assessment rubrics were compared.
Assessment. During academic years 2011-2012 and 2012-2013, 180 P2 student pharmacists participated in the required ambulatory care IPPE. Ninety experienced simulation first, while the other 90 experienced the clinic first. Students assessed over a 2-year time span performed similarly on each of 3 skills-based assessments, regardless of how simulation experiences were sequenced within the IPPE.
Conclusion. The lack of significant difference in student performance suggests that schools of pharmacy may have flexibility with regard to how they choose to incorporate simulation into clinical ambulatory care IPPEs.
INTRODUCTION
The Accreditation Council for Pharmacy Education (ACPE) requires that almost one-third of the doctor of pharmacy (PharmD) curriculum consists of experiential education, which is delivered in introductory pharmacy practice experiences (IPPEs), followed by advanced pharmacy practice experiences (APPEs), the latter traditionally delivered in the final year of a pharmacy program. Unlike APPEs that have been in place for many years, colleges and schools of pharmacy have implemented IPPEs differently as their needs dictate and curricula allow. Regardless of how a school incorporates IPPEs into their curriculum, ACPE requires that students participate in them for a minimum of 300 hours.1
While implementation of IPPEs differs, schools of pharmacy have likely faced similar challenges. First and foremost is the challenge of developing and maintaining capacity to meet the needs of student placement. This includes accounting for the number of sites, monitoring student-to-preceptor ratios, and maintaining the proper diversity of sites to meet accreditation requirements.2-5 Along with these challenges, experiential directors must be vigilant about “preceptor burn-out,” which could be exacerbated by the need for sites and preceptors to provide students with experiences at both the IPPE and APPE levels. Identifying sites and preventing preceptor burn-out can be particularly challenging when multiple pharmacy schools are using the same preceptor pool.2 Schools of pharmacy also are tasked with fitting IPPEs into the curriculum as seamlessly as possible, which may be a significant scheduling challenge.
As of 2011, ACPE allows pharmacy schools to use structured simulation to help meet IPPE goals and objectives. Simulation is defined by ACPE as, “an activity or event replicating pharmacy practice.” In the accreditation standards, ACPE further states, “For the purpose of satisfying introductory pharmacy practice experience expectations, simulation may include use of high-fidelity manikins, medium-fidelity manikins, standardized patients, standardized colleagues, role play, and computer-based simulations. Simulation as a component of introductory pharmacy practice experiences should clearly connect the pharmacy activity or delivery of a medication to a patient (whether simulated patient, standardized patient, or virtual patient).”1
Simulation can account for up to 60 hours of the 300-hour requirement.1 It provides the opportunity for students to participate in controlled activities without the variation naturally present in real life environments.2 This control permits a certain level of consistency to be built into an IPPE. Although endorsed as a viable component of IPPE by ACPE, a lack of published evidence pertains to simulation as an effective education component of an IPPE in pharmacy curricula.2 A recent survey assessed the status of simulation-based teaching methodologies (referring to the use of high-fidelity manikins and standardized patients) in US schools of pharmacy. Seventy-four out of 88 schools participating in the survey reported using simulation. Several of the 14 schools that reported not using simulation, did report using role play (n=12), partial-task trainers (n=7), and low-fidelity manikins (n=6). Only 29.7% of the schools participating in the survey reported using simulation for IPPEs.6
Two schools of pharmacy looked at the effects of participation in simulation activities on knowledge and skills. In 2010, University of Missouri-Kansas City School of Pharmacy described incorporating 9 hours of high-fidelity simulation involving 3 acute care cases into a longitudinal clinical IPPE that students on a satellite campus were completing in the fourth year of a 5-year professional program.7 Twenty-eight students participated in the simulation activities, and 27 completed all accompanying assessments. Students completed a quiz prior to participating in each simulation session and again after to assess whether knowledge was improved through participation. Scores on the postsimulation quizzes demonstrated significant improvement in knowledge. Students also completed a 3-month follow-up quiz, which was identical to the postsimulation quiz. Students who participated in the simulation experiences scored significantly higher on the follow-up quiz than students from the school’s main campus who did not participate in the simulation experiences.7
Additionally, California Northstate University College of Pharmacy described creating a 3-week, 60-hour simulation-based IPPE designed to teach and assess pre-APPE core domains as identified by ACPE. The activities designed for the course were a model community pharmacy, a model hospital pharmacy, standardized patients, and high-fidelity simulation. Twenty-eight students participated in the simulation IPPE, and 60 students participated in traditional IPPEs and served as the control group. Students enrolled in simulation performed significantly better on quizzes taken after each of the simulation experiences compared to those taken before. Simulation IPPE students and traditional IPPE students participated in a practical examination, and scores from the 2 groups were compared. The practical examination consisted of a checklist and assessed student performance on 10 pre-APPE core domains. The simulation student scores from the observer checklists were significant with 52%, compared with scores of 44% in the control arm. The authors concluded that the simulation IPPE students were better prepared for the examination, but also noted that the simulation IPPE required a great deal of time and resources.8
The Jefferson College of Pharmacy (JCP) uses simulation as a component of an ambulatory care IPPE. During the second professional (P2) year, students assigned to the required ambulatory care IPPE are randomized to experience a 6-week simulation curriculum followed by a 6-week in-clinic experience or a 6-week in-clinic experience followed by a 6-week simulation curriculum. All ambulatory care IPPE students complete 3 assessments during the last week of the semester. The aim of our work was to assess if this intrasemester sequencing of the ambulatory care IPPE had an effect on student performance on the assessments.
DESIGN
Jefferson is a private institution offering a 4-year PharmD curriculum for students who have completed required prerequisite coursework. The JCP curriculum includes 6 IPPEs delivered over the course of the first 3 professional years (Table 1). The decision to use simulation exercises for the ambulatory care IPPE (as opposed to the other 5 IPPEs) was based on the pragmatic need to prioritize use of available ambulatory care practice sites to conduct required ambulatory care APPE rotations without causing preceptor/clinic burn-out or overburdening a site with too many students.
Required Introductory Pharmacy Practice Experiences (IPPEs) at Jefferson College of Pharmacy
The Dr. Robert and Dorothy Rector Clinical Skills and Simulation Center at Thomas Jefferson University (TJU) was available to provide simulation support for teaching and assessment activities to all of the programs at TJU. The center has dedicated faculty members and staff to implement programs, including those involving high-fidelity and low-fidelity models and standardized patients. Although the college of pharmacy faculty members work closely with the staff at the center to provide simulation experiences in several other courses, these resources were not used for ambulatory care simulation because of the significant cost of using the center on a weekly basis and the need to share the capability/availability of the center with all of the professional programs on campus. Lastly, consensus from ambulatory care faculty members was that the broad definition of “simulation” (as provided by ACPE) enabled the design of activities that could be carried out in the computer laboratory and model pharmacy that are part of JCP’s Pharmacy Practice Simulation Center (computer laboratory space and model pharmacy).
During the summer of 2011, each ambulatory care faculty member participated in a series of meetings, and consensus was established regarding the specific foundational knowledge and skills that should be practiced during simulation in an ambulatory care IPPE. The knowledge and skills identified were based on objectives outlined in the IPPE course syllabus, and ambulatory care faculty members agreed they were applicable across practice sites. All ambulatory care faculty members collaborated to develop and provide peer review for each week’s simulation activities.
From fall 2011 through spring 2013, the ambulatory care IPPE consisted of two 6-week experiences: an ambulatory care simulation experience and an on-site ambulatory care clinic experience. During the assigned semester, students were randomly assigned to experience simulation first or clinic first, and then switch assignments (Figure 1). During the simulation, P2 students were led through a series of activities delivered by adjunct pharmacy instructors in JCP’s Pharmacy Practice Simulation Center. The adjunct instructors hired were licensed pharmacists experienced in a variety of outpatient pharmacy practice settings, including community, health-system, and ambulatory care. All adjunct instructors were supervised by a faculty member who completed a postgraduate year 2 (PGY-2) ambulatory care pharmacy residency.
Randomization of Introductory Pharmacy Practice Experience (IPPE) students during the fall semester of the second professional year (P2).
a Second-year students switch IPPE assignment during the spring semester.
All simulation exercises were conducted individually or in small groups. Simulation activities pertained to cultural competency, “subjective, objective, assessment, and plan” (SOAP) note writing, responses to drug information questions (ie, the “curbside consult”), patient counseling, access to care (eg, navigating insurance formularies, finding medication coupons, navigating patient assistance programs), and transitions in care. A list of activities, as they relate to these topics, can be found in Table 2.
Summary of Ambulatory Care IPPE Simulation Activities
For the onsite clinic experience, students were assigned to a preceptor at an ambulatory care practice site. Under the preceptor’s supervision, students participated in pharmacist activities at the site. For both the simulation experience and clinic experience, students were evaluated with a common rubric on professionalism, development of knowledge and skills, written communication, verbal communication, organization and time management, self-directed learning and initiative, reasoning, and problem solving. The adjunct instructors and preceptors indicated whether students were below expectations, met expectations, or exceeded expectations on parameters related to these assessment areas.
During the final week of the semester, when the 2 groups had completed both portions, all students had to complete 3 assessments, which are summarized in Table 3. They included answering DI questions, completing a medication history, and conducting medication adherence counseling. Using the Rector Center was considered for administering the assessments, but because the decision to incorporate simulation into the ambulatory care IPPE was made after the school budget was approved, it was deemed cost prohibitive to do so. Instead, student pharmacists in their third and fourth professional years (P3 and P4) acted as patients for IPPE students to interact with during the medication history and medication adherence assessments. The P3 and P4 students were familiar with simulating patients and were trained by the investigators a week before the assessments.
Summary of Skills-based Assessments for the Ambulatory Care Introductory Pharmacy Practice Experience (IPPE)
Training methods were modeled after the procedures used in the Rector Center by their standardized patient trainers. Training was conducted as a group and included a script read-through of the case followed by verbally quizzing each simulation patient on case elements. Next, the simulation patients were led through a reading and discussion of the checklist items used to grade the ambulatory care IPPE students, taking time to ensure that all simulation patients understood key definitions and had the same interpretation of each checklist item. The last part of the training session involved each simulation patient role-playing with the instructor (who played the role of IPPE student) while the other simulation patients observed. After each role-play encounter, the group was walked through the checklist and discussed how to grade each item based on the role-play to ensure reliability in the assessment.
The P2 students were not informed about the nature of each assessment until shortly before the assessment was scheduled to begin. At that point, a sign was placed at each assessment station that contained pertinent information. To ensure academic integrity, students were required to sign a confidentiality form meant to limit information transmitted between student groups. Students had 15 minutes to complete each assessment and the activity was coordinated by a faculty member to ensure timing was monitored. Students who finished their assessment in less than the time allotted were instructed to stay at their station. After the 15 minutes allowed for each assessment had elapsed, 5 minutes were allocated for students to transfer to the next assessment station. During this 5-minute transfer time, the simulation patients conducted their grading via a nominal (yes/no) checklist for the medication adherence and medication history stations (see Table 3 for an example of checklist items). Drug information responses were graded by the same faculty member for the 2-year time span, according to a rubric designed for each DI question (see example in Table 3).
For each of the assessments in this course, a rubric score of greater than or equal to 73% was considered passing. This score was chosen based on the JCP passing standard and was consistent with the passing score used for similar assessments throughout the curriculum during the time period of the study.
The study aimed to evaluate if simulation and clinic sequencing had any effect on student performance on the assessments. To test the null hypothesis that there would be no difference in student performance, regardless of intrasemester sequencing of the ambulatory care IPPE, we employed a retrospective, case control design. Students had already been randomized into 2 groups based on experiencing ambulatory care IPPE simulation first, or by spending time in an ambulatory care clinic first.
Because all students experienced the assessments at the end of the semester, the primary outcome of the evaluation was the comparison of the 2 groups’ mean scores on the assessments. Additionally, a comparison of mean scores for student cohorts in the fall and spring semesters was conducted. The assessment was completed by collecting all end-of-semester assessment scores from academic years 2011-2012 and 2012-2013. Two-sided t tests were used to assess continuous data. A chi-square test was used to assess nominal data. Statistical analysis was performed using SPSS, v19 (IBM, Somers, NY). The study was reviewed and given exempt status by the TJU Institutional Review Board.
EVALUATION AND ASSESSMENT
One hundred eighty P2 student pharmacists completed all 3 assessments, with 90 students having experienced simulation first and 90 students having an ambulatory-clinic experience first. Respective mean scores for students in the simulation-first group and clinic-first group were: 9.1 and 9.5 out of 15 possible points for the drug information assessment; 14.4 and 14.2 out of 17 possible points for the medication adherence assessment; and 16.2 and 16.4 out of 20 possible points for the medication history assessment. No significant differences were observed in any of the assessment scores between the groups (Table 4). The percentage of students achieving a passing score of ≥ 73% on the DI, medication adherence, and medication history assessments were 38.8%, 71.1%, and 76.6% for the simulation-first group and 35.5%, 76.6%, and 82.2% for the clinic-first group.
Assessment Results Comparing Simulation First and Clinic First Experiences
When assessed by semester, students in the fall achieved a mean score of 8.6, 15, and 16.5 for the DI, medication adherence, and medication history assessments, respectively. Students in the spring semester achieved a mean score of 10, 13.7, and 16.3 for the DI, medication adherence and medication history assessments, respectively. Students at the end of the spring semester performed significantly better on the DI assessment, and students at the end of the fall semester performed significantly better on the medication adherence assessment. No significance was observed between semesters for the medication history assessment. Assessment results, by semester, comparing simulation-first to clinic-first cohorts can be found in Table 5.
Assessment Results Comparing Simulation First and Clinic First Experiences by Semester
DISCUSSION
Students assessed over a 2-year time span performed similarly on all 3 assessments regardless of how simulation experiences were sequenced within their ambulatory care IPPE. While spring semester students performed better than fall semester students on the on the DI assessment, the opposite was observed for the medication adherence assessment, and no difference was observed between semesters regarding the medication history assessment. The lack of significant difference in student performance within semesters, combined with a lack of net difference in performance between semesters, suggests that we succeeded in fairly and pragmatically incorporating ambulatory care IPPE simulation into the curriculum.
After we decided to add simulation exercises and assessment to the ambulatory care IPPE, faculty members focused on designing simulation activities and assessments that would best use resources and funds. We hope the data presented, in the context of our curriculum, enables other schools with similar characteristics to pursue adding simulation and its assessment into their IPPE curriculum. Having flexibility regarding simulation sequencing may help institutions incorporate IPPE ambulatory care experiences and meet their experiential education requirements. While we chose to target ambulatory care, the model could be used for an IPPE focusing on a different area of practice and could also feature or integrate other types of simulation such as high fidelity manikins or rigorously trained standardized patients as an institution’s financial resources allow.
As we continue to evaluate and revise this experience, a few areas could be improved or expanded. Overall, students did not perform well on the drug information assessment. We hypothesized that there may have been too many questions for the time allotted as many students struggled to finish. Second, P2 students take their drug information course during the fall of the P2 year (at the same time as half of the class takes this course) and then continue to complete drug information questions during their IPPEs as well as during a spring laboratory course. Thus, this assessment may not be commensurate with their level of training.
Another potential limitation of our study was using P3 and P4 student pharmacists in place of true standardized patients. While using P3 and P4 student pharmacists as simulation patients might enable other schools to replicate or adapt our IPPE simulation methods, the substitution of student pharmacists for true standardized patients has not been validated. Our faculty members aimed to mitigate this limitation by training the P3 and P4 students in the same manner that true standardized patients are trained at our clinical skills and simulation center. To evaluate the quality of this simulation assessment, we would like to assess the inter-rater reliability of P3 and P4 student pharmacists to true standardized patients and to faculty members. Additionally, we would like to incorporate a formal debriefing and feedback session for students taking the IPPE. Incorporating formalized debriefing and feedback components would create a more meaningful and enriching learning experience for the students.9
While our study helps augment research using simulation in IPPEs, it also raises questions requiring further study. For example, even though there was no difference in overall performance on assessments, it would be useful to know if differences exist between rubric areas (eg, communication items, professionalism). Finally, other studies examining the use of simulation in the experiential setting have assessed student perceptions of confidence, as well as preparedness and satisfaction. These are elements we may also consider formally assessing beyond a traditional course evaluation to help determine the overall efficacy of simulation exercises.7, 8
CONCLUSION
Schools of pharmacy may have difficulty finding an adequate number of ambulatory care sites/preceptors for both IPPE and APPE. Our data and experience support being flexible when incorporating simulation into IPPEs and may help schools of pharmacy more comfortably discuss the idea of using a hybrid simulation/practice site model for an IPPE. We encourage those institutions who are “thinking outside the box” regarding their IPPE simulation and sequencing methodologies to assist in providing more insight into this advancing area.
ACKNOWLEDGMENTS
The authors would like to thank Drs. Kimberly Carter and Amy Egras, Mrs. Andrea Joseph, and Drs. Jacqueline Lucey, Gerald Meyer, Cynthia Sanoski, Jason Schafer, and Elena Umland for their support, dedication, and commitment to facilitating the incorporation of simulation into experiential education.
Footnotes
- Received August 29, 2014.
- Accepted January 9, 2015.
- © 2015 American Association of Colleges of Pharmacy