Abstract
Objective. To assess Doctor of Pharmacy (PharmD) students’ skills and confidence in using an evidence-based medicine (EBM) approach to answer practice-based, clinical questions.
Methods. Pharmacy students’ ability to provide evidence-based answers for real-world clinical questions was assessed at two time points in the PharmD curriculum using a standard tool and trained evaluators. Pharmacy students’ confidence regarding their EBM skills was self-assessed at four points in the program, with the first survey administered before the EBM sequence and the final survey administered prior to graduation. The survey included five self-assessed skill questions and nine self-confidence questions.
Results. Two hundred twenty-four students from two graduating classes were included in the analysis. Over 97% of students received passing scores on their clinical inquiries (mean score=90.4%), confirming their competency in EBM skills. Students’ survey responses on all self-assessed skill and confidence questions improved significantly from baseline to graduation.
Conclusion. Longitudinal teaching of EBM concepts and opportunities for skills practice developed PharmD students’ ability to successfully provide evidence-based answers to authentic clinical questions. This was consistent with students’ confidence level and self-assessed skill levels reported on surveys. Future directions include confirming students’ use and understanding of EBM concepts after graduation.
INTRODUCTION
Having the evidence-based medicine (EBM) skills to answer clinical questions is essential for pharmacy practice.1-4 The Accreditation Standards set by the American Council for Pharmacy Education (ACPE) highlight these skills as a required curricular element in Appendix 1, under “Health Information Retrieval and Evaluation.”5 Schools and colleges of pharmacy need to ensure EBM skill competency for pharmacy students to be practice ready at graduation. Practice is needed for skill development; hence, using a threaded or longitudinal approach to learning assists students with developing EBM skills over time.6-10 However, with few exceptions,6,7 most published pharmacy literature regarding instruction in EBM describes providing only a single course, an elective course, or training.
The EBM model includes three components: answering clinical questions using best-available evidence, clinical judgement, and patient preference.11,12 This model is often taught alongside the steps for practicing EBM, which include: asking the question, acquiring the best-available evidence, appraising the evidence, and applying the answer in clinical practice.13 The University of Wisconsin-Madison (UW-Madison) School of Pharmacy uses a threaded approach, teaching EBM skills over a three-year period.7 This curriculum focuses primarily on the first three steps, ie, developing an evidence-based answer to a clinical question. Other aspects of EBM, such as applying evidence to patient care and considering patient values when developing care plans, are evaluated by preceptors in the experiential curriculum. The EBM curriculum had been streamlined since the previous evaluation was conducted in 2009, thereby creating a need to reassess EBM learning outcomes. The objective of this evaluation was to assess pharmacy students’ skills and confidence in using an evidence-based medicine (EBM) approach to answer practice-based, clinical questions.
The UW-Madison School of Pharmacy offers a four-year PharmD program. The longitudinal EBM curriculum begins during the spring semester of the second professional (P2) year with the drug literature evaluation course (Figure 1). This course introduces EBM concepts, instructs students in how to critically evaluate medical literature, and teaches them biostatistics and research methods. Active-learning techniques, including journal club discussions and audience response system check-in questions, as well as evaluation of student performance through a series of article critique papers and a final examination are also included.
Description of Evidence-based Medicine Curriculum and Evaluation Plan
Abbreviations: APPE=Advanced Pharmacy Practice Experiences, IPPE=Introductory Pharmacy Practice Experiences, P2=Second year of pharmacy school, P3=Third year of pharmacy school, P4=Fourth year of pharmacy school, PICO: P-Patient I-Intervention C-Comparison O-Outcomes
The EBM curriculum continues in the third professional (P3) year with the introductory pharmacy practice experience (IPPE) and in the fourth professional (P4) year in advanced pharmacy practice experience (APPE) courses, centering around an authentic drug information writing exercise called a clinical inquiry. The objectives for the clinical inquiry include: developing competency in searching current medical literature, interpreting the literature with respect to the question, and crafting an answer to a question based on the best available evidence. Clinical questions are jointly determined by the student and preceptor based on real patient cases and practice site clinical needs. Initially, students create a population, intervention, comparison, outcome (PICO) framework based on their clinical question.14 After identifying and summarizing the best available primary and secondary literature, students synthesize a succinct evidence-based answer to the clinical question.
Early in the P3 year, students attend three seminars explaining the clinical inquiry assignment. The first seminar refreshes their knowledge of the PICO framework and how to develop search strategies, while the second seminar reviews how to conduct secondary literature assessments. The third seminar reviews how to conduct primary literature assessment and provides direction on crafting evidence-based answers. First, students create a clinical question in conjunction with their IPPE preceptor and submit a PICO framework based on that question for feedback prior to writing their clinical inquiry. Finally, a single clinical inquiry is submitted for evaluation by a P3 course faculty member.
During the first APPE of their P4 year, students attend a seminar in which EBM and biostatistics concepts are reviewed, which is followed by a seminar in which the clinical inquiry assignment is reviewed. Throughout their P4 year, students write five to six clinical inquiries depending on the number of elective APPEs they complete. Again, the clinical inquiry question is generated at each experiential site.
All clinical inquiries developed by pharmacy students in the P3 and P4 years are evaluated using a rubric that was created by pharmacy faculty members. There are two major sections to the clinical inquiry evaluation: problem analysis and style of presented material. The problem analysis portion accounts for 75% of the assignment score and evaluates the appropriateness of the literature selected, depth and insight of supporting information, and the student’s evidence-based answer. The remaining 25% of the score is based on the PICO framework, strength of recommendation for the evidence-based answer using the Strength of Recommendation Taxonomy (SORT),15 and meeting seven technical writing and formatting requirements. The UW-Madison School of Pharmacy has developed a companion tool for students and preceptors to use in writing and evaluating a clinical inquiry that includes the assignment objectives, instructions for assignment completion, a grading rubric, and examples that provide additional guidance.
METHODS
This longitudinal evaluation assessed pharmacy students’ skills and confidence in using EBM concepts based on their clinical inquiry scores at two time points and their responses on a series of four surveys (Figure 1). Students graduating in 2017 and 2018 were followed beginning with the drug literature evaluation course in the P2 year through graduation. This project was determined not to meet the federal definition of research and full institutional review board review was not required.16
Clinical inquiry evaluations included the P3 clinical inquiry and the P4 clinical inquiries from elective rotations completed in the last semester of the APPE year. The last semester APPE clinical inquiries were selected to allow for a measurement of competency near but prior to graduation. Elective APPEs were used to increase the consistency in evaluations as those clinical inquiries were evaluated by one of six trained evaluators who remained consistent year to year. In addition to the overall assignment score (used in both IPPEs and APPEs), five subscores were available for APPE clinical inquiries: PICO, appropriate literature, supporting information, evidence-based answer, and strength of recommendation. Each subscore item could receive a maximum of five points. Possible PICO scores were 0 for not following PICO format, 3 for an incorrect PICO, and 5 for a correct PICO. Scores for using appropriate literature sources ranged from 1 for trivial references (eg, class notes) to 5 for appropriate best-available evidence. Scores on the supporting information provided ranged from 1 for incorrect or missing key information to 5 for an appropriate yet concise summary of the included literature. Scores for evidence-based answers ranged from 0 where the question was not answered to 5 where a correct conclusion was described. The strength of recommendation scores ranged from 0 where the strength of recommendation was missing to a 5 where both the strength of recommendation and the supporting rationale were accurate.
The survey included nine self-confidence questions (Table 1), which students responded to using a scale ranging from 0=having no confidence to 5=having complete confidence. There were five questions for which students self-assessed their skills, with responses based on a 0-5 scale: 0=poor skills to 5=excellent skills.
Pharmacy Students’ Self-Assessed Level of Confidence and Skill in Providing Evidence-Based Answers to Clinical Questions
Survey participation was voluntary and students were invited to participate at four time points: prior to the drug literature evaluation course in the spring semester of the P2 year, at the beginning of the P3 year prior to writing the first clinical inquiry, at the beginning of the P4 (APPE) year, and prior to graduation (Figure 1). At each of the four time points, an email invitation with a link to the survey instrument on Qualtrics (www.qualtrics.com) was sent to students. A reminder email with a link to the survey was sent to students one week after each invitation. Extra credit was offered to students for completing each of the first three surveys but not for survey 4 as it was administered just prior to graduation.
Data Analysis
All students who responded to the baseline survey and at least one follow-up survey were included in the primary analysis on the confidence and skills related to EBM. In cases where a participant missed a survey or did not complete a section of a survey, the last observation carried forward (LOCF) method was used. In cases where one or a few responses were missing within a section, the average of the other responses in that section was used. A sensitivity analysis, including all students who responded to the survey at all four time points, was also completed.
Descriptive statistics were used to analyze survey responses and clinical inquiry scores (ie. means and counts). For clinical inquiry analysis, the proportion of students scoring greater than 77% (B grade) and less than 70% (not passing) was calculated for both years (P3 and P4) individually and then compared to check for improvement from one year to the next. To analyze the self-confidence and skill surveys, repeated measures ANOVA was used. Cronbach alpha was used to assess the internal consistency of the self-confidence and self-assessed skills scales over all four time points. Stata, version 15.1 (StataCorp, College Station, TX) was used to complete the data analysis. A significance level of p<.05 was used to indicate statistical significance, with no adjustment made for repeated testing.
RESULTS
Of the 264 students invited to participate in the surveys, 224 (85%) were included in the primary analysis and 84 (32%) were included in the sensitivity analysis. Two hundred fifty-one P3 students and 191 P4 students at the end of their APPE year completed a clinical inquiry (Table 2). Mean assignment scores (about 90%) were consistently high in the P3 and P4 year. The P4 students, who were completing APPEs, excelled at identifying appropriate literature (mean score 4.6 out of 5) and at synthesizing evidence-based answers (mean score 4.5 out of 5). Overall, the P4 (APPE) students demonstrated competency in application of EBM skills at graduation.
Clinical Inquiry Scores
Students’ mean scores on all nine self-confidence items significantly increased from baseline to graduation (Table 1). All items except “confidence in calculating a number needed to treat” increased by over a full step on the confidence scale (range increase 0.9 to 1.8). The mean scores on all five self-assessed skill items significantly increased from baseline to graduation (Table 1). All except self-assessed skill in searching databases for high-quality literature increased by over a full step on the confidence scale (range increase 0.9 to 1.3).
The sensitivity analysis for student confidence and self-assessed EBM skills was consistent with the primary analysis. Student scores on all confidence and self-assessed skills questions significantly increased, in the sensitivity analysis (p<.001). The increase in students’ confidence scores ranged from 0.8 to 1.9 and the increase in students’ self-assessed skill scores ranged from 1.2 to 1.8.
DISCUSSION
With this streamlined longitudinal curriculum, pharmacy students learned to successfully provide evidence-based answers to authentic clinical questions. Sustained and improved student confidence and self-assessed skills were also observed. Findings from this evaluation are consistent with those from an earlier assessment of EBM skills performance conducted at the same school of pharmacy, despite curriculum updates which had reduced the number of EBM activities that students were required to complete.7 This suggests that the use of a threaded curriculum for EBM may be more effective than the specific number of activities completed. This analysis adds to the literature regarding the ability of pharmacy students to apply EBM skills in clinical practice situations.
An important benefit of the clinical inquiry assignment is its authenticity. The assignment is authentic as students generated the clinical questions at experiential sites and topics were likely to have been based on patient encounters or site needs. Incorporation of practice into learning is part of Kolb’s experiential learning theory, where students give the experience personal meaning, build on what they previously learned, and experiment for themselves in applying what they learned.17 As EBM should ultimately be patient-centered,11 incorporating clinical inquiry into experiential learning allows students to apply their evidence-based answers to actual patients.18 Additionally, students appreciate assignments in which realistic clinical questions are used and they develop answers using an EBM framework.19,20
Although the EBM curriculum had been streamlined, it yielded results similar to the former curriculum in terms of pharmacy student learning.7 The first learning component that was adjusted was five hours of librarian-facilitated content in the P2 year, which consisted of learning literature acquisition skills and completing a written assignment (ie, a mini-clinical inquiry assignment). This was condensed to a short online tutorial plus a single lecture by the librarian in the drug literature evaluation course. Given the content removed in the P2 year, what was previously a single librarian-facilitated review discussion was expanded into a three-lecture sequence that reviewed the clinical inquiry assignment and skills. At the same time, the number of clinical inquiry assignments written in the P3 year was decreased from two to one. There were no EBM curricular changes in the APPE year. Thus, the current learning sequence in the pre-APPE curriculum includes two fewer lecture hours along with two fewer written and faculty-evaluated papers, yet students have demonstrated that they have acquired similar EBM skills by the APPE year.
While there can be inconsistency in the difficulty level of clinical questions posed to students, this is reflective of clinical practice. The concern about inconsistency in the difficulty level of questions was addressed by having a small number of trained evaluators grade the assignment, thereby improving interrater reliability and feedback consistency. Additionally, having trained evaluators scoring the assignment relieved preceptors of this administrative burden.
The primary limitation of this evaluation was missing survey data. The reasons for the low response rate on the final survey were likely because we were unable to offer extra credit in the last semester of the P4 year and because students were busy with graduation. However, using the LOCF approach for the primary analysis was used to minimize bias and maximize the data available for the assessment.21 By carrying forward previous survey data, baseline data is likely being carried forward and decreasing the measured improvements in student confidence and self-assessed EBM skills. Additionally, there were consistent results between the primary and secondary analysis. The second instance of missing information was the P3 clinical inquiry subscores. They were no longer available because the campus had adopted a new learning management system.
Going forward, the rubric for the clinical inquiry assignment will be digitized. This will allow for real-time monitoring of clinical inquiry subscores as well as student-specific skills. Ideally, to confirm that students are truly practice ready, it would be beneficial to survey preceptors on how APPE students use EBM and clinical inquiries in practice, as well as to survey graduates after several years in practice to evaluate their use of and skills related to EBM.
CONCLUSION
Providing pharmacy students with longitudinal teaching of EBM concepts and opportunities to practice the skills they learned developed their ability to successfully provide evidence-based answers to authentic clinical questions prior to graduation. Future directions include confirming pharmacy graduates’ continued use and understanding of EBM concepts several years into practice.
ACKNOWLEDGMENTS
This work was supported by a grant from the Wisconsin Pharmacy Practice Research Institute. The authors would like to thank Lily Zwaska for her assistance with data entry and Mara Kieser, the UW-Madison School of Pharmacy Director of Experiential Education, for her support of this project.
- Received October 14, 2019.
- Accepted May 8, 2020.
- © 2020 American Association of Colleges of Pharmacy