Abstract
Objective. To evaluate the use of an objective structured clinical examinations (OSCE) to assess clinical competency acquired during an off-campus introductory pharmacy practice experience (IPPE).
Methods. Third-year pharmacy students completed an IPPE in transitions of care and completed 24 experiential contact hours at one of 17 practice sites. Students were assessed using two OSCEs, the first occurring prior to beginning an off-site IPPE (pre-experience OSCE) and the second occurring after completion of the off-site IPPE (post-experience OSCE). Each OSCE consisted of 10 stations and covered five graded competency domains. The primary outcome was the degree of change in student performance from the pre-experience OSCE to the post-experience OSCE. Secondary outcomes included changes in each graded domain, OSCE pass rate, and failure conversion rate.
Results. Of 111 students, 109 completed both the pre- and post-experience OSCE. Significant improvements were observed in overall score and cohort pass rate. Overall scores improved from 80 for the pre-experience OSCE to 87 for the post-experience OSCE. The OSCE pass rate also improved from 47% to 84%.
Conclusion. Although preceptor evaluations have traditionally served as the primary summative assessment for IPPE and APPE, this study indicates that OSCEs may be a reliable alternative to assess clinical competency acquired from off-site practice experiences.
- experiential education
- competency assessment
- clinical competency
- objective structured clinical examination
- introductory pharmacy practice experience
INTRODUCTION
Evaluation of student performance on introductory and advanced pharmacy practice experiences (IPPEs and APPEs) continues to be a controversial topic. In particular, preceptor ratings of student performance have been criticized for their lack of consistency and accuracy.1-3 Nevertheless, a majority of pharmacy schools continue to use preceptor evaluations as the primary source for grade determination in both IPPEs and APPEs.4
One alternative to the traditional preceptor evaluation is the objective structured clinical examination (OSCE). The OSCE was first introduced by Harden and colleagues as a novel method of assessing clinical competence for medical students and has since been adapted to doctor of pharmacy curricula, among numerous other health professional programs.5,6 The OSCE is advantageous in evaluating competency in difficult-to-assess areas, such as communication, problem solving, and sound decision-making.7,8 Compared to other assessments, the OSCE has relatively high reliability, validity, and objectivity. Considering that key drivers of preceptor rater error include leniency, deficits in preceptor knowledge and skills, and the halo effect of a students’ prior performance, OSCE lends itself as a more objective and unbiased alternative.1
Use of the OSCE in pharmacy curricula is common, but varies greatly in structure and purpose.6 Schools of pharmacy have implemented OSCEs as formative and summative assessments for both individual courses and program level performance, and most recently to assess students’ readiness to begin APPEs.6-7,9-12 Furthermore, because the OSCE is capable of assessing clinical competency, communication, and professionalism, there is significant potential to leverage the OSCE as the primary assessment for experiential courses.5,7,8 More specifically, some authors have indicated its potential to assess students’ performance in APPEs.13,14 Nevertheless, actual use of the OSCE to assess student learning in experiential courses has yet to be described in the pharmacy education literature. A review of other health professions yielded a few articles examining the use of the OSCE in medical clerkships, but literature in examining its use in experiential programs in other professions was lacking.13,15,16
The purpose of this study was to describe and evaluate the use of an OSCE administered before and after an IPPE to assess competency acquired during the course. We hypothesized that student OSCE performance would improve following completion of the IPPE and that this performance improvement would be indicative of the knowledge and skills gained during the IPPE.
METHODS
During the third professional year (P3) at the University of North Texas System College of Pharmacy, students completed a required IPPE in transitions of care. The course consisted of P3 students spending 24 hours over a three-week period at a pharmacy practice site (hospital, ambulatory, or hospital-based community pharmacy), during which they completed transition-of-care activities (ie, medication reconciliation and patient education), identified medication-related problems, and made clinical recommendations to their preceptor. Preceptors were provided with a mandatory one-hour online training session as well as guidance documents specific to this course that covered course structure, required activities, preceptor and student expectations, and use of an OSCE to assess student competency upon completion of the course. In addition, preceptors were required to undergo general preceptor development by the college on a regular basis that included instruction on teaching strategies and providing effective feedback to students.
Two OSCEs were administered. The first OSCE was administered prior to students completing the off-site experience and was used as a formative assessment (pre-experience OSCE). Students who failed the pre-experience OSCE were required to attend a debrief session in which a faculty member discussed their areas of weakness but they were not required to remediate. The second OSCE was administered as a summative assessment at the conclusion of the IPPE (post-experience OSCE). Students were required to pass the post-experience OSCE to receive a passing grade for the course. If a student failed the post-experience OSCE, they were allowed a single remediation attempt.
The structure, resources, and assessments were identical for both OSCEs. The only difference between the two OSCEs was case content. Each OSCE consisted of ten 15-minute stations (Table 1). Stations 0 through 6 simulated a hospital admission involving a progressive patient case that started in the emergency department and concluded after the patient was admitted to the intensive care unit (ICU). Stations 7 through 9 simulated a discharge encounter for a new patient. Skills tested throughout the OSCE, such as medication history interview skills and patient counseling, had previously been taught and assessed separately during the first year (P1) skills laboratory course.
Roadmap for an Objective Structured Clinical Examination Used to Assess Pharmacy Students’ Knowledge and Skills in Transitions of Care Prior to and After Completing an Introductory Pharmacy Practice Experience
Cases were initially developed by IPPE course faculty members and internally validated by content experts for accuracy, completeness, and standardization, particularly in terms of difficulty. Clinical case content was derived from second year (P2) pharmacotherapy courses. Consequently, P2 pharmacotherapy course faculty members served as content experts for each case as appropriate. To prevent loss of confidentiality, new cases were developed for each cohort of students.
Cases progressed in compressed time and included common health care challenges, such as health literacy, limited physician accessibility, and incomplete records. Information was revealed to students progressively via a standardized paper-based medical record that included progress notes, orders, laboratory results, and previous encounters. Students interacted with three standardized clients (patients, family members, and physicians) during the OSCE. Patients and family members were portrayed by actors from our standardized patient pool. Physicians were portrayed by faculty members. Standardized client interactions were assessed by live graders who were recruited from our pharmacist grader pool. Standardized clients were provided with a script, including specific and general responses. Both standardized clients and graders received a one-hour training session immediately prior to the OSCE.
The OSCE was assessed on a 100-point scale, with five individually graded competency domains. Each domain accounted for 20 points and included: medication history interview skills (station 2), accuracy of the medication history (station 2), accuracy of medication reconciliation (station 4), presentation and accuracy of therapeutic recommendations to the patient’s provider (station 6), and discharge counseling skills (station 9). Interview skills and counseling skills were assessed by a concealed grader using a global rating scale in which the grader rated students’ performance as: clear pass (20), borderline pass (15), borderline fail (10), or clear fail (5). Accuracy of the medication history and medication reconciliation were assessed by the experiential course director using a standardized rubric that evaluated five components on a four-point rating scale, including identification of current medications, medication order completeness, indication, penmanship, and recording other key data such as allergies and nonprescription products. Satisfactory competency (ie, met the criteria to pass the IPPE) was defined as obtaining an overall score of 75 or higher and a minimum score of 15 out of 20 for each of the five domains. Students who did not achieve a minimum score of 15 in any domain received a grade of “fail” for the entire OSCE. Rubrics, rating scales, and thresholds were developed by the course team and internally validated with other faculty members experienced in developing and conducting OSCEs.
The primary outcome was change in student performance from the pre-experience OSCE to the post-experience OSCE. Secondary outcomes included change in overall pass rate, pass rate in each domain, and the failure conversion rate, which was defined as the percentage of students that failed to demonstrate competency in the pre-experience OSCE, but later demonstrated competency in the post-experience OSCE. Student performance during remediation was not included in the analysis.
Student perceptions of the OSCEs were assessed using a standardized and validated institutional student course evaluation, specifically students’ responses to the statement: “The exams were representative of materials and objectives presented in the course.” Students were allowed to rank this statement on a five-point scale with anchors of strongly agree (5) and strongly disagree (1). The survey, which was voluntary and de-identified, was administered to students upon completion of the course but before grades were released.
Statistical analyses were conducted using SPSS Statistics 24 (IBM, Armonk, NY). The primary outcome was analyzed using paired t test and secondary outcomes were analyzed using McNemar chi square. Student course evaluation data were analyzed using descriptive statistics. This study was reviewed and approved by the North Texas Regional Institutional Review Board.
RESULTS
Of the 111 students enrolled in the course for the 2017-2018 academic year, 109 students completed both OSCEs. Students’ mean age was 26.9 years (SD=4.7 years) and 55% of students identified themselves as female. The primary outcome, student performance as represented by mean OSCE score, improved from 80 on the pre-experience OSCE to 87 on the post-experience OSCE (p<.001). In addition, the first-attempt OSCE pass rate increased from 47% of students to 84% of students, with a two-sided p<.001. The conversion rate of students who failed the pre-experience OSCE was 86%.
Significant improvements in pass rate were observed for four out of five domains (Table 2), and numerical scores increased significantly for three of the five domains (Table 3). The greatest observed improvement was for presentation and accuracy of therapeutic recommendations, in which numerical scores increased from 15 to 17 and the pass rate increased from 69% to 92% (p<.001). Significant improvements were also observed in the medication reconciliation competency, with an absolute increase in pass rate of 14% (p=.001) and a numerical score increase from 16 to 18. Performance improvements were less obvious in collecting an accurate medication history, admission interview skills, and discharge counseling skills. These three areas demonstrated significant improvements in either numerical score or overall pass rate, but not both.
First Attempt Student Success Rate for Transitions-of-Care Competency Areas Assessed by Objective Structure Clinical Examination Prior to and After Completing an Introductory Pharmacy Practice Experience
First Attempt Mean Numerical Student Scores for Transitions-of-Care Competency Areas Assessed by Objective Structured Clinical Examination Prior to and After Completing an Introductory Pharmacy Practice Experience
In terms of student perceptions, 41% of students responded to the statement “The exams were representative of materials and objectives presented in the course.” Of these, 45 respondents, 25 (56%) strongly agreed, 18 (40%) agreed, and the remaining two (4%) students gave neutral responses.
DISCUSSION
The use of OSCE to assess pharmacy students’ competency acquired during an IPPE as an alternative to preceptor evaluations allowed the authors to detect improvements in students’ overall competency as well as in specific competency domains. The greatest magnitude of improvement was detected in domains that required critical thinking, clinical reasoning, and communication. This result is consistent with both the medical education literature and experiential learning theory in terms of competencies typically acquired from such experiences.15,17,18
Consistent with the authors’ expectations, these results highlight important advantages to using OSCE to evaluate student competency acquired from off-campus experiences. The OSCE provides a mechanism to standardize summative assessment across diverse practice sites and compare student performance across a single cohort.1 In addition, these results can inform experiential office efforts concerning course design and preceptor development. Although alternatives, such as preceptor observation checklists, have been explored in the literature, these have produced a similar pattern of grade inflation to that seen with preceptor evaluations.19
Furthermore, use of a pre-experience OSCE added significant value to the course. In addition to serving as a baseline assessment to which the final OSCE could be compared, the pre-experience OSCE served as a teaching tool. By identifying and communicating gaps in individual students’ knowledge and skills at the beginning of the IPPE, the students could focus on specific competency areas that needed further development. Additionally, the pre-experience OSCE helped set student expectations for both the off-site learning experience ahead and the final assessment for the course.
To successfully implement a similar approach at other institutions, the realism and fidelity of the OSCE should be given careful consideration. Objective structured clinical examinations in pharmacy curricula vary greatly from discrete, skill-based assessments to whole case scenarios that include standardized patients and physicians.9,19 In this study, the OSCE included a whole case scenario, standardized clients, and factors to enhance the reality of the scenario, all of which improved the validity and efficacy of the OSCE.20-24 In addition to depicting real practice, the OSCE must maintain fidelity to the off-campus experience and vice versa to avoid adversely impacting student performance. Consequently, use of experienced and dedicated preceptors, as well as identifying adequate practice sites becomes crucial.
In terms of limitations, the external validity of this study is challenged by its small sample size and conduct at a single institution during a single academic year. Future studies of OSCE to assess experiential learning should evaluate larger cohorts and, if possible, across multiple institutions. Another key limitation is the unquantified influence of noncognitive confounders, such as performance anxiety, examination familiarity, and unintended patient variance.8,25,26 For example, students may have entered the post-experience OSCE more confidently because they were familiar with the exam structure. Furthermore, differences in case content may have also influenced the variance in student performance. And although difficulty assessment was included as a key component of the case validation strategy, it is possible that there were unintended differences in case difficulty.
Despite these limitations, there are many future directions for OSCE use in experiential courses. Based upon our findings, schools of pharmacy should consider adding an OSCE to select IPPEs and APPEs that may benefit from more robust assessment techniques than are currently used. As a reliable quantitative method for measuring improvements in clinical competencies across a cohort of off-campus learners, use of pre- and post-experiential OSCE may also be valuable in conducting educational research.
CONCLUSION
Although preceptor evaluations have traditionally served as the primary assessment for IPPE and APPE, this study indicates OSCE may be a reliable alternative to assess clinical competency acquired from off-site IPPE. Future studies should compare reliability and validity of experiential assessments, including OSCE and preceptor evaluations of the student learner. Additionally, our study suggests that future research may be needed to quantify the impact of non-cognitive confounders upon OSCE performance.
- Received April 10, 2019.
- Accepted September 16, 2019.
- © 2020 American Association of Colleges of Pharmacy