Abstract
Objective: To determine the effectiveness of an individualized teaching method in a pharmacy skills laboratory.
Design: All third-year students enrolled in an Accreditation Council for Pharmacy Education (ACPE) accredited doctor of pharmacy program (n=150) received an individual formative assessment from clinical pharmacists on communication skills and clinical competency after the students counseled standardized mock glaucoma patients during a laboratory focused on alternative dosing formulations. Objective structured clinical examination (OSCE) scores for this station from the 2012 and 2013 classes were compared before and after the intervention.
Assessment: Ophthalmic OSCE station scores were higher after the individual formative feedback intervention. Students in 2013 had a mean score of 83.2 ± 8.3% compared to a mean of 74.3 ± 12.9% in 2012 for this OSCE station. The percentage of students receiving an “A” on the OSCE station increased from 8.1% to 31.3% after the intervention.
Conclusion: Individualized formative teaching methods benefited students in both their communication skills and clinical assessment. Future research should focus on wider implementation and overcoming obstacles, such as increased facilitator needs.
INTRODUCTION
Since the inception of the most recent core competencies set forth by ACPE1 and the Center for the Advancement of Pharmacy Education,2 colleges and schools pharmacy across the nation have aimed to increase the clinical abilities of current students. One of these core competencies, improving pharmacy students’ proficiency in patient-centered care, has stimulated new teaching and examination methods. Use of problem-based learning (PBL) strategies has increased and they are being used in 71% of US pharmacy colleges.3 Objective structured clinical examinations (OSCEs) are also being increasingly used in many colleges and schools of pharmacy and particularly within pharmacy skills laboratories.4 The replacement of traditional laboratory examinations with OSCE assessments has been shown to benefit pharmacy students’ clinical and communication skills.5,6 One pharmacy program created a standardized patient counseling rubric to be used across multiple courses in order to improve performance on an annual assessment, which includes an OSCE.7 To our knowledge, however, no articles have been published describing specific pharmacy laboratory activities designed to improve performance on OSCEs.
Auburn University in Alabama has been using OSCEs in its skills laboratory since 2006. Student pharmacists on both the Auburn and Mobile campuses complete a 6-semester skills laboratory course sequence during their first 3 professional years in preparation for introductory and advanced pharmacy practice experiences. Each week of laboratory curriculum includes a 1-hour prelaboratory lecture and a 2-hour laboratory session. All students attend the prelaboratory lecture simultaneously via video conferencing. The lecture reviews pertinent pharmacotherapy and patient counseling using traditional teaching methods, while the laboratory session allows time for discussion and application of the skills in smaller groups. On the main campus there are 4 laboratory sections with approximately 32 students in each section, and the satellite campus in Mobile has 1 additional section with 24 students. Faculty, pharmacy residents, and fourth-year student pharmacists facilitate laboratory sessions in person on each campus and encourage student participation within groups of 6 to 8 students. Laboratory activities involve role playing among students and discussion of scenarios with facilitators. Limited individual performance feedback is provided during the laboratory sessions.
At the end of each semester, students are evaluated with a 6-station OSCE, during which they are expected to interact with standardized patients (SPs) regarding topics taught during the semester. While all students complete the same cases, separate OSCEs are conducted on each campus. For each OSCE station, students are given 3 minutes to review the case information and prepare for the patient encounter, and 7 minutes to interact with the patient for a total of 10 minutes per station. Two SPs are hired for each case from the lay public and trained to act and evaluate student performance. Each student interaction is viewed live via video, recorded for future review, and evaluated in real time by an SP. Evaluation of each station includes use of a standardized performance checklist, which is scored in real time by an SP viewing the station on video and use of a communication rubric, which is scored immediately following the encounter by the SP interacting with the student. Students are not provided any immediate feedback during the OSCE.
While a majority of students pass the end-of-semester summative OSCEs, many students struggle with rising to excellent performance. Course instructors and coordinators consider that student performance may be affected by the laboratory format itself because students typically interact and provide feedback in groups, yet they are assessed individually on the OSCE. We hypothesized that including more individualized formative assessments in laboratory sessions would improve student performance on the respective summative OSCE stations. The objective of this study was to determine the effectiveness of incorporating an individualized formative assessment into 1 pharmacy skills laboratory session by comparing OSCE performance before and after the intervention.
DESIGN
The alternative dosing formulations laboratory session for third-professional year student pharmacists was chosen for the teaching intervention, because the laboratory specifically focused on patient counseling for intranasal, otic, and ophthalmic preparations. This particular laboratory was chosen since the content was not demanding for third-professional year student pharmacists and the authors were the primary content experts. The laboratory session was reformatted to allow time for each student to role play with a clinical pharmacist, who played the role of an SP. All facilitators were provided with a detailed facilitator’s guide that included the patient case information and suggestions for verbal feedback. Three to 4 clinical pharmacists served as SPs for an identical case in each of the laboratory sections, Historically, laboratory sections are facilitated by 2 fourth-professional year student pharmacists and 2 clinical pharmacists (pharmacy residents and faculty). Each student spent 10 minutes in an examination room with an SP, who presented with an ophthalmic disorder requiring prescription medication counseling. Students were provided a standardized scenario including pertinent medication information (Appendix 1). Students were given 6 minutes to provide counseling (ie, administration techniques) and identify any prescription issues (ie, contraindications) based upon the standardized scenario. Following the student’s counseling component, the clinical pharmacist spent 4 minutes giving each student individual formative feedback. Students were provided a verbal assessment on how they conducted themselves, on their clinical knowledge and communication techniques, and on what needed improvement. Facilitators utilized a laboratory rubric and talking points to help formulate standardized formative assessments. The rubric and talking points were created based on OSCE station objectives. The remaining laboratory activities were organized as they had been in previous years. Groups of 6-8 students rotated through 3 other activities, which included patient counseling, although students were role playing with classmates instead of with an SP.
Students completed a 6-station OSCE, including 1 station on ophthalmic medication counseling, approximately 10 weeks after completion of the laboratory session. Disease presentation and station objectives for this station were the same as the previous year’s to allow for comparison. Neither the authors nor the clinical pharmacists conducting the intervention were involved with grading the ophthalmic OSCE station.
Students were evaluated using a standardized performance checklist and a communication rubric. The standardized performance checklist included items related to gathering information, management strategies, and appropriate follow up. Each item was equally weighted on the standardized performance checklist and the total comprised 80% of the overall station grade. Beginning in 2013, all OSCE performance checklists gained an additional item aimed to detect verification of the patient’s medications and past medical history. Therefore, the standardized performance checklist for the ophthalmic OSCE totaled 16 points in 2012 and 17 points in 2013. Each year, a grade adjustment was made on this station, in which the standardized performance checklist was taken out of one fewer point when reporting grades to the students. The raw score for the station grade prior to any adjustments and the adjusted grade was reported. The communication rubric included 8 different domains and was identical both years. Communication scores accounted for 20% of the overall station grade.
Means of ophthalmic station scores from 2012 and 2013 were calculated for the standardized performance checklist, communication rubric, unadjusted overall station scores, adjusted overall station scores, and overall OSCE scores. The percentage of students achieving each letter grade for the station, using overall adjusted scores, was also calculated and compared between the 2 years. Finally, overall OSCE semester scores were compared between 2012 and 2013. Assessment for significant differences between 2012 and 2013 were conducted using SAS v.9.3 (SAS Institute, Cary, NC). A 2-tail t test was used to compare means for the standardized performance checklist scores, unadjusted overall station scores, adjusted overall station scores, and overall OSCE scores. Equal variance was assessed using the folded F statistic. The communication rubric, consisting of 8 questions, was assessed categorically due only 3 different earned scores, 75%, 87.5%, and 100%. The distributions of scores for all other measures were normal. The chi-square test was used to compare 2012 and 2013 letter grades for the communication rubric scores and adjusted overall station scores.
The Auburn University Institutional Review Board (IRB) approved this project as an expedited protocol, and all students were provided an opportunity to withdraw their data from the study.
EVALUATION AND ASSESSMENT
All third-professional year student pharmacists attended the alternative dosing forms laboratory session and received individual formative feedback on their counseling performance in 2013. No students opted to withdraw their data from the analysis (n=150). Since the data from 2012 was retrospective, all students were included in the analysis (n=148).
Equal variances were found for all tests, and the pooled method of assessing variance was used. When reviewing performance on the ophthalmic-focused OSCE station, performance significantly improved on the standardized performance checklist score (p<0.0001) and overall unadjusted mean station score (p<0.0001) (Table 1). The adjusted station mean grade, which added 1 additional point to the standardized performance checklist score in both years, increased from 74.3% to 83.2% (p<0.0001). Assessment of communication rubric grades revealed significantly different grades between 2012 and 2013 (p<0.0001). Most notably, the number of students with an “A” on the adjusted station grade increased nearly 4-fold, from 8.1% in 2012 to 31.3% in 2013. (Table 2).
OSCE Ophthalmic Station Numeric Scores (Percentage Mean ± SD)
OSCE Ophthalmic Station Numeric Grades
Overall 2013 OSCE performance was lower than performance in 2012. The mean overall OSCE grade in 2012 was 77.6 ± 7.2% compared to 75.0 ± 6.6% in 2013.
DISCUSSION
OSCEs are becoming a mainstay in many pharmacy skill laboratories as aids in training and assessing student pharmacists’ clinical skills prior to APPEs. Previous studies have demonstrated the importance of incorporating OSCEs,5,6 however student outcomes continue to be lackluster. The inclusion of individualized formative assessments during a laboratory session was investigated in hope of advancing the pharmacy skills laboratory teaching methods. To our knowledge, this is the first study to examine specific pharmacy laboratory activities on OSCE performance.
We found significant improvement in all OSCE performance measures after individualized formative assessments were implemented. Students in 2013 performed well, with 96% of students earning a grade of “C” or better for their OSCE station score. In 2012, with a communication mean score of 98% and 85% of students receiving 100%, we did not expect to see any significant improvements due to a ceiling effect. However, the overall station grades and communication grades increased significantly. Overall OSCE performance was better in 2012 than it was 2013, suggesting that any confounding variables (eg, GPA, curriculum, pharmacy work, counseling experience) did not falsely increase the 2013 ophthalmic OSCE station performance.
A limitation of this study was that providing individualized feedback as an intervention was only incorporated into 1 laboratory session. This limited the scope by which the teaching methods could be analyzed, as only the OSCE aimed at the alternative dosing formulation laboratory session could be used as a measure. Staffing constraints and the need for additional facilitators limited the intervention to only 1 laboratory week in 2013. The intervention required at least 6 facilitators and a minimum of 4 clinical pharmacists. Staffing multiple laboratory sessions with this number of pharmacist facilitators was not feasible and limited the expansion of the teaching methodology at the time. Regarding the intervention itself, quality of the feedback could have varied throughout the day, as 6 different clinical pharmacists served as SPs. Also, due to the laboratory reformatting, students spent more time practicing ophthalmic medication counseling compared to previous years, which could have contributed to the improved performance on the station. Additionally, the OSCE data for 2013 was only compared to the immediate past year; because there were 2 classes of students being evaluated and compared, there was the potential for differences between the 2 groups in clinical competency and class characteristics. The lower overall OSCE score in 2013 suggested the reported increases in the scores for 2013 were not inflated, but the possibility exists nevertheless. Finally, this study represented the experience of 1 skills laboratory session in a large, public college of pharmacy. Results might not be generalizable to other skills laboratory settings.
While understanding the study’s limitations, the individualized formative assessment demonstrated the ability to improve students’ clinical and communication skills. Auburn University plans to progress and refine individualized formative assessments within the pharmacy skills laboratory, focusing on efficient use of clinical pharmacists as facilitators.
SUMMARY
Including individualized formative assessment in a skills laboratory course improved performance at an OSCE station at the end of the semester. This study confirmed the importance of designing laboratory activities to match planned summative assessments. The addition of this teaching methodology in other laboratory sessions may help improve overall OSCE performance in the future. As pharmacy programs strive to include more active-learning and skills-based assessments in their curricula, further analysis of teaching methods that focus on individual student feedback are needed. Future research might focus on wider implementation of individualized formative feedback in multiple laboratory sessions and on how to overcome implementation obstacles, such as lack of available facilitators.
Appendix 1. Scenario and Medication Information Provided to Student
- Received January 27, 2014.
- Accepted April 9, 2014.
- © 2014 American Association of Colleges of Pharmacy