Abstract
Objective. To implement a holistic assessment plan to evaluate the impact of a four-semester laboratory course series entitled Integrated Pharmacotherapy Skills on students’ readiness to begin advanced pharmacy practice experiences (APPEs) following separation of the laboratory component from the Pharmacotherapy lecture component.
Design. Faculty prospectively selected and employed a variety of course assessment methods including student self- and preceptor evaluation during APPEs, course evaluations, and a student confidence survey to evaluate student readiness for APPEs and ensure a quality learning experience for students.
Assessment. APPE students’ self-perceived confidence to perform skills increased after completion of the redesigned curriculum and after experiencing two APPE rotations. APPE preceptors did not report a change in student performance. Results from course evaluations suggest that separating the laboratory course from the lecture course created a positive learning experience for students.
Conclusion. Students completing the new laboratory curriculum were equally prepared to begin APPE rotations as students who had completed the old curriculum. A similar multidimensional, holistic assessment plan could be used at other institutions to evaluate skills-based courses as part of continuous quality improvement.
INTRODUCTION
The 2016 Accreditation Council for Pharmacy Education Accreditation Standards emphasize the importance of schools and colleges of pharmacy conducting curricular review to ensure achievement of educational outcomes and reasonable student workload expectations.1 Curricular review allows for continual quality assurance of the design, delivery, and sequencing of learning objectives at both the course and curriculum levels. The findings from a curricular review may result in course restructuring or redesign.
The evaluation of a redesigned course should include a holistic assessment plan that is multidimensional to provide a 360-degree evaluation of the changes. Individual assessment methods should include direct measures of student learning such as performance data measuring achievement of course learning objectives, but may also include indirect measures including student perception of learning or performance.2 Other informative components of an assessment plan may include student and instructor feedback on course experience, estimates of course workload, course mapping, course coordinator self-assessment, and peer faculty review.2,3
The Standards also require schools and colleges of pharmacy to assess student readiness for advanced pharmacy practice experiences (APPEs).1 Pre-APPE Performance Domains and Abilities from the Guidance for Standards 2016 provide statements to assist in this assessment.4 Many of these statements include skills that students must learn, develop, and finally, demonstrate competency in performing. Often these pharmaceutical care skills are taught, practiced, and evaluated in a skills laboratory.5-13
The University of Wisconsin (UW) Madison School of Pharmacy is a traditional four-year doctor of pharmacy (PharmD) program. Since the inception of the PharmD program, the curriculum has required students to complete a four-semester Pharmacotherapy course sequence beginning in the fall of the second professional year (P2) and spanning through the spring of the third professional year (P3). Each Pharmacotherapy course was four credits and contained three one-hour pharmacotherapy lectures, a one-hour discussion, and a two- to three-hour laboratory session each week based on activities to be completed. The lecture component focused on the acquisition and application of knowledge of various disease states, medications used to treat those conditions, and clinical guidelines for patient care. The laboratory component focused on further application of knowledge gained in the lecture component and the skills necessary to provide pharmaceutical care to patients. During the 2014-2015 academic year, curricular changes were enacted that split the laboratory component from the larger Pharmacotherapy course to create a four-semester sequence of one-credit, standalone skills-based laboratory courses termed Integrated Pharmacotherapy Skills I-IV. These courses each consisted of approximately eight one-hour discussions and eight laboratory sessions that were two to three hours in length spaced throughout the semester. All laboratory sessions included pre-laboratory assignments and some included post-laboratory assignments. The process of the laboratory course redesign has been previously described.14 The goal of the redesigned laboratory curriculum was to enhance student readiness for APPEs through focused instruction and assessment of critical skills mentioned in the Standards.4
In the pharmacy education literature, assessment and evaluation plans have been predominantly based on indirect measures of student learning and experiences, such as student perceptions and opinions, and description of the course revisions.5-7,10,15-20 Comparison of pre- and post-revision direct measures of learning in the form of final grades, examination and assignment scores, and APPE performance have also been described for this purpose, but to a far lesser extent.15,19,20 Uniquely, this article details the design and results of an assessment plan composed of a variety of evaluation measures.
To optimize authenticity of the assessment plan at UW-Madison, faculty members prospectively selected and employed a variety of evaluation methods yielding both direct and indirect measures of student learning and experience to assess the redesigned laboratory curriculum regarding educational outcomes related to APPE preparedness, as well as impact on student workload and the overall student learning experience. This article will describe how a holistic assessment plan was used to evaluate achievement of predetermined course goals and objectives following course revision.
DESIGN
The redesigned Integrated Pharmacotherapy Skills laboratory and lecture-based Pharmacotherapy courses were implemented during the 2014-2015 academic year. Each of the four laboratory courses emphasized a different skill area (patient communication, documentation, provider communication, and complex critical thinking) to ensure sequential instruction and evaluation of skill performance. Skills were introduced to students, graded formatively, and finally, evaluated during a summative performance-based assessment (PBA).
Assessment of the courses was planned to occur at the conclusion of the academic year in spring 2015 to ensure the redesign met its overarching goals of improving student readiness for APPEs through enhanced competency of skill performance, while maintaining a quality learning experience for students. Ensuring that student performance within the Pharmacotherapy lecture courses was not adversely affected by the redesigned laboratory curricula was a secondary goal. In anticipation of the course redesign, purposeful assessments and evaluations were embedded prior to the implemented changes so that impact of the redesigned laboratory curriculum on the aforementioned outcomes could be meaningfully assessed. For purposes of this article, pre-assessments are defined as those that occurred prior to the 2014-2015 academic year, and post-assessments are those that occurred during or after. Assessments that were used before and after the changes included APPE performance, PBAs, and student performance on examinations. Several assessments were only used following implementation of the changes and included student surveys, a focus group, and student time spent on course activities. A general overview of assessments used to evaluate the redesigned laboratory curriculum is provided in Table 1.
Overview of the Assessments Used to Evaluate the Impact of Revisions
Evaluation of student performance on APPEs was compared between student cohorts completing the old laboratory curriculum (2014-2015 APPE students) and one year of the new (2015-2016 APPE students) curriculum. The authors hypothesized that differences in APPE readiness resulting from the laboratory redesign could best be detected by analyzing student confidence to successfully perform pharmaceutical care skills and actual student performance at the beginning of the APPE academic year. For this reason, the change in mean APPE student self-evaluation scores between the two cohorts was compared at baseline (prior to the start of APPEs), and at the completion of rotation one and two (seven and 14 weeks into the academic year, respectively). The change in mean APPE preceptor evaluation scores was compared between the two cohorts following completion of rotations 1 and 2. Rotations 3 through 6 were not compared as it was hypothesized that pharmacy practice experiences gained by this point in the academic year would too greatly confound the results.
Student surveys and a focus group were used to assess the quality of the learning experience, as well as achievement of desired learning objectives for the new laboratory curriculum. The P2 and P3 students evaluated the redesigned curriculum through completion of anonymous, voluntary, electronic course evaluations at the conclusion of the spring 2015 semester (immediately prior to the start of APPEs for responding P3 students) and completed a survey related to their confidence to complete various pharmaceutical care skills at that point in time. A focus group was held after the first year of the new curriculum in June 2015 with students who had recently completed their P3 year and had completed three weeks of their first APPE rotation. This group of APPE students, from the 2015-2016 cohort, offered a unique perspective as they had experienced one year of both the old and new laboratory curricula, and therefore were able to compare and provide informed feedback. Students were purposefully selected as they were completing rotations in the same city where the UW-Madison School of Pharmacy is located and had provided constructive feedback to instructors in the past.
The PBAs conducted within the skills laboratory were also used to determine whether the redesigned Integrated Pharmacotherapy Skills courses improved student readiness for APPEs compared to the previous curriculum. Three PBAs were conducted prior to and after implementation of laboratory changes and therefore, those three were selected for comparison. These included the spring P2 and P3 summative PBAs and the spring P2 blood pressure and heart rate measurement PBA. Time-on-task surveys were administered following each laboratory session and asked students to report total time spent on activities before and after class sessions. Time-on-task surveys were designed to assess the appropriateness of course workload in relation to credit allocation.
Student performance on objective (ie, multiple-choice) examination questions within the Pharmacotherapy lecture component was assessed. Because the identical questions were administered before and after the course changes, pre-post comparisons of student performance could be made. This assessment was intended to evaluate whether splitting the lecture and laboratory adversely affected student knowledge, comprehension, and application of Pharmacotherapy lecture-related material.
A variety of statistical methods were used to evaluate the data collected. Data were compiled in Microsoft Excel, version 13, and analyzed with Excel or Stata, version 14.0 (StataCorp, College Station, TX). Mann Whitney U tests were used to compare both the results on APPE preceptor and student self-evaluations between the two APPE student cohorts, and the scores on blood pressure and heart rate measurement activities. Unpaired t tests were used to compare student PBA scores. A p<.05 was considered statistically significant. Testing and Evaluation Services at UW-Madison was consulted on the analysis of course examination questions, and differences in student performance on specific questions between the years were evaluated using descriptive statistics. As this project was undertaken for programmatic evaluation, the UW-Madison Health Sciences Institutional Review Board (IRB) determined this project did not meet the federal definition of research so IRB review was not required.
EVALUATION AND ASSESSMENT
Student self-evaluations and preceptor evaluations of APPE students were completed using a course competencies assessment tool developed at the UW-Madison School of Pharmacy. The assessment tool evaluated student performance on 14 general course competencies (eg, communication and pharmacotherapy skills) using a five-point performance scale (1=does not know; 3=knows how; 5=does) (Tables 2, 3). Preceptors and students were instructed to choose “not applicable” if a course competency did not apply, and these data were not included in the analysis.
Student Self-evaluation of Advanced Pharmacy Practice Experience Performance
Preceptor Evaluation of Advanced Pharmacy Practice Experience Performance
Student self-evaluation and preceptor-evaluated APPE performance during the 2014-2015 academic year was compared to that in the 2015-2016 academic year. All students in the 2015 (n=132) and 2016 (n=134) cohorts completed baseline self-evaluations prior to the start of the first APPE, which served as an indicator of student self-perceived APPE readiness. If a student had adequate elective credits, he/she had the option to omit one APPE during the P4 year. Therefore, the number of students and preceptors completing evaluations for the 2015 and 2016 cohorts differed after completion of rotation 1 (n=131 in the 2015 and 2016 cohorts) and rotation 2 (n=128 and 131, respectively) from the baseline evaluation.
The change in student self-evaluation mean scores (2016 cohort mean score minus the 2015 cohort mean score) at baseline, and after completion of rotation 1 and 2 are presented in Table 2. The 2016 student cohort mean scores on self-evaluations increased on all 14 competencies after completion of the redesigned laboratory curriculum and prior to beginning APPEs (baseline) compared to means scores of the 2015 cohort. Differences in scores reached significance in all but two competencies (communication with patients and caregivers, and administrative skills). The changes in mean scores decreased after exposure to APPE rotation sites and practice settings during rotations 1 and 2. Three of 14 competencies maintained statistical significance at the conclusion of rotation 2 (technical skills, service and attitude, and self-directed learning).
The change in preceptor evaluated student performance mean scores (2016 cohort mean score - 2015 cohort mean score) on the competencies at the conclusion of rotations 1 and 2 can be seen in Table 3. There were no significant changes in student mean scores on the 14 performance competencies at the conclusion of rotation 1 or rotation 2.
At the conclusion of the 2015-2016 academic year, P2 (n=134) and P3 (n=133) students who had completed two semesters of the redesigned laboratory curricula were invited to complete course evaluations and a survey regarding student confidence to perform various pharmaceutical care skills. Evaluations and surveys were administered through Qualtrics (Qualtrics, Provo, UT) and participation was anonymous and voluntary. Course instructors did not review course evaluations or completed surveys until course grades had been finalized and awarded.
Students were asked to report their perspectives on the previous academic year (fall and spring semester courses) using a five-point unipolar agreement scale (1=not at all in agreement; 3=moderately agree; 5=extremely agree). Course evaluation completion rates in the P2 and P3 class cohorts were 96% (n=128) and 79% (n=105), respectively. Results of student self-reported level of agreement with course evaluation measures are reported in Table 4. In general, P2 and P3 students reported agreement with course evaluation criteria. The majority (≥83%) of P2 and P3 students reported agreement (very much or extremely agree) that (1) they could apply information/skills learned in the course to real-life practice settings, (2) they could apply the learning in the class to their future profession, and that (3) the course material was pertinent to their professional training. Students in the P2 and P3 classes also reported agreement (very much or extremely agree) that the course positively affected their problem-solving abilities (57% and 66%, respectively) and encouraged critical thinking (68% and 82%, respectively). Additionally, when P3 students were asked to compare their experiences in the P2 laboratory year prior to the curriculum revisions in their P3 year, 68% of students very much or extremely agreed that the new curriculum allowed for better preparation of skills and abilities necessary for APPEs.
Post-lab Revision Results of Self-reported Student Level of Agreement Measuresa
In general, students reported lower agreement scores to course evaluation questions related to the assessment, timing of feedback, and debriefing of performance-based activities. Forty-four percent of participating students in the P3 class reported they very much or extremely agreed debriefing occurred in a timely manner after completing PBAs, compared to 62% of the P2 class. Additionally, 48% of P2 students and 39% of P3 students reported they very much or extremely agreed that ample feedback was provided on performance of skills prior to being tested/graded on those skills. Thirty-three percent and 44% of P2 and P3 students, respectively, very much or extremely agreed summative PBAs accurately evaluated their ability to perform pharmaceutical care skills in real-life practice situations.
Students were asked to answer questions about how confident they felt at that moment in time to perform various pharmaceutical care skills. A five-point unipolar confidence scale was used (1=not at all confident; 3=moderately confident; 5=extremely confident). Participation in P2 and P3 classes was 96% (n=128) and 77% (n=103), respectively. Student self-reported confidence to perform various pharmaceutical care skills can be found in Table 5. Self-reported confidence to perform skills was higher in the P3 student cohort compared to the P2 student cohort in all but two pharmaceutical care skills: (1) prioritize identified drug-related problems by assessing the urgency and risk associated with each problem and (2) perform physical assessment skills.
Post-lab Revision Results of Self-reported Student Level of Confidence Measures to Perform Pharmaceutical Care Skillsa
Seventeen APPE students were invited to participate in the focus group and seven attended (41%). Two laboratory faculty members served as facilitators of the focus group and a third took notes. A semi-structured interview was conducted with the focus group participants to explore students’ perception of their preparedness for APPEs, advantages and disadvantages of the old vs new laboratory curriculum, and the realism and utility of learning activities related to skill development and assessment in the new laboratory curriculum. The focus group session lasted approximately 60 minutes, and audio was recorded with students’ knowledge and approval. Notes taken during the live focus group and the audio recording were used to create a transcript of students’ responses during the session. Transcripts were reviewed by the laboratory faculty members who facilitated the focus group session. Comments were grouped and categorized to identify overarching themes related to the structured interview questions. In general, students felt the increased realism of simulated clinical activities during laboratory sessions better prepared them for their APPEs. They preferred that more laboratory time be spent in active-learning, with more individual student accountability for assignments. They desired additional formative assessments as lower-stakes graded assignments (<5% of final course grade) and more in-class time devoted to building specific clinical skill areas.
In the spring of the P2 year, all students were graded on their blood pressure and heart rate measurement techniques. Students were initially taught the skills during an introductory pharmacy practice experience (IPPE) completed the first year of the PharmD program. Technique was reviewed with students during the spring semester of the P2 year and students were provided several opportunities to practice. Students were graded formally using a blood pressure simulation arm. The same evaluation criteria were used to test both cohorts. The average score on the blood pressure and heart rate measurement activity increased from 90.0% prior to revisions (n=133) to 93.4% following revisions (n=133), but the improvement was not significant (p=0.35).
Also in the spring of the P2 year, a similar summative PBA was conducted before and after the course changes where students counseled a patient on an inhaler. For the content and communication portions of the examination, there were 13 and 17 grading items, respectively, that were identical for the two years. Students performed similarly on both the content section (n=133 pre-revision, average score: 90.9%; n=132 post-revision, average score: 90.2%; p=.47) and communication portion (n=133 pre- and post-revision, average scores 93.3% and 93.7%, respectively; p=.61) of the patient interaction.
In the spring of the P3 year, students completed a summative PBA in which they conducted a simulated ambulatory care visit consisting of four stations: patient interview, communication of recommendations to the patient’s provider, patient education and response to questions or concerns, and documentation of the clinic encounter in a SOAP note. Three grading rubrics were used to assess the stations; one for the provider interaction (station 2), one for the SOAP note (station 4), and a combined rubric for the patient interview and education (stations 1 and 3). Ten grading items on the rubrics were consistent between the pre- and post-revision PBA for the patient interview and education component, five items were consistent for the provider interaction component, and nine items were consistent for the SOAP note component. Following the revisions, the average performance score on the patient interview and education significantly increased from 77.9% (n=131) to 89.3% (n=132; p<.001), and the average percent score on the SOAP note significantly improved from 84.6% to 90.6% (p<.001). Although student performance on the provider interaction decreased from 79.7% to 77.7% following PBA revisions, this change was not significant (p=.36).
Within one week of completing a laboratory course, P2 and P3 students were required to record time spent on pre- and post-course activities. Students were given ranges for time spent per week in 15-, 30-, and 60-minute intervals from none to more than 3 hours. For calculation purposes, the middle of each time range was used (eg, for 1-15 minutes, 7.5 minutes was used) except for the last option of “more than 3 hours,” for which 3 hours was used. Across all four revised courses, students averaged 82 minutes on pre-laboratory activities (average range of 64.7 to 91.5 minutes) and 22 minutes on post-laboratory activities (average range of 15.4 to 29.4 minutes). Course faculty members recorded student time spent in each laboratory session. Students spent an average of 96 minutes in the laboratory completing hands-on activities (average range of 69.4 to 126.9 minutes). On average, a student spent a total of 199 minutes for each laboratory (average range of 179.4 to 226.1 minutes, with both the smallest and largest amount of time spent by students in the P2 year).
To assess for differences in student performance on the Pharmacotherapy objective question examinations prior to and following course redesign, question level differences across the years within a course were evaluated. Only questions that were identical were used in the analysis. The percentage of students who answered each question correctly was recorded. The percent correct was summed for all questions in a course. If a difference was seen between the two years, the percent difference was subtracted from each question in the higher performing year to equalize the two groups. After this process, a 10% or greater difference in question performance between the two years was considered a significant difference in student performance by the UW-Madison Testing and Evaluation Services.
For the Pharmacotherapy II course held in the spring semester of the P2 year, there were examination results for 131 to 135 students, depending on the examination. Thirty-one examination questions were matched between the two years (18.3% of questions in 2014 and 21.7% of questions in 2015). Out of 31 items, students performed better on five questions (covering five topics) and worse on four questions (covering three topics) in 2015 than in 2014. No change in performance was seen for the remaining 22 questions.
For the Pharmacotherapy III course held in the fall semester of the P3 year, there were examination results for 129 to 133 students, depending on the examination. Thirty-three examination questions were matched between the two years (24.1% of questions in 2013 and 25.4% of questions in 2014). There were no questions on which students performed better in 2014 and four questions on which students performed worse in 2014, all of which covered one topic area. For 29 of the questions, no difference was seen. Data from the Pharmacotherapy I and IV courses were not available for assessment.
DISCUSSION
Through a holistic, multidimensional assessment plan, faculty members were able to evaluate the impact of the separation of laboratory and lecture components of a comprehensive pharmacotherapy course into standalone courses, with a focus on increasing students’ APPE readiness. Results from course assessments on the evaluation of student confidence and skill performance indicate students completing the new laboratory curriculum are equally or more prepared to begin APPEs. Social cognitive theory suggests that students’ perception of their own ability, or confidence, is a key component of performance.21,22 Individuals who are confident or have high self-efficacy that they can successfully complete a task are more likely to succeed in task completion because of increased engagement and motivation.21-23 Student self-reported confidence to perform pharmaceutical care skills in the P2, P3, and APPE years was used to evaluate student learning and experience in the laboratory course and to assess APPE readiness.
Survey results regarding student confidence in performing pharmaceutical care skills during the P2 and P3 years (Table 5) can be considered along with performance results on PBAs during these same years to provide insights into student learning and experiences. As one would expect, when comparing self-reported P2 and P3 student confidence to perform pharmaceutical care skills, student confidence, shifted from the “not at all/slightly confident” categories in the P2 year to the “moderately and very/extremely confident” categories in the P3 year. Despite not having a comparator group for this study, these results suggest that the redesigned laboratory courses, in addition to other experiences, contribute to sequential and progressive skill-building over the four-semester sequence. Given this increase in confidence, social cognitive theory suggests that a student’s ability to perform these skills would follow suit and increase from the P2 to the P3 year, thus enhancing the student’s APPE readiness. Although student confidence scores increased, not all P3 students rated their confidence as very to extremely confident on all of the skills. However, this is not surprising given the laboratory classroom is a simulated environment and students’ confidence is expected to continue increasing with more time and experience in actual practice. Additionally, based on preceptor performance evaluations, this level of confidence did not affect student performance on APPEs.
Objective examination results also provide insight into students’ didactic knowledge during the formal curriculum. There were limited differences in Pharmacotherapy examination scores prior to and following laboratory revisions, suggesting that division of the laboratory and lecture components had minimal impact on student performance in the lecture-based Pharmacotherapy course and on pharmacotherapy knowledge.
Within the APPE year, student self-evaluation of performance, or student-perceived confidence to perform pharmaceutical care skills (Table 2) can be considered with preceptor-reported student performance data (Table 3) to determine effectiveness of the redesigned courses on student readiness for APPEs. Self-perceived confidence at baseline was higher in the large majority of competencies after the revisions; however, at the conclusion of rotation 2, increased confidence to perform remained in only three competencies. When comparing students’ confidence before and after the revisions, there were no differences in either the preceptor evaluations or student self-reported confidence at the conclusion of rotation 2. These results suggest that the redesigned laboratory curriculum did increase student self-reported APPE readiness, but did not help or hinder student overall performance across the first two APPE rotations. It also showed that APPE experiences provide opportunities for growth above and beyond that provided through laboratory instruction.
Focus group and student survey results suggested the revised laboratory course provided a learning experience that was pertinent and applicable to actual pharmacy practice. After experiencing the old and the new curriculum, P3 students felt the new curriculum better prepared them for skills and abilities necessary on APPEs. While this provided encouragement for continuation of the new laboratory courses, feedback also presented opportunities for additional revisions to further improve students’ learning experience in subsequent semesters. In particular, constructive feedback centered on increasing timely and formative feedback opportunities. Initially, this presented a challenge given student concerns of already high course workload for a one-credit course. However, findings from time-on-task surveys helped to quantify and justify the appropriateness of the course workload in accordance with the university’s expectations for a one-credit laboratory course, which includes “2-3 hours of laboratory activities per week plus some work beyond the activities in the lab.”24
The entire process of course mapping, redesign, and evaluation was very time-consuming, spanning several years. However, the benefits of this process have already been seen with more emphasis on skill development and assessment in the curriculum, without adversely affecting APPE readiness.
The assessment plan had several limitations. While many of these limitations are confounders to the assessment results specific to UW-Madison School of Pharmacy, it is the authors’ hope that other schools and colleges of pharmacy desiring to assess a skills-based course in a similar manner could use these limitations to inform development of their own prospective assessment plan.
In general, UW-Madison School of Pharmacy preceptors have historically rated baseline student APPE performance on the higher end of the course competencies assessment tool. High baseline scores make detection of an improvement in student performance difficult, but are useful in assessing whether a noticeable decrease in performance has occurred. The differences in student self-evaluation scores also were small. It is unknown if that difference had a noticeable impact on student performance at the start of the first APPE. Others institutions should consider the magnitude of change necessary to indicate a relevant change in APPE student performance as measured on their own competency assessment tools, and what this means for the corresponding utility of this type of assessment.
Performance-based assessments (PBAs) provided a comparison of skill ability prior to and following course revisions, which directly aligned with the purpose of the revision. However, as part of the revision process itself, the rubrics used to evaluate student performance were also revised. This meant the rubrics as a whole were not identical, but components of the rubric were consistent both pre- and post-revision. Faculty members felt this provided reasonable and useful comparison points as the patient scenarios were very similar, but using identical rubrics would have provided a more robust assessment. Because of the course revisions made, only a few PBAs were comparable before and after the changes. Comparison of Pharmacotherapy examination scores also contained several confounding factors. Placement or order of lecture topics varied between the two years and only a portion of the questions were consistent across both years allowing for a more limited comparison. Other schools and colleges of pharmacy may consider assessing course revisions in steps to avoid such confounders and to ensure robust pre- and post-PBA and examination assessments are possible. If too much of a course is altered at once it makes valid pre- to post- comparisons more challenging.
There were several limitations regarding student groups. Baseline characteristics of the different cohorts were not available (age, gender, GPA, etc) and thus, they are not included in the article. Additionally, students could have had outside experiences that influenced the results (employment, internship, or IPPE experiences). Third-year students reported that the new curriculum better prepared them for APPEs compared to the old curriculum they experienced during the P2 year on student surveys. While this is useful feedback, it must be interpreted with some caution given those students did not experience both curriculums. Instead they are comparing the old P2 curriculum to the new P3 laboratory curriculum, which while purposefully sequenced are separate courses. Additionally, the new cohort of APPE students only completed one year of the new laboratory curriculum. Ideally, students completing two years of the new laboratory courses would have been included. However, higher-level and more complex skills that build upon the P2 year courses are taught and practiced in the P3 year and thus relate more directly to APPE readiness.
Students who attended the focus group were purposefully selected, which could have introduced bias into the feedback provided. The focus group was further limited by the lack of a neutral facilitator and the small number of students, which could have been due to the time of day the group was held and student interest in attending. Careful and purposeful selection of student cohorts for comparison and individual students for focus group inclusion is recommended for other institutions considering similar assessments of the learning experience. Specific learning outcome(s) being targeted for assessment (Table 1) can be used to inform how and which student groups are selected for the most robust assessment.
CONCLUSION
This article described the development of a holistic assessment plan to evaluate impact of a four-semester laboratory course series on APPE readiness following separation from the Pharmacotherapy lecture component. Future plans include continuing the laboratory as a standalone course and with internal assessments in the form of course evaluations and comparisons of the same or similar PBAs for continuous quality improvement.
Overall the assessment plan was multidimensional and provided a holistic view of the course revisions. The prospective nature of the evaluation also contributed to its usefulness to course faculty members. A similar robust assessment plan could be used at other institutions to evaluate skills-based courses as part of continuous quality improvement.
ACKNOWLEDGMENTS
The authors would like to thank James Wollack, associate professor and director, Testing and Evaluation Services at the University of Wisconsin-Madison for his assistance with the examination question analysis.
- Received June 30, 2016.
- Accepted November 18, 2016.
- © 2017 American Association of Colleges of Pharmacy