Abstract
Objective. To describe the development and implementation of an innovative, comprehensive, multi-day module focused on assessing and providing feedback on student cognitive and interpersonal skill development and practice readiness after the first year (PY1) of a Doctor of Pharmacy (PharmD) curriculum.
Methods. A multi-day capstone assessment was developed to evaluate first-year students’ knowledge of course content, ability to find and apply information, and interpersonal skills, including teamwork and adaptability. The PY1 Capstone consisted of four parts. Knowledge was assessed using 130 multiple-choice items on first-year course content and 50 fill-in-the-blank items on Top 200 brand and generic drug names. The ability to find and apply information was assessed using a 45-question open-book test. Interpersonal skills were assessed using a specially designed multiple mini-interview (MMI). The final part of the assessment was a debriefing session that provided rapid-cycle feedback on capstone performance and a bridge between students’ recently completed first-year coursework and an upcoming 2-month experiential immersion.
Results. The average score on the closed-book and open-book assessments were 75% and 68%, respectively. Most students displayed satisfactory interpersonal skills based on the MMI. Students viewed the assessment positively based on post-assessment survey responses (>75%). Most students (98%) reported not studying for the assessment, indicating that the results should reflect students’ retention of knowledge and skills.
Conclusion. The capstone assesses students on knowledge and skills and provides students with feedback on areas to focus on during their early immersion. Continued work is needed to ensure the process is transparent and cost-effective.
INTRODUCTION
Emerging strategies in improvement science advocate the use of rapid-cycle testing to improve organizational decision-making and optimize outcomes.1,2 Institutions undergoing curricular change must develop organizational strategies that effectively evaluate and inform the improvement of core educational elements believed to be critical for student development. From a systems perspective, these strategies should include assessment processes that are transparent, participatory, sustainable, responsive, and contextualized to the institution.3 In other words, efforts to evaluate and improve the system (ie, curriculum) should be transparent and specific to all relevant stakeholders, including students, faculty members, preceptors, and administrators, and must be built in a participatory and sustainable way. As schools of pharmacy pursue curriculum change, we propose that this approach be used for assessing the extent to which a curriculum promotes practice readiness.
Assessment of student progress and practice readiness is an important part of student and curricular development. As such, the Accreditation Council of Pharmaceutical Educators (ACPE) Standards and the Center for the Advancement of Pharmacy Education (CAPE) 2013 Educational Outcomes address the need for these types of assessments.4,5 The CAPE outcomes also encourage integrated assessments to ensure students are retaining, integrating, and applying their knowledge, skills, and attitudes. Formal opportunities designed to help students connect the multiple facets of their academic experiences, also called capstones, are increasingly common in health professions education. Within pharmacy education, schools have typically implemented capstones at the end of the didactic curriculum, immediately prior to advanced pharmacy practice experiences (APPEs), to identify students who are unprepared for rotations.6-12 While several schools of pharmacy report using assessments after the first year (PY1), there are no published reports of assessments designed to go beyond knowledge and skills to holistically assess students’ experiential learning readiness in the early years of the curriculum, while providing feedback for the dynamic process of curriculum development.9,13-17
In spring 2016, the UNC Eshelman School of Pharmacy designed and implemented a capstone at the end of the first year of a new Doctor of Pharmacy (PharmD) curriculum launched in fall 2015. The new PharmD curriculum focuses on developing graduates who are exemplary practioners that provide high-quality team-based patient-centered care, innovators who recognize the health care needs of both patients and society and who lead teams toward change to improve patient care, and lifelong learners who continually strive for positive impact.18 The core competencies of the curriculum reflect the skills, knowledge, and abilities believed to be critical for success in the practice of pharmacy, including in-depth knowledge and proficient skills in the pharmaceutical sciences and practice of pharmacy, as well as broader lifelong learning skills such as: accessing and analyzing information, critical thinking and problem solving, communication, collaboration and influence, adaptability, initiative, curiosity and inquisitiveness, and professionalism and ethical behavior.18 To promote the development of these competencies, the curriculum was designed with a focus on earlier and increased immersion in experiential education and active in-class learning.
The PY1 Capstone was built to align with the school’s core competencies and PY1 course outcomes and designed to meet the school’s quality assurance needs by assessing students’ retention of knowledge from foundational courses, ability to integrate and extend their knowledge from multiple courses, and development of interpersonal skills. The capstone also responded to quality improvement needs by informing the school’s faculty about strengths and deficits in student learning in the new curriculum; informing early immersion preceptors about student knowledge and skill development in the first-year; and promoting metacognition and self-directed learning by providing students with feedback on strengths as well as areas in need of self-remediation. In addition, the assessment was mapped to accreditation standards and the Pharmacy Curriculum Outcomes Assessment (PCOA) to allow for a gap analysis. The purpose of this paper is to describe the development and initial performance of the PY1 Capstone and planned next steps.
METHODS
This study was exempt from review by the University of North Carolina at Chapel Hill Institutional Review Board. The PY1 courses address core pharmacy knowledge and skill-based competencies (eg, accessing and analyzing drug information, solving pharmaceutical calculations, developing a working knowledge of medical terms, extemporaneous compounding, collaborating in team-based healthcare settings, and performing the patient-centered care process).19 These courses also foster the development of adaptability, effective communication, collaboration, initiative, and professionalism. At the end of the spring semester, immediately preceding the PY1 Capstone, the Foundations of Patient Care course includes a final objective structured clinical examination (OSCE) to summatively assess each student’s competence in the patient-centered care process (eg, medication history taking). As such, the capstone was developed to assess other aspects of early practice-readiness as described below.
The PY1 Capstone was scheduled immediately following completion of PY1 course work and immediately preceding the students’ first immersion experience. Five months prior, a six-person Capstone Planning Team (CPT) was charged with developing a capstone consistent with the philosophy and guiding principles of the new PharmD curriculum. The CPT’s objective was to create a comprehensive experience to retrospectively assess student learning in the PY1 curriculum and prospectively bridge those students into their early immersion experiences. During the design and implementation process, over 40 individual faculty members, preceptors, and staff members were involved.
To inform design, 19 faculty members and preceptors were invited via email to participate in an interview to determine what knowledge and which skills were expected of students in the upcoming early immersion experiences. Eleven of the 19 consented and were interviewed individually for 15 to 30 minutes. Each interviewee was asked to “...identify no more than three major areas students should review and receive feedback on before their early immersion experience this summer.” The content and skill areas identified by interviewees were aggregated and found to be consistent with information that had been gathered from a larger body of preceptors at the institution in an earlier study.20
Based on the collected information, CPT members selected the specific areas the capstone assessments should focus on and drafted a proposal for the capstone format. The CPT design intentionally omitted any content beyond the scope of the PY1 course learning outcomes, content that would be covered during the second and third year of the PharmD curriculum (eg, pharmacotherapy), and skills-based competencies that had already been assessed extensively in PY1 courses (eg, extemporaneous compounding, vaccine administration, etc). The CPT proposed a four-part capstone format, including: a closed-book knowledge assessment; an open-book, find-and-apply assessment; an interpersonal skills assessment using the multiple mini-interview (MMI) format; and a debriefing session with the students. Subcommittees consisting of three to four individuals, including PY1 course directors, academic fellows, and faculty members with assessment expertise, developed each capstone section.
Subcommittee work was guided by the overarching capstone design goal to develop a comprehensive experience that aligned both horizontally (ie, course content) and vertically (ie, levels of learning) with the PY1 curriculum, useful for providing feedback to students about personal strengths and weaknesses, and informative for identifying curriculum strengths and weaknesses. To ensure the content was consistent with these goals, all individual components (eg, examination items, prompts, and interview questions) were tagged to identify the specific courses in which that content was taught and the relevant PCOA content areas to enable standardized external assessment benchmarking. Additionally, all capstone questions were tagged using a reduced Bloom’s scale with group 1 questions from the knowledge and understanding domains, group 2 questions from the application and analysis domains, and group 3 from the creation and evaluation domains.
The closed book, knowledge section of the capstone assessed students’ retention of knowledge from their didactic coursework, with the number of questions on a given area consistent with the relative contributions (credit hours) of each PY1 course (Appendix 1). The closed-book assessment focused on major learning objectives from these courses and the lower levels of Bloom’s Taxonomy of the Cognitive Domain (group 1). Course directors provided access to questions from their examinations and quizzes, with most of these warehoused in ExamSoft (ExamSoft Worldwide, Inc, Dallas, TX) or the school’s learning management system. Questions were selected based on instructor recommendation, Bloom’s cognitive level, and question quality as determined by previous item performance analysis. Some items were modified to avoid answer recall or improve clarity of the item stem.
The knowledge assessment consisted of two separate sections: 50 questions assessing recall of Top 200 brand and generic drug names and 130 questions assessing knowledge of drug class, indication for use, mechanism of action, structure-activity relationships, fundamental aspects of pharmacokinetics and dosage forms, pathophysiology, medical terminology, clinical pharmacology, and pharmaceutical calculations. Questions ranged from simple definition recall and basic pharmaceutical calculations to application of knowledge in response to clinical vignettes. Students were provided four hours to complete the closed-book assessment, which had an estimated completion time of 2 hours.
The find-and-apply assessment was an open-book section of the capstone designed to assess higher-order cognition (eg, application, analysis, evaluation, synthesis). This section also assessed students’ ability to find, interpret, and integrate drug information; make reasonable interpretations of the information; and synthesize a response to drug information questions. Additionally, this assessment was intended to help students “bridge” classroom learning to experiential learning by posing scenarios that a preceptor might ask during early immersion (eg, drug information requests). The assessment contained questions to assess students’ metacognition, self-awareness of their current knowledge, judgment of “when to ask a preceptor for help” during a direct patient care experience, and knowledge of drug information resources.
The subcommittee charged with developing this open-book assessment included course directors for the Evidence-Based Practice and Skills Laboratory courses. Course directors submitted questions related to the goals of this section. The subcommittee integrated and refined the items. Clinical vignettes were presented, followed by several related questions that required students to find information using primary or secondary literature, cite their sources, document a second source used to verify the information, interpret the information, and formulate a reasonable response to the question.
The open-book assessment contained 44 drug information questions, and students were allowed to access any Internet resource. Five questions required students to cite the resources they used to synthesize their answers, three questions required students to rate their confidence in their answer choice, and three questions contained the answer choice, “I do not know/ask my preceptor.” Students were given four hours to complete the assessment, which had an estimated completion time of two hours.
The purpose of the interpersonal skills assessment was to assess various professional attributes believed to be critical for success in the transformed curriculum and beyond.18 It was administered as a multiple mini-interview (MMI), consistent with the MMI used in the school’s admissions process.21 The logistics of the MMI are like those of an OSCE, in which students rotate between timed assessment stations.22 The Capstone MMI (c-MMI) was, to the best of our knowledge, the first use of this process for program assessment purposes.
Four c-MMI stations were created to assess students’ teamwork skills, integrity, adaptability, and empathy, which are constructs aligned with evaluations completed during the admissions MMI and PY1 coursework. Faculty members and postdoctoral fellow interviewers were trained before and on the morning of the c-MMI.
At each of the four stations, facilitators allowed students two minutes to read a situational prompt posted outside of the room. At three of the c-MMI stations, one student entered the room and had six minutes with a single interviewer to respond to the prompt. At the fourth station, three students entered the room together and had 25 minutes in front of two interviewers to complete a series of team-based exercises.
The c-MMI design team created the scenarios that were then vetted with the CPT and piloted with a small group of postdoctoral fellows. Efforts were made to ensure students had not previously encountered any of the capstone scenarios during the admissions process. Interviewers were provided with a series of probing questions to facilitate discussion with the students and provide further insight into the attribute of interest. At the end of the discussion, interviewers were given two minutes to rate the student on four constructs of interest, each measured on a 10-point scale ranging from 1 (needs improvement) to 10 (outstanding). The constructs of interest were: communication; critical thinking; and pharmacy appreciation, which reflected the student’s ability to articulate the importance of the attribute of interest to pharmacy practice.
Several aspects of the c-MMI differentiated it from the admissions MMI. For example, while prospective candidates do not receive MMI scores or formative feedback after the admissions MMI, the c-MMI was intentionally designed to provide students with feedback on their strengths and opportunities for improvement. As such, all interviewer ratings were collected via paper forms and then entered into ExamSoft, making it possible to provide individual student feedback on day 3 of the capstone. In addition, the c-MMI included probing questions to assess students’ “pharmacy appreciation” following one year of coursework. These questions prompted students to articulate their understanding of how each construct related to pharmacy practice.
Because the capstone was a low-stakes assessment, formal grades were not assigned to students based on their performance. However, students did receive feedback regarding their performance on the open- and closed-book assessments and were provided their individual overall percent correct (0 to 100%) their percent correct for questions tagged by course, and the class average, which illustrated where individual students performed in relation to all others in the class. Grading occurred within the 36-hour window between the last assessment and debrief session.
For the c-MMI, student performance was rated on a scale of 0 to 10 for internal purposes. The 10-point MMI scale was collapsed and simplified to a three-point scale for reports provided to students. Station scores equal to or less than four were labeled as “needs improvement”; scores between five and eight were labeled as “satisfactory”; and scores of nine or 10 were labeled as “outstanding.”
At the end of the PY1 Capstone, there was a debriefing section where students were each provided with feedback on their performance using ExamSoft reports. Verbal feedback on their collective performance was provided in a large-group debriefing session presentation on day 3. The debriefing session explained how to interpret the individual performance reports and answered student questions about what to do in response to that information.
Following an explanation of how to interpret capstone reports and feedback, a panel discussion was held on how students should connect their PY1 Capstone performance to the upcoming eight-week immersion experience. The panel consisted of seven individuals (three fourth-year students, one second-year student, two postdoctoral fellows who were alumni of the school, and one precepting faculty member) who discussed practical issues related to working in clinical practice settings, managing expectations during the immersion, taking initiative, and receiving feedback. As requested by members of student leadership, the panel was followed by a presentation on professionalism tips to help students connect the ideals of professionalism from didactic instruction into the forthcoming immersion experience, and to encourage students to focus on patient-centered care while being mindful to build professional relationships with the patient and healthcare team. Students were instructed to discuss their capstone feedback with their summer immersion preceptors to help individualize and focus their experiences.
At the end of the debriefing session, a survey was administered to collect information from students about their experiences and perceptions of each part of the capstone. The survey instrument included 21 items, rated on a four-point scale from strongly disagree to strongly agree, with an option of “I do not know/cannot answer.” The survey also included three open-text questions asking about the most useful aspects of the PY1 Capstone, suggestions for improving the capstone, and questions about the capstone that had not been answered.
The PY1 Capstone was implemented during the last week in April, one week after the spring semester final examinations. Approximately two weeks before the capstone, the school provided information and instructions to the PY1 students, including the overall capstone structure, what files to download, and what to bring to the assessment. Students were informed that the capstone was a formative, low-stakes assessment with no associated grade and encouraged not to study for it.
Although the PY1 Capstone was a new and relatively quickly planned element, all four parts were executed smoothly over a three-day period. The closed-book knowledge and open-book assessments were electronically administered under examination conditions in large lecture halls using ExamSoft in the afternoons of days 1 and 2. Interpersonal skills assessments were conducted with individuals and small groups during the mornings of days 1 and 2.
All quantitative data analysis was conducted in SPSS for Windows, Version 21 (IBM, 2013). Kuder-Richardson Formula 20 (KR-20) was used to examine reliability of items used in the closed-book and open-book examinations, with levels above 0.70 considered high and levels between 0.69 and 0.40 considered acceptable. Survey data were binned into two categories: disagree represents responses of strongly disagree and disagree; agree represents responses of strongly agree and agree. Continuous data are represented as means and standard deviations (SD). Frequency data are presented as number (percentage).
RESULTS
On the Top 200 brand and generic drug names assessment, the average student score was 60% (SD=24) (Table 2). On the remaining closed-book section of 130 questions, the students scored 75% on average (SD=9) (Table 2). The closed-book examination demonstrated a high level of reliability, with a KR-20 of 0.85. On average, students completed this examination in 1.8 hours with a range of 1.0 to 3.5 hours.
Alignment of Curricular Outcomes With a Capstone for First-Year Doctor of Pharmacy Students
Summary Statistics of the Knowledge Assessment and the Find & Apply Assessment Administered to Pharmacy Students as Part of a Capstone Completed at the End of Their First Year
For the open-book, find-and-apply assessment, the average score was 68% (SD=9), and mean time for completion was 2.3 hours (range:1.0-4.0 hours). This assessment showed reasonable reliability, with a KR-20 of 0.47. There were three questions that allowed students to select “Ask your preceptor.” Two questions that could not be answered with the information given were included intentionally. On these questions, 64% of students answered incorrectly, 34% of students indicated they would ask their preceptor on one of the two questions, and 2% indicated they would ask their preceptors on both questions. One question had a correct answer but allowed students to select “Ask your preceptor.” Although 18% of students answered it correctly, 8% of the students chose “Ask your preceptors” and the remaining 74% answered the question incorrectly.
On the c-MMI, average ratings ranged from 6.4 (SD=1.5) for teamwork to 7.8 (SD=1.7) for integrity. On each station, most students received a rating of “satisfactory” (raw score=5-8), indicating that raters believed students performed at a level appropriate for having completed the first year of the PharmD curriculum. In the c-MMI, 109 (73.6%), 81 (54.7%), 88 (59.5%), and 90 (60.8%) students received a rating of satisfactory for teamwork, adaptability, integrity, and empathy, respectively.
All students completed the debrief survey instrument (n=147, 100% response rate). Only one student (0.7%) agreed with the statement “I studied (or attempted to study) in preparation for the capstone.” More than 98% of students reported that they did not attempt to prepare for this assessment, indicating that the capstone outcomes likely were valid representations of students’ knowledge and skill levels at the end of year one of the new PharmD curriculum. Even though students did not study and there were no individual grades attached to the assessment, more than 85% of students self-reported devoting their best efforts to the capstone assessments (Table 3). Approximately two-thirds of the students across each section of the capstone felt the assessment accurately reflected the skills and knowledge they obtained during the first year of the new curriculum. Approximately 70% of students reported they believed the feedback would be useful for them (Table 5).
Pharmacy Student Perceptions About a Capstone Experience Completed at the End of the First Year of the PharmD Programa
DISCUSSION
We piloted a formative capstone experience that assessed student knowledge and skills proceeding the first year of a new curriculum. The Capstone provided key information to students about their performance, and results from this capstone were shared broadly with a wide range of internal and external stakeholders. For example, results were shared with the school’s Curriculum Transformation Steering Committee and Assessment Committee. In addition, results from the PY1 Capstone were presented at a faculty town hall meeting and at the school’s annual Educational Renaissance Symposium, a required all-day event for faculty members.
Each section of the assessment was built to provide both content and face validity. The closed-book section assessed core knowledge and borrowed questions directly from the PY1 course assessments. The open-book section assessed students’ ability to integrate content knowledge and find-and-apply information to situations that would emulate a drug information request during their forthcoming experiential education. The interpersonal skills assessment, c-MMI, aligned with admissions criteria and the core competencies of the curriculum. Each of the three capstone assessment elements demonstrated acceptable reliability scores, based on the KR-20 and the psychometric properties of the c-MMI.23
Development of the capstone required significant time (approximately four months) and human capital (approximately 45 faculty and staff members). Setting aside the cost of faculty and staff time, minimal financial resources (less than $50.00) were used for c-MMI supplies. Given the resources required to develop and implement the PY1 Capstone, evaluating its short- and long-term benefits will be important. The key metrics will be how well the capstone met the goals of assessing students’ preparation for experiential experiences and how useful it will be for informing improvements in the PharmD curriculum. The full benefits of the capstone cannot yet be determined because both metrics require several years of data and therefore are beyond the scope of the current report. The immediate benefits of the capstone are clear, however, and include providing useful feedback to the learners, preceptors, and faculty members.
The use of the c-MMI is most likely more cost-effective than the use of OSCEs for targeted assessment of specific interpersonal skills. While OSCEs are clearly useful for assessing patient-care skills, such as medication education, they generally target multiple constructs and processes in a single station, making it difficult to specifically evaluate single constructs such as critical thinking, adaptability, and pharmacy appreciation.24 Additionally, it would be a costly endeavor to design separate OSCE experiences to assess all the constructs assessed in the c-MMI.25,27 Another comparative advantage of the c-MMI is that it does not involve hiring standardized patient actors, the expenditure of faculty time for developing cases, or the scheduling and renting of specialized OSCE rooms.
The capstone closed-book knowledge assessment was the easiest section to develop because it used preexisting questions from PY1 courses. Most of the time required to develop this section was spent editing questions for consistency (eg, ensuring there was a single correct response) and tagging questions to the appropriate categories. The knowledge assessment represents an initial effort to map our foundational courses to PCOA content areas. Because the foundational courses were designed to integrate basic biomedical, pharmaceutical, and clinical science knowledge, mapping to PCOA content areas allows us to identify where specific pharmaceutical science content is taught. The closed-book knowledge section could be replaced by an available standardized examination (eg, PCOA). However, replacement with a standardized examination might limit the timely provision of individual feedback, and available standardized assessments may not align with a first-year PharmD curriculum.
The open-book, find-and-apply section was the most challenging and time-consuming assessment to create. The goal was to integrate coursework, align activities that could bridge students to experiential learning (ie, what they might be asked to do on summer immersive experiences), and assess students’ confidence and ability to use appropriate resources or ask for assistance. The design team considered using graduate education (ie, PhD) qualifying examination models to create the find-and-apply questions. However, it was not feasible to charge small teams of faculty members from various disciplines to develop, review, and revise examination items in such a short time period. Additionally, the capstone piloted novel methods for assessing students’ ability to ask for help, which is evidence of self-awareness, one of the school’s core competencies.
The c-MMI is a new assessment approach within the curriculum that provided important information to the school concerning student development of key interpersonal skills. Although the school has used the MMI model for several years in the admissions process, the use of this assessment during the capstone presented some unique challenges. For example, to provide students with timely feedback about their areas of strength and opportunity on the c-MMI, the c-MMI subcommittee had to design a rapid-cycle process for converting the paper-based scores into an electronic format. Other changes that were made based on lessons learned from this assessment were adding additional time for evaluating students (eg, more than two minutes); having a facilitator for the teamwork station to help the student teams stay on time and task, evaluating the instrument (eg, “Are these the right subscales?”), and investigating the feasibility of including other or additional stations to assess students. These stations could be used to assess other aspects of our core competencies such as: leading by influence, agility and adaptability, initiative and entrepreneurialism, effective oral and written communication, curiosity and imagination, and self-awareness.
In the future, critical incidents (ie, an event which made a student stop and think, or one that raised questions that impacted personal growth) that students encounter during their first pharmacy practice experience could be used to inform constructs for the interpersonal skills assessment (ie, MMI), and cases from the integrative pharmacotherapy coursework in the second-year curriculum could be used to better align the open-book, find-and-apply cases.
Overall, several lessons were learned from this initial iteration. The time commitment and resources necessary for successful planning and implementation are substantial. Successful execution of a holistic capstone experience required extensive communication among all stakeholders to determine logistics and examination content, as well as the best way to use the examination data to improve future student learning. The pilot PY1 Capstone generated a substantial amount of data, and there was no a priori decision on vetting of the data or what role the centralized school committees should play in analyzing and reporting the data. Regardless, these data provide baseline measures for evaluating revisions to the evolving curriculum and its assessments. Changes in annual capstone performance, for example, could be monitored following a number of possible changes within the first-year curriculum, such as integrating first year courses more intentionally, altering content to promote reinforcement of key concepts, or implementing new pedagogical approaches.
Notably, future iterations of the capstone will benefit from the following points not thoroughly addressed during the pilot year: benchmarking, implications for poor or failing performance, and a more developed feedback system. Additional improvements will include extended planning time and increased communication with stakeholders. Furthermore, the UNC Eshelman School of Pharmacy is developing a PY3 Capstone to be administered immediately before the fourth-year advanced pharmacy practice experiences (APPEs). The PY3 Capstone is expected to inform further development and refinement of the PY1 Capstone, provide insight into opportunities to improve the pre-APPE didactic curriculum, and generate feedback to the students for improvement.
ACKNOWLEDGMENTS
During the time of this study, Kathryn Fuller, PharmD, was an academic fellow in Educational Innovation and Research but at the time of publication she is a clinical assistant professor at the UNC Eshelman School of Pharmacy.
Appendix 1. Breakdown of the Closed-book Knowledge Assessment Content by Course

- Received August 17, 2017.
- Accepted October 25, 2017.
- © 2019 American Association of Colleges of Pharmacy