Abstract
Objective. To evaluate a tool designed to assess Doctor of Pharmacy (PharmD) students’ personal and professional development prior to beginning advanced pharmacy practice experiences (APPEs).
Methods. A five-item instrument, entitled the Faculty Advisor’s Assessment of the Advisee (FAAA) tool, was developed to assess and monitor pharmacy students’ progress over the three-year didactic curriculum. Question anchors were created to describe characteristics exhibited by the student that matched categories of not engaged, beginning, emerging, or engaged. Possible FAAA composite scores ranged from 7 to 20. Using the FAAA tool, faculty advisors assessed their advisees’ values, engagement, self-awareness, professionalism, and leadership in 2017, 2018, and 2019. Individual and aggregate cohort reports were run and data for each of the three years were matched with students. To determine if the FAAA showed progression in assessed dimensions in the students during the first, second, and third professional (P1, P2, and P3) years, a Friedman test was performed. Cronbach alpha was used to assess the reliability of the instrument.
Results. The data of 93 students were matched for the P1 through the P3 years. Median (IQR) for the FAAA composite score levels for the P1, P2, and P3 were 13 (12-16), 17 (15-19) and 18 (16-20), respectively. Significant differences existed at all timepoints compared, including from the P1 to P2, P2 to P3, and P1 to P3 years. The reliability of the FAAA scale was strong across all three years (winter 2017, α=0.87; winter 2018, α=0.89; and winter 2019, α=0.87). All items appeared worthy of retention as removal did not significantly increase their reliability.
Conclusion. A five-item tool which assesses pharmacy students’ personal and professional development during the first three years of a PharmD program could be used by faculty advisors to assess student’s progress across the didactic curriculum.
INTRODUCTION
Ensuring sufficient professional development in Doctor of Pharmacy (PharmD) students prior to beginning advanced pharmacy practice experiences (APPEs) is vital. This importance is highlighted by the Center for the Advancement of Pharmacy Education (CAPE) Educational Outcomes 2013 under Domain 4: Personal and Professional Development.1 Components of this domain include measurement of student leadership, self-awareness, innovation and entrepreneurship, and professionalism. Most of the literature from pharmacy, medicine, and other health professions that describes assessment of these elements of personal and professional growth is based on students’ self-reported data gathered through surveys or self-reflection assignments.2-8 Individual pharmacy programs have published papers on their curricular efforts to foster student innovation, entrepreneurship, and leadership at individual timepoints or in specific courses.2,9,10 Approaches to using individual scales for two of these elements, assessment of professionalism and self-awareness, have been published, but most of the efforts relied upon student self-assessment and provided limited feedback.4-7,11,12 Thus, limited literature has been published on how to globally assess these particular characteristics in pharmacy education in an objective manner. Given the nature of students and their evolving personal and professional development over the course of a curriculum, it seems that a longitudinal, objective assessment is needed to assess these attributes.
Specific to pharmacy programs, several approaches have been used to assess elements of personal and professional growth. The Defining Issues Test (DIT), a standardized tool to assess professionalism, was described and incorporated by Peeters and colleagues.2 Students were asked to use the DIT in self-reflection and submit written reflections at regularly scheduled intervals (eight intervals and topics) spanning the first through third professional (P1-P3) years of the PharmD curriculum.2 Chisholm and colleagues developed an 18-item instrument to measure a student’s professionalism via self-assessment and administered it to first year students and end of fourth year/graduation.6 Others researchers have published studies describing assessment of professionalism through either evaluation of student portfolios or by preceptor evaluation as an element of a program’s midpoint or final APPE assessment.2,12,13
Other health professions have used a mix of self-reflection and objective assessment of elements of personal and professional development in didactic portions of their curriculums. Harris and colleagues at Ohio State University noted the importance of assessing students early in the curriculum and throughout the program.8 In their allied health programs, including athletic training, physician assistant, occupational therapy, and physical therapy, they used the Professionalism Assessment Tool (PAT), a standardized self-assessment tool, to detect growth in these areas. The PAT evaluated personal responsibility, elements of self-awareness, and engagement. A unique attribute of the PAT was it also asked students to assess their relationship with others. Finally, Dorsey and colleagues at the University of Minnesota developed an Academic Professional Behavior Assessment (APBA) to assess professional behavior among Doctor of Physical Therapy students.14 In their study, researchers provided important factors to consider in assessing professionalism, such as conducting the assessment early in the program, having a consistent tool, and having faculty evaluate students in a continuous and ongoing way.
The Eugene Applebaum College of Pharmacy and Health Sciences at Wayne State University implemented a redesigned PharmD curriculum starting with P1 students in fall 2016. One goal of the renewed curriculum was to improve the college’s focus on the development of student professionalism, self-awareness, and leadership in pharmacy students during the didactic portion of the PharmD curriculum. Given the volume of literature reporting the benefit of student self-reflection and the critical role of mentoring in the development of these attributes, we evaluated how we could enhance our student advising process to help students in these areas and improve our academic and career counseling.13,15
Previously, students were only required to meet with their advisor during the P2 year and complete a standardized student reflection; any other meetings with an advisor were voluntary and initiated by the student. Based on feedback from focus groups conducted with graduating P4 students, graduating student surveys from the American Association of Colleges of Pharmacy (AACP), the updated educational standards from the Accreditation Council for Pharmacy Education (ACPE) recognizing the need for schools to provide academic advising to support student success, and input from our faculty, we formalized and mandated a longitudinal advising process that spans all four years of the renewed PharmD curriculum.1,16 We determined that the faculty advisor was the one individual that longitudinally interacted with a student; thus, faculty advisors would be in a perfect position to provide mentoring, feedback, and longitudinal assessment of a student if provided with a specific process and goals.
A description of the development and evaluation of a faculty-advisor driven evaluation tool, the FAAA, to assess a student’s personal and professional development is presented as well as a description of how it was incorporated into the advising process in our PharmD program. Our goal was to develop, implement, and evaluate performance of the FAAA tool to determine its reliability in longitudinally assessing student development of professionalism during the PharmD program.
METHODS
As part of the curriculum redesign that our college launched in fall 2016, we had revised an existing student self-assessment tool, the Faculty Advising Discussion (FAD) form, to contain questions related directly to the new curriculum as well as focus more on students’ personal and professional development than in years past. The FAD form was completed by students prior to a meeting with their advisor. In the old curriculum, students were only required to complete the FAD form in the P2 year. In the redesigned curriculum, students were required to complete the form during each semester of the P1 to P3 years. Each semester, the student advisee completed a FAD form in anticipation of their meeting with their faculty advisor. The FAD form addressed semester-specific concerns and tied directly to student’s classroom and experiential learning, co-curricular engagement, and professional development. This student self-assessment provided a rich source of information for discussion between the student and faculty advisor when reviewing the student’s progress in the program and the student’s values, development, self-awareness, leadership, and professionalism.
Based on alignment of the new curriculum and FAD forms, the Faculty Advising Assessment of the Advisee (FAAA) tool was developed (Appendix 1). The intent was that faculty advisors would use the FAAA tool to evaluate their advisees during each winter semester (February to April) throughout the PharmD curriculum. The tool, which was developed by a subgroup of members of the PharmD program’s Pharmacy Assessment Committee, was constructed to enable advisors to assess the student based on behaviors observed surrounding and during the meeting and on specific content discussed during the meeting.
Several resources and ideas were considered when developing the FAAA tool. One of the college’s goals was to develop a short, easy-to-use rubric that would align directly with the topics of self-awareness, professionalism, and leadership, as these were not assessed objectively in other manners within the existing didactic curriculum. Innovation was excluded from the assessment as students were already assessed on this attribute in several courses as part of the requirements for self-directed or group projects. Based on the model of the Association of American Colleges and Universities’ VALUE (Valid Assessment of Learning in Undergraduate Education) rubrics and other rubric design practices, the authors determined that it would be most effective to develop a horizontal, analytic-type rubric that could contain up to five dimensions for assessment, with each having two or more levels of performance.17-19 To follow a format similar to that of our curriculum map of ability-based outcomes used in the didactic curriculum, which had three levels, we initially determined to use three anchors for all questions. Specific questions related to the dimensions of engagement, values, self-awareness, professionalism, and leadership were developed. Next, the anchors for each rubric dimension were developed into the categories of beginning, emerging, or engaged. This terminology was chosen to convey a positive message to the student about their growth and was not associated with any grade penalty or reward for a rating on the rubric.
The initial FAAA tool was piloted by three members of the Pharmacy Assessment Committee and two additional faculty advisors with student advisees before undergoing final review and revisions by the full Assessment Committee. During the fall 2016 semester, these individuals reviewed the FAAA tool and tried it on paper during their advising meetings. Finally, they provided feedback to the subgroup of the Pharmacy Assessment Committee. Several modifications were suggested, including deleting the term of “self-help” from the tool, further developing the personal leadership section to include more than employment and student organizations, and better aligning some of the descriptors under the anchors with activities addressed on their self-evaluation in the FAD. The Assessment Committee reviewed the revised tool and suggested that a “not engaged” category be added for three of the dimensions related to student engagement, alignment of values with interests pursued, and personal leadership. This change was voted on and approved by both the Assessment and Curriculum Committees. Faculty members’ initial introduction to and training in use of the FAAA tool occurred via email, in department meetings, and during a large-group training session held at the college’s winter “all faculty” meeting in February 2017. In that session, the FAAA tool was reviewed, its relationship to the FAD form was explained, and the importance of faculty observation and assessment was emphasized. Emphasis was made on conducting sincere evaluations and that the anchor category descriptions linked to each dimension should be carefully considered. Faculty also were informed that the categories on the instrument did not align with specific years in the PharmD program, eg, an element marked as emerging was not the benchmark for a P2 student simply because it was the second category listed on the form.
The FAAA was launched in late February 2017 and incorporated into the advising process. While the initial emphasis and use of the FAAA was for program evaluation, students were informed of its use in class meetings and individual advising meetings and were able to view their faculty advisor’s feedback electronically.
There were two main research objectives in evaluating the performance of the FAAA: to determine if the FAAA tool identified pharmacy student growth between the beginning (P1) and the end (P3) of the didactic curriculum, and to determine the reliability of the instrument’s performance across the curriculum and years.
The project was determined to be exempt by the Wayne State University’s Institutional Review Board. Individual and aggregate cohort level reports were obtained from run from our experiential and didactic data management system, E*Value (MedHub). This was done for each student year within the cohort (students who received advising as P1s in winter 2017, P2s in winter 2018, and P3s in winter 2019). Using Excel 2016, individual student scores were matched by the primary investigator, and then de-identified prior to data transformation and analysis. Each anchor in questions on the FAAA rubric were assigned a score (1=not engaged, 2=beginning, 3=emerging, 4=engaged), and the alphabetical character data from E*Value was transformed to numeric data using these codes. An overall FAAA score was then calculated for each student for each year of the curriculum. All data evaluations were completed using SPSS, version 65 (IBM).
To determine whether the FAAA tool showed progression in engagement, values, self-awareness, professionalism, and leadership between the P1 and P3 year, the cohort of individually matched student data from the class of 2020 was used. First, a Friedman test was used to determine whether there was a mean difference in overall FAAA score between program years. Next, Wilcoxon signed-rank tests were performed to determine if differences existed between individual program years (P1 vs P2, P2 vs P3, and P1 vs P3) on each component of the FAAA measure. An alpha of 0.017 would be used to indicate statistical significance for post-hoc comparisons using Wilcoxon signed-rank test with a Bonferroni adjustment. Last, Cronbach alpha was calculated to assess the reliability of the FAAA scale for each winter semester (ie, winter 2017, winter 2018, and winter 2019).
RESULTS
Data spanning the P1 through P3 years were matched for 93 of 99 of the students in the cohort. The Friedman test identified a significant difference in FAAA scores between the P1 through P3 year (p<.01). Median (IQR) for the FAAA composite score levels for the P1, P2, and P3 were 13 (12-16), 17 (15-19) and 18 (16-20), respectively. Post hoc analysis with Wilcoxon signed-rank tests resulted in a significance level of p<.01 for all three comparisons (P1 vs P2, P2 vs P3, and P1 vs P3). A contingency table with scores on each question on the FAAA tool is shown in Table 1. Friedman tests across each individual FAAA element showed a significant difference across years (question categories on Table 1 of engagement, values, self-awareness, professionalism, and leadership). Post hoc analysis with Wilcoxon signed-rank tests identified a significance level of p<.01 for comparisons (P1 vs P2, P2 vs P3, P1 vs P3) within each element. To assess reliability, Cronbach alpha was measured on the five items that comprise the FAAA instrument. The reliabilities of the overall FAAA scale were assessed each year and were strong at winter 2017 (n=258, α=0.875), winter 2018 (n=305, α=0.887) and winter 2019 (n=300, α=0.874). All items appeared worthy of retention as removal would not have significantly increased the reliability of the instrument.
Personal and Professional Development and Progress of Students in a Doctor of Pharmacy Program as Tracked Using the Faculty Advising Assessment of the Advisee Instrument
DISCUSSION
After implementing a new student assessment tool as well as clarifying and strengthening the faculty advisor’s role in ensuring students’ success, we observed an improvement in students’ personal and professional development as they progressed from their P1 through P3 year. We believe these changes are important as the FAAA tool directly measures Standard 4, which has been challenging for pharmacy programs to assess. Furthermore, this tool is unique in that it incorporates longitudinal assessment by the advisors along with student self-assessment and the bi-annual advising sessions can be accomplished in the normal course of work.15 The advisor is in a good position to assess their advisee’s growth and ensure that students make improvements based on the feedback given to them. Informal feedback from our faculty advisors has indicated that the tool is easy to use. Given the short length of the tool, other institutions could consider incorporating it into their advising process. The design of the program described here, ie, using faculty advisors in the assessment of students’ attributes of self-awareness, leadership, and professionalism as a means of measuring pharmacy students’ growth in learning and engagement in the profession across the three years of the didactic PharmD curriculum, is unique in that it is more comprehensive and holistic than that of other programs.
Similarities and differences exist between the FAAA tool we have incorporated into our faculty advising process and tools and processes used by other pharmacy programs for the assessment of students’ personal and professional development. The element of student self-reflection prior to meetings is similar to others linking reflections to pieces within the curriculum.2,5,7 As in the study by Peeters, our students complete self-reflections at regular intervals within the curriculum (ours is the Faculty Advising Discussion form) that are linked to didactic and cocurricular activities.2 Pokorny describes a similar self-reflection process linked to experiential education activities with feedback provided by advisors in group settings,7 while the program Hoffman describes uses self-reflection throughout the curriculum that is documented in a student portfolio system.5 Some of these programs give direct feedback based on reflections, while others just look for student completion of assignments because of lack of manpower to review each submission.2,5,7 One of the unique attributes of our FAAA tool and faculty advising process is that students are provided direct verbal feedback on reflections made on their FAD form during their meetings and are assessed using a standardized tool by a faculty member. Chisholm describes assessing students’ professionalism through student self-reflection using a standardized pharmacy professionalism instrument during the P1 and end of P4 year within a month of graduation, but did not find statistical differences between students by year in the program or professional growth.6 By comparison, our faculty-based assessment, the FAAA tool, which was used yearly from the P1 through P3 years, showed student growth in all areas as a cohort. One of the primary differences in these two tools was the assessors (ie, self-assessment by students vs faculty assessment of students), and the faculty used observation and analysis of the student’s work and behaviors on the FAAA. Finally, like other processes describing personal and professional development of students, ours did show improvement of students over a long period of time.2,12
The FAAA tool has some similarities and differences from other health profession’s tools to assess students’ personal and professional development. Unlike the PAT that did not have a faculty or volunteer preceptor as assessors, our process incorporates both student self-reflection and an advisor’s evaluation of student performance using the FAAA tool. This provides a faculty-conducted objective assessment based on their interactions with the student.8 The APBA tool used to assess physical therapy students is similar to our FAAA tool in that it allows faculty members to evaluate each students’ professionalism after each semester, thus giving individual feedback to each student and providing an objective measure of student’s professionalism level throughout the program.14 The APBA tool was used by faculty members to evaluate students in all courses every semester instead of by the same faculty advisor at the end of winter semester as was done in our process. Along with professionalism, the APBA tool specifically evaluated a student’s problem-solving abilities, effective use of time and resources, interpersonal skills, and working relationships. Interestingly, their APBA process required students to meet with their faculty advisor if three or more substandard professional behaviors were identified. When this occurred, a professional behavior remediation plan was developed for the student in conjunction with the faculty advisor. Our process focused only on continuous growth in professional behaviors and was not associated with remediation.
This study has some limitations. One limitation was possible subjectivity in the implementation of the FAAA rubric by faculty advisors. The FAAA rubric is very specific and was designed to objectively assess a students’ growth in self-awareness, leadership, and professionalism longitudinally. However, only one faculty member assessed each student, so interrater reliabilities could not be calculated. Because faculty advisors’ perspectives and expectations of a student may vary, this may have resulted in variations in the way student growth and progress were evaluated as different faculty advisors assessed different students.
Another possible limitation of this study was faculty turnover. Although rare, in some cases a student had to be reassigned to a different faculty advisor because the original advisor left the college or was on sabbatical or taking a leave of absence. The reassignment of advisors may have resulted in some variations in the evaluation and progress tracking of some students’ professional growth and maturity.
Finally, students were randomly assigned to a faculty advisor as they entered the program. Some faculty members took more initiative, made themselves more available to advisees, and were more enthusiastic about the advising process and its objectives. If either the assigned advisor or the student was not enthusiastic about the process, the engagement and progress of the student may have been delayed or even hindered because of the lack of a strong advisor-advisee relationship.
CONCLUSION
As educators, we realize the ever-increasing importance of providing consistent and substantive advising to PharmD students to improve student retention, success, and well-being. The approach described in this paper combines the benefits of providing students with an expert in the profession as an advisor with a structured tool for assessing development of the student pharmacist as a professional in the field. The combination represents best practice for both student education and faculty workflow. Using faculty advising to enhance mentorship and having an assessment tied to the process also represents best practice in terms of allowing flexibility for both students and faculty in a PharmD curriculum and helping our program meet ACPE requirements. A future consideration would be to extend the formalized process through the P4 year of the PharmD program.
ACKNOWLEDGMENTS
We would like to acknowledge Sabrina Bierstetel, PhD, from the Research Design and Analysis Unit for her participation and assistance with this project.
Appendix

Faculty Advisor Assessment of Advisee
- Received May 30, 2020.
- Accepted November 25, 2020.
- © 2021 American Association of Colleges of Pharmacy