Skip to main content

Main menu

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers: Moving from Injustice to Equity
  • Reviewers
    • Reviewer Instructions
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Other Publications

User menu

Search

  • Advanced search
American Journal of Pharmaceutical Education
  • Other Publications
American Journal of Pharmaceutical Education

Advanced Search

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers: Moving from Injustice to Equity
  • Reviewers
    • Reviewer Instructions
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Follow AJPE on Twitter
  • LinkedIn
Brief ReportBRIEF

Developing and Implementing an Entrustable Professional Activity Assessment for Pharmacy Practice Experiences

Connie Smith, Roxie Stewart, Gregory Smith, H. Glenn Anderson and Scott Baggarly
American Journal of Pharmaceutical Education September 2020, 84 (9) ajpe7876; DOI: https://doi.org/10.5688/ajpe7876
Connie Smith
University of Louisiana Monroe College of Pharmacy, Monroe, Louisiana
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Roxie Stewart
University of Louisiana Monroe College of Pharmacy, Monroe, Louisiana
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Gregory Smith
University of Louisiana Monroe College of Pharmacy, Monroe, Louisiana
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
H. Glenn Anderson
University of Louisiana Monroe College of Pharmacy, Monroe, Louisiana
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Scott Baggarly
University of Louisiana Monroe College of Pharmacy, Monroe, Louisiana
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading

Abstract

Objective. To develop, implement, and validate an entrustable professional activity (EPA) assessment tool that could be used to calculate course grades for experiential students in all practice environments.

Methods. An EPA assessment tool was developed and directly mapped to 18 EPAs, and a criterion, or passing score, for each EPA was established for all practice experiences. The EPA assessment tool was implemented in the college’s experiential program during summer 2018 and comparative outcomes and reliability of the EPA assessment tool were assessed within the core advanced pharmacy practice experiences (APPEs).

Results. The EPA assessment tool reliability was strong (Cronbach’s alpha=0.93), with preceptor-suggested grades and grades calculated using the EPA assessment tool equivalent in 95% of completed APPEs. All nonequivalent calculated-preceptor grade pairs were evenly split between one grade higher than scored and one grade lower than scored.

Conclusion. The EPA assessment tool is a reliable and valid instrument for assessing EPA achievement in the APPE year. Future work should focus on determining the longitudinal utility of the EPA tool by comparing outcomes in introductory and advanced pharmacy practice experiences.

Keywords
  • entrustable professional activities
  • experiential education
  • assessment
  • pharmacy education

INTRODUCTION

As pharmacy moves toward provider status, pharmacy educators must ensure that pharmacy graduates can competently perform activities and contribute as a member of the health care team. Stakeholders expect pharmacy graduates to demonstrate and apply requisite knowledge and skills to direct patient care. In response to anticipated stakeholder demand, a core list of entrustable professional activities (EPAs) for pharmacy graduates was developed by the American Association of Colleges of Pharmacy (AACP).1-3 These core EPAs are essential activities that all pharmacy graduates, regardless of the setting in which they intend to practice, must be able to perform without direct supervision.1

In addition to the traditional use of objectives to measure knowledge and progression toward competencies, assessment using EPAs has become necessary to determine students’ preparedness to autonomously perform pharmacist duties. Core EPAs have been shown to be valid and pertinent to pharmacy practice.4 Studies have shown that core EPAs are consistently rated as relevant to most practice settings and are important for determining a student’s readiness for practice.5,6 Students should perform these activities at certain levels of entrustment based upon the depth and maturity of their knowledge, skills, and attitudes as they progress through the professional program.2 However, there is minimal research available that describes EPA assessment in pharmacy education. Scott and colleagues found that quantifying the use of EPAs with supporting tasks within the practice manager domain was a useful method of assessment.7 Entrustable professional activities incorporated into an assessment tool for practice experiences completed by first-year student pharmacists demonstrated a significant increase in the performance of EPAs from midpoint to final evaluations, suggesting student growth.8 Studies reporting on the psychometric properties of EPAs are few; however, those that have been published report moderately strong inter-rater reliability and good content validity.9

While the body of evidence supporting use of EPAs is growing, there is a need to establish a valid approach for their assessment during practice experiences. To help meet the need for a validated instrument to assess EPAs throughout the experiential curriculum, an assessment tool was developed to determine student’s level of entrustment during each practice experience; calculate scores and letter grades based on student EPA performance; and capture data for personal and professional development (PPD). This paper describes the development and pilot implementation of the EPA assessment within the curriculum’s core advanced pharmacy practice experiences (APPEs), which include general medicine, ambulatory care, institutional, and community settings.

METHODS

During fall 2016, the University of Louisiana Monroe (ULM) College of Pharmacy designed a new experiential assessment tool that incorporated the evaluation of EPAs.3 Tool development focused on content validity, criterion validity, and convergent/discriminant evidence.10 To ensure content validity and construct validity, 14 EPAs that had been established by the AACP Academic Affairs Committee were included in the tool.1 Four program-specific EPAs (Table 1) were also developed and included to ensure all college objectives were represented. Each EPA was listed on the assessment tool (Appendix 1), along with examples of supporting tasks, expected level of entrustment, and feedback fields. The assessment tool also contained four PPD sections (self-awareness, leadership, innovation, and professionalism), and one section for overall feedback. To determine criterion validity, preceptors were asked to provide a suggested letter grade for each students’ performance; however, the grade was hidden from the student’s view and not factored into the student’s actual grade.

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 1.

Doctor of Pharmacy Students’ Expected Levels of Entrustment on AACP and Program Specific EPAs by Professional Year Milestones

Development and implementation of the EPA tool were led by faculty members from the Office of Experiential Education (OEE). The Pharmacy Practice Experience (PPE) Committee, the Curriculum and Assessment Committees, and all active preceptors were consulted for input throughout the process. First, the PPE Committee, which included eight faculty members and two non-faculty preceptors representing all core practice experiences, established expected levels of entrustment for each professional year milestone (Table 1) using a modified Angoff method.10-13 Expected levels of entrustment were determined for each EPA, with the minimum for APPE students being level 3 per the AACP recommendation that all students should achieve this level upon graduation.2 Second, a sample of faculty and non-faculty preceptors were polled to ensure that opportunities for students to perform the EPAs at the established levels of entrustment were achievable at their practice sites.

Historically, ULM students receive letter grades during practice experience evaluations; therefore, the assessment tool (Appendix 1) was required to calculate scores, which were later translated into grades. Each EPA had six options from which the preceptor could select: five levels of entrustment and a not applicable (N/A) option. Those who met or exceeded the expected performance level for a given EPA were awarded 100% of points for that EPA. Students who did not meet the expected level for an EPA were awarded a proportional percentage of points for that EPA. For example, if the expected performance level for an EPA was four, a score of three would result in the student receiving 75% of the points. If a student had no opportunity to demonstrate performance on a given EPA, the item was marked N/A. Any EPA marked as N/A did not contribute toward the student’s overall grade for the practice experience. Professionalism was either demonstrated or not demonstrated during the practice experience; a student receiving a “no” rating for professionalism on the final assessment meant an automatic failure for the practice experience. Scores for the applicable EPAs and professionalism were equally weighted and combined to arrive at the final grade. The areas of self-awareness, leadership, and innovation were evaluated but not scored. Students were rated as “novice,” “approaching proficiency,” or “demonstrates proficiency” for each of these personal growth areas.

The proposed assessment tool was beta-tested by select faculty preceptors from various practice settings who provided input on the tool’s design and utilization. Changes were made based on this feedback, resulting in the final version of the instrument. Prior to pilot implementation in May 2018, training on EPAs, milestone expectations, definitions of levels of entrustment, and utilization of the new assessment tool was provided to preceptors (n=397). Training was administered via live seminars at professional meetings, email and telephone correspondence, or through one-on-one educational opportunities. Prior to APPE onset, students were informed about EPAs and mandatory activities that provided additional opportunities for demonstrating entrustment.

Data from May 2018 to May 2019 were collected for eight APPE blocks. All data points that were marked as N/A were treated as missing values within the data file and were replaced through regression imputation to address potential loss of cases listwise during statistical analysis.14 Descriptive statistics were reported for each student demographic and performance for each of the EPAs. Implementation assessment was undertaken using Multi-Factor Analysis of Variance. The association of the dependent variable (EPA performance) with various independent variables (APPE type, APPE block, student gender, student ethnicity, student age, and grade point average [GPA] at the end of the third academic year) was analyzed. Bonferroni post hoc analyses were performed where appropriate. Comparisons of EPA assessment reliability between core practice experiences (test-retest reliability) and assessment of internal consistency were made using Cronbach’s alpha. Criterion validity and convergence/discriminant evidence were assessed by comparing preceptor suggested grades and grades generated by the EPA assessment (chi-square and hit rate).15

RESULTS

One hundred forty-two preceptors completed evaluations between May 2018 and May 2019: 22 general medicine preceptors (27% were faculty members), 19 ambulatory care preceptors (47% were faculty members), 36 institutional preceptors (none were faculty members), and 65 community preceptors (none were faculty members). Ninety (63%) preceptors were female. Ninety-two students completed the 428 APPE blocks assessed in this analysis. One hundred thirty-nine general medicine, 104 ambulatory care, 93 institutional, and 92 community APPE blocks were completed. During this study, 7704 data points were collected, of which 704 (9%) were recorded as N/A. No students from this cohort required remediation. Of the 92 students, 70.7% were female and the majority identified their race/ethnicity as Caucasian (71.8% Caucasian, 14.1% African American, and 14.1% other). The mean age was 25.2±2.6 years, and the mean GPA was 3.37±.40.

Multi-factorial modeling included the main effects of APPE type, APPE block, student gender, student ethnicity, student age, and GPA. The model was predictive of EPA outcomes (p<.001). All variables except ethnicity were contributory and retained in the model. A comparison of student EPA outcomes by APPE type is presented in Table 2. General medicine and ambulatory care assessments of student EPA outcomes were similar (mean difference=-.03, 95% CI=-.09 to .03; p=1.0). Community assessment scores were generally higher than in institutional (mean difference=.12, 95% CI=.06 to .19; p<.001). However, both community and institutional APPEs evaluated outcomes higher than either ambulatory care (mean difference of .53 and .41, respectively) or general medicine APPEs (mean difference of .56 and .44, respectively). All pairwise comparisons were significant at p<.01. Post hoc analysis of rotation block sequence showed no differences among EPAs. There was a significant association (p<.05) of certain EPAs with student gender, student age, and GPA. In general, female students outperformed male students (specifically, female students scored higher on EPAs 1, 2, 3, 5, 7, 9, 10, 13, and 15). A negative correlation between student age and score was found for EPAs 3 and 9, and a positive correlation between student GPA and score was found for EPAs 2, 5, and 16.

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 2.

Doctor of Pharmacy Students’ Performance on Entrusted Professional Activities as Measured Using a Novel Assessment Instrument

Test-retest reliability was strong (Cronbach’s α=93, p<.001). Similarly, internal consistency was high (Cronbach’s α=.97, p<.001). Preceptor-suggested grades and EPA assessment calculated grades were congruent in 407 (hit rate=95%, p<.001) completed APPEs. Twenty-one grade disagreements (5%) were observed. Grade disagreements were evenly distributed, with 11 grades received being one grade higher than recommended and 10 being one grade lower. No grade disagreements resulted in a student receiving a passing or failing grade when the preceptor indicated the opposite.

DISCUSSION

We designed and implemented a comprehensive tool to evaluate the AACP committee-established EPAs,2 our program-specific EPAs, and the PPD soft skills. Our findings were similar to those of other studies in that scores generated from the tool across the APPE year were associated with APPE type, APPE block, GPA, student gender, and student age.16-22 Similar to Fay’s work, scores given by our faculty preceptors were lower than those given by non-faculty preceptors.23 In our program, faculty members solely engage in delivery of either general medicine (27%) or ambulatory care (47%) APPEs, which could explain the finding that non-faculty community and institutional preceptors tended to score students higher. Although a positive and expected correlation between GPA and EPA scores was observed, some of our results may warrant further investigation. Our findings suggested that women performed better on clinical APPEs than men, which had been previously demonstrated by Riese and Haist.20,21 Student age was found to be negatively correlated with EPA outcomes, which has been seen in other areas of medical education.24-26

Strong internal consistency and test-retest reliability suggest the tool may provide a mechanism for consistent assessment of EPAs. Additionally, the findings that EPA tool outcomes provided high congruency with preceptor-suggested grades and no pass/fail disagreements lends strength to the validity of the tool. Factors attributing to this successful pilot implementation included the incorporation of required activities to give students additional opportunities to demonstrate entrustment in a variety of settings. Furthermore, assessment of EPAs demonstrated consistent achievement of the expected levels of entrustment at the onset and throughout the APPE year, which could imply students were adequately prepared in advance of APPEs. Future investigation should focus on the refinement of the tool’s utility for assessing growth in students within specific APPE blocks and for determining longitudinal growth throughout pre-APPE simulation-based laboratory activities and introductory and advanced pharmacy practice experiences.27

This study has several limitations. A relatively large percentage of individual EPAs were marked as N/A, which was addressed using a valid method of score imputation.14 Also, the study analyzed a single year of APPE data from one program; therefore, the generalizability of our findings may be limited. Though our method of assessing APPEs using EPAs seems to be reliable while providing a mechanism for delivering meaningful feedback, we recognize the need for further refinement of the tool. As further guidance for best practices emerges in the area of EPA assessment, we hope to enhance our methods by using a more holistic and prospective approach to ensure the practice-readiness of our pharmacy students.28

CONCLUSION

This pilot study describes the development and implementation of an EPA assessment tool during a single APPE year. The test-retest reliability of the instrument was strong and internal consistency was high. Outcomes demonstrated high congruency with preceptor-suggested grades and no pass/fail disagreements. Though further investigation is needed for tool refinement and longitudinal application from introductory to advanced pharmacy practice experiences, this tool may be adapted by other programs in the development of EPA assessment methods.

ACKNOWLEDGMENTS

The authors thank all preceptors involved in beta-testing the evaluation tool. The authors also thank Jeffery Evans, PharmD, Seetharama Jois, PhD, Savannah Posey, PharmD, Laurel Sampognaro, PharmD, Paul Sylvester, PhD, and Jamie Terrell, PharmD, for review and feedback during the editing of this manuscript.

Appendix 1. APPE Assessment Tool Example Excerpts

Embedded Image

  • Received October 7, 2019.
  • Accepted June 19, 2020.
  • © 2020 American Association of Colleges of Pharmacy

REFERENCES

  1. 1.↵
    1. Haines ST,
    2. Pittenger AL,
    3. Stolte SK,
    4. et al
    . Core entrustable professional activities for new pharmacy graduates. Am J Pharm Educ. 2017;81(1):Article S2. doi: 10.5688/ajpe811S2
    OpenUrlCrossRef
  2. 2.↵
    1. Haines ST,
    2. Gleason BL,
    3. Kantorovich A,
    4. et al
    . Report of the 2015-2016 Academic Affairs Standing Committee. Am J Pharm Educ. 2016;80(9):Article S20. doi: 10.5688/ajpe809S20
    OpenUrlCrossRef
  3. 3.↵
    1. Pittenger AL,
    2. Copeland DA,
    3. Lacroix MM,
    4. et al
    . Report of the 2016-17 Academic Affairs Standing Committee: entrustable professional activities implementation roadmap. Am J Pharm Educ. 2017;81(5):Article S4. doi: 10.5688/ajpe815S4.
    OpenUrlCrossRef
  4. 4.↵
    1. Haines ST,
    2. Pittenger AL,
    3. Gleason BL,
    4. Medina MS,
    5. Neely S
    . Validation of the entrustable professional activities for new pharmacy graduates. Am J Health Sys Pharm. 2018;75(23):1922-1929. doi: 10.2146/ajhp170815.
    OpenUrlAbstract/FREE Full Text
  5. 5.↵
    1. Pittenger AL,
    2. Gleason BL,
    3. Haines ST,
    4. Neely S,
    5. Medina MS
    . Pharmacy student perceptions of the entrustable professional activities. Am J Pharm Educ. 2019;83(9):7274. doi: 10.5688/ajpe7274.
    OpenUrlAbstract/FREE Full Text
  6. 6.↵
    1. VanLangen KM,
    2. Meny L,
    3. Bright D,
    4. Seiferlein M
    . Faculty perceptions of entrustable professional activities to determine pharmacy student readiness for advanced practice experiences. Am J Pharm Educ. 2019;83(10):7501. doi: 10.5688/ajpe7501.
    OpenUrlCrossRef
  7. 7.↵
    1. Scott DM,
    2. Naughton CA,
    3. Petry N,
    4. Friesner DL
    . Assessment of practice management entrustable professional activities by pharmacists in North Dakota. Am J Pharm Educ. 2019;83(10):7486. doi: 10.5688/ajpe7486.
    OpenUrlAbstract/FREE Full Text
  8. 8.↵
    1. Rhodes LA,
    2. Marciniak MW,
    3. McLaughlin J,
    4. Melendez CR,
    5. Leadon KI,
    6. Pinelli NR
    . Exploratory analysis of entrustable professional activities as a performance measure during early pharmacy practice experiences. Am J Pharm Educ. 2019;83(2):Article 6517. doi: 10.5688/ajpe6517
    OpenUrlCrossRef
  9. 9.↵
    1. Thompson LR,
    2. Leung CG,
    3. Green B,
    4. et al
    . Development of an assessment for entrustable professional activity (EPA) 10: Emergent patient management. West J Emerg Med. 2017;18(1):35-42. doi: 10.5811/westjem.2016.10.31479.
    OpenUrlCrossRef
  10. 10.↵
    1. Eignor DR
    . Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association; 1999.
  11. 11.↵
    1. Anderson HG Jr.,
    2. Nelson AA
    . Reliability and credibility of progress test criteria developed by alumni, faculty, and mixed alumni-faculty judge panels. Am J Pharm Educ. 2011;75(10):200. doi: 10.5688/ajpe7510200.
    OpenUrlAbstract/FREE Full Text
  12. 12.↵
    1. Morrison H,
    2. McNally H,
    3. Wylie C,
    4. McFaul P,
    5. Thompson W
    . The passing score in the objective structured clinical examination. Med Educ. 1996;30(5):345-8. doi: 10.1111/j.1365-2923.1996.tb00845.x.
    OpenUrlCrossRefPubMed
  13. 13.↵
    1. Norcini JJ
    . Setting standards on educational tests. Med Educ. 2003;37(5):464-469. doi: 10.1046/j.1365-2923.2003.01495.x.
    OpenUrlCrossRefPubMed
  14. 14.↵
    1. Cheema JR
    . Some general guidelines for choosing missing data handling methods in educational research. Journal of Modern Applied Statistical Methods. 2014;13(2):53-75. doi: 10.22237/jmasm/1414814520
    OpenUrlCrossRef
  15. 15.↵
    1. Allen MJ,
    2. Yen WM
    . Introduction to Measurement Theory. Waveland Press, 2002.
  16. 16.↵
    1. Axelson RD,
    2. Solow CM,
    3. Ferguson KJ,
    4. Cohen MB
    . Assessing implicit gender bias in medical student performance evaluations. Evaluation & the Health Professions. 2010;33(3):365-385. doi:10.1177/0163278710375097.
    OpenUrlCrossRef
  17. 17.↵
    1. Chan ZC,
    2. Chan YT,
    3. Lui CW,
    4. et al
    . Gender differences in the academic and clinical performances of undergraduate nursing students: a systematic review. Nurse Educ Today. 2014;34(3):377-388. doi: 10.1016/j.nedt.2013.06.011.
    OpenUrlCrossRefPubMed
  18. 18.↵
    1. Heldenbrand SD,
    2. Dayer LE,
    3. Martin BC,
    4. et al
    . APPE evaluations are positively associated with MMI, pre-pharmacy GPA and pharmacy GPA. Am J Pharm Educ. 2018;82(7):6326. doi: 10.5688/ajpe6326.
    OpenUrlAbstract/FREE Full Text
  19. 19.↵
    1. Jacques L,
    2. Kaljo K,
    3. Treat R,
    4. Davis J,
    5. Farez R,
    6. Lund M
    . Intersecting gender, evaluations, and examinations: averting gender bias in an obstetrics and gynecology clerkship in the United States. Education for Health: Change in Learning & Practice. 2016;29(1):25-29. doi:10.4103/1357-6283.178926.
    OpenUrlCrossRef
  20. 20.↵
    1. Riese A,
    2. Rappaport L,
    3. Alverson B,
    4. Park S,
    5. Rockney RM
    . Clinical performance evaluations of third-year medical students and association with student and evaluator gender. Acad Med. 2017;92(6):835-840. doi: 10.1097/ACM.0000000000001565.
    OpenUrlCrossRef
  21. 21.↵
    1. Haist SA,
    2. Wilson JF,
    3. Elam CL,
    4. Blue AV,
    5. Fosson SE
    . The effect of gender and age on medical school performance: an important interaction. Advances in Health Sciences Education: Theory and Practice. 2000;5(3):197-205. doi: 10.1023/A:1009829611335
    OpenUrlCrossRef
  22. 22.↵
    1. Nto NJ,
    2. Obikili EN,
    3. Anyanwu GE,
    4. Agu AU,
    5. Esom EA,
    6. Ezugworie JO
    . Effect of age, premedical academic performance, and entry bias on students' performance in final preclinical examination at the University of Nigeria Medical School. J Exp Clin Anat. 2019;18:6-11. doi: 10.4103/jeca.jeca219
    OpenUrlCrossRef
  23. 23.↵
    1. Fay EE,
    2. Schiff MA,
    3. Mendiratta V,
    4. Benedetti TJ,
    5. Debiec K
    . Beyond the ivory tower: a comparison of grades across academic and community OB/GYN clerkship sites. Teach Learn Med. 2016;28(2):146-51. doi: 10.1080/10401334.2016.1146603.
    OpenUrlCrossRef
  24. 24.↵
    1. Moore S,
    2. Clark C,
    3. Haught A,
    4. et al
    . Factors associated with academic performance in physician assistant graduate programs and national certification examination scores. A literature review. Health Professions Education. 2019;(2):103. doi: 10.1016/j.hpe.2018.06.003.
    OpenUrlCrossRef
  25. 25.↵
    1. Asprey D,
    2. Dehn R,
    3. Kreiter C
    . The impact of age and gender on the physician assistant national certifying examination scores and pass rates. Perspect Physician Assist Educ. 2004;15(1):38-41. doi: 10.1097/01367895-200415010-00006
    OpenUrlCrossRef
  26. 26.↵
    1. Satler C,
    2. Guimarães L,
    3. Tomaz C
    . Planning ability impairments in probable Alzheimer's disease patients: Evidence from the Tower of London test. Dement Neuropsychol. 2017;11(2):137-144. doi: 10.1590/1980-57642016dn11-020006.
    OpenUrlCrossRef
  27. 27.↵
    CDC Cognitive impairment: A Call for Action, Now! https://www.cdc.gov/aging/pdf/cognitive_impairment/cogimp_poilicy_final.pdf. Accessed March 9, 2020.
  28. 28.↵
    1. Ten Cate O,
    2. Schwartz A,
    3. Chen HC
    . Assessing trainees and making entrustment decisions: on the nature and use of entrustment-supervision scales. Acad Med. 2020;doi:10.1097/ACM.0000000000003427. doi:10.1097/ACM.0000000000003427
    OpenUrlCrossRef
PreviousNext
Back to top

In this issue

American Journal of Pharmaceutical Education
Vol. 84, Issue 9
1 Sep 2020
  • Table of Contents
  • Index by author
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Pharmaceutical Education.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Developing and Implementing an Entrustable Professional Activity Assessment for Pharmacy Practice Experiences
(Your Name) has sent you a message from American Journal of Pharmaceutical Education
(Your Name) thought you would like to see the American Journal of Pharmaceutical Education web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
4 + 14 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Citation Tools
Developing and Implementing an Entrustable Professional Activity Assessment for Pharmacy Practice Experiences
Connie Smith, Roxie Stewart, Gregory Smith, H. Glenn Anderson, Scott Baggarly
American Journal of Pharmaceutical Education Sep 2020, 84 (9) ajpe7876; DOI: 10.5688/ajpe7876

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Developing and Implementing an Entrustable Professional Activity Assessment for Pharmacy Practice Experiences
Connie Smith, Roxie Stewart, Gregory Smith, H. Glenn Anderson, Scott Baggarly
American Journal of Pharmaceutical Education Sep 2020, 84 (9) ajpe7876; DOI: 10.5688/ajpe7876
del.icio.us logo Digg logo Reddit logo Twitter logo CiteULike logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • INTRODUCTION
    • METHODS
    • RESULTS
    • DISCUSSION
    • CONCLUSION
    • ACKNOWLEDGMENTS
    • Appendix 1. APPE Assessment Tool Example Excerpts
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Similar AJPE Articles

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Prevalence of Anxiety and Depressive Symptoms Among Pharmacy Students
  • Pharmacy Students’ Perception of an Elective Course on Evidence-based Learning Strategies
  • Comparison of Pharmacy Students' Performance in a Laboratory Course Delivered Live Versus by Virtual Facilitation
Show more BRIEF

Related Articles

  • No related articles found.
  • Google Scholar

Keywords

  • entrustable professional activities
  • experiential education
  • assessment
  • pharmacy education

Home

  • AACP
  • AJPE

Articles

  • Current Issue
  • Early Release
  • Archive

Instructions

  • Author Instructions
  • Submission Process
  • Submit a Manuscript
  • Reviewer Instructions

About

  • AJPE
  • Editorial Team
  • Editorial Board
  • History
  • Contact

© 2021 American Journal of Pharmaceutical Education

Powered by HighWire