Skip to main content

Main menu

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Other Publications

User menu

Search

  • Advanced search
American Journal of Pharmaceutical Education
  • Other Publications
American Journal of Pharmaceutical Education

Advanced Search

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Follow AJPE on Twitter
  • LinkedIn
In BriefBRIEF

Development and Initial Evaluation of an Advanced Pharmacy Practice Experience Readiness Assessment Plan

Kali M. VanLangen, Kyle J. Schmidt, Minji Sohn, Lisa M. Meny and David R. Bright
American Journal of Pharmaceutical Education April 2023, 87 (4) ajpe9002; DOI: https://doi.org/10.5688/ajpe9002
Kali M. VanLangen
aFerris State University College of Pharmacy, Grand Rapids, Michigan
PharmD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: KaliVanLangen@ferris.edu
Kyle J. Schmidt
aFerris State University College of Pharmacy, Grand Rapids, Michigan
PharmD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Minji Sohn
bFerris State University College of Pharmacy, Big Rapids, Michigan
PhD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Lisa M. Meny
aFerris State University College of Pharmacy, Grand Rapids, Michigan
PharmD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
David R. Bright
bFerris State University College of Pharmacy, Big Rapids, Michigan
PharmD, MBA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading

Abstract

Objective. To describe the composition of an advanced pharmacy practice experience (APPE) readiness assessment plan (APPE-RAP) along with initial findings following retrospective application to a cohort of students.

Methods. The APPE-RAP uses existing summative assessment data within the ExamSoft platform on six skills and 12 ability-based outcomes from the pre-APPE curriculum. Thresholds were created to sort students into three readiness categories for skills and knowledge, determine overall readiness, and identify need for curricular review. Students that completed their third professional year in spring 2021 served as the pilot cohort. The APPE-RAP was applied after the cohort progressed to APPEs to analyze appropriateness of categorization and revise the plan before full implementation.

Results. The APPE-RAP was applied to 131 students that progressed to APPEs in spring 2021. Overall, 87.9% were APPE ready for all skills and aggregate knowledge. Two skills met criteria for curricular review. Seven students (5.3%) were categorized as red on at least one skill after one remediation attempt. Nine students (7%) were categorized as red on an aggregate knowledge-based ability-based outcomes (ABO) evaluation. Four students (3.1%) did not pass one of their first two experiential rotations. Using a red categorization on aggregate knowledge as a risk indicator identified APPE failure with 94% specificity and a 98% negative predictive value.

Conclusion. Existing assessment data may be leveraged to identify assessment targets to help quantify APPE readiness. Further research is warranted to identify additional assessment thresholds that enhance quantification of APPE readiness as well as the impact of focused remediation on attainment of APPE readiness.

Keywords
  • advanced pharmacy practice experience readiness
  • skills-based assessment
  • knowledge-based assessments

INTRODUCTION

Ensuring pharmacy graduates are practice-ready clinicians has been a consistent focus throughout the transition to the Doctor of Pharmacy (PharmD) as the only degree leading to pharmacist licensure.1 Robust experiential education, including advanced pharmacy practice experiences (APPEs), allow students dedicated time to deepen and apply classroom knowledge in a practical way. Success requires a foundation of knowledge and skills taught during the pre-APPE curriculum. Unaddressed deficiencies in knowledge and skills prior to APPEs may detract from clinical experiences designed to prepare students for practice. Predictors of APPE success have been reported, highlighting the importance of proactively identifying students with foundational deficiencies in knowledge, skills, and professionalism to mitigate APPE course failures.2,3

Consensus definitions of APPE readiness have been lacking, with limited clarity available in the literature or accreditation guidance.4-7 Early reports have also identified discordance in assessment strategies and thresholds, necessitating a data-driven approach to guide practice.4

Historically, our program has considered any student that passed all courses in the pre-APPE curriculum to be APPE ready. Few programmatic failures during APPEs and a strong passing rate on licensure examination(s) implied this approach was effective.8 However, successful course completion may overlook individual core outcome deficiencies given the multitude of factors considered in overall course grades.9 As such, progression to APPEs may occur despite an absence of minimal competence in some skills and knowledge needed for APPEs.

A data-driven APPE readiness assessment plan (APPE-RAP) was developed to complement existing programmatic assessment processes aiming to identify underperforming students on individual ability-based outcomes (ABOs) and patient care skills. Identified deficiencies will serve to guide early intervention for students as well as curricular modification, if necessary. The purpose of this Brief is to describe the composition of the APPE-RAP along with initial findings following retrospective application to a cohort of students.

METHODS

The APPE-RAP was developed by a faculty subcommittee with membership from the college’s curriculum and assessment committees. The APPE-RAP consisted of skills and knowledge assessment components. Skills-based abilities were assessed through an APPE readiness objective structured clinical examination (OSCE) delivered at the end of the pre-APPE curriculum. Knowledge was assessed through evaluation of performance on summative examinations in the pre-APPE curriculum. Skills and summative knowledge assessments, except for one biochemistry course, were administered through ExamSoft (ExamSoft Worldwide LLC).

Six core patient care skills taught and assessed throughout the pre-APPE curriculum were included (Table 1).10 Standardized rubrics, developed by faculty, are used throughout the pre-APPE curriculum with four consistent categories of proficiency (proficient, competent, novice, unacceptable) per item. Readiness for each skill was determined by summative, individual performance in an APPE readiness OSCE administered over three days, with one remediation attempt if needed. Each skill is taught and assessed at least twice prior to the summative assessment. Remediation was required per standard practices if a student had any unacceptable ratings on each skill rubric.

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 1.

Six Core Patient Care Skills and Twelve Knowledge Areas Taught to an Entering Class of Pharmacy Students and Assessed Throughout the Pre-APPE Curriculum (N=131)

The knowledge component includes 12 college-approved ABOs within domain 1 of Standards 2016 (Table 1).7 These outcomes focus on foundational knowledge exclusively assessed within the pre-APPE curriculum. Each individual examination question within ExamSoft must be tagged with college-approved ABOs. All questions administered via ExamSoft categorized by faculty as summative examinations in the pre-APPE curriculum, including remediation attempts, were aggregated for the analysis.

Within the APPE-RAP, readiness for each skill and knowledge was represented as three distinct categories (green, yellow, red) (Table 2). Skills categorizations aligned with predetermined passing criteria for the APPE readiness OSCE. Knowledge categorizations were created under the assumption that each student cohort represented a normal distribution of performance. Aggregate average performance of the cohort throughout the pre-APPE curriculum served as the benchmark. Red categorization, for students performing below two standard deviations of the mean, represents significant underperformance compared to peers. Using individual ABO categorizations, students were then given an overall aggregate categorization (Table 2).

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 2.

Thresholds for Determining First-Year Pharmacy Students’ Readiness Category for Skills and Knowledge

To determine whether curricular modification was needed, university-level assessment guidance was used, which suggests curricular intervention if less than 85% of the cohort meets the passing threshold. For skills, the aggregate first-attempt pass rate was reviewed.

Students in the 2018 entering class that progressed to APPEs on time were evaluated, as this entering class was the only cohort to complete the revised pre-APPE curriculum at the time of analysis. The APPE-RAP was retrospectively applied once all eligible students completed two APPEs. The APPE-RAP was not used to intervene in student progression during this pilot. The focus was to determine the appropriateness of categorization and identify opportunities for improving the APPE readiness plan.

Categorization appropriateness was determined by manual review of each student’s performance on their first two assigned APPEs compared to their readiness categorization. Because knowledge-based data provide early intervention opportunities, we used the data to estimate predictive values, sensitivity, and specificity. Positive predictive value represents the probability that students categorized as red (not APPE ready) would fail at least one of their first two APPEs. Negative predictive value represents the probability that students categorized as yellow or green (APPE ready) would not fail at least one of their first two APPEs. At our institution, there is no required order in which APPEs must be completed. As a result, some students may have had electives as their first two APPEs, while others may have had two required direct patient care experiences.

All analyses were performed using Microsoft Excel. This quality improvement project was reviewed and approved by the Ferris State University Institutional Review Board.

RESULTS

One hundred thirty-one students completed the pre-APPE curriculum in three years and progressed to APPEs in spring 2021. Overall, 87.9% of students were categorized as green or yellow (APPE ready) for aggregate knowledge and skills.

First-attempt skills performance on the APPE readiness OSCE is shown in Table 1. Two skills did not meet the 85% first-attempt pass threshold, indicating the need for curricular review. Most students (94.7%) were categorized as green or yellow for all six skills after one remediation attempt. Fewer than three students (2.3%) received novice ratings in >50% of items assessed for each skill.

Aggregate student performance on each ABO is shown in Table 1. Evaluation of aggregate knowledge categorization identified nine students (7%) as red on multiple ABOs (five on two, one on three, two on four, and one on five). Six of the nine students classified as red (66.7%) were assigned to elective APPEs in their first two blocks, which traditionally have a low failure rate.

Four students (3.1%) did not pass one of their first two APPEs. For aggregate knowledge, one student was classified as red, while the other three were classified as yellow. A more detailed overview of these students’ APPE readiness results is shown in Table 3. Each of these failures was a required, direct patient care APPE (two ambulatory care, two adult inpatient medicine). Based on data obtained from auto-fail criteria on the standard APPE evaluation, failures were due to the following: student’s performance on four or more outcomes was novice (n=4), student failed to meet rotation-specific criteria (n=3), student’s performance was unacceptable on one or more outcomes (n=2), and student’s lack of knowledge prohibited their ability to provide patient care (n=1).

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 3.

Individual APPE Readiness Results for Each Student Who Failed One of the First Two APPEs

A validation analysis was conducted using aggregate knowledge red categorization as the risk indicator for APPE failure. It identified APPE failure with 25% sensitivity, 94% specificity, 11% positive predictive value, and 98% negative predictive value.

DISCUSSION

This paper describes an APPE-RAP with retrospective application to a cohort of students. The APPE-RAP serves as a tool to guide focused remediation as well as inform the college of potential needs for curricular revision. Analysis of data within ExamSoft allows feasible evaluation of student-level deficits without the use of novel assessments (eg, mile marker examinations), potentially avoiding additional workload strain on faculty and students. Prior to implementing the APPE-RAP for programmatic progression decisions, our institution chose to review the data for a cohort of students that had already begun APPEs to determine appropriateness of the initial categorizations. The frequency of students with more than 50% novice ratings on individual skills was low (2.3%). Therefore, skills thresholds were revised to ensure underperforming students undergo additional remediation. The revised threshold for red categorization is now a rating of ≥1 unacceptable or ≥50% of the ratings as novice for any individual skill. Future use of the APPE-RAP will include enhanced student self-monitoring and/or earlier programmatic remediation interventions.

A review of first-attempt pass rates for skills revealed that a significant number of students required remediation to improve their ability to write SOAP (subjective, objective, assessment, plan) notes, which was suggestive of curricular deficiencies and resulted in course modification. Similar modifications were discussed but not implemented for patient counseling because the assessment of patient counseling occurred virtually due to teaching restrictions related to the COVID-19 pandemic. Several students shared that they perceived a different level of stress when learning and practicing counseling skills via distance-based videoconferencing technology and assessment in person, which may have produced spurious results.11

Knowledge-based deficits were more difficult to generalize, as student-specific deficiencies varied. These findings are reflective of the overall infrequent APPE failures observed (5.4 failures per year for the last four years), but they also represent that a simple APPE-RAP designation of red may be insufficient to capture those with an increased likelihood of failing an APPE. While knowledge thresholds remain unchanged, a threshold of multiple yellow ABOs may need to be considered in addition to other supportive interventions to ensure success on APPEs.2

Using red as the risk indicator estimated a low positive predictive value and a high negative predictive value. Alternatively, being categorized as green or yellow predicted passing APPEs with good accuracy (98% negative predictive value). However, a high proportion of students categorized as red also passed APPEs (11% positive predictive value). Using the red categorization as the threshold would conservatively identify students at risk of failure, and early intervention resource allocation to such students may still be prudent.

Individual APPE-RAP reports are now provided to students to provide insight into their performance beginning in their second year of pharmacy school. This data will allow student self-monitoring and help guide academic advising efforts, potentially leading to targeted remediation prior to APPEs to improve readiness. This could prevent students categorized as red from spending a disproportionate amount of time during their APPEs on reviewing or remediating content, thus limiting their advancement of knowledge beyond the pre-APPE level. Our initial analysis demonstrates a way to identify not only those that are failing but also those that share similar deficiencies that may benefit from earlier assistance. Our goal is that early focused intervention will optimize APPE growth.

Focusing on programmatic ABOs and essential skills aligns well with published strategies.2-6 Identifying a consistent passing threshold on item-level achievement proved difficult due to varying assessment philosophies across the college. For example, faculty may prefer to offer APPE-level examination questions to second-year students, using grade adjustments to achieve more normal distribution. However, preadjustment item-level statistics may indicate poor performance on that ability-based outcome. Instead of using an arbitrary performance threshold, average performance of the cohort was leveraged to identify relative outliers in student performance. Given that the ability for more real-time tracking and dissemination to students exists, early intervention and/or remediation options could become part of a final APPE readiness plan. Additionally, consideration has been given to adding measures of professionalism to future iterations of the APPE-RAP.2

This analysis represents the work of one institution with one cohort of students. The results might not be generalizable to other institutions. Reliability needs to be evaluated when additional cohorts complete the pre-APPE curriculum. Also, the timing of skills and knowledge assessments differed, which may have introduced issues with equal weighting throughout the pre-APPE curriculum in the APPE-RAP. However, use of relative performance criteria, rather than a preset threshold to categorize readiness, minimized these issues. To simplify data aggregation and build an initial plan, professionalism and early experiential data were not considered in our assessment matrix, but these have been previously associated with APPE failures.6 Additionally, not all students were assigned to a direct patient care APPE within the first two blocks, which may impact our results. Given the paucity of literature, our intention is for this approach to stimulate further discussion regarding the utility of APPE readiness to improve both individual and aggregate student outcomes.

CONCLUSION

Existing assessment data may be leveraged to identify assessment targets to help quantify APPE readiness. Observed deficiencies can be triaged to focused remediation or curricular revisions based on objective criteria. Further research is warranted not only to identify additional assessment thresholds that enhance quantification of APPE readiness but also to describe the impact of focused remediation on attainment of APPE readiness.

ACKNOWLEDGEMENTS

Kali M. VanLangen, PharmD, BCPS, served as the lead and corresponding author on this paper. She led the development of the APPE-RAP, preliminary literature review, facilitated the writing of the first draft, and provided critical review and approval of the final version. All authors contributed equally to the writing of the first draft and subsequent revisions.

  • Received December 17, 2021.
  • Accepted July 29, 2022.
  • © 2023 American Association of Colleges of Pharmacy

REFERENCES

  1. 1.↵
    1. Bright DR,
    2. Adams AJ,
    3. Black CJ,
    4. Powers MF
    . The mandatory residency dilemma: parallels to historical transitions in pharmacy education. Ann Pharmacother. 2010;44(11):1793-1799. doi:10.1345/aph.1P394
    OpenUrlCrossRefPubMed
  2. 2.↵
    1. Call WB,
    2. Grice GR,
    3. Tellor KB,
    4. Armbruster AL,
    5. Spurlock AM,
    6. Berry TM
    . Predictors of student failure or poor performance on advanced pharmacy practice experiences. Am J Pharm Educ. 2020;84(10):Article 7890. doi:10.5688/ajpe7890
    OpenUrl
  3. 3.↵
    1. Meszaros K,
    2. Barnett MJ,
    3. McDonald K,
    4. et al.
    Progress examination for assessing students’ readiness for advanced pharmacy practice experiences. Am J Pharm Educ. 2009;73(6):1-8. doi:10.5688/aj7306109
    OpenUrlCrossRefPubMed
  4. 4.↵
    1. VanLangen KM,
    2. Meny LM,
    3. Bright DR,
    4. et al.
    An initial environmental scan of APPE readiness assessment. Curr Pharm Teach Learn. 2020 Jul;12(7):771-775. doi:10.1016/j.cptl.2020.02.015
    OpenUrl
  5. 5.
    1. Guirguis E,
    2. Sourial M,
    3. Jackson J,
    4. Bonfiglio M,
    5. Nornoo A,
    6. Maarsingh H
    . Developing a comprehensive APPE-readiness plan with a focus on skills, attitudes, and behaviors. Curr Pharm Teach Learn. 2020;12(4):479-486. doi:10.1016/j.cptl.2019.12.035
    OpenUrlCrossRef
  6. 6.↵
    1. Gilliam E,
    2. Nuffer W,
    3. Thompson M,
    4. Vande Griend J
    . Design and activity evaluation of an advanced-introductory pharmacy practice experience (aIPPE) course for assessment of student APPE readiness. Curr Pharm Teach Learn. 2017 Jul;9(4):595-604. doi:10.1016/j.cptl.2017.03.028
    OpenUrlCrossRef
  7. 7.↵
    Accreditation Council for Pharmacy Education (ACPE). Accreditation standards and key elements for the professional program in pharmacy leading to the doctor of pharmacy degree (“Standards 2016”). Published February 2015. Accessed March 19, 2023. Available at: https://www.acpe-accredit.org/pdf/Standards2016FINAL.pdf
  8. 8.↵
    Accreditation Council for Pharmacy Education (ACPE). Policies and procedures for ACPE accreditation of professional degree programs. Published June 2021. Accessed March 19, 2023. Available at: https://www.acpe-accredit.org/pdf/CSPoliciesandProceduresJanuary2022.pdf
  9. 9.↵
    1. Cain J,
    2. Medina M,
    3. Romanelli F,
    4. Persky A
    . Deficiencies of traditional grading systems and recommendations for the future. Am J Pharm Educ. 2022;86(7):Article 8850. doi:10.5688/ajpe8850
    OpenUrl
  10. 10.↵
    1. VanLangen KM,
    2. Meny L,
    3. Bright D,
    4. Seiferlein M
    . Faculty perceptions of entrustable professional activities to determine pharmacy student readiness for advanced pharmacy practice experiences. Am J Pharm Educ. 2019;83(10):Article 7501. doi:10.5688/ajpe7501
    OpenUrl
  11. 11.↵
    1. VanLangen KM,
    2. Sahr MJ,
    3. Salvati LA,
    4. Meny LM,
    5. Bright DR,
    6. Sohn M
    . Viability of virtual skills-based assessments focused on communication. Am J Pharm Educ 2021;85(7):Article 8378. doi:10.5688/ajpe8378
    OpenUrl
PreviousNext
Back to top

In this issue

American Journal of Pharmaceutical Education
Vol. 87, Issue 4
1 Apr 2023
  • Table of Contents
  • Index by author
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Pharmaceutical Education.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Development and Initial Evaluation of an Advanced Pharmacy Practice Experience Readiness Assessment Plan
(Your Name) has sent you a message from American Journal of Pharmaceutical Education
(Your Name) thought you would like to see the American Journal of Pharmaceutical Education web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
4 + 0 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Citation Tools
Development and Initial Evaluation of an Advanced Pharmacy Practice Experience Readiness Assessment Plan
Kali M. VanLangen, Kyle J. Schmidt, Minji Sohn, Lisa M. Meny, David R. Bright
American Journal of Pharmaceutical Education Apr 2023, 87 (4) ajpe9002; DOI: 10.5688/ajpe9002

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Development and Initial Evaluation of an Advanced Pharmacy Practice Experience Readiness Assessment Plan
Kali M. VanLangen, Kyle J. Schmidt, Minji Sohn, Lisa M. Meny, David R. Bright
American Journal of Pharmaceutical Education Apr 2023, 87 (4) ajpe9002; DOI: 10.5688/ajpe9002
Reddit logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • INTRODUCTION
    • METHODS
    • RESULTS
    • DISCUSSION
    • CONCLUSION
    • ACKNOWLEDGEMENTS
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Similar AJPE Articles

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Teaching and Assessing Pharmacy Students on Sterile Compounding Accuracy Checks
  • Identifying Student Research Project Impact Using the Buxton and Hanney Payback Framework
  • Qualitative Analysis of Pharmacy Students’ Self-identified Preconceptions Regarding the Term Clinical Pharmacy
Show more BRIEF

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Keywords

  • advanced pharmacy practice experience readiness
  • skills-based assessment
  • knowledge-based assessments

Home

  • AACP
  • AJPE

Articles

  • Current Issue
  • Early Release
  • Archive

Instructions

  • Author Instructions
  • Submission Process
  • Submit a Manuscript
  • Reviewer Instructions

About

  • AJPE
  • Editorial Team
  • Editorial Board
  • History
  • Contact

© 2023 American Journal of Pharmaceutical Education

Powered by HighWire