Skip to main content

Main menu

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Other Publications

User menu

Search

  • Advanced search
American Journal of Pharmaceutical Education
  • Other Publications
American Journal of Pharmaceutical Education

Advanced Search

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Follow AJPE on Twitter
  • LinkedIn
Brief ReportBRIEF

An Objective Structured Clinical Examination to Assess Competency Acquired During an Introductory Pharmacy Practice Experience

Randy D. Martin, Nam Ngo, Homero Silva and W. Russell Coyle
American Journal of Pharmaceutical Education April 2020, 84 (4) 7625; DOI: https://doi.org/10.5688/ajpe7625
Randy D. Martin
University of North Texas System College of Pharmacy, Fort Worth, Texas
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Nam Ngo
University of North Texas System College of Pharmacy, Fort Worth, Texas
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Homero Silva
University of North Texas System College of Pharmacy, Fort Worth, Texas
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
W. Russell Coyle
University of North Texas System College of Pharmacy, Fort Worth, Texas
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading

Abstract

Objective. To evaluate the use of an objective structured clinical examinations (OSCE) to assess clinical competency acquired during an off-campus introductory pharmacy practice experience (IPPE).

Methods. Third-year pharmacy students completed an IPPE in transitions of care and completed 24 experiential contact hours at one of 17 practice sites. Students were assessed using two OSCEs, the first occurring prior to beginning an off-site IPPE (pre-experience OSCE) and the second occurring after completion of the off-site IPPE (post-experience OSCE). Each OSCE consisted of 10 stations and covered five graded competency domains. The primary outcome was the degree of change in student performance from the pre-experience OSCE to the post-experience OSCE. Secondary outcomes included changes in each graded domain, OSCE pass rate, and failure conversion rate.

Results. Of 111 students, 109 completed both the pre- and post-experience OSCE. Significant improvements were observed in overall score and cohort pass rate. Overall scores improved from 80 for the pre-experience OSCE to 87 for the post-experience OSCE. The OSCE pass rate also improved from 47% to 84%.

Conclusion. Although preceptor evaluations have traditionally served as the primary summative assessment for IPPE and APPE, this study indicates that OSCEs may be a reliable alternative to assess clinical competency acquired from off-site practice experiences.

Keywords
  • experiential education
  • competency assessment
  • clinical competency
  • objective structured clinical examination
  • introductory pharmacy practice experience

INTRODUCTION

Evaluation of student performance on introductory and advanced pharmacy practice experiences (IPPEs and APPEs) continues to be a controversial topic. In particular, preceptor ratings of student performance have been criticized for their lack of consistency and accuracy.1-3 Nevertheless, a majority of pharmacy schools continue to use preceptor evaluations as the primary source for grade determination in both IPPEs and APPEs.4

One alternative to the traditional preceptor evaluation is the objective structured clinical examination (OSCE). The OSCE was first introduced by Harden and colleagues as a novel method of assessing clinical competence for medical students and has since been adapted to doctor of pharmacy curricula, among numerous other health professional programs.5,6 The OSCE is advantageous in evaluating competency in difficult-to-assess areas, such as communication, problem solving, and sound decision-making.7,8 Compared to other assessments, the OSCE has relatively high reliability, validity, and objectivity. Considering that key drivers of preceptor rater error include leniency, deficits in preceptor knowledge and skills, and the halo effect of a students’ prior performance, OSCE lends itself as a more objective and unbiased alternative.1

Use of the OSCE in pharmacy curricula is common, but varies greatly in structure and purpose.6 Schools of pharmacy have implemented OSCEs as formative and summative assessments for both individual courses and program level performance, and most recently to assess students’ readiness to begin APPEs.6-7,9-12 Furthermore, because the OSCE is capable of assessing clinical competency, communication, and professionalism, there is significant potential to leverage the OSCE as the primary assessment for experiential courses.5,7,8 More specifically, some authors have indicated its potential to assess students’ performance in APPEs.13,14 Nevertheless, actual use of the OSCE to assess student learning in experiential courses has yet to be described in the pharmacy education literature. A review of other health professions yielded a few articles examining the use of the OSCE in medical clerkships, but literature in examining its use in experiential programs in other professions was lacking.13,15,16

The purpose of this study was to describe and evaluate the use of an OSCE administered before and after an IPPE to assess competency acquired during the course. We hypothesized that student OSCE performance would improve following completion of the IPPE and that this performance improvement would be indicative of the knowledge and skills gained during the IPPE.

METHODS

During the third professional year (P3) at the University of North Texas System College of Pharmacy, students completed a required IPPE in transitions of care. The course consisted of P3 students spending 24 hours over a three-week period at a pharmacy practice site (hospital, ambulatory, or hospital-based community pharmacy), during which they completed transition-of-care activities (ie, medication reconciliation and patient education), identified medication-related problems, and made clinical recommendations to their preceptor. Preceptors were provided with a mandatory one-hour online training session as well as guidance documents specific to this course that covered course structure, required activities, preceptor and student expectations, and use of an OSCE to assess student competency upon completion of the course. In addition, preceptors were required to undergo general preceptor development by the college on a regular basis that included instruction on teaching strategies and providing effective feedback to students.

Two OSCEs were administered. The first OSCE was administered prior to students completing the off-site experience and was used as a formative assessment (pre-experience OSCE). Students who failed the pre-experience OSCE were required to attend a debrief session in which a faculty member discussed their areas of weakness but they were not required to remediate. The second OSCE was administered as a summative assessment at the conclusion of the IPPE (post-experience OSCE). Students were required to pass the post-experience OSCE to receive a passing grade for the course. If a student failed the post-experience OSCE, they were allowed a single remediation attempt.

The structure, resources, and assessments were identical for both OSCEs. The only difference between the two OSCEs was case content. Each OSCE consisted of ten 15-minute stations (Table 1). Stations 0 through 6 simulated a hospital admission involving a progressive patient case that started in the emergency department and concluded after the patient was admitted to the intensive care unit (ICU). Stations 7 through 9 simulated a discharge encounter for a new patient. Skills tested throughout the OSCE, such as medication history interview skills and patient counseling, had previously been taught and assessed separately during the first year (P1) skills laboratory course.

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 1.

Roadmap for an Objective Structured Clinical Examination Used to Assess Pharmacy Students’ Knowledge and Skills in Transitions of Care Prior to and After Completing an Introductory Pharmacy Practice Experience

Cases were initially developed by IPPE course faculty members and internally validated by content experts for accuracy, completeness, and standardization, particularly in terms of difficulty. Clinical case content was derived from second year (P2) pharmacotherapy courses. Consequently, P2 pharmacotherapy course faculty members served as content experts for each case as appropriate. To prevent loss of confidentiality, new cases were developed for each cohort of students.

Cases progressed in compressed time and included common health care challenges, such as health literacy, limited physician accessibility, and incomplete records. Information was revealed to students progressively via a standardized paper-based medical record that included progress notes, orders, laboratory results, and previous encounters. Students interacted with three standardized clients (patients, family members, and physicians) during the OSCE. Patients and family members were portrayed by actors from our standardized patient pool. Physicians were portrayed by faculty members. Standardized client interactions were assessed by live graders who were recruited from our pharmacist grader pool. Standardized clients were provided with a script, including specific and general responses. Both standardized clients and graders received a one-hour training session immediately prior to the OSCE.

The OSCE was assessed on a 100-point scale, with five individually graded competency domains. Each domain accounted for 20 points and included: medication history interview skills (station 2), accuracy of the medication history (station 2), accuracy of medication reconciliation (station 4), presentation and accuracy of therapeutic recommendations to the patient’s provider (station 6), and discharge counseling skills (station 9). Interview skills and counseling skills were assessed by a concealed grader using a global rating scale in which the grader rated students’ performance as: clear pass (20), borderline pass (15), borderline fail (10), or clear fail (5). Accuracy of the medication history and medication reconciliation were assessed by the experiential course director using a standardized rubric that evaluated five components on a four-point rating scale, including identification of current medications, medication order completeness, indication, penmanship, and recording other key data such as allergies and nonprescription products. Satisfactory competency (ie, met the criteria to pass the IPPE) was defined as obtaining an overall score of 75 or higher and a minimum score of 15 out of 20 for each of the five domains. Students who did not achieve a minimum score of 15 in any domain received a grade of “fail” for the entire OSCE. Rubrics, rating scales, and thresholds were developed by the course team and internally validated with other faculty members experienced in developing and conducting OSCEs.

The primary outcome was change in student performance from the pre-experience OSCE to the post-experience OSCE. Secondary outcomes included change in overall pass rate, pass rate in each domain, and the failure conversion rate, which was defined as the percentage of students that failed to demonstrate competency in the pre-experience OSCE, but later demonstrated competency in the post-experience OSCE. Student performance during remediation was not included in the analysis.

Student perceptions of the OSCEs were assessed using a standardized and validated institutional student course evaluation, specifically students’ responses to the statement: “The exams were representative of materials and objectives presented in the course.” Students were allowed to rank this statement on a five-point scale with anchors of strongly agree (5) and strongly disagree (1). The survey, which was voluntary and de-identified, was administered to students upon completion of the course but before grades were released.

Statistical analyses were conducted using SPSS Statistics 24 (IBM, Armonk, NY). The primary outcome was analyzed using paired t test and secondary outcomes were analyzed using McNemar chi square. Student course evaluation data were analyzed using descriptive statistics. This study was reviewed and approved by the North Texas Regional Institutional Review Board.

RESULTS

Of the 111 students enrolled in the course for the 2017-2018 academic year, 109 students completed both OSCEs. Students’ mean age was 26.9 years (SD=4.7 years) and 55% of students identified themselves as female. The primary outcome, student performance as represented by mean OSCE score, improved from 80 on the pre-experience OSCE to 87 on the post-experience OSCE (p<.001). In addition, the first-attempt OSCE pass rate increased from 47% of students to 84% of students, with a two-sided p<.001. The conversion rate of students who failed the pre-experience OSCE was 86%.

Significant improvements in pass rate were observed for four out of five domains (Table 2), and numerical scores increased significantly for three of the five domains (Table 3). The greatest observed improvement was for presentation and accuracy of therapeutic recommendations, in which numerical scores increased from 15 to 17 and the pass rate increased from 69% to 92% (p<.001). Significant improvements were also observed in the medication reconciliation competency, with an absolute increase in pass rate of 14% (p=.001) and a numerical score increase from 16 to 18. Performance improvements were less obvious in collecting an accurate medication history, admission interview skills, and discharge counseling skills. These three areas demonstrated significant improvements in either numerical score or overall pass rate, but not both.

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 2.

First Attempt Student Success Rate for Transitions-of-Care Competency Areas Assessed by Objective Structure Clinical Examination Prior to and After Completing an Introductory Pharmacy Practice Experience

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 3.

First Attempt Mean Numerical Student Scores for Transitions-of-Care Competency Areas Assessed by Objective Structured Clinical Examination Prior to and After Completing an Introductory Pharmacy Practice Experience

In terms of student perceptions, 41% of students responded to the statement “The exams were representative of materials and objectives presented in the course.” Of these, 45 respondents, 25 (56%) strongly agreed, 18 (40%) agreed, and the remaining two (4%) students gave neutral responses.

DISCUSSION

The use of OSCE to assess pharmacy students’ competency acquired during an IPPE as an alternative to preceptor evaluations allowed the authors to detect improvements in students’ overall competency as well as in specific competency domains. The greatest magnitude of improvement was detected in domains that required critical thinking, clinical reasoning, and communication. This result is consistent with both the medical education literature and experiential learning theory in terms of competencies typically acquired from such experiences.15,17,18

Consistent with the authors’ expectations, these results highlight important advantages to using OSCE to evaluate student competency acquired from off-campus experiences. The OSCE provides a mechanism to standardize summative assessment across diverse practice sites and compare student performance across a single cohort.1 In addition, these results can inform experiential office efforts concerning course design and preceptor development. Although alternatives, such as preceptor observation checklists, have been explored in the literature, these have produced a similar pattern of grade inflation to that seen with preceptor evaluations.19

Furthermore, use of a pre-experience OSCE added significant value to the course. In addition to serving as a baseline assessment to which the final OSCE could be compared, the pre-experience OSCE served as a teaching tool. By identifying and communicating gaps in individual students’ knowledge and skills at the beginning of the IPPE, the students could focus on specific competency areas that needed further development. Additionally, the pre-experience OSCE helped set student expectations for both the off-site learning experience ahead and the final assessment for the course.

To successfully implement a similar approach at other institutions, the realism and fidelity of the OSCE should be given careful consideration. Objective structured clinical examinations in pharmacy curricula vary greatly from discrete, skill-based assessments to whole case scenarios that include standardized patients and physicians.9,19 In this study, the OSCE included a whole case scenario, standardized clients, and factors to enhance the reality of the scenario, all of which improved the validity and efficacy of the OSCE.20-24 In addition to depicting real practice, the OSCE must maintain fidelity to the off-campus experience and vice versa to avoid adversely impacting student performance. Consequently, use of experienced and dedicated preceptors, as well as identifying adequate practice sites becomes crucial.

In terms of limitations, the external validity of this study is challenged by its small sample size and conduct at a single institution during a single academic year. Future studies of OSCE to assess experiential learning should evaluate larger cohorts and, if possible, across multiple institutions. Another key limitation is the unquantified influence of noncognitive confounders, such as performance anxiety, examination familiarity, and unintended patient variance.8,25,26 For example, students may have entered the post-experience OSCE more confidently because they were familiar with the exam structure. Furthermore, differences in case content may have also influenced the variance in student performance. And although difficulty assessment was included as a key component of the case validation strategy, it is possible that there were unintended differences in case difficulty.

Despite these limitations, there are many future directions for OSCE use in experiential courses. Based upon our findings, schools of pharmacy should consider adding an OSCE to select IPPEs and APPEs that may benefit from more robust assessment techniques than are currently used. As a reliable quantitative method for measuring improvements in clinical competencies across a cohort of off-campus learners, use of pre- and post-experiential OSCE may also be valuable in conducting educational research.

CONCLUSION

Although preceptor evaluations have traditionally served as the primary assessment for IPPE and APPE, this study indicates OSCE may be a reliable alternative to assess clinical competency acquired from off-site IPPE. Future studies should compare reliability and validity of experiential assessments, including OSCE and preceptor evaluations of the student learner. Additionally, our study suggests that future research may be needed to quantify the impact of non-cognitive confounders upon OSCE performance.

  • Received April 10, 2019.
  • Accepted September 16, 2019.
  • © 2020 American Association of Colleges of Pharmacy

REFERENCES

  1. 1.↵
    1. Ried LD,
    2. Douglas CA
    . Towards an operational definition of clinical competency in pharmacy. Am J Pharm Educ. 2015;79(4):54.
    OpenUrlAbstract/FREE Full Text
  2. 2.↵
    1. Manning DH,
    2. Ference KA,
    3. Welch AC,
    4. Holt-Macey M
    . Development and implementation of pass/fail grading system for advanced pharmacy practice experiences. Curr Pharm Teach Learn. 2016;8:59-68.
    OpenUrl
  3. 3.↵
    1. Tofadea T,
    2. Sheplerb BM,
    3. Feudoc DM,
    4. et al
    . Grading trends and evaluation of student performance across advanced pharmacy practice experiences (APPE) in the Big Ten Academic Alliance (The GRAPPES study). Curr Pharm Teach Learn. 2018;10:1466-1473.
    OpenUrl
  4. 4.↵
    1. Varner LH,
    2. Radhakrishnan R,
    3. Rollins BL
    . Preceptor's grading scale preference for student pharmacy practice experience and assessment of the common grading scale among US schools of pharmacy. Curr Pharm Teach Learn. 2018;10:165-169.
    OpenUrl
  5. 5.↵
    1. Harden RM,
    2. Stevenson M,
    3. Downie WW,
    4. et al
    . Assessment of clinical competence using objective structured examinations. BMJ. 1975;1:447-451.
    OpenUrlAbstract/FREE Full Text
  6. 6.↵
    1. Sturpe DA
    . Objective structured clinical examinations in doctor of pharmacy programs in the United States. Am J Pharm Educ. 2010;74(8):148.
    OpenUrlAbstract/FREE Full Text
  7. 7.↵
    1. Shirwaikar A
    . Objective structured clinical examination (OSCE) in pharmacy education – a trend. Pharm Prac. 2015;13:627.
    OpenUrl
  8. 8.↵
    1. Kirton SB,
    2. Kravitz L
    . Objective structured clinical examinations (OSCEs) compared with traditional assessment methods. Am J Pharm Educ. 2011;75(6):111.
    OpenUrlAbstract/FREE Full Text
  9. 9.↵
    1. Ragan RE,
    2. Virtue DW,
    3. Chi SJ
    . An assessment program using standardized clients to determine student readiness for clinical practice. Am J Pharm Educ. 2013;77(1):14.
    OpenUrlAbstract/FREE Full Text
  10. 10.↵
    1. Meszaros K,
    2. Barnett MJ,
    3. McDonald K,
    4. et al
    . Progress examination for assessing students’ readiness for advanced pharmacy practice experiences. Am J Pharm Educ. 2009;73(6):109.
    OpenUrlPubMed
  11. 11.↵
    1. McLaughlin JE,
    2. Jhanova J,
    3. Scolaro K,
    4. et al
    . Limited predictive utility of admissions scores and objective structured clinical examinations for APPE performance. Am J Pharm Educ. 2015;79(6):84.
    OpenUrlAbstract/FREE Full Text
  12. 12.↵
    1. Vyas D,
    2. Bhutada NS,
    3. Feng X
    . Patient simulation to demonstrate students’ competency in core domain abilities prior to beginning advanced pharmacy practice experiences. Am J Pharm Educ. 2012;76(9):176.
    OpenUrlAbstract/FREE Full Text
  13. 13.↵
    1. Peeters MJ,
    2. Cox CD
    . Using the OSCE strategy for APPEs? Am J Pharm Educ. 2011;75(1):13.
    OpenUrl
  14. 14.↵
    1. Jameel A,
    2. Noor SM,
    3. Ayub S,
    4. et al
    . Feasibility, relevance and effectiveness of teaching and assessment of ethical status and communication skills as attributes of professionalism. J Pak Med Assoc. 2015;65(7):721-726.
    OpenUrl
  15. 15.↵
    1. Prislin MD,
    2. Fitzpatrick CF,
    3. Lie D,
    4. et al
    . Use of an objective structured clinical examination in evaluating student performance. Fam Med. 1998;30(5):338-344.
    OpenUrlPubMed
  16. 16.↵
    1. Brazeau C,
    2. Boyd L,
    3. Crosson J,
    4. et al
    . Changing an existing OSCE to a teaching tool: the making of a teaching OSCE. Acad Med. 2002;77(9):932.
    OpenUrlCrossRefPubMed
  17. 17.↵
    1. Townsend AH,
    2. Mcllvenny S,
    3. Miller CJ,
    4. et al
    . The use of an objective structured clinical examination (OSCE) for formative and summative assessment in a general practice clinical attachment and its relationship to final medical school examination performance. Med Educ. 2001;35:841-846.
    OpenUrlCrossRefPubMed
  18. 18.↵
    1. Cooper CL
    1. Kolb DA,
    2. Fry R
    . Chapter 3: Towards an Applied Theory of Experiential Learning. In: Theories of Group Processes. Ed. Cooper CL. New York: Wiley. 1976. Pgs 33-57.
  19. 19.↵
    1. Linedecker SJ,
    2. Barner J,
    3. Ridings-Myhra J,
    4. et al
    . Development of a direct observation of procedural skills rubric for fourth-year pharmacy students in ambulatory care rotations. Am J Health-Syst Pharm. 2017;74:S17-23.
    OpenUrlAbstract/FREE Full Text
  20. 20.↵
    1. Epstein RM,
    2. Hundert EM
    . Defining and assessing professional competence. J Am Med Assoc. 2002;287(2):226-235.
    OpenUrlCrossRefPubMed
  21. 21.↵
    1. Schuwirth LWT,
    2. van der Vleuten CPM
    . A plea for new psychometric models in educational assessment. Med Educ. 2006;40(4):296-300.
    OpenUrlCrossRefPubMed
  22. 22.↵
    1. Frederiksen JR,
    2. White BY
    . Designing assessments for instruction and accountability: an application of validity theory to assessing scientific inquiry. Yearb Natl Soc Study Educ. 2004;103(2):74-104.
    OpenUrl
  23. 23.↵
    1. Newble D
    . Techniques for measuring clinical competence: objective structured clinical examinations. Med Educ. 2004;38(2):199-203.
    OpenUrlCrossRefPubMed
  24. 24.↵
    1. Frederiksen N
    . The real test bias: influences of testing on teaching and learning. Am Psychol. 1984;39(3):193-202.
    OpenUrlCrossRef
  25. 25.↵
    1. Barman A
    . Critiques on the objective structured clinical examination. Ann Acad Med Singapore. 2005;34:478-482.
    OpenUrlPubMed
  26. 26.↵
    1. Urteaga EM,
    2. Attridge RL,
    3. Tovar JM,
    4. et al
    . Evaluation of clinical and communication skills of pharmacy students and pharmacists with an objective structured clinical examination. Am J Pharm Educ. 2015;79(8):122.
    OpenUrlAbstract/FREE Full Text
PreviousNext
Back to top

In this issue

American Journal of Pharmaceutical Education
Vol. 84, Issue 4
1 Apr 2020
  • Table of Contents
  • Index by author
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Pharmaceutical Education.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
An Objective Structured Clinical Examination to Assess Competency Acquired During an Introductory Pharmacy Practice Experience
(Your Name) has sent you a message from American Journal of Pharmaceutical Education
(Your Name) thought you would like to see the American Journal of Pharmaceutical Education web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
15 + 4 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Citation Tools
An Objective Structured Clinical Examination to Assess Competency Acquired During an Introductory Pharmacy Practice Experience
Randy D. Martin, Nam Ngo, Homero Silva, W. Russell Coyle
American Journal of Pharmaceutical Education Apr 2020, 84 (4) 7625; DOI: 10.5688/ajpe7625

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
An Objective Structured Clinical Examination to Assess Competency Acquired During an Introductory Pharmacy Practice Experience
Randy D. Martin, Nam Ngo, Homero Silva, W. Russell Coyle
American Journal of Pharmaceutical Education Apr 2020, 84 (4) 7625; DOI: 10.5688/ajpe7625
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • INTRODUCTION
    • METHODS
    • RESULTS
    • DISCUSSION
    • CONCLUSION
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Similar AJPE Articles

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Predictors of Academic Success in a Nontraditional Doctor of Pharmacy Degree Program at a Historically Black College and University
  • Evaluating Pharmacy Faculty’s Awareness of Teaching and Learning Myths and Misconceptions
  • Use of the Virtual Simulation Tool ‘MyDispense’ By Pharmacy Programs in the United States
Show more BRIEF

Related Articles

  • No related articles found.
  • Google Scholar

Keywords

  • Experiential Education
  • competency assessment
  • clinical competency
  • objective structured clinical examination
  • introductory pharmacy practice experience

Home

  • AACP
  • AJPE

Articles

  • Current Issue
  • Early Release
  • Archive

Instructions

  • Author Instructions
  • Submission Process
  • Submit a Manuscript
  • Reviewer Instructions

About

  • AJPE
  • Editorial Team
  • Editorial Board
  • History
  • Contact

© 2023 American Journal of Pharmaceutical Education

Powered by HighWire