Skip to main content

Main menu

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Other Publications

User menu

Search

  • Advanced search
American Journal of Pharmaceutical Education
  • Other Publications
American Journal of Pharmaceutical Education

Advanced Search

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Follow AJPE on Twitter
  • LinkedIn
Research ArticleTEACHERS’ TOPICS

Repeated Testing to Improve Skills in a Pharmacy Practice Laboratory Course

Kimberley Begley, Michael S. Monaghan and Yongyue Qi
American Journal of Pharmaceutical Education August 2013, 77 (6) 130; DOI: https://doi.org/10.5688/ajpe776130
Kimberley Begley
School of Pharmacy and Health Professions, Creighton University, Omaha, Nebraska
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Michael S. Monaghan
School of Pharmacy and Health Professions, Creighton University, Omaha, Nebraska
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Yongyue Qi
School of Pharmacy and Health Professions, Creighton University, Omaha, Nebraska
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading

Abstract

Objective. To evaluate the impact of repeated simulations and testing on the pharmacy practice skills development of third-year doctor of pharmacy (PharmD) students.

Design. A pharmacy practice skills laboratory was redesigned to reinforce skills development and enhance retention. Timed, repeated learning experiences that increased in complexity throughout the semester were used to test student knowledge, skills, and abilities.

Assessment. Over a 5-year period, scores from skills-based activities deemed essential to professional practice and repeated 4 or more times in the course were analyzed. There was a significant improvement in scores on drug utilization reviews and patient counseling simulations despite the increasing difficulty and complexity of the medication problems presented (p <0.001). Students’ scores on prescription verification and sterile product verification also improved significantly over 3 assessments (p <0.001), but then plateaued, with less improvement seen in performance on subsequent assessments.

Conclusion. Providing multiple opportunities for students to conduct or simulate pharmacy practice activities and then test their knowledge and skills improves students’ learning and performance.

Keywords
  • pharmacy practice laboratory
  • simulation
  • testing
  • pharmacy skills
  • assessment
  • pharmacy practice

INTRODUCTION

Pharmacy programs must use and integrate teaching and learning methods that have been shown through curricular assessments to produce graduates who become competent pharmacists.1 Learning a new skill encompasses 3 phases: the cognitive phase (identification and development of the component parts of the skill), the associative phase (linking the component parts into a smooth action), and the autonomous phase (developing the learned skill so that it becomes automatic).2,3 The teaching of a skill can be accomplished using 2 educational components: verbal instructions and demonstration, which produce a mental picture of the skill for the student; and practice with feedback, which allows the student to improve the performance of the skill.

Multiple testing not only enhances learning, it also improves long-term retention.4,5 The School of Pharmacy and Health Professions at Creighton University took these ideas regarding skill development and long-term learning and used them to construct a pharmacy practice skills “learning laboratory.” This 15-week laboratory course is conducted parallel to the classroom-based portion of the curriculum and designed to allow students to apply classroom material. The course’s intent was to develop students’ cognitive or performance skills in the areas covered in the classroom. This longitudinal skills development experience reinforces educational outcomes through student participation in knowledge in use activities which provide feedback on student performance. It also provides a curricular structure for the use of embedded assessments that may be used to demonstrate student competence achievement of educational outcomes and generate data for programmatic assessment.

DESIGN

The Dispensing and Pharmaceutical Care Laboratory was a 1-credit hour laboratory, aligned with a corresponding 2-credit hour lecture, occurring in the second semester of the third year of the pharmacy curriculum. The laboratory was intended to provide students with an opportunity to enhance their skill development and apply knowledge learned in the classroom. Students enrolled in the school’s campus pathway and those enrolled in the distance pathway had to complete the 15-week required laboratory course. There were an average 163 third-year pharmacy students enrolled in the laboratory per semester. The campus students were divided into 2 sections and met for a 3-hour session once weekly. Distance students completed the classroom content of the course synchronously with the campus students throughout the semester via classroom capture technology and completed the laboratory portion on campus during a 2-week summer session occurring approximately 3 weeks before the initiation of advanced pharmacy practice experiences (APPEs).

In 2007, faculty members redesigned the Dispensing and Pharmaceutical Care Laboratory. Pioneering practice skill activities were conceived and developed to be stepping stones for students as they progressed toward their APPEs. Experiences were designed that used active-learning methods to advance students’ ability to think and problem solve in a self-reliant and critical way. Equally important was ensuring that students possessed the basic knowledge, skills, and abilities to independently practice pharmacy at the time of graduation. Simulated experiences were added to the laboratory that involved communication skills, critical decision making, and patient safety in the health care setting. These performance-based assessments mimicked real-world scenarios in order to enrich and heighten students’ skills development.

In the practice laboratory setting, it was vital to use teaching and learning techniques that aided in students’ assimilation of pharmacy principles, supported and expanded students’ knowledge base, and augmented practice skills and abilities. The frequent testing model (the “testing effect”) has demonstrated that repeated testing improves students’ long-term recollection better than repeated studying.6 The restructured laboratory activities implemented this type of testing model, in which students were tested, evaluated, and assessed each week.

This revised laboratory format used 2 different approaches to student examination and assessment. The first method used scripted scenarios in which faculty members or pharmacy alumni acted as faculty preceptors or standardized patients and directly interacted with students. This direct contact allowed them to more easily assess pharmacy students’ abilities to provide effective patient care and to ascertain whether students were competent to enter practice. Because of the large class size, the ideal one-on-one faculty-student interactions were sometimes not logistically possible. Innovative, new methods to test student competency were developed to address this issue. The second method of assessment used an examination software program as an efficient and practical means to assist in evaluation. Non-interactive tasks or “silent” objective structured clinical examinations were created that focused on patient safety and drug distribution (eg, sterile product verification, prescription verification). Students repeatedly checked retail and hospital prescriptions and determined if the prescriptions were appropriate to dispense. Students then entered their answers into the examination program and were given immediate feedback.

From 2008 to 2012, the laboratory series followed the same configuration. The first laboratories of the semester were dedicated to reviewing such topics as drug information; non-sterile compounding; non-oral dosage forms and home diagnostics; and diabetes care, insulin pumps, and glucometers. In the last 7 to 8 sessions of the semester, the laboratory was divided into 10 workstations and students rotated from one station to the next at determined intervals (Figure 1). The intent was to structure multitasking activities in an environment of “controlled chaos” in which students had to make accurate and appropriate decisions and provide optimum patient care in a limited amount of time. All 10 stations had to be completed and the student had to demonstrate proficiency to receive a passing score. Examples of activities occurring at these stations are given in Figure 1.

Figure 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 1.

Content and rotation order for the 10-station performance assessment process used in the pharmacy skills laboratory.

Many of these restructured sections of the laboratory curriculum focused on management of various acute and chronic diseases. Laboratory sessions that paralleled topics learned in the classroom portion of the course were designed to reinforce students’ cognitive and performance skills. These diverse, simulated experiences were repeated each week to ensure students had more than one instance to practice the skill and improve. Tasks and activities that were deemed essential to professional practice were repeated 4 or more times during the laboratory series. These activities included patient counseling, drug utilization reviews (brown bag reviews), prescription verification, and sterile product verification.

With consistent testing, evaluation, and feedback, students were expected to become more competent and confident in providing safe and effective care. Laboratory objectives focused on these 4 specific abilities and mandated that students be able to demonstrate them prior to their experiential year (eg, students shall evaluate for accuracy both outpatient prescriptions and inpatient medication orders filled by other individuals or robotic systems; given a prescription and patient data, students shall provide verbal counseling; given a prescription and patient data, students shall be able to perform a prospective drug utilization review).

The constructed experiences contained scenarios requiring students to use their knowledge, skills, and abilities to resolve patient issues and optimize therapeutic outcomes. The design of each activity was deliberate and purposeful. Scenarios were standardized to contain the same types of information. Because of the possibility of material being shared between pharmacy classes, the laboratory included different activities each week so there was minimal possibility of memorization. Students were informed of the basic layout of the 10 laboratory stations but were not given any details about the patient cases or medications they would review.

Students began the laboratory rotations with an introductory laboratory activity that served as a baseline to gauge their ability. Based on the one-on-one activities and examination software grades, faculty members could identify students who might be struggling in certain areas and ensure that attention or remediation was given to them as the semester progressed. Each week, the laboratory activities became increasingly more complex. Both the number of prescriptions that students evaluated and the complexity of patient cases increased from the initial week to the final week. The 4 activities repeated 4 or more times during the course are described below.

Brown bag review. One of the 10 stations was the brown bag review. The brown bag review was a one-on-one faculty-to-student workstation. Students were given the patient’s self-care and prescription medications and asked to perform a drug utilization review. The student was given a set amount of time to analyze the contents of the bag. He/she then reported the findings to the faculty member who had a rubric to assess the student’s recommendations (available from author upon request). There were 6 drug utilization review categories and each correct answer was worth 1 point. The student had to achieve a minimum score to pass the activity. The faculty member gave the student immediate feedback and reviewed the important points of the drug utilization review process specific to that case. The brown bag activity was repeated weekly, but contained more and different medications each time (the number increased from approximately 12 medications in the initial review to 17 in the final review). If several students missed a significant concept, a similar situation could be repeated in a future brown bag activity as a means of remediation. The brown bag review was repeated 6 times during the semester. The repetitive nature of performing drug utilization review was meant to enhance students’ clinical skills and promote knowledge retention.

Patient counseling. The patient counseling session was another weekly rotation and a one-on-one faculty-to-student workstation. The student was given a prescription and a patient profile and went into a patient counseling room where a faculty member or pharmacy alumni was playing the role of the patient. The student advised the patient on the new medication. The student was graded by the faculty member/alumni, who had a rubric to assess the information presented and communication skills (available from author upon request). Each item on the rubric was worth 1 point and the student had to achieve a minimum score to pass the activity. At the end of the session, the faculty member/alumni provided feedback to the student on the performance. The patient counseling scenarios were repeated each week with new patient profiles and medications on which to counsel. These sessions became more involved as the semester progressed (added drug-drug interactions, therapeutic duplications, patient allergies) and were designed to promote skill development and recollection.

Prescription verification. Both prescription verification and sterile product verification were repeated weekly. They were both “silent” objective structured clinical examinations. Each station in the laboratory was equipped with a computer. In the prescription verification workstations, prepared retail prescriptions with hard copies were laid out for students to check. The student was given a checklist to help him/her establish a process or method to check prescriptions (checklist available from author upon request). The student determined if the prescription contained any errors or was appropriately prepared. One point was awarded if the student correctly identified whether the product should be dispensed. The student recorded his/her answers in an assessment software program that provided immediate feedback.

Sterile product verification. Students’ knowledge and skills in sterile product verification were assessed in the same way. The student was given physician orders and the prepared parenterals were laid out in the “hood” for him/her to check. The student was given a checklist to help him/her establish an approach to check parenteral products (checklist available from author upon request). The student decided if the compounded parenteral product contained any errors or if it had been prepared correctly. One point was awarded if the student correctly identified whether the product should be dispensed. The student recorded his/her answers in the same software program used for prescription verification and received immediate feedback. In both prescription verification and sterile product verification activities, the quantity and complexity of prescriptions were increased throughout the semester.

EVALUATION AND ASSESSMENT

The 15-week laboratory course was completed by 814 third-year students from 2008 to 2012. Scores from activities repeated 4 or more times (patient counseling, brown bag reviews, prescription verification, and sterile product verification) were analyzed to determine whether students’ performance improved with repeated activities. Grading in the laboratory course involved 2 categories of assessment: knowledge-based testing and skills-based performance, according to a definitional system model.7 Each category was equally valuable and exceptional performance in one could not compensate for poor performance in the other. To receive a passing grade, the student was required to meet or exceed the standards for each category. The grade for the knowledge-based testing category was determined by students’ performance on quizzes. For the skills-based category, students were required to meet or exceed minimum competency standards in each of the laboratory assignments and/or exercises where points were earned based on student performance and communication skills. Students received a pass-fail grade for the laboratory skills based on their performance on the laboratory activities: A = passed 90% or more of the activities; B = passed 80% or more of activities; and C = passed 75% or more of activities. For example, if a student received an A average on the quizzes but passed less than 75% of the laboratory assignments, the student failed the course. Students had to achieve a passing grade in the course before beginning their APPEs.

Statistical analysis was performed using SPSS Statistics 20 (IBM Corporation, Armonk, NY). For student performance data from the brown bag, patient counseling, and prescription verification exercises, one-way repeated measures ANOVA with both within and between subject factor effects were used to test whether student mean scores on trials were improved over times and across years. This was followed by a separate repeated measures ANOVA with only within subject factor effect conducted on all exercises for each academic year. Contrasts were made to compare the means of each trial to the previous one. A p value < 0.05 for all tests was considered significant.

Student scores were significantly improved both with repeated attempts (trials) within the same academic year as well as across academic years for the 3 exercises, drug utilization review (brown bag reviews), patient counseling, and prescription verification (Table 1). The trends of student performance scores on repeated trials across 5 years for brown bag, patient counseling, and prescription verification exercises are shown in Figure 2, Figure 3, and Figure 4, respectively. These figures illustrate repeated attempts by the same student within the academic year for each of the 5 academic years. Students performed continuously better on the brown bag exercises, despite the increasing difficulty and complexity of the medication problems (p = 0.001 for all comparisons). This was true for all years measured, 2008 through 2012 (p = 0.001). Likewise, for patient counseling exercises, student scores improved significantly with each assessment (p = 0.001 for all comparisons); this was found for all years measured, 2008 through 2012 (p = 0.001).

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 1.

Comparison of Mean Student Scores With Repeated Attempts Within Each Year and Between Subsequent Years for Drug Utilization Reviews, Patient Counseling, and Prescription Verification Exercises

Figure 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 2.

Mean student scores for brown bag exercises for repeated trials within years and across years.

Figure 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 3.

Mean student scores for patient counseling exercises for repeated trials within years and across years.

Figure 4.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 4.

Mean student scores for retail prescription verification exercises for repeated trials within years and across years.

For the prescription verification exercises, student performance was measured for years 2009 through 2012. Scores continued to improve in all years with each repeated trial (at least p = 0.002 for all comparisons), but plateaued, with less significant improvement (or no significance) in performance seen after the third trial. This trend was also true for student scores in sterile product verification.

Table 2 shows the grand means of student performance scores for each of the academic years and the comparisons between the mean for that year and the mean for the previous year for the exercises (except sterile product verification). Scores significantly improved chronologically for brown bag exercises and prescription verification exercises, but not for patient counseling exercise in which improvement was found only in the period of 2009 to 2010 and 2010 to 2011.

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 2.

Comparison of Mean Student Scores for 3 Pharmacy Practice Skill Exercises Across Academic Years 2008-2012a

DISCUSSION

We attempted to design an educational environment that maximized learning opportunities by using multiple assessments. We collected student performance data for 5 academic years to evaluate whether we were successful in this endeavor. Over this period, we examined scores from skills-based activities repeated 4 or more times. Mean student scores continued to improve with the more cognitive practice skills (drug utilization reviews and patient counseling), even over 4 or more exercises. On the contrary, student scores associated with more technical skills of prescription verification (retail and parenteral product confirmation or checking) plateaued after completing 3 activities. These data are important in 2 ways. First, even though we increased the complexity of the brown bag reviews and the patient counseling sessions, student performance continued to improve with each challenge. These facts lend credence to the idea that the more cognitive skills of pharmacy practice deserve deliberate practice and feedback to improve student performance, and that these activities should be performed by a pharmacist and not other persons associated with pharmacy practice (technicians).

After increasing over the first 3 activities, prescription verification scores tended to plateau, suggesting that the majority of students had achieved some degree of “mastery” of these activities. These data suggest that these less cognitive pharmacy practice skills could be performed by pharmacy support personnel (technicians), freeing up pharmacists for more patient-centered practice. In the laboratory sessions, the number of trials for these verification activities has been reduced to allow the introduction of new experiences. To require students to complete more attempts would consume valuable laboratory time with minimal additional benefit to student learning.

There was an increasing trend in student scores with repeated attempts (trials) from academic years 2008 to 2012. We hypothesize that it was our continued revisions to the laboratory process that accounted for improved trends in scores. Each year, student feedback was used as a means to enhance the laboratory experience, both in process and outcome. Thus, continuous quality improvements were made with each laboratory offering. Initially, the majority of activities were one-on-one interactions between a faculty member and a student; however, 2 issues arose. The amount of dedicated faculty members needed to conduct the laboratory sessions and the tremendous associated workload became problematic. Additionally, all rubrics and checklists were on paper, which required time to grade and delayed student feedback on their performance by weeks. Constructing activities using examination software that graded students and provided immediate feedback to them on performance solved both of these problems and may account for the continued improvement seen in student scores in subsequent academic years.

Some students expressed concerns on course surveys that they were being tested on activities that they had never before practiced. To address this, videos were made to demonstrate what an appropriate patient counseling session entailed. A video of a faculty member demonstrating a stepwise approach to conducting a brown bag review was also composed to give students an idea of how a pharmacist should approach this activity. Similarly, students had no method or process to follow when checking retail prescriptions and sterile products, so checklists were made and distributed to students. This gave students a starting point to develop a system that worked for them and the addition of this tool also may have improved performance over the years.

Student feedback also addressed inconsistent grading among faculty and pharmacy alumni as potentially problematic and affecting their scores. In response, videos of an acceptable and unacceptable patient counseling session were generated for faculty members that provided detailed explanations on how they should grade each of these sessions using the patient counseling rubric. Brown bag review videos discussing the purpose and expectations of drug utilization review were also produced and disseminated to faculty members in an attempt to improve inter-rater reliability.

CONCLUSION

Conducting repeated assessments of students and providing immediate feedback to them can create an educational environment that maximizes student learning (eg, drug utilization reviews and patient counseling) while minimizing faculty member and student workload (eg, prescription verification). Such a pharmacy skills practice environment can improve the cognitive skills and abilities necessary for students to become competent and proficient pharmacy practitioners.

  • Received February 8, 2013.
  • Accepted May 10, 2013.
  • © 2013 American Association of Colleges of Pharmacy

REFERENCES

  1. 1.↵
    Accreditation Council for Pharmacy Education. Accreditation standards and guidelines for the professional program in pharmacy leading to the doctor of pharmacy degree. Effective February 14, 2011. https://www.acpe-accredit.org/pdf/ACPE_Revised_PharmD_Standards_Adopted_Jan152006.pdf. Accessed April 18, 2013.
  2. 2.↵
    1. Fitts PM,
    2. Posner MI
    . Human Performance. Oxford, England: Brooks and Cole; 1967.
  3. 3.↵
    1. Fischer KW
    . A theory of cognitive development: the control and construction of hierarchies of skills. Psychol Rev. 1980;87(6):477-531.
    OpenUrlCrossRef
  4. 4.↵
    1. Karpicke JD,
    2. Roediger HL
    . Repeated retrieval during learning is the key to long-term retention. J Mem Lang. 2007;57(2):151-162.
    OpenUrlCrossRef
  5. 5.↵
    1. Roediger HL,
    2. Karpicke JD
    . Test-enhanced learning. Psychol Sci. 2006;17(3):249-255.
    OpenUrlCrossRefPubMed
  6. 6.↵
    1. Larsen DP,
    2. Butler AC,
    3. Roediger HI
    . Repeated testing improves long-term retention relative to repeated study: a randomized controlled trial. Med Educ. 2009:43(12):1174-1181.
    OpenUrlCrossRefPubMed
  7. 7.↵
    1. Walvoord BE,
    2. Anderson VJ
    . Effective Granding: A Tool for Learning and Assessment. San Francisco, CA: Jossey-Bass Publishers; 1998:96-97.
PreviousNext
Back to top

In this issue

American Journal of Pharmaceutical Education
Vol. 77, Issue 6
12 Aug 2013
  • Table of Contents
  • Index by author
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Pharmaceutical Education.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Repeated Testing to Improve Skills in a Pharmacy Practice Laboratory Course
(Your Name) has sent you a message from American Journal of Pharmaceutical Education
(Your Name) thought you would like to see the American Journal of Pharmaceutical Education web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
1 + 0 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Citation Tools
Repeated Testing to Improve Skills in a Pharmacy Practice Laboratory Course
Kimberley Begley, Michael S. Monaghan, Yongyue Qi
American Journal of Pharmaceutical Education Aug 2013, 77 (6) 130; DOI: 10.5688/ajpe776130

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Repeated Testing to Improve Skills in a Pharmacy Practice Laboratory Course
Kimberley Begley, Michael S. Monaghan, Yongyue Qi
American Journal of Pharmaceutical Education Aug 2013, 77 (6) 130; DOI: 10.5688/ajpe776130
Reddit logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • INTRODUCTION
    • DESIGN
    • EVALUATION AND ASSESSMENT
    • DISCUSSION
    • CONCLUSION
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Similar AJPE Articles

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • A Novel Teaching Tool Combined With Active-Learning to Teach Antimicrobial Spectrum Activity
  • Investigating the Correlation Between Pharmacy Student Performance on the Health Science Reasoning Test and a Critical Thinking Assignment
  • Active-learning Strategies for Legal Topics and Substance Abuse in a Pharmacy Curriculum
Show more Teachers’ Topics

Related Articles

  • No related articles found.
  • Google Scholar

Keywords

  • pharmacy practice laboratory
  • simulation
  • testing
  • pharmacy skills
  • assessment
  • Pharmacy Practice

Home

  • AACP
  • AJPE

Articles

  • Current Issue
  • Early Release
  • Archive

Instructions

  • Author Instructions
  • Submission Process
  • Submit a Manuscript
  • Reviewer Instructions

About

  • AJPE
  • Editorial Team
  • Editorial Board
  • History
  • Contact

© 2023 American Journal of Pharmaceutical Education

Powered by HighWire