Skip to main content

Main menu

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Other Publications

User menu

Search

  • Advanced search
American Journal of Pharmaceutical Education
  • Other Publications
American Journal of Pharmaceutical Education

Advanced Search

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Follow AJPE on Twitter
  • LinkedIn
Research ArticleRESEARCH

The Impact of Problem-Solving Feedback on Team-Based Learning Case Responses

Melissa S. Medina, Susan E. Conway, Tamra S. Davis-Maxwell and Ryan Webb
American Journal of Pharmaceutical Education November 2013, 77 (9) 189; DOI: https://doi.org/10.5688/ajpe779189
Melissa S. Medina
The University of Oklahoma College of Pharmacy, Oklahoma City, Oklahoma
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Susan E. Conway
The University of Oklahoma College of Pharmacy, Oklahoma City, Oklahoma
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Tamra S. Davis-Maxwell
The University of Oklahoma College of Pharmacy, Oklahoma City, Oklahoma
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Ryan Webb
The University of Oklahoma College of Pharmacy, Oklahoma City, Oklahoma
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading

Abstract

Objective. To determine the amount and type of feedback needed to improve pharmacy students’ problem-solving skills using team-based learning (TBL) and a problem-solving rubric.

Methods. A problem-solving rubric was developed to assess the ability of pharmacy students’ to prioritize, organize, and defend the best and alternative options on TBL cases The study involved 3 groups of pharmacy students: second-year students in a cardiology class who received no feedback (control group), third-year students in an endocrinology class who received written feedback only, and third-year students in an endocrinology class who received written and verbal feedback. Students worked in groups on all TBL cases except the first and last one (beginning and end of course), which students completed independently as it served as a pretest and posttest.

Results. Significant improvements were seen in the ability of the third-year students who received verbal and written feedback to prioritize the information presented in the case and in their total score on the problem-solving rubric.

Conclusion. Providing pharmacy students with written and verbal explanations may help them improve their problem-solving skills overall. During verbal feedback, faculty members can provide more examples of how to improve and can field questions if needed.

Keywords
  • team-based learning
  • problem-solving
  • rubrics
  • assessment
  • feedback

INTRODUCTION

Problem-solving is a critical skill for pharmacists. Failure to recognize, resolve, and prevent drug-related problems can result in significant patient harm. Thus, teaching pharmacy students effective problem-solving skills is important. The Center for Advancement of Pharmacy Education (CAPE) educational outcomes and Accreditation Council on Pharmaceutical Education (ACPE) competencies emphasize critical-thinking and problem-solving skills.1,2 The CAPE outcomes for pharmacy practice state students should be able to formulate patient-centered pharmaceutical plans including identification of drug-related problems, use of literature to make evidence-based decisions, and recommendations for appropriate drug therapy as part of a care plan.1 The ACPE Standard 11 states teaching and learning methods should foster the development and maturation of critical thinking and problem-solving skills to produce graduates who become competent pharmacists. Standard 11.2 further specifies that development of critical-thinking and problem-solving skills should occur through active-learning strategies (such as case studies).2 Using cases in instruction helps students develop problem-solving skills, especially when the format emphasizes (1) exploring several possible problem solutions rather than focusing on 1 correct answer, (2) using quality evidence to support proposed solutions, and (3) reflecting on a solution’s strengths and weaknesses.3-6

These 3 areas are linked to the IDEAL problem-solving model by Bransford and Stein, which emphasizes identifying the problem, defining goals, exploring multiple strategies/solutions, anticipating outcomes and acting, and looking at the effects and learning from the experience.7 Helping students develop these problem-solving skills is important because pharmacy students’ ability to use evidence to support their solutions is often weak.8

One way to help students develop their problem-solving skills is for faculty members to provide students feedback about their problem-solving performance. Feedback is most effective when it is objective, not graded (formative), specific, structured, and allows for identification of strengths and weaknesses.9 Feedback is different than “feeding,” which focuses on praising students; evaluation, which involves grading (summative) and occurs at the end of an activity; and self-reflection, where individuals may overlook what needs improvement.9 Rubrics are written grading tools used in assessment that are useful in providing feedback because they describe the continuum of acceptable to unacceptable performance along with an associated point value in a grid.10 While rubrics are useful in evaluating performance and providing feedback, it is unclear whether they facilitate as much improvement if the assessment is provided without verbal feedback.

Team-based learning (TBL) is an established active-learning strategy that uses patient cases with emphasis on exploring multiple problem solutions, using evidence to support proposed solutions, and reflecting on a solution’s strengths and weaknesses to develop problem-solving skills.3-6,11 Team-based learning consists of 3 phases: preparation, readiness assurance, and application of course concepts. In the preparation phase, students are required to complete out-of-class readings. The readiness assurance phase is achieved through individual and team quizzes which occur at the beginning of class to ensure that students adequately reviewed the readings. In the application phase, students practice case-based problems with plausible multiple-choice solutions. The students first evaluate and discuss the cases within small teams and then with the entire class, with feedback on team responses provided by faculty members.11 Through the TBL process, students receive feedback on their mastery of content and application of course concepts to patient case scenarios, but intentional feedback on the problem-solving approach the students used is not a required aspect of this phase. Therefore, discussion of the overall problem-solving strategy used may not occur.

Team-based learning methodologies have been adopted within many pharmacy school curricula.12-21 Literature reports on the use of TBL in pharmacy education are largely descriptive in nature. The outcomes measured have generally been course grades12-18 and student perceptions and attitudes through course evaluations and surveys.12,13,16-19 These reports validate the benefits of TBL to acquire knowledge and apply course concepts, develop teamwork skills, and be engaged in the classroom environment. More rigorous evidence is lacking on the value of TBL as a teaching method for knowledge retention beyond the course offering, problem-solving skill development, and communication skills development.

At the University of Oklahoma College of Pharmacy, TBL has been implemented within several courses. The integrated cardiology and endocrinology modules are 2 courses that have been successfully using TBL methodologies for several years.12,13 A review of the college’s curriculum found that courses using TBL provided students with an opportunity for structured development of problem-solving skills, but did not provide intentional discussion, specific feedback, or assessment specific to problem solving skills. The lack of specific problem-solving feedback may hinder the advancement of students’ problem-solving skills and limit the usefulness of TBL as a teaching strategy. Therefore, this study evaluated the effect of deliberate and structured feedback on and assessment of problem-solving skills taught within TBL courses through the use of a problem-solving rubric. The problem-solving rubric was evaluated within the 2 courses with established TBL teaching methods: the cardiology module in the spring of the second year and the endocrinology module in the fall of the third year.

The first objective of the study was to use a problem-solving rubric to compare improvement in problem-solving skills in 3 groups: students receiving no feedback (control group), students receiving written feedback only, and students receiving written and verbal feedback. The second objective was to evaluate the effect of deliberate problem-solving feedback on students’ ability to use a problem-solving rubric to prioritize, organize, and defend the best and alternative options on TBL cases. We hypothesized that the ability of students who received weekly written and verbal feedback to prioritize and defend solutions would significantly improve compared with those abilities in students who received written feedback only or no specific problem-solving feedback using a problem-solving rubric.

METHODS

A problem-solving rubric was developed and designed to evaluate 4 areas: answer selection, answer prioritization and defense, organization of the response, and evidence (12 items worth 20 points total). These 4 parts of the rubric aligned with the “EA” of the IDEAL problem-solving method where E is exploring multiple strategies/solutions and A is anticipating outcomes and acting.9

Second- and third-year doctor of pharmacy students at the University of Oklahoma College of Pharmacy participated in the study. Participation in the study was based on enrollment in 1 of 3 required module courses that used TBL as a teaching method. The study received IRB approval prior to enrollment.

The study included 3 separate groups. Group 1, the control group, included 108 second-year PharmD students enrolled in a required 4-credit hour cardiology course that lasted 6 weeks. Group 1 received no written or verbal problem-solving feedback via the problem-solving rubric. Group 2 included 121 students enrolled in a third-year required endocrinology module that was a 3-credit-hour course that lasted 6 weeks. This group received weekly written feedback via the problem-solving rubric. Group 3 included 121 students enrolled in the same third-year required endocrinology course offered during a different semester. Group 3 received weekly written and verbal feedback on their problem-solving skills via the problem-solving rubric.

One unique TBL case per week that related to either the cardiology or endocrinology course content was used. The faculty members in the courses wrote and selected the cases after discussing the level of difficulty of the case and answers. Each case concluded with 4 multiple-choice items and instructions to rank their solution preference from 1 to 4, circle the letter to indicate their most preferred answer and explain the reason for that selection. Students were also instructed to explain why they did not select the other 3 options. The 4 answer options were all plausible; however, 1 option was the best option based on the scientific literature and patient data.

The TBL sessions were conducted during regular live class sessions in 2 university classrooms on separate campuses connected by videoconference. Participants were randomly divided into teams of 5 to 7 students prior to the start of each of the 3 courses and those teams remained intact throughout the course. The TBL sessions were conducted according to Michaelson’s TBL structure, ie, students first completed readiness assurance tests individually and in a team and then worked on TBL patient cases in teams.11,12 The cardiology class completed 6 TBL sessions while each of the endocrinology classes completed 13 TBL sessions. Participants completed the first case on the first TBL day and last case on the last TBL day individually to serve as a pretest and posttest and these cases were graded by study personnel using the problem-solving rubric. Groups 2 and 3 used the same endocrinology cases for the study. Then once a week, the first case of each class’s TBL session was completed by each student team and was graded by study personnel using the problem solving rubric. Course faculty provided verbal case debriefings about case content for all 3 groups. Group 1 only received the verbal debriefings about TBL case content. Groups 2 and 3 received the graded rubric as written problem-solving feedback each week. In addition to the written problem-solving feedback, group 3 also received verbal problem-solving feedback specific to the problem-solving rubric.

For this study, the same 2 cardiology faculty members and the same 3 endocrinology faculty members lead the TBL case sessions to increase consistency in delivery and structure of the TBL sessions. The same 3 study personnel were used throughout the study to grade all of the TBL study cases and were trained using a case answer key and the rubric. Inter-rater reliability checks were done for 10% of the cases among the 3 graders. When grading discrepancies were found, grading criteria were discussed and resolved among the 3 graders. The grades for the study cases were not factored into students’ course grades.

Scores for each of the 4 parts of the rubric were tallied and recorded: part 1, answer selection (maximum score=3 points); part 2, answer prioritization and defense (maximum score=9 points); part 3, organization of the response (maximum score=2 points); part 4, evidence (maximum score=6 points); maximum total rubric score=20 points. Results were analyzed using analysis of variance and the Tukey method for multiple pairwise comparisons. The scale for parts 1 and 3 on the rubric was too small to warrant individual analysis. Instead parts 1 and 3 were analyzed within the total score.

RESULTS

Group 1 had 108 students, group 2 had 121 students, and group 3 had 121 students complete the study. There was a significant difference in the change in total rubric score between the pretest and posttest for group 3 and group 2. The mean scores for students in group 3 increased 3.3 points from pretest to posttest while that of group 2 increased 0.7 points (Table 1). To evaluate the effect of deliberate problem-solving feedback on individual participants’ ability to prioritize, organize, and defend the best and alternative options using evidence on TBL cases using a problem-solving rubric, students’ scores on the 4 separate parts of the rubric were evaluated. There were 2 significant changes noted on 2 of the 4 parts. There was a significant difference in the change from pretest to posttest in total prioritizing score (part 2 on the rubric) for groups 1 and 2 compared to group 3. Group 3’s total prioritizing score increased 1.8 points more than that of group 2. There was also a significant difference in the pretest to posttest change in total evidence score (part 4 of the rubric) between group 3 and group 2. Group 2’s score decreased 0.7 points while group 3’s score was unchanged. There were no significant differences noted in changes in scores on answer selection (rubric part 1) or response organization (part 3).

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table1.

Comparison of Pharmacy Students’ Pretest and Posttest Scores on a Team-Based Learning Exercise for Which Different Methods of Feedback Were Provided

DISCUSSION

Pharmacy students may benefit from both written and verbal feedback about their ability to answer, prioritize, organize, and defend their solutions to TBL case problems. The significant change in group 3’s (received verbal and written feedback) total score from pre- to posttest compared to that of group 2 (written feedback only) suggests that written and verbal explanations may help students more than written feedback alone. The written feedback may delineate faculty problem-solving expectations, such as what qualifies as evidence or what an organized response is, while the verbal feedback may allow faculty members to provide more examples of how to improve and answer specific student questions if needed.

The data on prioritizing solutions (related to the second objective and part 2 of the rubric) suggests that the written and verbal feedback explanations from faculty members helped students learn the importance of explicitly prioritizing multiple solutions. In the pretest case responses, participants commonly wrote that they would select a certain answer and did not like the other 3 options. This finding suggests that students treated the answers as right and wrong and were comfortable selecting a “correct” option even though all of the answer choices were plausible. Pretest written feedback encouraged participants to rank each choice from 1=first choice through 4=last choice.

The results for the evidence section (part 4) of the problem-solving rubric suggest that participants struggled with how to use evidence to support their solutions on both the pretest and posttest. Receiving written comments alone may have confused the participants. In fact, the posttest results showed a decrease in scores among the groups, especially in group 2 for which a significant decrease in posttest scores were seen (p=0.002). When students did not use the scientific literature or guidelines to defend their solutions, the graders would circle the missing items on the rubric in that section and provide a written reminder. On the posttest, many participants wrote “per class notes” or “per the guidelines” as their evidence statement. It is possible that the lack of a verbal explanation did not allow for specific clarification of how to use the evidence, leading to greater confusion among the students. Anecdotally, when some participants were asked why they wrote “per the guidelines,” they explained that they felt the statement offered enough evidence.

One limitation of the study was the variability in the baseline pretest grades among the 3 groups, which we expected would be equal. Upon further analysis, this difference was attributed to grading differences, with 1 grader giving consistently lower scores than the other 2 graders. This was further complicated by the graders not evaluating the same students on the pretest and posttest. Instead the cases were randomly and evenly distributed to the graders.

Furthermore, the investigators noted early in the study that the problem-solving rubric needed wording changes because graders interpreted rubric items differently. The rubric was not changed since the study was in progress and it was thought that the inter-rater reliability checks had ameliorated these differences, but they had not. This unintended consequence reveals that, although grading rubrics are useful tools when evaluating written answers, regular training and debriefing among multiple graders is essential. In addition, the percentage of answers used in inter-rater reliability checks may need to be increased, especially when the grading rubric is new and faculty members are less experienced with the tool, as was the case in this study.

Although the rubric was pilot tested, it would have been helpful to field test the rubric further to identify needed clarifications and wording changes prior to the study initiation. It was difficult and inappropriate to make changes to the rubric once the study was underway.

The use of the cardiology class with second-year students as the control group made it more difficult to control for confounding factors such as different teaching styles and a different class year. The cardiology class was chosen because its use of TBL was similar to that in the endocrinology class and because both classes were required courses.

Another limitation was the types of graders used in the study, only one of the graders was a content expert. This may have created a discrepancy where 1grader focused on course content in the answers while the other 2 grader(s) focused on problem-solving content. This is a limitation not easily addressed. Those using TBL know the grading burden associated with using TBL, especially if all of the cases are graded. Therefore, having additional graders can decrease the grading burden and increase the rate at which feedback is returned to students. However, adding multiple graders can result in grading discrepancies, even when a grading rubric and training on the rubric are used. Future studies should focus on refinement and validation of the problem-solving rubric to increase inter-rater reliability.

CONCLUSIONS

Team-based learning is designed to improve students’ problem-solving skills, but the type of feedback that best facilitates this improvement has been unclear. Providing specific verbal and written problem-solving feedback to students regarding their performance on team-based learning may be the most effective strategy. Even though this approach may be more time-consuming, written feedback alone may not provide enough guidance or clarification to students to help them improve.

ACKNOWLEDGEMENTS

The authors thank the faculty member at the University of Oklahoma College of Pharmacy who helped with the study: Drs. Ann Lloyd, Todd Marcy, Toni Ripley, Jeremy Johnson, and Holly Herring.

Footnotes

  • ↵* Author affiliation at time of study. Dr. Davis-Maxwell's current affiliation is with the Department of Marketing, Illinois State University.

  • Received March 19, 2013.
  • Accepted May 18, 2013.
  • © 2013 American Association of Colleges of Pharmacy

REFERENCES

  1. 1.↵
    1. Medina MS,
    2. Plaza CM,
    3. Stowe CD,
    4. Robinson ET,
    5. DeLander G,
    6. Beck DE,
    7. Melchert RB,
    8. Supernaw RB,
    9. Roche VF,
    10. Gleason BL,
    11. Strong MN,
    12. Bain A,
    13. Meyer GE,
    14. Dong BJ,
    15. Rochon J,
    16. Johnston P
    . Center for the Advancement of Pharmacy Education (CAPE) 2013 educational outcomes. Am J Pharm Educ., 2013, 77(8):Article 162.
    OpenUrl
  2. 2.↵
    Accreditation Council for Pharmacy Education. Standards and guidelines for the professional program in pharmacy leading to the doctor of pharmacy degree, Guideline Version 2.0, Adopted January 23, 2011, http://www.acpe-accredit.org/standards/default.asp. Accessed February 6, 2013.
  3. 3.↵
    1. McKeachie WJ
    1. McKeachie WJ,
    2. Svinicki M
    . Problem-based learning: teaching with cases, simulations, and games. In: McKeachie WJ, ed. McKeachie’s Teaching Tips: Strategies, Research, and Theory for College and University Teachers. 12th ed. Boston: Houghton Mifflin; 2006:222-225.
  4. 4.↵
    1. Ertmer PA,
    2. Stepich DA
    . Case-based instruction in post-secondary education: developing students’ problem-solving expertise. (ERIC Document Reproduction Service No. ED 435375);1999.
  5. 5.↵
    1. Eshach H,
    2. Bitterman H
    . From case-based reasoning to problem-based learning. Acad Med. 2003;78(5):491-496.
    OpenUrlPubMed
  6. 6.↵
    1. Evenson DH,
    2. Hmelo CE
    1. Kelson ACM
    . Epilogue: assessment of students for proactive lifelong learning. In: Evenson DH, , Hmelo CE, eds. Problem-Based Learning: A Research Perspective on Learning Interactions. Mahwah, NJ: LEA; 2000:315-345.
  7. 7.↵
    1. Bransford JD,
    2. Stein BS
    . The IDEAL Problem Solver. 2nd ed. New York: WH Freeman & Company, 1993: 19-49.
  8. 8.↵
    1. Medina MS
    . Relationship between case question prompt format and the quality of responses. Am J Pharm Educ. 2010,74(2):Article 29.
    OpenUrl
  9. 9.↵
    1. Medina MS
    . Providing feedback to enhance pharmacy students’ performance. Am J Health-Sys Pharm. 2007;64(24):2542-2545.
    OpenUrlFREE Full Text
  10. 10.↵
    1. Medina MS
    . Using rubrics to assess student performance on rotation. Am J Health-Sys Pharm. 2008;65(16):1502-1506.
    OpenUrlFREE Full Text
  11. 11.↵
    1. Michaelsen LK,
    2. Knight AB,
    3. Fink LD
    1. Michaelsen LK
    . Team-based learning in large classes. In: Michaelsen LK, , Knight AB, , Fink LD, eds. Team-Based Learning: A Transformative Use of Small Groups. Westport, CT: Praeger; 2002: 157-171.
  12. 12.↵
    1. Letassy NA,
    2. Fugate SE,
    3. Medina MS,
    4. Stroup JS,
    5. Britton ML
    . Using team-based learning in an endocrine module taught across two campuses. Am J Pharm Educ. 2008;72(5):Article 103.
    OpenUrl
  13. 13.↵
    1. Conway S,
    2. Johnson J,
    3. Ripley T
    . Integration of team-based learning into a cardiovascular module at synchronous distant campuses. Am J Pharm Educ. 2010;74(2):Article 35.
    OpenUrl
  14. 14.↵
    1. Persky AM
    . The impact of team-based learning on a foundational pharmacokinetics course. Am J Pharm Educ. 2012;76(2):Article 31.
    OpenUrl
  15. 15.↵
    1. Grady SE
    . Team-based learning in pharmacotherapeutics. Am J Pharm Educ. 2011;75(7):Article 136.
    OpenUrl
  16. 16.↵
    1. Persky AM,
    2. Pollack GM
    . A modified team-based learning physiology course. Am J Pharm Educ. 2011;75(10):Article 204.
    OpenUrl
  17. 17.↵
    1. Kolluru S,
    2. Roesch DM,
    3. de la Fuente AA
    . A multi-insturctor, team-based, active learning exercise to integrate basic and clinical sciences content. Am J Pharm Educ. 2012;76(2):Article 33.
    OpenUrl
  18. 18.↵
    1. Redwanski J
    . Incorporating team-based learning in a drug information course covering tertiary literature. Curr Pharm Teach Learn. 2012;4(3):202-206.
    OpenUrl
  19. 19.↵
    1. Gallegos PJ,
    2. Peeters JM
    . A measure of teamwork perceptions for team-based learning. Curr Pharm Teach Learn. 2011;3(1):30-35.
    OpenUrl
  20. 20.↵
    1. Walters DE
    . Team-based learning applied to medicinal chemistry course. Med Princ Pract. 2013;22(1):2-3.
    OpenUrl
  21. 21.↵
    1. Zingone MM,
    2. Franks AS,
    3. Guirguis AB,
    4. George CM,
    5. Howard-Thompson A,
    6. Heidel RE
    . Comparing team-based and mixed active-learning methods in an ambulatory care elective course. Am J Pharm Educ. 2010;74(9):Article 160.
    OpenUrl
PreviousNext
Back to top

In this issue

American Journal of Pharmaceutical Education
Vol. 77, Issue 9
12 Nov 2013
  • Table of Contents
  • Index by author
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Pharmaceutical Education.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
The Impact of Problem-Solving Feedback on Team-Based Learning Case Responses
(Your Name) has sent you a message from American Journal of Pharmaceutical Education
(Your Name) thought you would like to see the American Journal of Pharmaceutical Education web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
5 + 1 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Citation Tools
The Impact of Problem-Solving Feedback on Team-Based Learning Case Responses
Melissa S. Medina, Susan E. Conway, Tamra S. Davis-Maxwell, Ryan Webb
American Journal of Pharmaceutical Education Nov 2013, 77 (9) 189; DOI: 10.5688/ajpe779189

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
The Impact of Problem-Solving Feedback on Team-Based Learning Case Responses
Melissa S. Medina, Susan E. Conway, Tamra S. Davis-Maxwell, Ryan Webb
American Journal of Pharmaceutical Education Nov 2013, 77 (9) 189; DOI: 10.5688/ajpe779189
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • INTRODUCTION
    • METHODS
    • RESULTS
    • DISCUSSION
    • CONCLUSIONS
    • ACKNOWLEDGEMENTS
    • Footnotes
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Similar AJPE Articles

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Remote Work in Pharmacy Academia and Implications for the New Normal
  • Assessment of Moral Development Among Undergraduate Pharmacy Students and Alumni
  • An Update on the Progress Toward Gender Equity in US Academic Pharmacy
Show more RESEARCH

Related Articles

  • No related articles found.
  • Google Scholar

Keywords

  • team-based learning
  • problem-solving
  • rubrics
  • assessment
  • Feedback

Home

  • AACP
  • AJPE

Articles

  • Current Issue
  • Early Release
  • Archive

Instructions

  • Author Instructions
  • Submission Process
  • Submit a Manuscript
  • Reviewer Instructions

About

  • AJPE
  • Editorial Team
  • Editorial Board
  • History
  • Contact

© 2023 American Journal of Pharmaceutical Education

Powered by HighWire