Skip to main content

Main menu

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Other Publications

User menu

Search

  • Advanced search
American Journal of Pharmaceutical Education
  • Other Publications
American Journal of Pharmaceutical Education

Advanced Search

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Follow AJPE on Twitter
  • LinkedIn
Research ArticleRESEARCH

Determining Indicators of High-Quality Application Activities for Team-Based Learning

Kristin K. Janke, Robert A. Bechtol, Stephanie James, Gardner Lepp, Rebecca Moote and Peter Clapp
American Journal of Pharmaceutical Education November 2019, 83 (9) 7109; DOI: https://doi.org/10.5688/ajpe7109
Kristin K. Janke
aUniversity of Minnesota College of Pharmacy, Minneapolis, Minnesota
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Robert A. Bechtol
aUniversity of Minnesota College of Pharmacy, Minneapolis, Minnesota
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Stephanie James
bRegis University School of Pharmacy, Denver, Colorado
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Gardner Lepp
cUniversity of Minnesota College of Pharmacy, Duluth, Minnesota
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Rebecca Moote
dUniversity of Texas College of Pharmacy, Austin, Texas
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Peter Clapp
bRegis University School of Pharmacy, Denver, Colorado
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading

Abstract

Objective. To determine the indicators of quality for application activities in pharmacy team-based learning (TBL).

Methods. A modified Delphi process was conducted with pharmacy TBL experts. Twenty-three experts met the inclusion criteria, including having at least four years of TBL experience, designing at least eight TBL sessions, training others to use TBL, and authoring a peer-reviewed TBL pharmacy paper. In round 1, panelists responded to five open-ended questions about their successful TBL applications activities, including satisfaction with the activity and methods for creating positive student outcomes. In round 2, panelists indicated their level of agreement with the round 1 quality indicators using a four-point Likert rating. Consensus was set at 80% strongly agree/agree. In an open comment period, panelists provided suggestions to help expand the indicator descriptions. Indicators were verified based on TBL and the education literature.

Results. Twenty panelists (87% of those eligible) responded in round 1 and 17 (85% participation) in round 2. Sixteen quality indicators were identified in round 1, with 14 achieving consensus in round 2. “Uses authentic pharmacy challenges or situations” (88% strongly agree/agree) and “incorporates or provides effective feedback to groups” (88% strongly agree/agree) met consensus. However, “has multiple right answers” (76% strongly agree/agree) and “incorporates elements from school specific emphases (eg, faith, underserved)” (53% strongly agree/agree) did not reach consensus.

Conclusions. These indicators can assist faculty members in designing application activities to provide high-quality TBL exercises that promote deep thinking and engaged classroom discussion. The indicators could also guide faculty development and quality improvement efforts, such as peer review of application activities.

Keywords
  • Delphi
  • team-based learning
  • quality
  • application activities

INTRODUCTION

Team-based learning (TBL) is intended to flip the classroom, freeing up class time for application activities in which students work in teams to solve the kinds of problems they will face in the future.1 However, the design of application activities has not been a strong focus in pharmacy’s TBL literature. The use of TBL in US colleges and schools of pharmacy has been described, including factors affecting implementation2 and perceptions of faculty members on educational outcomes.2,3 Pharmacy TBL studies have focused on student knowledge attainment and progression,4-12 student perceptions,10,13-17 professionalism,18,19 teamwork skills,4,18-20 student performance on individual readiness assurance tests (iRATs) and team readiness assurance tests (tRATs),21-23 student engagement and critical-thinking abilities,24 individual variability in learning,25 and use of feedback on TBL exercises and activities.26

A primary goal in TBL is to move beyond content coverage and ensure that students have the opportunity to practice using course concepts to solve the kinds of problems they will likely face in the future.1 One of the biggest difficulties for TBL instructors is creating effective group assignments that promote deep thinking and engaged, content-focused discussion.3,27,28 In health professions education, instructors should use authentic problems that require students to make decisions like those encountered in the clinical arena.29 Understandably, some application activities meet with more success than others. High-quality application activities are not easily constructed and often require repeated rounds of revision to achieve optimal learning outcomes.

In providing guidance for creating effective application activities, Michaelsen and Sweet describe four principles called the “4S rules”: significant problem, specific choice, same problem, and simultaneous report.1 While this guidance focuses on the intent of application activities and elements of their structure and execution, instructors still face many decisions in the design of application activities. To optimize class time, more information is needed on the indicators of quality application activities. What makes a high-quality application activity that results in the learning outcomes we expect to achieve? Isolating and describing quality indicators is a logical step forward in improving this important component of TBL. To that end, the objective of this study was to determine the quality indicators for TBL application activities in pharmacy education. Once quality indicators are identified, interventions that aim to increase quality, such as training programs, checklists, and peer review, can be better examined to determine their effectiveness.

METHODS

This study used a two-round modified Delphi process to garner opinions from pharmacy TBL experts to determine the quality indicators of TBL application activities in pharmacy education. The Delphi process is a methodological technique that requests and refines the collective thoughts and opinions of a panel of experts.30 Data are collected from participants and, through subsequent rounds, are summarized and presented back to them to obtain feedback and measure their level of agreement. The overarching goal is reaching a consensus on the topic being studied.31 Specifically, in pharmacy education the process has been used for creating consensus on guiding principles32 and competencies33 for student leadership development, developing consensus on competencies for professional advocacy,34 investigating students’ perceptions of professional engagement,35 defining bullying behaviors in clinical training of pharmacy students,36 creating consensus on criteria to assess communication skills of pharmacy students,37 and defining the roles and education of veterinary pharmacists and veterinary pharmacy specialists.38 In examining quality, the Delphi technique has been used to select quality indicators in healthcare,39 develop quality criteria for patient decision support technologies,40 and create quality indicators for general practice management.41

Experts have argued that the success of a Delphi study is determined by the expertise of the panel chosen.42 Recommended criteria for expert panel selection include competency within the specialized area43 and credibility.42 To identify potential expert panelists for this study, a search of the pharmacy TBL peer-reviewed literature was conducted. Four specific journals were consulted and searched, including the American Journal of Pharmaceutical Education, Currents in Pharmacy Teaching and Learning, Innovations in Pharmacy, and Pharmacy Education, using each of the following key search terms individually and combined: team-based learning, team based learning (no hyphen), classroom, and education. All authors associated with the identified articles were emailed an online screening survey asking about their expertise and background in TBL. The responses to the screening survey were reviewed by members of the research team, and participants were selected to serve as the panel experts based on the following inclusion criteria: authorship of a peer-reviewed TBL pharmacy paper, at least four years of personal experience using TBL, teaching at least eight hours per year in the classroom, designing at least eight TBL sessions, and experience in training others to use TBL.

Agreement has not been established within the literature regarding the optimal number of experts needed for a Delphi study.30,43,44 For a homogenous group, 10 to 15 panelists has been suggested as a sufficient number.45 Generally, a panel is composed of under 50 people.46 Considering the pool of available experts, a panel size of 20 was deemed appropriate for this study. Selected experts were invited via email to participate in the Delphi process.

In round 1, panelists were given a definition of application activities based on TBL literature and research team expertise. Application activities were defined as focused, in-class, team exercises, assignments, or tasks aimed at developing higher-level thinking skills in complex situations. In round 1, panelists were asked five open-ended questions about their most successful TBL application activities, including methods for creating positive student outcomes, satisfaction with and quality of the activity, and characteristics of strong activities. Panelist responses were reviewed and quality indicator statements were generated from themes in the panelists’ comments. Panelists made additional comments focused on the facilitation and outcomes of the application activities (vs design of application activities). For example, panelists commented on the importance of “keeping the emphasis on student-led discussion” after the team choices had been revealed. Comments focused on facilitation or outcomes were not studied further in subsequent rounds.

In round 2, a report from round 1 was returned to the panel containing the quality indicator statements (eg, the activity uses authentic pharmacy challenges or situations) and quotes from panelists that were used to create the summary statements (eg, “cases reflected important, practical scenarios that pharmacists commonly face” and “cases were authentic, meaning the students could see themselves doing the case in real life”). Panelists were asked to indicate their level of agreement with each of the quality indicator statements using a four-point Likert rating system (ie, strongly disagree, disagree, agree, and strongly agree). In addition to rating the quality indicator statements, panelists were specifically asked to comment, particularly where they disagreed with an indicator, to aid in refining the statements. In Delphi studies, there are a variety of recommendations on consensus levels42; however, no definitive guidelines are available.30,42 Consensus on agreement has generally been no lower than 55% and potentially up to 100%.42 Prior to initiating round 2, consensus was defined as 80% of panelists agreeing or strongly agreeing on the quality indicators in TBL application activities.

A comment period can provide additional context and background on the panelists’ perspectives. The quality indicator statements from round 2 were sent to the panel with the instructions: “If desired, please feel free to comment on this indicator. The comments will help with the descriptors/additional detail that accompany each indicator during reporting.” To give additional direction in understanding each indicator, descriptors were created cooperatively by two investigators using direct quotes and consolidating common phrases from the panelists’ comments in round 2 and the comment period. The descriptors were then reviewed by a third investigator after consulting original response data and approved by consensus among the three investigators. After approving the descriptors, the indicator statements were categorized by the authors into four groupings: structure; content; design and intent; learning actions and results.

For all rounds and the comment period, the web-based survey program Qualtrics (Qualtrics Labs Inc., Provo, UT) was used to collect panelist responses and comments. Both the University of Minnesota and Regis University Institutional Review Boards determined the study did not meet the regulatory definition of human subjects’ research.

As a means of verifying the results of the panelists’ work following completion of the Delphi process, the list of quality indicators was reviewed by two TBL practitioners and two faculty members trained and specializing in pharmacy education. Connections between the indicators and available TBL literature were identified by the TBL practitioners, along with applicable literature to aid in using the findings. Likewise, connections between the indicators and general educational concepts, theories, or philosophies were identified by the education specialists, along with applicable literature to aid in use.

RESULTS

One hundred three leads and coauthors were asked to complete an online screening survey regarding their expertise and background in TBL. Fifty-one authors’ responses were reviewed by the research team using the inclusion criteria. Twenty-three participants met the criteria to serve as expert panelists for the study. In round 1, 20 panelists (ie, 87% of those eligible) responded, and their responses were used to generate the quality indicator statements. In round 2, 17 panelists (85% participation) indicated their level of agreement with the quality indicators. In an open, optional comment period, 15 (75%) of the 17 panelists provided suggestions to help expand the descriptors used in the indicators.

Sixteen quality indicator statements were generated from the panelists’ responses in round 1. Of these, the panel approved 14 statements by the predefined 80% consensus in round 2 (see Table 1 and Table 2). The two indicator statements that did not meet consensus were “the activity has multiple right answers” (76% agreement) and “the activity incorporates elements from school-specific emphases (eg, faith, underserved)” (53% agreement). Ten of the indicators aligned with three general categories related to a constructivist approach: authentic learning, alignment of content with course objectives, and depth of thinking (Table 3).

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 1.

Structure and Content Quality Indicators of Application Activities for Team-Based Learning Identified by Expert Panelists Who Participated in a Delphi Study

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 2.

Design and Intent, Learning Actions, and Results Quality Indicators of Application Activities for Team-Based Learning as Identified by Expert Panelists Who Participated in a Delphi Study

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 3.

Alignment of Indicators Contributed by Team-based Learning Experts With Higher Education Concepts

DISCUSSION

The verification process identified several points of alignment between the indicators and contemporary higher-education thinking and approaches. Constructivism suggests the active involvement of the student is an essential component of the teaching/learning process. Indicator 14, “requires involvement/engagement of all students on the team,” speaks directly to this need for involvement. In constructivist theory, each student brings to the classroom a unique set of past experiences, assumptions, skills, and knowledge, and uses these to make meaning from the new information they are exposed to in the course.47 When students are encouraged to use prior knowledge and experience as part of the learning process, they are likely to experience positive outcomes, including retention of information, motivation, persistence, creativity, and overall understanding of the course material.48-50 Kang and colleagues have encouraged the incorporation of constructivism in pharmacy education.51

An authentic learning environment is one in which students are presented with problems or cases that closely mirror a real-world environment.52 The characteristics of an authentic learning activity include: real-world relevance, high level of complexity, opportunities to collaborate and reflect with peers, and an outcome that includes a finished product as a part of the assessment (as opposed to an examination being the only assessment of learning).53 In a health professions program, students may find the content more accessible if the learning mirrors what they might see in future practice. This engagement causes deeper levels of understanding, critical thinking, and motivation to learn more.54 Perhaps most importantly, an authentic environment is an active-learning space in which students are encouraged to contribute, as opposed to a passive environment in which memorization of information is the primary focus.

The theoretical basis for the alignment of objectives, activities, and assessments is described in the work of Wiggins and McTighe, who championed the “backward design” approach to course and curriculum design.55 This approach suggests courses and curricula should be constructed by first clearly identifying the precise learning objectives. The next step is to identify appropriate assessments that would ensure students have achieved the learning objectives. Only then are learning activities developed that help students pass assessments. This approach has been generally accepted in higher education as the most robust and sound method to use when designing a course or curriculum. In addition, pharmacy educators have advocated for reviewing the alignment of course objectives with assessments and learning activities in order to improve assessment content validity and consistency, which improves the overall outcomes of the course.56,57

Rather than assignments that simply expose learners to new information, Michaelsen argues for assignments that encourage depth of thinking and require active involvement in higher-level cognitive skills.58 He states that using higher-level thinking and problem solving is the key to promoting both greater depth of understanding and retention. In addition, he cites a number of studies that demonstrate improved long-term educational impact of group work, when these types of assignments are used.58 However, these forms of thinking may be difficult to precisely articulate because they can be quite wide-ranging and nuanced. For the purposes of this paper, it is appropriate to suggest that “deep thinking” can include both critical thinking and higher-order or complex thinking.

Bloom’s Taxonomy outlines a hierarchical structure of thinking where, as students continue to expand their knowledge in a subject, they are able to apply, analyze, and evaluate information related to that subject.59 Many have suggested that the ability to critically consider all the available information (ie, analyze, evaluate) and apply it to a problem is among the most valuable skills a student can acquire.60 This is certainly true in pharmacy education, in which critical thinking is among the most valuable demonstrable skills for recent graduates.61 With critical thinking as foundational, Peeters and colleagues describe higher-order, complex thinking as involving problem-solving, clinical reasoning, and moral reasoning.62 In pharmacy, this thinking is reflected in the ability to develop and apply sound ethical and clinical judgments.63

The verification process identified several connections between the indicators defined by the expert panel and the literature on TBL practice, providing evidence of consistency between the findings and the literature. A key component of successful TBL is that students come to class having prepared to apply what they have learned (indicator 1).29 When compared to traditional lecture, TBL increases the number of students prepared for class.2 Proper student preparation has been shown to improve student participation in class and significantly improve test scores, regardless of student ability.8,64 Literature indicates that well-prepared application activities promote more intense discussion among team members (consistent with indicator 13), thereby enhancing team cohesiveness.27,28

The panel also agreed that quality applications should be of a difficulty level that is challenging (indicator 4), but that can be completed in a feasible timeframe (indicator 2). Although there is no specific length of time an application activity may take, Michaelsen and colleagues indicate the goal should be to promote a high level of engagement among team members, and create a high enough level of energy so that students are willing to take time to have meaningful team discussions and defend their answers as discussed.65

The panelists agreed that the activity should follow the published rules on TBL “4S” activities (ie, significant problem, specific choice, same problem, and simultaneous report) (indicator 3) for the creation and implementation of application activities.28,29,65 Team-based learning advocates assert that these attributes will result in student accountability and will generate discussion within and among teams.66 Haidet and colleagues recognize the “4S” characteristics as a core element of TBL and state that these characteristics “foster individual and team motivation, a common frame of reference, critical thinking and conceptual depth, and energy during whole-class discussions.”67 In an article proposing TBL best practices in pharmacy education, Farland and colleagues suggest that application activities be practice-based (indicators 6 and 7) and focused on overarching concepts that are significant and potentially integrative between or within courses (indicator 10).68

Although few comments were provided for indicator 5, the panel agreed that quality application activities “incorporate or provide effective feedback to groups.” The educational research literature has documented that feedback is essential to content learning and retention.69 There are many points in the TBL process where feedback is inherent or can be incorporated. For instance, the readiness assurance process provides feedback to individuals and teams on their level of preparation. Students may receive feedback from team members rating their professionalism70 or from instructors attempting to support team leadership.71,72 However, the feedback generated as part of the application process is also important to learning. Students receive feedback from peers during and after application activities when teams simultaneously report an answer (ie, task feedback) and discuss the analysis that led to their decision (ie, process feedback). Medina and colleagues examined the use of a problem-solving rubric to provide verbal and/or written feedback on a group’s ability to prioritize, organize, and defend the best and alternative options in TBL.26 In discussing TBL facilitation, Gullo and colleagues suggest that facilitators provide adequate time for closure, commenting on difficult concepts and emphasizing take-home points, as well as highlighting student comments that corrected an inaccuracy or excelled in some manner.73 More innovation in providing immediate or post-class feedback on applications is needed, in addition to investigation of effects.

Some panelists disagreed with the “specific choice” aspect of the “4S” rules (indicator 3). Comments supported the use of open-ended questions to: promote critical thinking, show the richness of clinical decision making and move students to higher levels of Bloom’s taxonomy. The TBL literature strongly advocates for application activities that force a specific choice to create “constructive controversy” between teams.74 However, specific choice questions do not necessarily need to be written in a multiple-choice format, as several alternative methods have been described (eg, ranking, sorting, sequencing, matching).75 According to Roberson and Franchini, limiting the options from which teams can choose gives the instructor the prerogative to guide the class discussion so that “feedback on the task can be directed at specific, anticipated discoveries and realizations” that are most beneficial to the learning process.75 Panelists commented in round 2 and during the open comment period that adherence to the “4S” rules should not discourage innovation or impede creative approaches to application activities that remain consistent with the principles of TBL1 described by Michaelsen and colleagues.

The panel’s consensus on indicator statements 6 and 7 is also consistent with and expands upon the “4S” rule of providing students with a “significant problem.” In pharmacy, a significant problem includes both an authentic pharmacy challenge (ie, an important, practical, real-life scenario that a pharmacist is likely to face, such as a question on rounds) and relevant pharmacist tasks (ie, one or more things that pharmacists do, such as interpreting a drug study’s relevance for a particular patient). Designing application activities with relatable contextual details (indicator 6) that highlight and exercise pertinent pharmacist duties/tasks/responsibilities (indicator 7) that are also perceived as germane by students is challenging. During the design process for application activities, instructors may find it helpful to consult additional faculty members and/or pharmacists to identify and hone rich and engaging challenges and meaningful, relevant tasks.

The indicator, “The activity has multiple right answers,” achieved 76% agreement but narrowly missed consensus. Comments around this potential indicator of quality involved correct answers vs best answers and the plausibility of distractors. Panelists suggested that an activity with multiple viable answers is more consistent with real-life problem-solving and is more likely to encourage rich discussion. Such an activity may also help students to realize that there might not always be one “right” answer and that there may be multiple ways to approach a problem. However, activities designed with a single best answer are supported by literature on specific-choice in TBL because they compel the group to defend their position against other groups that have made comparable decisions.1 Other comments focused on the point that there is “learning value” in pursuing a best answer. Continued dialogue in this area is needed to help isolate any potential indicators and to understand the insights that may be gleaned from divergent opinions. Both approaches may have merit, assuming that both could be implemented in a way that addressed a particular learning need/goal and led to quality discussion of complexities or nuances.

In round 1, some panelists discussed the integration of specific faith-based or social justice objectives into TBL applications. However, in round 2, only 53% agreed that it is important to “incorporate elements from school-specific emphases (eg, faith, underserved).” Comments suggested that coverage of mission or specialized outcomes is desirable but may not be required in all application activities. Certainly, mandating that every application activity incorporate mission elements would be limiting. However, TBL applications might be an opportunity to help a school’s mission “come alive” for students, demonstrating the school’s commitment to its mission and providing opportunities to discuss the real-world complexities of living out those mission elements.

The results of this study might be further elaborated upon by investigation with student stakeholders. Students may have additional insights into the indicators of quality application activities from their perspective. In addition, because of the limited number of rounds, it is unclear whether items without consensus simply required more discussion and refinement or whether those indicators were, in fact, not strong indicators of quality. Given the asynchronous nature of the investigation, clarification could not be made with the panelists on the intent of the indicators, when questions arose. In particular, comments from later rounds suggested that some panelists were interpreting “indicator” as required (ie, must always be present). However, the intent was simply to emphasize that the item’s presence was likely to support (indicate) success. Clarifying this issue may have generated additional indicators for consideration in the design of application activities, allowing all panelists to brainstorm possibilities outside of the constraints of “those things required.” Future research can use these indicators to examine the quality of application activities, including the effects of training, checklists, and peer review on the quality of the application activities developed.

CONCLUSION

Fourteen quality indicator statements for application activities were identified by consensus among TBL experts. The indicators aligned well with TBL principles and higher education concepts. Pharmacy TBL experts identified quality indicators related to application activities, structure, content, design and intent, and learning actions and results. The indicators can assist pharmacy faculty members in creating, implementing, and improving application activities to provide high quality TBL exercises that promote deep thinking and generate engaged classroom discussion. The indicators could also be used to inform faculty development and quality improvement efforts.

ACKNOWLEDGMENTS

This work was supported by the Team-Based Learning Collaborative, which is an organization of educators who encourage and support the use of Team-Based Learning in all levels of education.

  • Received April 11, 2018.
  • Accepted February 6, 2019.
  • © 2019 American Association of Colleges of Pharmacy

REFERENCES

  1. 1.↵
    1. Michaelsen LK,
    2. Sweet M
    . The essential elements of team-based learning. New Dir Teach Learn. 2008;2008(116):7-27.
    OpenUrl
  2. 2.↵
    1. Allen RE,
    2. Copeland J,
    3. Franks AS,
    4. et al
    . Team-based learning in US colleges and schools of pharmacy. Am J Pharm Educ. 2013;77(6):Article 115.
    OpenUrl
  3. 3.↵
    1. Tweddell S,
    2. Clark D,
    3. Nelson M
    . Team-based learning in pharmacy: the faculty experience. Curr Pharm Teach Learn. 2015;8(1):7-17.
    OpenUrl
  4. 4.↵
    1. Pogge E
    . A team-based learning course on nutrition and lifestyle modification. Am J Pharm Educ. 2013;77(5):Article 103.
    OpenUrl
  5. 5.↵
    1. Nation LM,
    2. Rutter P
    . A comparison of pharmacy student attainment, progression, and perceptions using team- and problem-based learning: Experiences from Wolverhampton School of Pharmacy, UK. Curr Pharm Teach Learn. 2015;7(6):884-891.
    OpenUrl
  6. 6.↵
    1. Bleske BE,
    2. Remington TL,
    3. Wells TD,
    4. et al
    . Team-based learning to improve learning outcomes in a therapeutics course sequence. Am J Pharm Educ. 2014;78(1):Article 13.
    OpenUrl
  7. 7.↵
    1. Bleske BE,
    2. Remington TL,
    3. Wells TD,
    4. et al
    . A randomized crossover comparison of team-based learning and lecture format on learning outcomes. Am J Pharm Educ. 2016;80(7):Article 120.
    OpenUrl
  8. 8.↵
    1. Grady SE
    . Team-based learning in pharmacotherapeutics. Am J Pharm Educ. 2011;75(7):Article 136.
    OpenUrl
  9. 9.↵
    1. Letassy NA,
    2. Fugate SE,
    3. Medina MS,
    4. Stroup JS BM
    . Using team-based learning in an endocrine module for pharmacy students across two campuses. Am J Pharm Educ. 2008;72(5):Article 103.
    OpenUrl
  10. 10.↵
    1. Johnson JF,
    2. Bell E,
    3. Bottenberg M,
    4. et al
    . A multiyear analysis of team-based learning in a pharmacotherapeutics course. Am J Pharm Educ. 2014;78(7):Article 142.
    OpenUrl
  11. 11.↵
    1. Persky AM,
    2. Pollack GM
    . A modified team-based learning physiology course. Am J Pharm Educ. 2011;75(10):Article 204.
    OpenUrl
  12. 12.↵
    1. Redwanski J
    . Incorporating team-based learning in a drug information course covering tertiary literature. Curr Pharm Teach Learn. 2012;4(3):202-206.
    OpenUrl
  13. 13.↵
    1. Remington TL,
    2. Bleske BE,
    3. Bartholomew T,
    4. et al
    . Qualitative analysis of student perceptions comparing team-based learning and traditional lecture in a pharmacotherapeutics course. Am J Pharm Educ. 2017;81(3):Article 55.
    OpenUrl
  14. 14.↵
    1. Wright KJ,
    2. Frame TR,
    3. Hartzler ML
    . Student perceptions of a self-care course taught exclusively by team-based learning and utilizing Twitter. Curr Pharm Teach Learn. 2014;6(6):842-848.
    OpenUrl
  15. 15.↵
    1. Frame TR,
    2. Cailor SM,
    3. Gryka RJ,
    4. Chen AM,
    5. Kiersma ME,
    6. Sheppard L
    . Student perceptions of team-based learning vs traditional lecture-based learning. Am J Pharm Educ. 2015;79(4):Article 51.
    OpenUrlCrossRef
  16. 16.↵
    1. Miller DM,
    2. Khalil K,
    3. Iskaros O,
    4. Van Amburgh JA
    . Professional and pre-professional pharmacy students’ perceptions of team based learning (TBL) at a private research-intensive university. Curr Pharm Teach Learn. 2017;9(4):666-670.
    OpenUrl
  17. 17.↵
    1. Zingone MM,
    2. Franks AS,
    3. Guirguis AB,
    4. George CM,
    5. Howard-Thompson A,
    6. Heidel RE
    . Comparing team-based and mixed active-learning methods in an ambulatory care elective course. Am J Pharm Educ. 2010;74(9):Article 160.
    OpenUrl
  18. 18.↵
    1. Persky AM
    . The impact of team-based learning on a foundational pharmacokinetics course. Am J Pharm Educ. 2012;76(2):Article 31.
    OpenUrl
  19. 19.↵
    1. Beatty SJ,
    2. Kelley KA,
    3. Metzger AH,
    4. Bellebaum KL,
    5. McAuley JW
    . Team-based learning in therapeutics workshop sessions. Am J Pharm Educ. 2009;73(6):Article 100.
    OpenUrl
  20. 20.↵
    1. Gallegos PJ,
    2. Peeters JM
    . A measure of teamwork perceptions for team-based learning. Curr Pharm Teach Learn. 2011;3(1):30-35.
    OpenUrl
  21. 21.↵
    1. Farland MZ,
    2. Barlow PB,
    3. Lancaster TL,
    4. Franks AS
    . Comparison of answer-until-correct and full-credit assessments in a team-based learning course. Am J Pharm Educ. 2015;79(2):Article 21.
    OpenUrl
  22. 22.↵
    1. Eiland LS,
    2. Garza KB,
    3. Hester EK,
    4. Carroll DG,
    5. Kelley KW
    . Student perspectives and learning outcomes with implementation of team-based learning into a videoconferenced elective. Curr Pharm Teach Learn. 2016;8(2):164-172.
    OpenUrl
  23. 23.↵
    1. Addo-Atuah J
    . Performance and perceptions of pharmacy students using team-based learning (TBL) within a global health course. Innov Pharm. 2011;2(2):Article 37.
    OpenUrl
  24. 24.↵
    1. Nelson M,
    2. Allison SD,
    3. McCollum M,
    4. et al
    . The Regis model for pharmacy education: a highly integrated curriculum delivered by team-based learning (TBL). Curr Pharm Teach Learn. 2013;5(6):555-563.
    OpenUrl
  25. 25.↵
    1. Persky AM,
    2. Henry T,
    3. Campbell A
    . An exploratory analysis of personality, attitudes, and study skills on the learning curve within a team-based learning environment. Am J Pharm Educ. 2015;79(2):Article 20.
    OpenUrl
  26. 26.↵
    1. Medina MS,
    2. Conway SE,
    3. Davis-Maxwell TS,
    4. Webb R
    . The impact of problem-solving feedback on team-based learning case responses. Am J Pharm Educ. 2013;77(9):Article 189.
    OpenUrl
  27. 27.↵
    1. Michaelsen LK,
    2. Bauman Knight A,
    3. Fink LD
    1. Michaelsen LK,
    2. Bauman Knight A
    . Creating Effective Assignments. In: Michaelsen LK, , Bauman Knight A, , Fink LD, eds. Team-Based Learning: A Transformative Use of Small Groups. Westport, CT: Praeger Publishers; 2002:53-75.
  28. 28.↵
    1. Parmelee DX,
    2. Michaelsen LK
    . Twelve tips for doing effective team-based learning (TBL). Med Teach. 2010;32(2):118-122.
    OpenUrlCrossRefPubMed
  29. 29.↵
    1. Parmelee D,
    2. Michaelsen LK,
    3. Cook S,
    4. Hudes PD
    . Team-based learning: a practical guide: AMEE guide no. 65. Med Teach. 2012;34(5):e275-287.
    OpenUrlCrossRefPubMed
  30. 30.↵
    1. Keeney S,
    2. Hasson F,
    3. McKenna H
    . Consulting the oracle: ten lessons from using the Delphi technique in nursing research. J Adv Nurs. 2006;53(2):205-212.
    OpenUrlCrossRefPubMed
  31. 31.↵
    1. Fink A,
    2. Kosecoff J,
    3. Chassin M,
    4. Brook RH
    . Consensus methods: characteristics and guidelines for use. Am J Public Health. 1984;74(9):979-983.
    OpenUrlCrossRefPubMed
  32. 32.↵
    1. Traynor AP,
    2. Boyle CJ,
    3. Janke KK
    . Guiding principles for student leadership development in the doctor of pharmacy program to assist administrators and faculty members in implementing or refining curricula. Am J Pharm Educ. 2013;77(10):Article 221.
    OpenUrlCrossRef
  33. 33.↵
    1. Janke KK,
    2. Traynor AP,
    3. Boyle CJ
    . Competencies for student leadership development in doctor of pharmacy curricula to assist curriculum committees and leadership instructors. Am J Pharm Educ. 2013;77(10):Article 222.
    OpenUrlCrossRef
  34. 34.↵
    1. Bzowyckyj AS,
    2. Janke KK
    . A consensus definition and core competencies for being an advocate for pharmacy. Am J Pharm Educ. 2013;77(2):Article 24.
    OpenUrl
  35. 35.↵
    1. Aronson BD,
    2. Janke KK,
    3. Traynor AP
    . Investigating student pharmacist perceptions of professional engagement using a modified Delphi process. Am J Pharm Educ. 2012;76(7):Article 125.
    OpenUrl
  36. 36.↵
    1. Knapp K,
    2. Shane P,
    3. Sasaki-Hill D,
    4. Yoshizuka K,
    5. Chan P,
    6. Vo T
    . Bullying in the clinical training of pharmacy students. Am J Pharm Educ. 2014;78(6):Article 117.
    OpenUrl
  37. 37.↵
    1. Mackellar A,
    2. Ashcroft DM,
    3. Bell D,
    4. James DH,
    5. Marriott J
    . Identifying criteria for the assessment of pharmacy students’ communication skills with patients. Am J Pharm Educ. 2007;71(3):Article 50.
    OpenUrl
  38. 38.↵
    1. Ceresia ML,
    2. Fasser CE,
    3. Rush JE,
    4. et al
    . The role and education of the veterinary pharmacist. Am J Pharm Educ. 2009;73(1):Article 16.
    OpenUrl
  39. 39.↵
    1. Boulkedid R,
    2. Abdoul H,
    3. Loustau M,
    4. Sibony O,
    5. Alberti C
    . Using and reporting the Delphi method for selecting healthcare quality indicators: A systematic review. PLoS One. 2011;6(6):e20476.
    OpenUrlCrossRefPubMed
  40. 40.↵
    1. Elwyn G,
    2. O’Connor A,
    3. Stacey D,
    4. et al
    . Developing a quality criteria framework for patient decision aids: Online international Delphi consensus process. BMJ. 2006;333(7565):417.
    OpenUrlAbstract/FREE Full Text
  41. 41.↵
    1. Engels Y,
    2. Campbell S,
    3. Dautzenberg M,
    4. et al
    . Developing a framework of, and quality indicators for, general practice management in Europe. Fam Pr. 2005;22(2):215-222.
    OpenUrl
  42. 42.↵
    1. Powell C
    . The Delphi technique: myths and realities. J Adv Nurs. 2003;41(4):376-382.
    OpenUrlCrossRefPubMed
  43. 43.↵
    1. Hsu C,
    2. Sandford B
    . The Delphi technique: making sense of consensus. Pract Assessment, Res Eval. 2007;12(10):1-8.
    OpenUrlCrossRef
  44. 44.↵
    1. Williams PL,
    2. Webb C
    . The Delphi technique: a methodological discussion. J Adv Nurs. 1994;19(1):180-186.
    OpenUrlCrossRefPubMed
  45. 45.↵
    1. Delbecq AL,
    2. Van de Ven AH,
    3. Gustafson DH
    . The Delphi Technique. In: Group Techniques for Program Planning: A Guide to Nominal Groups and Delphi Process. Middleton, WI: Green Briar Press; 1986:89.
  46. 46.↵
    1. Witkin BR,
    2. Altschuld JW
    . Planning and Conducting Needs Assessments: A Practical Guide. Thousand Oaks, CA: Sage Publications, Inc.; 1995.
  47. 47.↵
    1. Piaget J
    . The Mechanisms of Perception. New York, NY: Basic Books; 1989.
  48. 48.↵
    1. Parcover JA,
    2. McCuen RH
    . Discovery approach to teaching engineering design. J Prof Issues Eng Educ Pract. 1995;121(4):236-241.
    OpenUrl
  49. 49.↵
    1. Johnson JL
    . Learning communities and special efforts in the retention of university students: what works, what doesn’t, and is the return worth the investment? J Coll Student Retent. 2001;2(3):219-238.
    OpenUrl
  50. 50.↵
    1. Ç Semerci,
    2. Batdi V
    . A meta-analysis of constructivist learning approach on learners’ academic achievements, retention and attitudes. J Educ Train Stud. 2015;3(2):171-180.
    OpenUrl
  51. 51.↵
    1. Kang LO,
    2. Brian S,
    3. Ricca B
    . Constructivism in pharmacy school. Curr Pharm Teach Learn. 2010;2(2):126-130.
    OpenUrl
  52. 52.↵
    1. Jonassen DH,
    2. Howland JL,
    3. Marra RM,
    4. Crismond DP
    . Meaningful Learning with Technology. 3rd ed. Upper Saddle River, NJ: Pearson Education; 2008.
  53. 53.↵
    1. Reeves TC,
    2. Herrington J,
    3. Oliver R
    . Authentic activities and online learning. In: Proceedings of the 25th HERDSA Annual Conference.; 2002:562-567.
  54. 54.↵
    1. Bruner J
    . Toward a Theory of Instruction. New York, NY: W.W. Norton & Company; 1966.
  55. 55.↵
    1. Wiggins G,
    2. McTighe J
    . What is backward design? In: Understanding by Design. Upper Saddle River, NJ: Merrill Prentice Hall; 2001:7-19.
  56. 56.↵
    1. Wittstrom K,
    2. Cone C,
    3. Salazar K,
    4. Bond R,
    5. Dominguez K
    . Alignment of pharmacotherapy course assessments with course objectives. Am J Pharm Educ. 2010;74(5):Article 76.
    OpenUrl
  57. 57.↵
    1. FitzPatrick B,
    2. Hawboldt J,
    3. Doyle D,
    4. Genge T
    . Alignment of learning objectives and assessments in therapeutics courses to foster higher-order thinking. Am J Pharm Educ. 2015;79(1):Article 10.
    OpenUrl
  58. 58.↵
    1. Michaelsen L,
    2. Knight A,
    3. Fink L
    . Team-Based Learning: A Transformative Use of Small Groups in College Teaching. Westport, Connecticut: Stylus Publishing; 2002.
  59. 59.↵
    1. Bloom B,
    2. Engelhart M,
    3. Furst E,
    4. Hill W,
    5. Krathwohl D
    . A Taxonomy of Educational Objectives: Handbook I: Cognitive Domain. New York, NY: David McKay Company; 1956.
  60. 60.↵
    1. Bransford JD,
    2. Brown AL,
    3. Cocking RR,
    4. Donovan MS,
    5. Pellegrino JW
    , eds. How People Learn: Brain, Mind, Experience and School. Washington, DC: National Academy Press; 2000.
  61. 61.↵
    1. Thompson DC,
    2. Nuffer W,
    3. Brown K
    . Characteristics valued by the pharmacy practice community when hiring a recently graduated pharmacist. Am J Pharm Educ. 2012;76(9):Article 170.
    OpenUrl
  62. 62.↵
    1. Peeters MJ,
    2. Zitko KL,
    3. Schmude KA
    . Development of critical thinking in pharmacy education. Innov Pharm. 2016;7(1):Article 5.
    OpenUrl
  63. 63.↵
    1. Peeters MJ,
    2. Vaidya VA
    . A mixed-methods analysis in assessing students’ professional development by applying an Assessment for learning approach. Am J Pharm Educ. 2016;80(5):Article 77.
    OpenUrl
  64. 64.↵
    1. Artz GM,
    2. Jacobs K,
    3. Boessen CR
    . The whole is greater than the sum: an empirical analysis of the effect of team based learning on student achievement. NACTA J. 2016;60(4):405-411.
    OpenUrl
  65. 65.↵
    1. Michaelsen LK,
    2. Parmelee DX,
    3. McMahon KK
    . Team-Based Learning for Health Professions Education: A Guide to Using Small Groups for Improving Learning. 1st ed. Sterling, VA: Stylus Publishing; 2008.
  66. 66.↵
    1. Dolmans D,
    2. Michaelsen L,
    3. van Merriënboer J,
    4. van der Vleuten C
    . Should we choose between problem-based learning and team-based learning? No, combine the best of both worlds! Med Teach. 2015;37(4):354-359.
    OpenUrl
  67. 67.↵
    1. Haidet P,
    2. Levine RE,
    3. Parmelee DX,
    4. et al
    . Guidelines for reporting team-based learning activities in the medical and health sciences education literature. Acad Med. 2012;87:292-299.
    OpenUrlCrossRefPubMed
  68. 68.↵
    1. Farland MZ,
    2. Sicat BL,
    3. Franks AS,
    4. Pater KS,
    5. Medina MS,
    6. Persky AM
    . Best practices for implementing team-based learning in pharmacy education. Am J Pharm Educ. 2013;77(8):Article 177.
    OpenUrlCrossRef
  69. 69.↵
    1. Hattie J,
    2. Timperley H
    . The power of feedback. Rev Educ Res. 2007;77(1):81-112.
    OpenUrlCrossRef
  70. 70.↵
    1. Emke AR,
    2. Cheng S,
    3. Dufault C,
    4. et al
    . Developing professionalism via multisource feedback in team-based learning. Teach Learn Med. 2015;27(4):362-365.
    OpenUrl
  71. 71.↵
    1. Alizadeh M,
    2. Mirzazadeh A,
    3. Parmelee DX,
    4. et al
    . Leadership identity development through reflection and feedback in team-based learning medical student teams. Teach Learn Med. 2018;30(1):76-83.
    OpenUrl
  72. 72.↵
    1. Alizadeh M,
    2. Mirzazadeh A,
    3. Parmelee DX,
    4. et al
    . Uncover it, students would learn leadership from team-based learning (TBL): The effect of guided reflection and feedback. Med Teach. 2017;39(4):395-401.
    OpenUrl
  73. 73.↵
    1. Gullo C,
    2. Ha TC,
    3. Cook S
    . Twelve tips for facilitating team-based learning. Med Teach. 2015;37(9):819-824.
    OpenUrlCrossRefPubMed
  74. 74.↵
    1. Haidet P,
    2. Levine RE,
    3. Parmelee DX,
    4. et al
    . Guidelines for reporting team-based learning activities in the medical and health sciences education literature. Acad Med. 2012;87(3):292-299.
    OpenUrlCrossRefPubMed
  75. 75.↵
    1. Roberson B,
    2. Franchini B
    . Effective task design for the TBL classroom. J Excell Coll Teach. 2014;25(3&4):275-302.
    OpenUrl
PreviousNext
Back to top

In this issue

American Journal of Pharmaceutical Education
Vol. 83, Issue 9
1 Nov 2019
  • Table of Contents
  • Index by author
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Pharmaceutical Education.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Determining Indicators of High-Quality Application Activities for Team-Based Learning
(Your Name) has sent you a message from American Journal of Pharmaceutical Education
(Your Name) thought you would like to see the American Journal of Pharmaceutical Education web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
4 + 13 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Citation Tools
Determining Indicators of High-Quality Application Activities for Team-Based Learning
Kristin K. Janke, Robert A. Bechtol, Stephanie James, Gardner Lepp, Rebecca Moote, Peter Clapp
American Journal of Pharmaceutical Education Nov 2019, 83 (9) 7109; DOI: 10.5688/ajpe7109

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Determining Indicators of High-Quality Application Activities for Team-Based Learning
Kristin K. Janke, Robert A. Bechtol, Stephanie James, Gardner Lepp, Rebecca Moote, Peter Clapp
American Journal of Pharmaceutical Education Nov 2019, 83 (9) 7109; DOI: 10.5688/ajpe7109
Reddit logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • INTRODUCTION
    • METHODS
    • RESULTS
    • DISCUSSION
    • CONCLUSION
    • ACKNOWLEDGMENTS
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Similar AJPE Articles

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Differences in Multiple-Choice Questions of Opposite Stem Orientations Based on a Novel Item Quality Measure
  • Trends in the Number of Authors and Institutions in Papers Published in AJPE 2015-2019
  • Comparison of Suicidal Ideation and Depressive Symptoms Between Medical and Pharmacy Students
Show more RESEARCH

Related Articles

  • No related articles found.
  • Google Scholar

Keywords

  • delphi
  • team-based learning
  • quality
  • application activities

Home

  • AACP
  • AJPE

Articles

  • Current Issue
  • Early Release
  • Archive

Instructions

  • Author Instructions
  • Submission Process
  • Submit a Manuscript
  • Reviewer Instructions

About

  • AJPE
  • Editorial Team
  • Editorial Board
  • History
  • Contact

© 2023 American Journal of Pharmaceutical Education

Powered by HighWire