INTRODUCTION
Interprofessional learning experiences prepare future health care professionals for enhanced team-based care of patients, which will lead to improved population health outcomes. The Interprofessional Education Collaborative’s (IPEC’s) core competencies have become the gold standard for planning, implementation, and assessment of interprofessional education (IPE) activities. However, the IPEC framework is inconsistently applied across institutions. The IPEC Board of Directors decided to formally address this issue through the development of an institutional assessment instrument. This instrument will leverage IPEC competencies to identify institutional characteristics tied to the successful implementation of programmatic IPE. The Interprofessional Education Collaborative partnered with the University of Texas Health Science Center at San Antonio in this effort. A panel of nationally recognized experts in IPE was formed to contribute to the instrument development process. In addition to demonstrating the need for such an instrument, this commentary encourages institutional IPE leaders across the United States to participate in the instrument development process.
DISCUSSION
Interprofessional Collaboration and Review of IPEC Competencies
In 1999, the Institute of Medicine (IOM) reported on the deficiencies in interprofessional coordination, communication, and teamwork within the health care delivery system in the United States.1 Although important progress has occurred since the IOM report was published, IPE and interprofessional teamwork remain inconsistently implemented and assessed across institutions. A national infrastructure now supports a transformation in IPE. Three key organizations have emerged and collaborated over the last decade: the Interprofessional Education Collaborative (IPEC), comprised of 21 professional associations representing most health-related colleges and schools in the United States;2 the Health Professions Accreditors Collaborative (HPAC), comprised of 25 accrediting bodies in the health professions;3 and the National Center for Interprofessional Practice and Education (the National Center), a public-private partnership dedicated to serving as an unbiased, neutral convener that is also the home of the American Interprofessional Health Collaborative.4
The mission of IPEC is to promote and encourage interprofessional learning experiences to help prepare future health professionals for enhanced team-based care of patients and improved population health outcomes. The IPEC expert panel report, Core Competencies for Interprofessional Collaborative Practice, is recognized as the nation’s leading framework to guide planning, implementing, and assessment/evaluation of IPE activities and programming. The four IPEC core competencies focus on Interprofessional Teamwork and Team-based Practices, Roles and Responsibilities for Collaborative Practice, Values/Ethics for Interprofessional Practice, and Interprofessional Communication Practices. Each of these four core competencies are supported by 8 to 11 sub-competencies. Academic administrators, educators, and researchers have applied these sub-competencies in myriad ways, including the routine practice of converting select IPEC sub-competencies into learning outcomes based on local needs and interpretations, which has created inconsistencies in their utilization. Some institutions have also superimposed categorization systems onto the IPEC framework to structure longitudinal approaches.5 Inconsistencies have appeared in research as well, particularly in the development of student learning outcome measures.6 This broad uptake and diversity in utilization reinforces the value and importance of the framework but also highlights challenges. Because of the framework’s inconsistent application, attempts to get all types of programs from all types of professions at all types of institutions moving in the same direction have been delayed.
The IPEC Board of Directors decided to formally address this issue through the development of an assessment instrument that would leverage IPEC competencies to identify institutional characteristics tied to successful implementation of programmatic IPE, including the capacity for students to engage meaningfully on teams in interprofessional clinical learning environments. Member institutions have expressed a need for such a tool, developed by IPEC and anchored in the IPEC framework, to assess the maturity level of institutional IPE efforts. Individual administrators, educators, and researchers have developed IPEC-based solutions based on local needs. However, to transform the health professions education enterprise on a larger scale, a consistent mechanism to implement IPEC competencies in a longitudinal manner is needed. To address this problem, IPEC sought to partner with the University of Texas Health Science Center at San Antonio (UT Health San Antonio) based on their commitment to transform IPE using IPEC’s competencies as a shared framework. This commitment is demonstrated through the university’s Quality Enhancement Plan, Linking Interprofessional Networks for Collaboration (LINC), which seeks to integrate IPEC-focused IPE activities into all programs’ curricula across five schools using recommendations from the HPAC-National Center guideline as a roadmap.7,8
Demonstrating the Need for and Process of Instrument Development
Development of the IPEC institutional assessment instrument began with a scoping review to identify similar instruments in the peer-reviewed literature. Project leaders developed a search strategy around the question: “In higher education for the health professions, how do institutional leaders use IPEC core competencies to assess the quality and effectiveness of IPE programs?” The following inclusion criteria were applied: set in higher education for health professions, topically relevant to the assessment of IPE at the institutional level, based on the IPEC core competencies, and published in a peer-reviewed journal. Two independent reviewers screened the title and abstract of each identified record against the inclusion criteria. Of the 469 records identified through our search, no peer-reviewed publications were identified to inform the research question. This finding further reinforced the relevance and importance of the project and for developing such an instrument.
Forming an interprofessional panel of nationally recognized experts in the field of IPE was one of the first steps of the process. Eligibility criteria included formal administrative appointment as a designated university-wide IPE leader and demonstrable IPE expertise via peer-reviewed publication record and letters of support. Panelist selection was also guided by an explicit desire for diversity in terms of geography and professional affiliation as well as a commitment to spearhead future pilot testing of multiple iterations of the instrument at their institution. An expert panel of 16 individuals was ultimately selected to join our team from a competitive pool of 52 applicants (Table 1).9
IPEC Institutional Assessment Instrument Expert Panela
Using a modified Delphi technique, the experience and wisdom of the expert panel will be leveraged to generate a pool of testable items intended to capture the institutional characteristics, structures, and processes that collectively inform the depth to which IPEC core competencies can be achieved.10-14 A structured dialogue exploring experts’ opinions and perspectives will yield a set of potential items for inclusion into the IPEC institutional assessment instrument. The size and composition of the expert panel were strategically selected to align with evidence-based recommendations to maximize the validity of items produced.15,16 Once finalized, the pool of items will then be administrated to a convenience sample of institutional IPE leaders recruited from across the United States. Responses will be subjected to exploratory factor analysis (EFA) to generate a preliminary structure for the instrument.17 A convenience sample of 100-200 participants is proposed based on feasibility considerations and guidelines for instrument development.18
CONCLUSION
Designated IPE leaders at academic institutions throughout the United States represent the targeted end user group of the IPEC institutional assessment instrument. We envision, for example, that chancellors/vice chancellors, presidents/vice presidents, provosts/vice provosts, and directors of university-wide IPE centers and offices will utilize this instrument to understand the maturity of IPE programming on their campuses. This target presents a challenge in terms of instrument development as the pool of individuals who hold these positions is relatively small. As a result, we anticipate a challenge in recruiting a large enough convenience sample of designated IPE leaders. We write this commentary to demonstrate both the need for such an instrument, considering the lack of published data in the literature, as well as to implore universities to share this paper with IPE leaders who may be willing to participate in the forthcoming validation process. This will be necessary to move IPE assessment forward at the institutional level to transform health professions education.
ACKNOWLEDGMENTS
The authors would like to acknowledge the Josiah Macy Jr. Foundation for supporting this project through Macy President’s Grant P21-01.
- Received February 18, 2021.
- Accepted May 20, 2021.
- © 2021 American Association of Colleges of Pharmacy