Abstract
Intensifying accountability pressures have led to an increased attention to assessments of teaching, but teaching generally represents only a portion of faculty duties. Less attention has been paid to how evaluations of faculty members can be used to gather data on teaching, research, clinical work, and outreach to integrate clinical and academic contributions and fill in information gaps in strategic areas such as technology transfer and commercialization where universities are being pressed to do more. Online reporting systems can enable departments to gather comprehensive data on faculty activities that can be aggregated for accreditation assessments, program reviews, and strategic planning. As detailed in our case study of implementing such a system at a research university, online annual reviews can also be used to publicize faculty achievements, to document departmental achievements, foster interdisciplinary and community collaborations, recognize service contributions (and disparities), and provide a comprehensive baseline for salary and budgetary investments.
INTRODUCTION
In many universities, there is no readily available and retrievable database of faculty members’ effort, accomplishments, and productivity that can be used in annual faculty evaluations and comprehensive record audits for accreditation assessments, program reviews, and strategic planning. Databases maintained by departments or divisions often are not easily merged by the university to fill in major gaps in institutional records. Implementing a campus-wide online faculty reporting system has the potential to address unit and university record-keeping demands, but presents many challenges that require careful assessments of institutional resources, presumptions, needs, and aspirations. Several years ago, the University of Arizona (UA) developed and implemented an online system for annual reviews of faculty members that allows for information on their activities to be accessed remotely, aggregated, analyzed, and reported for a wide range of purposes. The online system draws institutional information from other campus systems on courses, student evaluations, publications, grants and contracts. All of this information is brought together for departmental reviews by internal groups such as administrators and peer-review committees, or (if de-identified) by external groups such as accreditation agencies. This commentary will describe the development of one online system with an emphasis on data collection associated with the annual faculty evaluation process to highlight issues and opportunities associated with this system.
Before discussing the benefits and constraints of an online reporting system for performing annual reviews, it is useful to discuss the desired features of an effective and practical online system. The system would be readily accessible regardless of input platform (ie, cross-platform functionality) or user location and would require a minimal amount of training with an easy-to-use interface for searches, navigation of menus, data entry, and data extraction for various types of reporting. There would be a common overarching organizational structure for data entry common to all units in the university, and the flexibility to allow for unit-specific documentation, including free-text options if needed. The system would be easily integrated with other university systems that would allow for automatic, accurate and reliable uploading of information from these sites into the online system. University- and unit-specific reports from the system could be generated rapidly and easily. The system would be efficient and not be prone to downtimes or slow service during peak-user times. Expert help would be readily available for questions or problems at the unit level. The system would be secure. Finally, the system would require modest start-up and ongoing (eg, equipment/software updates, and personnel) resource utilization costs. Of note, some of these desired characteristics (eg, common overarching elements with unit-specific discretion) may conflict with each other and be difficult to implement in a way that satisfies all parties.
A Case Study in the Implementation of an Online Annual Review System
The University of Arizona (UA) considered developing an online annual reporting system to gather information on faculty members’ contributions after the recession in 2009 to help document the social and economic impact of the university and aid with strategic efforts to improve efficiency and expand impact. An informal working group spent several years researching national trends and alternative solutions. Online faculty review systems were becoming more common at the time. An informal survey of Association of American Universities institutions in 2011 found that 14 of the 25 responding institutions were not using online systems, though six were considering or had just launched such systems. Most of the 11 institutions that were using such a system had developed it themselves.
We spent significant time contemplating our options because online annual reviews were not well-established, and we needed to work with a wide-range of stakeholders to consider options. As in most universities, annual reviews at UA are a distributed process shaped by a wide range of departmental, college and university constraints and priorities. Annual reviews are governed by university, college, and department policies and procedures that require peer reviews and college oversight. University policies require that all faculty members be evaluated on their individual areas of responsibility with a focus on the percentage effort assigned to their research, teaching, and service using a five-level scale of truly exceptional, exceeds expectations, meets expectations, needs improvement, and unsatisfactory. Departments and colleges provide more detailed criteria and procedures on what is expected and how it will be assessed, including information on the timelines for review, the composition and role of peer-review committees, and specific benchmarks for assessment.
After considering building on the in-house annual review systems that had been developed by several UA colleges, a decision was made to contract with a vendor, and the university launched a system in 2014. We decided to go with a smaller company that was focused on faculty reporting systems because we perceived the company to be more responsive to our distinct institutional needs. However, our project team learned many strategic lessons from reviewing broader national trends, including the strategic importance of building close collaborations among our research librarians, institutional research and business intelligence offices.
A provost-led collaboration of campus stakeholders developed the online system for the university known as UA Vitae. Because annual reviews are a distributed process affected by departmental and college priorities and constraints, we worked closely with faculty, staff and administrators across campus. To learn from the diverse needs of varied departments and colleges, we worked incrementally with five colleges to pilot the system in 2013. Eleven more colleges implemented the system in 2014. The original five colleges included four that had homegrown online annual review systems to make use of the experienced faculty user groups that those colleges had developed. From the start, we tried to be responsive to varied institutional needs. For example, our initial college cohort included our college of medicine, which was seeking to establish ways to track clinical and academic activities.
Three committees facilitated the development, implementation, and ongoing monitoring and improvements of UA Vitae: a leadership team provided an overarching institutional perspective, a campus advisory committee provided formative input on colleges’ and departments’ perspectives on annual reviews, and an implementation team that drew upon the experiences of college and university IT specialists, administrators, researchers, and librarians with experience archiving and supporting users of faculty data. Additionally, each college has at least one assigned coordinator and many colleges have coordinators at the departmental level to answer questions about the system. Deans, department heads, and other select administrative personnel have access to the system to facilitate faculty input, generate reports, and suggest system improvements. These collaborations have been vital to the launch of our online reporting system.
To build on the distributed dynamics of annual reviews, our online reporting system has separate launching pages for colleges to provide their own information on support staff, workshops, and unique procedures. Faculty members access these homepages when they log into the secure system with a password. The college homepage lists general announcements concerning UA Vitae and information on UA Vitae workshops being held in departments and colleges. We also use the user interface within the system to provide real-time support for faculty members as they complete their annual reviews. For example, screen prompts appear when a faculty member has publications or research abstracts posted to PubMed. The faculty member can review the action item, post it to their activity report, delete the item if it is not applicable, or accept but modify the item if it requires changes. Other data automatically uploaded to the activity report consist of instruction in credit-bearing courses (including individual teaching sessions in team-taught courses) and grants/contracts. These uploads not only save faculty members time with data entry, they also enable departments and colleges to verify institutional information on faculty teaching, including work with residents and interns in clinical settings. Faculty members can verify these automatic data uploads from other campus sources, and add information on non-indexed publications, invited presentations with no published abstracts, and grants/contracts not going through the office of sponsored projects at the university.
Our UA Vitae system has activities or data input categories that are common to all colleges. Examples include workload distribution and general categories related to teaching, research, and service. The option for more detailed activity input and categorization is particularly useful for the health profession colleges that often have distinctive forms of experiential teaching, clinical services, and business and community collaborations that are not generally recorded in integrated ways that can enable departments to track and aggregate them. In pharmacy and other health science fields, teaching and clinical services are richly interrelated but not often integrally reported in ways that document the fuller educational and impact of departments.
Considerations for Launching an Annual Review System
Not only can an online reporting system improve a wide range of procedures, it can also improve the efficiency of the annual review process itself. Unit administrators can allow peer-review committees to have anywhere, anytime confidential access to faculty activity reports for annual performance reviews. Administrators can compile summary data for internal evaluations or quality improvement purposes, or more granular information for external groups such as accreditation agencies. The report generation function of the online system can be a big time saver when compiling summary data for multiple faculty members over several years. However, the majority of faculty members only input their data once a year vs updating throughout the year – so the most up-to-date view overall would be when annual faculty self-evaluation forms are due.
What follows is a list of lessons that department, college and university leaders should consider when implementing an online system. Time needs to be devoted to researching best practices in peer institutions, and the deliberations on implementing such a system should include key stakeholders. Building a shared sense of need for a clean, complete record of faculty activity is important to help university administrators and local stakeholders own the problem and accept the solution. All parties need to recognize that there will never be a perfect system and that participation in the system is necessary to establish consistent assessments of faculty efforts and aggregate data for departments. Similarly, university and local stakeholders need to be consistent when communicating the guiding reasons and intents for the system. System features such as an interface at point of contact to communicate the latter information will help, but not replace the need for university and local stakeholder alignment.
Helping colleges and departments take responsibility in addressing their distinct needs and applications is critical to integrating online review systems into the distributed process of annual reviews in ways that enlist departmental and college staff in supporting the systems. Whether they are implemented within colleges or across a university, online reporting systems need to respect the distributed dynamics of annual reviews, including both interdisciplinary differences and the varied roles that annual reviews play in career progressions, salary increases, accreditation reviews, and program assessments and investments. However, while departmental and college customizations can provide familiarity and control, customizations should not categorize important activities in ways that do not roll up to shared reports or that become excessively complicated or time-consuming that faculty members skip them. Further, there will never be perfect data, so leaders should consider integrating data systems incrementally to work through issues that inevitably will arise from integrating different information.
At the faculty level, there are other factors to remember when implementing a new online system in addition to the usual information and training exercises. Faculty members rarely see their data held in university systems, and may be shocked when they see institutional records of their teaching and research, especially with complex arrangements for grant funding and team-taught classes. Also, some faculty members like to curate their own scholarly publication lists. In particular, they do not want incomplete publications preloaded to the system, but they also do not want to spend time entering publication data. From an administrative standpoint, we have found that inaccurate or incomplete grant or publication data automatically uploaded into the online system often leads to useful information about other campus systems. For example, we have found problems with how grant-related information is entered on the original data entry site. This has led to useful discussions and changes in how this information is handled. Similarly, inaccurate or incomplete data related to publications has led to useful discussions such as the limitations of publication indexing systems and when publications should be considered as published (eg, ahead of print or early view vs assigned to a journal issue) for the purposes of annual performance evaluations.
In closing, the decision to develop and implement a university-wide online evaluation system requires ongoing evaluation and refinement. The importance of buy-in by departmental and college administrators, particularly during the early stages of development process cannot be overstated. Lack of departmental or college administrative personnel buy-in will lead to lack of buy-in by their faculty. It is extremely challenging to get buy-in from departments once system implementation has begun. Related to faculty buy-in, is the issue of adaptability of the system to meet the distinct needs of departments or divisions. For example, the need to account for clinical-related activities such as the provision of patient care and residency training by faculty in the health professions. Finally, the ready availability of local support staff to answer questions from faculty members is another factor likely to be associated with ongoing faculty buy-in. In our pharmacy practice department, we estimate that less than 5% of one staff person’s time is spent on questions related to the online system and most of this time occurs in the period leading up to the annual evaluation deadline. The hope is that the issues raised in this paper will help others in the process of developing or improving online systems for the faculty evaluation process.
- Received December 11, 2017.
- Accepted February 19, 2018.
- © 2018 American Association of Colleges of Pharmacy