Abstract
Objective. To evaluate faculty and student perceptions of and performance on virtual skills-based assessments focused on communication compared to in-person assessments.
Methods. In spring 2020, virtual skills-based assessments were conducted. After all assessments were completed, two 12-item questionnaires, one for students and one for the faculty members who conducted the assessment, were designed to assess perceptions of virtual skills-based assessments. The surveys were distributed via an online platform to second- and third-year (P2 and P3) pharmacy students and to faculty who had participated in a virtual skills-based assessment. Scores from the spring 2020 virtual skills-based assessment were compared to scores on the in-person skills-based assessment that took place in spring 2019.
Results. Of the 19 faculty and 279 students invited to participate, 18 (94.7%) faculty and 241 (86.4%) students responded. The majority of faculty (88.9%) and students (63.5%) perceived the virtual skills-based assessments to be effective at simulating an interaction. However, only 33.3% of faculty and 28.6% of students preferred the virtual environment. There was not a significant difference in student performance between in-person and virtual assessments for patient consultation and SOAP note skills.
Conclusion. Providing sufficient formative and summative feedback to pharmacy students is a challenge, particularly in the context of skills-based assessments. Students and faculty reported that the virtual assessment provided an opportunity for an appropriate assessment of student communication skills. However, a strong preference for using virtual skills-based assessments in the future was not observed.
INTRODUCTION
Frequent practice of communication skills in healthcare professional education is essential to prepare trainees for real-world experiences.1-3 Providing formative and summative feedback supports meaningful learning for the trainees but financial and personnel support are required to conduct these time-consuming activities.3,4 Traditionally, such assessments have been provided almost exclusively in person.
The coronavirus pandemic of 2019 (COVID-19) made in-person instruction impossible, so Doctor of Pharmacy (PharmD) programs quickly transitioned to alternate methods of assessment, including skills-based assessments.5,6 While anecdotal evidence suggests that alternate assessment methods were effective during the pandemic, questions emerged related to the continued utility of virtual skills-based assessments as an opportunity to improve the quantity and quality of assessment throughout PharmD programs.7 Virtual skills-based assessments may offer opportunities to meet the Accreditation Council for Pharmacy Education’s expectations while eliminating evaluators’ travel logistics and some general expenses.8,9 Furthermore, virtual skills-based assessments may create additional opportunities to assess students during their final professional year at our university when skills-based assessments traditionally have not taken place because of student placement in off-campus advanced pharmacy practice experience (APPE) settings.
The literature regarding the use of virtual skills-based assessments to evaluate patient care skills in pharmacy education is limited. Published articles describe using simulation-based software, virtual patients, and technology for distance learning.8,10-13 Other health professions have utilized online learning to a greater extent than pharmacy, but most literature on this is related to simulation or virtual reality training rather than assessment.14,15 Limited data regarding faculty and student perceptions of skills-based assessments were found in medical and pharmacy literature and few studies draw conclusions regarding faculty and student satisfaction with this means of assessment.4,16-18 The purpose of this research was to evaluate faculty and student perceptions about virtual assessments and to compare student performance on virtual skills-based assessments with that of in-person skills-based assessments focused on communication.
METHODS
A 12-item questionnaire was distributed via QuestionPro (QuestionPro, Inc) to P2 and P3 pharmacy students who had participated in a virtual skills-based assessment during March 2020. A parallel questionnaire was distributed to faculty who had evaluated these assessments in March 2020. The questionnaire was open for two weeks, and a reminder was sent after one week. Faculty and students who had not participated in the in-person skills-based assessments were excluded. The virtual environments were created in Blackboard Collaborate Ultra (Blackboard, Inc).
The virtual skills-based assessments focused on verbal and written communication skills. Physical assessment, compounding, and other hands-on skills were not assessed. The P2 pharmacy students completed two activities on different days: a patient case presentation and a 10-minute in-service presentation. The P3 pharmacy students completed two activities on the same day: a patient consultation focused on identifying and overcoming barriers and making a verbal recommendation to a provider. Additionally, the P3 pharmacy students completed a one-problem SOAP (subjective, objective, assessment, plan) note.
All of the skills had previously been assessed in an in-person environment. After completing the assessments, participants were asked to respond to seven different statements using a five-point Likert-type scale to rate their perception of the virtual skills-based assessments compared to previous in-person skills-based assessments. Respondents were then asked about their participation in previous in-person assessments, how they connected during the assessment (video and computer audio, video and telephone, computer audio only, telephone only), and any technology issues they experienced. Students were also asked to report their professional year in school, while faculty were asked which type of assessment they had participated in. The final survey item asked respondents to explain why they did or did not prefer a virtual environment for assessments over an in-person environment.
The primary study objective was to examine faculty and student perceptions of virtual compared to in-person skills-based assessments focused on communication. The secondary objectives were to compare perceptions between the two student cohorts and between virtual assessment scores and in-person assessment scores. To compare perceptions, strongly agree and agree responses were considered positive and combined. Then, the percent positive responses for each survey question were compared between groups using the chi-square test. To compare performance, the percentage of students achieving a score of 80% or better was compared between the two groups using the chi-square test. This threshold was selected as it reflects the expected performance of a P3 pharmacy student prior to beginning APPEs. Curricular revision precluded the same analysis among the P2 cohort as revised assessment practices resulted in no opportunities for direct comparison. The P3 assessments were unchanged; therefore, spring 2019 in-person scores were compared with spring 2020 virtual scores. The same rubrics were used to assess both student cohorts; however, different cases were used to mitigate risk for academic dishonesty per standard course practices. Because the purpose of the case was to assess communication skills, adjusted case details were not expected to substantially confound findings. A detailed comparison of the resources used and assessment methods in each environment can be found in Table 1.
Comparison of Characteristics of In-Person vs Virtual Skills-Based Assessment Environments in Pharmacy Education
Analyses were performed using Stata/SE, version 13 (StataCorp LLC, College Station, TX). Results with p values less than .05 were considered statistically significant. Open-ended survey question responses were reviewed by study authors and grouped by common themes. The Ferris State University Institutional Review Board deemed this project exempt.
RESULTS
Nineteen faculty and 279 students were invited to complete the survey; 18 faculty (94.7%) and 241 students (86.4%) participated. One hundred ten P2 pharmacy students (80.3%) and 131 P3 pharmacy students (92.3%) responded to the survey.
The majority of faculty (72%) evaluated only one of three virtual skills-based assessments. Eighty-three percent of faculty connected with students using video and audio through Blackboard Collaborate Ultra, while the remaining 17% only connected with students via audio. Although significantly more faculty responded positively, the majority of faculty (88.9%) and students (63.5%) perceived the virtual skills-based assessments to be effective at simulating an interaction (p=.03). Most students (72.2%) perceived that the virtual skills-based assessment allowed them to demonstrate their communication skills; however, only 33.3% of faculty and 28.6% of students preferred the virtual environment compared to the in-person environment (p=.67). Student preference for virtual skills-based assessment was not associated with their mode of connection, as the percent preferring virtual skills-based assessments was not significantly different between those who connected using only audio and those who used video (18% vs 30%, respectively, p=.153). Qualitative analysis of open-ended responses revealed that 65.6% of students reported having no technology issues, 17.4% reported connectivity issues, 13.3% reported audio issues, and 4.1% reported other issues related to usability of the technology. The positive responses of faculty and students regarding each questionnaire statement are compared in Table 2. The responses of P2 and P3 pharmacy student to each survey item are also compared in Table 2. There were no significant differences in responses between the two student groups. There was no difference in the percentage of P3 students scoring 80% or higher between in-person and virtual assessments for the patient consultation and SOAP note skills (Table 3). However, a difference was found for students scoring 80% or greater for the healthcare provider interaction (97.9% virtual vs 84.9% in person, p<.001).
Comparison of Doctor of Pharmacy Student and Faculty Perceptions of Virtual Skills-Based Assessments Compared to In-Person Skills-Based Assessments
Comparison of Third Professional Year Pharmacy Student Performance on In-Person (2019) vs Virtual (2020) Skills-Based Assessments
Qualitative feedback indicated that some faculty preferred that future skills-based assessments be conducted in a virtual environment primarily because of the efficiency of this method as it requires less faculty travel. Most faculty who preferred in-person assessments stated that is how most interactions will be in practice and because it is easier to assess students’ nonverbal communication skills in person. Students who strongly agreed or agreed that future assessments should be held in a virtual environment noted a preference for the virtual environment because of the flexibility it allowed, not needing to commute, and the importance of being exposed to telehealth environments. Students who preferred the in-person assessment environment expressed similar comments to those of faculty, indicating that in-person communication was taught in the classroom and much of their future practice would be conducted in person. Thus, in-person assessments might be more applicable for the majority of future practice settings. The importance of being “forced to present in person” was also mentioned, with students explaining that being placed in “uncomfortable situations are how you improve.” Students commented that the possibility for technology-related issues increased their anxiety.
DISCUSSION
This project describes a preliminary investigation of student and faculty perceptions of a series of virtual skills-based assessments focused on communication during the pivot to distance education at the beginning of the COVID-19 pandemic. Students did not overly prefer future communication assessments being held virtually though faculty were more supportive that virtual assessments could be considered. Additionally, the majority of respondents in both groups strongly agreed or agreed the virtual environment effectively simulated interactions and allowed for quality written and verbal feedback. These perceptions were similar to findings from a study conducted before the pandemic.18 Since our study was conducted at the beginning of the pandemic, similar findings would be expected. Perceptions may continue to shift as students and faculty familiarity with video-based distance instruction improves.
Though no one could have predicted the COVID-19 pandemic, nor would anyone voluntarily ask for such a disruptive event in educational delivery and assessment, there are serendipitous learning opportunities to optimize post-pandemic educational practices. While preliminary discussions of virtual engagement in skills-based assessments occurred in prior academic years, momentum favoring in-person skills-based assessments was difficult to overcome until the pandemic forced our hand. We now know that virtual skills-based assessments generally work, and students and faculty believe appropriate assessment occurred. Technology-related issues did not significantly alter student perceptions of virtual assessment and we anticipate that continued practice with distance education modalities will reduce the frequency of technology issues over time. Students may have hesitations about virtual assessment of their performance, but that presents an opportunity to educate students that any isolated assessment of a skill has drawbacks and there may still be strong value in constructive feedback for continued development. Familiarity with the virtual format for formative and summative assessments may also foster student acceptance of this method over time. This study did not seek to compare perceptions of evaluation accuracy between virtual and in-person skill-based assessments, so it is possible that students’ questions regarding the accuracy of assessments may not be unique to virtual assessments.
Knowing that virtual skills-based assessments are generally viable, we see value in continuing to use virtual skills-based assessments, especially given the significant reductions in travel and expenses they provide. Our institution relies on a large cohort of faculty with significant APPE teaching responsibilities with corresponding off-campus placements. In an in-person format, faculty may require multiple hours of travel time to attend these assessments. The virtual format saves substantial time and better allows for APPE and clinic site responsibilities, which may be why faculty expressed support for virtual assessments. We also see value for students in simulating telehealth encounters. There has been limited application of telehealth-focused simulations through an elective course in our program.19 Given that these initial experiences were well received and with the sizeable shift to telehealth during the pandemic, we believe it is advantageous to increase telehealth training in the pre-APPE curriculum.20
Some institutions may have facilities that offer video recording capabilities within skills laboratories, which makes recording in-person assessments and subsequent reflection possible. In settings where such technology is unavailable, dated, cumbersome, or workload intensive, simply hitting “record” within a video conferencing tool may bring ease to recording and distributing files. Increased emphasis on self-assessment and goal setting has taken place in recent years at our institution. While a comprehensive self-assessment recommendation is beyond the scope of this work, relevant literature should be reviewed in development of a self-assessment plan. Furthermore, the use of technology during a skills-based assessment would create an opportunity for video capture to facilitate student reflections.21
As this study represents initial findings at a single institution, limitations should be noted, including that assessments were restricted to students’ communication skills and did not include physical assessment or pharmaceutical compounding, which are commonly included in other skills-based assessments across a PharmD curriculum. Our results may be confounded by prior experiences with in-person skills-based teaching and assessment that create baseline expectations from faculty and students. These results may have been different if students had been exposed to more virtual skills-based instruction before the virtual assessment. Another limitation is that a minimal thematic analysis was conducted, and qualitative themes identified are only anecdotal. Concern was raised related to the difference in scores between virtual and in-person assessments, and while this could be a between-class variation or a more structural difference (Table 1), our assessments were not designed to analyze reasons for this variation. These differences have also raised concerns about academic integrity. That has generated discussions of how we can focus rubrics to primarily assess communication skills rather than knowledge and clinical skills, given that information sharing among students about a case yields little advantage for communication skills-based assessments. Future research could be targeted at issues of inter-rater reliability, consistency of evaluation, and methods to maintain case integrity. While these issues are not unique to virtual assessments, it may be possible to develop tools to assist with improving evaluator consistency in virtual environments.
CONCLUSION
Students and faculty reported that a virtual skills-based assessment served as an appropriate means of evaluating student communication skills. However, despite feelings that the assessment was appropriate, a strong preference for future skills-based assessments to be conducted virtually was not observed. Nevertheless, use of virtual skills-based assessments could be employed to reduce faculty travel requirements, better simulate modern telehealth activities, and create opportunities for student self-assessment, and therefore merits further consideration.
- Received August 28, 2020.
- Accepted January 26, 2021.
- © 2021 American Association of Colleges of Pharmacy