Abstract
Objective. To improve pharmacy students’ ability to effectively incorporate a computer into a simulated patient encounter and to improve their awareness of barriers and attitudes towards and their confidence in using a computer during simulated patient encounters.
Design. Students completed a survey that assessed their awareness of, confidence in, and attitudes towards computer use during simulated patient encounters. Students were evaluated with a rubric on their ability to incorporate a computer into a simulated patient encounter. Students were resurveyed and reevaluated after instruction.
Assessment. Students improved in their ability to effectively incorporate computer usage into a simulated patient encounter. They also became more aware of and improved their attitudes toward barriers regarding such usage and gained more confidence in their ability to use a computer during simulated patient encounters.
Conclusion. Instruction can improve pharmacy students’ ability to incorporate a computer into simulated patient encounters. This skill is critical to developing efficiency while maintaining rapport with patients.
INTRODUCTION
With the adoption of the American Recovery and Reinvestment Act in 2009, Congress authorized incentive payments through Medicare and Medicaid to providers that use certified electronic health records (EHRs) in a meaningful way to improve health care delivery. In 2013, 78% of office-based physicians used an EHR, an increase of 18% from 2001.1 Health information technology (HIT) is expected to provide integrated electronic health care with interactive exchange among patients, providers, and insurers, resulting in an increase in the overall quality, safety, and efficiency of health care delivery with fewer medical errors, increased administrative efficiency, decreased health care costs, and expanded patient access to affordable health care.2
With the computer now being a vital component for storing and retrieving patient information, health care providers must have the ability to seamlessly integrate technology into patient communication activities such as the patient interview. To maximize efficiency and balance other demands for time, practitioners will need to gather information from a patient and simultaneously document it in the patient’s electronic record. Easily accessible information may help practitioners better target interview questions, eliminating inquiries for which they already have information or allowing deeper inquiry into areas with little documentation.
Communication is critical in the development of good patient relationships and the delivery of pharmaceutical care.3 A Medline search found no literature on the impact of EHRs on the pharmacist-patient relationship. However, in the medical literature, EHRs have a negative impact on physicians’ establishment of a rapport with patients. Medical students express concern about integrating EHRs into patient encounters.4 One potential reason for this negative impact is the encounter is organized around the EHR structure rather than the patient’s narrative and priorities. While the computer does help complete information-related tasks, it reduces the attention to patient-centered aspects of communication.5 In reviewing 30 recorded patient encounters of 3 primary care physicians, Margalit and colleagues found that empathy, concern, and reassurance were lower with increased screen gaze.6 In the study, physicians spent 25-42% of visit time gazing at the computer screen, and keyboarding was inversely related to the amount of visit dialogue contributed by the physician or the patient. Specific effects of screen gaze were inhibition of physician engagement in psychosocial question asking and emotional responsiveness. Keyboarding increased biomedical information exchange, including more questions about therapeutic regimen, more patient education and counseling, and increased patient disclosure of medical information to the physician. A summary score reflecting overall patient-centered communication during the visit was negatively correlated with both screen gaze and keyboarding.
Morrow and colleagues4 assessed first-year medical students’ ability to demonstrate EHR-specific communication skills. Seventeen students volunteered to participate and were assigned to a control group or an intervention group. Both groups learned how to document patient histories in EHRs, but the intervention group received additional instruction using guided discovery, brief didactics, and practice role play. Students’ EHR-specific skills were assessed by a standardized patient using a checklist. The EHR-specific communication skills checklist was developed based on Ventres’ 4 thematic areas affecting EHR communication.7 In Morrow’s study, a panel of experienced clinicians developed constructs and sample behaviors for 3 of the thematic areas cited by Ventres: geographical, relational, and educational. These included adjusting the geography of the patient encounter to effectively incorporate the computer into the interaction, developing the physician-patient-EHR relationship, and using the computer to teach and enhance the quality of care. The structural theme was excluded from Morrow’s checklist as this type of change was beyond the scope of their study. Morrow and colleagues found that first year medical students had the ability to demonstrate EHR-specific communication skills to incorporate a computer into a patient interview. However, students did not spontaneously demonstrate these skills, nor did EHR-specific skills correlate with general communication skills. While 10 EHR-specific communication skills were assessed, intervention group students performed significantly better on the following 6 skills: introducing self before turning to computer, introducing computer into the doctor/patient/computer triad, moving close enough for patient to read screen and constructing a triangle between doctor/patient/computer, adjusting screen so patient could see it easily, showing patient weight gain and/or vital signs, and visually sharing EHR information on screen to include patient rather than keeping him/her outside computer work.
Demonstrating strong communication skills and EHR use are consistent with expectations of the Center for the Advancement of Pharmacy Education (CAPE) 2013 Educational Outcomes and Accreditation Council on Pharmaceutical Education (ACPE) Standards.8,9 To thrive in a technologically advancing practice model, pharmacy students should be exposed to and offered opportunities to practice incorporating a computer into a patient interview in the didactic setting. However, a Medline search revealed no reports describing teaching methods or curricular interventions for teaching EHR-specific communication skills to pharmacy students.
Faculty members at Concordia University Wisconsin School of Pharmacy sought to improve third-year pharmacy students’ ability to effectively incorporate a computer into a simulated patient encounter. Students received instruction and education during an applied patient care laboratory course. Faculty members examined the students’ self-assessed knowledge of, skills in, and attitudes toward effectively incorporating a computer into a simulated patient encounter both before and after receiving instruction. Other objectives included improving the students’ awareness of barriers and attitudes towards computer use during patient interviews, improving student confidence in effectively using a computer during these encounters, and assessing the students’ baseline ability to incorporate a computer into a simulated patient encounter without training.
DESIGN
The instructional design change occurred in the third year of the required, 6-course applied patient care laboratory series. The third-year courses are 2-credit courses that include both lecture and laboratory components. Students integrate the provider and patient communication skills they learned and practiced in the first two years of the series with more complex patient cases. Students have three applicable 110-minute laboratories in the fall semester and two in the spring semester. During these laboratories, the students gather medication lists from or perform pharmaceutical care assessments (medication list, social and medical history, assessment for drug therapy problems) on simulated patients (typically third-year classmates) and then develop and deliver pharmacotherapy plans to either patients or providers. The patient interviews are video-recorded. Faculty members, residents, and fourth-year student teaching assistants grade students with a rubric containing faculty-developed criteria. Students in the first 2 years of the series work both in pairs and individually; in the third year, the students work only individually. The cases are available to the students before the laboratory session so they can prepare for their encounters. The cases are in PDF format; the course does not utilize a simulated EHR at this time. Medication list and pharmaceutical care assessment templates are available for students to help organize the patient encounter. It is optional for students to use these worksheets. In earlier semesters, students are allowed to use a computer to electronically record information they gather from patients, but it is not mandatory.
To prepare students to use computers during patient encounters on their advanced pharmacy practice experiences (APPEs), whether with an EHR or electronic worksheet, students were required to use a computer to collect patient information during one simulated patient encounter in the third-year course. Two faculty members evaluated students’ ability to effectively incorporate a computer into the simulated patient encounter using a rubric that contained several criteria Morrow and colleagues used in their EHR-specific communication skills checklist (Figure 1).4 The rubric and its results were blinded to students, who had also completed a survey prior to the simulated patient encounter. The survey consisted of 9 questions: one assessing awareness of the computer as a potential barrier, 5 assessing confidence in effectively using a computer during a simulated patient encounter, 2 assessing attitudes towards utilizing a computer during a simulated patient encounter, and 1 asking how many times students had utilized a computer to assist them in a simulated patient encounter during previous applied patient care courses. The survey tool, which used a Likert scale (Appendix 1), was sent to faculty members at Concordia University Wisconsin School of Pharmacy and to others outside the school for feedback and was revised based on their comments.
Computer Use Skills Rubric
Students then received specific instruction on how to incorporate a computer into a patient encounter during one of the 50-minute lecture time slots within the applied patient care course. The instruction included 5-10 minutes of lecture material highlighting concepts such as introducing the computer into the patient visit, how to position the computer, and the importance of maintaining eye contact with patients. The concepts were developed from behaviors studied by Morrow et al.4 The instructor also demonstrated both poor and good incorporation of a computer into a patient encounter. The students were then required to use a computer to gather patient information in a simulated patient encounter and were evaluated by faculty members with a rubric, which was blinded to students, who completed another survey prior to that simulated patient encounter. The survey consisted of the same questions students previously answered, minus the last question asking them how many times they had previously used a computer. Course and study design are illustrated in Figure 2.
Course and Study Design (SPE=simulated patient encounter)
In the 2012-2013 academic year, students were evaluated and surveyed in the fall semester (preinstruction) and spring semester (postinstruction). In the 2013-2014 academic year, students were evaluated and surveyed both before and after instruction in the fall semester because of changes in the scheduled order of laboratories. Two faculty members evaluated all students’ simulated patient encounters to maintain consistency of grading. The surveys were completed by the students during their prelaboratory preparation; it is estimated that students took 5-10 minutes to complete each survey. This study was deemed exempt by Concordia University Wisconsin’s Institutional Review Board.
The study’s participants were comprised of 151 third-year pharmacy students (n=69 for 2012-2013 academic year and n=82 for 2013-2014 academic year) who completed the APC 5 and 6 courses. McNemar test was used to analyze the faculty members’ evaluation of student ability to use a computer during a simulated patient encounter. Wilcoxon signed rank test was used for the students’ self-assessment surveys. Statistics were performed with SPSS for Windows, v21 (IBM, Armonk, NY).
EVALUATION AND ASSESSMENT
Faculty members were only able to evaluate 146 of the 151 (97%) students on their ability to effectively incorporate a computer into a simulated patient encounter because of technical difficulties during the video recordings of the encounter. Students improved this ability from preinstruction to postinstruction, with ratings of acceptable rising from 46% (n= 67) to 73% (n=107), respectively. A McNemar test revealed a significant difference (p<.05) between preinstruction and postinstruction ratings. Before instruction, students struggled to explain the purpose of using the computer prior to the interview with the simulated patient (n=37, 25%) and to form a work triangle between the simulated patient, computer, and self (n=56, 38%). Table 1 lists the evaluation results for each component before and after instruction. Some students needed improvement on multiple criteria, as evaluated by faculty members.
Student Performance on Specific Rubric Components of Faculty Evaluations, n=146
The preinstruction survey revealed that students did not have much previous experience using a computer to assist with the simulated patient encounter during previous applied patient care courses. The majority (n=119, 79%) had never used a computer for that purpose.
The survey results before and after instruction are shown in Table 2. Analysis of preinstruction and postinstruction responses with the Wilcoxon signed rank test revealed significant differences in the median for many questions. Students responded that they became more aware that using computers during a patient encounter could be a barrier to effective pharmacist and patient communication after receiving instruction on appropriate use of a computer (p<0.05). Students also reported becoming more confident that they could explain the use of the computer to the patient at the beginning of the interview (p<0.05), form a work triangle between themselves, the patient, and the computer (p<0.05), maintain good eye contact with the patient throughout the majority of the encounter (p<0.05), and alert the patient when they would be turning their attention to the computer for extended periods of time (p<0.05). Student responses of “I don’t know” to a particular question were not included in the analysis. The number of “I don’t know” responses ranged from 0 to 28 depending on the question. There were more “I don’t know” responses from students on the skills they were less confident of performing. Students also reported changing their attitudes towards using computers during patient encounters. Students felt that establishing rapport with patients while using a computer during a patient encounter was less difficult than they originally anticipated (p<0.05). They also thought that use of a computer during the encounter did not make them talk less to their patient (p<0.05).
Student Responses to Survey Questions
DISCUSSION
Instruction, including role-playing by an instructor, did improve pharmacy students’ ability to use a computer while interviewing a simulated patient. Acceptable rubric ratings increased. These results are similar to the improvements shown by first-year medical students in the study by Morrow and colleagues.4 Students in general are familiar with computers and many use an electronic worksheet to organize their information in laboratory and to take notes during other courses. However, instruction on specific communication skills when using a computer during a patient encounter appears to be needed and helpful.
Based on students’ self-assessments, their confidence in using a computer during a simulated patient encounter also improved with instruction and practice throughout the laboratory course. Four of the 5 skill-based survey questions showed significant improvement. Students’ awareness and attitudes towards incorporating a computer into a simulated patient encounter also improved. Repetition in laboratories led to more confidence in the students’ abilities to establish relationships with simulated patients and conduct successful interviews.
These results are different than those reported in Rouf and colleagues’ study.10 Thirty-three third-year medical students completed a questionnaire related to how using an EHR impacted their learning during an ambulatory primary care clerkship. The authors found that only 64% of third-year medical students conducting electronic ambulatory encounters were satisfied or very satisfied with doctor-patient communication using the EHR. Only 24% thought the EHR improved their ability to establish rapport with patients, and only 21% believed that their patients liked that the provider used an EHR. This discrepancy could result from students in our course using an electronic worksheet created specifically for the simulated patient encounter, as opposed to the medical students, who used an EHR they may not have been as familiar with or that was more complicated to enter information into.
The students’ confidence in their skills did not align with the faculty members’ grading of their incorporation of the computer into the simulated patient encounter during the preinstruction period, as shown in Tables 1 and 2. For instance, the majority of students did not use the computer in an acceptable manner preinstruction, with 25% (n=37) of students not explaining its purpose prior to starting the interview, and 38% (n=56) of students not forming a work triangle between the patient, computer, and self. In contrast, 99% (n=139) of students agreed or strongly agreed preinstruction they were confident they could explain the use of the computer to the patient at the beginning of the interview, and 82% (n=100) of students agreed or strongly agreed they were confident they could form a work triangle between the patient, computer, and themselves.
Although the results are encouraging, there are several limitations to this study. The results may not be directly applicable to practice because students used an electronic worksheet instead of an EHR. The electronic worksheet may have required more typing than would be needed in an EHR, which may have detracted from the students’ ability to effectively incorporate the computer into the encounter. Alternatively, although students were provided with worksheet templates, they also were allowed to modify these worksheets to best meet their needs and interview style. The students may have been more familiar with the content and location of information on the worksheet, which may have made it easier for them to use the computer during the encounter. Additionally, students were only asked to gather information from the patient to document on their worksheet. Students were not asked to use an EHR for information retrieval, which may be done in practice to assist with the visit.
Additionally, the preinstruction survey might have biased students to know how they should use a computer as questions detailed appropriate ways to interact. There may also have been some variation in data between the years studied because of differences in when the study was conducted. For the first year of data collection, the study spanned the fall and spring semesters, with preinstruction data being gathered in the fall, and instruction and postinstruction data collection occurring in the spring. For the second year, schedules allowed for both preinstruction and postinstruction data to be collected in the fall semester. This time-frame alteration may have affected the results. And finally, the results are not generalizable as only 2 groups of students from one pharmacy school were assessed.
As there is little published on this topic in the pharmacy literature, several areas can be studied in the future to gain a better understanding of how pharmacy students incorporate technology into their patient communication. Future research is needed regarding student retention of these skills over several semesters and with increasingly complex patients to determine if the skills are transferable to these patient encounters. Research to assess why some students choose to incorporate the computer into their patient interviews when given the option may also help elucidate technology’s role in patient care.
Future research is also needed to assess practicing pharmacists’ use of this skill and patient implications resulting from this use. As technological abilities may vary, research is needed to identify how frequently practicing pharmacists use computers during the patient interview, if practitioners feel comfortable with this skill, and if pharmacists are successfully incorporating technology into their own practice. A survey could assess patients’ perspectives on pharmacist use of technology during an interview and if quality of care is enhanced through appropriate incorporation of technology into patient communication. We encourage academic institutions to consider how the use of computers and EHRs are used by pharmacy students when communicating with patients. It may be beneficial for students to learn appropriate ways to incorporate computers and EHRs into their visits with patients so they are prepared once they are practicing pharmacists.
SUMMARY
In an era of advancing technology, pharmacists must be equipped with the skills needed to effectively integrate technology into their practice, including effective incorporation of the computer into a patient interview. Faculty members at Concordia University Wisconsin School of Pharmacy sought to improve third-year pharmacy students’ ability to effectively incorporate a computer into a simulated patient encounter in an applied patient care laboratory course. This instruction was found to improve both student performance on and student self-assessed confidence in predefined criteria regarding computer usage. As students move from the laboratory setting to APPEs and eventually into practice as licensed pharmacists, effectively incorporating an EHR into their patient encounters will be critical to maintaining good rapport with patients while still maintaining efficiency and compliance with EHR use requirements.
ACKNOWLEDGMENT
The authors would like to thank our colleague, Michael Brown, PharmD, who provided insight and expertise that greatly assisted the research.
Appendix 1. Computer Use Survey* (Question #9 not used in postinstruction survey)
1. I am aware that using computers during a patient encounter can be a barrier to effective pharmacist and patient communication.
2. I am confident that I can introduce myself to a patient before turning my attention to the computer when conducting an interview.
3. I am confident that I can explain the use of the computer to the patient at the beginning of the interview.
4. I am confident that I can form a work triangle between myself (the pharmacist), the patient, and the computer.
5. I am confident that I can maintain good eye contact with the patient throughout the majority of the encounter.
6. I am confident that I can alert the patient when I will turn my attention to the computer for extended periods of time.
7. I think that using a computer during a patient encounter makes it difficult to establish rapport with my patient.
8. I think that using a computer during a patient encounter makes me talk less to my patient.
9. How often have you used a computer to assist in an interview of a simulated patient during an activity in any Applied Patient Care laboratory over the past 3 years?
* Questions 1-8 were evaluated on a scale of strongly disagree, disagree, agree, strongly agree.
* Question 9 was evaluated on a scale of never, one time, two times, three times, four times, five times or more than five times.
- Received July 9, 2014.
- Accepted September 4, 2014.
- © 2015 American Association of Colleges of Pharmacy