Abstract
Objective. To determine students’ perceptions of and performance in a drug assay laboratory course after the addition of Web-based multimedia tools.
Design. Video modules and other Web-based tools to deliver instructions and emulate the laboratory set up for experiments were implemented in 2005 to improve student preparation for laboratory sessions and eliminate the need for graduate students to present instructions live.
Assessment. Data gathered from quizzes, final examinations, and post-course surveys administered over 6 years were analyzed. Students’ scores on online quizzes after implementation of the virtual laboratories reflected improved student understanding and preparation. Students’ perception of the course improved significantly after the introduction of the tools and the new teaching model.
Conclusions. Implementation of an active-learning model in a laboratory course led to improvement in students’ educational experience and satisfaction. Additional benefits included improved resource use, student exposure to a variety of educational methods, and having a highly structured laboratory format that reduced inconsistencies in delivered instructions.
INTRODUCTION
Traditional teaching methods arguably provide a learning experience at the lowest cognitive level, focus on memorization of factual information that is likely to be of little relevance in the future, and fail to accommodate the diverse learning styles of today’s students.1,2 Studies using Dale’s Cone of Experience3 and other learning models show that passive lectures lead to the lowest rate of retention among students, especially compared to retention rates when active-learning methods are used. As a result, interest in the implementation of active-learning methods in pharmaceutical education has increased using a variety of techniques with varying levels of risk for the faculty, students, and the course structure.4 Web-based teaching is one such active-learning technique. One appeal of using Web-based teaching is the potential to individualize instruction, allowing learners to choose their own path to knowledge and to obtain instant feedback on their performance.5,6
The concept of “pedagogies of engagement,” in learning contends that there is a need for a shift in education from students learning about things to students engaging in the learning process and acquiring the abilities necessary to become resourceful professionals.7,8 To fully engage students in the learning process, a number of methods can be used such as problem-based and collaborative learning9; the constructivist model that centers on individual learners and their ability to discover or learn the information; and the collaborative and cooperative learning model where learners must have some prior knowledge of the topics at hand.10
Web-based and computer-assisted learning in pharmaceutical education has numerous benefits. In one study, pharmacy students viewed results to microbiology and biotechnology experiments in Web-based modules, which allowed them to come to class more prepared and with a more comprehensive understanding of important concepts relevant to the course. Advantages of this Web-based tool included more frequent student participation in class, improved ability for students to perform self-assessments, and provision of quick feedback to students, all of which assured the students and educators that the students were learning concepts as intended.11
Computer-based modules proved effective in teaching pharmacy students to identify and correct prescription errors.12 Use of computer-assisted medicinal chemistry case study modules helped students better understand how to evaluate structure activity relationship findings in relation to desired therapeutic outcomes and addressing therapeutic problems in a clinical setting.13 Another study found that using Web-based prescription simulations that depicted different scenarios relating to pharmacy practice in a hospital setting or community setting assisted the learning process for students in a practice skills laboratory course. Furthermore, the Web-based learning appealed to students and allowed them to self-pace their studies, and provided professors with the flexibility to create modules that were specific to students’ learning needs.14
Instructors used Web-based tools to transform a large pharmacokinetics course into an interactive course, and found that having students acquire knowledge outside of class was as successful as having them acquire knowledge during class as long as students were held accountable for acquisition of the knowledge.15 A comparative study of computer-mediated instruction (CMI) versus lecture-mediated instruction (LMI) in a pain management course found that CMI was at least as effective as LMI, but was more efficient and received higher scores in student satisfaction surveys.16 In contrast, pharmacy students who used a biotechnology virtual laboratory to complete a patient case did not prefer it to completing the same patient case on paper, although they still rated the virtual laboratory experience as a valuable portion of the course.17 Other forms of active learning in a laboratory setting have also proved useful and had a positive impact on student learning, confidence, and satisfaction with the laboratory.18
This study examined the effects of introducing a novel, Web-based approach to presenting instructions for a drug assay laboratory, and explores whether the new approach improved students perceptions about the course.
DESIGN
This study examined data collected from fall semester 2004 to fall semester 2009 relating to the laboratory portion of a drug assay course required for first-year doctor of pharmacy (PharmD) students enrolled at the University of Michigan College of Pharmacy. Course enrollment during the study varied between 72 and 84 students, so the class was divided into 2 sections, one meeting on Mondays and the other on Wednesdays.
The drug assay laboratory spanned 1 semester during which students conducted 10 experiments. The techniques and protocol for 9 of the laboratory experiments remained the same during the study, however, those for 1 experiment changed during the time period of the study and therefore the data were excluded. The 9 included experiments (Table 1) used the following techniques: buffer preparation, acid-base titrations, colorimetric analysis, ultraviolet spectroscopic and fluorometric analysis, high performance liquid chromatography (HPLC) analysis using external standards, enzyme kinetics, enzyme-linked immunosorbent assays, gel electrophoresis and HPLC analysis using internal standards. These techniques were used to analyze various groups of drugs, and students were expected to use these techniques to identify and/or quantify various drug samples and apply that knowledge to case studies that involved different drug groups.
Laboratory Experiments Used in a Drug Assay Laboratory Course
All students conducted the first experiment at the same time. For the remaining 9 experiments, the students were divided into 3 groups of 12 to 14 students each. The 9 experiments were conducted in 3 cycles, with each cycle comprised of 3 experiments. For example, during week 1 of the first laboratory cycle, group A conducted experiment 2, group B conducted experiment 3, and group C conducted experiment 4. The experiments were rotated weekly over a 3-week period, allowing for a small group size to accommodate the use of equipment and to ensure a small ratio of students to graduate-student instructor.
For students enrolled in the course in fall 2004, a graduate student instructor gave a 20-minute introductory lecture that described the theoretical background and protocol for the experiment. During the first week of the first 3-week cycle, students ran their assigned experiment for that week immediately following the three 20-minute lectures (1 lecture for each of the 3 experiments to be conducted over the 3-week cycle). In the following 2 weeks, students were expected to come to the laboratory session ready to perform their next experiment without any further formal instructions. The graduate student instructor assigned to each experiment was available for any questions or help with the protocol, and oversaw the students as they conducted their experiments. No assessments were conducted to ensure students were prepared to run the experimental protocol.
Students were required to write 3 laboratory reports, 1 for each experiment, which were due 1 week after completing each of the 3 cycles and graded by the graduate student instructor assigned for the experiment. At the end of the semester, students were given a final practical examination on which they were asked to conduct a number of experimental tasks based on the various techniques they performed during the laboratory sessions and their performance was evaluated by the graduate student instructors.
The method of giving course instructions to students changed in 2005. Beginning with the fall 2005 class, rather than having students listen to lectures presented by a graduate student instructor, students were given a set of online tools that they were required to use prior to coming to the laboratory. These tools were composed of either a videotaped introductory lecture or a virtual laboratory and technique video clips that demonstrated the use of equipment relevant to the particular experiment. Students then completed an ungraded online quiz that assessed their knowledge level of the concepts and techniques used in the experiment. In the laboratory, a graduate student instructor discussed the results of the online quiz with the class and any knowledge deficiencies identified by the quiz. Students then conducted the assigned experiment for that week, using the Web-based tools as visual aids as needed during the laboratory session. While the same technique videos and online quizzes were used throughout the study period, the instructional tools were changed from videotaped lectures to virtual laboratories in response to student input and other factors. In fall 2005 and 2006, the term instructional videos in survey questions referred to videotaped instructional media, and from fall 2007 to 2009, to Adobe Flash computer modules modeled after virtual laboratories.
All 9 experiments (Table 1) used videotaped introductory lectures in fall 2005, while experiments 8 and 9 included the prototype of the virtual laboratories as well. In fall 2006, experiments 2, 3, 4, 5 and 6 used the videotaped introductory lectures, while experiments 1, 7, 8 and 9 used virtual laboratories. In fall 2007 and 2008, all experiments with the exception of experiment 4 used virtual laboratories. The transformation to virtual laboratories was completed in fall 2009 with the addition of experiment 4.
The final evaluation was similar to that used in the traditional instruction process and involved submission of a laboratory report 2 weeks after the completion of each experiment and a final practical examination during which each student was asked to perform experiments based on the protocols used during the semester.
The virtual laboratories were comprised of 2 Adobe Flash-based computer modules; the first described the theoretical background of each experiment and included checkpoint questions to ensure students grasped the information provided, while the second module described and depicted the actual experiment, complete with expected results and questions regarding each step. The second module typically started with an introduction to the equipment used, followed by some basic questions regarding safety and proper laboratory techniques. Once students answered these questions correctly, the Flash presentation proceeded into a step-by-step description of the protocol, with images, interactive simulations, and animations used to illustrate each step. As students progressed from one step to another, they were required to answer questions that tested their understanding of the protocol, followed by an explanation of the reasoning behind each answer. The virtual laboratories also illustrated the consequences of making any errors students might commit during the experiment, eg, inadequate incubation time of DNA vectors with restriction enzymes resulting in incomplete cuts and unexpected fragments. Finally, they occasionally provided an in-depth look inside some of the equipment used, such as HPLC or gel electrophoretic equipment.
EVALUATION AND ASSESSMENT
Students were required to take an online quiz pertaining to the experiment they would run that week prior to coming to the laboratory session. At that point, students had only been exposed to the online instructional tools (videotaped introductory lectures and virtual laboratories); thus these quizzes were a good measure for the effectiveness of such tools. These online quizzes were graded electronically, ensuring consistency between different groups and over the length of the study. The quiz scores were used to compare performance between classes exposed to different online tools.
Students were surveyed twice; first, at the end of the semester, to examine their perceptions of the laboratory immediately following the conclusion of the course, and again during the second semester of their second year to assess students’ perception after completing most of the basic sciences courses and some of the clinical sciences courses. The end of the semester survey tool was composed of 9 items with responses based on a 5-point Likert scale on which 1 = strongly disagree and 5 = strongly agree and 2 open-ended questions. The second-year survey instrument consisted of 7 items with responses based on the same 5-point Likert scale and 2 open-ended questions. The students’ responses on these survey instruments were compiled and the average score for each item was used to compare students’ perceptions between classes and as they progressed from year 1 to year 2 in pharmacy school.
Analyses were performed using SPSS 16.0 and Excel 2007 statistical programs. Numerical data means for 2 different student groups (years) were compared using 2-tailed independent sample t tests, whereas matched-pair t tests were used to compare results from the same group of students at 2 different time points. Multiple mean comparisons were performed using ANOVA for independent samples. A p value < 0.05 was deemed significant.
Less than 40% of the students in the fall 2005 class felt that their expectations for the course, which included a mixture of traditional and Web-based teaching methods, were met or at least partially met. In contrast, in the 2006-2009 surveys, administered after transformation to Web-based instruction, over 80% of students felt their expectations were met or partially met (p < 0.001). These findings did not vary significantly with the addition and further modification of the tools in fall 2006 and fall 2009 (Table 2).
Pharmacy Students’ Perceptions of Whether Their Expectations of a Laboratory Course Were Met
With regard to students’ familiarity with the techniques used at the end of the semester, responses ranged from 4.0 to 4.3 during the study period (p < 0.05) (Table 3). Students’ average score for relevance of the course to their pharmaceutical education remained consistently around 3.0, with an increase to 3.4 in fall 2009. Students felt that the laboratory experiments generally followed the instructions in the lecture portion (range 3.6 to 3.9). Students’ perception of the variety of concepts and techniques used in drug analysis to which they had been exposed improved steadily from 4.0 in fall 2005 to 4.4 in fall 2009 (p < 0.05). Students felt that the pace of the laboratory was somewhat demanding (range 2.9 to 3.1) from fall 2005 to fall 2008. However, students taking the course in fall 2009 felt that the pace was appropriate (4.1; p < 0.0001). Students’ perception of the quality of the report submission process was consistently around 3.6, with a slight drop in fall 2006 to 3.4, but an improvement to 4.0 was seen in 2009 (p < 0.05).
Pharmacy Students’ Perceptions of a Drug Assay Course
When students were surveyed a year after they took the fall 2005 course (year in which Web-based instructions were introduced), the percent of students who felt that the course met or partially met their expectations increased from 42% to 64%. Scores for this item on the second-year survey instrument increased to 82% for the class enrolled in the course in 2006 and over 97% for the classes enrolled in 2007, 2008, and 2009 (p < 0.01) (Table 2).
With regard to students’ familiarity with the techniques, students average score on the second-year survey stayed between 3.4 and 3.7, but jumped from 4.1 to 4.2 for those enrolled in 2008 and 2009 (p < 0.001) (Table 3). The average score for the relevance of the course increased over the period of the study from 2.2 to 3.6, (p < 0.001). Students felt that the laboratory experiments generally followed the instructions in the lecture portion, with an increase in scores for this item from 3.0 to 3.7 for those enrolled from 2005 to 2007, to over 3.9 for those enrolled in 2008 and 2009 (p < 0.001). Students’ perception of the degree of variety of various concepts and techniques used in drug analysis to which they were exposed showed steady improvement from 3.5 to 4.4 (p < 0. 001) from 2004-2009. Students felt that the pace of the laboratory was somewhat demanding, with an average score of 3.0 for classes in 2004 to 2006, and a slight improvement in the average score for the class in fall 2007 (3.2). However, students taking the course in 2008 and 2009 felt that the pace was appropriate, with an average score of 4.0 (p < 0. 001). Students consistently rated the report submission process an average of 3.4, with scores of those enrolled in 2008 increasing to 4.0 (p < 0.05). Data from the second-year survey for the fall 2009 class was withheld because of technical issues.
Comparing students’ perceptions of the relevance of the course to their pharmaceutical education, the average score on the second-year survey actually decreased compared to scores on the end-of-the semester survey for those enrolled in fall 2005 (p < 0.05). The scores were similar for students enrolled in fall 2006 and 2007, but higher for those enrolled in fall 2008 and fall 2009, with only 2008 scores being significantly different (p < 0.05).
Students felt that all of the forms of online instruction used in the course had great educational value (Table 4). Students consistently rated the instructional videos, technique videos, and virtual laboratories over 4.0 on a 5-point scale from 2005-2009, scoring the virtual laboratories significantly higher than the instructional videos (4.8 in 2009; p < 0.05). The perception of instructional videos improved after the college switched to the Adobe Flash model in 2007 (p < 0.005). The perception of the technique videos stayed somewhat consistent over the 5 years, with an average range of 4.2 to 4.5. In terms of educational value, ratings for use of online reporting and prelaboratory quizzes fared poorly in 2005, but ratings improved in subsequent years, reaching an average of 4.2 (p < 0.0001).
Student Perception of Online Tools Used in the Drug Assay Course
Students generally prefered the virtual laboratories to the videotaped instructional videos as evident by their perception of both tools during fall 2005 and fall 2006 when more than half the experiments used instructional videos (p < 0.05).
When students were asked whether to keep, revise, or eliminate each tool, their answers differed. The percent of students who wanted to keep the tools increased for the instructional videos, prelaboratory quizzes, and online reporting, and stayed consistently high for virtual laboratories and technique videos. By 2009, over 90% favored keeping virtual laboratories and online reporting, over 80% favored keeping instructional videos and technique videos, and over 75% favored keeping prelaboratory quizzes. Over the period of the study, a small percentage of student favored eliminating at least 1 of the tools, with the percentage standing at less than 3% for each of the tools in 2009.
To investigate the effectiveness of virtual laboratories of each of the experiments (Table 1), the online quiz scores were compared before and after introducing the tool. In 3 cases (experiments 1, 2, and 6), a significant improvement in the scores was observed (p < 0.05) after introducing the virtual laboratories, while in 4 cases (experiments 3, 4, 5, and 7), there was no significant improvement in the quiz scores. The virtual laboratories for experiments 8 and 9 were used during the first year, so there was no reference point for comparison (Figure 1).
Student perception of the various online tools.
To explore the effect of adding Web-based instruction on student learning in the course, the investigators looked at the 2 parameters used to evaluate student performance in the laboratory section: laboratory reports and the final practical examination. Scores on the final practical examination improved significantly from an average of 82.7% prior to the use of the Web-based instructions to 86.2% after introduction of some of the tools, to an average of 91.2% after the Web-based instruction approach was fully implemented (p < 0. 001; ANOVA test). The overall scores on laboratory reports did not change significantly after the introduction of the new teaching approach, and remained relatively constant (89%-90%) throughout the study.
DISCUSSION
Because of the perceived need for more pharmacists, pharmacy school class sizes have increased nationwide. This increase places a particularly heavy burden on courses with a laboratory component because of the need for specialized equipment and a smaller instructor-to-student ratio. In addition, maintaining high standards in science training is important if pharmacy graduates are to practice knowledgeably, responsibly, and confidently in any setting. Instrumentation and resource constraints are most difficult in the pharmaceutical analysis laboratory because of the nature of the experiments that need to be conducted and the need to provide an individual learning experience. This study provides a novel approach to overcome such problems and still provide students with a unique learning experience. Having students study experiments online prior to conducting them in the laboratory allowed more laboratory time for conducting the actual experiments, and dividing the students into smaller subgroups and rotating the order in which experiments were conducted ensured that every student received hands-on experience using the laboratory equipment. This setup eliminated the burden on graduate-student instructors to present the laboratory experiment instructions in lectures, and overcame associated problems, such as educational gaps resulting from differences in the instructions provided by the graduate student instructors.
This course can be adapted to any laboratory course in which students can be divided into groups, allowing for a lower student-to-instructor ratio and more hands-on experience, even when the number of instruments available is limited. The online tools may be adapted for any experiment. Providing the instructions online standardized the instructions given throughout the semester to different groups of students performing the experiments at different times. The online tools also provided opportunities for instructors to introduce concepts and scenarios that were impractical to include in the wet laboratory, including problem-solving in patient cases, faulty scenarios, and clinical applications that highlight the relevance of drug assay to the profession of pharmacy.
The resources required for implementing this teaching approach are limited to those needed to create the online tools, which may include an academic technology expert to build and maintain the modules, and provide technical support. Once created, students are able to access these tools from any computer with a high-speed Internet connection. Resources required for running the laboratory and conducting experiments are similar to those incurred using a traditional instruction setup; however, costs may be lower because the workload for graduate student instructors is less. Also, theoretically, there may be lower maintenance costs associated with the instrumentation as students come in to the laboratory with the knowledge to operate the equipment correctly. In colleges where the introduction of a drug assay or other laboratories is not feasible because of lack of resources, facilities, instrumentation, or other limitations, virtual laboratories and other online tools could complement medicinal chemistry or clinical sciences courses as an alternate means of exposing students to concepts involved in drug assays or other related techniques. This option may be of importance given economic realities and increasing pharmacy class sizes.
After introduction of Web-based instruction, the number of students who felt at the end of the semester that the course had not met their expectations dropped significantly from almost half the class in 2005 to less than 2% in 2008 and 2009. When students who took the course in 2007, 2008, and 2009 were surveyed a second time after completing almost 2 years of the PharmD program and being exposed to a wide variety of pharmacy courses, almost all felt the course had met their expectations. This latter observation is of major significance as it reflects students’ perceptions after being exposed to a wide variety of courses at the college and gaining a better understanding of pharmaceutical education, the role of a pharmacist today, and the accompanying expectations.
While students scored areas such as familiarity with the techniques used and variety of techniques high even before implementation of the Web-based instruction, scores on these items still improved significantly over the period of the study. The new online reporting system was well received, especially after the changes made in the last 2 years of the study. Interestingly, students felt that the pace of the laboratory course, despite the pre-class work they were required to do, was at an appropriate level, especially in 2009. While scores for the relevance of the course improved over the study period, the increase was not as significant as that seen in other areas.
The improvement in scores on most items occurred from the end of the semester survey to the second-year survey. A significant drop from year 1 to year 2 in students’ scores for the relevance of the course was observed among those enrolled in 2005, but a significant increase in the same score over the same time period was observed for those enrolled in 2008.
The online tools used in this course were well received in general, as indicated by student survey results. Virtual laboratory sessions consistently received the highest scores from students and that led to the decision to switch to that format exclusively. When addressing the educational value of these tools, students scored the virtual laboratory sessions higher than other formats, and once we switched exclusively to that tool and addressed some of the technical issues, the scores significantly improved. Student satisfaction scores with the instructional videos increased significantly as well after we switched to the same Adobe Flash format for the virtual laboratory. After the online reporting system was modified to address streamlining and technical problems, satisfaction scores for those items also improved significantly. The one tool about which students’ perceptions remained unchanged over the length of the study was the technique videos. Interestingly, students’ perception of the prelaboratory quizzes improved over the length of the study even though no changes were made to this course component. The improvement in scores could be the results of constant optimization of the other tools highlighting the relevance and description of the concepts assessed in these quizzes and providing better explanation to the students regarding the benefits of the quizzes.
An assessment of how the introduction of the virtual laboratories affected the prelaboratory quiz scores over the period of the study is hypothesized to act as an indicator to the efficiency of these virtual laboratories as they were the only instructions the students received prior to taking the quizzes. The significant improvement in the pre-laboratory quiz scores for only 3 of the 7 experiments could be explained by higher average scores on the other 4 quizzes in the years before the virtual laboratories were offered. Because those 4 quiz scores were already high, a further significant improvement was unlikely.
The impact of the new teaching method on student learning was unclear in some areas. Students performed better on the final practical examination after introduction of the Web-based tools, and scored even higher after all the changes were implemented, with an 8.5% improvement in scores. However, scores on laboratory reports did not significantly improve after the introduction of Web-based instructions. One explanation is that the laboratory reports are based largely on the results of hands-on experiments and may not be influenced as much by the method of delivery used for instructions on conducting the experiments. On the other hand, the final practical examination relies on problem solving, application and analysis of the experimental protocols, and use of higher thought processes, where the study shows a positive correlation with Web-based instruction as an active-learning approach. The authors recognize that despite the use of grading rubrics, the change in graduate student instructors from one year to the next may have had an effect on the results described, as the graduate students were primarily responsible for grading both the final practical examination and the laboratory reports.
CONCLUSIONS
The introduction of Web-based tools in a drug assay laboratory course resulted in improvement in students’ perception of the course, particularly regarding its relevance to pharmacy practice, familiarity with techniques, and overall expectations. This positive perception of the course continued after completion, as students progressed deeper into their pharmacy education. The innovative tools introduced, particularly the virtual laboratory modules, were well received and students recommended that they continue to be an integral part of the course. The changes also seemed to have a positive effect on students’ understanding of the experimental protocol as demonstrated by their performance on prelaboratory quizzes and on students’ overall learning as illustrated by improvement in their performance on the final practical examination.
This study provides an alternative solution for presenting instructions to large classes, and a blueprint for incorporating active learning and problem solving into a laboratory course and streamlining laboratory sessions. The online tools and Web-based approach could be used in other non-laboratory courses as well.
- Received May 27, 2011.
- Accepted January 29, 2012.
- © 2012 American Association of Colleges of Pharmacy