Skip to main content

Main menu

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Other Publications

User menu

Search

  • Advanced search
American Journal of Pharmaceutical Education
  • Other Publications
American Journal of Pharmaceutical Education

Advanced Search

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Follow AJPE on Twitter
  • LinkedIn
Research ArticleINSTRUCTIONAL DESIGN AND ASSESSMENT

Student Engagement in Pharmacology Courses Using Online Learning Tools

Abdullah Karaksha, Gary Grant, Shailendra Anoopkumar-Dukie, S. Niru Nirthanan and Andrew K. Davey
American Journal of Pharmaceutical Education August 2013, 77 (6) 125; DOI: https://doi.org/10.5688/ajpe776125
Abdullah Karaksha
aSchool of Pharmacy, Griffith University, Queensland, Australia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Gary Grant
aSchool of Pharmacy, Griffith University, Queensland, Australia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Shailendra Anoopkumar-Dukie
aSchool of Pharmacy, Griffith University, Queensland, Australia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
S. Niru Nirthanan
bSchool of Medical Science, Griffith University, Queensland, Australia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Andrew K. Davey
aSchool of Pharmacy, Griffith University, Queensland, Australia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading

Abstract

Objective. To assess factors influencing student engagement with e-tools used as a learning supplement to the standard curriculum in pharmacology courses.

Design. A suite of 148 e-tools (interactive online teaching materials encompassing the basic mechanisms of action for different drug classes) were designed and implemented across 2 semesters for third-year pharmacy students.

Assessment. Student engagement and use of this new teaching strategy were assessed using a survey instrument and usage statistics for the material. Use of e-tools during semester 1 was low, a finding attributable to a majority (75%) of students either being unaware of or forgetting about the embedded e-tools and a few (20%) lacking interest in accessing additional learning materials. In contrast to semester 1, e-tool use significantly increased in semester 2 with the use of frequent reminders and announcements (p<0.001).

Conclusion. The provision of online teaching and learning resources were only effective in increasing student engagement after the implementation of a “marketing strategy” that included e-mail reminders and motivation.

Keywords
  • engagement
  • e-tools
  • pharmacology
  • online instruction
  • Internet

INTRODUCTION

Student engagement is defined as the time and energy students invest in educationally purposeful activities and the effort institutions devote to using effective educational practices.1,2 It is also the quality of effort and involvement in productive learning activities. The engagement premise has been evolving since the 1930s. One of the earliest reports was a published work in 1969, which showed that time spent on a learning task has a positive impact on improving student understanding.1 Engagement is emerging as an organizing construct for institutional improvement efforts, assessment, and accountability. The concept of engagement suggests that the more students study a subject, the more they know about it, and the more students practice their learning tasks, the deeper they understand what they are learning.1 Therefore, the more students engage in learning tasks, the more they benefit from these activities and eventually learn.3

Student engagement involves 2-way communication wherein students and institutions play central roles in creating the environment for engagement and taking advantage of engagement opportunities.1,4 This broadened perspective highlights the notion that students should be at the heart of the learning process and that institutions aiming to increase student engagement should focus squarely on enhancing individual learning and development.4 Student engagement is not a “one-size-fits-all” way of thinking. Nonetheless, student engagement as a concept “provides a practical lens for assessing and responding to the significant dynamics, constraints and opportunities facing higher education institutions. It provides key insights into what students are actually doing and a stimulus for guiding new thinking about good practice.”5

A growing body of research into student engagement over the last few years argues that student engagement with traditional classroom lectures and participation in traditional learning activities has significantly declined.6 Students are now entering higher education with a diverse range of backgrounds and skill sets, that are different from traditional university entrance criteria.7 Moreover, increased student enrolment and the financial costs of higher education have raised concerns about the quality of student learning and experience.8 At the same time, education has been undergoing a paradigm shift, moving away from teaching as instruction toward student-centered learning approaches.9 Consequently, curricula have been redesigned around learning outcomes rather than content.10 The rapid expansion and innovation in technology and students’ expectation that technology will be integrated into their learning experiences11,12 have encouraged higher education institutions to incorporate technology-based teaching into their curriculum to increase student engagement.13

Students who have grown up with advances in new technologies have resulted in the emergence of a new concept known as the digital native. Students who are digital natives have novel learning styles that are nonlinear and personalized to individual needs, and they are fluent in “simulation-based virtual settings.”14 Those students need more than the traditional teaching approaches to engage in the learning process. Technology is thought to provide many advantages to digital-native student learning. Supporters believe that it allows them to direct their own learning by providing flexible learning opportunities.11,15 The implementation of interactive online teaching materials known as e-tools is increasingly advocated because of their capacity to allow students to learn when, where, what, and how (collaborative or independent learning style) they want.16,17 E-tools are also anticipated to motivate, engage, and stimulate higher-order thinking for students.18 However, the impact of implementing e-tools on increasing student engagement has not been extensively investigated. One study in 2011 concluded that commercial e-tools failed to increase the motivation of students when included as part of their teaching and learning. However, to realize a student learning benefit, the content of the e-tools should be aligned with the educational objective of the course.18 Therefore, a suite of customized animation tools (e-tools) encompassing the basic mechanisms of action for different drug classes were designed for third-year pharmacy students at Griffith University, Gold Coast, Australia. Those e-tools were used to supplement traditional face-to-face lectures in the Human Pharmacology I and II courses.

The e-tools were designed to align with the objectives of the courses to form a system that is beneficial to students.19 The design process for the in-house e-tools within the framework of a defined pedagogy and relevant teaching theories has been published.20 The aim of this project was to assess student engagement with these e-tools to determine if e-tools increase student engagement. Many measures of student engagement have been used over time, including level of academic challenge, active and collaborative learning, student-faculty interactions,21 and student perception. Given that student perception is the most commonly used measure of engagement and has been used by The National Survey of Student Engagement, the present study used this approach by evaluating student comments and feedback.1 The authors also explored student interaction with the e-tools by analyzing when and how often they accessed the material on course Web site using Blackboard (Blackboard Inc., Washington, DC).22

DESIGN

This study was conducted at the School of Pharmacy, Griffith University, Gold Coast campus, Australia. A suite of 83 e-tools was designed for the Human Pharmacology I course in semester 1 and 65 e-tools for the Human Pharmacology II course in semester 2 of 2012. These are both 13-week courses normally delivered by means of 3 lectures per week, supported by weekly tutorials and laboratories (2 to 4 hours per week). The e-tools covered the mechanisms of action for the majority of drug classes and were used as a supplement to the standard curriculum. Ethical approval was granted by the Griffith University Human Ethics Committee. Custom animations were sequenced in Microsoft PowerPoint 2010 and narration was added using iSpring Pro 6.1.0 (iSpring Solutions, Alexandria VA) to produce the embedded animation and then convert the animations into an Adobe Flash (Adobe Systems Inc, San Jose, CA) format for ease of delivery and access through Blackboard.20 Participants could easily control the speed of the final e-tools, skip content, and move forward and backward to revisit specific concepts as needed. Each e-tool was accompanied by multiple-choice questions, which were developed to assess stated learning objectives, generated by Question Writer 3 (Professional) (Question Writer Corporation, Torrance, CA) and accessed through Blackboard. Question Writer 3 sent the results anonymously to the researcher’s designated e-mail for evaluation.

The first set of e-tools used during semester 1 were made available to students through the course Web site in Griffith University’s Blackboard interface. Students were informed about the e-tools during the course introductory lecture. Semester 1 assessment items included a midsemester examination; and 4 online quizzes on: (1) genitourinary drugs; (2) the cardiovascular system; (3) drugs affecting blood; and (4) the central nervous system. Students were given 13 weeks to complete the online quizzes. The final examination was administered 2 weeks after the deadline for completion of the online quizzes.

EVALUATION AND ASSESSMENT

Eighty pharmacy students enrolled in the course Human Pharmacology I in semester 1. One student did not pass this course and so was not able to enroll in Human Pharmacology II the following semester.

To evaluate student baseline attributes in semester 1, a paper-based survey instrument was designed to obtain student demographic data and preference for e-tools. One month after finishing the first course, students were approached in person during a Human Pharmacology II workshop and asked to participate in the survey. The timing of the survey gave students a chance to access the e-tools during the first semester and ensured that all students would have the opportunity to participate in the survey. Student participation in the survey was voluntary and anonymous. The survey instrument was designed according to previous studies that examined student preference regarding technology 21,23-25 and obtained demographic data including gender, grade-point average (GPA), frequency of attending lectures, and difficulty of following topics that cover drug mechanisms of action. It also explored student engagement and perceptions of the e-tools used during semester 1. Students were asked whether they accessed the e-tools, their reasons if they did not, and their behavior and attitude regarding the e-tools if they did. Students also specified whether they accessed the complete set of e-tools or were more interested in accessing only those for certain drug classes, as well as how frequently they accessed the e-tools (eg, daily, weekly, and/or before assessments). Additionally, students were given an opportunity to provide their perceptions, feedback, and additional comments in their responses to an open-ended question. Student preference for technology, in general, was examined by means of a 5-point Likert scale (strongly agree, agree, no comment, disagree, and strongly disagree). Student learning styles were assessed by asking students whether they remembered words and/or pictures in responding to questions related to drug mechanisms of action.

Feedback from semester 1 indicated that students who did not access the e-tools were either unaware of their existence or had forgotten about them. Therefore, in semester 2, a different strategy was followed to motivate students to engage with the e-tools. An initial announcement was made when the e-tools were uploaded into the course Web site, and 7 follow-up reminders/announcements were made through Blackboard and e-mailed to students during the semester, usually prior to assessment deadlines. Semester 2 assessment items included a midsemester examination; 4 online quizzes that were available on Blackboard for a limited period of time, as in semester 1. The 4 quizzes were on: (1) inflammation; (2) antibiotics; (3) the endocrine system; and (4) chemotherapy. The time students had to complete each quiz ranged from 12-21 days. A final examination was administered 6 days after the last quiz deadline.

To evaluate student engagement for semester 1 and semester 2, data from the online course Web site on Blackboard were obtained, including the number of uses for each e-tool and the times and dates of access. The data were de-identified by the course coordinators before analysis. For the survey results, several quantitative analyses were undertaken. Demographic data including gender, English as first language, and student preference for technology and learning style were compared between the students who accessed the e-tools and those who did not. To determine whether the groups significantly differed in these baseline variables, t tests and chi-square tests were used. Student performance in the long-term retention questions across the 2 groups was evaluated using t tests, and the method used to recall information when answering these questions was analyzed using the chi-square test. The survey instrument evaluated participant attitudes toward the technology using a 5-point Likert scale (strongly agree, agree, no comment, disagree, and strongly disagree). To improve sample size per group, these categories were collapsed into 3 types of responses: positive, negative, and neutral. For the course Web site data, t tests were used to compare total e-tool usage in terms of number of hits between the 2 semesters. Analysis of variances (ANOVA) was undertaken to compare the differences in e-tool usage, measured as mean hits per day, in each month during the 2 semesters. The data were analysed using SPSS, version 20 (IBM Corp, Armonk NY). Bar graphs showing usage trends were created in Microsoft Excel. Significance was set at p<0.05.

Forty-three students voluntarily participated in the survey, representing 54% of the total cohort. Of those, 23 students accessed the e-tools (group 1) while 20 did not use the e-tools (group 2) during semester 1. No significant differences were found between the 2 groups in any of the demographic comparisons (Table 1). GPA was also collected from participants through the survey. All recorded values were valid and within the normal GPA range (1.0-7.0).

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 1.

Demographic Data and Preferences Toward Online Learning Tools Among Semester 1 Students Who Did and Did Not Use E-Tools

There was no significant difference between the 2 groups in GPA (p>0.05) or the number of students whose first language was English (p>0.05). Students were also asked to indicate whether they read through the lecture notes before attending lectures (prior lecture study), and no significant difference was seen in this variable between the groups (p>0.05). Additionally, participants were requested to rate the level of difficulty they had in understanding course content that involved drug mechanisms of action. Student responses were split between easy, neutral, and difficult, with no significant differences noted (p>0.05). Student attitude toward online learning tools was also analyzed and compared between the groups. There was a positive preference for the online learning tools regardless of whether students accessed the e-tools. However, the majority of students from both groups were either negative or neutral regarding the substitution of traditional classroom lectures with online learning tools. Student learning style was also compared between the groups. More students who used the e-tools (group 1) preferred animations to reinforce their learning, suggesting a preference for visual learning. The difference between the 2 groups was highly significant (p<0.05) (Table 1).

Feedback was also obtained from students who did not access the e-tools (group 2). The majority of students (n=15) either forgot about or were unaware of the e-tools. Other students indicated lack of time as their reason for not using additional learning materials. Finally, some students preferred to study textbooks, which they cited as more of a match for their learning style.

Analyzing student behaviors toward the e-tools showed that 70% of students who accessed the e-tools were inclined to view the complete suite. Those who viewed certain drug classes indicated that they accessed the e-tools for cardiac drugs, diuretics, antiarrhythmics, and anticoagulants. Students were more likely to access the e-tools before quizzes and examinations (Table 2).

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 2.

Student Behaviors and Attitudes Toward the E-Tools

Students included comments and feedback regarding the benefit of e-tools. Two students did not find the e-tools useful because they believed that the information was either too basic or required too much time to view. The remaining comments were positive and were classified into 3 major themes. In the first theme, students appreciated that the e-tools were visual explanation of drug mechanisms of action (7 comments relating to this aspect). In the second theme, students perceived the e-tools as helpful in furthering their understanding of the drug mechanisms of action (4 comments). Finally, students perceived that the e-tools provided additional reinforcement of the lecture materials and that reviewing them was more interesting than repeatedly reading lecture notes (9 comments).

Data from Blackboard provided more in-depth analyses for student engagement with the e-tools. Figures 1 and 2 demonstrate student access to the e-tools during semesters 1 and 2. Students accessed the e-tools in the first semester over 3 months (April, May and June). Time of access for the e-tools time was spread across the day. A similar trend was found during semester 2.

Figure 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 1.

Usage of E-Tools during Semester 1

Figure 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 2.

Usage of E-Tools during Semester 2

In semester 1, most hits occurred halfway through the semester, particularly on the day before the midsemester examination. There was a steep drop in use during the following month. Use of the e-tools increased again in the study break just prior to the final examination. Analyzing data from semester 2 shows that students started to use and view the e-tools 3 days before the midsemester examination. Usage dropped for the remainder of the term and peaked 3 days before the final examination.

The timing of student access to the e-tools in semester 1 ranged from early morning until almost midnight. Spikes in the number of hits were recorded at 10:00 am and 7:00 pm. Similarly, during the second course, students viewed the e-tools mainly between 7:00 am and 11:00 pm with a small number of hits recorded at 2:00 am and 4:00 am The most popular times to access the e-tools during the second course were at 1:00 pm. and 10:00 pm.

The results showed significantly greater e-tool use during the second course compared to the first course (p<0.05). This difference in use was most apparent in the final month of the semester, with much greater usage in the second course than the first (p<0.05) (Table 3).

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 3.

Comparison of E-Tool Use Between Semesters

Table 4 shows the total usage of e-tools for each drug class covered during the first course. The highest usage (99) was recorded for the diuretic drugs group, which contained 9 e-tools. Only 2 hits were recorded for hypnotic drugs, which contained 5 e-tools (Table 4). During the second semester, the antibiotic drug class consisted of 18 e-tools and received 302 hits. The last drug class that was covered during the semester (antiemetics) recorded the lowest number of hits at 21(Table 4).

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 4.

Use of E-Tools for Each Drug Class During Semester 1 and 2

DISCUSSION

The aim of the study was to assess student engagement with the set of 148 e-tools across a period of 2 semesters in the School of Pharmacy. Participants in the e-tool user and nonuser groups had similar demographics.

Students’ attitudes toward the application of technology into their learning and teaching were also similar across the groups. In general, students from both groups (72%) were positive toward the technology. This is an expected outcome from digital-native students who were known to have positive preference toward implementing technology into their learning experiences.14 It also confirms the claim that students expect technology to be integrated into their learning experiences.11,12 However, 45% (n=14) of students who had positive attitudes toward the application of online tools to learning and teaching did not use the e-tools. Suprisingly, 64% (n=9) of those students did not use the e-tools because they either forgot or did not know about them. Students’ various learning styles affect how they engage with traditional and new teaching methods.26 Therefore, student preference toward studying using text, animation, or both was assessed in the survey. The results showed that the preponderance of students (77%) who used the e-tools preferred to study either animation or a combination of text and animation. This finding can be linked to the qualitative comments from students who stated that they found the e-tools to be a valuable visual explanation for drug mechanisms of action. Students with a visual learning style felt that they learned more easily with diagrams and pictures than with written text.26 This correlates with findings that generation Y students prefer to study and learn using audiovisual materials over textual information.27 This is a good illustration of the benefit of using multiple teaching methods to satisfy various student learning styles, which is in line with previous research showing that one of the advantages of e-tools is to present information to students in different ways, thereby catering to individual learning styles.28 Student feedback indicated that supplementing lectures with e-tools in the Human Pharmacology courses gave them the flexibility to choose the learning method that best suited their needs. Approximately 80% of students had either negative or neutral responses regarding the replacement of traditional lectures with online-learning tools. In a 2011 study that corroborates this finding, 70% of the 359 digital native students surveyed favored attending traditional face-to-face lectures.29 Students still consider face-to-face discussions with lecturers and peer interaction in the classroom to be critical to their learning success.30,31

While the majority of participants had positive attitudes toward technology and preferences for studying both animations and text, they had either a negative or neutral attitude regarding replacing traditional lectures with those tools. Therefore, we recommend that technology-based teaching methods be used as supplements to traditional lectures.

Despite our intention to increase student engagement, we followed a teacher-focused approach in implementing the e-tools in Human Pharmacology I. With this approach, the teacher focuses on the design of the teaching materials rather than on what the students do.32 We informed the students when the e-tools were ready and uploaded online. No further action was taken to encourage students to use them. Analyzing student reasons for not accessing the e-tools during Human Pharmacology I showed that the majority either forgot or did not know about them, with only a few students indicating that they were not interested in accessing additional learning materials. This minority indicated that the lecture notes and textbook were enough support for their learning. Another study reported similar reasons for student non-engagement with technology, including lack of interest in or desire to use technology. For the minority who did not use the e-tools, as in our study, the main reasons were lack of perceived need or relevance, lack of awareness of the existence of the technology, and lack of knowledge/understanding in relation to how to use the technologies.33

Apparently, students did not anticipate the expected benefit of using the e-tools during the first semester. Students are known to participate and engage more when they understand the importance and relevance of the learning items to the assessment tasks. Additionally, according to the phenomenography theory, the learner perspective determines what is learned and when is the suitable time to learn.19 Other researchers have suggested that lecturers need to explain the relevance of the learning tasks to the course to encourage the students to engage and follow a deep-learning approach; ie, to follow a student-focused approach to increase student engagement with the e-tools.19 The focus of this approach is on student learning and the teacher’s role is to encourage students to lead their self-directed learning and to construct their own knowledge and ideas.34 This phenomenon has been explained by the expectancy-value theory of motivation, which assumes that for students to engage in any learning activity, they need to see the value of the activity toward their ultimate goal: passing the course.10 Moreover, encouraging students to be independent learners involves them taking responsibility for their learning, monitoring their progress, and seeking help appropriately.35

Students failed to properly engage with the e-tools during semester 1. The assumption that digital-native students will purposely engage with technology is still under question.36 Consequently, we decided to use frequent e-mails and announcements to remind students about the importance of using the e-tools during semester 2 and encourage them to benefit from this teaching approach, as a previous study demonstrated that students appreciate receiving announcements and e-mails about information related to their courses.37 In the announcements, we explained the expected benefit of using the e-tools and encouraged the students to use them. This significantly increased use of the e-tools in semester 2 compared with that during semester 1, suggesting more student engagement with the e-tools.

Student behavior in using the e-tools across both semesters indicates maximum use just before assessment tasks, which raises a question regarding whether accessing the tools immediately prior to examinations is a useful strategy in terms of student learning outcomes. Arguably, students seem to be either taking a surface approach to their own learning by using the e-tools as a short-term memory aid for the examination, or they are studying hard all semester and then using the e-tools as a refresher at the end. As we were not able to measure the length of time students spent watching any given e-tool, it is not possible to draw absolute conclusions regarding the depth of student engagement with the tools.

Another important discussion point is the decreased use of e-tools as the semester progressed (Table 4). This was a clear trend in semester 1 despite that the later e-tools covered the difficult drug classes. Previous research has suggested that academics should not challenge students at the start of their courses but rather focus on introducing them to the environment and challenging them in the final stages of the course.38 Therefore, the Human Pharmacology lecturers structured the courses and started from simple modules to build student knowledge, progressing to more complicated and complex modules. Students may have felt tired as the semester progressed and did not have the energy to engage in the later, more difficult topics. Another possible explanation is that students were following a strategic approach in their study, focusing on the easier topics with the hope of getting easy marks rather than spending more time on hard subjects.39 This is an expected behavior from university students who are purposeful learners.19,40 Further investigation is needed to better understand this behavior. However, use of e-tools for the latest drug classes in semester 2 was better than that in semester 1, suggesting that frequent reminders to students are important to keep them engaged during the busy time of assessments and examinations.

Students’ qualitative comments indicated their positive preference toward the e-tools, with a few students giving extremely positive feedback. This finding concurs with evidence from other studies that found positive student attitudes toward the implementation of technology into their learning.21,23-25,41 Students also commented that the e-tools were an interesting additional resource for studying. These comments align with the findings of the EDUCASE Center for Applied Research study, in which 70% of 36,950 students found that technology makes learning the content in their courses more convenient.42

The purpose of this study was to examine student engagement with e-tools over a period of 2 semesters. What we discovered was that the addition of e-tools (or any other resource) alone did not lead to increased student engagement. A student-focused approach, as in the second semester, is needed to improve student acceptance of and engagement with the e-tools. A limitation of this study is that student performance in the final examinations in semesters 1 and 2 was not measured. Other limitations include the small sample size and the potential for nonrespondent and self-reporting bias.

CONCLUSION

Pharmacy students enrolled in a Human Pharmacology course series valued the addition of technology-based teaching strategies as a supplement to classroom teaching methods. However, the development of online teaching and learning resources is ineffective in increasing student engagement unless supported with frequent reminders and encouragement. The provision of online teaching and learning resources were only effective in increasing student engagement after the implementation of a “marketing strategy” comprising e-mail reminders and motivation.

ACKNOWLEDGEMENTS

The authors acknowledge the role of the 2011 cohort of third-year pharmacy students for undertaking the study survey instrument and the 2012 cohort for using the e-tools and participating in the study survey. The authors thank the faculty of Griffith Health at Griffith University for providing the blended learning grant that funded this work.

  • Received January 2, 2013.
  • Accepted February 17, 2013.
  • © 2013 American Association of Colleges of Pharmacy

REFERENCES

  1. 1.↵
    1. Kuh GD
    . The national survey of student engagement: conceptual and empirical foundations. New Dir Inst Res. 2009;2009(141):5-20.
    OpenUrl
  2. 2.↵
    1. Kuh GD,
    2. Cruce TM,
    3. Shoup R,
    4. Kinzie J,
    5. Gonyea RM
    . Unmasking the effects of student engagement on first-year college grades and rersistence. J Higher Educ. 2008;79(5):540-563.
    OpenUrlCrossRef
  3. 3.↵
    1. Pace R
    . The Undergraduates: A Report of Their Activities and Progress in College in the 1980's. California University, LA: Center for the Study of Evaluation; 1990.
  4. 4.↵
    1. Coates H
    . Attracting, Engaging and Retaining: New Conversations About Learning, Australasian Student Engagement Report. Camberwell, Vic: Australian Council for Educational Research Ltd; 2008.
  5. 5.↵
    1. Radlof A,
    2. Coates H
    . Doing More for Learning: Enhancing Engagement and Outcomes: Australasian Survey of Student Engagement: Australasian Student Engagement Report. Camberwell: Australian Council for Educational Research (ACER); 2010.
  6. 6.↵
    1. Barnett R,
    2. Coate K
    . Engaging the Curriculum in Higher Education. England: McGraw-Hill Education; 2005.
  7. 7.↵
    1. Franklin T,
    2. Van Harmelen M
    . Web 2.0 for content for learning and teaching in higher education. 2007. http://www.jisc.ac.uk/media/documents/programmes/digitalrepositories/web2-content-learning-and-teaching.pdf. Accessed May 15, 2012.
  8. 8.↵
    1. Haggis T
    . Pedagogies for diversity: retaining critical challenge amidst fears of 'dumbing down'. Stud Higher Educ. 2006;31(5):521-535.
    OpenUrl
  9. 9.↵
    1. Ramsden P
    . Learning to Teach in Higher Education: Approaches to Learning, 2nd edition. New Fetter Lane, London: Routledge Falmer; 2003.
  10. 10.↵
    1. Biggs J
    . Setting the stage for effective teaching. In: Teaching for Quality Learning at University: What the Student Does, 2nd edition. Buckingham, England: SRHE and open University Press; 2003:56-73.
  11. 11.↵
    1. Yelland N,
    2. Tsembas S,
    3. Hall L
    . E learning: issues of pedagogy and practice for the information age. In: Kell P, Vialle W, Konza D and Vogyl G, ed. Introduction - Learning and the Learner: Exploring Learning for New Times: University of Wollongong; 2008:95–111.
  12. 12.↵
    1. Berman N,
    2. Fall L,
    3. Maloney C,
    4. Levine D
    . Computer-assisted instruction in clinical education: a roadmap to increasing CAI implementation. Adv Hhealth Sci Educ Theory Pract. 2008;13(3):373-383.
    OpenUrl
  13. 13.↵
    1. Cole M
    . Using Wiki technology to support student engagement: lessons from the trenches. Comput Educ. 2009;52(1):141-146.
    OpenUrl
  14. 14.↵
    1. Prensky MH
    . sapiens digital: from digital immigrants and digital natives to digital wisdom. Innovate. 2009;5(3):1-9.
    OpenUrl
  15. 15.↵
    1. Lewis MJ,
    2. Davies R,
    3. Jenkins D,
    4. Tait MI
    . A review of evaluative studies of computer-based learning in nursing education. Nurse Educ Today. 2005;25(8):586-597.
    OpenUrlCrossRefPubMed
  16. 16.↵
    1. Grigg P,
    2. Stephens CD
    . Computer-assisted learning in dentistry a view from the UK. J Dent. 1998;26(5-6):387-395.
    OpenUrlPubMed
  17. 17.↵
    1. Nieder GL,
    2. Borges NJ,
    3. Pearson JC
    . Medical student use of online lectures: exam performance, learning styles, achievement motivation and gender. J Int Assoc Med Sci Educ. 2011;21(3):222-228.
    OpenUrl
  18. 18.↵
    1. Charsky D,
    2. Ressler W
    . “Games are made for fun”: lessons on the effects of concept maps in the classroom use of computer games. Comput Educ. 2011;56(3):604-615.
    OpenUrl
  19. 19.↵
    1. Biggs J,
    2. Tang C
    . Teaching according to how students learn. In: Teaching for Quality Learning at University, 4th edition. Berkshire, England: Open University Press (McGraw-Hill Education); 2007:15-30.
  20. 20.↵
    1. Karaksha A,
    2. Grant G,
    3. Davey AK,
    4. Anoopkumar-Dukie S
    . Development and evaluation of computer-assisted learning (CAL) teaching tools compared to the conventional didactic lecture in pharmacology education. Proceedings of EDULEARN11 Conference. 4-6 July 2011, Barcelona, Spain. 2011:3580–3589.
  21. 21.↵
    1. Chen P-SD,
    2. Lambert AD,
    3. Guidry KR
    . Engaging online learners: the impact of web-based learning technology on college student engagement. Comput Educ. 2010;54(4):1222-1232.
    OpenUrl
  22. 22.↵
    1. Johnson RD
    . Gender differences in e-learning: communication, social presence, and learning outcomes. J Organ End User Comput. 2011;23(1):79-94.
    OpenUrl
  23. 23.↵
    1. Taplin RH,
    2. Low LH,
    3. Brown AM
    . Students' satisfaction and valuation of web-based lecture recording technologies. Aust J Educ Technol. 2011;27(2):175-191.
    OpenUrl
  24. 24.↵
    1. MacLean J,
    2. Scott K,
    3. Marshall T,
    4. Asperen P
    . Evaluation of an e-learning teaching resource: what is the medical student perspective? Aust N Z Assoc Health Prof Educ. 2011;13(2):53-63.
    OpenUrl
  25. 25.↵
    1. Euzent P,
    2. Martin T,
    3. Moskal P,
    4. Moskal P
    . Assessing student performance and perceptions in lecture capture vs. face-to-face course delivery. J Inf Technol Educ. 2011;10:295-307.
    OpenUrl
  26. 26.↵
    1. Hunt L,
    2. Eagle L,
    3. Kitchen PJ
    . Balancing marketing education and information technology: Matching needs or needing a better match? J Mark Educ. 2004;26(1):75-88.
    OpenUrl
  27. 27.↵
    1. Lindquist T,
    2. Long H
    . How can educational technology facilitate student engagement with online primary sources?: a user needs assessment. Libr Hi Tech. 2011;29(2):224-241.
    OpenUrl
  28. 28.↵
    1. Mogey N
    . So you are thinking about using learning technology. In: Stoner G, ed. Implementing Learning Technology. Heriot-Watt University, Edinburgh: Learning Technology Dissemination Initiative; 1999:21–27.
  29. 29.↵
    1. Thirunarayanan M,
    2. Lezcano H,
    3. McKee M,
    4. Roque G
    . “Digital nerds” and “digital normals:” not “digital natives” and “digital immigrants.” Int J Instr Tecnol Digital Learn. 2011;8(2). http://www.itdl.org/Journal/Feb_11/article03.htm. Accessed October 10, 2012.
  30. 30.↵
    1. Lohnes S,
    2. Kinzer C
    . Questioning assumptions about students’ expectations for technology in college classrooms. Innovate. 2007;5(3). http://www.innovateonline.info/index.php?view=article&id=431. Accessed October 12, 2012.
  31. 31.↵
    1. Garcia P,
    2. Qin J
    . Identifying the generation gap in higher education: where do the differences really lie? Innovate. 2007;3(4). http://www.innovateonline.info/index.php?view=article&id=379. Accessed October 12, 2012.
  32. 32.↵
    1. Biggs J,
    2. Tang C
    . Designing intended learning outcomes. In: Teaching for Quality Learning at University, 4th edition. Berkshire, England: Open University Press (McGraw-Hill Education); 2007:113-133.
  33. 33.↵
    1. Allin L,
    2. Turnock C,
    3. Thompson J
    . Enhancing teaching and learning with technology through collaborative research with students. Proceedings of The London SoTL 8th International Conference. 13-14 May 2011, London, UK. 2011:9–16
  34. 34.↵
    1. Martin E,
    2. Prosser M,
    3. Trigwell K,
    4. Ramsden P,
    5. Benjamin J
    . What university teachers teach and how they teach it. Instr Sci. 2000;28(5):387-412.
    OpenUrl
  35. 35.↵
    1. Cassidy S
    . Learning style and student self-assessment skill. Educ Train. 2006;48(2/3):170-177.
    OpenUrl
  36. 36.↵
    1. Helsper E,
    2. Enyon R
    . Digital natives: where is the evidence? Br Educ Res J. 2010;36(3):503-520.
    OpenUrl
  37. 37.↵
    1. McCabe DB,
    2. Meuter ML
    . A student view of technology in the classroom: does it enhance the seven principles of good practice in undergraduate education? J Mark educ. 2011;33(2):149-159.
    OpenUrl
  38. 38.↵
    1. Norton L
    . Assessing students learning. In: Fry H, Ketteridge S, Marshall S, eds. A Handbook for Teaching and Learning in Higher Education, 3rd edition. London: Routledge (Taylor & Francis Group); 2009:132–149.
  39. 39.↵
    1. Fry H,
    2. Ketteridge S,
    3. Marshall S
    . Understanding student learning. In: A Handbook for Teaching and Learning in Higher Education, 2nd edition. London: Kogan Page; 2003:9–25.
  40. 40.↵
    1. Meyers NM,
    2. Nulty DD
    . How to use (five) curriculum design principles to align authentic learning environments, assessment, students' approaches to thinking and learning outcomes. Assess Eval Higher Educ. 2009;34(5):565-577.
    OpenUrl
  41. 41.↵
    1. Rosenberg H,
    2. Grad H,
    3. Matear D
    . The effectiveness of computer-aided, self-instructional programs in dental education: a systematic review of the literature. J Dent Educ. 2003;67(5):524-532.
    OpenUrlAbstract
  42. 42.↵
    1. Smith SD,
    2. Caruso JB
    . The ECAR Study of Undergraduate Students and Information Technology, 2010: ECAR Research Study 6, EDUCAUSE Center for Applied Research, Boulder, CO; 2010.
PreviousNext
Back to top

In this issue

American Journal of Pharmaceutical Education
Vol. 77, Issue 6
12 Aug 2013
  • Table of Contents
  • Index by author
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Pharmaceutical Education.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Student Engagement in Pharmacology Courses Using Online Learning Tools
(Your Name) has sent you a message from American Journal of Pharmaceutical Education
(Your Name) thought you would like to see the American Journal of Pharmaceutical Education web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
9 + 11 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Citation Tools
Student Engagement in Pharmacology Courses Using Online Learning Tools
Abdullah Karaksha, Gary Grant, Shailendra Anoopkumar-Dukie, S. Niru Nirthanan, Andrew K. Davey
American Journal of Pharmaceutical Education Aug 2013, 77 (6) 125; DOI: 10.5688/ajpe776125

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Student Engagement in Pharmacology Courses Using Online Learning Tools
Abdullah Karaksha, Gary Grant, Shailendra Anoopkumar-Dukie, S. Niru Nirthanan, Andrew K. Davey
American Journal of Pharmaceutical Education Aug 2013, 77 (6) 125; DOI: 10.5688/ajpe776125
Reddit logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • INTRODUCTION
    • DESIGN
    • EVALUATION AND ASSESSMENT
    • DISCUSSION
    • CONCLUSION
    • ACKNOWLEDGEMENTS
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Similar AJPE Articles

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Transformation of an Online Multidisciplinary Course into a Live Interprofessional Experience
  • Enhancing Student Communication Skills Through Arabic Language Competency and Simulated Patient Assessments
  • Qualitative Analysis of Student Perceptions Comparing Team-based Learning and Traditional Lecture in a Pharmacotherapeutics Course
Show more Instructional Design and Assessment

Related Articles

  • No related articles found.
  • Google Scholar

Keywords

  • engagement
  • e-tools
  • pharmacology
  • online instruction
  • Internet

Home

  • AACP
  • AJPE

Articles

  • Current Issue
  • Early Release
  • Archive

Instructions

  • Author Instructions
  • Submission Process
  • Submit a Manuscript
  • Reviewer Instructions

About

  • AJPE
  • Editorial Team
  • Editorial Board
  • History
  • Contact

© 2023 American Journal of Pharmaceutical Education

Powered by HighWire