Skip to main content

Main menu

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Other Publications

User menu

Search

  • Advanced search
American Journal of Pharmaceutical Education
  • Other Publications
American Journal of Pharmaceutical Education

Advanced Search

  • Articles
    • Current
    • Early Release
    • Archive
    • Rufus A. Lyman Award
    • Theme Issues
    • Special Collections
  • Authors
    • Author Instructions
    • Submission Process
    • Submit a Manuscript
    • Call for Papers - Intersectionality of Pharmacists’ Professional and Personal Identity
  • Reviewers
    • Reviewer Instructions
    • Call for Mentees
    • Reviewer Recognition
    • Frequently Asked Questions (FAQ)
  • About
    • About AJPE
    • Editorial Team
    • Editorial Board
    • History
  • More
    • Meet the Editors
    • Webinars
    • Contact AJPE
  • Follow AJPE on Twitter
  • LinkedIn
Research ArticleRESEARCH

National Trends in the Adoption of Pharmacy Curriculum Outcomes Assessment for Student Assessment and Remediation

Justine Gortney, Michael J. Rudolph, Jill M. Augustine, Julie M. Sease, Brenda Bray, Nina Pavuluri and Siu Fun Wong
American Journal of Pharmaceutical Education August 2019, 83 (6) 6796; DOI: https://doi.org/10.5688/ajpe6796
Justine Gortney
aWayne State University, Eugene Applebaum College of Pharmacy and Health Sciences, Detroit, Michigan
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Michael J. Rudolph
bMarshall University School of Pharmacy, Huntington, West Virginia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Jill M. Augustine
cMercer University College of Pharmacy, Atlanta, Georgia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Julie M. Sease
dPresbyterian College School of Pharmacy, Clinton, South Carolina
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Brenda Bray
eWashington State University College of Pharmacy, Spokane, Washington
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Nina Pavuluri
fLake Erie College of Osteopathic Medicine School of Pharmacy, Bradenton, Florida
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Siu Fun Wong
gChapman University School of Pharmacy, Irvine, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading

Abstract

Objective. To determine and describe the current uses of the Pharmacy Curriculum Outcomes Assessment (PCOA) by US schools and colleges of pharmacy.

Methods. Assessment professionals from 135 US schools and colleges of pharmacy were invited to complete a 38-item electronic survey. Survey items were designed to investigate common uses of the PCOA, cut points, and “stakes” assigned to the PCOA, identification of at-risk students, and remediation approaches.

Results. The school response rate was 68%. The most common uses of the PCOA included curricular assessment (76%), individual student performance assessment (74%), and cohort performance assessment (71%). The PCOA was most frequently administered to third-year pharmacy (P3) students. The approach for assigning “stakes” to PCOA performance varied among programs depending on the student’s professional year in the curriculum. Programs used a variety of approaches to establish the benchmark (or cut point) for PCOA performance. Remediation for at risk students was required by less than 25% of programs. Remediation was most commonly required for P3 students (22%).

Conclusion. Survey results indicate wide variability between programs regarding PCOA cut points (benchmarks), stakes, and remediation approaches. In the future, it will be important for pharmacy educators to identify and study best practices for use of PCOA within student assessment and remediation plans.

Keywords
  • PCOA
  • remediation
  • student progression
  • high-stakes assessment
  • low-stakes assessment

INTRODUCTION

The Pharmacy Curriculum Outcomes Assessment (PCOA) is a validated evaluation tool developed by the National Association of Boards of Pharmacy (NABP) with its first administration in 2008. The PCOA examination blueprint is based on a national curriculum survey and is highly reflective of Appendix 1 of Standards 2016 published by the Accreditation Council for Pharmacy Education (ACPE).1 Appendix 1, entitled, “Required Elements of the Didactic Doctor of Pharmacy Curriculum,” specifies the content areas that must be addressed and the learning expectations for students throughout a Doctor of Pharmacy (PharmD) curriculum. Consistent with ACPE Appendix 1, the PCOA has four major content areas: basic biomedical sciences, pharmaceutical sciences, social/behavioral/administrative pharmacy services, and clinical sciences. The examination can be administered to students at all program levels (eg, beginning of the first year or end of the third year) and students receive “scaled scores” that can be compared to the average performance of students within the same institution or a national sample of US schools and colleges of pharmacy. Additionally, the examination can be administered to the same student in multiple program years in order to document students’ individual progression in each content area over time. Revised ACPE 2007 standards stated that all “colleges should incorporate periodic, psychometrically sound, comprehensive, knowledge-based and performance-based formative and summative assessments that allow comparison and benchmarks with all accredited and peer institutions.”2 To provide the potential for a national benchmark examination, the 2016 standards (Standard 24) require that all students take the PCOA close to the end of the didactic curriculum and before beginning their advanced pharmacy practice experiences (APPEs).1

Given the requirement to administer the PCOA prior to APPEs, there has been considerable discussion within the Academy regarding how to best use PCOA data for both curricular assessment and individual student assessment. Particular discussion has focused on how to use the examination to identify students at risk, as well as the types of stakes that should be attached to individual student performance.3,4 In a general survey of PCOA users (n=38) in 2014, 61% indicated that no “stakes” were attached to the examination for their students, while 26% indicated “low stakes” (development plan for students), 8% indicated “medium stakes” (course grade), and 3% indicated “high stakes” (impacted progression) were attached to the examination.5 Since that time, considerable interest has been shown during education sessions at national pharmacy meetings regarding the use of PCOA as a high- or medium-stakes assessment now that the examination is required by ACPE. However, limited information is available regarding what is being done at individual schools or nationally, and a global study to address how programs are using the PCOA to assess and assist students is warranted. The goal of our study was to determine how many schools and colleges of pharmacy currently assign stakes to the PCOA, how “at-risk” students are identified, and how any corresponding remediation processes are structured. “At-risk students” were those that were considered by their institution’s standards to have demonstrated deficient or poor performance on the PCOA and could subsequently be at risk for poor APPE or North American Pharmacist Licensure Examination (NAPLEX) performance.

METHODS

A survey was developed by a group of pharmacy faculty members, associate deans, and directors of assessment within an AACP Assessment Special Interest Group (SIG) research group representing six diverse pharmacy programs. Content and ideas were generated and refined through multiple phone conferences, and consensus was built prior to entry of the instrument into an electronic survey management system. The survey was designed to be administered to assessment professionals in all US schools and colleges of pharmacy to gather information pertaining to the use of the PCOA for student assessment and progressions from 2008-2009 to 2015-2016. The instrument contained 38 items to examine the: program year(s) in which the PCOA was administered to students; stakes assigned to student performance on the PCOA; determination and measurement of student minimum scores on the PCOA for remediation or progression in the PharmD program; and strategies for remediation of students who did not achieve the institution-established minimum score on the PCOA. Survey logic was built in for answers based on the year of the program in which the PCOA was used. Two demographic questions were included at the end of the survey instrument: one asked the respondent to provide the length and type of PharmD program(s) at their institution, and the other asked the respondent to voluntarily provide their name and the name of their institution.

Participants targeted for our survey included all assessment professionals at US pharmacy schools. Prospective participants were identified and their contact information obtained through the Faculty and Professional Staff Roster of the American Association of Colleges of Pharmacy (AACP).6 The list of assessment professionals was reviewed by the study investigators and cross-referenced against each school’s website. When discrepancies between the two sources of information were identified, the school website information was used.

The survey instrument was administered electronically using Survey Monkey (San Mateo, CA). After the instrument was uploaded to the survey platform, it was tested by the study investigators to ensure the questions were properly displayed and responses were recorded. Any potential issues with wording and display logic were identified and addressed prior to launching the survey. Although the survey was anonymous, the link to the survey was sent by email; thus, respondents' IP address was recorded by the system." Also, respondents were given the option to provide their names and the names of their institutions in the final two survey items. This information was used simply to track response rates and identify duplicate responses from the same individual or institution. The survey was initially disseminated in September 2016, and reminder emails were sent after two and four weeks.

Following data collection, information was downloaded from Survey Monkey and imported into SPSS, v. 21. Data were reviewed for two types of duplicate responses: multiple responses from the same individual and multiple responses from the same institution. If a participant had submitted a duplicate survey instrument, the most complete response was retained and all other responses were deleted from the dataset. If any conflicting data existed within duplicate institutional answers, the principal investigator contacted the respondent for clarification. Recoding of responses occurred if a response typed in an “other” option aligned more appropriately with another existing survey question or response category. Items that were labeled “select all that apply” were evaluated individually. Lastly, descriptive statistics were compiled for each of the survey items using IBM SPSS Statistics for Windows, Version 21.0. Armonk, NY: IBM Corp. This study was submitted to the Wayne State University Institutional Review Board (IRB), which determined that the project did not require review or approval.

RESULTS

The survey was sent to 171 individuals representing 135 colleges and schools of pharmacy. One hundred two responses were received. Responses were initially screened for duplicate IP addresses and programs. Of the 102 responses, there was one duplicate response as determined by the sender’s IP address; therefore, the second response from that individual was removed. Upon reviewing the list of institutions contained within the results, multiple responses were found for five of the schools. These duplicates were also deleted or combined, depending on the responses. Duplicate responses from an institution were compared to identify similarities or differences, and respondents were contacted if questions existed. There were six respondents who discontinued the survey at or prior to question seven, so their responses were deleted because of the limited usefulness of the data. The final data set contained 92 responses of the 135 programs. Assuming we were successful in eliminating all duplicate responses, a 68% program response rate was obtained.

The majority (72, 78%) of institutions represented within the dataset, had traditional four-year PharmD programs consisting of three years of didactic and one year of experiential coursework. Eight institutions (9%) had a “0-6” program and seven (8%) had a three-year accelerated program. Three respondents stated that their institutions had more than one pathway, and two respondents stated they had 2.5 years of didactic curriculum and 1.5 years of advanced pharmacy practice experiences (APPEs). Use of PCOA was found to vary across institutions according to the purpose of the assessment as well as utilization in coursework. Based on the 92 responses received, the most common uses for the PCOA were curricular assessment (76%), individual student performance assessment (74%), and/or cohort performance assessment (71%). Less common uses included individual student progress assessment (20%) and “other” (15%). Seven of the respondents who chose “other” indicated that their institutions had not used PCOA results or that use of PCOA results was still being determined, and four others stated the need to collect additional years of data in order to make decisions regarding how PCOA results would be used. One other respondent noted that the results were used for individual student strengths/weaknesses pre-APPE another for pathway comparison, while one other respondent suggested the results were not useful based on low-stakes administration of the examination.

Trends of institutional PCOA use based on program year are shown in Figure 1. The vast majority of schools in our sample (92%) administered the PCOA to third-year (P3) students in 2015-2016, whereas only 14% administered to first-year (P1) students, 16% to second-year (P2) students, and 5% to fourth-year (P4) students. Additionally, most institutions (78%) used the PCOA in only one program year compared to 14% that used the PCOA in two program years, 7% in three, and 1% in four. With the exception of fourth-year pharmacy student use, which remained consistent at about five programs, PCOA use in the P1 through P3 years had steadily increased since the 2009-2010 academic year. The largest one-year increase in the number of institutions using the examination occurred between 2014-2015 (n=32) and 2015-2016 (n=85) for P3 students. Table 1 provides a frequency distribution of the number of responding institutions that required their students to take the PCOA by program year and number of students in 2015-2016. In comparing the figures with the data in Figure 1, nearly all schools that administered the PCOA in any program year required those students to take the examination.

Figure 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 1.

Number of Institutions Administering the PCOA by Professional Year 2008-2016

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 1.

Distribution of Class Sizes and Professional Years at Institutions Requiring Administration of the Pharmacy Curriculum Outcomes Assessment During the 2015-2016 School Year

Stakes attached to the PCOA varied based on students’ program year in the curriculum. For respondents using the examination during the P1 or P2 year in 2015-2016, roughly half (50% and 47%) administered it as a low-stakes (defined as “no ramifications for students regardless of performance”) examination and the other half (50% and 53%) administered it as medium stakes (defined as “deficient or poor performance results in the student being made to complete remedial process, but otherwise student progresses in curriculum”). None of the responding institutions that used the PCOA for P1 or P2 students had assigned high stakes (defined as “deficient or poor performance results in the student being unable to progress within the curriculum”) to the examination. Figure 2 reports the percentages of pharmacy schools using each category of stakes for P3 students in 2015-2016. Although the majority of institutions used low-stakes examinations for P3 students, 16 had assigned medium or high stakes to the PCOA. Several programs that indicated “other” in response to stakes used for P3 students described stakes that did not match any of the three specified categories. Three of these respondents stated a variety of course allotments were tied to the PCOA (points, course percentage, or skills laboratory grade). One program used P3 students’ scores on the PCOA and third-year objective structured clinical examination (OSCE) to identify at-risk students and develop a student remediation plan. Another stated the loss of a letter grade for failure to take the assessment. Lastly, one other respondent stated progression in the curriculum was tied to the PCOA regardless of score.

Figure 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 2.

Percentage of Programs Linking Stakes to P3 PCOA Performance

Programs that required students to take the PCOA during the P1 through P3 years indicated they were considering changing the stakes associated with the examination for their students. Forty-six programs (55%) that required P3 students to take the PCOA in the 2015-2016 academic year responded that they were considering changing their examination stakes for future cohorts. Of those programs, the majority (36) had administered the PCOA to P3 students as low stakes in 2015-2016, one had administered the examination as high stakes, and six had administered it as a medium-stakes examination. Eight (30%) of 27 programs that required the examination for P1 and/or P2 students were considering changing the stakes for those program years. All but one of those programs had administered the PCOA to P1 and/or P2 students as medium stakes in 2015-2016.

Standards used to determine “cut points” to identify students at risk differed across institutions and were often undefined. In total, seven of the schools that required its P1 students (58%) to complete the PCOA had a specified cut point for P1 performance, 10 of the schools for P2 performance (67%), 36 of the schools for P3 performance (43%), and one school for P4 performance (100%). Of those schools identifying cut points for PCOA scores by P1 or P2 students, most (71% and 60%, respectively) gave specific percentiles (eg, students scoring at or below the 20th to 25th percentile). Several programs used a specified number of standard deviations from the national mean for the total score. One school reported that if a student scored more than two standard deviations below the national average for any of the four content areas, that student would be referred to his or her advisor. Of those respondents whose institutions had defined cutoff points for P3 students, most used a fixed percentile (30%), the national mean (22%), the national standard deviation (8%), or a combination of these measures (14%). Those that chose to use fixed percentiles exhibited a wide range in standards from less than the 10th percentile to the 40th percentile (national). The most common percentile ranges for P3s included between the 10th to 20th percentile (n=5) or greater than the 20th to 30th percentile (n=5). Three programs stated using the 10th percentile, 40th percentile, or 50th percentile.

In total, less than 25% of programs that used the PCOA in 2015-2016 had students who demonstrated deficient or poor performance on the examination and required remediation. Of the programs that required the PCOA be taken within a given year, 20 (24%) required remediation for P3s, followed by seven (47%) for P2s, six (54%) for P1s, and one for P4s. Several survey questions were asked to determine the characteristics of post-PCOA remediation of P3 students, including who was engaged in determining the need for and execution of the remediation, the length of the remediation, and reassessment. Twenty programs provided information on their remediation process for P3 students. Generally, at-risk students were identified by a committee such as progressions or assessment (9 schools, 45%) or by the dean or director of assessment (7 schools, 35%). The dean or director of assessment (11 schools, 55%) or a faculty committee (5 schools, 25%) also typically oversaw the remediation process. Length of the remediation differed considerably across institutions, with several institutions having a defined period of remediation of two to eight weeks (n=8, 45%) and others adopting a longitudinal or self-paced model (n=5, 25%). Three other institutions (15%) developed customized remediation plans based on the students’ needs, two (10%) had a very brief period of remediation consisting of a reflection or several hours of review, and one was still in the process of defining remediation. Finally, 11 (55%) programs indicated that no reassessment was mandated as part of the remediation process, while four (25%) programs had a PCOA retake, and three programs (15%) used a program-based examination to assess the remediation process. Similar trends were seen for the limited number of schools that used PCOA-based remediation for P1 and P2 students.

DISCUSSION

Our findings, coupled with evidence from the existing literature, demonstrate that schools and colleges that use the PCOA have incorporated their students’ performance results in a variety of ways within their programs.7-8 Consistent with prior research, our results suggest the most common purposes for administering the examination are to assess the curriculum, individual student performance, and overall cohort performance.5 The vast majority (92%) of the responding programs in our study administered the PCOA to their P3 students in 2015-2016. By comparison, Kirschbaum and colleagues reported in 2006 that only 50% (34 of 68) of responding schools used cumulative, end-of-year examinations prior to development of the PCOA.9 More recently, Vyas and colleagues determined in 2014 that 50% of schools (52 of 105) had a cumulative assessment program, many of which incorporated use of cumulative examinations, with about 20% specifically using PCOA.7 Relatively few responding programs in our study (less than 20%) reported assigning medium or high stakes to individual P3 student performance on the PCOA. Similarly, Kirschbaum and colleagues reported that 20% of programs used a high-stakes year-end examination.9 A 2014 survey related to PCOA use showed that only 6% of respondents were using the examination as a high-stakes assessment.5 Taken collectively, these studies and the volume of literature on student summative evaluations in pharmacy education suggest that some programs are using the PCOA as part of a comprehensive student evaluation process, while others may be using it in place of previously developed, school-specific, year-end examinations for progressions.4,10-15

Our findings indicate that from the 2008-2009 academic year to the 2015-2016 academic year, there was a steady increase in the number of programs using the PCOA in their P1, P2 and P3 program years. Not surprisingly, the number of programs using the PCOA for P3 students increased markedly from prior years to 2015-2016 with the adoption of the new ACPE 2016 Standards. During the first two testing windows in 2016, a little more than 13,000 P3 students had taken the PCOA compared to approximately 4,000 in 2014-2015.16 Our survey estimates were consistent with these figures, showing an increase in the number of schools administering the PCOA to P3 students from 32 to 85. Our study also demonstrated that the number of institutions administering the PCOA to P1 and P2 students has increased over time, but overall remains limited (14% and 16% of our sample in 2015-2016). The number of programs that administer the PCOA to P4 students is also relatively small and has remained fairly consistent at around five schools each year.

Nearly 20% of responding programs reported using the PCOA for P3 students in a medium- to high-stakes manner. Several additional programs that identified having “other” types of stakes have used PCOA performance as part of the process for determining course grades and student progressions. The literature is limited in this area, with only one institution having published their description of using PCOA in a high-stakes manner.4 Most institutions have used PCOA as a low-stakes and generally formative student assessment and whether they required it for P3 students (prior to the 2016 ACPE mandate) varied widely across programs.3,8,15,17-20 Other pharmacy programs have reported use of institution-specific competency assessments for both formative and curricular assessment purposes as well as in a high-stakes manner to determine student progression.11-13

The results from the current study indicate that the majority of US pharmacy programs have not elected to use the PCOA as a high-stakes assessment for the purpose of making progressions decisions or determinations regarding individual student readiness for APPEs. Instead, it appears most institutions are currently using the PCOA to inform students of their knowledge-based abilities and for curriculum assessment purposes.5,8,14,15,18 However, 55% of programs in our study also reported that they are currently considering making changes to their stakes for P3 students associated with PCOA scores. This suggests that a high degree of change in the nature of PCOA use may occur over the next few years.

Other health professions, including medicine, physician assistant (PA) studies, nursing, and osteopathic medicine use standardized assessments at different points throughout their respective programs to provide evidence of student attainment of knowledge. Similarities exist across standardized assessments in health professions as well as several notable differences with respect to how different programs interpret and use the outcomes and integrate these examinations within the curriculum. In medical and osteopathic medicine education, a portion of the examinations conducted within the curricula, including the United States Medical Licensing Examination and the Comprehensive Osteopathic Medical Licensing Examination-USA, are competency and/or skills-based assessments, whereas PCOA is norm-referenced and knowledge-based.21-24 Standardized examinations within physician assistant or nursing programs are more similar to the PCOA.

Most (about 90%) of the physician assistant studies programs in the United States use a standardized assessment developed by the Physician Assistant Education Association (PAEA) called the PACKRAT (Physician Assistant Knowledge Rating and Assessment Tool) to help students self-assess their clinical knowledge. Similar to the PCOA, the PACKRAT may be used either as a formative or summative assessment, and detailed performance reports with comparative statistics are provided to students and to the program.25 Administration of the PACKRAT, however, differs from the highly controlled, secure-testing environment of the PCOA in that the PACKRAT can be administered as a proctored or non-proctored examination and on site or remotely, and the testing window is adjustable. The PAEA also offers seven standardized End of Rotation examinations that focus on relevant knowledge gained during specific clinical practice experiences.26 In contrast to the PACKRAT, supervised proctoring is required for these End of Rotation examinations. The stakes for these examinations are determined by individual programs much like the PACKRAT and the PCOA in PharmD programs.26

Many nursing programs use a standardized knowledge assessment called the HESI (Health Education Systems, Inc) Exit Examination to evaluate a student’s readiness to pass the National Council Licensure Examination for Registered Nurses (NCLEX-RN). The HESI Exit questions focus on critical thinking and are representative of NCLEX examination questions.27,28 This differs from the approach adopted by the NABP in that the PCOA was developed primarily as a curriculum evaluation tool rather than a predictor for individual student success on the NAPLEX.

Research from PA studies, nursing, and pharmacy has shown student performance on within-program standardized assessments and licensing examinations to be significantly correlated. A pilot study by Massey and colleagues demonstrated that formative and summative examinations, including the PACKRAT, yielded a predicted score for the Physician Assistant National Certifying Examination (PANCE) that was correlated at R2 of 0.75.29 The researchers concluded that performance on formative and summative evaluations throughout the curriculum could be used to identify students at risk of failing the PANCE. In a pilot study involving 72 students from three PA programs, Massey and colleagues also found that student scores on the PAEA End of Rotation subject examinations and the PANCE were strongly and positively correlated (r=0.856, p<.05).30 In nursing, the ninth HESI Exit Examination validity study concluded that the HESI Exit Examination was highly accurate (96.61%) in predicting NCLEX-RN examination success.28 Studies within pharmacy education have found significant correlations between the PCOA and NAPLEX, ranging from r=0.51 to 0.73; however, the amount of variance in the predicated NAPLEX total score by the PCOA was generally limited.3,14,15,17,19,20

Formal remediation processes for students who are identified by their PCOA performance as being at risk has, to date, been implemented by a only a small number of pharmacy programs. Moreover, the few programs in our survey that reported they have required students who performed poorly on the PCOA to undergo remediation used a variety of processes and methods. Perceived strengths of the programs that reported having a remediation process in place for the PCOA include the use of faculty committees to make decisions regarding remediation as well as oversight by the dean or director of assessment. This may reflect both general faculty input into the process and allocation of important resources. Potential challenges for programs incorporating remediation processes may include student delay, increased faculty and/or administrator workload, cost, and opportunity for reassessment given timing of subsequent PCOA administrations and the relatively long turnaround time for the results.31

Length of remediation for PCOA varied considerably across programs, with remediation lasting less than one week, two to eight weeks, or occurring over an extended period of time (a semester or academic year) and often at the student’s own pace. Several programs also reported requiring SMART (specific, measurable, achievable, relevant, time-oriented) plans for students who were unsuccessful in their remediation attempts. While the majority of programs did not reassess students following remediation, a few required students to retake the PCOA or complete a program-based examination to determine the level of success from undergoing remediation. For the programs that had a reassessment procedure in place, it was generally overseen at the dean or director level.

Limitations of this study include a response rate of 68%, which reflects a less than complete sampling of US colleges and schools of pharmacy. Although respondents were asked to specify the type of PharmD program(s) at their institutions (eg, traditional four-year, three-year, “0-6”), data were analyzed in aggregate in part because of the small number of nontraditional programs. As a result, important differences between program types may have been overlooked. Also, the specific content covered during remediation for the responding programs that reported having a PCOA-based remediation program was not queried. The content of remediation programs as well as the timing and delivery methods for them represent an important direction for future research. Another possible limitation is that data were self-reported and provided by directors or deans of assessment. An assumption was made that those individuals were generally the most knowledgeable in the area of PCOA administration and remediation and therefore able to answer the questions presented in the survey for their respective institutions. While the length of the remediation period was recorded, the survey did not collect information regarding the proportion of that time period dedicated to remediation versus other activities such as ongoing coursework. Two additional limitations involved the cleaning and recoding of data. As noted previously, a number of individuals selecting “other” for one or more items provided a description that appeared to match one or more predefined categories. Their responses were recoded accordingly; however, the researchers may have misunderstood the respondents’ meaning. Several respondents also provided information that was somewhat unclear, so we contacted them for clarification. However, as the respondent’s name and institution were optional fields, this could not be done in all cases.

Future research should seek to determine the validity of the PCOA as a tool for identifying students in need of remediation prior to beginning APPEs. Limited reports have been made regarding its utility in predicting APPE success.4,32 Coyle and colleagues stated that after a program transition from using a “homegrown” examination to using PCOA as a high-stakes assessment for students prior to beginning APPEs, a correlation was found between students performing poorly on the PCOA and their readiness for APPEs.4 Jefferson and Weber evaluated the correlation of pharmacotherapy course grade point averages, skills laboratories, and PCOA scores to APPE performance.32 A positive correlation existed between PCOA scores and APPE performance, but it was not found to be a comprehensive indicator. If PCOA can be established as a reliable and valid assessment for determining success on APPEs or as part of a formula to determine such, best practices for developing benchmark scores and remediation processes should also be described in the literature.4,32

CONCLUSION

The survey data confirmed an increase in the number of programs using the PCOA since it first became available in 2008-2009, especially for testing P3 students, which likely occurred in response to ACPE Standards 2016. The findings from this study clearly demonstrate wide variability in how pharmacy schools are positioning the use of the PCOA (high vs low stakes), setting (or not setting) standards for individual student performance, and designing and implementing associated student remediation processes. Although relatively few programs use PCOA as part of student progressions decisions, more than half of those administering the PCOA as a low-stakes assessment in 2015-2016 are considering changing the stakes for future cohorts. Given the mandated use of PCOA for pre-APPE students nationally, it will be important for future research to identify and study best practices as more colleges and schools incorporate this tool within their student and programmatic assessment plans.

  • Received September 15, 2017.
  • Accepted January 19, 2018.
  • © 2019 American Association of Colleges of Pharmacy

REFERENCES

  1. 1.↵
    Accreditation Council for Pharmacy Education. Accreditation Standards and Guidelines for the Professional Program in Pharmacy Leading to the Doctor of Pharmacy Degree (Standards 2016). https://www.acpe-accredit.org/pdf/Standards2016FINAL.pdf. Published February 2015. Accessed July 14, 2019.
  2. 2.↵
    Accreditation Council for Pharmacy Education. Accreditation Standards and Guidelines for the Professional Program in Pharmacy Leading to the Doctor of Pharmacy Degree 2007. https://www.acpe-accredit.org/pdf/S2007Guidelines2.0_ChangesIdentifiedInRed.pdf. Accessed July 14, 2019.
  3. 3.↵
    1. Buring SM,
    2. Hein BE,
    3. Messinger NJ,
    4. Wigle PR
    . The PCOA effect: the Cincinnati experience. Am J Pharm Educ. 2015;79(5):Article 122.
  4. 4.↵
    1. Coyle EA,
    2. Jenkins TL
    . Using the PCOA as a high stakes exam in assessing pre-APPE readiness. Am J Pharm Educ. 2015;79(5):Article 124.
  5. 5.↵
    1. Gortney JS,
    2. Bray BS,
    3. Salinitri FD
    . Implementation and use of the pharmacy curriculum outcomes assessment at us schools of pharmacy. Am J Pharm Educ. 2015;79(9):Article 137.
  6. 6.↵
    American Association of Colleges of Pharmacy. American Association of Colleges of Pharmacy (AACP) Faculty and Professional Staff Roster 2016. http://www.aacp.org/about/membership/Pages/roster.aspx. Accessed July 1, 2016.
  7. 7.↵
    1. Vyas D,
    2. Halivovic J,
    3. Kim MK,
    4. Raynan MC,
    5. Rogan EL,
    6. Galal SM
    . Use of cumulative assessments in US schools and colleges of pharmacy. Pharmacy. 2015;3:27-38.
    OpenUrl
  8. 8.↵
    1. Scott DM,
    2. Bennett LL,
    3. Ferrill MJ,
    4. Brown DL
    . Pharmacy curriculum outcomes assessment for individual student assessment and curricular evaluation. Am J Pharm Educ. 2010;74(10):Article 183.
  9. 9.↵
    1. Kirschenbaum HL,
    2. Brown ME,
    3. Kalis MM
    . Programmatic curricular outcomes assessment at colleges and schools of pharmacy in the United States and Puerto Rico. Am J Pharm Educ. 2006;70(1):Article 8.
  10. 10.↵
    1. Szilagyi JE
    . Curricular progress assessments: the mile-marker. Am J Pharm Educ. 2008;72(5):101.
    OpenUrlPubMed
  11. 11.↵
    1. Alston GL,
    2. Love BL
    . Development of a reliable, valid annual skills mastery assessment examination. Am J Pharm Educ. 2010;74(5):Article 80.
  12. 12.↵
    1. Kelley KA,
    2. Beatty SJ,
    3. Legg JE,
    4. McAuley JW
    . A progress assessment to evaluate pharmacy students' knowledge prior to beginning advanced pharmacy practice experiences. Am J Pharm Educ. 2008;72(4):Article 88.
  13. 13.↵
    1. Meszaros K,
    2. Barnett MJ,
    3. McDonald K,
    4. et al
    . Progress examination for assessing students' readiness for advanced pharmacy practice experiences. Am J Pharm Educ. 2009;73(6):Article 109.
  14. 14.↵
    1. Garavalia LS,
    2. Prabhu S,
    3. Chung E,
    4. Robinson DC
    . An analysis of the use of Pharmacy Curriculum Outcomes Assessment (PCOA) scores within one professional program. Curr Pharm Teach Learn. 2017;9:178-184.
    OpenUrl
  15. 15.↵
    1. Naughton C,
    2. Friesner DL
    . Correlation of P3 PCOA scores with future NAPLEX scores. Curr Pharm Teach Learn. 2014;6:877-883.
    OpenUrl
  16. 16.↵
    National Association of Boards of Pharmacy. 2014-2017 PCOA Administration Highlights; https://nabp.pharmacy/wp-content/uploads/2018/03/PCOA-Highlights-2018.pdf. Accessed July 14, 2019.
  17. 17.↵
    1. Sousa JM,
    2. Hutchinson DJ,
    3. Lenhard LM
    . Pharmacy Curriculum Outcomes Assessment (PCOA) as predictors of performance on NAPLEX. Am J Pharm Educ. 2015;79(5):Article 4.
  18. 18.↵
    1. Waskiewicz RA
    . Pharmacy students' test-taking motivation-effort on a low-stakes standardized test. Am J Pharm Educ. 2011;75(3):Article 41.
  19. 19.↵
    1. Gortney JS,
    2. Rudolph M,
    3. Maerten-Rivera J,
    4. et al
    . An examination of the relationships between PCOA and NAPLEX subtopic and total scores. Am J Pharm Educ. 2017;81(5):Article 51.
  20. 20.↵
    1. Ferrence JD,
    2. Welch AC
    . Using PCOA: experiences from two early adopter schools. Am J Pharm Educ 2017;81(5):Article 9.
  21. 21.↵
    United States Medical Licensing. United States Medical Licensing Examination® (USMLE®). 2017; http://www.usmle.org/. Accessed June 7, 2017.
  22. 22.↵
    1. Gilliland WR,
    2. La Rochelle J,
    3. Hawkins R,
    4. et al
    . Changes in clinical skills education resulting from the introduction of the USMLE step 2 clinical skills (CS) examination. Med Teach. 2008;30(3):325-327.
    OpenUrlCrossRefPubMed
  23. 23.↵
    1. Haist SA,
    2. Butler AP,
    3. Paniagua MA
    . Testing and evaluation: the present and future of the assessment of medical professionals. Adv Physiol Educ. 2017;41(1):149-153.
    OpenUrlCrossRefPubMed
  24. 24.↵
    American Association of Colleges of Osteopathic Medicine. Comprehensive Osteopathic Medical Licensure Exam (COMPLEX-USA) 2017. http://www.aacom.org/news-and-events/publications/2018-cib/board-examinations-and-licensure. Accessed August 15, 2017
  25. 25.↵
    Physician Assistant Educators Association. PACKRAT Exam. 2017; http://packratexam.org. Accessed June 9, 2017.
  26. 26.↵
    Physician Assistant Educators Association. PAEA End of Rotation Exam. 2017; http:www.endofrotation.org/. Accessed June 9, 2017.
  27. 27.↵
    HESI Examiners. HESI Exam Guide. http:www.hesi-exam.com. Accessed June 9, 2017.
  28. 28.↵
    1. Zweighaft EL
    . Impact of HESI Specialty Exams: the ninth HESI Exit Exam validity study. J Prof Nurs. 2013;29(2 Suppl 1):S10-16.
    OpenUrl
  29. 29.↵
    1. Massey S,
    2. Stallman J,
    3. Lee L,
    4. et al
    . The relationship between formative and summative examinations and pance scores: can the past predict the future? J Physician Assist Edu. 2011; 22(1):41-45.
    OpenUrl
  30. 30.↵
    1. Massey S,
    2. Holmerud D,
    3. Hammond J,
    4. et al
    . Correlation of the physician assistant education association end of rotation examinations with the physician assistant national certifying exam. J Physician Assist Educ. 2015;26(3):144-146.
    OpenUrl
  31. 31.↵
    1. Mok TY,
    2. Romanelli F
    . Identifying best practices and utilities of the Pharmacy Curriculum Outcomes Assessment examination. Am J Pharm Educ. 2016;80(10):Article 163.
  32. 32.↵
    1. Jefferson CG,
    2. Weber SS
    . Pharmacy Curriculum Outcomes Assessment as a predictor of success on advanced pharmacy practice experiences. Am J Pharm Educ. 2016;80(5):Article 3.
PreviousNext
Back to top

In this issue

American Journal of Pharmaceutical Education
Vol. 83, Issue 6
1 Aug 2019
  • Table of Contents
  • Index by author
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Pharmaceutical Education.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
National Trends in the Adoption of Pharmacy Curriculum Outcomes Assessment for Student Assessment and Remediation
(Your Name) has sent you a message from American Journal of Pharmaceutical Education
(Your Name) thought you would like to see the American Journal of Pharmaceutical Education web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
1 + 13 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Citation Tools
National Trends in the Adoption of Pharmacy Curriculum Outcomes Assessment for Student Assessment and Remediation
Justine Gortney, Michael J. Rudolph, Jill M. Augustine, Julie M. Sease, Brenda Bray, Nina Pavuluri, Siu Fun Wong
American Journal of Pharmaceutical Education Aug 2019, 83 (6) 6796; DOI: 10.5688/ajpe6796

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
National Trends in the Adoption of Pharmacy Curriculum Outcomes Assessment for Student Assessment and Remediation
Justine Gortney, Michael J. Rudolph, Jill M. Augustine, Julie M. Sease, Brenda Bray, Nina Pavuluri, Siu Fun Wong
American Journal of Pharmaceutical Education Aug 2019, 83 (6) 6796; DOI: 10.5688/ajpe6796
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • INTRODUCTION
    • METHODS
    • RESULTS
    • DISCUSSION
    • CONCLUSION
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Similar AJPE Articles

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Assessment of Moral Development Among Undergraduate Pharmacy Students and Alumni
  • An Update on the Progress Toward Gender Equity in US Academic Pharmacy
  • Remote Work in Pharmacy Academia and Implications for the New Normal
Show more RESEARCH

Related Articles

  • No related articles found.
  • Google Scholar

Keywords

  • PCOA
  • remediation
  • student progression
  • high-stakes assessment
  • low-stakes assessment

Home

  • AACP
  • AJPE

Articles

  • Current Issue
  • Early Release
  • Archive

Instructions

  • Author Instructions
  • Submission Process
  • Submit a Manuscript
  • Reviewer Instructions

About

  • AJPE
  • Editorial Team
  • Editorial Board
  • History
  • Contact

© 2023 American Journal of Pharmaceutical Education

Powered by HighWire