Abstract
Objectives. To adapt and evaluate an instrument that measures perceived psychological contract violations in pharmacy students by schools and colleges of pharmacy.
Design. A psychological contract violations measure was developed from existing literature and the 1997 ACPE Guidelines and pilot-tested with second-year pharmacy students at 2 schools of pharmacy. A revised measure then was administered to second-year pharmacy students at 6 schools of pharmacy. Using a 5-point Likert-type scale, participants were asked to indicate the level of obligations they received compared to what was promised by the school of pharmacy.
Results. Exploratory factor analysis on the psychological contract violations measure was conducted using principal components analysis resulting in 7 factors, which led to a revised measure with 26 items. Using a sample of 339 students, the proposed 7-factor measurement model was tested using confirmatory factor analysis. In general, the results supported the hypothesized model. The final 23-item scale demonstrated both reliability and validity. Some students perceived certain aspects of the psychological contract that exists with their school of pharmacy were being violated.
Conclusion. The psychological contract violations measure may serve as a valuable tool in helping to identify areas where their students believe that schools/colleges of pharmacy have not fulfilled promised obligations.
INTRODUCTION
There is a consensus in the literature as to the need for colleges and schools of pharmacy to foster student development of professionalism.1 Tools are available to measure pharmacy student professionalism as a comprehensive construct in and of itself2,3 or its specific elements such as professional commitment4 and attitudes toward pharmaceutical care.5 However, less attention has been devoted to exploring determinants of students' professional attitudes and behaviors or the mechanisms underlying their formation.
Although developed in the employment context, the concept of psychological contracts (and their violations) may aid in the explanation of students' attitudes and subsequent behaviors, such as professional conduct, commitment to their school and the profession, and the appropriate provision of pharmaceutical care. Psychological contracts “entail beliefs about what employees believe they are entitled to receive, or should receive, because they perceive that their employer conveyed promises to provide those things”6 and the relationship between psychological contract violations and a number of employment outcomes has been examined (eg, job satisfaction, job turnover, organizational commitment, organizational citizenship behaviors, and work performance). The purpose of this study was to develop and evaluate an instrument that measures students' perceived psychological contract violations by their pharmacy schools.
Psychological Contracts
The concept of a psychological contract traditionally has been applied in an employment context and refers to subjective beliefs held by employees regarding the organization's obligations to them.7 According to Rousseau: “when an individual perceives that the contributions he or she makes obligates the organization to reciprocity (or vice versa), a psychological contract emerges … it is the individual's belief in an obligation of reciprocity that constitutes the contract.”8 These perceived obligations may be implied by the organization or expressly stated. Each individual develops a unique psychological contract based upon his/her own understanding of the reciprocal obligations that exist between the employee and the organization.9
Psychological contracts share several characteristics. In addition to being subjective perceptions that differ between individuals,10 these contracts are dynamic, changing over time during the employer-employee relationship. Psychological contracts involve mutual obligations, based on implied or explicit promises, in which both parties invest in their relationship with the expectation of a positive outcome. Although psychological contracts are rarely explicitly discussed, they are important determinants of employees' behaviors and attitudes.11
Although much of the research to date has examined psychological contracts within an employment context, psychological contracts can arise in a myriad of circumstances, such as customer-firm relations and doctor-patient interactions, and in situations where there are written as well as unwritten agreements.10 Given relationships between students and their respective universities and schools, psychological contracts likely exist in educational settings. Anderson states that as consumers of education, students are not unlike consumers of other products and services.12 For example, they often seek information about course offerings and may build their expectations on the information available to them when they make course selections.12 There is evidence that students' academic expectations, contrasted with their actual experience (a psychological contract of sorts), can help identify at-risk students.13 Pharmacy students may possess psychological contracts with various individuals and entities during their professional education, and pharmacy schools have attempted several strategies to enhance student professionalism and professional commitment, which in turn help foster psychological contracts. One such example is the white coat ceremony, which in medicine has been described as creating a “psychological contract for professionalism and empathy.”14
Measuring Psychological Contract Violations
Because the breach of perceived contracts influences attitudes and behaviors of the offended party, the offender would benefit by knowing when such violations occur. In the context of employment, Robinson and Rousseau define a psychological contract violation as “the perception by an individual that his or her organization has failed to fulfill promised obligations.”15 Rousseau notes that employees do not interpret all instances of organizational noncompliance as psychological contract violations.10 Similar to the terms of the psychological contract, the perception of violations rests with the employee. How the individual interprets this break of the contract10,16 and/or the degree to which the employee focuses on the discrepancy17 ultimately determines whether the employee believes a psychological contract violation has taken place. In an educational context, the “employer” role within a psychological contract is fulfilled by the school and the “employee” role is fulfilled by the student.
Several different approaches have been used to assess breaches of psychological contracts. Most applicable to the current study, Turnley and Feldman used a dimensional measure of the degree of psychological contract violations rather than a global measure.18 Their measure of contract violation incorporates multiple items addressing 16 different aspects of the employment relationship. Furthermore, the measure allows respondents to indicate whether each aspect of their contract was unfulfilled, fulfilled, or over fulfilled.
Literature Review Summary
The literature examining psychological contracts is extensive and many relationships with psychological contract violations have been empirically investigated. However, these studies have all been conducted in the context of an employer-employee relationship even though there appears to be some promise in studying such contracts in the context of other relationships. In addition, few studies have examined simultaneously the causes and consequences of relationships between pharmacy students and schools.19 This study attempts to fill these gaps by developing and evaluating an instrument that measures pharmacy students' perceived psychological contract violations by schools and colleges of pharmacy.
METHODS
The development of the psychological contract violations measure, as reported in this manuscript, was part of a larger study examining the effects of such violations on attitudinal outcomes in a student/pharmacy school environment. Measure development occurred in 3 stages: initial measure development, pilot measure testing and revision using exploratory factor analysis (EFA) and revised measure testing using confirmatory factor analysis (CFA). The methods used in subsequent stages were informed by the results of previous stages. (Copies of the pilot measure and the revised measure are available from the author upon request.)
Initial Measure Development
Two psychological contract violation measures designed for use in the employment context were examined for possible inclusion in the present measure.15,18 The 16-item measure developed by Turnley and Feldman18 assesses aspects of the employment relationship that represent potential organizational obligations within the psychological contract as identified by previous research.7,15,20 Those aspects include: pay, work content, opportunity for advancement, benefits, job security, training, development, feedback, and organizational and supervisory support.18 From the original 16 items, 9 were immediately eliminated because they were not relevant to the student/school of pharmacy relationship. The 7 items that were not eliminated and compiled for possible inclusion in the instrument included training, career development, decision-making input, challenge, feedback, supervisory support, and organizational support.
Robinson and Rousseau's single-item measure of psychological contract violation was adapted and modified for inclusion in the current measure.15 Prior research in the area of psychological contract violations has utilized this single-item, global assessment.15,21,22 Its purpose in the present study was to assess its correlation with the developed measure in order to provide some evidence of the validity of the newly developed measure.
Generation of additional items for inclusion in the modified measure initially began by examining the results from focus groups conducted with pharmacy students and pharmacy faculty members at The University of Mississippi. Potential items for the current measure also were identified from studies that examined the perceptions of professional students (medical and pharmacy) regarding their college or institution.15,18,19,23 Additionally, the Accreditation Council for Pharmacy Education (ACPE) 1997 accreditation standards for colleges and schools of pharmacy were consulted as a source for items.24 The resulting measure consisted of 33 items, plus the modified Robinson and Rousseau global item, prior to pilot testing.
The measure did not ask students specifically about psychological contract violations but rather asked them to indicate the amount of each item they received from their school/college of pharmacy compared to what they feel had been promised to them. Similar to Turnley and Feldman,18 a 5-point scale ranging from 1 = “Receive much more than promised” to 5 = “Receive much less than promised” was used. Thus, the higher the score, the greater the magnitude of psychological contract violation.
Pilot Measure Testing
Objectives for the pilot study included: (1) an assessment of the factor structure of the measure to facilitate refinement and revision of the measure, (2) a preliminary examination of the instrument's reliability and validity, and (3) evaluation of the wording of each item for understanding by respondents. The pilot measure was administered to doctor of pharmacy (PharmD) students in their second year. That class year was selected because they were suspected to have formed fewer psychological contracts outside of the school of pharmacy than their final-year counterparts, had ample time to develop psychological contracts with the institution, and had emerged from the “honeymoon” phase of the first year; yet, were not focused solely on graduation.
A convenience sample of 2 schools of pharmacy, one public and one private, was chosen for the pilot test (The University of Mississippi and Samford University). Data from these 2 schools were not used in the subsequent analysis in which the revised measure was tested. Questionnaires that contained the psychological contract violations measure along with a set of demographic questions were sent to contacts at each of the 2 schools that agreed in advance to facilitate data collection at their institutions. The pilot study received exemption from the institutional review boards at both universities.
Data were analyzed using IBM SPSS (SPSS: An IBM Company, Chicago, IL, USA) EFA was conducted using Principal Components Analysis (PCA) as the method of factor extraction. Prior to factor extraction, Bartlett's test of sphericity and the Kaiser-Meyer-Olkin (KMO) Measure of Sampling Adequacy (MSA) were evaluated to determine the factorability of the data. To aid in interpretation of the factor structure, orthogonal rotation (using the varimax procedure) was employed. Researchers have many different options when using EFA, and PCA followed by varimax rotation is commonly used. Rather than evaluating whether using different EFA options would lead to consistent results, the fit of measurement model was evaluated during the revised measure testing phase using confirmatory factor analysis (CFA) with a new set of data. PCA was used to identify poorly performing items, or items that failed to load well on any factor, cross loaded on more than one factor, or had low communalities (ie, the amount of variance in each item accounted for by the factors). Reliabilities for the scale and subscales also were calculated.
Revised Measure Testing
The measure that was developed during the pilot testing phase was subjected to additional analyses using a new set of data. Prior to administering the revised measure to a new sample, subtle wording changes were made to several items (eg, removing “the” and “a” from items) and more substantial wording changes were made to 3 items (Table 1). Item 25 was changed to read, “Potential to be involved in professional organizations.” Item 26 was changed to read “Potential to participate in extracurricular activities.” Item 32 was split into 2 items, one about teaching and the other about learning. The final revised measure consisted of 26 items plus the modified Robinson and Rousseau global item.
Orthogonal Rotation Factor Matrix for Pilot Measure Data
A data collection methodology similar to the pilot test was used to collect data to assess the revised measure. Questionnaires that contained the revised psychological contract violations measure along with a set of demographic questions were sent to contacts at 6 schools/colleges of pharmacy (mixed public and private) who agreed in advance to facilitate data collection at their institutions (Southwestern Oklahoma State University, University of Arkansas for Medical Sciences, Wingate University, West Virginia University, Duquesne University, University of Missouri – Kansas City). As with the pilot study, second-year pharmacy students were asked to complete questionnaires. Exemption or approval was received from the institutional review boards at each university.
CFA based on structural equation modeling (SEM) with Amos (SPSS: An IBM Company, Chicago, IL, USA) was used to assess convergent and discriminant validity of the revised multifactor psychological contract violations measure. Essentially, this item-level analysis on the 26-item psychological contract violations measure was performed to further purify the measure and to examine whether the 7-factor model derived in the pilot study using PCA was replicated using CFA. All items were modeled to load only on their corresponding factor and all latent variables were allowed to correlate. The hypothesized model tested for the psychological contract violations measure was derived from the pilot test of the measure, in which EFA was used.25
Convergent validity was assessed by examining whether the items loaded significantly on the intended factor.26 Modification indices were inspected for evidence of large cross-loadings. Minimal evidence of cross-loadings combined with significant t values for each indictor loading supported the convergent validity of the constructs represented by the psychological contract violations measure. In addition, convergent validity was assessed by examining the average variance extracted for each factor; values of 0.5 or higher suggest adequate convergence.27 Discriminant validity was assessed by using a chi-square difference test to compare the model in which each pair of inter-factor correlations were constrained to one with an unconstrained model (essentially testing whether the correlation between 2 latent variables is 1). Large changes in the chi-square statistic indicate adequate discriminant validity.26 Each test was performed in a pair-wise manner for all inter-factor correlations. Evidence of reliability was provided through an examination of Cronbach's alpha and construct scale reliabilities for each subscale; estimates of 0.7 or higher suggest good reliability.27 The comparative fit index (CFI) and the root mean square error of approximation (RMSEA) were used to evaluate model fit.28,29
RESULTS
For the pilot measure testing, 208 survey instruments were administered and 175 (84.1%) usable responses were received. An additional 5 cases were excluded prior to PCA because of missing data; thus, the analysis sample consisted of 170 cases. The KMO-MSA revealed a value of 0.86 and Bartlett's test of sphericity was significant (p = 0.0005). Therefore, factor analysis was considered appropriate for the sample.30
A total of 8 items were removed based on the PCA results and after the research team reconsidered the meaning associated with the items, the potential for vagueness, and whether the items were indeed part of the psychological contract between pharmacy students and their schools. For example, 1 item dealing with the potential to select courses reflecting personal interests was removed because the schools of interest had “lockstep” curricula, leaving students little or no choice in the courses they took (for further information on item deletion, please contact the corresponding author). After removing these 8 items, PCA was again performed on the remaining 25 items.27 Seven factors with eigenvalues exceeding 1 were extracted, accounting for approximating 65% of the total variance. Communalities for each item ranged from 0.45 to 0.81. Factor loadings following varimax rotation are presented in Table 1. Based on the pattern of factor loadings, the factors were labeled “Faculty,” “Futuristic,” “Student Development,” “Course and Curricular Content,” “Learning Opportunities,” “Involvement,” and “Facilities.”
Reliability analysis revealed a reliability coefficient (Cronbach's alpha) for the total scale of 0.91 and a range of 0.59–0.84 for the 7 subscales. To provide some evidence of validity, the total scale score was correlated with the single-item measure adapted from previous research.15,21 As expected, the 2 measures were positively and significantly correlated (r = 0.654, p < 0.01).
For revised measure testing, 538 survey instruments were mailed to the 6 institutions and 364 (67.7%) responses were received. Twenty-five responses were excluded (due to missing data or students not enrolled in the second professional year), leaving an analysis sample of 339 (63.0% usable response rate). Within-school response rates ranged from 26.5% to 85.7%, with 4 of the 6 schools having response rates greater than 80%, while another school had a 58.3% response rate. The low response rate was from one of the public institutions and was attributed primarily to a miscommunication between the principal investigator and the site contact. Analyses were conducted both with and without the 22 students from this school and no substantive differences were noted. Therefore, the results presented in the remainder of the paper include all 339 usable responses. Table 2 provides a demographic breakdown of the 339 respondents used for the revised measure testing.
Respondent Characteristics for Revised Measure Testing Sample (n = 339)
After examining the modification indexes and initial factor loadings, 3 items were deleted as they showed evidence of cross-loadings or loaded poorly on the intended factor. CFA was performed on the remaining 23 items. Standardized loadings, goodness of fit indices, AVE, and reliability measures based on this analysis can be found in Table 3 and the items belonging to each factor can be found in Table 4. Although RMSEA and CFI suggest only mediocre to reasonable model fit based on suggested standards,28,29 the t values for each indictor were significant and all standardized loadings were greater than 0.5, providing some evidence of convergent validity.27 Variance extracted calculations varied from 0.37 to 0.78. Reliability measures for each factor appear adequate with the possible exception of the course and curricular content factor. All chi square differences were significant, providing evidence of discriminant validity (ie, the factors represent distinct constructs, Table 5). The “Student Development” subscale and the “Course & Curricular Content” subscale appear to be the most similar to each other. Further evidence of the validity of the revised measure is that the total scale score was positively and significantly correlated (r = 0.645, p < 0.01) with the single-item measure adapted from previous research.15,21
Confirmatory Factor Analysis Results
Survey Items and Their Respective Factor Structure (Revised Measure)
Discriminant Validity Results
The items for each construct were averaged (using observed scores) to form 7 subscales, with higher values representing a greater degree of psychological contract violation for each dimension. While the average for each subscale and the total score is at or around the midpoint of the scale (ie, 3), there is some variation in the scores and most subscales had a maximum value at or near 5 (Table 6).
Descriptive Statistics for the Subscales of the Psychological Contract Violation Measure
DISCUSSION
This research makes several contributions to the literature on psychological contract violations. Prior to this work, psychological contract theory had been reserved for the employee/employer context. This study builds upon previous research in expanding the theory to include the student/school of pharmacy setting. In expanding the application of psychological contract theory, greater insight may be gained in understanding pharmacy students' attitudes and subsequent behaviors.
It is important for schools of pharmacy to measure whether students believe their psychological contracts have been violated. These perceptions have the potential to influence such variables as professional commitment, organizational commitment, and willingness to provide pharmaceutical care. Furthermore, if violations are indeed shown, the academic institution can identify, through use of this instrument, areas in which it needs to improve. This may aid the institution in helping meet student obligations. Once met, students may be more likely to remain involved and committed to the profession and contribute to the school of pharmacy upon graduation. Furthermore, students who believe the school of pharmacy fulfilled its obligations may be more likely to display positive attitudes and behaviors, such as professionalism. This research created a new instrument for measuring psychological contract violations among pharmacy students. Prior to this project, the only available psychological contract violation measures were ones that tapped the many dimensions present in the workplace.7,22 The newly developed psychological contract violation measure used in this project tapped many of the dimensions present in the student/school of pharmacy setting. Focusing on the educational environment, a final measure with 23 items that ranged from involvement in extracurricular activities to student participation on school of pharmacy committees to responsiveness to student evaluations was developed.
The measure developed suggests that the construct of psychological contract violations has multiple dimensions, supporting the finding of Turnley and Feldman.18 Confirming this structure is important to colleges and schools of pharmacy because it demonstrates that the psychological contract that a student develops with the institution comprises many facets. Furthermore, this project provides colleges and schools of pharmacy with an instrument to assess their own students. Thus, each institution can attempt to examine the issue of whether a problem exists, and if present, the areas that need to be addressed.
There are many avenues for future research with respect to understanding perceived psychological contract violations among pharmacy students. First and foremost, additional research on the measure itself is necessary. For example, variance extracted calculations suggest some room for improvement for a couple of the factors. In addition, measures of reliability and tests of discriminant validity suggest potential problems with the Course & Curricular Content factor. Measure development is an iterative and lengthy process and it is not uncommon for further testing to lead to additional refinements.
The variation in subscale scores suggests that some respondents perceived a significant degree of psychological contract violation by their school/college of pharmacy. Given this, exploring students' responses to violations is a fruitful area of research and may help to explain students' attitudes and behaviors, such as professional conduct and commitment. In addition, if students' expectations of the perceived contract are met or managed, students may be more likely to remain involved and contribute to the school of pharmacy upon graduation. Understanding factors predictive of perceived violations also may aid in understanding this important construct.
In the present study, the measure was administered only to second-year pharmacy students. Students exhibit attitudinal differences as they progress through the professional curriculum31,32 thus, a similar finding may exist for students' perceptions of psychological contract violations. As students progress through the curriculum and gain more experience, they may gain a better understanding of the obligations the school owes them and what their contracts entail. Both employees' perceptions of their own obligations and employers' obligations have been shown to increase over a 2-year period.21 Studies of different year-in-program cohorts and true longitudinal follow-up studies of perceived psychological contract violations of students may shed additional light on this area. The degree to which psychological contracts are violated may change over time among pharmacy students and such studies could help evaluate predictors of that change. Also, professional socialization processes may change the conceptualization of psychological contract violations, a question of measurement invariance (ie, does the scale have the same meaning over time and across different groups of pharmacy students?). Thus, future research should address changes in degree of violation over time (and differences between more advanced students and students early in the educational process) and also address the invariance of the scale over time and between groups of students at different levels of education and training.
The measure may need modification if administered to students in different years of the professional program. One foreseeable modification would be the inclusion of items related to advanced pharmacy practice experiences (APPEs) for fourth-year students. As such, 2 different instruments could emerge.
Limitations
This newly developed measure has demonstrated both reliability and validity in this sample. However, the sample included pharmacy students at only 6 schools of pharmacy. Though these schools (both public and private) were located in various regions of the country, the measure should be tested in a more diverse student population to assess the replicability of the measurement model.
Although the use of this measure represents an improvement over single-item scales, there are still areas for improvement. When creating a new scale, there is always a limitation that some items were incorrectly selected or omitted from the measure. As such, the items included to define the psychological contract violation construct may not truly represent its entire dimensionality. Though an extensive literature search was performed, the possibility exists that a body of work pertinent to this research project was overlooked. Furthermore, this study utilized the 1997 ACPE Standards, not the newer, revised standards.
Another limitation of this study was its cross-sectional nature, as all responses to the survey instrument were collected from each school of pharmacy at a single point in time. Given that the survey instruments were administered late in the spring semester (except at the 2 pilot schools), this timing in the academic calendar may have affected student responses. Also, it is not known whether the data were collected during periods when stressful academic events, such as examinations and student presentations, occurred.
This study relied on students' self-reporting of attitudinal data; thus, the potential for bias exists. One problem with collecting data using similar methods and type of scales is that the results could be attributed to individuals' tendencies to respond to similar types of measures in a similar fashion. Furthermore, ordinal or positional biases also may occur whereby a respondent marks options located in a certain position, such as the first or last position for each question.
CONCLUSIONS
The present study expanded the traditional role of psychological contract theory from the employer/employee context to the student/school of pharmacy setting. Although further research is necessary, this study provides some evidence of the reliability and validity of the proposed measure. The findings also suggest that some pharmacy students do perceive psychological contract violations by their respective institutions, signifying the need for future research to aid in the understanding of the antecedents and consequents of such violations. The current psychological contract violation measure is an initial step in understanding this important construct and its measurement in the educational setting.
ACKNOWLEDGEMENTS
Financial support was provided by the McWhorter School of Pharmacy at Samford University and the Department of Pharmacy Administration at The University of Mississippi. We would also like to thank Shane Desselle, Mary Euler, Mary Ferrill, Jan Kavookjian, Virgil Van Dusen, and Donna West Strum for their assistance with this project.
Footnotes
↵*During the conduct of this study and part of manuscript preparation, Dr. Spies was an Assistant Professor at Samford University and Southwestern Oklahoma State University. His current affiliation is with the College of Pharmacy, University of Oklahoma Health Sciences Center.
- Received October 16, 2009.
- Accepted January 31, 2010.
- © 2010 American Journal of Pharmaceutical Education