faculty, survey nonresponse, and organizational...

20
DRAFT Faculty, Survey Nonresponse, and Organizational Citizenship Behavior: A Population Profiling Study Kiernan R. Mathews Harvard University Paper/Poster presented at the annual meeting of the Association for the Study of Higher Education November 22, 2014 Abstract: This study introduces an organizational citizenship behavior (OCB) framework and the population profiling method to explore survey nonresponse among college faculty. Four OCB subscales are measured and compared among survey respondents, passive nonrespondents, and active nonrespondents under an analytical approach including chi-square, linear regression, and MANOVA.

Upload: vodat

Post on 16-Apr-2018

224 views

Category:

Documents


4 download

TRANSCRIPT

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior: A Population Profiling Study

Kiernan R. Mathews Harvard University

Paper/Poster presented at the annual meeting

of the Association for the Study of Higher Education

November 22, 2014 Abstract: This study introduces an organizational citizenship behavior (OCB) framework and the population profiling method to explore survey nonresponse among college faculty. Four OCB subscales are measured and compared among survey respondents, passive nonrespondents, and active nonrespondents under an analytical approach including chi-square, linear regression, and MANOVA.

DRAFTFACULTY, SURVEY NONRESPONSE,

AND ORGANIZATIONAL CITIZENSHIP BEHAVIOR: A POPULATION PROFILING STUDY

College and university leaders have come to rely increasingly on surveys of students,

staff, and faculty to inform accreditation, compliance, accountability, and improvements to

institutional policies and practices (Cote, Frinnell & Tompkins, 1986; Grosset, 1995; Schiltz,

1988; Peterson, Einarson, Augustine, & Vaughan, 1999; Snover, Terkla, Kim, Decarie &

Brittingham, 2010). Surveys are favorite tools of higher education researchers, too; in 2010, over

half of the articles published in Research in Higher Education used survey methodology to some

extent (Barge & Gehlbach, 2012). As surveys have become more prevalent, however, so has

nonresponse among both college populations and the general public (de Leeuw & de Heer, 2002;

Jones, 1996; Kim, Gershenson, Glaser & Smith, 2011; Porter & Umbach, 2006; Porter &

Whitcomb, 2005). Despite the lower rates of return, relatively few scholarly studies (in

organizational science, less than 1 in 3) include serious examinations of the presence and

potential impact of nonresponse bias on reported outcomes (Dooley & Lindner, 2003; Lindner,

Murphy, & Briers, 2001; Rogelberg & Stanton, 2007; Werner, Praxedes, & Kim, 2007), leading

one study to call survey nonresponse “a rather neglected stepchild” in organizational behavior

(Spitzmüller, Glenn, Barr, Rogelberg & Daniel, 2006, p. 19).

Each survey and every data sample are in some regards unique, so response rates and the

effectiveness of techniques to increase them vary depending on the context (Rogelberg & Luong

1998). Although widely considered to be a sui generis population (Blackburn & Lawrence, 1995;

Finkelstein, Seal, & Schuster, 1998; O’Meara, Terosky, & Neumann, 2008; Schuster &

Finkelstein, 2006), university faculty have been relatively overlooked in studies of survey

methodology. Their status, experiences, and settings differ from the usual suspects in survey

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 2

nonresponse research; as a result, little is known today about the substantial proportion of college

faculty who do not respond to the many organizational surveys they receive.

This study undertook an examination of faculty survey nonparticipation under an

organizational citizenship behavior (OCB) framework in order to evaluate whether

nonrespondent characteristics alter what we think we know about faculty. I asked:

RQ1. What is the extent of active and passive nonresponse among college faculty in an

organizational survey?

RQ2. Do organizational citizenship behaviors predict survey response intentions?

RQ3. Do faculty respondents, passive nonrespondents, and active nonrespondents differ

in their organizational citizenship behaviors?

The findings are intended to help researchers and practitioners to interpret their faculty

survey data and to shed light on the scope of limitations to survey research in this population.

METHOD

Population profiling

This study adapts the “population profiling” method from the organizational science

domain (Barr, Spitzmüller, & Stuebing, 2008; Rogelberg & Luong, 1998; Rogelberg, Luong,

Sederburg & Cristol, 2000; Rogelberg et al., 2003; Rogelberg & Stanton, 2007; Spitzmüller et

al., 2006; Spitzmüller et al., 2007). Population profiling relies on the theory of reasoned action,

which states that intent to respond at a later date indicates the likelihood of actual response; a

substantial body of literature provides evidence that prior survey refusals and the number of

previously completed surveys indeed suggest the likelihood of future participation (Bradburn,

1978; Brennan & Hoek, 1992; Couper & Groves, 1996; Goyder; 1986; Stinchcombe, Jones &

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 3

Sheatsley, 1981). Furthermore, subjects’ expressed intent to participate in a future survey has

been found to influence and to be correlated with actual survey response (Rogelberg et al., 2006).

This study reverses the typical profiling sequence by administering the profile after the

organizational survey’s period in the field is complete. Each faculty member’s response status—

respondent or nonrespondent—is combined with his or her expressed survey participation intent

to create the three categories of the “response profile” dependent variable. “Respondents” are

known to have participated in a prior organizational survey and, in the profile, agree that they

would participate in a similar survey in the future. Passive nonrespondents are known prior

nonrespondents who agree to participate in the future. Finally, active nonrespondents are known

prior nonrespondents who disagree that they would ever participate in such a survey.

Two large, public research universities met the criteria for selection. Each participated in

an identical, multi-institution survey of faculty workplace attitudes and realized an

approximately 50% response rate. Subsequently, these institutions granted me access to

administer a 15-minute profiling survey during several routine, on-campus meetings of college

faculty—just the “captive audience” that is required for success of this nonresponse analysis

technique.

Theoretical framework: Organizational Citizenship Behavior (OCB)

For my independent variables, I adapted a 19-item OCB inventory to the faculty setting

(Podsakoff et al. 1990; Barr et al., 2008; Spitzmüller et al., 2007). Participants scored statements

designed to assess intentions to engage in OCB conscientiousness, civic virtue, courtesy, and

altruism.

Conscientiousness is “a pattern of going well beyond minimally-required levels of

attendance, punctuality, housekeeping, conserving resources, and related matters of internal

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 4

maintenance” (Organ, 1988, p. 96). Higher incidences of this behavior have been found to

produce stronger survey response intentions, but not necessarily to predict survey response

(Rogelberg et al., 2003; Spitzmüller et al., 2007).

Civic virtue is “responsible, constructive involvement in the political process of the

organization, including not just expressing opinions but reading one’s mail, attending meetings,

and keeping abreast of larger issues involving the organization” (Organ, 1988, p. 96).

Spitzmüller et al. (2007) could not find a place for civic virtue in survey intentions, but they and

Youssefnia (2000) found evidence for lower levels of civic virtue in active nonrespondents.

Courtesy is a type of helping behavior involving voluntarily taking steps to prevent the

creation of problems for coworkers (Podsakoff, MacKenzie, Paine, & Bachrach, 2000). Applied

to survey research, an individual’s courtesy trait might lead him or her to see survey participation

as a chance to diagnose and prevent institutional dysfunction, like communication problems or

other organizational struggles (Spitzmüller et al., 2007). Indeed, research has found a

relationship between OCB courtesy and response intentions, and active nonrespondents are less

likely than respondents and passive nonrespondents to take steps toward preventing problems

with others (Spitzmüller et al., 2007).

Altruism in the OCB context describes “discretionary behaviors that have the effect of

helping a specific other person with an organizationally relevant task or problem” (Podsakoff et

al., 1990, p. 115). In this study’s instrumentation, the intent to participate measure describes a

specific instance in which the chief academic officer (CAO) is the originator of the request to

respond. Therefore, faculty reporting higher altruistic behaviors could be responding in order to

help the CAO perform her job.

FINDINGS

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 5

Altogether, 27% (n=36) of the analytic sample qualified as active or passive

nonrespondents according to established criteria (Rogelberg et al., 2003). Table 1 presents the

results across response types for several characteristics of the participants. Chi square tests of

differences in group proportions yielded no differences of statistical significance.

Participants’ intentions to complete a survey correlated with their past response behaviors

at a significant level (Table 2). An exploratory factor analysis found that the 19 individual

dimensions of OCB loaded onto their appropriate four scales. In total, the OCB scales accounted

for 52% of the variance across the inventory. A bivariate correlation analysis confirmed the

presence of statistically significant relationships between each of the four OCB constructs and

survey response intentions; civic virtue behavior demonstrated the strongest correlation. Given

the strength of these correlations, all OCB factors could be included in a bivariate linear

regression (Table 3) to test the extent to which organizational citizenship behaviors predict

response intentions. Overall, the model explained 21% of the variance in survey response

intention, with civic virtue being the only significant predictor.

To answer the final research question, this study used one-way MANOVA with the three-

level independent variable for response types and four dependent variables for the OCB

subscales. This procedure considers the homogeneity of variance across groups and protects

against the chances of making a Type I error (Spitzmüller et al., 2007). As long as its

assumptions are met, MANOVA accounts (where ANOVA does not) for the covariance among

multiple dependent variables—here, the four OCB subscales, which correlate with one another at

a statistically significant level (Huberty & Petoskey, 2000). Furthermore, MANOVA produces a

canonically-derived OCB measure, useful in illustrating the differences in OCBs overall across

response types (Meyers, Gamst & Guarino, 2006).

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 6

A priori tests confirmed that the criteria for MANOVA were met, and subsequently, a

statistically significant MANOVA effect was obtained. The results implied that 16% of the

variance in the canonically-derived OCB scores was accounted for by faculty membership in the

respondent, passive nonrespondent, and active nonrespondent groups. Tests showed (Figure 1)

that differences in the supervariable means between all groups met a p<.05 or better threshold of

statistical significance and were most substantive between active nonrespondents and other

groups. With a strong observed power, the MANOVA affirmed that, even given the limitations

of analyzing small numbers of nonrespondents, this study’s planned comparisons of OCB

dimensions between respondent types were warranted. Table 4 presents the similarities and

differences in organizational citizenship behaviors between survey response profiles.

Accounting for the limitation of sample size, the results suggest that, compared to

respondents, active nonrespondents rate themselves lower on conscientious and civic virtue

behaviors by significant margins (Figure 2). Passive nonrespondents and respondents showed no

substantive differences, only a marginal difference, on any of the four OCB subscales in the

analytic sample. The standardized discriminant function coefficients in the MANOVA support

this finding, that is, that higher scores on conscientiousness and civic virtue account for the

greatest share of differences between active nonrespondents, passive nonrespondents, and

respondents.

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 7

DISCUSSION & IMPLICATIONS

Although the sample was small, the proportion of active nonrespondents identified in this

faculty population is comparable to findings by others (Sosdian & Sharp, 1980; Rogelberg et al.,

2000; Rogelberg et al., 2003; Youssefnia, 2000) who place this group of hard-core refusers at

15% or less. The cautious analysis that proceeded confirmed a positive relationship between

OCBs and survey response behaviors. OCBs differ along survey response behaviors and help

explain faculty’s intentions to respond to organizational surveys. These findings suggest that

OCB-based frameworks are appropriate for the study of survey nonresponse in faculty.

Among university faculty, civic virtue was by far the greatest predictor of intentions to

complete a workplace attitudes survey. This finding differs considerably from that of Spitzmüller

et al. (2007), who found civic virtue to be the weakest and courtesy to be the only significant

OCB predictor of response intentions among firefighter cadets. Civic virtue outweighed other

OCBs in faculty survey behaviors, too. As found by Spitzmüller et al. (2007), active

nonrespondents reported that they were less likely than respondents to engage in these optional

organizational activities. My study also found among faculty, where findings on other

populations were mixed (Rogelberg et al., 2003; Rogelberg et al., 2006; Spitzmüller et al., 2007),

that active nonrespondents differ from respondents in that they were less likely to engage in the

conscientious behaviors that indirectly contribute to the organization. I found no differences,

though prior research has (Spitzmüller et al., 2007), between the three response profiles on

helping behaviors directed toward others (altruism) or on taking steps to prevent problems with

others (courtesy).

It may not be coincidental that both civic virtue and conscientiousness are weaker in

active nonrespondents than in respondents. Together, these dimensions describe behaviors that

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 8

are directed toward the organization (OCB-O); courtesy and altruism, on the other hand, are

behaviors directed toward individuals (OCB-I) (Hoffman et al., 2007; Williams & Anderson,

1991). This two-dimensional conceptualization of OCB places the faculty experience in sharp

relief: service to the institution, not necessarily to one another, is expected of faculty outside the

required tasks of teaching and research (Blackburn & Lawrence, 1995). Although performing

“for the good of the organization” is required in general, it is at the discretion of the individual

faculty member to choose how much time and where to engage in a vast and vaguely-bounded

range of assignments (Blackburn & Lawrence, 1995, p. 222).

OCB-O, then, is written in the faculty DNA; opportunities for OCB-I, on the other

hand—especially towards other faculty—are limited in the “shrinking collegial atmosphere” of

colleges and universities (O’Meara et al., 2008, p. 86). “Competition has replaced collegiality” in

institutions made up of “superspecialists,” immersed by their training “in a language all their

own,” and left to be “foreigners in their own land” (Blackburn & Lawrence, 1995, p. 3). In

today’s university, the norm of service—that vessel of civic virtue and conscientious

behaviors—is what drives faculty from their disciplinary “silos” into contact with one another.

Only then, as an indirect consequence, might faculty find themselves engaging in the altruistic

and courteous behaviors measured by this study’s OCB inventory.

Therefore, in the faculty context—more so, perhaps, than in populations previously

studied (Rogelberg et al., 2003; Rogelberg et al., 2006; Spitzmüller et al., 2007)—survey

response is associated particularly with civic virtue’s expression of “responsible, constructive

involvement in the political process” of the university (Organ, 1988, p. 96). It follows that

nonresponse will not be randomly missing data in university surveys about the extent of faculty

participation in campus governance. Survey nonparticipation by faculty who disengage from

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 9

other forms of service may result in a restriction of range—and a tendency-to-the-positive effect

(Taris & Schreurs, 2007)—on estimates of faculty efforts in institutional service. Given this

conceptual link between OCB-O and institutional service in the academy, on the one hand, and

this study’s findings of a relationship between OCB-O (especially civic virtue) and survey

response behaviors, on the other, colleges and universities with strong cultures of institutional

service should enjoy higher response rates on their surveys of faculty. Conversely, low response

rates in a department, division, or across the university may be interpreted as diagnostic

indicators of faculty disengaged in institutional service.

For researchers, this study should draw new scrutiny to survey research utilizing samples

supplemented by a pool of past survey completers (e.g., Hurtado et al., 2012) who, as high-OCB

individuals, will introduce greater nonresponse error (Peytcheva & Groves, 2009). Because high

OCBs correspond with higher job satisfaction and generally more positive attitudes about

employees’ organizations, reliance on a supplemental sample of high OCB individuals will

produce population estimates that overstate the positive attitudes of faculty toward their

institutions and other attitudinal measures in that instrument (Brüggen & Dholakia, 2010;

Hoffman et al., 2007; Ilies, Fulmer, Spitzmüller, & Johnson, 2009; LePine el al., 2002;

Rogelberg et al., 2000; Rogelberg et al., 2003; Taris & Schreurs, 2007). Therefore, studies that

exaggerate the participation of faculty already inclined to be respondents will only aggravate the

restriction-of-range effect that bedevils survey research.

More studies of the characteristics and motivations of faculty nonrespondents would help

academic and institutional researchers in developing guidelines to encourage broader

participation in faculty surveys and to make better statistical accommodations for missing data.

New findings could also clarify the magnitude and direction of limitations to survey research on

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 10

faculty. Given the survivorship bias suggested by this study, colleges and universities should

take greater care to elicit the muted voices of faculty who consistently withdraw from

opportunities to influence institutional decision making.

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 11

TABLES

Table 1. Descriptive characteristics of participants by response behavior and response profile.

All participants

with known survey behaviors

Analytic sample

Respondent Nonresponden

t

Respondent

Passive NR Active NR Total N 124 63 98 27 9 Gender

Female 43% 37% 44% 41% 33% Male 57% 63% 56% 59% 67%

Tenure status Pre-tenure 17% 14% 17% 11% 11% Tenured 83% 86% 83% 89% 89%

Rank Assistant 16% 14% 16% 7% 22% Associate 33% 48% 35% 48% 67% Full 51% 38% 49% 41% 11%

Note: NR= Nonrespondent.

Table 2. Correlation analysis among OCB factors, survey response intentions, and survey

response behaviors.

Variable N 1 2 3 4 5 1 OCB Conscientiousness 262 (.67)

2 OCB Civic Virtue 264 .37 *** (.75)

3 OCB Courtesy 264 .52 *** .35 *** (.73)

4 OCB Altruism 256 .50 *** .43 *** .53 *** (.72)

5 Response intention 269 .20 *** .44 *** .20 *** .23 *** -- 6 Response behavior 187 .09 .22 ** .08 -.03 .36 ***

Note: Cronbach’s α is reported on the diagonal for OCB composite measures. Correlations

reported are Pearson’s r. Response behavior: 0 = did not respond to prior faculty survey, 1 =

responded to prior faculty survey. *p<.05; **p<.01; ***p<.001.

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 12

Table 3. Bivariate regression analysis for OCB subscales predicting response intentions (z-score)

(n = 247).

Variable SE B β Conscientiousness .06 .03

Civic virtue .06 .41 ***

Courtesy .06 .06 Altruism .06 .02

Note: ***p<.001, R2 = .21, adjusted R2 = .20.

DRAFTFaculty, Survey Nonresponse, and Organizational Citizenship Behavior 13

Table 4. Standardized means, standard deviations (SD), and significance levels of differences for OCB subscales by response

profile (for the analytic sample) and by survey behaviors (for all participants with a known prior faculty survey status).

Analytic sample

All participants with known survey behaviors

(A)

Active NR (n = 9)

(P)

Passive NR (n = 27)

(R)

Respondent (n = 98)

Significance levels

of column differences Nonresponde

nt (n = 63)

Responden

t (n = 124)

Mean SD Mean SD Mean SD R - A R - P P - A Mean SD Mean SD Conscientiousness -.95 1.32 -.14 1.04 .08 .93 .037 * .891 .209 -.19 1.06 .01 .97 Civic virtue -1.09 1.02 -.22 .90 .18 .88 .003 ** .139 .099 † -.36 .81 .08 .97 ** Courtesy -.29 1.05 -.13 .99 .03 .97 1.000

1.000 1.000 -.14 1.00 .02 .96

Altruism -.12 .54 .09 .84 .08 .94 1.000 1.000 1.000 .05 .73 .00 .96

Note: NR = Nonrespondents. Effect size is Cohen’s d.

† p<.10, *p<.05, **p<.01.

Faculty, Survey Nonresponse, and O

rganizational Citizenship B

ehavior 13

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 14

FIGURES

Figure 1. Central tendency and spread of the canonically derived OCB supervariable (z-score)

for respondents, passive nonrespondents, and active nonrespondents.

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 15

Figure 2. Central tendency and spread of four OCB subscale estimates (z-scores) for

respondents, passive nonrespondents, and active nonrespondents.

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 16

REFERENCES

Barge, S. & Gehlbach, H. (2012). Using the theory of satisficing to evaluate the quality of survey data. Research in Higher Education, 53(2), 182-200.

Barr, C., Spitzmüller, C., & Stuebing, K. (2008). Too stressed out to participate? Examining the

relation between stressors and survey response behavior. Journal of Occupational Health Psychology, 13(3), 232-243.

Blackburn, R. T. & Lawrence, J. H. (1995). Faculty at work: Motivation, expectation,

satisfaction. Baltimore: Johns Hopkins University Press. Bradburn, N. M. (1978). Respondent burden. Paper presented at the American Statistical

Association, Proceedings of the Survey Research Methods Section. Brennan, M. & Hoek, J. (1992). The behavior of respondents, nonrespondents, and refusers

across mail surveys. Public Opinion Quarterly 56(4), 530-535. Brüggen, E. & Dholakia, U. M. (2010). Determinants of participation and response effort in web

panel surveys. Journal of Interactive Marketing 24(3), 239–250. Cote, L. S., Frinnell, R. E., & Tompkins, L. D. (1986). Increasing response rates to mail surveys:

the impact of adherence to Dillman-like procedures and techniques. Review of Higher Education, 9(2), 229-242.

Couper, M. & Groves, R. (1996). Social environmental impacts on survey cooperation. Quality

and Quantity, 30(2), 173-188. de Leeuw, E. & de Heer, W. (2002). Trends in household survey nonresponse: a longitudinal and

international comparison. In R.M. Groves, D.A. Dillman, J.L. Eltinge & R.J.A. Little (Eds.), Survey nonresponse, 41-54. New York: Wiley.

Dooley, L. M. & Lindner, J. R. (2003). The handling of nonresponse error. Human Resource

Development Quarterly, 14(1), 99-110. Finkelstein, M. J., Seal, R. K., & Schuster, J. H. (1998). The new academic generation: A

profession in transformation. Baltimore: Johns Hopkins University Press. Goyder, J. (1986). Surveys on surveys: Limitations and potentialities. Public Opinion Quarterly,

50(1), 27-41. Grosset, J. M. (1995). The biasing effects of bad addresses on information gathered by mail

surveys. Journal of Applied Research in the Community College, 2(2), 179-191.

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 17

Hoffman, B. J., Blair, C. A., Meriac, J. P., & Woehr, D. J. (2007). Expanding the criterion domain? A quantitative review of the OCB literature. Journal of Applied Psychology, 92(2), 555-566.

Huberty, C. J. & Petoskey, M. D. (2000). Multivariate analysis of variance and covariance. In H.

Tinsley and S. Brown (Eds.) Handbook of applied multivariate statistics and mathematical modeling. New York: Academic Press.

Hurtado, S., Eagan, M. K., Pryor, J. H., Whang, H., & Tran, S. (2012). Undergraduate teaching

faculty: The 2010–2011 HERI Faculty Survey. Los Angeles: Higher Education Research Institute, UCLA.

Ilies, R., Fulmer, I. S., Spitzmüller, M., & Johnson, M. (2009). Personality and citizenship

behavior: The mediating role of job satisfaction. Journal of Applied Psychology, 94(4), 945-959.

Jones, J. (1996). The effects of non-response on statistical inference. Journal of Health and

Social Policy, 8, 49-62. Kim, J., Gershenson, C., Glaser, P., & Smith, T. W. (2011). The polls: trends in surveys on

surveys. Public Opinion Quarterly, 75(1), 165-191. LePine, J. A., Erez, A., & Johnson, D. E. (2002). The nature and dimensionality of

organizational citizenship behavior: A critical review and meta-analysis. Journal of Applied Psychology, 87, 52–65.

Lindner, J. R., Murphy, T. H., & Briers, G. H. (2001). Handling nonresponse in social science

research. Journal of Agricultural Education, 42 (4), 43-53. Menachemi, N. (2011). Assessing response bias in a web survey at a university faculty.

Evaluation & Research in Education, 24(1), 5-15. Meyers, L. S., Gamst, G., & Guarino, A. (2006). Applied multivariate research: Design and

interpretation. Thousand Oaks, CA: Sage Publishers. O’Meara, K., Terosky, A. L., & Neumann, A. (2008). Faculty careers and work lives: A

professional growth perspective. San Francisco: Jossey-Bass. Organ D.W. (1988). Organizational citizenship behavior: The good soldier syndrome.

Lexington, MA: Lexington Books. Peterson, M. W., Einarson, M. K., Augustine, C. H., & Vaughan, D. S. (1999). Institutional

support for student assessment: Methodology and results of a national survey. Stanford, CA: National Center for Postsecondary Improvement.

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 18

Peytcheva, E. & Groves, R.M. (2009). Using variation in response rates of demographic subgroups as evidence of nonresponse bias in survey estimates. Journal of Official Statistics 25(2), 193-201.

Podsakoff, P. M., MacKenzie, S. B., & Moorman, R. H. (1990). Transformational leader

behaviors and their effects on followers’ trust in leader, satisfaction, and organizational citizenship behaviors. Leadership Quarterly, l(2), 107-142.

Podsakoff, P. M., MacKenzie, S. B., Paine, J. B., & Bachrach, D. G. (2000). Organizational

citizenship behaviors: A critical review of the theoretical and empirical literature and suggestions for future research. Journal of Management, 26, 513-563.

Porter, S. R. & Umbach, P. (2006). Student survey response rates across institutions: Why do

they vary? Research in Higher Education, 47(2), 229-247. Porter, S. R. & Whitcomb, M. E. (2005). Non-response in student surveys: the role of

demographics, engagement, and personality. Research in Higher Education, 46(2), 127-152.

Rogelberg, S. G. & Luong, A. (1998). Nonresponse to mailed surveys: A review and guide.

Current Directions in Psychological Science, 7(2), 60-65. Rogelberg, S. G., Luong, A., Sederburg, M. E., & Cristol, D. S. (2000). Employee attitude

surveys: Examining the attitudes of noncompliant employees. Journal of Applied Psychology, 85(2), 284-293.

Rogelberg, S. G., Conway, J. M., Sederburg, M. E., Spitzmüller, C., Aziz, S., & Knight, W. E.

(2003). Profiling active and passive nonrespondents to an organizational survey. Journal of Applied Psychology, 88(6), 1104-1114.

Rogelberg, S. G., Spitzmüller, C., Little, I., & Reeve, C. L. (2006). Understanding response

behavior to an online special topics organizational satisfaction survey. Personnel Psychology, 59(4), 903-923.

Rogelberg, S. G. & Stanton, J. M. (2007). Introduction: understanding and dealing with

organizational survey nonresponse. Organizational Research Methods, 10(2), 195-209. Schiltz, M. E. (1988). Professional standards for survey research. Research in Higher Education,

28(1), 67-75. Schuster, J. H. & Finkelstein, M. J. (2006). The American faculty: The restructuring of academic

work and careers. Baltimore: Johns Hopkins University Press. Snover, L., Terkla, D. G., Kim, H., Decarie, L., Brittingham, B. (2010, June) NEASC

Commission on Higher Education Requirements for Documentation of Assessment of Student Learning and Student Success in the Accreditation Process: Four Case Studies. In

DRAFT

Faculty, Survey Nonresponse, and Organizational Citizenship Behavior 19

J. Carpenter-Hubin (Chair), Charting Our Future in Higher Education. 50th Annual Forum of the Association for Institutional Research (AIR), Chicago, IL.

Sosdian, C. P. & Sharp, L. M. (1980). Nonresponse in mail surveys: Access failure or respondent

resistance. Public Opinion Quarterly, 44, 396-402. Spitzmüller, C., Glenn, D. M., Barr, C. D., Rogelberg, S. G., & Daniel, P. (2006). “If you treat

me right, I reciprocate”: Examining the role of exchange in organizational survey response. Journal of Organizational Behavior, 27(1), 19-35.

Spitzmüller, C., Glenn, D. M., Sutton, M. M., Barr, C. D., & Rogelberg, S. G. (2007). Survey

nonrespondents as bad soldiers: Examining the relationship between organizational citizenship and survey response behavior. International Journal of Selection & Assessment, 15(4), 449-459.

Taris, T. W. & Schreurs, P. J. G. (2007). How may nonresponse affect findings in organizational

surveys? The tendency-to-the-positive effect. International Journal of Stress Management, 14(3), 249-259.

Werner, S., Praxedes, M., & Kim, H. G. (2007). The reporting of nonresponse analyses in survey

research. Organizational Research Methods, 10(2), 287-295. Williams, L.I. & Anderson, S.E. (1991). Job-satisfaction and organizational commitment as

predictors of organizational citizenship and in-role behaviors. Journal of Management 17(3), 601-617.

Youssefnia, D. (2001). Examining organizational survey response through an organizational

citizenship framework. (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses (275927939).