benjamin l. messer washington state university papor mini-conference, berkeley, ca june 24, 2011...

32
Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys: A Summary of the AAPOR Session

Upload: bridget-bradford

Post on 25-Dec-2015

218 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Benjamin L. MesserWashington State University

PAPOR Mini-Conference, Berkeley, CA June 24, 2011

Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

A Summary of the AAPOR Session

Page 2: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

2011 AAPOR Session

• Are Measurement and Item Nonresponse Differences a Problem in Web and Mail Mixed-Mode Surveys?– 1) Millar & Dillman, “Do mail and web produce different answers?

Mode differences in question response and item nonresponse rates”– 2) Smyth & Olson, “Comparing numeric and text open-end response

in mail and web surveys.”– 3) Lesser, Olstad, Yang, & Newton, “Item nonresponse in web and

mail response to general public surveys.”– 4) Messer, Edwards, & Dillman, “Determinants of web and mail item

nonresponse in address-based samples of the general public.”– 5) Israel & Lamm, “Item nonresponse in a client survey of the

general public.”

Page 3: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Background

• Item nonresponse and measurement differences have not been thoroughly tested in web and mail surveys, particularly of the general public

• Both types of error are thought to be similar between the modes since they are self-administered and visual, although some have found that mail obtains higher error rates (i.e. lower data quality)

Page 4: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

PAPER 1: Millar & Dillman

• Tested item nonresponse and measurement error in web and mail surveys

• Used data from the WSU Student Experience Survey, in which the population has near universal access to the web and both postal and email addresses, in the Spring and Fall of 2009– Spring 2009 study: 100 items, 36 questions– Fall 2009 study: 76 items, 33 questions

Page 5: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Response Rates

Page 6: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Methods

• Item nonresponse: Calculated percent of respondents missing a response and compared rates with z-tests

• Measurement: Used chi-square tests to test for differences in the distribution of responses

Page 7: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Item Nonresponse

• No statistically significant differences in overall item nonresponse rates between web and mail modes in each experiment

Page 8: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Item Nonresponse, cont.

• Some individual items exhibited significant mode differences in item nonresponse rates– Multi-item questions– Branching questions– Open-end questions

Page 9: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Measurement Differences

• Very few significant differences exist– Multi-item questions are most likely to exhibit

measurement differences

Page 10: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

PAPER 2: Smyth & Olson

• Tested web vs. mail measurement differences in open-ended questions, a question type more prone to error in self-administered, visual modes

• Used data from the 2009 Quality of Life in Changing Nebraska Survey (QLCN)– 45.6% total response rate

Page 11: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Methods

• Four experimental treatment groups– Mail-only, Mail+Web, Web+Mail, Web-only

• Tested two different open-end question formats– 7 number box items (e.g. date of birth)

• Measurement differences (distributions of answers to questions)

• Item nonresponse rates (missing or not missing)

– 2 text box items (e.g. descriptions or narratives)• Item nonresponse rates (missing or not missing)• Quality (e.g. length, number of themes, elaboration)

Page 12: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Results

• Number Box:– Regression analyses resulted in very few significant

differences in either responses or measurement indicators between web and mail modes

– What few differences occurred were largely due to differential participation in web and mail modes, as well as questionnaire design

• Text Box:– Analyses show that item nonresponse rates and data quality

were very similar across modes• However, the differences found to be significant were largely due

to questionnaire design and mode

Page 13: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

PAPER 3: Lesser, Newton, & Yang

• Compared item nonresponse rates across modes and question types and compared unit response between mail, web, and telephone modes

• Used data from the 2008 & 2010 general public surveys conducted for the Oregon Department of Transportation– Address-based samples

Page 14: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Methods

• Four groups:– Telephone, Mail-only, Web+Mail, Web/Mail

(choice)• Five question types:– Likert, Open-end, Filtered, Tabular, & Demographic

Page 15: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Item Nonresponse Results

Type Mail Web+Mail Web/Mail

Likert 2.4 2.6 2.4Filter 0.9 2.7 2.7Table 4.9 5.6 5.2 Open 3.2 4.1 4.3Demo 4.0 4.3 3.3

Overall 3.5 4.1 3.9

Page 16: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Item Nonresponse Results: 2008

Type Mail Web/Mail TelephoneLikert 1.8 1.7 0.65Filter 1.3 1.4 0.44Table 4.1 3.2 0.20 Open 5.5 4.3 2.55Demo 3.7 3.6 4.05

Overall 3.0 2.6 0.87

Page 17: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Unit Response Results

Telephone Mail Web/Mail WM Option0

5

10

15

20

25

30

35

40

45

Unit Response Rate by Mode and Year

2006200820105th

Mode

Response Rate

%

Page 18: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

PAPER 4: Messer, Edwards, & Dillman

• Tested for item nonresponse differences controlling for survey mode and design, question types, and demographic characteristics

• Used data from three general public surveys with address-based samples:– 2007 Lewiston & Clarkston Quality of Life Survey (LCS)

• Web+Mail, 55%; Mail-only, 66%• 92 items, 51 questions

– 2008 Washington Community Survey (WCS)• Web+Mail, 40%; Mail-only, 50%• 110 items, 52 questions

– 2009 Washington Economic Survey (WES)• Web+Mail, 50%; Mail-only, 62%• 96 items, 57 questions

Page 19: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Methods

• Survey modes: web, mail, & mail-follow-up– Survey designs: web+mail & mail-only

• Question Types– Screened, Multi-item, Open-end, Close-end

• Question Formats– Factual, Attitudinal, Behavioral

• Demographic Characteristics– Gender, age, education, income

• Item nonresponse rate: missing responses divided by total number of possible responses for each respondent

Page 20: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Item Nonresponse Rates by Survey Mode

LCS WCS WES0

2

4

6

8

10

12

14

Mail-only Web Mail follow-up

Experiment

Item

Non

resp

onse

Rat

es

Page 21: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Question Effects

• The same trend persists across questions types and formats, with web obtaining the lowest rate, followed by mail, and then mail follow-up in all three surveys

• In regression analyses controlling for both survey mode and question type & format (e.g. screened, multi-item, etc.) are significant predictors of item nonresponse

Page 22: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Demographic Effects

• The same trends in item nonreponse rates are also found across different demographic subgroups.

• We also found that older respondents with less education and income have higher item nonresponse rates– Regression analyses controlling for mode and

demographics indicate this could be due to differential participation

Page 23: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Item Nonresponse Rates by Survey Design

• Web+mail and mail-only designs obtained statistically similar item nonresponse rates ranging between 4-8%

• Question Type and Format– Regression analyses indicate that question format is a

significant predictor of item nonresponse, controlling for design

• Demographic Characteristics– Older respondents with lower levels of education and

income tend to exhibit higher rates, controlling for design• Survey design was not a significant predictor

Page 24: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

PAPER 5: Israel & Lamm

• Tested for item nonresponse differences controlling for survey mode, question characteristics, and respondent demographics

• Used data from the 2008, 2009, 2010 University of Florida Extension Customer Satisfaction Survey

Page 25: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Methods

• Mail-only and Web-only modes• Had postal and email addresses of clients• Used logistic regression and HLM statistical

methods• Demographic Characteristics:– Gender, education, age, race, & client participation

status• Question Types– Open-end, screened, demographic, grid, & yes/no

Page 26: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Item Nonresponse Rates by Mode

Mail Web7.3% 5.1%

Mail Web7.1% 6.3%

Mail Web7.1% 5.5%

2008 2009 2010

Page 27: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Completed Survey Rate by Mode

Mail Web Mail Web Mail Web28.2% 47.2% 28.4% 48.2% 30.0% 41.3%

2008 2009 2010

Page 28: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Determinants of Item Nonresponse

• Logistic regression analysis shows that demographic characteristics of respondents have little affect on item nonresponse

• Open-end and screened question types yielded the highest item nonresponse rates

• HLM analyses result in few mode or demographic effects question characteristics

Page 29: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Tying it all together• Millar & Dillman found few item nonresponse rate differences between web

and mail in a population with easy access to both, however questions types were found to influence data quality net of mode

• For the general public, Smyth & Olson discovered that one particular question type obtained variable data quality depending on the survey mode and the format of the question (numeric vs. text), with web obtaining slightly lower nonresponse rates and better data quality

• In multiple general pubic web and mail surveys, Lesser et. al., Messer et. al., and Israel & Lamm found that web obtained lower item nonresponse rates. However, when combined with a mail follow-up, web+mail item nonresponse rates approximate those obtained by mail alone. In addition, question characteristics and respondent demographics (Messer et. al.) were found by to influence item nonresponse rates. Lesser et al. also showed that telephone obtained the lowest item nonresponse rates, as could be expected.

Page 30: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Where we stand now…

• Using similar questionnaires, web may obtain slightly better data quality than mail, at least in general public surveys, but also currently obtains lower unit response rates, indicating a trade-off.

• In addition, web may also attract types of respondents that have a greater propensity to complete the questionnaire compared to mail, and complex question types/formats can produce lower data quality net of survey mode.

Page 31: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

In addition….

• I also saw Danna L. Moore present “Driving Respondents to the Web: Experimental Trial of Benefit Appeals and Impacts on Survey Completion”– She tested the use of three different letter types

• One basic letter with no appeals• One letter with an appeal to save the government money by using

the web (instead of paper)• One letter with an appeal to help out the local community

– Results indicated that the appeal to save money had a positive, significant effect on web response rates, compared with the other two letters• About 3 percentage points higher

Page 32: Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:

Citations and Contact Info• Millar, Morgan M. & Don A. Dillman. 2011. “Improving Response Rates to Web and

Mixed-Mode Surveys.” Public Opinion Quarterly, 75:2(249-69)– [email protected]

• Smyth, Jolene & Kristen Olson. Unpublished Manuscript. “Comparing Numeric and Text Open-End Responses in Mail & Web Surveys.”– [email protected]

• Lesser, Virginia, Andy Olstad, Danny Yang & Lydia Newton. Unpublished Manuscript. “Item nonresponse in web and mail response to general public surveys.”– [email protected]

• Messer, Benjamin & Don A. Dillman. Forthcoming. “Surveying the General Public Over the Internet Using Address-based Sampling and Mail Contact Procedures.” Public Opinion Quarterly– [email protected]

• Israel, Glenn D. & Alex J. Lamm. Unpublished Manuscript. “Item-Nonresponse in a Client Survey of the General Public.”– [email protected]