unit nonresponse when collecting data

32
Unit nonresponse when collecting data. When and how much should we worry? Mario Callegaro Program in Survey Research and Methodology University of Nebraska, Lincoln SSP Workshop March 30 th 2007 Acknowledgement: the author thank Nancy Bates of the U.S. Census Bureau for providing recent response rates trends for federal agencies sponsored surveys

Upload: others

Post on 19-May-2022

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Unit nonresponse when collecting data

Unit nonresponse when collecting data. When and how much should we worry?

Mario CallegaroProgram in Survey Researchand MethodologyUniversity of Nebraska, Lincoln

SSP Workshop March 30th 2007

Acknowledgement: the author thank Nancy Bates of the U.S. Census Bureau for providing recent response rates trends for federal agencies sponsored surveys

Page 2: Unit nonresponse when collecting data

SSP workshop March 30th 2007 2/31

Workshop overviewUnit nonresponse, definitionComputing response rates (AAPOR standards)Trends in response ratesNonresponse biasIncreasing response ratesMeasuring nonresponse biasOMB Guidelines for U.S. Federal government-funded surveysBooks on nonresponse

Page 3: Unit nonresponse when collecting data

SSP workshop March 30th 2007 3/31

NonresponseUnit nonresponse: the failure to obtain the interview from the selected person

1. Failure to deliver the survey request (e.g. noncontact)

2. Refusal to participate3. Inability to participate

Item nonresponseFailure to obtain some answers to the questionnaire from the selected respondent

Page 4: Unit nonresponse when collecting data

SSP workshop March 30th 2007 4/31

Computing Response rates

AAPOR 2006 standard definitions enable to compute Response Rates for

Face-to-faceTelephoneMailInternet surveys

AAPOR standards are endorsed by many journals (e.g. POQ, APSR) and recently by CMOR and OMB

Page 5: Unit nonresponse when collecting data

SSP workshop March 30th 2007 5/31

Initial sample(all phone numb.)Known

eligibilityUnknownEligibility

Response Nonresponse

Complete(I)

Partial(P)

Refusal(R)

Noncontact

(NC)

Othernon-interv.

(O)

Eligible Not eligible

Out ofsample

Fax/dataline

Non work.disconnect

Special techcircumst.

Nonresidence

RDDAAPORdisposition codes

Not eligiblerespondent

Quota filled

Unknown ifHousing un.

(UH)

Unknown if eligible resp.

Other(UO)

Page 6: Unit nonresponse when collecting data

SSP workshop March 30th 2007 6/31

Final disposition codesRR = Response rateCOOP = Cooperation rateREF = Refusal rateCON = Contact rate

I = Complete interviewP = Partial interviewR = Refusal and break-offNC = Non-contactO = OtherUH = Unknown if household/occupied UHUO = Unknown, othere = Estimated proportion of cases of unknown eligibility that are eligible

Page 7: Unit nonresponse when collecting data

SSP workshop March 30th 2007 7/31

Response rates

RR2

RR4

)()()( UOUHONCRPIPI

+++++++

( ) ( ) ( )I P

I P R NC O e UH UO+

+ + + + + +

Page 8: Unit nonresponse when collecting data

SSP workshop March 30th 2007 8/31

Other ratesCOOP2

REF3

CON3

ORPIPI+++

+)(

)(

)()( ONCRPIR

++++

NCORPIORPI+++++++

)()(

Page 9: Unit nonresponse when collecting data

SSP workshop March 30th 2007 9/31

Trends in response ratesAcademicCommercialFace to face government surveys

Page 10: Unit nonresponse when collecting data

SSP workshop March 30th 2007 10/31

Curtin et al. (2005) Telephone RR2 1979-2003 Survey of Consumer Attitudes

Page 11: Unit nonresponse when collecting data

SSP workshop March 30th 2007 11/31

Curtin et al. (2005) Final refusal rate 1979-2003 Survey of Consumer Attitudes

Page 12: Unit nonresponse when collecting data

SSP workshop March 30th 2007 12/31

Curtin et al. (2005) Noncontact rate 1979-2003 Survey of Consumer Attitudes

Page 13: Unit nonresponse when collecting data

SSP workshop March 30th 2007 13/31

Gallup RDD 1997- 2003 (Tortora, 2004)Contact rates

50

60

70

Dec19

97Mar1

998

June

1998

Sept19

98Dec

1998

Mar199

9Ju

ne19

99Sep

t1999

Dec19

99Mar2

000

June

2000

Sept20

00Dec

2000

Mar200

1Ju

ne20

01Sep

t2001

Dec20

01Mar2

002

June

2002

Sept20

02Dec

2002

Mar200

3Ju

ne20

03Sep

t2003

Perc

ent

Page 14: Unit nonresponse when collecting data

SSP workshop March 30th 2007 14/31

Gallup RDD 1997- 2003 (Tortora, 2004)Answering machines

0

5

10

15

20

Dec19

97Mar1

998

June

1998

Sept19

98Dec

1998

Mar199

9Ju

ne19

99Sep

t1999

Dec19

99Mar2

000

June

2000

Sept20

00Dec

2000

Mar200

1Ju

ne20

01Sep

t2001

Dec20

01Mar2

002

June

2002

Sept20

02Dec

2002

Mar200

3Ju

ne20

03Sep

t2003

Perc

ent

Page 15: Unit nonresponse when collecting data

SSP workshop March 30th 2007 15/31

Gallup RDD 1997- 2003 (Tortora, 2004)Response rates

20

30

40

Dec19

97Mar1

998

June

1998

Sept19

98Dec

1998

Mar199

9Ju

ne19

99Sep

t1999

Dec19

99Mar2

000

June

2000

Sept20

00Dec

2000

Mar200

1Ju

ne20

01Sep

t2001

Dec20

01Mar2

002

June

2002

Sept20

02Dec

2002

Mar200

3Ju

ne20

03Sep

t2003

Perc

ent

Page 16: Unit nonresponse when collecting data

SSP workshop March 30th 2007 16/31

National Immunization Survey – Teleph. 1995-2004 (Battaglia et al., in press)

CASRO Response rates

65

75

85

1995 1996 1997 1998 1999 2000 2001 2002 2003 2004

Perc

ent

Page 17: Unit nonresponse when collecting data

SSP workshop March 30th 2007 17/31

Telephone RR for news media and government contractors (1996-2005)

Variab Mean SD Min Max N

RR3 .30 .13 .04 .70 114

CON2 .67 .13 .33 .92 114

COOP1 .44 .15 .09 .84 114

REF2 .29 .09 .04 .55 114

Holbrook, Krosnick, & Pfent (forthcoming)

Page 18: Unit nonresponse when collecting data

SSP workshop March 30th 2007 18/31

Face to face U.S. federal agencies 1990-2005 First interview nonresponse rates

Page 19: Unit nonresponse when collecting data

SSP workshop March 30th 2007 19/31

Face to face U.S. federal agencies 1990-2005 First interview refusal rates

Page 20: Unit nonresponse when collecting data

SSP workshop March 30th 2007 20/31

Face to face U.S. federal agencies 1990-2005 First interview no one home rates

Page 21: Unit nonresponse when collecting data

SSP workshop March 30th 2007 21/31

Nonresponse Bias

Where yr is value of y for the respondents (observed)ynr is the value of y for the nonrespondents (not observed)n is the selected sample sizer are the respondentsnr are the nonrespondentsn = r + nr(yr – ynr) is the difference between respondents and nonrespondents

is the nonresponse rate

( )nrr yynnr

−=Bias

nnr

Page 22: Unit nonresponse when collecting data

SSP workshop March 30th 2007 22/31

% absolute nonresp. bias across 30 studiesGroves (2006), p. 659

( )n

nr

yyy −*100

Page 23: Unit nonresponse when collecting data

SSP workshop March 30th 2007 23/31

Relationship between response rates and bias in exit polls

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1

Response Rates

MWPE

Bautista, Callegaro, Vera, & Abundis (in press)

Page 24: Unit nonresponse when collecting data

SSP workshop March 30th 2007 24/31

Contactability

Initial decision

Final decision

Lenght of datacollection

# and timing ofcalls

Interviewerworkload

Interviewerobservations

Prenotice IncentivesBurden /

miniquestion.

Use of proxy

Interviewermatch

Interviewerbehavior

Sponsorship

Mode switch

Interviewerswitch

Two-phasesampling

PersuasionletterGroves et al,

(2004, p. 190)

Increasingresponserates

Page 25: Unit nonresponse when collecting data

SSP workshop March 30th 2007 25/31

Estimating nonresponse errorIn order to estimate nonresponse error we need to estimate (yr – ynr)

Page 26: Unit nonresponse when collecting data

SSP workshop March 30th 2007 26/31

ESTIMATINGNONRESPONSE

ERROR

Use samplingframe information

Using linked oradministrative data

Using interviewersupplied

information

Comparison withaggregate data

Survey ofnonrespondents or asubsample of them

Use a surrogatefor nonrespondent

Use paneldropouts

On some /most variables

On allvariables

Laterespondents

Convertedrefusals

Reluctantrespondents

Page 27: Unit nonresponse when collecting data

SSP workshop March 30th 2007 27/31

Office of Management and Budget (OMB) Guidelines (2006) for unit nonresponse

ICRs for surveys with expected response rates of 80 percent or higher need complete descriptions of the Response Rates basis of the estimated response rate…ICRs for surveys with expected response rates lower than 80 percent need complete descriptions of how the expected response rate was determined, a detailed description of steps that will be taken to maximize the response rate, and a description of plans to evaluate nonresponse biasICR= Information Collection Request

Page 28: Unit nonresponse when collecting data

SSP workshop March 30th 2007 28/31

Q71. How can agencies examine potential nonresponse bias? I

Agencies should consult with professional statisticians and survey methodologists to ensure that potential nonresponse bias is addressed in the design of the studyAt a minimum, agencies should plan to compare respondents and nonrespondents on information available from the sampling frameIn addition, agencies should seek out other available external information that they may be able to match to their sampling frame

Page 29: Unit nonresponse when collecting data

SSP workshop March 30th 2007 29/31

Q71. How can agencies examine potential nonresponse bias? II

If this kind of information is not available, there are other possibilities to consider, such as mapping telephone exchanges in an RDD survey to census tracts or zip codes, and then matching with aggregated data from the Census long form [see American Fact Finder]Another source of information in longitudinal surveys is to compare respondents and nonrespondents on characteristics gathered at prior waves

Page 30: Unit nonresponse when collecting data

SSP workshop March 30th 2007 30/31

Q71. How can agencies examine potential nonresponse bias? IIIWhen there are no good sources of information about respondents and nonrespondents on the substantive variables of interest, agencies can also use additional follow-up procedures with an abbreviated questionnaireSometimes these follow-up studies are done by selecting a probability sample of nonrespondentsAgencies can also assess potential nonresponse bias by analyzing differences between respondents and initial refusals (who were later “converted”)

Page 31: Unit nonresponse when collecting data

SSP workshop March 30th 2007 31/31

Books on nonresponse

The silent minority: nonrespondents in sample surveys

John Goyder

1987

Interviewer approaches

Jean Morton-Williams

19931993

20021998 2005

Page 32: Unit nonresponse when collecting data

References on nonresponse research American Association for Public Opinion Research. (2006). Standard definitions: Final

dispositions of case codes and outcome rates for surveys. 4th edition. Lenexa, KS: AAPOR. Available at: http://www.aapor.org/standards.asp

Battaglia, M. P., Khare, M., Frankel, M., Murray, M. C., Buckley, P., & Peritz, S. (in

press). Response Rates: How have they changed and where are they headed? In J. Lepkowski, C. Tucker, M. Brick, E. De Leeuw, L. Japec, P. J. Lavrakas, M. Link & R. Sangster (Eds.), Advances in telephone survey methodologies. Hoboken, NJ: Wiley.

Bautista, R., Callegaro, M., Vera, J. A., & Abundis, F. (in press). Studying nonresponse

in Mexican exit polls. International Journal of Public Opinion Research. Curtin, R., Presser, S., & Singer, E. (2005). Changes in telephone survey nonresponse

in the past quarter century. Public Opinion Quarterly, 69, 87-98. Groves, R. M. (2006). Nonresponse rates and nonreponse bias in household surveys.

Public Opinion Quarterly, 70, 646-675. Groves, R. M., Fowler, F. J., Couper, M. P., Lepkowski, J. M., Singer, E., &

Tourangeau, R. (2004). Survey methodology. Hoboken, NJ: Wiley. Holbrook, A. L., Krosnick, J. A., & Pfent, A. (in press). The causes and consequences of

response rates in surveys by the news media and government contractor survey research firms. In J. Lepkowski, C. Tucker, M. Brick, E. De Leeuw, L. Japec, P. J. Lavrakas, M. Link & R. Sangster (Eds.), Advances in telephone survey methodologies. Hoboken, NJ: Wiley.

Lynn, P. (in press). The problem of nonresponse. In E. De Leeuw, J. Hox & D. A.

Dillman (Eds.), International handbook of survey methodology: Lawrence Erlbaum.

Office of Management and Budget. (2006). Question and answers when designing

surveys for information collections. Washington D.C.: OMB. Available at: http://www.whitehouse.gov/omb/inforeg/pmc_survey_guidance_2006.pdf

Singer, E. (Ed.). (2006). Nonresponse bias in household surveys. Public Opinion

Quarterly special issue Vol. 70 Issue 5. Tortora, B. (2004). Response trends in a national random digit dialing telephone survey.

Metodološki zvezki,, 1, 21-32. Available at: http://mrvar.fdv.uni-lj.si/pub/mz/mz1.1/tortora.pdf