Skip to main content

Using Incentives to Reduce Nonresponse Bias in the American Housing Survey


Report Acceptance Date: September 2023 (143 pages)

Posted Date: December 05, 2023

The American Housing Survey (AHS), like many other federal surveys, has grappled with declining survey response rates in recent years. Concerned that declining response rates might alter the sample composition of the AHS and possibly bias the resulting survey estimates, HUD contracted the Office of Evaluation Sciences (OES) at the GSA to design and test an incentive experiment to improve data quality. Focusing on survey nonresponse bias, using the longitudinal design of the AHS, and with attention to cost effective methods, OES developed strategies to target incentives to households with the highest probabilities of nonresponse. OES hypothesized that providing incentives to households would increase the response rates of the AHS, but targeting incentives to specific households would also decrease survey nonresponse bias and be more cost efficient. Secondary hypotheses were that targeting incentives to households with high probabilities of nonresponse would also reduce the number of contact attempts made by field representatives. Incentive amounts ($1, $3, $5, $10) were also randomly varied, allowing for additional testing of the marginal returns of providing monetary incentives of differing amounts, including no incentive.

OES conducted the experiment as part of the 2021 AHS. Using data from the 2019 AHS, OES identified households with the highest probability of being nonrespondents in 2021. Potential respondent households with similar estimated risks of nonresponse were put into pairs and each household was assigned to one of two different groups: 1. a “targeted” group of potential respondent households who received incentives if and only if they fell into the 30% with the highest estimated risk of nonresponse; 2. a “nontargeted” group of potential respondent households, each of whom had a 30% probability of receiving an incentive irrespective of their estimated nonresponse risk. The remaining households in the AHS did not receive an incentive and formed a third “no incentive” comparison group. The results of the experiment were mixed. Targeting did not alter the sample composition when comparing the targeted group to the nontargeted group, meaning targeting did not reduce survey nonresponse bias. The targeted group did have higher response rates than the nontargeted group, suggesting that targeting incentives increased response rates over providing nontargeted incentives to the sample. There was no evidence that incentives, targeted or random, reduced the number of contact attempts by field enumerators. The increase in response rates for the targeted group compared to the nontargeted group, however, was accomplished without a corresponding increase in contact attempts by field representatives. The final report provides additional information about the study design, methods, and results.

Publication Categories: Publications     AHS Data Based Reports      Quality Control    


All Publications
Search for Publications
Search for Ongoing Research