Skip to main content

Cityscape: Volume 18 Number 2 | Tracking and Interviewing Family Options Study Participants

HUD.GOV HUDUser.gov

Borrower Beware

Volume 18, Number 2

Editors
Mark D. Shroder
Michelle P. Matuga

Tracking and Interviewing Family Options Study Participants

Debi McInnis
Abt Associates

Brenda Rodriguez
Abt SRBI


Evaluation Tradecraft

Evaluation Tradecraft presents short articles about the art of evaluation in housing and urban research. Through this department of Cityscape, the Office of Policy Development and Research presents developments in the art of evaluation that might not be described in detail in published evaluations. Researchers often describe what they did and what their results were, but they might not give readers a step-by-step guide for implementing their methods. This department pulls back the curtain and shows readers exactly how program evaluation is done. If you have an idea for an article of about 3,000 words on a particular evaluation method or an interesting development in the art of evaluation, please send a one-paragraph abstract to marina.l.myhre@hud.gov.


Sample retention is a challenge for any longitudinal study. Panel attrition is inevitable. Panel retention is especially difficult with highly mobile, low-income study participants. This article examines the participant-tracking strategy used for the Family Options Study, conducted by the U.S. Department of Housing and Urban Development. Through the Family Options Study, 2,282 homeless families in 12 locations nationwide received three housing services and interventions. The study measures the effect of these housing services and interventions on study participants over a three-year follow-up period. Followup surveys conducted 18 and 36 months after enrollment were the main source of data to measure the effects of the study interventions. The study used a rigorous participant-tracking approach that yielded high response rates. More than 80 percent of study participants responded to the 18-month survey, and 78 percent responded to the survey conducted 3 years after enrollment. Approximately 10 percent of the total evaluation costs were devoted to participant tracking. The tracking strategy used a variety of methods—telephone, mail, and in-person contacts—with varying degrees of frequency and intensity. The article examines the importance of local interviewers, participant incentives, the continued engagement of participants, and administrative data in the tracking strategy. Lessons from the Family Options Study point to the importance of a combination of methods for successful participant tracking.


Previous Article  Next Article

 

image of city buildings