skip to main content

This content will become publicly available on February 1, 2023

Title: The Effects of a Targeted “Early Bird” Incentive Strategy on Response Rates, Fieldwork Effort, and Costs in a National Panel Study
Abstract Adaptive survey designs are increasingly used by survey practitioners to counteract ongoing declines in household survey response rates and manage rising fieldwork costs. This paper reports findings from an evaluation of an early-bird incentive (EBI) experiment targeting high-effort respondents who participate in the 2019 wave of the US Panel Study of Income Dynamics. We identified a subgroup of high-effort respondents at risk of nonresponse based on their prior wave fieldwork effort and randomized them to a treatment offering an extra time-delimited monetary incentive for completing their interview within the first month of data collection (treatment group; N = 800) or the standard study incentive (control group; N = 400). In recent waves, we have found that the costs of the protracted fieldwork needed to complete interviews with high-effort cases in the form of interviewer contact attempts plus an increased incentive near the close of data collection are extremely high. By incentivizing early participation and reducing the number of interviewer contact attempts and fieldwork days to complete the interview, our goal was to manage both nonresponse and survey costs. We found that the EBI treatment increased response rates and reduced fieldwork effort and costs compared to a control group. We more » review several key findings and limitations, discuss their implications, and identify the next steps for future research. « less
Authors:
; ;
Award ID(s):
1623684 2042875
Publication Date:
NSF-PAR ID:
10334474
Journal Name:
Journal of Survey Statistics and Methodology
ISSN:
2325-0984
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract In recent years, household surveys have expended significant effort to counter well-documented increases in direct refusals and greater difficulty contacting survey respondents. A substantial amount of fieldwork effort in panel surveys using telephone interviewing is devoted to the task of contacting the respondent to schedule the day and time of the interview. Higher fieldwork effort leads to greater costs and is associated with lower response rates. A new approach was experimentally evaluated in the 2017 wave of the Panel Study of Income Dynamics (PSID) Transition into Adulthood Supplement (TAS) that allowed a randomly selected subset of respondents to choose their own day and time of their telephone interview through the use of an online appointment scheduler. TAS is a nationally representative study of US young adults aged 18–28 years embedded within the worlds’ longest running panel study, the PSID. This paper experimentally evaluates the effect of offering the online appointment scheduler on fieldwork outcomes, including number of interviewer contact attempts and interview sessions, number of days to complete the interview, and response rates. We describe panel study members’ characteristics associated with uptake of the online scheduler and examine differences in the effectiveness of the treatment across subgroups. Finally, potential cost-savingsmore »of fieldwork effort due to the online appointment scheduler are evaluated.« less
  2. This paper describes the association between an incentive boost and data collection outcomes across two waves of a long-running panel study. In a recent wave, with the aim of achieving response rate goals, all remaining sample members were offered a substantial incentive increase in the final weeks of data collection, despite uncertainty about potential effects on fieldwork outcomes in the following wave. The analyses examine response rates and the average number of interviewer attempts to complete the interview in the waves during and after the incentive boost, and provide an estimate of the cost of the incentives and fieldwork in the waves during and following the boost. The findings provide suggestive evidence that the use of variable incentive strategies from one wave to the next in the context of an ongoing panel study may be an effective strategy to reduce nonresponse and may yield enduring positive effects on subsequent data collection outcomes.
  3. We conducted an experiment to evaluate the effects on fieldwork outcomes and interview mode of switching to a web-first mixed-mode data collection design (self-administered web interview and interviewer-administered telephone interview) from a telephone-only design. We examine whether the mixed-mode option leads to better survey outcomes, based on response rates, fieldwork outcomes, interview quality and costs. We also examine respondent characteristics associated with completing a web interview rather than a telephone interview. Our mode experiment study was conducted in the 2019 wave of the Transition into Adulthood Supplement (TAS) to the US Panel Study of Income Dynamics (PSID). TAS collects information biennially from approximately 3,000 young adults in PSID families. The shift to a mixed-mode design for TAS was aimed at reducing costs and increasing respondent cooperation. We found that for mixed-mode cases compared to telephone only cases, response rates were higher, interviews were completed faster and with lower effort, the quality of the interview data appeared better, and fieldwork costs were lower. A clear set of respondent characteristics reflecting demographic and socioeconomic characteristics, technology availability and use, time use, and psychological health were associated with completing a web interview rather than a telephone interview.
  4. Abstract

    A prior study found that mailing prepaid incentives with $5 cash visible from outside the envelope increased the response rate to a mail survey by 4 percentage points compared to cash that was not externally visible. This “visible cash effect” suggests opportunities to improve survey response at little or no cost, but many unknowns remain. Among them: Does the visible cash effect generalize to different survey modes, respondent burdens, and cash amounts? Does it differ between fresh samples and reinterview samples? Does it affect data quality or survey costs? This article examines these questions using two linked studies where incentive visibility was randomized in a large probability sample for the American National Election Studies. The first study used $10 incentives with invitations to a long web questionnaire (median 71 minutes, n = 17,849). Visible cash increased response rates in a fresh sample for both screener and extended interview response (by 6.7 and 4.8 percentage points, respectively). Visible cash did not increase the response rate in a reinterview sample where the baseline reinterview response rate was very high (72 percent). The second study used $5 incentives with invitations to a mail-back paper questionnaire (n = 8,000). Visible cash increased the response rate in a samplemore »of prior nonrespondents by 4.0 percentage points (from 31.5 to 35.5), but it did not increase the response rate in a reinterview sample where the baseline reinterview rate was very high (84 percent). In the two studies, several aspects of data quality were investigated, including speeding, non-differentiation, item nonresponse, nonserious responses, noncredible responses, sample composition, and predictive validity; no adverse effects of visible cash were detected, and sample composition improved marginally. Effects on survey costs were either negligible or resulted in net savings. Accumulated evidence now shows that visible cash can increase incentives’ effectiveness in several circumstances.

    « less
  5. Background: Internships for college students can enhance their grades, skills, and employment prospects, but finding and completing an internship sometimes requires considerable resources. Consequently, before postsecondary institutions consider mandating this high-impact practice, more evidence is needed regarding the various obstacles students face as they seek an internship. Focus of Study: The purpose of this study was to document the prevalence and nature of obstacles to securing a college internship and how these factors interact in the lives of particular students. Field theory is used to highlight the ways that structural inequalities and forms of capital serve to facilitate or constrain access to an internship experience. Population: The participants in this study included students attending five postsecondary institutions—three comprehensive universities, one historically Black college and university (HBCU), and one technical college in the U.S. states of Maryland, South Carolina, and Wisconsin. Research Design: This concurrent mixed-methods study included the collection of survey (n = 1,549) and focus group and interview (n = 100) data from students who self-selected into the study. Given that this is a descriptive study, the aim was to document student experiences with obstacles to internships using varied sources of data. Data Collection and Analysis: Data were collectedmore »via an online survey (with a 26% response rate) and in-person focus groups or interviews at each campus. Data were analyzed using inductive thematic analysis, social network analysis, and logistic regression techniques and interpreted in ways that highlight the situated and critical role of capital and structure in shaping opportunity and behavior. Findings: Among the 1,060 (69%) survey respondents who reported not having had an internship, 638 indicated that they had in fact wanted to pursue an internship but could not because of the need to work, a heavy course load, insufficient positions, and inadequate pay. The role of financial, social, and cultural capital also impacted students differentially depending on their majors, socioeconomic status, race, and geographic location, highlighting how context and enduring systemic forces—and not solely the possession of capital(s)—intersect to shape students’ abilities to pursue an internship. Conclusion: Internships are not universally accessible to all college students and instead favor students who have access to financial, social, and cultural capital while also being positioned in particular majors, geographic locations, and institutions. Before actively promoting internships for their students, colleges and universities should secure funding to support student pay and relocation costs, identify alternative forms of experiential learning for working students, and engage employers in creating more in-person and online positions for students across the disciplines.« less