- Award ID(s):
- Publication Date:
- NSF-PAR ID:
- Journal Name:
- Journal of the Royal Statistical Society: Series A (Statistics in Society)
- Sponsoring Org:
- National Science Foundation
More Like this
Abstract In recent years, household surveys have expended significant effort to counter well-documented increases in direct refusals and greater difficulty contacting survey respondents. A substantial amount of fieldwork effort in panel surveys using telephone interviewing is devoted to the task of contacting the respondent to schedule the day and time of the interview. Higher fieldwork effort leads to greater costs and is associated with lower response rates. A new approach was experimentally evaluated in the 2017 wave of the Panel Study of Income Dynamics (PSID) Transition into Adulthood Supplement (TAS) that allowed a randomly selected subset of respondents to choose their own day and time of their telephone interview through the use of an online appointment scheduler. TAS is a nationally representative study of US young adults aged 18–28 years embedded within the worlds’ longest running panel study, the PSID. This paper experimentally evaluates the effect of offering the online appointment scheduler on fieldwork outcomes, including number of interviewer contact attempts and interview sessions, number of days to complete the interview, and response rates. We describe panel study members’ characteristics associated with uptake of the online scheduler and examine differences in the effectiveness of the treatment across subgroups. Finally, potential cost-savingsmore »
Response Times as an Indicator of Data Quality: Associations with Question, Interviewer, and Respondent Characteristics in a Health Survey of Diverse RespondentsResponse time (RT) – the time elapsing from the beginning of question reading for a given question until the start of the next question – is a potentially important indicator of data quality that can be reliably measured for all questions in a computer-administered survey using a latent timer (i.e., triggered automatically by moving on to the next question). In interviewer-administered surveys, RTs index data quality by capturing the entire length of time spent on a question–answer sequence, including interviewer question-asking behaviors and respondent question-answering behaviors. Consequently, longer RTs may indicate longer processing or interaction on the part of the interviewer, respondent, or both. RTs are an indirect measure of data quality; they do not directly measure reliability or validity, and we do not directly observe what factors lengthen the administration time. In addition, either too long or too short RTs could signal a problem (Ehlen, Schober, and Conrad 2007). However, studies that link components of RTs (interviewers’ question reading and response latencies) to interviewer and respondent behaviors that index data quality strengthen the claim that RTs indicate data quality (Bergmann and Bristle 2019; Draisma and Dijkstra 2004; Olson, Smyth, and Kirchner 2019). In general, researchers tend to consider longermore »
Effects of the COVID-19 crisis on survey fieldwork: Experience and lessons from two major supplements to the U.S. Panel Study of Income DynamicsTwo major supplements to the Panel Study of Income Dynamics (PSID) were in the field during the COVID-19 outbreak in the United States: the 2019 waves of the PSID Child Development Supplement (CDS-19) and the PSID Transition into Adulthood Supplement (TAS-19). Both CDS-19 and TAS-19 abruptly terminated all face-to-face fieldwork and, for TAS-19, shifted interviewers from working in a centralized call center to working from their homes. Overall, COVID-19 had a net negative effect on response rates in CDS-19 and terminated all home visits that represented an important study component. For TAS-19, the overall effect of Covid-19 was uncertain, but negative. The costs were high of adapting to COVID-19 and providing paid time-off benefits to staff affected by the pandemic. Longitudinal surveys, such as CDS, TAS, and PSID, that span the pandemic will provide valuable information on its life course and intergenerational consequences, making ongoing data collection of vital importance.
The Effects of a Targeted “Early Bird” Incentive Strategy on Response Rates, Fieldwork Effort, and Costs in a National Panel StudyAbstract Adaptive survey designs are increasingly used by survey practitioners to counteract ongoing declines in household survey response rates and manage rising fieldwork costs. This paper reports findings from an evaluation of an early-bird incentive (EBI) experiment targeting high-effort respondents who participate in the 2019 wave of the US Panel Study of Income Dynamics. We identified a subgroup of high-effort respondents at risk of nonresponse based on their prior wave fieldwork effort and randomized them to a treatment offering an extra time-delimited monetary incentive for completing their interview within the first month of data collection (treatment group; N = 800) or the standard study incentive (control group; N = 400). In recent waves, we have found that the costs of the protracted fieldwork needed to complete interviews with high-effort cases in the form of interviewer contact attempts plus an increased incentive near the close of data collection are extremely high. By incentivizing early participation and reducing the number of interviewer contact attempts and fieldwork days to complete the interview, our goal was to manage both nonresponse and survey costs. We found that the EBI treatment increased response rates and reduced fieldwork effort and costs compared to a control group. Wemore »
This article illustrates some effects of dynamic adaptive design in a large government survey. We present findings from the 2015 National Survey of College Graduates Adaptive Design Experiment, including results and discussion of sample representativeness, response rates, and cost. We also consider the effect of truncating data collection (examining alternative stopping rules) on these metrics. In this experiment, we monitored sample representativeness continuously and altered data collection procedures—increasing or decreasing contact effort—to improve it. Cases that were overrepresented in the achieved sample were assigned to more passive modes of data collection (web or paper) or withheld from the group of cases that received survey reminders, whereas underrepresented cases were assigned to telephone follow-ups. The findings suggest that a dynamic adaptive survey design can improve a data quality indicator (R-indicators) without increasing cost or reducing response rate. We also find that a dynamic adaptive survey design has the potential to reduce the length of the data collection period, control cost, and increase timeliness of data delivery, if sample representativeness is prioritized over increasing the survey response rate.