skip to main content

Title: Switching from telephone to web‐first mixed‐mode data collection: Results from the Transition into Adulthood Supplement to the US Panel Study of Income Dynamics
We conducted an experiment to evaluate the effects on fieldwork outcomes and interview mode of switching to a web-first mixed-mode data collection design (self-administered web interview and interviewer-administered telephone interview) from a telephone-only design. We examine whether the mixed-mode option leads to better survey outcomes, based on response rates, fieldwork outcomes, interview quality and costs. We also examine respondent characteristics associated with completing a web interview rather than a telephone interview. Our mode experiment study was conducted in the 2019 wave of the Transition into Adulthood Supplement (TAS) to the US Panel Study of Income Dynamics (PSID). TAS collects information biennially from approximately 3,000 young adults in PSID families. The shift to a mixed-mode design for TAS was aimed at reducing costs and increasing respondent cooperation. We found that for mixed-mode cases compared to telephone only cases, response rates were higher, interviews were completed faster and with lower effort, the quality of the interview data appeared better, and fieldwork costs were lower. A clear set of respondent characteristics reflecting demographic and socioeconomic characteristics, technology availability and use, time use, and psychological health were associated with completing a web interview rather than a telephone interview.
Award ID(s):
Publication Date:
Journal Name:
Journal of the Royal Statistical Society: Series A (Statistics in Society)
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract In recent years, household surveys have expended significant effort to counter well-documented increases in direct refusals and greater difficulty contacting survey respondents. A substantial amount of fieldwork effort in panel surveys using telephone interviewing is devoted to the task of contacting the respondent to schedule the day and time of the interview. Higher fieldwork effort leads to greater costs and is associated with lower response rates. A new approach was experimentally evaluated in the 2017 wave of the Panel Study of Income Dynamics (PSID) Transition into Adulthood Supplement (TAS) that allowed a randomly selected subset of respondents to choose their own day and time of their telephone interview through the use of an online appointment scheduler. TAS is a nationally representative study of US young adults aged 18–28 years embedded within the worlds’ longest running panel study, the PSID. This paper experimentally evaluates the effect of offering the online appointment scheduler on fieldwork outcomes, including number of interviewer contact attempts and interview sessions, number of days to complete the interview, and response rates. We describe panel study members’ characteristics associated with uptake of the online scheduler and examine differences in the effectiveness of the treatment across subgroups. Finally, potential cost-savingsmore »of fieldwork effort due to the online appointment scheduler are evaluated.« less
  2. Response time (RT) – the time elapsing from the beginning of question reading for a given question until the start of the next question – is a potentially important indicator of data quality that can be reliably measured for all questions in a computer-administered survey using a latent timer (i.e., triggered automatically by moving on to the next question). In interviewer-administered surveys, RTs index data quality by capturing the entire length of time spent on a question–answer sequence, including interviewer question-asking behaviors and respondent question-answering behaviors. Consequently, longer RTs may indicate longer processing or interaction on the part of the interviewer, respondent, or both. RTs are an indirect measure of data quality; they do not directly measure reliability or validity, and we do not directly observe what factors lengthen the administration time. In addition, either too long or too short RTs could signal a problem (Ehlen, Schober, and Conrad 2007). However, studies that link components of RTs (interviewers’ question reading and response latencies) to interviewer and respondent behaviors that index data quality strengthen the claim that RTs indicate data quality (Bergmann and Bristle 2019; Draisma and Dijkstra 2004; Olson, Smyth, and Kirchner 2019). In general, researchers tend to consider longermore »RTs as signaling processing problems for the interviewer, respondent, or both (Couper and Kreuter 2013; Olson and Smyth 2015; Yan and Olson 2013; Yan and Tourangeau 2008). Previous work demonstrates that RTs are associated with various characteristics of interviewers (where applicable), questions, and respondents in web, telephone, and face-to-face interviews (e.g., Couper and Kreuter 2013; Olson and Smyth 2015; Yan and Tourangeau 2008). We replicate and extend this research by examining how RTs are associated with various question characteristics and several established tools for evaluating questions. We also examine whether increased interviewer experience in the study shortens RTs for questions with characteristics that impact the complexity of the interviewer’s task (i.e., interviewer instructions and parenthetical phrases). We examine these relationships in the context of a sample of racially diverse respondents who answered questions about participation in medical research and their health.« less
  3. Two major supplements to the Panel Study of Income Dynamics (PSID) were in the field during the COVID-19 outbreak in the United States: the 2019 waves of the PSID Child Development Supplement (CDS-19) and the PSID Transition into Adulthood Supplement (TAS-19). Both CDS-19 and TAS-19 abruptly terminated all face-to-face fieldwork and, for TAS-19, shifted interviewers from working in a centralized call center to working from their homes. Overall, COVID-19 had a net negative effect on response rates in CDS-19 and terminated all home visits that represented an important study component. For TAS-19, the overall effect of Covid-19 was uncertain, but negative. The costs were high of adapting to COVID-19 and providing paid time-off benefits to staff affected by the pandemic. Longitudinal surveys, such as CDS, TAS, and PSID, that span the pandemic will provide valuable information on its life course and intergenerational consequences, making ongoing data collection of vital importance.
  4. Abstract Adaptive survey designs are increasingly used by survey practitioners to counteract ongoing declines in household survey response rates and manage rising fieldwork costs. This paper reports findings from an evaluation of an early-bird incentive (EBI) experiment targeting high-effort respondents who participate in the 2019 wave of the US Panel Study of Income Dynamics. We identified a subgroup of high-effort respondents at risk of nonresponse based on their prior wave fieldwork effort and randomized them to a treatment offering an extra time-delimited monetary incentive for completing their interview within the first month of data collection (treatment group; N = 800) or the standard study incentive (control group; N = 400). In recent waves, we have found that the costs of the protracted fieldwork needed to complete interviews with high-effort cases in the form of interviewer contact attempts plus an increased incentive near the close of data collection are extremely high. By incentivizing early participation and reducing the number of interviewer contact attempts and fieldwork days to complete the interview, our goal was to manage both nonresponse and survey costs. We found that the EBI treatment increased response rates and reduced fieldwork effort and costs compared to a control group. Wemore »review several key findings and limitations, discuss their implications, and identify the next steps for future research.« less
  5. Abstract

    This article illustrates some effects of dynamic adaptive design in a large government survey. We present findings from the 2015 National Survey of College Graduates Adaptive Design Experiment, including results and discussion of sample representativeness, response rates, and cost. We also consider the effect of truncating data collection (examining alternative stopping rules) on these metrics. In this experiment, we monitored sample representativeness continuously and altered data collection procedures—increasing or decreasing contact effort—to improve it. Cases that were overrepresented in the achieved sample were assigned to more passive modes of data collection (web or paper) or withheld from the group of cases that received survey reminders, whereas underrepresented cases were assigned to telephone follow-ups. The findings suggest that a dynamic adaptive survey design can improve a data quality indicator (R-indicators) without increasing cost or reducing response rate. We also find that a dynamic adaptive survey design has the potential to reduce the length of the data collection period, control cost, and increase timeliness of data delivery, if sample representativeness is prioritized over increasing the survey response rate.