Abstract Interviewers’ postinterview evaluations of respondents’ performance (IEPs) are paradata, used to describe the quality of the data obtained from respondents. IEPs are driven by a combination of factors, including respondents’ and interviewers’ sociodemographic characteristics and what actually transpires during the interview. However, relatively few studies examine how IEPs are associated with features of the response process, including facets of the interviewer-respondent interaction and patterns of responding that index data quality. We examine whether features of the response process—various respondents’ behaviors and response quality indicators—are associated with IEPs in a survey with a diverse set of respondents focused on barriers and facilitators to participating in medical research. We also examine whether there are differences in IEPs across respondents’ and interviewers’ sociodemographic characteristics. Our results show that both respondents’ behaviors and response quality indicators predict IEPs, indicating that IEPs reflect what transpires in the interview. In addition, interviewers appear to approach the task of evaluating respondents with differing frameworks, as evidenced by the variation in IEPs attributable to interviewers and associations between IEPs and interviewers’ gender. Further, IEPs were associated with respondents’ education and ethnoracial identity, net of respondents’ behaviors, response quality indicators, and sociodemographic characteristics of respondents and interviewers. Future research should continue to build on studies that examine the correlates of IEPs to better inform whether, when, and how to use IEPs as paradata about the quality of the data obtained.
more »
« less
This content will become publicly available on May 9, 2026
Incorporating Process Information Into Cognitive Diagnostic Models: A Four-Component Joint Modeling Approach
Recent studies show increasing interest in using process data (e.g., response time, response actions) to enhance measurement accuracy for respondents’ latent traits. Yet, few have explored the possibility of incorporating process information into cognitive diagnostic models (CDMs). This study proposes a novel CDM approach that utilizes a four-component joint modeling approach with response action sequences (i.e., similarity and efficiency), response time, and item responses. We employed the Markov Chain Monte Carlo method for parameter estimation and evaluated the performance of the proposed model using both an empirical study and two simulation studies. The results suggest that the process data can improve respondents’ classification accuracy under varied conditions and support the interpretation of the association between process and response data.
more »
« less
- Award ID(s):
- 1749275
- PAR ID:
- 10625418
- Publisher / Repository:
- Sage
- Date Published:
- Journal Name:
- Journal of Educational and Behavioral Statistics
- ISSN:
- 1076-9986
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract The response process of problem‐solving items contains rich information about respondents' behaviours and cognitive process in the digital tasks, while the information extraction is a big challenge. The aim of the study is to use a data‐driven approach to explore the latent states and state transitions underlying problem‐solving process to reflect test‐takers' behavioural patterns, and to investigate how these states and state transitions could be associated with test‐takers' performance. We employed the Hidden Markov Modelling approach to identify test takers' hidden states during the problem‐solving process and compared the frequency of states and/or state transitions between different performance groups. We conducted comparable studies in two problem‐solving items with a focus on the US sample that was collected in PIAAC 2012, and examined the correlation between those frequencies from two items. Latent states and transitions between them underlying the problem‐solving process were identified and found significantly different by performance groups. The groups with correct responses in both items were found more engaged in tasks and more often to use efficient tools to solve problems, while the group with incorrect responses was found more likely to use shorter action sequences and exhibit hesitative behaviours. Consistent behavioural patterns were identified across items. This study demonstrates the value of data‐driven based HMM approach to better understand respondents' behavioural patterns and cognitive transmissions underneath the observable action sequences in complex problem‐solving tasks.more » « less
-
Abstract Computer‐based interactive items have become prevalent in recent educational assessments. In such items, the entire human‐computer interactive process is recorded in a log file and is known as the response process. These data are noisy, diverse, and in a nonstandard format. Several feature extraction methods have been developed to overcome the difficulties in process data analysis. However, these methods often focus on the action sequence and ignore the time sequence in response processes. In this paper, we introduce a new feature extraction method that incorporates the information in both the action sequence and the response time sequence. The method is based on the concept of path signature from stochastic analysis. We apply the proposed method to both simulated data and real response process data from PIAAC. A prediction framework is used to show that taking time information into account provides a more comprehensive understanding of respondents' behaviors.more » « less
-
Brenner, P. S. (Ed.)Features of the survey measurement process may affect responses from respondents in various racial, ethnic, or cultural groups in different ways. When responses from multiethnic populations are combined, such variability in responding could increase variable error or bias results. The current study examines the survey response process among Black and White respondents answering questions about trust in medical researchers and participation in medical research. Using transcriptions from telephone interviews, we code a rich set of behaviors produced by respondents that past research has shown to be associated with measurement error, including long question-answer sequences, uncodable answers, requests for repetition or clarification, affective responses, and tokens. In analysis, we test for differences between Black and White respondents in the likelihood with which behaviors occur and examine whether the behaviors vary by specific categorizations of the questions, including whether the questions are racially focused. Overall, we find that White respondents produce more behaviors that indicate cognitive processing problems for racially focused questions, which may be interpreted as demonstrating a “cultural” difference in the display of cognitive processing and interaction. Data are provided by the 2013–2014 Voices Heard Survey, a computer-assisted telephone survey designed to measure respondents’ perceptions of barriers and facilitators to participating in medical research.more » « less
-
Abstract The Household Pulse Survey (HPS), released by the US Census Bureau at the start of the coronavirus pandemic, gathers timely information about the societal and economic impacts of coronavirus. The first phase of the survey was launched in April 2020 and ran for 12 weeks. To track the immediate impact of the pandemic, individual respondents during this phase were re-sampled for up to three consecutive weeks. Motivated by expected job loss during the pandemic, using public-use microdata, this work proposes unit-level, model-based estimators that incorporate longitudinal dependence at both the response and domain level. In particular, using a pseudo-likelihood, we consider a Bayesian hierarchical unit-level, model-based approach for both Gaussian and binary response data under informative sampling. To facilitate construction of these model-based estimates, we develop an efficient Gibbs sampler. An empirical simulation study is conducted to compare the proposed approach to models that do not account for unit-level longitudinal correlation. Finally, using public-use HPS micro-data, we provide an analysis of ‘expected job loss’ that compares both design- and model-based estimators and demonstrates superior performance for the proposed model-based approaches.more » « less
An official website of the United States government
