skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: The Interview Quality Reflection Tool (IQRT): Honing the Craft of Experiential Interviews
Prior methodological literature on conducting interviews emphasizes the importance of skill development in conducting interviews. However, in contrast to qualitative data analysis, there are few systematic processes in place to guide the interviewer into reflexivity about their role in the interview situation. Here, we present the interview quality reflection tool (IQRT) as a process that we developed from conducting and mentoring semi-structured and unstructured interviews focused on personal lived experiences. The IQRT prompts the interviewer to transcribe each interview question and reflect on how the spoken question served to advance experiential quality in the interview. We illustrate the IQRT itself before demonstrating how we authors used the process to examine experiential quality in three cases conducted in our prior research. Finally, we consider how the IQRT enables researchers to examine the interview situation as a whole, by increasing the self-awareness of the interviewer, and the parts, by commenting on the mechanics of constructing useful questions.  more » « less
Award ID(s):
2045392
PAR ID:
10586669
Author(s) / Creator(s):
;
Publisher / Repository:
Sage
Date Published:
Journal Name:
International Journal of Qualitative Methods
Volume:
23
ISSN:
1609-4069
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Response time (RT) – the time elapsing from the beginning of question reading for a given question until the start of the next question – is a potentially important indicator of data quality that can be reliably measured for all questions in a computer-administered survey using a latent timer (i.e., triggered automatically by moving on to the next question). In interviewer-administered surveys, RTs index data quality by capturing the entire length of time spent on a question–answer sequence, including interviewer question-asking behaviors and respondent question-answering behaviors. Consequently, longer RTs may indicate longer processing or interaction on the part of the interviewer, respondent, or both. RTs are an indirect measure of data quality; they do not directly measure reliability or validity, and we do not directly observe what factors lengthen the administration time. In addition, either too long or too short RTs could signal a problem (Ehlen, Schober, and Conrad 2007). However, studies that link components of RTs (interviewers’ question reading and response latencies) to interviewer and respondent behaviors that index data quality strengthen the claim that RTs indicate data quality (Bergmann and Bristle 2019; Draisma and Dijkstra 2004; Olson, Smyth, and Kirchner 2019). In general, researchers tend to consider longer RTs as signaling processing problems for the interviewer, respondent, or both (Couper and Kreuter 2013; Olson and Smyth 2015; Yan and Olson 2013; Yan and Tourangeau 2008). Previous work demonstrates that RTs are associated with various characteristics of interviewers (where applicable), questions, and respondents in web, telephone, and face-to-face interviews (e.g., Couper and Kreuter 2013; Olson and Smyth 2015; Yan and Tourangeau 2008). We replicate and extend this research by examining how RTs are associated with various question characteristics and several established tools for evaluating questions. We also examine whether increased interviewer experience in the study shortens RTs for questions with characteristics that impact the complexity of the interviewer’s task (i.e., interviewer instructions and parenthetical phrases). We examine these relationships in the context of a sample of racially diverse respondents who answered questions about participation in medical research and their health. 
    more » « less
  2. This methods paper presents the interview quality reflection tool (IQRT) to evaluate the quality of qualitative research interviews. Qualitative researchers commonly use semi-structured interviews that rely on the interviewers’ ability to improvise in real time based on the needs of the study. Given that interviewing involves numerous tacit skills that cannot be delineated by a simple written protocol, it is necessary that researchers develop interview competencies through practice and reflection. While prior literature on interviewing has often focused on developing interview protocols, we know little about how interviewers themselves may be trained to gather high-quality data. In this paper, we focus on how the IQRT may be used to guide the self-assessment of research interviews. We discuss how interviews are used in engineering education, how we developed and applied the IQRT, and how lessons learned through using this tool might lead to improved interviewing skills through careful examination of interview structure, content, and context within the mentoring process. 
    more » « less
  3. We conducted an experiment to evaluate the effects on fieldwork outcomes and interview mode of switching to a web-first mixed-mode data collection design (self-administered web interview and interviewer-administered telephone interview) from a telephone-only design. We examine whether the mixed-mode option leads to better survey outcomes, based on response rates, fieldwork outcomes, interview quality and costs. We also examine respondent characteristics associated with completing a web interview rather than a telephone interview. Our mode experiment study was conducted in the 2019 wave of the Transition into Adulthood Supplement (TAS) to the US Panel Study of Income Dynamics (PSID). TAS collects information biennially from approximately 3,000 young adults in PSID families. The shift to a mixed-mode design for TAS was aimed at reducing costs and increasing respondent cooperation. We found that for mixed-mode cases compared to telephone only cases, response rates were higher, interviews were completed faster and with lower effort, the quality of the interview data appeared better, and fieldwork costs were lower. A clear set of respondent characteristics reflecting demographic and socioeconomic characteristics, technology availability and use, time use, and psychological health were associated with completing a web interview rather than a telephone interview. 
    more » « less
  4. Abstract Interviewers’ postinterview evaluations of respondents’ performance (IEPs) are paradata, used to describe the quality of the data obtained from respondents. IEPs are driven by a combination of factors, including respondents’ and interviewers’ sociodemographic characteristics and what actually transpires during the interview. However, relatively few studies examine how IEPs are associated with features of the response process, including facets of the interviewer-respondent interaction and patterns of responding that index data quality. We examine whether features of the response process—various respondents’ behaviors and response quality indicators—are associated with IEPs in a survey with a diverse set of respondents focused on barriers and facilitators to participating in medical research. We also examine whether there are differences in IEPs across respondents’ and interviewers’ sociodemographic characteristics. Our results show that both respondents’ behaviors and response quality indicators predict IEPs, indicating that IEPs reflect what transpires in the interview. In addition, interviewers appear to approach the task of evaluating respondents with differing frameworks, as evidenced by the variation in IEPs attributable to interviewers and associations between IEPs and interviewers’ gender. Further, IEPs were associated with respondents’ education and ethnoracial identity, net of respondents’ behaviors, response quality indicators, and sociodemographic characteristics of respondents and interviewers. Future research should continue to build on studies that examine the correlates of IEPs to better inform whether, when, and how to use IEPs as paradata about the quality of the data obtained. 
    more » « less
  5. Engineering education research heavily relies on qualitative studies that utilize interview-based approaches. The quality and depth of knowledge derived from these studies depend heavily on the craft of conducting interviews, a facet often overlooked in prior work on qualitative methods. This special session aims to address this gap by guiding engineering education researchers in honing their interviewing skills for qualitative research. Participants will learn best practices for developing interview protocols, creating an accessible environment, and capturing high-quality data. Through case studies and hands-on activities, attendees will gain confidence in moderating conversations, improving data collection, and enhancing their overall skillset. This session provides an opportunity for researchers interested in qualitative research and scholarly educators to deepen their understanding of conducting meaningful interviews. By bridging the gap between the importance of qualitative studies and the need for skilled interviewers, we aim to contribute to the advancement of engineering education research. 
    more » « less