Data visualization provides a powerful way for analysts to explore and make data-driven discoveries. However, current visual analytic tools provide only limited support for hypothesis-driven inquiry, as their built-in interactions and workflows are primarily intended for exploratory analysis. Visualization tools notably lack capabilities that would allow users to visually and incrementally test the fit of their conceptual models and provisional hypotheses against the data. This imbalance could bias users to overly rely on exploratory analysis as the principal mode of inquiry, which can be detrimental to discovery. In this paper, we introduce Visual (dis) Confirmation, a tool for conducting confirmatory, hypothesis-driven analyses with visualizations. Users interact by framing hypotheses and data expectations in natural language. The system then selects conceptually relevant data features and automatically generates visualizations to validate the underlying expectations. Distinctively, the resulting visualizations also highlight places where one's mental model disagrees with the data, so as to stimulate reflection. The proposed tool represents a new class of interactive data systems capable of supporting confirmatory visual analysis, and responding more intelligently by spotlighting gaps between one's knowledge and the data. We describe the algorithmic techniques behind this workflow. We also demonstrate the utility of the tool through a case study.
more »
« less
A Crowdsourced Study of Visual Strategies for Mitigating Confirmation Bias
Confirmation bias is a type of cognitive bias that involves seeking and prioritizing information that conforms to a pre-existing view or hypothesis that can negatively affect the decision-making process. We investigate the manifestation and mitigation of confirmation bias with an emphasis on the use of visualization. In a series of Amazon Mechanical Turk studies, participants selected evidence that supported or refuted a given hypothesis. We demonstrated the presence of confirmation bias and investigated the use of five simple visual representations, using color, positional, and length encodings for mitigating this bias. We found that at worst, visualization had no effect in the amount of confirmation bias present, and at best, it was successful in mitigating the bias. We discuss these results in light of factors that can complicate visual debiasing in non-experts.
more »
« less
- Award ID(s):
- 1816620
- PAR ID:
- 10340646
- Date Published:
- Journal Name:
- Visual Languages and Human-Centered Computing 2022
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)The confirmation bias, unlike other decision biases, has been shown both empirically and in theory to be enhanced with deliberation. This suggests that limited attention, reduced deliberation, or limited available cognitive resources may moderate this bias. We aimed to test this hypothesis using a validated confirmation bias task in conjunction with a protocol that randomly assigned individuals to one week of at-home sleep restriction (SR) or well-rested (WR) sleep levels. We also used a measure of cognitive reflection as an additional proxy for deliberation in our analysis. We tested the hypotheses that the confirmation bias would be stronger for WR participants and those higher in cognitive reflection on a sample of 197 young adults. Our results replicated previous findings, and both males and females separately displayed the confirmation bias. Regarding our deliberation hypotheses, the confirmation bias results were most precisely estimated for those having thought relatively more about the issue of gun control. Additionally, for the subset of individuals having thought relatively more about gun control, we found evidence that the confirmation bias was stronger for those higher in cognitive reflection and, somewhat less robustly, for those participants who were (objectively) well-rested.more » « less
-
Three studies (1 survey, 2 experiments) examine cognitive biases in the professional judgments of nationally-representative samples of psychologists working in legal contexts. Study 1 (N= 84) demonstrates robust evidence of the bias blind spot (Pronin, Lin, & Ross, 2002) in experts’ judgments. Psychologists rated their own susceptibility to bias in their professional work lower than their colleagues (and laypeople). As expected, they perceived bias mitigating procedures as more threatening to their own domain than outside domains, and more experience was correlated with higher perceived threat of bias mitigating procedures. Experimental studies 2 (N=118) & 3 (N=128) with randomly-selected psychologists reveals psychologists overwhelmingly engage in confirmation bias (93% with one decision opportunity in study 1, and 90%, 87%, and 82% across three decision opportunities in study 2). Cognitive reflection was negatively correlated with confirmation bias. Psychologists were also susceptible to order effects in that the order of symptoms presented affected their diagnoses–even though the same symptoms existed in the different scenarios (in opposite orders).more » « less
-
The use of cognitive heuristics often leads to fast and effective decisions. However, they can also systematically and predictably lead to errors known as cognitive biases. Strategies for minimizing or mitigating these biases, however, remain largely non-technological (e.g., training courses). The growing use of visual analytic (VA) tools for analysis and decision making enables a new class of bias mitigation strategies. In this work, we explore the ways in which the design of visualizations (vis) may be used to mitigate cognitive biases. We derive a design space comprised of 8 dimensions that can be manipulated to impact a user's cognitive and analytic processes and describe them through an example hiring scenario. This design space can be used to guide and inform future vis systems that may integrate cognitive processes more closely.more » « less
-
Visualizations of data provide a proven method for analysts to explore and make data-driven discoveries. However, current visualization tools provide only limited support for hypothesis-driven analyses, and often lack capabilities that would allow users to visually test the fit of their conceptual models against the data. This imbalance could bias users to overly rely on exploratory visual analysis as the principal mode of inquiry, which can be detrimental to discovery. To address this gap, we propose a new paradigm for ‘concept-driven’ visual analysis. In this style of analysis, analysts share their conceptual models and hypotheses with the system. The system then uses those inputs to drive the generation of visualizations, while providing plots and interactions to explore places where models and data disagree. We discuss key characteristics and design considerations for concept-driven visualizations, and report preliminary findings from a formative study.more » « less
An official website of the United States government

