skip to main content

Title: Advances in Human Error, Reliability, Resilience, and Performance. AHFE 2018. Advances in Intelligent Systems and Computing
In the fight against hunger, Food Banks must routinely make strategic distribution decisions under uncertain supply (donations) and demand. One of the challenges facing the decision makers is that they tend to rely heavily on their prior experiences to make decisions, a phenomenon called cognitive bias. This preliminary study seeks to address cognitive bias through a visual analytics approach in the decision-making process. Using certain food bank data, interactive dashboards were prepared as an alternative to the customary spreadsheet format. A preliminary study was conducted to evaluate the effectiveness of the dashboard and results indicated dashboards reduced the amount of confirmation bias.
Authors:
; ; ;
Award ID(s):
1718672
Publication Date:
NSF-PAR ID:
10066342
Journal Name:
A Visual Analytics Approach to Combat Confirmation Bias for a Local Food Bank
Volume:
778
Page Range or eLocation-ID:
13-23
Sponsoring Org:
National Science Foundation
More Like this
  1. Food insecurity is defined as an individual or household’s inability or limited access to safe and nutritious food that every person in the household need for an active, healthy life. In this research, we apply visual analytics, the integration of data analytics and interactive visualization, to provide evidence-based decision-making for a local food bank to better understand the people and communities in its service area and improve the reach and impact of the food bank. We have identified the indicators of the need, rates of usage, and other factors related to the general accessibility of the food bank and its programs. Interactive dashboards were developed to allow decision-makers of the food bank to combine their field knowledge with the computing power to make evidence-based informed decisions in complex hunger relief operations.

  2. Abstract. In the geosciences, recent attention has been paid to the influence of uncertainty on expert decision making. When making decisions under conditions of uncertainty, people tend to employ heuristics (rules of thumb) based on experience, relying on their prior knowledge and beliefs to intuitively guide choice. Over 50 years of decision making research in cognitive psychology demonstrates that heuristics can lead to less-than-optimal decisions, collectively referred to as biases. For example, a geologist who confidently interprets ambiguous data as representative of a familiar category form their research (e.g., strike slip faults for expert in extensional domains) is exhibiting the availability bias, which occurs when people make judgments based on what is most dominant or accessible in memory. Given the important social and commercial implications of many geoscience decisions, there is a need to develop effective interventions for removing or mitigating decision bias. In this paper, we summarize the key insights from decision making research about how to reduce bias and review the literature on debiasing strategies. First, we define an optimal decision, since improving decision making requires having a standard to work towards. Next, we discuss the cognitive mechanisms underlying decision biases and describe three biases that have beenmore »shown to influence geoscientists decision making (availability bias, framing bias, anchoring bias). Finally, we review existing debiasing strategies that have applicability in the geosciences, with special attention given to those strategies that make use of information technology and artificial intelligence (AI). We present two case studies illustrating different applications of intelligent systems for the debiasing of geoscientific decision making, where debiased decision making is an emergent property of the coordinated and integrated processing of human-AI collaborative teams.

    « less
  3. Abstract. In the geosciences, recent attention has been paid to the influence of uncertainty on expert decision-making. When making decisions under conditions of uncertainty, people tend to employ heuristics (rules of thumb) based on experience, relying on their prior knowledge and beliefs to intuitively guide choice. Over 50 years of decision-making research in cognitive psychology demonstrates that heuristics can lead to less-than-optimal decisions, collectively referred to as biases. For example, the availability bias occurs when people make judgments based on what is most dominant or accessible in memory; geoscientists who have spent the past several months studying strike-slip faults will have this terrain most readily available in their mind when interpreting new seismic data. Given the important social and commercial implications of many geoscience decisions, there is a need to develop effective interventions for removing or mitigating decision bias. In this paper, we outline the key insights from decision-making research about how to reduce bias and review the literature on debiasing strategies. First, we define an optimal decision, since improving decision-making requires having a standard to work towards. Next, we discuss the cognitive mechanisms underlying decision biases and describe three biases that have been shown to influence geoscientists' decision-making (availability bias,more »framing bias, anchoring bias). Finally, we review existing debiasing strategies that have applicability in the geosciences, with special attention given to strategies that make use of information technology and artificial intelligence (AI). We present two case studies illustrating different applications of intelligent systems for the debiasing of geoscientific decision-making, wherein debiased decision-making is an emergent property of the coordinated and integrated processing of human–AI collaborative teams.« less
  4. Over the last 10 years, learning analytics have provided educators with both dashboards and tools to understand student behaviors within specific technological environments. However, there is a lack of work to support educators in making data-informed design decisions when designing a blended course and planning appropriate learning activities. In this paper, we introduce knowledge-based design analytics that uncover facets of the learning activities that are being created. A knowledge-based visualization is integrated into edCrumble, a (blended) learning design authoring tool. This new approach is explored in the context of a higher education programming course, where instructors design labs and home practice sessions with online smart learning content on a weekly basis. We performed a within-subjects user study to compare the use of the design tool both with and without visualization. We studied the differences in terms of cognitive load, controllability, confidence and ease of choice, design outcomes, and user actions within the system to compare both conditions with the objective of evaluating the impact of using design analytics during the decision-making phase of course design. Our results indicate that the use of a knowledge-based visualization allows the teachers to reduce the cognitive load (especially in terms of mental demand) andmore »that it facilitates the choice of the most appropriate activities without affecting the overall design time. In conclusion, the use of knowledge-based design analytics improves the overall learning design quality and helps teachers avoid committing design errors.« less
  5. Although many efforts are being made to provide educators with dashboards and tools to understand student behaviors within specific technological environments (learning analytics), there is a lack of work in supporting educators in making data-informed design decisions when designing a blended course and planning learning activities. In this paper, we introduce concept-level design analytics, a knowledge-based visualization, which uncovers facets of the learning activities that are being authored. The visualization is integrated into a (blended) learning design authoring tool, edCrumble. This new approach is explored in the context of a higher education programming course, where teaching assistants design labs and home practice sessions with online smart learning content on a weekly basis. We performed a within-subjects user study to compare the use of the design tool both with and without the visualization. We studied the differences in terms of cognitive load, design outcomes and user actions within the system to compare both conditions to the objective of evaluating the impact of using design analytics during the decision-making phase of course design.