Visual working memory is traditionally studied using abstract, meaningless stimuli. Although studies using such simplified stimuli have been insightful in understanding the mechanisms of visual working memory, they also potentially limit our ability to understand how people encode and store conceptually rich and meaningful stimuli in the real world. Recent studies have demonstrated that meaningful and familiar visual stimuli that connect to existing knowledge are better remembered than abstract colors or shapes, indicating that meaning can unlock additional working memory capacity. These findings challenge current models of visual working memory and suggest that its capacity is not fixed but depends on the type of information that is being remembered and, in particular, how that information connects to preexisting knowledge.
- Award ID(s):
- 1853630
- PAR ID:
- 10375766
- Date Published:
- Journal Name:
- eLife
- Volume:
- 11
- ISSN:
- 2050-084X
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Fitch., T. ; Lamm, C. ; Leder, H. ; Teßmar-Raible, K. (Ed.)We make frequent decisions about how to manage our health, yet do so with information that is highly complex or received piecemeal. Causal models can provide guidance about how components of a complex system interact, yet models that provide a complete causal story may be more complex than people can reason about. Prior work has provided mixed insights into our ability to make decisions with causal models, showing that people can use them in novel domains but that they may impede decisions in familiar ones. We examine how tailoring causal information to the question at hand may aid decision making, using simple diagrams with only the relevant causal paths (Experiment 1) or those paths highlighted within a complex causal model (Experiment 2). We find that diagrams tailored to a choice improve decision accuracy over complex diagrams or prior knowledge, providing new evidence for how causal models can aid decisions.more » « less
-
Research exploring how to support decision-making has often used machine learning to automate or assist human decisions. We take an alternative approach for improving decision-making, using machine learning to help stakeholders surface ways to improve and make fairer decision-making processes. We created "Deliberating with AI", a web tool that enables people to create and evaluate ML models in order to examine strengths and shortcomings of past decision-making and deliberate on how to improve future decisions. We apply this tool to a context of people selection, having stakeholders---decision makers (faculty) and decision subjects (students)---use the tool to improve graduate school admission decisions. Through our case study, we demonstrate how the stakeholders used the web tool to create ML models that they used as boundary objects to deliberate over organization decision-making practices. We share insights from our study to inform future research on stakeholder-centered participatory AI design and technology for organizational decision-making.more » « less
-
Abstract. In the geosciences, recent attention has been paid to the influence of uncertainty on expert decision-making. When making decisions under conditions of uncertainty, people tend to employ heuristics (rules of thumb) based on experience, relying on their prior knowledge and beliefs to intuitively guide choice. Over 50 years of decision-making research in cognitive psychology demonstrates that heuristics can lead to less-than-optimal decisions, collectively referred to as biases. For example, the availability bias occurs when people make judgments based on what is most dominant or accessible in memory; geoscientists who have spent the past several months studying strike-slip faults will have this terrain most readily available in their mind when interpreting new seismic data. Given the important social and commercial implications of many geoscience decisions, there is a need to develop effective interventions for removing or mitigating decision bias. In this paper, we outline the key insights from decision-making research about how to reduce bias and review the literature on debiasing strategies. First, we define an optimal decision, since improving decision-making requires having a standard to work towards. Next, we discuss the cognitive mechanisms underlying decision biases and describe three biases that have been shown to influence geoscientists' decision-making (availability bias, framing bias, anchoring bias). Finally, we review existing debiasing strategies that have applicability in the geosciences, with special attention given to strategies that make use of information technology and artificial intelligence (AI). We present two case studies illustrating different applications of intelligent systems for the debiasing of geoscientific decision-making, wherein debiased decision-making is an emergent property of the coordinated and integrated processing of human–AI collaborative teams.more » « less
-
Abstract. In the geosciences, recent attention has been paid to the influence of uncertainty on expert decision making. When making decisions under conditions of uncertainty, people tend to employ heuristics (rules of thumb) based on experience, relying on their prior knowledge and beliefs to intuitively guide choice. Over 50 years of decision making research in cognitive psychology demonstrates that heuristics can lead to less-than-optimal decisions, collectively referred to as biases. For example, a geologist who confidently interprets ambiguous data as representative of a familiar category form their research (e.g., strike slip faults for expert in extensional domains) is exhibiting the availability bias, which occurs when people make judgments based on what is most dominant or accessible in memory. Given the important social and commercial implications of many geoscience decisions, there is a need to develop effective interventions for removing or mitigating decision bias. In this paper, we summarize the key insights from decision making research about how to reduce bias and review the literature on debiasing strategies. First, we define an optimal decision, since improving decision making requires having a standard to work towards. Next, we discuss the cognitive mechanisms underlying decision biases and describe three biases that have been shown to influence geoscientists decision making (availability bias, framing bias, anchoring bias). Finally, we review existing debiasing strategies that have applicability in the geosciences, with special attention given to those strategies that make use of information technology and artificial intelligence (AI). We present two case studies illustrating different applications of intelligent systems for the debiasing of geoscientific decision making, where debiased decision making is an emergent property of the coordinated and integrated processing of human-AI collaborative teams.