skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 10:00 PM ET on Friday, February 6 until 10:00 AM ET on Saturday, February 7 due to maintenance. We apologize for the inconvenience.


Title: Data visualization literacy: Definitions, conceptual frameworks, exercises, and assessments
In the information age, the ability to read and construct data visualizations becomes as important as the ability to read and write text. However, while standard definitions and theoretical frameworks to teach and assess textual, mathematical, and visual literacy exist, current data visualization literacy (DVL) definitions and frameworks are not comprehensive enough to guide the design of DVL teaching and assessment. This paper introduces a data visualization literacy framework (DVL-FW) that was specifically developed to define, teach, and assess DVL. The holistic DVL-FW promotes both the reading and construction of data visualizations, a pairing analogous to that of both reading and writing in textual literacy and understanding and applying in mathematical literacy. Specifically, the DVL-FW defines a hierarchical typology of core concepts and details the process steps that are required to extract insights from data. Advancing the state of the art, the DVL-FW interlinks theoretical and procedural knowledge and showcases how both can be combined to design curricula and assessment measures for DVL. Earlier versions of the DVL-FW have been used to teach DVL to more than 8,500 residential and online students, and results from this effort have helped revise and validate the DVL-FW presented here.  more » « less
Award ID(s):
1713567
PAR ID:
10101226
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Proceedings of the National Academy of Sciences of the United States of America
Volume:
116
Issue:
6
ISSN:
0027-8424
Page Range / eLocation ID:
1857-1864
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. In the information age, the ability to read and construct data visualizations becomes as important as the ability to read and write text. However, while standard definitions and theoretical frameworks to teach and assess textual, mathematical, and visual literacy exist, current data visualization literacy (DVL) definitions and frameworks are not comprehensive enough to guide the design of DVL teaching and assessment. This paper introduces a data visualization literacy framework (DVL-FW) that was specifically developed to define, teach, and assess DVL. The holistic DVL-FW promotes both the reading and construction of data visualizations, a pairing analogous to that of both reading and writing in textual literacy and understanding and applying in mathematical literacy. Specifically, the DVL-FW defines a hierarchical typology of core concepts and details the process steps that are required to extract insights from data. Advancing the state of the art, the DVL-FW interlinks theoretical and procedural knowledge and showcases how both can be combined to design curricula and assessment measures for DVL. Earlier versions of the DVL-FW have been used to teach DVL to more than 8,500 residential and online students, and results from this effort have helped revise and validate the DVL-FW presented here. 
    more » « less
  2. Visualization misinformation is a prevalent problem, and combating it requires understanding people’s ability to read, interpret, and reason about erroneous or potentially misleading visualizations, which lacks a reliable measurement: existing visualization literacy tests focus on well-formed visualizations. We systematically develop an assessment for this ability by: (1) developing a precise definition of misleaders (decisions made in the construction of visualizations that can lead to conclusions not supported by the data), (2) constructing initial test items using a design space of misleaders and chart types, (3) trying out the provisional test on 497 participants, and (4) analyzing the test tryout results and refining the items using Item Response Theory, qualitative analysis, a wrong-due-to-misleader score, and the content validity index. Our final bank of 45 items shows high reliability, and we provide item bank usage recommendations for future tests and different use cases. Related materials are available at: https://osf.io/pv67z/. 
    more » « less
  3. We contribute an autoethnographic reflection on the complexity of defining and measuring visualization literacy (i.e., the ability to interpret and construct visualizations) to expose our tacit thoughts that often exist in-between polished works and remain unreported in individual research papers. Our work is inspired by the growing number of empirical studies in visualization research that rely on visualization literacy as a basis for developing effective data representations or educational interventions. Researchers have already made various efforts to assess this construct, yet it is often hard to pinpoint either what we want to measure or what we are effectively measuring. In this autoethnography, we gather insights from 14 internal interviews with researchers who are users or designers of visualization literacy tests. We aim to identify what makes visualization literacy assessment a ``wicked'' problem. We further reflect on the fluidity of visualization literacy and discuss how this property may lead to misalignment between what the construct is and how measurements of it are used or designed. We also examine potential threats to measurement validity from conceptual, operational, and methodological perspectives. Based on our experiences and reflections, we propose several calls to action aimed at tackling the wicked problem of visualization literacy measurement, such as by broadening test scopes and modalities, improving test ecological validity, making it easier to use tests, seeking interdisciplinary collaboration, and drawing from continued dialogue on visualization literacy to expect and be more comfortable with its fluidity. 
    more » « less
  4. Co-reading (when parents read aloud with their children) is an important literacy development activity for children. HCI has begun to explore how technology might support children in co-reading, but little empirical work examines how parents currently co-read, and no work examines how people with visual impairments (PWVI) co-read. PWVIs' perspectives offer unique insights into co-reading, as PWVI often read differently from their children, and (Braille) literacy holds particular cultural significance for PWVI. We observed discussions of co-reading practices in a blind parenting forum on Facebook, to establish a grounded understanding of how and why PWVI co-read. We found that PWVIs' co-reading practices were highly diverse and affected by a variety of socio-technical concerns - and visual ability was less influential than other factors like ability to read Braille, presence of social supports, and children's literacy. Our findings show that PWVI have valuable insights into co-reading, which could help technologies in this space better meet the needs of parents and children, with and without disabilities. 
    more » « less
  5. Data visualization literacy is essential for K-12 students, yet existing practices emphasize interpreting pre-made visualizations rather than creating them. To address this, we developed the DPV (Domain, Purpose, Visual) framework, which guides middle school students through the visualization design process. The framework simplifies design into three stages: understanding the problem domain, specifying the communication purpose, and translating data into effective visuals. Implemented in a twoweek summer camp as a usage scenario, the DPV framework enabled students to create visualizations addressing community issues. Evaluation of student artifacts, focus group interviews, and surveys demonstrated its effectiveness in enhancing students' design skills and understanding of visualization concepts. This work highlights the DPV framework's potential to foster data visualization literacy for K-12 education and broaden participation in the data visualization community. 
    more » « less