skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Vis Repligogy: Towards a Culture of Facilitating Replication Studies in Visualization Pedagogy and Research
In this paper, we present the Vis Repligogy framework that enables conducting replication studies in the class. Replication studies are crucial to strengthening the data visualization field and ensuring its foundations are solid and methods accurate. Although visualization researchers acknowledge the epistemological significance of replications and their necessity to establish trust and reliability, the field has made little progress to support the publication of such studies and, importantly, provide methods to the community to encourage replications. Therefore, we contribute Vis Repligogy, a novel framework to systematically incorporate replications within visualization course curricula that not only teaches students replication and evaluation methodologies but also results in executed replication studies to validate prior work. To validate the feasibility of the framework, we present case studies of two graduate data visualization courses that implemented it. These courses resulted in a total of five replication studies. Finally, we reflect on our experience implementing the Vis Repligogy framework to provide useful recommendations for future use. We envision this framework will encourage instructors to conduct replications in their courses, help facilitate more replications in visualization pedagogy and in research, and support a culture shift towards reproducible research. Supplemental materials of this paper are available at https://osf.io/ncb6d/.  more » « less
Award ID(s):
2145382
PAR ID:
10608134
Author(s) / Creator(s):
; ; ; ; ; ; ;
Publisher / Repository:
The Eurographics Association
Date Published:
Subject(s) / Keyword(s):
CCS Concepts: Human-centered computing→Visualization theory and methods Visualization pedagogy Human centered computing→Visualization theory and methods Visualization pedagogy
Format(s):
Medium: X Size: 9 pages
Size(s):
9 pages
Right(s):
Creative Commons Attribution 4.0 International
Sponsoring Org:
National Science Foundation
More Like this
  1. A series of failed replications and frauds have raised questions regarding self-correction in science. Metascientific activists have advocated policies that incentivize replications and make them more diagnostically potent. We argue that current debates, as well as research in science and technology studies, have paid little heed to a key dimension of replication practice. Although it sometimes serves a diagnostic function, replication is commonly motivated by a practical desire to extend research interests. The resulting replication, which we label ‘integrative’, is characterized by a pragmatic flexibility toward protocols. The goal is to appropriate what is useful, not test for truth. Within many experimental cultures, however, integrative replications can produce replications of ambiguous diagnostic power. Based on interviews with 60 members of the Board of Reviewing Editors for the journal Science, we show how the interplay between the diagnostic and integrative motives for replication differs between fields and produces different cultures of replication. We offer six theses that aim to put science and technology studies and science activism into dialog to show why effective reforms will need to confront issues of disciplinary difference. 
    more » « less
  2. Practicing reproducible scientific research requires access to appropriate reproducibility methodology and software, as well as open data. Strict reproducibility in complex scientific domains such as environmental science, ecology and medicine, however, is difficult if not impossible. Here, we consider replication as a relaxed but bona fide substitution for strict reproducibility and propose using 3D terrain visualization for replication in environmental science studies that propose causal relationships between one or more driver variables and one or more response variables across complex ecosystem landscapes. We base our contention of the usefulness of visualization for replication on more than ten years observing environmental science modelers who use our 3D terrain visualization software to develop, calibrate, validate, and integrate predictive models. To establish the link between replication and model validation and corroboration, we consider replication as proposed by Munafò, i.e., triangulation. We enumerate features of visualization systems that would enable such triangulation and argue that such systems would render feasible domain-specific, open visualization software for use in replicating environmental science studies. 
    more » « less
  3. The visualization community has seen a rise in the adoption of user studies. Empirical user studies systematically test the assumptions that we make about how visualizations can help or hinder viewers’ performance of tasks. Although the increase in user studies is encouraging, it is vital that research on human reasoning with visualizations be grounded in an understanding of how the mind functions. Previously, there were no sufficient models that illustrate the process of decision-making with visualizations. However, Padilla et al. [41] recently proposed an integrative model for decision-making with visualizations, which expands on modern theories of visualization cognition and decision-making. In this paper, we provide insights into how cognitive models can accelerate innovation, improve validity, and facilitate replication efforts, which have yet to be thoroughly discussed in the visualization community. To do this, we offer a compact overview of the cognitive science of decision-making with visualizations for the visualization community, using the Padilla et al. [41] cognitive model as a guiding framework. By detailing examples of visualization research that illustrate each component of the model, this paper offers novel insights into how visualization researchers can utilize a cognitive framework to guide their user studies. We provide practical examples of each component of the model from empirical studies of visualizations, along with visualization implications of each cognitive process, which have not been directly addressed in prior work. Finally, this work offers a case study in utilizing an understanding of human cognition to generate a novel solution to a visualization reasoning bias in the context of hurricane forecast track visualizations. 
    more » « less
  4. Abstract Empirical evaluations of replication have become increasingly common, but there has been no unified approach to doing so. Some evaluations conduct only a single replication study while others run several, usually across multiple laboratories. Designing such programs has largely contended with difficult issues about which experimental components are necessary for a set of studies to be considered replications. However, another important consideration is that replication studies be designed to support sufficiently sensitive analyses. For instance, if hypothesis tests are to be conducted about replication, studies should be designed to ensure these tests are well-powered; if not, it can be difficult to determine conclusively if replication attempts succeeded or failed. This paper describes methods for designing ensembles of replication studies to ensure that they are both adequately sensitive and cost-efficient. It describes two potential analyses of replication studies—hypothesis tests and variance component estimation—and approaches to obtaining optimal designs for them. Using these results, it assesses the statistical power, precision of point estimators and optimality of the design used by the Many Labs Project and finds that while it may have been sufficiently powered to detect some larger differences between studies, other designs would have been less costly and/or produced more precise estimates or higher-powered hypothesis tests. 
    more » « less
  5. null (Ed.)
    Experimenter bias and expectancy effects have been well studied in the social sciences and even in human-computer interaction. They refer to the nonideal study-design choices made by experimenters which can unfairly influence the outcomes of their studies. While these biases need to be considered when designing any empirical study, they can be particularly significant in the context of replication studies which can stray from the studies being replicated in only a few admissible ways. Although there are general guidelines for making valid, unbiased choices in each of the several steps in experimental design, making such choices when conducting replication studies has not been well explored. We reviewed 16 replication studies in information visualization published in four top venues between 2008 to present to characterize how the study designs of the replication studies differed from those of the studies they replicated. We present our characterization categories which include the prevalence of crowdsourcing, and the commonly-found replication types and study-design differences. We draw guidelines based on these categories towards helping researchers make meaningful and unbiased decisions when designing replication studies. Our paper presents the first steps in gaining a larger understanding of this topic and contributes to the ongoing efforts of encouraging researchers to conduct and publish more replication studies in information visualization. 
    more » « less