Visualization design studies bring together visualization researchers and domain experts to address yet unsolved data analysis challenges stemming from the needs of the domain experts. Typically, the visualization researchers lead the design study process and implementation of any visualization solutions. This setup leverages the visualization researchers’ knowledge of methodology, design, and programming, but the availability to synchronize with the domain experts can hamper the design process. We consider an alternative setup where the domain experts take the lead in the design study, supported by the visualization experts. In this study, the domain experts are computer architecture experts who simulate and analyze novel computer chip designs. These chips rely on a Network-on-Chip (NOC) to connect components. The experts want to understand how the chip designs perform and what in the design led to their performance. To aid this analysis, we develop Vis4Mesh, a visualization system that provides spatial, temporal, and architectural context to simulated NOC behavior. Integration with an existing computer architecture visualization tool enables architects to perform deep-dives into specific architecture component behavior. We validate Vis4Mesh through a case study and a user study with computer architecture researchers. We reflect on our design and process, discussing advantages, disadvantages, and guidance for engaging in a domain expert-led design studies.
more »
« less
Digital Collaborator: Augmenting Task Abstraction in Visualization Design with Artificial Intelligence
In the task abstraction phase of the visualization design process, including in "design studies", a practitioner maps the observed domain goals to generalizable abstract tasks using visualization theory in order to better understand and address the users needs. We argue that this manual task abstraction process is prone to errors due to designer biases and a lack of domain background and knowledge. Under these circumstances, a collaborator can help validate and provide sanity checks to visualization practitioners during this important task abstraction stage. However, having a human collaborator is not always feasible and may be subject to the same biases and pitfalls. In this paper, we first describe the challenges associated with task abstraction. We then propose a conceptual Digital Collaborator: an artificial intelligence system that aims to help visualization practitioners by augmenting their ability to validate and reason about the output of task abstraction. We also discuss several practical design challenges of designing and implementing such systems.
more »
« less
- Award ID(s):
- 1657466
- PAR ID:
- 10206916
- Date Published:
- Journal Name:
- Workshop on Artificial Intelligence for HCI: A Modern Approach (CHI 2020)
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Common pitfalls in visualization projects include lack of data availability and the domain users' needs and focus changing too rapidly for the design process to complete. While it is often prudent to avoid such projects, we argue it can be beneficial to engage them in some cases as the visualization process can help refine data collection, solving a “chicken and egg” problem of having the data and tools to analyze it. We found this to be the case in the domain of task parallel computing where such data and tooling is an open area of research. Despite these hurdles, we conducted a design study. Through a tightly-coupled iterative design process, we built Atria, a multi-view execution graph visualization to support performance analysis. Atria simplifies the initial representation of the execution graph by aggregating nodes as related to their line of code. We deployed Atria on multiple platforms, some requiring design alteration. We describe how we adapted the design study methodology to the “moving target” of both the data and the domain experts' concerns and how this movement kept both the visualization and programming project healthy. We reflect on our process and discuss what factors allow the project to be successful in the presence of changing data and user needs.more » « less
-
Computing professionals in areas like compilers, performance analysis, and security often analyze and manipulate control flow graphs (CFGs) in their work. CFGs are directed networks that describe possible orderings of instructions in the execution of a program. Visualizing a CFG is a common activity in developing or debugging computational approaches that use them. However, general graph drawing layouts, including the hierarchical ones frequently applied to CFGs, do not capture CFG-specific structures or tasks and thus the resulting drawing may not match the needs of their audience, especially for more complicated programs. While several algorithms offer flexibility in specifying the layout, they often require expertise with graph drawing layouts and primitives that these potential users do not have. To bring domain-specific CFG drawing to this audience, we develop CFGConf, a library designed to match the abstraction level of CFG experts. CFGConf provides a JSON interface that produces drawings that can stand-alone or be integrated into multi-view visualization systems. We developed CFGConf through an interactive design process with experts while incorporating lessons learned from previous CFG visualization systems, a survey of CFG drawing conventions in computing systems conferences, and existing design principles for notations. We evaluate CFGConf in terms of expressiveness, usability, and notational efficiency through a user study and illustrative examples. CFG experts were able to use the library to produce the domain-aware layouts and appreciated the task-aware nature of the specification.more » « less
-
Side-channel attacks that leak sensitive information through a computing device's interaction with its physical environment have proven to be a severe threat to devices' security, particularly when adversaries have unfettered physical access to the device. Traditional approaches for leakage detection measure the physical properties of the device. Hence, they cannot be used during the design process and fail to provide root cause analysis. An alternative approach that is gaining traction is to automate leakage detection by modeling the device. The demand to understand the scope, benefits, and limitations of the proposed tools intensifies with the increase in the number of proposals. In this SoK, we classify approaches to automated leakage detection based on the model's source of truth. We classify the existing tools on two main parameters: whether the model includes measurements from a concrete device and the abstraction level of the device specification used for constructing the model. We survey the proposed tools to determine the current knowledge level across the domain and identify open problems. In particular, we highlight the absence of evaluation methodologies and metrics that would compare proposals' effectiveness from across the domain. We believe that our results help practitioners who want to use automated leakage detection and researchers interested in advancing the knowledge and improving automated leakage detection.more » « less
-
Modern social media platforms like Twitch, YouTube, etc., embody an open space for content creation and consumption. However, an unintended consequence of such content democratization is the proliferation of toxicity and abuse that content creators get subjected to. Commercial and volunteer content moderators play an indispensable role in identifying bad actors and minimizing the scale and degree of harmful content. Moderation tasks are often laborious, complex, and even if semi-automated, they involve high-consequence human decisions that affect the safety and popular perception of the platforms. In this paper, through an interdisciplinary collaboration among researchers from social science, human-computer interaction, and visualization, we present a systematic understanding of how visual analytics can help in human-in-the-loop content moderation. We contribute a characterization of the data-driven problems and needs for proactive moderation and present a mapping between the needs and visual analytic tasks through a task abstraction framework. We discuss how the task abstraction framework can be used for transparent moderation, design interventions for moderators’ well-being, and ultimately, for creating futuristic human-machine interfaces for data-driven content moderation.more » « less
An official website of the United States government

