Interaction is critical for data analysis and sensemaking. However, designing interactive physicalizations is challenging as it requires cross-disciplinary knowledge in visualization, fabrication, and electronics. Interactive physicalizations are typically produced in an unstructured manner, resulting in unique solutions for a specific dataset, problem, or interaction that cannot be easily extended or adapted to new scenarios or future physicalizations. To mitigate these challenges, we introduce a computational design pipeline to 3D print network physicalizations with integrated sensing capabilities. Networks are ubiquitous, yet their complex geometry also requires significant engineering considerations to provide intuitive, effective interactions for exploration. Using our pipeline, designers can readily produce network physicalizations supporting selection—the most critical atomic operation for interaction—by touch through capacitive sensing and computational inference. Our computational design pipeline introduces a new design paradigm by concurrently considering the form and interactivity of a physicalization into one cohesive fabrication workflow. We evaluate our approach using (i) computational evaluations, (ii) three usage scenarios focusing on general visualization tasks, and (iii) expert interviews. The design paradigm introduced by our pipeline can lower barriers to physicalization research, creation, and adoption.
more »
« less
Multi-Touch Querying on Data Physicalizations in Immersive AR
Data physicalizations (3D printed terrain models, anatomical scans, or even abstract data) can naturally engage both the visual and haptic senses in ways that are difficult or impossible to do with traditional planar touch screens and even immersive digital displays. Yet, the rigid 3D physicalizations produced with today's most common 3D printers are fundamentally limited for data exploration and querying tasks that require dynamic input (e.g., touch sensing) and output (e.g., animation), functions that are easily handled with digital displays. We introduce a novel style of hybrid virtual + physical visualization designed specifically to support interactive data exploration tasks. Working toward a "best of both worlds" solution, our approach fuses immersive AR, physical 3D data printouts, and touch sensing through the physicalization. We demonstrate that this solution can support three of the most common spatial data querying interactions used in scientific visualization (streamline seeding, dynamic cutting places, and world-in-miniature visualization). Finally, we present quantitative performance data and describe a first application to exploratory visualization of an actively studied supercomputer climate simulation data with feedback from domain scientists.
more »
« less
- PAR ID:
- 10376915
- Date Published:
- Journal Name:
- Proceedings of the ACM on Human-Computer Interaction
- Volume:
- 5
- Issue:
- ISS
- ISSN:
- 2573-0142
- Page Range / eLocation ID:
- 1 to 20
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
We present a design-based exploration of the potential to reinterpret glyph-based visualization of scalar fields on 3D surfaces, a traditional scientific visualization technique, as a data physicalization technique. Even with the best virtual reality displays, users often struggle to correctly interpret spatial relationships in 3D datasets; thus, we are motivated to understand the extent to which traditional scientific visualization methods can translate to physical media where users may simultaneously leverage their visual systems and tactile senses to, in theory, better understand and connect with the data of interest. This pictorial traces the process of our design for a specific user study experiment: (1) inspiration, (2) exploring the data physicalization design space, (3) prototyping with 3D printing, (4) applying the techniques to different synthetic datasets. We call our most recent and compelling visual/tactile design boxcars on potatoes, and the next step in the research is to run a user-based evaluation to elucidate how this design compares to several of the others pictured here.more » « less
-
Despite advances in digitizing vision and hearing, touch still lacks an equivalent digital interface matching the fidelity of human perception. This gap limits the quality of digital tactile information and the realism of virtual experiences. Here, we introduce a step toward human-resolution haptics: a class of wearable tactile displays designed to match the spatial and temporal acuity of the human fingertip. Our device, VoxeLite, is a 0.1-millimeter-thick, 0.19-gram, skin-conformal array of individually addressable soft electroadhesive actuators (“nodes”). As users touch and move across surfaces, VoxeLite delivers high-resolution distributed forces via the nodes. Enabled by scalable microfabrication techniques, the display achieves actuator densities up to 110 nodes per square centimeter, produces stimuli up to 800 hertz, and remains transparent to real-world tactile input. We demonstrate its ability to render small-scale hapticons and virtual textures and transmit physical surfaces, validated through human psychophysics and biomimetic sensing. These findings position VoxeLite as a platform for human-resolution haptics in immersive interfaces, robotics, and digital touch communication.more » « less
-
Most mobile health apps employ data visualization to help people view their health and activity data, but these apps provide limited support for visual data exploration. Furthermore, despite its huge potential benefits, mobile visualization research in the personal data context is sparse. This work aims to empower people to easily navigate and compare their personal health data on smartphones by enabling flexible time manipulation with speech. We designed and developed Data@Hand, a mobile app that leverages the synergy of two complementary modalities: speech and touch. Through an exploratory study with 13 long-term Fitbit users, we examined how multimodal interaction helps participants explore their own health data. Participants successfully adopted multimodal interaction (i.e., speech and touch) for convenient and fluid data exploration. Based on the quantitative and qualitative findings, we discuss design implications and opportunities with multimodal interaction for better supporting visual data exploration on mobile devices.more » « less
-
While visualization can support understanding complex phenomena, their effectiveness might vary with the recipient’s familiarity with both the phenomenon and the visualization. The current study contrasted interpretations of simulated hurricane paths using student populations from a high frequency hurricane area versus no local hurricane risk. Non-expert understanding of trajectory predictions was supported via two visualizations: common cones of uncertainty and novel dynamic ensembles. General patterns of performance were similar across the two groups. Participants from the high hurricane risk area did show narrower decision thresholds, in both common and novel visualization formats. More variability was consistently considered possible when viewing the dynamic ensemble displays. Despite greater likelihood of experiences with variability of trajectories outside of forecast paths, greater familiarity tended towards narrower interpretations of the need for evacuations within the variability possible. The results suggest an advantage of dynamic ensembles in grasping uncertainty even in populations familiar with hurricanes.more » « less
An official website of the United States government

