skip to main content


Title: MobiSweep: Exploring Spatial Design Ideation Using a Smartphone as a Hand-held Reference Plane
In this paper, we explore quick 3D shape composition during early-phase spatial design ideation. Our approach is to re-purpose a smartphone as a hand-held reference plane for creating, modifying, and manipulating 3D sweep surfaces. We implemented MobiSweep, a prototype application to explore a new design space of constrained spatial interactions that combine direct orientation control with indirect position control via well-established multi-touch gestures. MobiSweep leverages kinesthetically aware interactions for the creation of a sweep surface without explicit position tracking. The design concepts generated by users, in conjunction with their feedback, demonstrate the potential of such interactions in enabling spatial ideation.  more » « less
Award ID(s):
1632154 1329979 1538868
NSF-PAR ID:
10041306
Author(s) / Creator(s):
Date Published:
Journal Name:
TEI '16 Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction
Page Range / eLocation ID:
12-20
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We present RealFusion, an interactive workflow that supports early stage design ideation in a digital 3D medium. RealFusion is inspired by the practice of found-object-art, wherein new representations are created by composing existing objects. The key motivation behind our approach is direct creation of 3D artifacts during design ideation, in contrast to conventional practice of employing 2D sketching. RealFusion comprises of three creative states where users can (a) repurpose physical objects as modeling components, (b) modify the components to explore different forms, and (c) compose them into a meaningful 3D model. We demonstrate RealFusion using a simple interface that comprises of a depth sensor and a smartphone. To achieve direct and efficient manipulation of modeling elements, we also utilize mid-air interactions with the smartphone. We conduct a user study with novice designers to evaluate the creative outcomes that can be achieved using RealFusion. 
    more » « less
  2. Abstract Inspirational stimuli are known to be effective in supporting ideation during early-stage design. However, prior work has predominantly constrained designers to using text-only queries when searching for stimuli, which is not consistent with real-world design behavior where fluidity across modalities (e.g., visual, semantic, etc.) is standard practice. In the current work, we introduce a multi-modal search platform that retrieves inspirational stimuli in the form of 3D-model parts using text, appearance, and function-based search inputs. Computational methods leveraging a deep-learning approach are presented for designing and supporting this platform, which relies on deep-neural networks trained on a large dataset of 3D-model parts. This work further presents the results of a cognitive study ( n = 21) where the aforementioned search platform was used to find parts to inspire solutions to a design challenge. Participants engaged with three different search modalities: by keywords, 3D parts, and user-assembled 3D parts in their workspace. When searching by parts that are selected or in their workspace, participants had additional control over the similarity of appearance and function of results relative to the input. The results of this study demonstrate that the modality used impacts search behavior, such as in search frequency, how retrieved search results are engaged with, and how broadly the search space is covered. Specific results link interactions with the interface to search strategies participants may have used during the task. Findings suggest that when searching for inspirational stimuli, desired results can be achieved both by direct search inputs (e.g., by keyword) as well as by more randomly discovered examples, where a specific goal was not defined. Both search processes are found to be important to enable when designing search platforms for inspirational stimuli retrieval. 
    more » « less
  3. null (Ed.)
    Design thinking is an approach to educational curriculum that builds empathy, encourages ideation, and fosters active problem solving through hands-on design projects. Embedding participatory “co-design” into design thinking curriculum offers students agency in finding solutions to real-world design challenges, which may support personal empowerment. An opportunity to explore this prospect arose in the design of sounds for an accessible interactive science-education simulation in the PhET Project. Over the course of three weeks, PhET researchers engaged blind and visually-impaired high-school students in a design thinking curriculum that included the co-design of sounds and auditory interactions for the Balloons and Static Electricity (BASE) sim. By the end of the curriculum, students had iterated through all aspects of design thinking and performed a quantitative evaluation of multiple sound prototypes. Furthermore, the group’s mean self-efficacy rating had increased. We reflect on our curriculum and the choices we made that helped enable the students to become authentic partners in sound design. 
    more » « less
  4. Virtual reality (VR) offers potential as a collaborative tool for both technology design and human-robot interaction. We utilized a participatory, human-centered design (HCD) methodology to develop a collaborative, asymmetric VR game to explore teens’ perceptions of, and interactions with, social robots. Our paper illustrates three stages of our design process; ideation, prototyping, and usability testing with users. Through these stages we identified important design requirements for our mid-fidelity environment. We then describe findings from our pilot test of the mid-fidelity VR game with teens. Due to the unique asymmetric virtual reality design, we observed successful collaborations, and interesting collaboration styles across teens. This study highlights the potential for asymmetric VR as a collaborative design tool as well as an appropriate medium for successful teen-to-teen collaboration. 
    more » « less
  5. In this paper, we introduce a creative pipeline to incorporate physiological and behavioral data from contemporary marine mammal research into data-driven animations, leveraging functionality from industry tools and custom scripts to promote scientific insights, public awareness, and conservation outcomes. Our framework can flexibly transform data describing animals’ orientation, position, heart rate, and swimming stroke rate to control the position, rotation, and behavior of 3D models, to render animations, and to drive data sonification. Additionally, we explore the challenges of unifying disparate datasets gathered by an interdisciplinary team of researchers, and outline our design process for creating meaningful data visualization tools and animations. As part of our pipeline, we clean and process raw acceleration and electrophysiological signals to expedite complex multi-stream data analysis and the identification of critical foraging and escape behaviors. We provide details about four animation projects illustrating marine mammal datasets. These animations, commissioned by scientists to achieve outreach and conservation outcomes, have successfully increased the reach and engagement of the scientific projects they describe. These impactful visualizations help scientists identify behavioral responses to disturbance, increase public awareness of human-caused disturbance, and help build momentum for targeted conservation efforts backed by scientific evidence. 
    more » « less