skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Investigating Sensory Extensions as Input for Interactive Simulations
Sensory extensions enhance our awareness by transforming variations in stimuli normally undetectable by human senses into perceivable outputs. Similarly, interactive simulations for learning promote an understanding of abstract phenomena. Combining sensory extension devices with interactive simulations gives users the novel opportunity to connect their sensory experiences in the physical world to computer-simulated concepts. We explore this opportunity by designing a suite of wearable sensory extension devices that interface with a uniquely inclusive PhET Simulation, Ratio and Proportion. In this simulation, two hands can be moved on-screen to various values, representing different mathematical ratios. Users explore changing hand heights to find and maintain ratios through visual and auditory feedback. Our sensory extension devices translate force, distance, sound frequency, and magnetic field strength to quantitative values in order to control individual hands in the computer simulation. This paper describes the design of the devices and our analysis of feedback from 23 high-school aged youth who used our designs to interact with the Ratio and Proportion simulation.  more » « less
Award ID(s):
2119303
PAR ID:
10466085
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
Seventeenth International Conference on Tangible, Embedded, and Embodied Interaction
Page Range / eLocation ID:
1 to 7
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    We investigate typing on a QWERTY keyboard rendered in virtual reality. Our system tracks users’ hands in the virtual environment via a Leap Motion mounted on the front of a head mounted display. This allows typing on an auto-correcting midair keyboard without the need for auxiliary input devices such as gloves or handheld controllers. It supports input via the index fingers of one or both hands. We compare two keyboard designs: a normal QWERTY layout and a split layout. We found users typed at around 16 words-per-minute using one or both index fingers on the normal layout, and about 15 words-per-minute using both index fingers on the split layout. Users had a corrected error rate below 2% in all cases. To explore midair typing with limited or no visual feedback, we had users type on an invisible keyboard. Users typed on this keyboard at 11 words-per-minute at an error rate of 3.3% despite the keyboard providing almost no visual feedback. 
    more » « less
  2. To improve student's class experience, the use of mobile devices has been steadily increasing. However, such use of mobile learning environments in the class is mostly static in nature through content delivery or multiple choice and true/false quiz taking. In CS courses, we need learning environments where students can interact with the problem in a hands-on-approach and instructor can assess their learning skills in real-time with problems having different degree of difficulty. To facilitate such interactive problem solving and real-time assessment using mobile devices, a comprehensive backend system is necessary. This paper presents one such system, named Mobile Response System (MRS) software, associated interactive problem-solving activities, and lessons learned by using it in the CS classrooms. MRS provides instructor with the opportunity of evidence-based teaching by allowing students to perform interactive exercises in their mobile devices with different learning outcomes and by getting an instant feedback on their performance and mental models. MRS is easy-to-use, extensible and can render interactive exercises developed by third-party developers. The student performance data shows its effectiveness in increasing student understanding of difficult concepts and the overall perception of using the software was very positive. 
    more » « less
  3. null (Ed.)
    The scale and accessibility of passive global surveillance have rapidly increased over time. This provides an opportunity to calibrate the performance of models, algorithms, and reflectance ratios between remote-sensing devices. Here, we test the sensitivity and specificity of the Eucalypt chlorophyll-a reflectance ratio (ECARR) and Eucalypt chlorophyll-b reflectance ratio (ECBRR) to remotely identify eucalypt vegetation in Queensland, Australia. We compare the reflectance ratio values from Sentinel-2 and Planet imagery across four sites of known vegetation composition. All imagery was transformed to reflectance values, and Planet imagery was additionally scaled to harmonize across Planet scenes. ECARR can identify eucalypt vegetation remotely with high sensitivity but shows low specificity and is impacted by the density of the vegetation. ECBRR reflectance ratios show similar sensitivity and specificity when identifying eucalypt vegetation but with values an order of magnitude smaller than ECARR. We find that ECARR was better at identifying eucalypt vegetation in the Sentinel-2 imagery than Planet imagery. ECARR can serve as a general chlorophyll indicator but is not a specific index to identify Eucalyptus vegetation with certainty. 
    more » « less
  4. Touch is often omitted or viewed as unnecessary in digital learning. Lack of touch feedback limits the accessibility and multimodal capacity of digital educational content. Touchscreens with vibratory, haptic feedback are prevalent, yet this kind of feedback is often under-utilized. This work provides initial investigations into the design, development, and use of vibratory feedback within multimodal, interactive, educational simulations on touchscreen devices by learners with and without visual impairments. The objective of this work is to design and evaluate different haptic paradigms that could support interaction and learning in educational simulations. We investigated the implementation of four haptic paradigms in two physics simulations. Interviews were conducted with eight learners (five sighted learners; three learners with visual impairments) on one simulation and initial results are shared. We discuss the learner outcomes of each paradigm and how they impact design and development moving forward. 
    more » « less
  5. Recent robot collections provide various interactive tools for users to explore and analyze their datasets. Yet, the literature lacks data on how users interact with these collections and which tools can best support their goals. This late-breaking report presents preliminary data on the utility of four interactive tools for accessing a collection of robot hands. The tools include a gallery and similarity comparison for browsing and filtering existing hands, a prediction tool for estimating user impression of hands (e.g., humanlikeness), and a recommendation tool suggesting design features (e.g., number of fingers) for achieving a target user impression rating. Data from a user study with 9 novice robotics researchers suggest the users found the tools useful for various tasks and especially appreciated the gallery and recommendation functionalities for understanding the complex relationships of the data. We discuss the results and outline future steps for developing interface design guidelines for robot collections. 
    more » « less