In this paper we explore how the qualities of Autonomous Sensory Meridian Response (ASMR) media - its pairing of sonic and visual design, ability to subvert fast-paced technology for slow experiences, production of somatic responses, and attention to the everyday-might reveal new design possibilities for interactions with wearable technology. We recount our year-long design inquiry into the subject which began with an interview with a "live" ASMR creator and design probes, a series of first-person design exercises, and resulted in the creation of two interactive garments for attending, noticing, and becoming enchanted with our our everyday surroundings. We conclude by suggesting that these ASMR inspired designs cultivate personal, intimate, embodied, and felt practices of attention within our everyday, mundane, environments.
more »
« less
From a morning forest to a sunset beach: Understanding visual experiences and the roles of personal characteristics for designing relaxing digital nature
Nature experiences, especially the visual aspects of nature, have been widely used to facilitate relaxation. Fueled by digital technology, simulated visual nature experiences have gained popularity in creating healing environments that induce relaxation. However, while easily applicable, not all nature-imitating visuals lead to relaxation. How to effectively design relaxing visual nature experiences remains largely unexplored. This paper investigates how different nature qualities facilitate relaxing visual experiences and the roles of two personal characteristics (mood and nature-relatedness) play. Through an online survey and interviews, we assessed 16 nature video clips, representing eight distinctive nature qualities, and compared perceived experiences while considering the influence of personal characteristics. The results indicate four types of visual qualities (engaging, instinctive, ambient, and derivative) underlying nature-induced relaxation, and show that nature relatedness influences the degree to which nature video clips elicit relaxation. We discuss design implications for creating personalized digital nature.
more »
« less
- Award ID(s):
- 2143552
- PAR ID:
- 10527488
- Publisher / Repository:
- International journal of human computer interactions
- Date Published:
- Journal Name:
- International journal of human computer interactions
- ISSN:
- 2180-1347
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
BACKGROUND Facial expressions are critical for conveying emotions and facilitating social interaction. Yet, little is known about how accurately sighted individuals recognize emotions facially expressed by people with visual impairments in online communication settings. OBJECTIVE This study aimed to investigate sighted individuals’ ability to understand facial expressions of six basic emotions in people with visual impairments during Zoom calls. It also aimed to examine whether education on facial expressions specific to people with visual impairments would improve emotion recognition accuracy. METHODS Sighted participants viewed video clips of individuals with visual impairments displaying facial expressions. They then identified the emotions displayed. Next, they received an educational session on facial expressions specific to people with visual impairments, addressing unique characteristics and potential misinterpretations. After education, participants viewed another set of video clips and again identified the emotions displayed. RESULTS Before education, participants frequently misidentified emotions. After education, their accuracy in recognizing emotions improved significantly. CONCLUSIONS This study provides evidence that education on facial expressions of people with visual impairments can significantly enhance sighted individuals’ ability to accurately recognize emotions in online settings. This improved accuracy has the potential to foster more inclusive and effective online interactions between people with and without visual disabilities.more » « less
-
Recent years have witnessed an increasing interest in image-based question-answering (QA) tasks. However, due to data limitations, there has been much less work on video-based QA. In this paper, we present TVQA, a largescale video QA dataset based on 6 popular TV shows. TVQA consists of 152,545 QA pairs from 21,793 clips, spanning over 460 hours of video. Questions are designed to be compositional in nature, requiring systems to jointly localize relevant moments within a clip, comprehend subtitle-based dialogue, and recognize relevant visual concepts. We provide analyses of this new dataset as well as several baselines and a multi-stream end-to-end trainable neural network framework for the TVQA task. The dataset is publicly available at http://tvqa.cs.unc.edu.more » « less
-
Sound Travels is a US-based, federally-funded collaboration between sound researchers, learning researchers, and educational practitioners working to understand the role of soundscapes on free-choice, out-of-school learning experiences. In this paper, members of our research team describe how we have combined approaches from acoustic ecology and visitor studies to navigate the affordances and challenges of studying sound across several complex leisure settings (a science museum, a botanic garden, a park, and a zoo). As an exploratory and transdisciplinary project, our initial work has involved significant deliberation about how to meaningfully and effectively gather data in highly variable acoustic environments, as well as what types and characteristics of sound data are most salient to understanding visitors’ experiences of sound. In addition to grappling with these technical questions, we have also worked to ensure that our research does not detract from positive visitor experiences in these spaces and that it directly engages perspectives from practitioners and visitors about cognition, affect, and culture. We will describe the logic of the methods we have used to date (stationary ambient recordings, a post-experience visitor questionnaire, and a “sound search” in which visitors record video clips), as well as our plans for further study.more » « less
-
This study investigates how individual predispositions toward Virtual Reality (VR) affect user experiences in collaborative VR environments, particularly in workplace settings. By adapting the Video Game Pursuit Scale to measure VR predisposition, we aim to establish the reliability and validity of this adapted measure in assessing how personal characteristics influence engagement and interaction in VR. Two studies, the first correlational and the second quasi-experimental, were conducted to examine the impact of environmental features, specifically the differences between static and mobile VR platforms, on participants’ perceptions of time, presence, and task motivation. The findings indicate that individual differences in VR predisposition significantly influence user experiences in virtual environments with important implications for enhancing VR applications in training and team collaboration. This research contributes to the understanding of human–computer interaction in VR and offers valuable insights for organizations aiming to implement VR technologies effectively. The results highlight the importance of considering psychological factors in the design and deployment of VR systems, paving the way for future research in this rapidly evolving field.more » « less
An official website of the United States government

