Not AvailableThe intersection of dance and artificial intelligence presents fertile ground for exploring human-machine interaction, co-creation, and embodied expression. This paper reports on a seven month four-phase collaboration with fifteen dancers from a university dance department, encompassing a preliminary study, redesign of LuminAI-a co-creative AI dance partner-, a contextual diary study, and a culminating public performance. Thematic analysis of responses revealed LuminAI’s impact on dancers’ perceptions, improvisational practices, and creative exploration. By blending human and AI interactions, LuminAI influenced dancers’ practices by pushing them to explore the unexpected, fostering deeper self-awareness, and enabling novel choreographic pathways. The experience reshaped their creative sub processes, enhancing their spatial awareness, movement vocabulary, and openness to experimentation. Our contributions underscore the potential of AI to not only augment dancers’ immediate improvisational capabilities but also to catalyze broader transformations in their creative processes, paving the way for future systems that inspire and amplify human creativity.
more »
« less
AI Meets Holographic Pepper’s Ghost: A Co-Creative Public Dance Experience
In this demonstration, we present a holographic projected version of LuminAI, which is an interactive art installation that allows participants to collaborate with an AI dance partner by improvising movements together. By utilizing a mix of a top-down and bottom-up approach, we seek to understand embodied co-creativity in an improvisational dance setting to better develop the design of the modular AI agent to creatively collaborate with a dancer. The purpose of this demonstration is to describe the five-module agent design and investigate how we can design an immersive experience that is design-efficient, portable, light, and duo-user participation. Through this installation in an imitated black box space, audience members and dancers engage in an immersive co-creative dance experience, inspiring discussion on the limitless applications of dance and technology in the realms of learning, training, and creativity.
more »
« less
- Award ID(s):
- 2123597
- PAR ID:
- 10461749
- Date Published:
- Journal Name:
- DIS '23 Companion: Companion Publication of the 2023 ACM Designing Interactive Systems Conference
- Page Range / eLocation ID:
- 274 to 278
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Engineering educators have increasingly sought strategies for integrating the arts into their curricula. The primary objective of this integration varies, but one common objective is to improve students’ creative thinking skills. In this paper, we sought to quantify changes in student creativity that resulted from participation in a mechanical engineering course targeted at integrating engineering, technology, and the arts. The course was team taught by instructors from mechanical engineering and art. The art instructor introduced origami principles and techniques as a means for students to optimize engineering structures. Through a course project, engineering student teams interacted with art students to perform structural analysis on an origami-based art installation, which was the capstone project of the art instructor’s undergraduate origami course. Three engineering student teams extended this course project to collaborate with the art students in the final design and physical installation.more » « less
-
Generative, ML-driven interactive systems have the potential to change how people interact with computers in creative processes - turning tools into co-creators. However, it is still unclear how we might achieve effective human-AI collaboration in open-ended task domains. There are several known challenges around communication in the interaction with ML-driven systems. An overlooked aspect in the design of co-creative systems is how users can be better supported in learning to collaborate with such systems. Here we reframe human-AI collaboration as a learning problem: Inspired by research on team learning, we hypothesize that similar learning strategies that apply to human-human teams might also increase the collaboration effectiveness and quality of humans working with co-creative generative systems. In this position paper, we aim to promote team learning as a lens for designing more effective co-creative human-AI collaboration and emphasize collaboration process quality as a goal for co-creative systems. Furthermore, we outline a preliminary schematic framework for embedding team learning support in co-creative AI systems. We conclude by proposing a research agenda and posing open questions for further study on supporting people in learning to collaborate with generative AI systems.more » « less
-
The uSucceed project aims to support neurodiverse individuals in the STEM workforce by utilizing Virtual Reality (VR) to deliver a customized training curriculum in CyberSecurity. This short paper delves into the design and methodology implemented by the uSucceed learning system. Preliminary usability test evaluations by neurodiverse individuals (n = 8) reveal critical insights into user experience, particularly regarding cybersickness and the usability of the uSucceed VR learning system. Usability findings revealed positive feedback on the immersive environment but highlighted issues with task navigation and inconsistent responses from the AI-driven pedagogical agent. Cybersickness levels ranged from low to moderate, with dizziness and eyestrain being the most reported symptoms. These results serve as a framework for further refining of the curriculum and system design to enhance usability. As the project evolves, it is moving towards the enhancement phase of the learning system’s development, with a focus on further advancement of the context-driven AI pedagogical agent.more » « less
-
Locus is a NIME designed specifically for an interactive, immersive high density loudspeaker array environment. The system is based on a pointing mechanism to interact with a sound scene comprising 128 speakers. Users can point anywhere to interact with the system, and the spatial interaction utilizes motion capture, so it does not require a screen. Instead it is completely controlled via hand gestures using a glove that is populated with motion-tracking markers. The main purpose of this system is to offer intuitive physical interaction with the perimeter based spatial sound sources. Further, its goal is to minimize user-worn technology and thereby enhance freedom of motion by utilizing environmental sensing devices, such as motion capture cameras or infrared sensors. The ensuing creativity enabling technology is applicable to a broad array of possible scenarios, from researching limits of human spatial hearing perception to facilitating learning and artistic performances, including dance. Below we describe our NIME design and implementation, its preliminary assessment, and offer a Unity-based toolkit to facilitate its broader deployment and adoption.more » « less
An official website of the United States government

