skip to main content


Title: A Practical Method for Butterfly Motion Capture
Simulating realistic butterfly motion has been a widely-known challenging problem in computer animation. Arguably, one of its main reasons is the difficulty of acquiring accurate flight motion of butterflies. In this paper we propose a practical yet effective, optical marker-based approach to capture and process the detailed motion of a flying butterfly. Specifically, we first capture the trajectories of the wings and thorax of a flying butterfly using optical marker based motion tracking. After that, our method automatically fills the positions of missing markers by exploiting the continuity and relevance of neighboring frames, and improves the quality of the captured motion via noise filtering with optimized parameter settings. Through comparisons with existing motion processing methods, we demonstrate the effectiveness of our approach to obtain accurate flight motions of butterflies. Furthermore, we created and will release a first-of-its-kind butterfly motion capture dataset to research community.  more » « less
Award ID(s):
2005430
NSF-PAR ID:
10463604
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Proceeding of ACM SIGGRAPH Conference on Motion, Interaction, and Games 2022
Page Range / eLocation ID:
1 to 9
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The long-range migration of monarch butterflies, extended over 4000 km, is not well understood. Monarchs experience varying density conditions during migration, ranging as high as 3000 m, where the air density is much lower than at sea level. In this study, we test the hypothesis that the aerodynamic performance of monarchs improves at reduced density conditions by considering the fluid–structure interaction of chordwise flexible wings. A well-validated, fully coupled Navier–Stokes/structural dynamics solver was used to illustrate the interplay between wing motion, aerodynamics, and structural flexibility in forward flight. The wing density and elastic modulus were measured from real monarch wings and prescribed as inputs to the aeroelastic framework. Our results show that sufficient lift is generated to offset the butterfly weight at higher altitudes, aided by the wake-capture mechanism, which is a nonlinear wing–wake interaction mechanism, commonly seen for hovering animals. The mean total power, defined as the sum of the aerodynamic and inertial power, decreased by 36% from the sea level to the condition at 3000 m. Decreasing power with altitude, while maintaining the same equilibrium lift, suggests that the butterflies generate lift more efficiently at higher altitudes. 
    more » « less
  2. Full-body motion capture is essential for the study of body movement. Video-based, markerless, mocap systems are, in some cases, replacing marker-based systems, but hybrid systems are less explored. We develop methods for coregistration between 2D video and 3D marker positions when precise spatial relationships are not known a priori. We illustrate these methods on three-ball cascade juggling in which it was not possible to use marker-based tracking of the balls, and no tracking of the hands was possible due to occlusion. Using recorded video and motion capture, we aimed to transform 2D ball coordinates into 3D body space as well as recover details of hand motion. We proposed four linear coregistration methods that differ in how they optimize ball-motion constraints during hold and flight phases, using an initial estimate of hand position based on arm and wrist markers. We found that minimizing the error between ball and hand estimate was globally suboptimal, distorting ball flight trajectories. The best-performing method used gravitational constraints to transform vertical coordinates and ball-hold constraints to transform lateral coordinates. This method enabled an accurate description of ball flight as well as a reconstruction of wrist movements. We discuss these findings in the broader context of video/motion capture coregistration. 
    more » « less
  3. Butterflies are not only ubiquitous around the world but are also widely known for inspiring thrill resonance, with their elegant and peculiar flights. However, realistically modeling and simulating butterfly flights—in particular, for real-time graphics and animation applications—remains an under-explored problem. In this article, we propose an efficient and practical model to simulate butterfly flights. We first model a butterfly with parametric maneuvering functions, including wing-abdomen interaction. Then, we simulate dynamic maneuvering control of the butterfly through our force-based model, which includes both the aerodynamics force and the vortex force. Through many simulation experiments and comparisons, we demonstrate that our method can efficiently simulate realistic butterfly flight motions in various real-world settings. 
    more » « less
  4. Traditional models of motor control typically operate in the domain of continuous signals such as spike rates, forces, and kinematics. However, there is growing evidence that precise spike timings encode significant information that coordinates and causally influences motor control. Some existing neural network models incorporate spike timing precision but they neither predict motor spikes coordinated across multiple motor units nor capture sensory-driven modulation of agile locomotor control. In this paper, we propose a visual encoder and model of a sensorimotor system based on a recurrent neural network (RNN) that utilizes spike timing encoding during smooth pursuit target tracking. We use this to predict a nearly complete, spike-resolved motor program of a hawkmoth that requires coordinated millisecond precision across 10 major flight motor units. Each motor unit enervates one muscle and utilizes both rate and timing encoding. Our model includes a motion detection mechanism inspired by the hawkmoth's compound eye, a convolutional encoder that compresses the sensory input, and a simple RNN that is sufficient to sequentially predict wingstroke-to-wingstroke modulation in millisecond-precise spike timings. The two-layer output architecture of the RNN separately predicts the occurrence and timing of each spike in the motor program. The dataset includes spikes recorded from all motor units during a tethered flight where the hawkmoth attends to a moving robotic flower, with a total of roughly 7000 wingstrokes from 16 trials on 5 hawkmoth subjects. Intra-trial and same-subject inter-trial predictions on the test data show that nearly every spike can be predicted within 2 ms of its known spike timing precision values. Whereas, spike occurrence prediction accuracy is about 90%. Overall, our model can predict the precise spike timing of a nearly complete motor program for hawkmoth flight with a precision comparable to that seen in agile flying insects. Such an encoding framework that captures visually-modulated precise spike timing codes and coordination can reveal how organisms process visual cues for agile movements. It can also drive the next generation of neuromorphic controllers for navigation in complex environments. 
    more » « less
  5. Kinematic motion analysis is widely used in health-care, sports medicine, robotics, biomechanics, sports science, etc. Motion capture systems are essential for motion analysis. There are three types of motion capture systems: marker-based capture, vision-based capture, and volumetric capture. Marker-based motion capture systems can achieve fairly accurate results but attaching markers to a body is inconvenient and time-consuming. Vision-based, marker-less motion capture systems are more desirable because of their non-intrusiveness and flexibility. Volumetric capture is a newer and more advanced marker-less motion capture system that can reconstruct realistic, full-body, animated 3D character models. But volumetric capture has rarely been used for motion analysis because volumetric motion data presents new challenges. We propose a new method for conducting kinematic motion analysis using volumetric capture data. This method consists of a three-stage pipeline. First, the motion is captured by a volumetric capture system. Then the volumetric capture data is processed using the Iterative Closest Points (ICP) algorithm to generate virtual markers that track the motion. Third, the motion tracking data is imported into the biomechanical analysis tool OpenSim for kinematic motion analysis. Our motion analysis method enables users to apply numerical motion analysis to the skeleton model in OpenSim while also studying the full-body, animated 3D model from different angles. It has the potential to provide more detailed and in-depth motion analysis for areas such as healthcare, sports science, and biomechanics. 
    more » « less