skip to main content

This content will become publicly available on June 30, 2023

Title: A Practical Model for Realistic Butterfly Flight Simulation
Butterflies are not only ubiquitous around the world but are also widely known for inspiring thrill resonance, with their elegant and peculiar flights. However, realistically modeling and simulating butterfly flights—in particular, for real-time graphics and animation applications—remains an under-explored problem. In this article, we propose an efficient and practical model to simulate butterfly flights. We first model a butterfly with parametric maneuvering functions, including wing-abdomen interaction. Then, we simulate dynamic maneuvering control of the butterfly through our force-based model, which includes both the aerodynamics force and the vortex force. Through many simulation experiments and comparisons, we demonstrate that our method can efficiently simulate realistic butterfly flight motions in various real-world settings.
Authors:
; ; ; ; ;
Award ID(s):
2005430
Publication Date:
NSF-PAR ID:
10359111
Journal Name:
ACM Transactions on Graphics
Volume:
41
Issue:
3
Page Range or eLocation-ID:
1 to 12
ISSN:
0730-0301
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract In digital agriculture, large-scale data acquisition and analysis can improve farm management by allowing growers to constantly monitor the state of a field. Deploying large autonomous robot teams to navigate and monitor cluttered environments, however, is difficult and costly. Here, we present methods that would allow us to leverage managed colonies of honey bees equipped with miniature flight recorders to monitor orchard pollination activity. Tracking honey bee flights can inform estimates of crop pollination, allowing growers to improve yield and resource allocation. Honey bees are adept at maneuvering complex environments and collectively pool information about nectar and pollen sources through thousands of daily flights. Additionally, colonies are present in orchards before and during bloom for many crops, as growers often rent hives to ensure successful pollination. We characterize existing Angle-Sensitive Pixels (ASPs) for use in flight recorders and calculate memory and resolution trade-offs. We further integrate ASP data into a colony foraging simulator and show how large numbers of flights refine system accuracy, using methods from robotic mapping literature. Our results indicate promising potential for such agricultural monitoring, where we leverage the superiority of social insects to sense the physical world, while providing data acquisition on par with explicitlymore »engineered systems.« less
  2. State of the art design and testing of avionics for unmanned aircraft is an iterative process that involves many test flights, interleaved with multiple revisions of the flight management software and hardware. To significantly reduce flight test time and software development costs, we have developed a real-time UAV Emulation Environment (uavEE) using ROS that interfaces with high fidelity simulators to simulate the flight behavior of the aircraft. Our uavEE emulates the avionics hardware by interfacing directly with the embedded hardware used in real flight. The modularity of uavEE allows the integration of countless test scenarios and applications. Furthermore, we present an accurate data driven approach for modeling of propulsion power of fixed-wing UAVs, which is integrated into uavEE. Finally, uavEE and the proposed UAV Power Model have been experimentally validated using a fixed-wing UAV testbed.
  3. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward researchmore »towards more realistic physicality in future VR/AR.« less
  4. Telecystoscopy can lower the barrier to access critical urologic diagnostics for patients around the world. A major challenge for robotic control of flexible cystoscopes and intuitive teleoperation is the pose estimation of the scope tip. We propose a novel real-time camera localization method using video recordings from a prior cystoscopy and 3D bladder reconstruction to estimate cystoscope pose within the bladder during follow-up telecystoscopy. We map prior video frames into a low-dimensional space as a dictionary so that a new image can be likewise mapped to efficiently retrieve its nearest neighbor among the dictionary images. The cystoscope pose is then estimated by the correspondence among the new image, its nearest dictionary image, and the prior model from 3D reconstruction. We demonstrate performance of our methods using bladder phantoms with varying fidelity and a servo-controlled cystoscope to simulate the use case of bladder surveillance through telecystoscopy. The servo-controlled cystoscope with 3 degrees of freedom (angulation, roll, and insertion axes) was developed for collecting cystoscope videos from bladder phantoms. Cystoscope videos were acquired in a 2.5D bladder phantom (bladder-shape cross-section plus height) with a panorama of a urothelium attached to the inner surface. Scans of the 2.5D phantom were performed in separatemore »arc trajectories each of which is generated by actuation on the angulation with a fixed roll and insertion length. We further included variance in moving speed, imaging distance and existence of bladder tumors. Cystoscope videos were also acquired in a water-filled 3D silicone bladder phantom with hand-painted vasculature. Scans of the 3D phantom were performed in separate circle trajectories each of which is generated by actuation on the roll axis under a fixed angulation and insertion length. These videos were used to create 3D reconstructions, dictionary sets, and test data sets for evaluating the computational efficiency and accuracy of our proposed method in comparison with a method based on global Scale-Invariant Feature Transform (SIFT) features, named SIFT-only. Our method can retrieve the nearest dictionary image for 94–100% of test frames in under 55[Formula: see text]ms per image, whereas the SIFT-only method can only find the image match for 56–100% of test frames in 6000–40000[Formula: see text]ms per image depending on size of the dictionary set and richness of SIFT features in the images. Our method, with a speed of around 20 Hz for the retrieval stage, is a promising tool for real-time image-based scope localization in robotic cystoscopy when prior cystoscopy images are available.« less
  5. The annual migration of monarch butterflies, Danaus plexippus, from their summer breeding grounds in North America to their overwintering sites in Mexico can span over 4000 kilometers. Little is known about the aerodynamic mechanism behind this extended flight. This study is motivated by the hypothesis that their flapping wing flight is enhanced by fluid-structure interactions. The objective of this study to quantify the aeroelastic performance of monarch butterfly wings and apply those values in the creation of an artificial wing with an end goal of creating a biomimetic micro-air vehicle. A micro-CT scan, force-deflection measurements, and a finite element solver on real monarch butterfly wings were used to determine the density and elastic modulus. These structural parameters were then used to create a monarch butterfly inspired artificial wing. A solidification process was used to adhere 3D printed vein structures to a membrane. The performance of the artificial butterfly wing was tested by measuring the lift at flapping frequencies between 6.3 and 14 Hz. Our results show that the elastic modulus of a real wing is 1.8 GPa along the span and 0.20 GPa along the chord, suggesting that the butterfly wing material is highly anisotropic. Real right forewings performed optimallymore »at approximately 10 Hz, the flapping frequency of a live monarch butterfly, with a peak force of 4 mN. The artificial wing performed optimally at approximately 8 Hz with a peak force of 5 mN.« less