skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Award ID contains: 1852155

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. ABSTRACT A challenge to understanding locomotion in complex three-dimensional terrain with large obstacles is to create tools for controlled, systematic experiments. Recent terrain arenas allow observations at small spatiotemporal scales (∼10 body lengths or cycles). Here, we created a terrain treadmill to enable high-resolution observation of animal locomotion through large obstacles over large spatiotemporal scales. An animal moves through modular obstacles on an inner sphere, while a rigidly attached, concentric, transparent outer sphere rotates with the opposite velocity via closed-loop feedback to keep the animal on top. During sustained locomotion, a discoid cockroach moved through pillar obstacles for up to 25 min (2500 cycles) over 67 m (1500 body lengths). Over 12 trials totaling ∼1 h, the animal was maintained within a radius of 1 body length (4.5 cm) on top of the sphere 90% of the time. The high-resolution observation enables the study of diverse locomotor behaviors and quantification of animal–obstacle interaction. 
    more » « less
  2. Current commercially available robotic minimally invasive surgery (RMIS) platforms provide no haptic feedback of tool interactions with the surgical environment. As a consequence, novice robotic surgeons must rely exclusively on visual feedback to sense their physical interactions with the surgical environment. This technical limitation can make it challenging and time-consuming to train novice surgeons to proficiency in RMIS. Extensive prior research has demonstrated that incorporating haptic feedback is effective at improving surgical training task performance. However, few studies have investigated the utility of providing feedback of multiple modalities of haptic feedback simultaneously (multi-modality haptic feedback) in this context, and these studies have presented mixed results regarding its efficacy. Furthermore, the inability to generalize and compare these mixed results has limited our ability to understand why they can vary significantly between studies. Therefore, we have developed a generalized, modular multi-modality haptic feedback and data acquisition framework leveraging the real-time data acquisition and streaming capabilities of the Robot Operating System (ROS). In our preliminary study using this system, participants complete a peg transfer task using a da Vinci robot while receiving haptic feedback of applied forces, contact accelerations, or both via custom wrist-worn haptic devices. Results highlight the capability of our system in running systematic comparisons between various single and dual-modality haptic feedback approaches. 
    more » « less
  3. null (Ed.)
    Spectral unmixing techniques for photoacoustic images are often used to isolate signal origins (e.g., blood, contrast agents, lipids). However, these techniques often require many (e.g., 12–59) wavelength transmissions for optimal performance to exploit the optical properties of different biological chromophores. Analysis of the acoustic frequency response of photoacoustic signals has the potential to provide additional discrimination of photoacoustic signals from different materials, with the added benefit of potentially requiring only a few optical wavelength emissions. This study presents our initial results testing this hypothesis in a phantom experiment, given the task of differentiating photoacoustic signals from deoxygenated hemoglobin (Hb) and methylene blue (MB). Coherence-based beamforming, principal component analysis, and nearest neighbor classification were employed to determine ground-truth labels, perform feature extraction, and classify image contents, respectively. The mean ± one standard deviation of classification accuracy was increased from 0.65 ± 0.16 to 0.88 ± 0.17 when increasing the number of wavelength emissions from one to two, respectively. When using an optimal laser wavelength pair of 710–870 nm, the sensitivity and specificity of detecting MB over Hb were 1.00 and 1.00, respectively. Results are highly promising for the differentiation of photoacoustic-sensitive materials with comparable performance to that achieved with more conventional multispectral laser wavelength approaches. 
    more » « less
  4. null (Ed.)
  5. Minimally invasive surgeries often require complicated maneuvers and delicate hand–eye coordination and ideally would incorporate “x-ray vision” to see beyond tool tips and underneath tissues prior to making incisions. Photoacoustic imaging has the potential to offer this feature but not with ionizing x-rays. Instead, optical fibers and acoustic receivers enable photoacoustic sensing of major structures—such as blood vessels and nerves—that are otherwise hidden from view. This imaging process is initiated by transmitting laser pulses that illuminate regions of interest, causing thermal expansion and the generation of sound waves that are detectable with conventional ultrasound transducers. The recorded signals are then converted to images through the beamforming process. Photoacoustic imaging may be implemented to both target and avoid blood-rich surgical contents (and in some cases simultaneously or independently visualize optical fiber tips or metallic surgical tool tips) in order to prevent accidental injury and assist device operators during minimally invasive surgeries and interventional procedures. Novel light delivery systems, counterintuitive findings, and robotic integration methods introduced by the Photoacoustic & Ultrasonic Systems Engineering Lab are summarized in this invited Perspective, setting the foundation and rationale for the subsequent discussion of the author’s views on possible future directions for this exciting frontier known as photoacoustic-guided surgery. 
    more » « less
  6. null (Ed.)