Abstract Sensory input across modalities is highly dynamic, continuously confronting the brain with the task of making sense of the external world. Olfaction is a key sense that many species depend on for survival, for example to locate food sources and mating partners or to avoid encountering predators. In the absence of visual cues, olfactory cues are especially useful, as they provide information over a large range of distances. Natural odours form temporally complex plumes that show rapid fluctuations in odour concentration carrying information about the location of an odour source. This review focuses on how primarily mammals use this spatial information from olfactory cues to navigate their environment. I highlight progress made on the physical description of dynamically fluctuating odours, behavioural paradigms to investigate odour-guided navigation and review initial findings on the underlying neural mechanisms that allow mammals to extract spatial information from the dynamic odour landscape.
more »
« less
This content will become publicly available on January 10, 2026
Quantifying spectral information about source separation in multisource odour plumes
Odours released by objects in natural environments can contain information about their spatial locations. In particular, the correlation of odour concentration timeseries produced by two spatially separated sources contains information about the distance between the sources. For example, mice are able to distinguish correlated and anti-correlated odour fluctuations at frequencies up to 40 Hz, while insect olfactory receptor neurons can resolve fluctuations exceeding 100 Hz. Can this high-frequency acuity support odour source localization? Here we answer this question by quantifying the spatial information about source separation contained in the spectral constituents of correlations. We used computational fluid dynamics simulations of multisource plumes in two-dimensional chaotic flow environments to generate temporally complex, covarying odour concentration fields. By relating the correlation of these fields to the spectral decompositions of the associated odour concentration timeseries, and making simplifying assumptions about the statistics of these decompositions, we derived analytic expressions for the Fisher information contained in the spectral components of the correlations about source separation. We computed the Fisher information for a broad range of frequencies and source separations for three different source arrangements and found that high frequencies were more informative than low frequencies when sources were close relative to the sizes of the large eddies in the flow. We observed a qualitatively similar effect in an independent set of simulations with different geometry, but not for surrogate data with a similar power spectrum to our simulations but in which all frequencies werea prioriequally informative. Our work suggests that the high-frequency acuity of olfactory systems may support high-resolution spatial localization of odour sources. We also provide a model of the distribution of the spectral components of correlations that is accurate over a broad range of frequencies and source separations. More broadly, our work establishes an approach for the quantification of the spatial information in odour concentration timeseries.
more »
« less
- Award ID(s):
- 2014217
- PAR ID:
- 10591776
- Editor(s):
- Gurka, Roi
- Publisher / Repository:
- Public Library of Science
- Date Published:
- Journal Name:
- PLOS ONE
- Volume:
- 20
- Issue:
- 1
- ISSN:
- 1932-6203
- Page Range / eLocation ID:
- e0297754
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)Abstract The sense of smell is an essential modality for many species, in particular nocturnal and crepuscular mammals, to gather information about their environment. Olfactory cues provide information over a large range of distances, allowing behaviours ranging from simple detection and recognition of objects, to tracking trails and navigating using odour plumes from afar. In this review, we discuss the features of the natural olfactory environment and provide a brief overview of how odour information can be sampled and might be represented and processed by the mammalian olfactory system. Finally, we discuss recent behavioural approaches that address how mammals extract spatial information from the environment in three different contexts: odour trail tracking, odour plume tracking and, more general, olfactory-guided navigation. Recent technological developments have seen the spatiotemporal aspect of mammalian olfaction gain significant attention, and we discuss both the promising aspects of rapidly developing paradigms and stimulus control technologies as well as their limitations. We conclude that, while still in its beginnings, research on the odour environment offers an entry point into understanding the mechanisms how mammals extract information about space.more » « less
-
Flying insects exhibit remarkable capabilities in coordinating their olfactory sensory system and flapping wings during odour plume-tracking flights. While observations have indicated that their flapping wing motion can ‘sniff’ up the incoming plumes for better odour sampling range, how flapping motion impacts the odour concentration field around the antennae is unknown. Here, we reconstruct the body and wing kinematics of a forwards-flying butterfly based on high-speed images. Using an in-house computational fluid dynamics solver, we simulate the unsteady flow field and odourant transport process by solving the Navier–Stokes and odourant advection-diffusion equations. Our results show that, during flapping flight, the interaction between wing leading-edge vortices and antenna vortices strengthens the circulation of antenna vortices by over two-fold compared with cases without flapping motion, leading to a significant increase in odour intensity fluctuation along the antennae. Specifically, the interaction between the wings and antennae amplifies odour intensity fluctuations on the antennae by up to 8.4 fold. This enhancement is critical in preventing odour fatigue during odour-tracking flights. Further analysis reveals that this interaction is influenced by the inter-antennal angle. Adjusting this angle allows insects to balance between resistance to odour fatigue and the breadth of odour sampling. Narrower inter-antennal angles enhance fatigue resistance, while wider angles extend the sampling range but reduce resistance. Additionally, our findings suggest that while the flexibility of the wings and the thorax's pitching motion in butterflies do influence odour fluctuation, their impact is relatively secondary to that of the wing–antenna interaction.more » « less
-
As augmented and virtual reality (AR/VR) technology matures, a method is desired to represent real-world persons visually and aurally in a virtual scene with high fidelity to craft an immersive and realistic user experience. Current technologies leverage camera and depth sensors to render visual representations of subjects through avatars, and microphone arrays are employed to localize and separate high-quality subject audio through beamforming. However, challenges remain in both realms. In the visual domain, avatars can only map key features (e.g., pose, expression) to a predetermined model, rendering them incapable of capturing the subjects’ full details. Alternatively, high-resolution point clouds can be utilized to represent human subjects. However, such three-dimensional data is computationally expensive to process. In the realm of audio, sound source separation requires prior knowledge of the subjects’ locations. However, it may take unacceptably long for sound source localization algorithms to provide this knowledge, which can still be error-prone, especially with moving objects. These challenges make it difficult for AR systems to produce real-time, high-fidelity representations of human subjects for applications such as AR/VR conferencing that mandate negligible system latency. We present Acuity, a real-time system capable of creating high-fidelity representations of human subjects in a virtual scene both visually and aurally. Acuity isolates subjects from high-resolution input point clouds. It reduces the processing overhead by performing background subtraction at a coarse resolution, then applying the detected bounding boxes to fine-grained point clouds. Meanwhile, Acuity leverages an audiovisual sensor fusion approach to expedite sound source separation. The estimated object location in the visual domain guides the acoustic pipeline to isolate the subjects’ voices without running sound source localization. Our results demonstrate that Acuity can isolate multiple subjects’ high-quality point clouds with a maximum latency of 70 ms and average throughput of over 25 fps, while separating audio in less than 30 ms. We provide the source code of Acuity at: https://github.com/nesl/Acuity.more » « less
-
Macroeconomists increasingly use external sources of exogenous variation for causal inference. However, unless such external instruments (proxies) capture the underlying shock without measurement error, existing methods are silent on the importance of that shock for macroeconomic fluctuations. We show that, in a general moving average model with external instruments, variance decompositions for the instrumented shock are interval-identified, with informative bounds. Various additional restrictions guarantee point identification of both variance and historical decompositions. Unlike SVAR analysis, our methods do not require invertibility. Applied to U.S. data, they give a tight upper bound on the importance of monetary shocks for inflation dynamics.more » « less
An official website of the United States government
