skip to main content


Title: Wavefront profiling via correlation of GLAO open loop telemetry
Adaptive Optics (AO) used in ground based observatories can be strengthened in both design and algorithms by a more detailed understanding of the atmosphere they seek to correct. Nowhere is this more true than on Maunakea, where a clearer profile of the atmosphere informs AO system development from the small separations of Extreme AO (ExAO) to the wide field Ground Layer AO (GLAO). Employing telemetry obtained from the ımaka GLAO demonstrator on the University of Hawaii 2.2-meter telescope, we apply a wind profiling method that identifies turbulent layer velocities through spatial-temporal cross correlations of multiple wavefront sensors (WFSs). We compare the derived layer velocities with nearby wind anemometer data and meteorological model predictions of the upper wind speeds and discuss similarities and differences. The strengths and limitations of this profiling method are evaluated through successful recovery of injected, simulated layers into real telemetry. We detail the profilers’ results, including the percentage of data with viable estimates, on four characteristic ımaka observing runs on open loop telemetry throughout both winter and summer targets. We report on how similar layers are to external measures, the confidence of these results, and the potential for future use of this technique on other multi conjugate AO systems.  more » « less
Award ID(s):
1910552
NSF-PAR ID:
10373997
Author(s) / Creator(s):
; ; ; ; ;
Editor(s):
Schmidt, Dirk; Schreiber, Laura; Vernet, Elise
Date Published:
Journal Name:
Proc. SPIE 12185, Adaptive Optics Systems VIII
Volume:
12185
Page Range / eLocation ID:
222
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Schmidt, Dirk ; Schreiber, Laura ; Vernet, Elise (Ed.)
    Early adaptive optics (AO) systems were designed with knowledge of a site’s distribution of Fried parameter (r0) and Greenwood time delay (τ0) values. Recent systems have leveraged additional knowledge of the distribution of turbulence with altitude. We present measurements of the atmosphere above Maunakea, Hawaii and how the temporal properties of the turbulence relate to tomographic reconstructions. We combine archival telemetry collected by ‘imaka—a ground layer AO (GLAO) system on the UH88” telescope—with data from the local weather towers, weather forecasting models, and weather balloon launches, to study how frequently one can map a turbulent layer’s wind vector to its altitude. Finally, we present the initial results of designing a new GLAO control system based off of these results, an approach we have named “temporal tomography.” 
    more » « less
  2. Abstract

    This study examines the possibility that supercell tornado forecasts could be improved by utilizing the storm-relative helicity (SRH) in the lowest few hundred meters of the atmosphere (instead of much deeper layers). This hypothesis emerges from a growing body of literature linking the near-ground wind profile to the organization of the low-level mesocyclone and thus the probability of tornadogenesis. This study further addresses the ramifications of near-ground SRH to the skill of the significant tornado parameter (STP), which is probably the most commonly used environmental indicator for tornadic thunderstorms. Using a sample of 20 194 severe, right-moving supercells spanning a 13-yr period, sounding-derived parameters were compared using forecast verification metrics, emphasizing a high probability of detection for tornadic supercells while minimizing false alarms. This climatology reveals that the kinematic components of environmental profiles are more skillful at discriminating significantly tornadic supercells from severe, nontornadic supercells than the thermodynamic components. The effective-layer SRH has by far the greatest forecast skill among the components of the STP, as it is currently defined. However, using progressively shallower layers for the SRH calculation leads to increasing forecast skill. Replacing the effective-layer SRH with the 0–500 m AGL SRH in the formulation of STP increases the number of correctly predicted events by 8% and decreases the number of missed events and false alarms by 18%. These results provide promising evidence that forecast parameters can still be improved through increased understanding of the environmental controls on the processes that govern tornado formation.

     
    more » « less
  3. Despite recent advances in both coupled fire modeling and measurement techniques to sample the fire environment, the fire–atmosphere coupling mechanisms that lead to fast propagating wildfires remain poorly understood. This knowledge gap adversely affects fire management when wildland fires propagate unexpectedly rapidly and shift direction due to the fire impacts on local wind conditions. In this work, we utilized observational data from the FireFlux2 prescribed burn and numerical simulations performed with a coupled fire–atmosphere model WRF-SFIRE to assess the small-scale impacts of fire on local micrometeorology under moderate wind conditions (10–12 m/s). The FireFlux2 prescribed burn provided a comprehensive observational dataset with in situ meteorological observations as well as IR measurements of fire progression. To directly quantify the effects of fire–atmosphere interactions, two WRF-SFIRE simulations were executed. One simulation was run in a two-way coupled mode in which the heat and moisture fluxes emitted from the fire were injected into the atmosphere, and the other simulation was performed in a one-way coupled mode for which the atmosphere was not affected by the fire. The difference between these two simulations was used to analyze and quantify the fire impacts on the atmospheric circulation at different sections of the fire front. The fire-released heat fluxes resulted in vertical velocities as high as 10.8 m/s at the highest measurement level (20 m above ground level) gradually diminishing with height and dropping to 7.9 m/s at 5.77 m. The fire-induced horizontal winds indicated the strongest fire-induced flow at the lowest measurement levels (as high as 3.3 m/s) gradually decreasing to less than 1 m/s at 20 m above ground level. The analysis of the simulated flow indicates significant differences between the fire-induced circulation at the fire head and on the flanks. The fire-induced circulation was much stronger near the fire head than at the flanks, where the fire did not produce particularly strong cross-fire flow and did not significantly change the lateral fire progression. However, at the head of the fire the fire-induced winds blowing across the front were the strongest and significantly accelerated fire progression. The two-way coupled simulation including the fire-induced winds produced 36.2% faster fire propagation than the one-way coupled run, and more realistically represented the fire progression.

     
    more » « less
  4. Abstract. During the Chequamegon Heterogeneous Ecosystem Energy-balance Study Enabled by a High-density Extensive Array of Detectors 2019 (CHEESEHEAD19) field campaign, held in the summer of 2019 in northern Wisconsin, USA, active and passive ground-based remote sensing instruments were deployed to understand the response of the planetary boundary layer to heterogeneous land surface forcing. These instruments include radar wind profilers, microwave radiometers, atmospheric emitted radiance interferometers, ceilometers, high spectral resolution lidars, Doppler lidars, and collaborative lower-atmospheric mobile profiling systems that combine several of these instruments. In this study, these ground-based remote sensing instruments are used to estimate the height of the daytime planetary boundary layer, and their performance is compared against independent boundary layer depth estimates obtained from radiosondes launched as part of the field campaign. The impact of clouds (in particular boundary layer clouds) on boundary layer depth estimations is also investigated. We found that while all instruments are overall able to provide reasonable boundary layer depth estimates, each of them shows strengths and weaknesses under certain conditions. For example, radar wind profilers perform well during cloud-free conditions, and microwave radiometers and atmospheric emitted radiance interferometers have a very good agreement during all conditions but are limited by the smoothness of the retrieved thermodynamic profiles. The estimates from ceilometers and high spectral resolution lidars can be hindered by the presence of elevated aerosol layers or clouds, and the multi-instrument retrieval from the collaborative lower atmospheric mobile profiling systems can be constricted to a limited height range in low-aerosol conditions. 
    more » « less
  5. The leading difficulty in achieving the contrast necessary to directly image exoplanets and associated structures (e.g., protoplanetary disks) at wavelengths ranging from the visible to the infrared is quasi-static speckles (QSSs). QSSs are hard to distinguish from planets at the necessary level of precision to achieve high contrast. QSSs are the result of hardware aberrations that are not compensated for by the adaptive optics (AO) system; these aberrations are called non-common path aberrations (NCPAs). In 2013, Frazin showed how simultaneous millisecond telemetry from the wavefront sensor (WFS) and a science camera behind a stellar coronagraph can be used as input into a regression scheme that simultaneously and self-consistently estimates NCPAs and the sought-after image of the planetary system (exoplanetimage). When run in a closed-loop configuration, the WFS measures the corrected wavefront, called theAO residual(AOR)wavefront. The physical principle underlying the regression method is rather simple: when an image is formed at the science camera, the AOR modules both the speckles arising from NCPAs as well as the planetary image. Therefore, the AOR can be used as a probe to estimate NCPA and the exoplanet image via regression techniques. The regression approach is made more difficult by the fact that the AOR is not exactly known since it can be estimated only from the WFS telemetry. The simulations in the Part I paper provide results on the joint regression on NCPAs and the exoplanet image from three different methods, calledideal,naïve, andbias-correctedestimators. The ideal estimator is not physically realizable (it is useful as a benchmark for simulation studies), but the other two are. The ideal estimator uses true AOR values (available in simulation studies), but it treats the noise in focal plane images via standard linearized regression. Naïve regression uses the same regression equations as the ideal estimator, except that it substitutes the estimated values of the AOR for true AOR values in the regression formulas, which can result in problematic biases (however, Part I provides an example in which the naïve estimate makes a useful estimate of NCPAs). The bias-corrected estimator treats the errors in AOR estimates, but it requires the probability distribution that governs the errors in AOR estimates. This paper provides the regression equations for ideal, naïve, and bias-corrected estimators, as well as a supporting technical discussion.

     
    more » « less