skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Attention:

The DOI auto-population feature in the Public Access Repository (PAR) will be unavailable from 4:00 PM ET on Tuesday, July 8 until 4:00 PM ET on Wednesday, July 9 due to scheduled maintenance. We apologize for the inconvenience caused.


Search for: All records

Creators/Authors contains: "Silva, R"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract The paper presents the effects of the storm‐time prompt penetration electric fields (PPEF) and traveling atmospheric disturbances (TADs) on the total electron content (TEC), foF2 and hmF2 in the American sector (north and south) during the geomagnetic storm on 23–24 April 2023. The data show a poleward shift of the Equatorial Ionization Anomaly (EIA) crests to 18°N and 20°S in the evening of 23 April (attributed to eastward PPEF) and the EIA crests remaining almost in the same latitudes after the PPEF reversed westward. The thermospheric neutral wind velocity, foF2, hmF2, and TEC variations show that TADs from the northern and southern high latitudes propagating equatorward and crossing the equator after midnight on 23 April. The meridional keograms of ΔTEC show the TAD structures in the north/south propagated with phase velocity 470/485 m/s, wave length 4,095/4,016 km and period 2.42/2.30 hr, respectively. The interactions of the TADs also appear to modify the wind velocities in low latitudes. The eastward PPEF and equatorward TADs also favored the development of a clear/not so clear F3 layer in northern/southern regions of the equator. 
    more » « less
  2. Abstract The fabrication of ceramic scintillators by laser sintering is briefly reviewed and current limitations discussed. The experimental work focused on the fabrication and characterization of undoped and Pr-doped Lu 3 Al 5 O 12 (LuAG). X-ray diffraction (XRD) and Raman spectroscopy were used to characterize the structure of the sintered ceramics, with XRD results suggesting the absence of residual thermal stresses. Collectively, Raman results suggested the incorporation of Pr to affect the structure and its dynamics. Broadening the peaks of the ceramics in relation to those from the single crystal revealed the presence of structural disorder. Scanning electron microscopy revealed intergrain porosity thus explaining the lack of optical transparency. Energy-dispersive X-ray spectroscopy (EDX) measurements showed Pr to be homogeneously distributed. Radioluminescence measurements under X-ray excitation as a function of the temperature were used to investigate intrinsic defects of the host, including anti-sites and F-type defects. 
    more » « less
  3. Runtime systems that automate the execution of applications on distributed cyberinfrastructures need to make scheduling deci- sions. Researchers have proposed many scheduling algorithms, but most of them are designed based on analytical models and assumptions that may not hold in practice. The literature is thus rife with algorithms that have been evaluated only within the scope of their underlying as- sumptions but whose practical effectiveness is unclear. It is thus difficult for developers to decide which algorithm to implement in their runtime systems. To obviate the above difficulty, we propose an approach by which the runtime system executes, throughout application execution, simulations of this very execution. Each simulation is for a different algorithm in a scheduling algorithm portfolio, and the best algorithm is selected based on simulation results. The main objective of this work is to evaluate the feasibility and potential merit of this portfolio scheduling approach, even in the presence of simulation inaccuracy, when compared to the traditional one-algorithm approach. We perform this evaluation via a case study in the context of scientific workflows. Our main finding is that portfolio scheduling can outperform the best one-algorithm approach even in the presence of relatively large simulation inaccuracies. 
    more » « less
  4. IEEE Computer Science (Ed.)
    This poster presents our first steps to define a roadmap to robust science for high-throughput applications used in scientific discovery. These applications combine multiple components into increasingly complex multi-modal workflows that are often executed in concert on heterogeneous systems. The increasing complexity hinders the ability of scientists to generate robust science (i.e., ensuring performance scalability in space and time; trust in technology, people, and infrastructures; and reproducible or confirmable research). Scientists must withstand and overcome adverse conditions such as heterogeneous and unreliable architectures at all scales (including extreme scale), rigorous testing under uncertainties, unexplainable algorithms in machine learning, and black-box methods. This poster presents findings and recommendations to build a roadmap to overcome these challenges and enable robust science. The data was collected from an international community of scientists during a virtual world cafe in February 2021 
    more » « less
  5. IEEE Computer Society (Ed.)
    Scientists using the high-throughput computing (HTC) paradigm for scientific discovery rely on complex software systems and heterogeneous architectures that must deliver robust science (i.e., ensuring performance scalability in space and time; trust in technology, people, and infrastructures; and reproducible or confirmable research). Developers must overcome a variety of obstacles to pursue workflow interoperability, identify tools and libraries for robust science, port codes across different architectures, and establish trust in non-deterministic results. This poster presents recommendations to build a roadmap to overcome these challenges and enable robust science for HTC applications and workflows. The findings were collected from an international community of software developers during a Virtual World Cafe in May 2021. 
    more » « less
  6. The prevalence of scientific workflows with high computational demands calls for their execution on various distributed computing platforms, including large-scale leadership-class high-performance computing (HPC) clusters. To handle the deployment, monitoring, and optimization of workflow executions, many workflow systems have been developed over the past decade. There is a need for workflow benchmarks that can be used to evaluate the performance of workflow systems on current and future software stacks and hardware platforms. We present a generator of realistic workflow benchmark specifications that can be translated into benchmark code to be executed with current workflow systems. Our approach generates workflow tasks with arbitrary performance characteristics (CPU, memory, and I/O usage) and with realistic task dependency structures based on those seen in production workflows. We present experimental results that show that our approach generates benchmarks that are representative of production workflows, and conduct a case study to demonstrate the use and usefulness of our generated benchmarks to evaluate the performance of workflow systems under different configuration scenarios. 
    more » « less
  7. Abstract The spectral line profile of the atomic oxygen O1D23P2transition near 6300 Å in the airglow has been used for more than 50 years to extract neutral wind and temperature information from the F‐region ionosphere. A new spectral model and recent samples of this airglow emission in the presence of the nearby lambda‐doubled OH Meinel (9‐3) P2(2.5) emission lines underscores earlier cautions that OH can significantly distort the OI line center position and line width observed using a single‐etalon Fabry‐Perot interferometer (FPI). The consequence of these profile distortions in terms of the emission profile line width and Doppler position is a strong function of the selected etalon plate spacing. Single‐etalon Fabry‐Perot interferometers placed in the field for thermospheric measurements have widely varying etalon spacings, so that systematic wind biases caused by the OH line positions differ between instruments, complicating comparisons between sites. Based on the best current determinations of the OH and O1D line positions, the ideal gap for a single‐etalon FPI wind measurements places the OH emissions in the wings of the O1D spectral line profile. Optical systems that can accommodate prefilters with square passbands less than ∼3 Å in the optical beam can effectively block the OH contamination. When that is not possible, a method to fit for OH contamination and remove it in the spectral background of an active Fabry‐Perot system is evaluated. 
    more » « less