Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract On 5 April 2024, 10:23 a.m. local time, a moment magnitude 4.8 earthquake struck Tewksbury Township, New Jersey, about 65 km west of New York City. Millions of people from Virginia to Maine and beyond felt the ground shaking, resulting in the largest number (>180,000) of U.S. Geological Survey (USGS) “Did You Feel It?” reports of any earthquake. A team deployed by the Geotechnical Extreme Events Reconnaissance Association and the National Institute of Standards and Technology documented structural and nonstructural damage, including substantial damage to a historic masonry building in Lebanon, New Jersey. The USGS National Earthquake Information Center reported a focal depth of about 5 km, consistent with a lack of signal in Interferometric Synthetic Aperture Radar data. The focal mechanism solution is strike slip with a substantial thrust component. Neither mechanism’s nodal plane is parallel to the primary northeast trend of geologic discontinuities and mapped faults in the region, including the Ramapo fault. However, many of the relocated aftershocks, for which locations were augmented by temporary seismic deployments, form a cluster that parallels the general northeast trend of the faults. The aftershocks lie near the Tewksbury fault, north of the Ramapo fault.more » « less
-
This research investigates the effect of scaling in virtual reality to improve the reach of users with Parkinson’s disease (PD). People with PD have limited reach, often due to impaired postural stability. We investigated how virtual reality (VR) can improve reach during and after VR exposure. Participants played a VR game where they smashed water balloons thrown at them by crossing their midsection. The distance the balloons were thrown at increased and decreased based on success or failure. Their perception of the distance and their hand were scaled in three counterbalanced conditions: under-scaled (scale = 0:83), not-scaled (scale = 1), and over-scaled (scale = 1:2), where the scale value is the ratio between the virtual reach that they perceive in the virtual environment (VE) and their actual reach. In each study condition, six data were measured - 1. Real World Reach (pre-exposure), 2. Virtual Reality Baseline Reach, 3. Virtual Reality Not-Scaled Reach, 4. Under-Scaled Reach, 5. Over-Scaled Reach, and 6. Real World Reach (post-exposure). Our results show that scaling a person’s movement in virtual reality can help improve reach. Therefore, we recommend including a scaling factor in VR games for people with Parkinson’s disease.more » « less
-
ABSTRACT We present the 2023 U.S. Geological Survey time-independent earthquake rupture forecast for the conterminous United States, which gives authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes throughout the region. In addition to updating virtually all model components, a major focus has been to provide a better representation of epistemic uncertainties. For example, we have improved the representation of multifault ruptures, both in terms of allowing more and less fault connectivity than in the previous models, and in sweeping over a broader range of viable models. An unprecedented level of diagnostic information has been provided for assessing the model, and the development was overseen by a 19-member participatory review panel. Although we believe the new model embodies significant improvements and represents the best available science, we also discuss potential model limitations, including the applicability of logic tree branch weights with respect different types of hazard and risk metrics. Future improvements are also discussed, with deformation model enhancements being particularly worthy of pursuit, as well as better representation of sampling errors in the gridded seismicity components. We also plan to add time-dependent components, and assess implications with a wider range of hazard and risk metrics.more » « less
-
The US National Seismic Hazard Model (NSHM) was updated in 2023 for all 50 states using new science on seismicity, fault ruptures, ground motions, and probabilistic techniques to produce a standard of practice for public policy and other engineering applications (defined for return periods greater than ∼475 or less than ∼10,000 years). Changes in 2023 time-independent seismic hazard (both increases and decreases compared to previous NSHMs) are substantial because the new model considers more data and updated earthquake rupture forecasts and ground-motion components. In developing the 2023 model, we tried to apply best available or applicable science based on advice of co-authors, more than 50 reviewers, and hundreds of hazard scientists and end-users, who attended public workshops and provided technical inputs. The hazard assessment incorporates new catalogs, declustering algorithms, gridded seismicity models, magnitude-scaling equations, fault-based structural and deformation models, multi-fault earthquake rupture forecast models, semi-empirical and simulation-based ground-motion models, and site amplification models conditioned on shear-wave velocities of the upper 30 m of soil and deeper sedimentary basin structures. Seismic hazard calculations yield hazard curves at hundreds of thousands of sites, ground-motion maps, uniform-hazard response spectra, and disaggregations developed for pseudo-spectral accelerations at 21 oscillator periods and two peak parameters, Modified Mercalli Intensity, and 8 site classes required by building codes and other public policy applications. Tests show the new model is consistent with past ShakeMap intensity observations. Sensitivity and uncertainty assessments ensure resulting ground motions are compatible with known hazard information and highlight the range and causes of variability in ground motions. We produce several impact products including building seismic design criteria, intensity maps, planning scenarios, and engineering risk assessments showing the potential physical and social impacts. These applications provide a basis for assessing, planning, and mitigating the effects of future earthquakes.more » « less
-
Abstract Flare frequency distributions represent a key approach to addressing one of the largest problems in solar and stellar physics: determining the mechanism that counterintuitively heats coronae to temperatures that are orders of magnitude hotter than the corresponding photospheres. It is widely accepted that the magnetic field is responsible for the heating, but there are two competing mechanisms that could explain it: nanoflares or Alfvén waves. To date, neither can be directly observed. Nanoflares are, by definition, extremely small, but their aggregate energy release could represent a substantial heating mechanism, presuming they are sufficiently abundant. One way to test this presumption is via the flare frequency distribution, which describes how often flares of various energies occur. If the slope of the power law fitting the flare frequency distribution is above a critical threshold,α= 2 as established in prior literature, then there should be a sufficient abundance of nanoflares to explain coronal heating. We performed >600 case studies of solar flares, made possible by an unprecedented number of data analysts via three semesters of an undergraduate physics laboratory course. This allowed us to include two crucial, but nontrivial, analysis methods: preflare baseline subtraction and computation of the flare energy, which requires determining flare start and stop times. We aggregated the results of these analyses into a statistical study to determine thatα= 1.63 ± 0.03. This is below the critical threshold, suggesting that Alfvén waves are an important driver of coronal heating.more » « less
-
Abstract The semiconductor tracker (SCT) is one of the tracking systems for charged particles in the ATLAS detector. It consists of 4088 silicon strip sensor modules.During Run 2 (2015–2018) the Large Hadron Collider delivered an integrated luminosity of 156 fb -1 to the ATLAS experiment at a centre-of-mass proton-proton collision energy of 13 TeV. The instantaneous luminosity and pile-up conditions were far in excess of those assumed in the original design of the SCT detector.Due to improvements to the data acquisition system, the SCT operated stably throughout Run 2.It was available for 99.9% of the integrated luminosity and achieved a data-quality efficiency of 99.85%.Detailed studies have been made of the leakage current in SCT modules and the evolution of the full depletion voltage, which are used to study the impact of radiation damage to the modules.more » « less
An official website of the United States government

Full Text Available