This content will become publicly available on May 15, 2024
- Award ID(s):
- NSF-PAR ID:
- Date Published:
- Journal Name:
- Applied Physics Letters
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
Modern short-range gravity experiments that seek to test the Newtonian inverse-square law or weak equivalence principle of general relativity typically involve measuring the minute variations in the twist angle of a torsion pendulum. Motivated by various theoretical arguments, recent efforts largely focus on measurements with test mass separations in the sub-millimeter regime. To measure the twist, many experiments employ an optical autocollimator with a noise performance of ∼300 nrad[Formula: see text] in the 0.1–10 mHz band, enabling a measurement uncertainty of a few nanoradians in a typical integration time. We investigated an alternative method for measuring a small twist angle through the construction of a modified Michelson interferometer. The main modification is the introduction of two additional arms that allow for improved angular alignment. A series of detectors and LabView software routines were developed to determine the orientation of a mirror attached to a sinusoidally driven rotation stage that oscillated with an amplitude of 0.35 mrad and a period of 200 s. In these measurements, the resolution of the interferometer is 8.1 μrad per fringe, while its dynamic range spanned 0.962 mrad. We compare the performance of this interferometric optical system to existing autocollimator-based methods, discussing its implementation, possible advantages, and future potential, as well as disadvantages and limitations.more » « less
In the modern industrial setting, there is an increasing demand for all types of sensors. The demand for both the quantity and quality of sensors is increasing annually. Our research focuses on thin-film nitrate sensors in particular, and it seeks to provide a robust method to monitor the quality of the sensors while reducing the cost of production.
We are researching an image-based machine learning method to allow for real-time quality assessment of every sensor in the manufacturing pipeline. It opens up the possibility of real-time production parameter adjustments to enhance sensor performance. This technology has the potential to significantly reduce the cost of quality control and improve sensor quality at the same time. Previous research has proven that the texture of the topical layer (ion-selective membrane (ISM) layer) of the sensor directly correlates with the performance of the sensor. Our method seeks to use the correlation so established to train a learning-based system to predict the performance of any given sensor from a still photo of the sensor active region, i.e. the ISM. This will allow for the real-time assessment of every sensor instead of sample testing. Random sample testing is both costly in time and labor, and therefore, it does not account for all of the individual sensors.
Sensor measurement is a crucial portion of the data collection process. To measure the performance of the sensors, the sensors are taken to a specialized lab to be measured for performance. During the measurement process, noise and error are unavoidable; therefore, we generated credibility data based on the performance data to show the reliability of each sensor performance signal at each sample time.
In this paper, we propose a machine learning based method to predict sensor performance using image features extracted from the non-contact sensor images guided by the credibility data. This will eliminate the need to test every sensor as it is manufactured, which is not practical in a high-speed roll-to-roll setting, thus truely enabling a certify as built framework.
The post-Newtonian formalism plays an integral role in the models used to extract information from gravitational wave data, but models that incorporate this formalism are inherently approximations. Disagreement between an approximate model and nature will produce mismodeling biases in the parameters inferred from data, introducing systematic error. We here carry out a proof-of- principle study of such systematic error by considering signals produced by quasi-circular, inspiraling black hole binaries through an injection and recovery campaign. In particular, we study how un- known, but calibrated, higher-order post-Newtonian corrections to the gravitational wave phase impact systematic error in recovered parameters. As a first study, we produce injected data of non-spinning binaries as detected by a current, second-generation network of ground-based observatories and recover them with models of varying PN order in the phase. We find that the truncation of higher order (>3.5) post-Newtonian corrections to the phase can produce significant systematic error even at signal-to-noise ratios of current detector networks. We propose a method to mitigate systematic error by marginalizing over our ignorance in the waveform through the inclusion of higher-order post-Newtonian coefficients as new model parameters. We show that this method can reduce systematic error greatly at the cost of increasing statistical error.more » « less
Quantum sensors are used for precision timekeeping, field sensing, and quantum communication. Comparisons among a distributed network of these sensors are capable of, for example, synchronizing clocks at different locations. The performance of a sensor network is limited by technical challenges as well as the inherent noise associated with the quantum states used to realize the network. For networks with only local entanglement at each node, the noise performance of the network improves at best with square root of the number of nodes. Here, we demonstrate that nonlocal entanglement between network nodes offers better scaling with network size. A shared quantum nondemolition measurement entangles a clock network with up to four nodes. This network provides up to 4.5 dB better precision than one without nonlocal entanglement, and 11.6 dB improvement as compared to a network of sensors operating at the quantum projection noise limit. We demonstrate the generality of the approach with atomic clock and atomic interferometer protocols, in scientific and technologically relevant configurations optimized for intrinsically differential comparisons of sensor outputs.more » « less
We consider a centralized detection problem where sensors experience noisy measurements and intermittent connectivity to a centralized fusion center. The sensors may collaborate locally within predefined sensor clusters and fuse their noisy sensor data to reach a common local estimate of the detected event in each cluster. The connectivity of each sensor cluster is intermittent and depends on the available communication opportunities of the sensors to the fusion center. Upon receiving the estimates from all the connected sensor clusters the fusion center fuses the received estimates to make a final determination regarding the occurrence of the event across the deployment area. We refer to this hybrid communication scheme as a cloud-cluster architecture. We propose a method for optimizing the decision rule for each cluster and analyzing the expected detection performance resulting from our hybrid scheme. Our method is tractable and addresses the high computational complexity caused by heterogeneous sensors’ and clusters’ detection quality, heterogeneity in their communication opportunities, and nonconvexity of the loss function. Our analysis shows that clustering the sensors provides resilience to noise in the case of low sensor communication probability with the cloud. For larger clusters, a steep improvement in detection performance is possible even for a low communication probability by using our cloud-cluster architecture.more » « less