An 84-dB-SNDR Low-OSR Fourth-Order Noise-Shaping SAR With an FIA-Assisted EF-CRFF Structure and Noise-Mitigated Push-Pull Buffer-in-Loop Technique
- Award ID(s):
- 2133106
- PAR ID:
- 10447054
- Date Published:
- Journal Name:
- IEEE Journal of Solid-State Circuits
- Volume:
- 57
- Issue:
- 12
- ISSN:
- 0018-9200
- Page Range / eLocation ID:
- 3804 to 3815
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Abstract Eccentricity has emerged as a potentially useful tool for helping to identify the origin of black hole mergers. However, eccentric templates can be computationally very expensive owing to the large number of harmonics, making statistical analyses to distinguish formation channels very challenging. We outline a method for estimating the signal-to-noise ratio (S/N) for inspiraling binaries at lower frequencies such as those proposed for LISA and DECIGO. Our approximation can be useful more generally for any quasi-periodic sources. We argue that surprisingly, the S/N evaluated at or near the peak frequency (of the power) is well approximated by using a constant-noise curve, even if in reality the noise strain has power-law dependence. We furthermore improve this initial estimate over our previous calculation to allow for frequency dependence in the noise to expand the range of eccentricity and frequency over which our approximation applies. We show how to apply this method to get an answer accurate to within a factor of 2 over almost the entire projected observable frequency range. We emphasize this method is not a replacement for detailed signal processing. The utility lies chiefly in identifying theoretically useful discriminators among different populations and providing fairly accurate estimates for how well they should work. This approximation can furthermore be useful for narrowing down parameter ranges in a computationally economical way when events are observed. We furthermore show a distinctive way to identify events with extremely high eccentricity where the signal is enhanced relative to naive expectations on the high-frequency end.more » « less
-
null (Ed.)Batch Normalization (BN) (Ioffe and Szegedy 2015) normalizes the features of an input image via statistics of a batch of images and hence BN will bring the noise to the gradient of training loss. Previous works indicate that the noise is important for the optimization and generalization of deep neural networks, but too much noise will harm the performance of networks. In our paper, we offer a new point of view that the self-attention mechanism can help to regulate the noise by enhancing instance-specific information to obtain a better regularization effect. Therefore, we propose an attention-based BN called Instance Enhancement Batch Normalization (IEBN) that recalibrates the information of each channel by a simple linear transformation. IEBN has a good capacity of regulating the batch noise and stabilizing network training to improve generalization even in the presence of two kinds of noise attacks during training. Finally, IEBN outperforms BN with only a light parameter increment in image classification tasks under different network structures and benchmark datasets.more » « less
An official website of the United States government

