We consider a network of distributed underwater sensors whose task is to monitor the movement of objects across an area. The sensors measure the strength of signals emanated by the objects and convey the measurements to the local fusion centers. Multiple fusion centers are deployed to cover an arbitrarily large area. The fusion centers communicate with each other to achieve consensus on the estimated locations of the moving objects. We introduce two efficient methods for data fusion of distributed partial estimates when delay in communication is not negligible. We concentrate on the minimum mean squared error (MMSE) global estimator, and evaluate the performance of these fusion methods in the context of multiple-object tracking via extended Kalman filtering. Numerical results show the superior performance compared to the case when delay is ignored.
more »
« less
Delay-Tolerant Distributed Inference in Tracking Networks
This paper discusses asynchronous distributed inference in object tracking. Unlike many studies, which assume that the delay in communication between partial estimators and the central station is negligible, our study focuses on the problem of asynchronous distributed inference in the presence of delays. We introduce an efficient data fusion method for combining the distributed estimates, where delay in communications is not negligible. To overcome the delay, predictions are made for the state of the system based on the most current available information from partial estimators. Simulation results show the efficacy of the methods proposed.
more »
« less
- PAR ID:
- 10316503
- Editor(s):
- Paolo Spagnolo
- Date Published:
- Journal Name:
- IEEE sensors journal
- ISSN:
- 1558-1748
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Many applications, e.g., federated learning, require the aggregation of information across a large number of distributed nodes. In this paper, we explore efficient schemes to do this at scale leveraging aggregation at intermediate nodes across overlay trees. Files/updates are split into chunks which are in turn simultaneously aggregated over different trees. For a synchronous setting with homogeneous communications capabilities and deterministic link delays, we develop a delay optimal aggregation schedule. In the asynchronous setting, where delays are stochastic but i.i.d., across links, we show that for an asynchronous implementation of the above schedule, the expected aggregation delay is near-optimal. We then consider the impact that failures in the network have on the resulting Mean Square Error (MSE) for the estimated aggregates and how it can be controlled through the addition of redundancy, reattempts, and optimal failure-aware estimates for the desired aggregate. Based on the analysis of a natural model of failures, we show how to choose parameters to optimize the trade-off between aggregation delay and MSE. We present simulation results exhibiting the above mentioned tradeoffs. We also consider a more general class of correlated failures and demonstrate via simulation the applicability of our techniques in those settings as well.more » « less
-
Consensus algorithms play a central role in many distributed systems, including blockchains. The most practical consensus algorithms are based on the partial synchrony model. While partially synchronous protocols are relatively simple, they cannot maintain liveness when the message delivery latency is uncertain. Asynchronous protocols do not rely on bounded latency to maintain liveness, but they are much more difficult to understand and implement, being usually described as a composition of several layers of algorithms. Moreover, due to the FLP impossibility theorem, they only provide probabilistic liveness guarantees. These factors make the correctness of asynchronous protocols challenging to verify, and there have been liveness bugs in these protocols that remain unnoticed for years. We introduce SureDistrib, a formal framework for specifying and verifying probabilistic safety and liveness properties of asynchronous distributed protocols. Our framework supports specifying probabilistic algorithms that depend on other probabilistic functionalities, such as binary agreement algorithms depending on common coins. We define refinement relations for such systems, and prove composition lemmas that replace the underlay functionality with an implementation, so that the composed system refines the original system with an abstract underlay. Based on our framework, we give the first mechanized proof that an asynchronous byzantine fault-tolerant binary agreement algorithm terminates with probability 1 (“almost-sure termination”).more » « less
-
Inference for high-dimensional models is challenging as regular asymptotic the- ories are not applicable. This paper proposes a new framework of simultaneous estimation and inference for high-dimensional linear models. By smoothing over par- tial regression estimates based on a given variable selection scheme, we reduce the problem to a low-dimensional least squares estimation. The procedure, termed as Selection-assisted Partial Regression and Smoothing (SPARES), utilizes data split- ting along with variable selection and partial regression. We show that the SPARES estimator is asymptotically unbiased and normal, and derive its variance via a non- parametric delta method. The utility of the procedure is evaluated under various simulation scenarios and via comparisons with the de-biased LASSO estimators, a major competitor. We apply the method to analyze two genomic datasets and obtain biologically meaningful results.more » « less
-
Noisy matrix completion aims at estimating a low-rank matrix given only partial and corrupted entries. Despite remarkable progress in designing efficient estimation algorithms, it remains largely unclear how to assess the uncertainty of the obtained estimates and how to perform efficient statistical inference on the unknown matrix (e.g., constructing a valid and short confidence interval for an unseen entry). This paper takes a substantial step toward addressing such tasks. We develop a simple procedure to compensate for the bias of the widely used convex and nonconvex estimators. The resulting debiased estimators admit nearly precise nonasymptotic distributional characterizations, which in turn enable optimal construction of confidence intervals/regions for, say, the missing entries and the low-rank factors. Our inferential procedures do not require sample splitting, thus avoiding unnecessary loss of data efficiency. As a byproduct, we obtain a sharp characterization of the estimation accuracy of our debiased estimators in both rate and constant. Our debiased estimators are tractable algorithms that provably achieve full statistical efficiency.more » « less
An official website of the United States government

