skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Bayesian updating with adaptive, uncertainty-informed subset simulations: High-fidelity updating with multiple observations
Award ID(s):
1762918 2000156
PAR ID:
10442299
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Reliability Engineering & System Safety
Volume:
230
Issue:
C
ISSN:
0951-8320
Page Range / eLocation ID:
108901
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Many service systems provide customers with information about the system so that customers can make an informed decision about whether to join or not. Many of these systems provide information in the form of an update. Thus, the information about the system is updated periodically in increments of size [Formula: see text]. It is known that these updates can cause oscillations in the resulting dynamics. However, it is an open problem to explicitly characterize the size of these oscillations when they occur. In this paper, we solve this open problem and show how to exactly calculate the amplitude of these oscillations via a fixed point equation. We also calculate closed form approximations via Taylor expansions of the fixed point equation and show that these approximations are very accurate, especially when [Formula: see text] is large. Our analysis provides new insight for systems that use updates as a way of disseminating information to customers. 
    more » « less
  2. Traditionally, clustered federated learning groups clients with the same data distribution into a cluster, so that every client is uniquely associated with one data distribution and helps train a model for this distribution. We relax this hard association assumption to soft clustered federated learning, which allows every local dataset to follow a mixture of multiple source distributions. We propose FedSoft, which trains both locally personalized models and high-quality cluster models in this setting. FedSoft limits client workload by using proximal updates to require the completion of only one optimization task from a subset of clients in every communication round. We show, analytically and empirically, that FedSoft effectively exploits similarities between the source distributions to learn personalized and cluster models that perform well. 
    more » « less