skip to main content


Title: A dynamic process reference model for sparse networks with reciprocity
Many social and other networks exhibit stable size scaling relationships, such that features such as mean degree or reciprocation rates change slowly or are approximately constant as the number of vertices increases. Statistical network models built on top of simple Bernoulli baseline (or reference) measures often behave unrealistically in this respect, leading to the development of sparse reference models that preserve features such as mean degree scaling. In this paper, we generalize recent work on the micro-foundations of such reference models to the case of sparse directed graphs with non-vanishing reciprocity, providing a dynamic process interpretation of the emergence of stable macroscopic behavior.  more » « less
Award ID(s):
1826589 1361425 1939237
NSF-PAR ID:
10184450
Author(s) / Creator(s):
Date Published:
Journal Name:
The Journal of Mathematical Sociology
ISSN:
0022-250X
Page Range / eLocation ID:
1 to 27
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The raindrop size distribution (DSD) is vital for applications such as quantitative precipitation estimation, understanding microphysical processes, and validation/improvement of two-moment bulk microphysical schemes. We trace the history of the DSD representation and its linkage to polarimetric radar observables from functional forms (exponential, gamma, and generalized gamma models) and its normalization (un-normalized, single/double-moment scaling normalized). The four-parameter generalized gamma model is a good candidate for the optimal representation of the DSD variability. A radar-based disdrometer was found to describe the five archetypical shapes (from Montreal, Canada) consisting of drizzle, the larger precipitation drops and the ‘S’-shaped curvature that occurs frequently in between the drizzle and the larger-sized precipitation. Similar ‘S’-shaped DSDs were reproduced by combining the disdrometric measurements of small-sized drops from an optical array probe and large-sized drops from 2DVD. A unified theory based on the double-moment scaling normalization is described. The theory assumes the multiple power law among moments and DSDs are scaling normalized by the two characteristic parameters which are expressed as a combination of any two moments. The normalized DSDs are remarkably stable. Thus, the mean underlying shape is fitted to the generalized gamma model from which the ‘optimized’ two shape parameters are obtained. The other moments of the distribution are obtained as the product of power laws of the reference moments M3 and M6 along with the two shape parameters. These reference moments can be from dual-polarimetric measurements: M6 from the attenuation-corrected reflectivity and M3 from attenuation-corrected differential reflectivity and the specific differential propagation phase. Thus, all the moments of the distribution can be calculated, and the microphysical evolution of the DSD can be inferred. This is one of the major findings of this article. 
    more » « less
  2. Low C-rate charge and discharge experiments, plus complementary differential voltage or differential capacity analysis, are among the most common battery characterization methods. Here, we adapt the multi-species, multi-reaction (MSMR) half-cell thermodynamic model to low C-rate cycling of whole-cell Li-ion batteries. MSMR models for the anode and cathode are coupled through whole-cell charge balances and cell-cycling voltage constraint equations, forming the basis for model-based estimation of MSMR half-cell parameters from whole-cell experimental data. Emergent properties of the whole-cell, like slippage of the anode and cathode lithiation windows, are also computed as cells cycle and degrade. A sequential least-square optimization scheme is used for parameter estimation from low-C cycling data of Samsung 18650 NMC∣C cells. Low-error fits of the open-circuit cell voltage (e.g., under 5 mV mean absolute error for charge or discharge curves) and differential voltage curves for fresh and aged cells are achieved. We explore the features (and limitations) of using literature reference values for the MSMR half-cell thermodynamic parameters (reducing our whole-cell formulation to a 1-degree-of-freedom fit) and demonstrate the benefits of expanding the degrees of freedom by letting the MSMR parameters be tailored to the cell under test, within a constrained neighborhood of the half-cell reference values. Bootstrap analysis is performed on each dataset to show the robustness of our fitting to experimental noise and data sampling over the course of 600 cell cycles. The results show which specific MSMR insertion reactions are most responsible for capacity loss in each half-cell and the collective interactions that lead to whole-cell slippage and changes in useable capacity. Open-source software is made available to easily extend this model-based analysis to other labs and battery chemistries. 
    more » « less
  3. null (Ed.)
    Drought, coupled with rising temperatures, is an emerging threat to many forest types across the globe. At least to a degree, we expect management actions that reduce competition (e.g., thinning, prescribed fire, or both) to improve growth of residual trees during drought. The influences of management actions and drought on individual tree growth may be measured with high precision using tree-rings. Here, we summarize tree-ring-based assessments of the effectiveness of thinning and prescribed fire as drought adaptation tools, with special consideration for how these findings might apply to dry coniferous forests in the southwestern United States. The existing literature suggests that thinning treatments generally improve individual tree growth responses to drought, though the literature specific to southwestern coniferous forests is sparse. Assessments from studies beyond the southwestern United States indicate treatment effectiveness varies by thinning intensity, timing of the drought relative to treatments, and individualistic species responses. Several large-scale studies appear to conflict on specifics of how site aridity influences sensitivity to drought following thinning. Prescribed fire effects in the absence of thinning has received much less attention in terms of subsequent drought response. There are limitations for using tree-ring data to estimate drought responses (e.g., difficulties scaling up observations to stand- and landscape-levels). However, tree-rings describe an important dimension of drought effects for individual trees, and when coupled with additional information, such as stable isotopes, aid our understanding of key physiological mechanisms that underlie forest drought response. 
    more » « less
  4. null (Ed.)
    We consider the problem of finding nearly optimal solutions of optimization problems with random objective functions. Such problems arise widely in the theory of random graphs, theoretical computer science, and statistical physics. Two concrete problems we consider are (a) optimizing the Hamiltonian of a spherical or Ising p-spin glass model, and (b) finding a large independent set in a sparse Erdos-Renyi graph. Two families of algorithms are considered: (a) low-degree polynomials of the input-a general framework that captures methods such as approximate message passing and local algorithms on sparse graphs, among others; and (b) the Langevin dynamics algorithm, a canonical Monte Carlo analogue of the gradient descent algorithm (applicable only for the spherical p-spin glass Hamiltonian). We show that neither family of algorithms can produce nearly optimal solutions with high probability. Our proof uses the fact that both models are known to exhibit a variant of the overlap gap property (OGP) of near-optimal solutions. Specifically, for both models, every two solutions whose objective values are above a certain threshold are either close or far from each other. The crux of our proof is the stability of both algorithms: a small perturbation of the input induces a small perturbation of the output. By an interpolation argument, such a stable algorithm cannot overcome the OGP barrier. The stability of the Langevin dynamics is an immediate consequence of the well-posedness of stochastic differential equations. The stability of low-degree polynomials is established using concepts from Gaussian and Boolean Fourier analysis, including noise sensitivity, hypercontractivity, and total influence. 
    more » « less
  5. Abstract A major uncertainty in reconstructing historical sea surface temperature (SST) before the 1990s involves correcting for systematic offsets associated with bucket and engine-room intake temperature measurements. A recent study used a linear scaling of coastal station-based air temperatures (SATs) to infer nearby SSTs, but the physics in the coupling between SATs and SSTs generally gives rise to more complex regional air–sea temperature differences. In this study, an energy-balance model (EBM) of air–sea thermal coupling is adapted for predicting near-coast SSTs from coastal SATs. The model is shown to be more skillful than linear-scaling approaches through cross-validation analyses using instrumental records after the 1960s and CMIP6 simulations between 1880 and 2020. Improved skill primarily comes from capturing features reflecting air–sea heat fluxes dominating temperature variability at high latitudes, including damping high-frequency wintertime SAT variability and reproducing the phase lag between SSTs and SATs. Inferred near-coast SSTs allow for intercalibrating coastal SAT and SST measurements at a variety of spatial scales. The 1900–40 mean offset between the latest SST estimates available from the Met Office (HadSST4) and SAT-inferred SSTs range between −1.6°C (95% confidence interval: [−1.7°, −1.4°C]) and 1.2°C ([0.8°, 1.6°C]) across 10° × 10° grids. When further averaged along the global coastline, HadSST4 is significantly colder than SAT-inferred SSTs by 0.20°C ([0.07°, 0.35°C]) over 1900–40. These results indicate that historical SATs and SSTs involve substantial inconsistencies at both regional and global scales. Major outstanding questions involve the distribution of errors between our intercalibration model and instrumental records of SAT and SST as well as the degree to which coastal intercalibrations are informative of global trends. Significance Statement To evaluate the consistency of instrumental surface temperature estimates before the 1990s, we develop a coupled energy-balance model to intercalibrate measurements of sea surface temperature (SST) and station-based air temperature (SAT) near global coasts. Our model captures geographically varying physical regimes of air–sea coupling and outperforms existing methods in inferring regional SSTs from SAT measurements. When applied to historical temperature records, the model indicates significant discrepancies between inferred and observed SSTs at both global and regional scales before the 1960s. Our findings suggest remaining data issues in historical temperature archives and opportunities for further improvements. 
    more » « less