skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Hypothetical Frequencies as Approximations
Hájek (Erkenntnis 70(2):211–235, 2009) argues that probabilities cannot be the limits of relative frequencies in counterfactual infinite sequences. I argue for a different understanding of these limits, drawing on Norton’s (Philos Sci 79(2):207–232, 2012) distinction between approximations (inexact descriptions of a target) and idealizations (separate models that bear analogies to the target). Then, I adapt Hájek’s arguments to this new context. These arguments provide excellent reasons not to use hypothetical frequencies as idealizations, but no reason not to use them as approximations.  more » « less
Award ID(s):
2043089
PAR ID:
10464375
Author(s) / Creator(s):
Date Published:
Journal Name:
Erkenntnis
ISSN:
0165-0106
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract We propose a new approach to deriving quantitative mean field approximations for any probability measure $$P$$ on $$\mathbb {R}^{n}$$ with density proportional to $$e^{f(x)}$$, for $$f$$ strongly concave. We bound the mean field approximation for the log partition function $$\log \int e^{f(x)}dx$$ in terms of $$\sum _{i \neq j}\mathbb {E}_{Q^{*}}|\partial _{ij}f|^{2}$$, for a semi-explicit probability measure $$Q^{*}$$ characterized as the unique mean field optimizer, or equivalently as the minimizer of the relative entropy $$H(\cdot \,|\,P)$$ over product measures. This notably does not involve metric-entropy or gradient-complexity concepts which are common in prior work on nonlinear large deviations. Three implications are discussed, in the contexts of continuous Gibbs measures on large graphs, high-dimensional Bayesian linear regression, and the construction of decentralized near-optimizers in high-dimensional stochastic control problems. Our arguments are based primarily on functional inequalities and the notion of displacement convexity from optimal transport. 
    more » « less
  2. Atomic Force Microscopy (AFM) force-distance (FD) experiments have emerged as an attractive alternative to traditional micro-rheology measurement techniques owing to their versatility of use in materials of a wide range of mechanical properties. Here, we show that the range of time dependent behaviour which can reliably be resolved from the typical method of FD inversion (fitting constitutive FD relations to FD data) is inherently restricted by the experimental parameters: sampling frequency, experiment length, and strain rate. Specifically, we demonstrate that violating these restrictions can result in errors in the values of the parameters of the complex modulus. In the case of complex materials, such as cells, whose behaviour is not specifically understood a priori , the physical sensibility of these parameters cannot be assessed and may lead to falsely attributing a physical phenomenon to an artifact of the violation of these restrictions. We use arguments from information theory to understand the nature of these inconsistencies as well as devise limits on the range of mechanical parameters which can be reliably obtained from FD experiments. The results further demonstrate that the nature of these restrictions depends on the domain (time or frequency) used in the inversion process, with the time domain being far more restrictive than the frequency domain. Finally, we demonstrate how to use these restrictions to better design FD experiments to target specific timescales of a material's behaviour through our analysis of a polydimethylsiloxane (PDMS) polymer sample. 
    more » « less
  3. Jakarta, Indonesia’s capital, is increasingly characterized by luxury real estate developments and high-profile infrastructural projects made possible by economic liberalization and finance capital. Yet these developments have contributed to Jakarta’s struggles with chronic flooding, land subsidence, and water shortages. This paper contributes an empirical study of the spatial-temporal dynamics of speculative urbanism and the associated impacts on water resources and flood events in Jakarta. I use an urban political ecology approach to analyze mainland and offshore development. First, I show how financial speculation generates flood risk and the overexploitation of water resources, producing uneven socio-spatial distributions of risk. These transformations in Jakarta’s hydroscape in turn threaten to undermine the city’s viability as a site for speculative investment. I thus show how speculative urbanism can be threatened or disrupted by nonhuman agencies. Second, I illustrate a second form of speculation, which I refer to as environmental speculation. As Jakarta’s water crisis has cast doubt on the future of the city itself as a place of habitation, the state explored an ambitious and potentially lucrative coastal defense project, while private developers have engaged in land reclamation. The turn toward offshore development illustrates how environmental speculation creates new opportunities for capital accumulation. I advance two arguments: first, in order to capture the full costs of speculative urbanism, it is imperative that urban scholars attend to its ecological dimensions. Second, an urban political ecology approach advances our understandings of speculative urbanism by illuminating its contradictions and limits. 
    more » « less
  4. ABSTRACT Radio interferometers aiming to measure the power spectrum of the redshifted 21 cm line during the Epoch of Reionization (EoR) need to achieve an unprecedented dynamic range to separate the weak signal from overwhelming foreground emissions. Calibration inaccuracies can compromise the sensitivity of these measurements to the effect that a detection of the EoR is precluded. An alternative to standard analysis techniques makes use of the closure phase, which allows one to bypass antenna-based direction-independent calibration. Similarly to standard approaches, we use a delay spectrum technique to search for the EoR signal. Using 94 nights of data observed with Phase I of the Hydrogen Epoch of Reionization Array (HERA), we place approximate constraints on the 21 cm power spectrum at z = 7.7. We find at 95 per cent confidence that the 21 cm EoR brightness temperature is ≤(372)2 ‘pseudo’ mK2 at 1.14 ‘pseudo’ h Mpc−1, where the ‘pseudo’ emphasizes that these limits are to be interpreted as approximations to the actual distance scales and brightness temperatures. Using a fiducial EoR model, we demonstrate the feasibility of detecting the EoR with the full array. Compared to standard methods, the closure phase processing is relatively simple, thereby providing an important independent check on results derived using visibility intensities, or related. 
    more » « less
  5. Amir Hashemi (Ed.)
    We present Hermite polynomial interpolation algorithms that for a sparse univariate polynomial f with coefficients from a field compute the polynomial from fewer points than the classical algorithms. If the interpolating polynomial f has t terms, our algorithms, require argument/value triples (w^i, f(w^i), f'(w^i)) for i=0,...,t + ceiling( (t+1)/2 ) - 1, where w is randomly sampled and the probability of a correct output is determined from a degree bound for f. With f' we denote the derivative of f. Our algorithms generalize to multivariate polynomials, higher derivatives and sparsity with respect to Chebyshev polynomial bases. We have algorithms that can correct errors in the points by oversampling at a limited number of good values. If an upper bound B >= t for the number of terms is given, our algorithms use a randomly selected w and, with high probability, ceiling( t/2 ) + B triples, but then never return an incorrect output. The algorithms are based on Prony's sparse interpolation algorithm. While Prony's algorithm and its variants use fewer values, namely, 2t+1 and t+B values f(w^i), respectively, they need more arguments w^i. The situation mirrors that in algebraic error correcting codes, where the Reed-Solomon code requires fewer values than the multiplicity code, which is based on Hermite interpolation, but the Reed-Solomon code requires more distinct arguments. Our sparse Hermite interpolation algorithms can interpolate polynomials over finite fields and over the complex numbers, and from floating point data. Our Prony-based approach does not encounter the Birkhoff phenomenon of Hermite interpolation, when a gap in the derivative values causes multiple interpolants. We can interpolate from t+1 values of f and 2t-1 values of f'. 
    more » « less