skip to main content

Title: Why Flash Type Matters: A Statistical Analysis: Why Flash Type Matters
Award ID(s):
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Geophysical Research Letters
Page Range / eLocation ID:
9505 to 9512
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Distance is the most fundamental metric in spatial analysis and modeling. Planar distance and geodesic distance are the common distance measurements in current geographic information systems and geospatial analytic tools. However, there is little understanding about how to measure distance in a digital terrain surface and the uncertainty of the measurement. To fill this gap, this study applies a Monte‐Carlo simulation to evaluate seven surface‐adjustment methods for distance measurement in digital terrain model. Using parallel computing techniques and a memory optimization method, the processing time for the distances calculation of 6,000 simulated transects has been reduced to a manageable level. The accuracy and computational efficiency of the surface‐adjustment methods were systematically compared in six study areas with various terrain types and in digital elevation models in different resolutions. Major findings of this study indicate a trade‐off between measurement accuracy and computational efficiency: calculations at finer resolution DEMs improve measurement accuracy but increase processing times. Among the methods compared, the weighted average demonstrates highest accuracy and second fastest processing time. Additionally, the choice of surface adjustment method has a greater impact on the accuracy of distance measurements in rougher terrain. 
    more » « less
  2. Welfare measures overall utility across a population, whereas malfare measures overall disutility, and the social planner’s problem can be cast either as maximizing the former or minimizing the latter. We show novel bounds on the expectations and tail probabilities of estimators of welfare, malfare, and regret of per-group (dis)utility values, where estimates are made from a finite sample drawn from each group. In particular, we consider estimating these quantities for individual functions (e.g., allocations or classifiers) with standard probabilistic bounds, and optimizing and bounding generalization error over hypothesis classes (i.e., we quantify overfitting) using Rademacher averages. We then study algorithmic fairness through the lens of sample complexity, finding that because marginalized or minority groups are often understudied, and fewer data are therefore available, the social planner is more likely to overfit to these groups, thus even models that seem fair in training can be systematically biased against such groups. We argue that this effect can be mitigated by ensuring sufficient sample sizes for each group, and our sample complexity analysis characterizes these sample sizes. Motivated by these conclusions, we present progressive sampling algorithms to efficiently optimize various fairness objectives. 
    more » « less
  3. Multicellular organisms often start life as a single cell. Subsequent cell division builds the body. Each mutational event during those developmental cell divisions carries forward to all descendant cells. The overall number of mutant cells in the body follows the Luria–Delbrück process. This article first reviews the basic quantitative principles by which one can understand the likely number of mutant cells and the variation in mutational burden between individuals. A recent Fréchet distribution approximation simplifies calculation of likelihoods and intuitive understanding of process. The second part of the article highlights consequences of somatic mutational mosaicism for understanding diseases such as cancer, neurodegeneration, and atherosclerosis. 
    more » « less