 Home
 Search Results
 Page 1 of 1
Search for: All records

Total Resources2
 Resource Type

00000020000
 More
 Availability

20
 Author / Contributor
 Filter by Author / Creator


Ehret, Uwe (2)

Behrangi, Ali (1)

Ehsani, Mohammad Reza (1)

Gupta, Hoshin V. (1)

Knuth, Kevin H. (1)

Perdigão, Rui A. P. (1)

Roy, Tirthankar (1)

SansFuentes, Maria A. (1)

Wang, Jingfeng (1)

#Tyler Phillips, Kenneth E. (0)

#Willis, Ciara (0)

& AbreuRamos, E. D. (0)

& Abramson, C. I. (0)

& AbreuRamos, E. D. (0)

& Adams, S.G. (0)

& Ahmed, K. (0)

& Ahmed, Khadija. (0)

& Aina, D.K. Jr. (0)

& AkcilOkan, O. (0)

& Akuom, D. (0)

 Filter by Editor


null (1)

& Spizer, S. M. (0)

& . Spizer, S. (0)

& Ahn, J. (0)

& Bateiha, S. (0)

& Bosch, N. (0)

& Brennan K. (0)

& Brennan, K. (0)

& Chen, B. (0)

& Chen, Bodong (0)

& Drown, S. (0)

& Ferretti, F. (0)

& Higgins, A. (0)

& J. Peters (0)

& Kali, Y. (0)

& RuizArias, P.M. (0)

& S. Spitzer (0)

& Sahin. I. (0)

& Spitzer, S. (0)

& Spitzer, S.M. (0)


Have feedback or suggestions for a way to improve these results?
!
Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to nonfederal websites. Their policies may differ from this site.

null (Ed.)We develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of onedimensional entropy from equiprobable random samples, and compare it with the popular BinCounting (BC) and Kernel Density (KD) methods. In contrast to BC, which uses equalwidth bins with varying probability mass, the QS method uses estimates of the quantiles that divide the support of the data generating probability density function (pdf) into equalprobabilitymass intervals. And, whereas BC and KD each require optimal tuning of a hyperparameter whose value varies with sample size and shape of the pdf, QS only requires specification of the number of quantiles to be used. Results indicate, for the class of distributions tested, that the optimal number of quantiles is a fixed fraction of the sample size (empirically determined to be ~0.25–0.35), and that this value is relatively insensitive to distributional form or sample size. This provides a clear advantage over BC and KD since hyperparameter tuning is not required. Further, unlike KD, there is no need to select an appropriate kerneltype, and so QS is applicable to pdfs of arbitrary shape, including those with discontinuous slope and/or magnitude. Bootstrapping is used to approximate the sampling variability distribution of the resulting entropy estimate, and is shown to accurately reflect the true uncertainty. For the four distributional forms studied (Gaussian, LogNormal, Exponential and Bimodal Gaussian Mixture), expected estimation bias is less than 1% and uncertainty is low even for samples of as few as 100 data points; in contrast, for KD the small sample bias can be as large as −10% and for BC as large as −50%. We speculate that estimating quantile locations, rather than binprobabilities, results in more efficient use of the information in the data to approximate the underlying shape of an unknown data generating pdf.more » « less

Perdigão, Rui A. P. ; Ehret, Uwe ; Knuth, Kevin H. ; Wang, Jingfeng ( , Water Resources Research)
Abstract Entropy and Information are key concepts not only in Information Theory but also in Physics: historically in the fields of Thermodynamics, Statistical and Analytical Mechanics, and, more recently, in the field of Information Physics. In this paper we argue that Information Physics reconciles and generalizes statistical, geometric, and mechanistic views on information. We start by demonstrating how the use and interpretation of Entropy and Information coincide in Information Theory, Statistical Thermodynamics, and Analytical Mechanics, and how this can be taken advantage of when addressing Earth Science problems in general and hydrological problems in particular. In the second part we discuss how Information Physics provides ways to quantify Information and Entropy from fundamental physical principles. This extends their use to cases where the preconditions to calculate Entropy in the classical manner as an aggregate statistical measure are not met. Indeed, these preconditions are rarely met in the Earth Sciences due either to limited observations or the far‐from‐equilibrium nature of evolving systems. Information Physics therefore offers new opportunities for improving the treatment of Earth Science problems.