Ultrahighenergy (UHE) photons are an important tool for studying the highenergy Universe. A plausible source of photons with exaeV (EeV) energy is provided by UHE cosmic rays (UHECRs) undergoing the Greisen–Zatsepin–Kuzmin process (Greisen 1966; Zatsepin & Kuzmin 1966) or pair production process (Blumenthal 1970) on a cosmic background radiation. In this context, the EeV photons can be a probe of both UHECR mass composition and the distribution of their sources (Gelmini, Kalashev & Semikoz 2008; Hooper, Taylor & Sarkar 2011). At the same time, the possible flux of photons produced by UHE protons in the vicinity of their sources by pion photoproduction or inelastic nuclear collisions would be noticeable only for relatively near sources, as the attenuation length of UHE photons is smaller than that of UHE protons; see, for example, Bhattacharjee & Sigl (2000) for a review. There also exists a class of socalled topdown models of UHECR generation that efficiently produce the UHE photons, for instance by the decay of heavy darkmatter particles (Berezinsky, Kachelriess & Vilenkin 1997; Kuzmin & Rubakov 1998) or by the radiation from cosmic strings (Berezinsky, Blasi & Vilenkin 1998). The search for the UHE photons was shown to be the most sensitive methodmore »
A Tight Lower Bound for Entropy Flattening
We study entropy flattening: Given a circuit C_X implicitly describing an nbit source X (namely, X is the output of C_X on a uniform random input), construct another circuit C_Y describing a source Y such that (1) source Y is nearly flat (uniform on its support), and (2) the Shannon entropy of Y is monotonically related to that of X. The standard solution is to have C_Y evaluate C_X altogether Theta(n^2) times on independent inputs and concatenate the results (correctness follows from the asymptotic equipartition property). In this paper, we show that this is optimal among blackbox constructions: Any circuit C_Y for entropy flattening that repeatedly queries C_X as an oracle requires Omega(n^2) queries. Entropy flattening is a component used in the constructions of pseudorandom generators and other cryptographic primitives from oneway functions [Johan Håstad et al., 1999; John Rompel, 1990; Thomas Holenstein, 2006; Iftach Haitner et al., 2006; Iftach Haitner et al., 2009; Iftach Haitner et al., 2013; Iftach Haitner et al., 2010; Salil P. Vadhan and Colin Jia Zheng, 2012]. It is also used in reductions between problems complete for statistical zeroknowledge [Tatsuaki Okamoto, 2000; Amit Sahai and Salil P. Vadhan, 1997; Oded Goldreich et al., 1999; Vadhan, more »
 Award ID(s):
 1749750
 Publication Date:
 NSFPAR ID:
 10081825
 Journal Name:
 33rd Computational Complexity Conference (CCC 2018)
 Volume:
 102
 Page Range or eLocationID:
 23:123:28
 ISSN:
 18688969
 Sponsoring Org:
 National Science Foundation
More Like this


The approximate degree of a Boolean function f is the least degree of a real polynomial that approximates f pointwise to error at most 1/3. The approximate degree of f is known to be a lower bound on the quantum query complexity of f (Beals et al., FOCS 1998 and J. ACM 2001). We find tight or nearly tight bounds on the approximate degree and quantum query complexities of several basic functions. Specifically, we show the following. kDistinctness: For any constant k, the approximate degree and quantum query complexity of the kdistinctness function is Ω(n3/4−1/(2k)). This is nearly tight for large k, as Belovs (FOCS 2012) has shown that for any constant k, the approximate degree and quantum query complexity of kdistinctness is O(n3/4−1/(2k+2−4)). Image size testing: The approximate degree and quantum query complexity of testing the size of the image of a function [n]→[n] is Ω~(n1/2). This proves a conjecture of Ambainis et al. (SODA 2016), and it implies tight lower bounds on the approximate degree and quantum query complexity of the following natural problems. kJunta testing: A tight Ω~(k1/2) lower bound for kjunta testing, answering the main open question of Ambainis et al. (SODA 2016). Statistical distance frommore »

Abstract Expert testimony varies in scientific quality and jurors have a difficult time evaluating evidence quality (McAuliff et al., 2009). In the current study, we apply Fuzzy Trace Theory principles, examining whether visual and gist aids help jurors calibrate to the strength of scientific evidence. Additionally we were interested in the role of jurors’ individual differences in scientific reasoning skills in their understanding of case evidence. Contrary to our preregistered hypotheses, there was no effect of evidence condition or gist aid on evidence understanding. However, individual differences between jurors’ numeracy skills predicted evidence understanding. Summary Poorquality expert evidence is sometimes admitted into court (Smithburn, 2004). Jurors’ calibration to evidence strength varies widely and is not robustly understood. For instance, previous research has established jurors lack understanding of the role of control groups, confounds, and sample sizes in scientific research (McAuliff, Kovera, & Nunez, 2009; Mill, Gray, & Mandel, 1994). Still others have found that jurors can distinguish weak from strong evidence when the evidence is presented alone, yet not when simultaneously presented with case details (Smith, Bull, & Holliday, 2011). This research highlights the need to present evidence to jurors in a way they can understand. Fuzzy Trace Theory purportsmore »

Mathematics is an important tool in engineering practice, as mathematical rules govern many designed systems (e.g., Nathan et al., 2013; Nathan et al., 2017). Investigations of structural engineers suggest that mathematical modelling is ubiquitous in their work, but the nature of the tasks they confront is not wellrepresented in the K12 classroom (e.g., Gainsburg, 2006). This follows a larger literature base suggesting that school mathematics is often inauthentic and does represent how mathematics is used in practice. At the same time, algebra is a persistent gatekeeper to careers in engineering (e.g., Harackiewicz et al., 2012; Olson & Riordan, 2012). In the present study, we interviewed 12 engineers, asking them a series of questions about how they use specific kinds of algebraic function (e.g., linear, exponential, quadratic) in their work. The purpose of these interviews was to use the responses to create mathematical scenarios for College Algebra activities that would be personalized to community college students’ career interests. This curriculum would represent how algebra is used in practice by STEM professionals. However, our results were not what we expected. In this paper, we discuss three major themes that arose from qualitative analyses of the interviews. First, we found that engineers resoundinglymore »

In a recent work (Ghazi et al., SODA 2016), the authors with Komargodski and Kothari initiated the study of communication with contextual uncertainty, a setup aiming to understand how efficient communication is possible when the communicating parties imperfectly share a huge context. In this setting, Alice is given a function f and an input string x, and Bob is given a function g and an input string y. The pair (x,y) comes from a known distribution mu and f and g are guaranteed to be close under this distribution. Alice and Bob wish to compute g(x,y) with high probability. The lack of agreement between Alice and Bob on the function that is being computed captures the uncertainty in the context. The previous work showed that any problem with oneway communication complexity k in the standard model (i.e., without uncertainty, in other words, under the promise that f=g) has publiccoin communication at most O(k(1+I)) bits in the uncertain case, where I is the mutual information between x and y. Moreover, a lower bound of Omega(sqrt{I}) bits on the publiccoin uncertain communication was also shown. However, an important question that was left open is related to the power that public randomness bringsmore »