Estimation of Shannon and Rényi entropies of unknown discrete distributions is a fundamental problem in statistical property testing. In this paper, we give the first quantum algorithms for estimating α-Rényi entropies (Shannon entropy being 1-Rényi entropy). In particular, we demonstrate a quadratic quantum speedup for Shannon entropy estimation and a generic quantum speedup for α-Rényi entropy estimation for all α ≥ 0, including tight bounds for the Shannon entropy, the Hartley entropy (α = 0), and the collision entropy (α = 2). We also provide quantum upper bounds for estimating min-entropy (α = +∞) as well as the Kullback-Leibler divergence. We complement our results with quantum lower bounds on α- Rényi entropy estimation for all α ≥ 0. Our approach is inspired by the pioneering work of Bravyi, Harrow, and Hassidim (BHH) [1], however, with many new technical ingredients: (1) we improve the error dependence of the BHH framework by a fine-tuned error analysis together with Montanaro’s approach to estimating the expected output of quantum subroutines [2] for α = 0, 1; (2) we develop a procedure, similar to cooling schedules in simulated annealing, for general α ≥ 0; (3) in the cases of integer α ≥ 2 and α = +∞, we reduce the entropy estimation problem to the α-distinctness and the ⌈log n⌉-distinctness problems, respectively. 
                        more » 
                        « less   
                    
                            
                            On the α-q-Mutual Information and the α-q-Capacities
                        
                    
    
            The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. All of the considerations follow the same approach, mimicking some of the various and mutually equivalent definitions of Shannon information measures, and the information transfer is quantified by an appropriately defined measure of mutual information, while the maximal information transfer is considered as a generalized channel capacity. However, all of the previous approaches fail to satisfy at least one of the ineluctable properties which a measure of (maximal) information transfer should satisfy, leading to counterintuitive conclusions and predicting nonphysical behavior even in the case of very simple communication channels. This paper fills the gap by proposing two parameter measures named the α-q-mutual information and the α-q-capacity. In addition to standard Shannon approaches, special cases of these measures include the α-mutual information and the α-capacity, which are well established in the information theory literature as measures of additive Rényi information transfer, while the cases of the Tsallis, the Landsberg–Vedral and the Gaussian entropies can also be accessed by special choices of the parameters α and q. It is shown that, unlike the previous definition, the α-q-mutual information and the α-q-capacity satisfy the set of properties, which are stated as axioms, by which they reduce to zero in the case of totally destructive channels and to the (maximal) input Sharma–Mittal entropy in the case of perfect transmission, which is consistent with the maximum likelihood detection error. In addition, they are non-negative and less than or equal to the input and the output Sharma–Mittal entropies, in general. Thus, unlike the previous approaches, the proposed (maximal) information transfer measures do not manifest nonphysical behaviors such as sub-capacitance or super-capacitance, which could qualify them as appropriate measures of the Sharma–Mittal information transfer. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1907918
- PAR ID:
- 10249479
- Date Published:
- Journal Name:
- Entropy
- Volume:
- 23
- Issue:
- 6
- ISSN:
- 1099-4300
- Page Range / eLocation ID:
- 702
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            A<sc>bstract</sc> We explore a large class of correlation measures called theα−zRényi mutual informations (RMIs). Unlike the commonly used notion of RMI involving linear combinations of Rényi entropies, theα−zRMIs are positive semi-definite and monotonically decreasing under local quantum operations, making them sensible measures of total (quantum and classical) correlations. This follows from their descendance from Rényi relative entropies. In addition to upper bounding connected correlation functions between subsystems, we prove the much stronger statement that for certain values ofαandz, theα−zRMIs also lower bound certain connected correlation functions. We develop an easily implementable replica trick which enables us to compute theα−zRMIs in a variety of many-body systems including conformal field theories, free fermions, random tensor networks, and holography.more » « less
- 
            We introduce asymptotic Rényi entropies as a parameterized family of invariants for random walks on groups. These invariants interpolate between various well-studied properties of the random walk, including the growth rate of the group, the Shannon entropy, and the spectral radius. They furthermore offer large deviation counterparts of the Shannon-McMillan-Breiman Theorem. We prove some basic properties of asymptotic Rényi entropies that apply to all groups, and discuss their analyticity and positivity for the free group and lamplighter groups.more » « less
- 
            We explore asymptotically optimal bounds for deviations of distributions of independent Bernoulli random variables from the Poisson limit in terms of the Shannon relative entropy and Rényi/relative Tsallis distances (including Pearson’s χ2). This part generalizes the results obtained in Part I and removes any constraints on the parameters of the Bernoulli distributions.more » « less
- 
            A bstract Quantum states with geometric duals are known to satisfy a stricter set of entropy inequalities than those obeyed by general quantum systems. The set of allowed entropies derived using the Ryu-Takayanagi (RT) formula defines the Holographic Entropy Cone (HEC). These inequalities are no longer satisfied once general quantum corrections are included by employing the Quantum Extremal Surface (QES) prescription. Nevertheless, the structure of the QES formula allows for a controlled study of how quantum contributions from bulk entropies interplay with HEC inequalities. In this paper, we initiate an exploration of this problem by relating bulk entropy constraints to boundary entropy inequalities. In particular, we show that requiring the bulk entropies to satisfy the HEC implies that the boundary entropies also satisfy the HEC. Further, we also show that requiring the bulk entropies to obey monogamy of mutual information (MMI) implies the boundary entropies also obey MMI.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    