Title: Quantum coherence, discord and correlation measures based on Tsallis relative entropy
Several ways have been proposed in the literature to define a coherence measure based on Tsallis relative entropy. One of them is defined as a distance between a state and a set of incoherent states with Tsallis relative entropy taken as a distance measure. Unfortunately, this measure does not satisfy the required strong monotonicity, but a modification of this coherence has been proposed that does. We introduce three new Tsallis coherence measures coming from a more general definition that also satisfy the strong monotonicity, and compare all five definitions between each other. Using three coherence measures that we discuss, one can also define a discord. Two of these have been used in the literature, and another one is new. We also discuss two correlation measures based on Tsallis relative entropy. We provide explicit expressions for all three discord and two correlation measure on pure states. Lastly, we provide tight upper and lower bounds on two discord and correlations measures on any quantum state, with the condition for equality. more »« less
We present a genuine coherence measure based on a quasi-relative entropy as a difference between quasientropies of the dephased and the original states. The measure satisfies non-negativity and monotonicity under genuine incoherent operations (GIO). It is strongly monotone under GIO in two- and three-dimensions, or for pure states in any dimension, making it a genuine coherence monotone. We provide a bound on the error term in the monotonicity relation in terms of the trace distance between the original and the dephased states. Moreover, the lower bound on the coherence measure can also be calculated in terms of this trace distance.
The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. All of the considerations follow the same approach, mimicking some of the various and mutually equivalent definitions of Shannon information measures, and the information transfer is quantified by an appropriately defined measure of mutual information, while the maximal information transfer is considered as a generalized channel capacity. However, all of the previous approaches fail to satisfy at least one of the ineluctable properties which a measure of (maximal) information transfer should satisfy, leading to counterintuitive conclusions and predicting nonphysical behavior even in the case of very simple communication channels. This paper fills the gap by proposing two parameter measures named the α-q-mutual information and the α-q-capacity. In addition to standard Shannon approaches, special cases of these measures include the α-mutual information and the α-capacity, which are well established in the information theory literature as measures of additive Rényi information transfer, while the cases of the Tsallis, the Landsberg–Vedral and the Gaussian entropies can also be accessed by special choices of the parameters α and q. It is shown that, unlike the previous definition, the α-q-mutual information and the α-q-capacity satisfy the set of properties, which are stated as axioms, by which they reduce to zero in the case of totally destructive channels and to the (maximal) input Sharma–Mittal entropy in the case of perfect transmission, which is consistent with the maximum likelihood detection error. In addition, they are non-negative and less than or equal to the input and the output Sharma–Mittal entropies, in general. Thus, unlike the previous approaches, the proposed (maximal) information transfer measures do not manifest nonphysical behaviors such as sub-capacitance or super-capacitance, which could qualify them as appropriate measures of the Sharma–Mittal information transfer.
Wilde, Mark M.
(, Proceedings of the 2018 IEEE International Symposium on Information Theory)
null
(Ed.)
The quantum relative entropy is a measure of the distinguishability of two quantum states, and it is a unifying concept in quantum information theory: many information measures such as entropy, conditional entropy, mutual information, and entanglement measures can be realized from it. As such, there has been broad interest in generalizing the notion to further understand its most basic properties, one of which is the data processing inequality. The quantum f-divergence of Petz is one generalization of the quantum relative entropy, and it also leads to other relative entropies, such as the Petz--Renyi relative entropies. In this contribution, I introduce the optimized quantum f-divergence as a related generalization of quantum relative entropy. I prove that it satisfies the data processing inequality, and the method of proof relies upon the operator Jensen inequality, similar to Petz's original approach. Interestingly, the sandwiched Renyi relative entropies are particular examples of the optimized f-divergence. Thus, one benefit of this approach is that there is now a single, unified approach for establishing the data processing inequality for both the Petz--Renyi and sandwiched Renyi relative entropies, for the full range of parameters for which it is known to hold.
We define a new family of similarity and distance measures on graphs, and explore their theoretical properties in comparison to conventional distance metrics. These measures are defined by the solution(s) to an optimization problem which attempts find a map minimizing the discrepancy between two graph Laplacian exponential matrices, under norm-preserving and sparsity constraints. Variants of the distance metric are introduced to consider such optimized maps under sparsity constraints as well as fixed time-scaling between the two Laplacians. The objective function of this optimization is multimodal and has discontinuous slope, and is hence difficult for univariate optimizers to solve. We demonstrate a novel procedure for efficiently calculating these optima for two of our distance measure variants. We present numerical experiments demonstrating that (a) upper bounds of our distance metrics can be used to distinguish between lineages of related graphs; (b) our procedure is faster at finding the required optima, by as much as a factor of 10 3 ; and (c) the upper bounds satisfy the triangle inequality exactly under some assumptions and approximately under others. We also derive an upper bound for the distance between two graph products, in terms of the distance between the two pairs of factors. Additionally, we present several possible applications, including the construction of infinite “graph limits” by means of Cauchy sequences of graphs related to one another by our distance measure.
Kopfer, Eva; Streets, Jeffrey
(, Symmetry, Integrability and Geometry: Methods and Applications)
We prove results relating the theory of optimal transport and generalized Ricci flow. We define an adapted cost functional for measures using a solution of the associated dilaton flow. This determines a formal notion of geodesics in the space of measures, and we show geodesic convexity of an associated entropy functional. Finally, we show monotonicity of the cost along the backwards heat flow, and use this to give a new proof of the monotonicity of the energy functional along generalized Ricci flow.
Vershynina, Anna. Quantum coherence, discord and correlation measures based on Tsallis relative entropy. Retrieved from https://par.nsf.gov/biblio/10142423. Quantum Information and Computation 20.7&8 Web. doi:10.26421/QIC20.7-8-2.
Vershynina, Anna. Quantum coherence, discord and correlation measures based on Tsallis relative entropy. Quantum Information and Computation, 20 (7&8). Retrieved from https://par.nsf.gov/biblio/10142423. https://doi.org/10.26421/QIC20.7-8-2
@article{osti_10142423,
place = {Country unknown/Code not available},
title = {Quantum coherence, discord and correlation measures based on Tsallis relative entropy},
url = {https://par.nsf.gov/biblio/10142423},
DOI = {10.26421/QIC20.7-8-2},
abstractNote = {Several ways have been proposed in the literature to define a coherence measure based on Tsallis relative entropy. One of them is defined as a distance between a state and a set of incoherent states with Tsallis relative entropy taken as a distance measure. Unfortunately, this measure does not satisfy the required strong monotonicity, but a modification of this coherence has been proposed that does. We introduce three new Tsallis coherence measures coming from a more general definition that also satisfy the strong monotonicity, and compare all five definitions between each other. Using three coherence measures that we discuss, one can also define a discord. Two of these have been used in the literature, and another one is new. We also discuss two correlation measures based on Tsallis relative entropy. We provide explicit expressions for all three discord and two correlation measure on pure states. Lastly, we provide tight upper and lower bounds on two discord and correlations measures on any quantum state, with the condition for equality.},
journal = {Quantum Information and Computation},
volume = {20},
number = {7&8},
author = {Vershynina, Anna},
}
Warning: Leaving National Science Foundation Website
You are now leaving the National Science Foundation website to go to a non-government website.
Website:
NSF takes no responsibility for and exercises no control over the views expressed or the accuracy of
the information contained on this site. Also be aware that NSF's privacy policy does not apply to this site.