Tsallis and Rényi Deformations Linked via a New λ-Duality

2022 
Tsallis and Rényi entropies, which are monotone transformations of each other, are deformations of the celebrated Shannon entropy. Maximization of these deformed entropies, under suitable constraints, leads to the $q$ -exponential family which has applications in non-extensive statistical physics, information theory and statistics. In previous information-geometric studies, the $q$ -exponential family was analyzed using classical convex duality and Bregman divergence. In this paper, we show that a generalized $\lambda $ -duality, where $\lambda = 1 - q$ is to be interpreted as the constant information-geometric curvature, leads to a generalized exponential family which is essentially equivalent to the $q$ -exponential family and has deep connections with Rényi entropy and optimal transport. Using this generalized convex duality and its associated logarithmic divergence, we show that our $\lambda $ -exponential family satisfies properties that parallel and generalize those of the exponential family. Under our framework, the Rényi entropy and divergence arise naturally, and we give a new proof of the Tsallis/Rényi entropy maximizing property of the $q$ -exponential family. We also introduce a $\lambda $ -mixture family which may be regarded as the dual of the $\lambda $ -exponential family, and connect it with other mixture-type families. Finally, we discuss a duality between the $\lambda $ -exponential family and the $\lambda $ -logarithmic divergence, and study its statistical consequences.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    55
    References
    0
    Citations
    NaN
    KQI
    []
    Baidu
    map