Articles | Volume 7, issue 3/4
https://doi.org/10.5194/npg-7-211-2000
https://doi.org/10.5194/npg-7-211-2000
31 Dec 2000
31 Dec 2000

Functional background of the Tsallis entropy: "coarse-grained" systems and "kappa" distribution functions

A. V. Milovanov and L. M. Zelenyi

Abstract. The concept of the generalized entropy is analyzed, with the particular attention to the definition postulated by Tsallis [J. Stat. Phys. 52, 479 (1988)]. We show that the Tsallis entropy can be rigorously obtained as the solution of a nonlinear functional equation; this equation represents the entropy of a complex system via the partial entropies of the subsystems involved, and includes two principal parts. The first part is linear (additive) and leads to the conventional, Boltzmann, definition of entropy as the logarithm of the statistical weight of the system. The second part is multiplicative and contains all sorts of multilinear products of the partial entropies; inclusion of the multiplicative terms is shown to reproduce the generalized entropy exactly in the Tsallis sense. We speculate that the physical background for considering the multiplicative terms is the role of the long-range correlations supporting the "macroscopic" ordering phenomena (e.g., formation of the "coarse-grained" correlated patterns). We prove that the canonical distribution corresponding to the Tsallis definition of entropy, coincides with the so-called "kappa" redistribution which appears in many physical realizations. This has led us to associate the origin of the "kappa" distributions with the "macroscopic" ordering ("coarse-graining") of the system. Our results indicate that an application of the formalism based on the Tsallis notion of entropy might actually have sense only for the systems whose statistical weights, Ω, are relatively small. (For the "coarse-grained" systems, the weight \omega could be interpreted as the number of the "grains".) For large Ω (i.e., Ω -> ∞), the standard statistical mechanical formalism is advocated, which implies the conventional, Boltzmann definition of entropy as ln Ω.