Next Article in Journal
Comparison of Outlier-Tolerant Models for Measuring Visual Complexity
Next Article in Special Issue
Effects of Urban Producer Service Industry Agglomeration on Export Technological Complexity of Manufacturing in China
Previous Article in Journal
An Efficient, Parallelized Algorithm for Optimal Conditional Entropy-Based Feature Selection
Previous Article in Special Issue
Research on the Node Importance of a Weighted Network Based on the K-Order Propagation Number Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Useful Dual Functional of Entropic Information Measures

1
Consejo Nacional de Investigaciones Científicas y Tecnológicas, (IFLP-CCT-CONICET)-C. C. 727, 1900 La Plata, Argentina
2
Departamento de Física, Universidad Nacional de La Plata, 1900 La Plata, Argentina
3
Departamento de Matemática, Universidad Nacional de La Plata, 1900 La Plata, Argentina
4
Fac. de C. Exactas-National University La Pampa, Peru y Uruguay, Santa Rosa, 6300 La Pampa, Argentina
5
Departamento de Física, Universidad Católica del Norte, Av. Angamos 0610, 2340000 Antofagasta, Chile
*
Author to whom correspondence should be addressed.
Submission received: 27 March 2020 / Revised: 17 April 2020 / Accepted: 18 April 2020 / Published: 24 April 2020
(This article belongs to the Special Issue Entropic Forces in Complex Systems)

Abstract

:
There are entropic functionals galore, but not simple objective measures to distinguish between them. We remedy this situation here by appeal to Born’s proposal, of almost a hundred years ago, that the square modulus of any wave function | ψ | 2 be regarded as a probability distribution P. the usefulness of using information measures like Shannon’s in this pure-state context has been highlighted in [Phys. Lett. A1993, 181, 446]. Here we will apply the notion with the purpose of generating a dual functional [ F α R : { S Q } R + ], which maps entropic functionals onto positive real numbers. In such an endeavor, we use as standard ingredients the coherent states of the harmonic oscillator (CHO), which are unique in the sense of possessing minimum uncertainty. This use is greatly facilitated by the fact that the CHO can be given analytic, compact closed form as shown in [Rev. Mex. Fis. E 2019, 65, 191]. Rewarding insights are to be obtained regarding the comparison between several standard entropic measures.

1. Introduction

Dual functionals map ordinary functionals onto real numbers. We are here interested in entropic functionals (EF). There are EFs galore, but no simple objective measures to distinguish between them. We remedy this situation in this work by appealing to Born’s proposal, of almost a hundred years ago, that the square modulus of any wave function | ψ | 2 ought to be regarded as a probability distribution P.
We begin by reminding the reader that the notion of appealing to just a small quantity of expectation values so as to describe main features of physical systems underlies statistical mechanics, particularly in its information theory version, called MaxEnt by its creator (Jaynes). Indeed, theoretical developments of the last century led Jaynes to formulate his MaxEnt approach, which is known to yield the least biased representation consistent with the available data-amount [1,2,3,4,5,6,7,8].
For a similar, but purely quantum treatment in the style of Born, so as to describe pure quantum states, important advances were made in References [9,10,11,12,13,14,15,16], in which a “quantum entropy functional” S Q was utilized, and the MaxEnt approach profitably employed. As an aside, we mention that, precisely, the MaxEnt approach has become the main comparison-via, till now, to ascertain whether a certain entropic measure is more or less apt than another in describing a given scientific phenomenon.
Returning to the pure-states entropic measure S Q ; Q = ( q 1 , q 2 , , q n ) , the MaxEnt methodology was demonstrated to be very useful in describing both ground and excited states of variegated many-body problems [9,10,11,12,13]. It constituted a reasonable alternative to the celebrated Gutzwiller ansatz [15], and paved the way for rather interesting semi-classical treatments [16]. It has been shown to provide one with many-body wave functions of a better quality in several distinct scenarios, like the Hartree-Fock [10], the BCS [11], or the random phase approximation [13] ones. One appeals there to a Shannon’s logarithmic ignorance measure [4] for the probability distribution P i ,
S [ P ] = i P i ln ( P i ) ,
with a special choice for the probability distribution (PD)
S ( ψ ) = 2 i | c i | 2 ln [ | c i | ]
for, in an arbitrary basis | i > ,
ψ = j c j | j >
in self explanatory notation.

The Quantum Entropic Functional S Q

Several important properties of the quantum entropy S Q were demonstrated in Reference [16], namely:
  • S Q is a true ignorance function, in the sense of Brillouin. For a normalized, discrete probability distribution p i , for instance, Shannon’s measure represents the missing information that one would need to possess so as to be in a “complete information” situation (CIS). In a CIS, just one p i = 1 , while the remaining ones vanish [4,5].
  • There is a unique global minimum for S Q subject to appropriate MaxEnt constraints.
  • S Q obeys an H-theorem.
  • Ground state wave functions that maximize S Q satisfy the virial theorem and the hyper virial ones [17].
We see then that our ignorance measure [4] S Q exhibits adequate credentials to be seriously considered. the wave function (wf) we will be interested in here is that advanced in References [18,19], which compactly describes in simple analytic terms the coherent states of the harmonic oscillator (HO), advantageously replacing the usual, cumbersome infinite sum.

2. A Recently Developed Analytic, Compact Expression for Coherent States

Reference [18] introduced for the first time ever an analytic, compact expression for coherent states, that was a posteriori extensively discussed in Reference [19]. the new coherent states’ compact expression advantageously replaces the customary Glauber’s infinite expansion in terms of the harmonic oscillator eigenstates | n > . It reads
ψ α ( x ) = m w π 1 4 e α 2 2 e | α | 2 2 e m w x 2 2 e 2 m w α x .
These ψ α ( x ) are eigenfunctions of the annihilation operator a corresponding to the one dimensional HO. Thus,
| α > = m w π 1 4 e α 2 2 e | α | 2 2 e m w x 2 2 e 2 m w α x | x > d x .
and
a | α > = α | α > .
For α = 0 we have
ψ 0 ( x ) = m w π 1 4 e m w x 2 2 = ϕ 0 ( x ) ,
namely, the wave function (wf) for the HO-ground state, which is a coherent state itself. For simplicity, in what follows we set
m w = 1 .
Given a certain operator A, it is certainly much easier to compute < α | A | α > (just one integral) than an infinite number of < n | A | n > (for n phonons) and then sum over them.
Our ψ α ( x ) , eigenfunctions of the annihilation operator a corresponding to the one dimensional HO, exhibit a special property that is of the essence for our present purposes: they are states of minimum Heisenberg-uncertainty. Actually, this is their principal feature, to such an extent that it becomes its defining trait: a coherent state is that of minimum uncertainty (with regards to canonical conjugate variables). This translates into the fact that their associated quantal entropy S Q , a measure of ignorance [4], is unique in the sense that the associated quantum ignorance is minimal.
Our central proposal here emerges in this context—associate to any entropic functional S Q ( P ) a numerical real value. This value emerges when the P input of S Q is a coherent state. This idea is viable because, as we will see, this functional’s numerical associated value m is independent of the displacement factor α of the coherent state. m is the same for any arbitrary α and thus uniquely characterizes any arbitrary dual functional F [ S Q ] α R
F α R : { S Q } R + .

3. Some Different Monoparametric Ignorance Measures

Shannon’s logarithmic measure (1) does not possess any parameter. Generalized entropic measures (GEMs) do [the best summary for them is, in our view, Reference [20] (and references therein). They have become quite popular in the last 30 years, being applied to variegated scientific areas of endeavor, from high energy physics to Economics. There are many GEMs [21], but we will limit ourselves in this Section to four monoparametric ones.
Let F ( x ) be the probability density (PD) corresponding to a wave function ψ ( x ) , of the form
F ( x ) = ψ * ( x ) ψ ( x ) .
Shannon’s entropic measure (or ignorance measure) is (we set Boltzmann’s constant k B = 1 )
S S = F ( x ) ln [ F ( x ) ] d x .
Tsallis’ ignorance measure reads [20]
S T q = 1 [ F ( x ) ] q d x q 1 ,
while Rènyi’s one adopts the appearance [20]
S R q = 1 1 q ln [ F ( x ) ] q d x .
Finally, Kaniadakis’ ignorance measure is [22,23,24]
S K q = 1 2 q [ F ( x ) ] 1 + q d x [ F ( x ) ] 1 q d x .

4. The  Main Mathematical Tool of This Paper

The coherent state PD is, for complex α ,
α = α R + i α I ,
given by
F α ( x ) = < ψ α ( x ) | ψ α ( x ) > = π 1 2 e ( x 2 α R ) 2 = F α R ( x )
and obviously depends only on the real part α R of α .
Given the probability density F for our coherent state, our fundamental tool is to be introduced at this point, via the formal introduction of a dual functional F of a given ignorance measure S ( F ) (S is a functional of F). In practice, however, to evaluate F we just compute the functional S ( F )
F α R ( S ) = S ( F α R ) .
We apply it now to our current five ignorance measures, beginning with Shannon’s S S
F α R ( S S ) = S S ( F α R ) = 1 2 ( 1 + ln π ) 1.07 ,
which is independent of α R ! This feature is common to all of our five measures, and can be generalized to other generalized measures.

4.1. Important Comment on the Meaning of Equation (18)

Let us consider now the specific real number associated with Shannon’s measure
N S = 1 2 ( 1 + ln π ) 1.07 .
N S is the minimum amount of ignorance displayed by Shannon’s entropy. It could perhaps be thought of as a kind of information theory’s counterpart of the uncertainty / 2 of quantum mechanics, although the units are different in the two cases. This least / 2 amount of ignorance (with regards to canonically conjugate variables) is physically unavoidable, of course. the Shannon quantum entropic functional S Q , instead, reflects an altogether distinct ignorance-amount (IA), that pertaining to the Born probability density | ψ ( x ) | 2 . Can this IA be diminished if one chooses a different entropic measure? This is a seemingly interesting question, that will be answered in the affirmative in the next Section below. Let us make perfectly clear the following notion. A given minimum IA for an entropic functional (EF)
  • in no way makes an EF “better” or “worse” than another EF,
  • but it serves the purpose of classifying EFs using it and
  • classification is the starting step of any scientific discipline [25].

4.2. Ignorance-Amount (IA) for Generalized Entropies

Our integrals over the variable x run always between and .
Tsallis’ entropy in the paradigmatic example [20]. In such case we will obtain a function N T ( q ) of q rather than a pure number. N T ( q ) depends on the specific value of the parameter q so that, after a straightforward computation, we get a real number N T for each q value. This real number arises from applying the super functional F α R to the functional S T q [ F α R ] . Indeed,
N T ( q ) = F α R ( S T q ) = S T q [ F α R ] = 1 q q π 1 q 2 q 1 ,
while, in Rényi’s case [20] we face the real numbers N R ( q )
N R ( q ) = F α R ( S R q ) = S R q [ F α R ] = 1 2 ( 1 q ) [ ln π ln q q ln π ]
Finally, for N K ( q ) - Kaniadakis, we find [22,23,24]
N K ( q ) = F α R ( S K q ) = S K q [ F α R ] = 1 2 q 1 π q 2 1 1 + q 1 π q 2 1 1 q .
The values of the super functional F are indeed independent of α R and are all functions of π [and for all but Shannon’s, also of q]. the π -dependence comes, of course, from integrating a Gaussian function for the coherent states. We insist on the fact that we are facing here pure numbers. No physical units are involved.
If we carefully inspect the above equations, we will appreciate that, in some cases, the Shannon’s IA is diminished for the generalized functionals. This will be clearly seen in the graphs that we will display below.

4.3. Generalizing the  α R -Independence to Arbitrary Entropic Measures

Let G Q be an arbitrary entropic measure that depends upon a set of parameters Q and involves the coherent-state probability density F, with Q = ( q 1 , q 2 , , q n ) . We have the functional F α R ( G C )
F α R ( G Q ) = G Q [ F α R ( x ) ] d x =
G Q [ π 1 2 e ( x 2 α R ) 2 ] d x = G Q [ π 1 2 e x 2 ] d x = I Q
and we see that the α R dependence is gone, absorbed in a variables’ change that one makes in performing the Gaussian integrations, as above.

5. Results: Four Numerical Quantities Associated to Each of Our Monoparametric Ignorance Measures

These N quantities are (1) N S , (2) N T ( q ) , (3) N R ( q ) , and (4) N K ( q ) , associated respectively with Shannon, Tsallis, Rényi, and Kaniadakis. We plot and compare them. We see that Shannon’s ignorance amount can indeed be diminished by other entropic measures. Figure 1 one clearly demonstrates the fact that the Shannon’s ignorance amount is indeed decreased for q > 1 in both the Tsallis and Rényi instances. Instead, Kanidakis’ functional achieves the same feat for q near zero.
In Figure 2 we compare the ignorance amounts (IA) associated with Tsallis (horizontal) and Rényi (vertical) entropic forms.
The black curve displays N R ( q ) (vertical axis) versus N T ( q ) (horizontal one). A monotonic dependence is observed, as one should expect from the associated mathematical expressions for these entropic forms. the red curve tells us that Tsallis-IA is smaller than Rényi’s one for q > 1 . Viceversa for q < 1 .
Figure 3 makes the comparisons as Figure 2, but now relates (black curve) Kaniadakis (vertical) to Rényi (horizontal) functionals. Here the black curve depicts the highly non trivial relationship between them.

6. Sharma-Mittal Biparametric Ignorance Measure

It is defined in term of two parameters r and q as [26,27]
S S M ( q , r ) = 1 1 r [ F ( x ) ] q d x 1 r 1 q 1 ,
so that
F α r ( S S M ( q , 2 q 1 ) ) = 1 2 q π 1 q 2 q 2 q 1 q 1 ,
where we have (arbitrarily, for comparisons ease) selected r = 2 q 1 . For r = 2 one has
F α r ( S S M ( q , 2 ) ) = 1 1 π q 1 2 q 1 ,
while for r = 0.5 we have
N S M ( q , r ) = F α r ( S S M ( q , 0.5 ) ) = 2 π 1 4 q 1 4 ( q 1 ) 2 .
The following graph (Figure 4) depicts our functional in terms of the pair ( q , r ) .
The next figure (Figure 5) compares the Tsallis result to the Sharma-Mittal ( q , 2 q 1 ) one.
We appreciate the fact that Sharma-Mittal measure exhibits a smaller ignorance amount than the Tsallis one for ( 0 q ) . This is to be expected, since there are two free parameters.

7. Value of Our Dual Functional When the  S Q -Argument Is Not a Coherent State

For the sake of completeness, we show now that the numerical value m of F [ S Q ] , when we deal with S q [ F 1 ] (with F 1 the probability density for the HO first excited state), is larger than that for the same dual functional, when the argument of S Q is a coherent state.
This should lend credibility to the statement that coherent states’ information measures yield minimum values for the dual functional.
The expression for the first excited state wave function is
ϕ 1 ( x ) = 2 x ( 4 π ) 1 4 e x 2 2 .
Then,
F ϕ 1 ( S ) = 2 π x 2 e x 2 ln 2 x 2 π e x 2 d x ,
so that (29) becomes
F ϕ 1 ( S ) = 1 2 ln π + ln 2 + C 1 2 = m 1 1.34 ,
where C = 0.57721566490 is Euler’s constant. From (18) we see that m 1 > m ( c o h e r e n t s t a t e ) .

8. Application to An Statistical Complexity (SC) Measure

Our entropy S Q can be viewed as the measure of the uncertainty associated to the basis-states on which the wave function (wf) is expanded (Cf. Equation (3)). We can regard the situation as that of a probabilistic physical processes described by the probability distribution p j = | c j | 2 ; j = 1 ; : : : ; N , P ( p 1 ; p 2 , , p N ) , where P is a vector in a probability space. For S Q [ P ] = 0 the situation is that prevailing immediately after performing an experiment (wf “collapse” and minimum ignorance). On the other hand, our ignorance is maximal if S [ P ] = ln N (uniform probability). These two extreme circumstances of (i) maximum knowledge (or “perfect order”) and (ii) maximum ignorance (or maximum “randomness”) are regarded by many authors [1,2,3,4,5,6,7,8,9,10,11,28,29,30,31,32,33,34,35] as “trivial” ones. These authors have conceived the idea of devising a “measure” of the “statistical complexity” (SC) contained in P that would vanish in two extreme situations described above. We will analyze here, the quantum SC of which S Q is a basic ingredient. We will apply the quantifier C to the probability distribution (PD) P = | ψ α | 2 corresponding to coherent states. Accordingly, if C = 0 , the PD P would contain only trivial information. the larger C, the larger the amount of “non-triviality”. At this stage of our discussion emerges an important and well known observation. No all the available information measures are equally able to detect non-triviality. They are equally ‘informative.’ This is why we will analyze the PD P above with different C measures, entailing distinct information measures (IM). Im turn. we study two different C definitions.

8.1. Shiner-Davison-Landsberg Complexity Measure for Distinct IM

We appeal to the simplest SC measure T S D L , devised by Shiner, Davison, and Landsberg (SDL) [36]. We first introduce the ratio H between S Q and the specific maximum value that S Q can attain ( S Q m a x i m a l ), that is,
T S D L = H ( 1 H ) .
What are we looking at with this definition in our particular instance? Remember that here P = | ψ α | 2 corresponding to coherent states. But all our present entropic measures yield results that are independent of α as we have seen above. Thus, T S D L S h a n n o n = 0 , not detecting any salient feature in P. Tsallis’ measure, instead, introduces another parameter, namely q, and correspondingly, T S D L T s a l l i s ( q ) yields different values for different q and produces a q parametrized curve- We plot in Figure 6 T S D L versus q [ 0 , ] for Shannon’s ( q = 1 ), Tsallis’ (red, q 1 ) and Rényi’s (brown, q 1 ) measures S Q q . As expected, the statistical complexity T vanishes at q = 1 , as we have just explained. For the q entropies it grows first and then stabilize themselves. Tsallis-curve displays a maximum at q 2.3 , entailing a special q value 2.3 of maximum complexity. What to we make of this maximum? that there are salient peculiarities in the distribution P above that the Tsallis SDL-measure best detect with this q value. the Rényi measure detection-ability grows with q at first, but eventually its non-triviality sensor stabilizes itself. Thus, if one is to apply P in computing some physical quantity B, the features of B should better be scrutinized via Tsallis’ measure with q = 2.3 , that would be the most “informative” one.

8.2. López Ruiz-Mancini-Calbet (LMC) Measure

The López Ruiz-Mancini-Calbet (LMC) is today regarded as the canonical SC measure, that has been applied to multiple physics-instances [28,29,30,31,32,33,34,35,37,38,39,40,41,42,43,44,45,46]. It has the following form:
T L M C = S Q ,
where Q is called the disequilibrium and is a distance in probability space between the current probability distribution P and the uniform distribution. For continuous one-dimensional density probabilities P one has [1,2,3,4,5,6,7,8,9,10,11,28,29,30,31,32,33,34,35]
Q = P 2 d x .
We have computed T L M C for the four probability distributions discussed above and plotted them versus q in Figure 7 Shannon blue dot, Tsallis green ( q 1 ) and Rényi red ( q 1 ). Note that no complexity maximum is displayed here by any of these curves. the LMC picture is the reverse of the SDL one. C is maximal for Shannon’s information measure, that becomes thus the most informative one. Rényi’s fares worse, and even more so Tsallis’. Moreover, in the two last cases the measures become less and less informative as q grows. Let us point out here that most people regard the LMC C as the canonical one, which has successfully detected phase transitions in many systems [28,29,30,31,32,33,34,37,38,39,40,41,42,43,44,45,46]. This, we construct our results as further evidence that LMC is better than SDL.

9. Conclusions

We have in this effort achieved a way of classifying the large number of different entropic functionals in vogue nowadays. This should be of importance in the sense of giving a semblance of order to the pandemonium of entropies galore that are used in a plethora of distinct scientific endeavors. Science always begins with a process of classification [25].
In our classification efforts we were aided by using the pure state entropy S Q advanced and utilized in References [9,10,11,12,13]. Our pure states are the coherent ones of the HO (CHO), taking advantage of the closed analytical representation of them advanced in References [18,19]. They are unique in the sense of possessing minimum Heisenberg uncertainty. We compute and compare diverse entropic functionals of the CHO probability densities.
Our quantum entropy S Q represents the information theoretic ignorance pertaining to the square modulus of ψ ( x ) when it is regarded as a probability density. As just stated, in this paper ψ α ( x ) is an HO-coherent state, and for any entropic functional S Q one encounters a displacement- a l p h a independent, positive real value N ( Q ) . This last fact gives sense to our central proposal, stated above, of associating to any entropic functional a numerical real value. N ( Q ) is the same for any arbitrary α and thus uniquely characterizes the entropic functional S Q .
These numbers N ( Q ) provide a way of listing, and thus classifying, the plethora of extant literature’s entropic functionals. An application to statistical complexity measures (SCM) is made, that encounters significant differences between two popular SCM.

Author Contributions

All authors produced the paper collaboratively in equal fashion. All authors have read and agreed to the published version of the manuscript.

Funding

Research was partially supported by FONDECYT, grant 1181558 and by CONICET (Argentine 200 Agency).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shannon, C. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  2. Jaynes, E.T. Information Theory and Statistical Mechanics. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
  3. Jaynes, E.T. Information Theory and Statistical Mechanics. II. Phys. Rev. 1957, 1086, 171–190. [Google Scholar] [CrossRef]
  4. Brillouin, L. Science and Information Theory; Academic Press: New York, NY, USA, 1956. [Google Scholar]
  5. Katz, A. Principles of Statistical Mechanics; Freeman: San Francisco, CA, USA, 1967. [Google Scholar]
  6. Balian, R. From Microphysics to Macrophysics; Springer: Berlin, Germany, 1991. [Google Scholar]
  7. Balian, R.; Alhassid, Y.; Reinhardt, H. Dissipation in many-body systems: A geometric approach based on information theory. Phys. Rep. 1986, 131, 1–146. [Google Scholar] [CrossRef]
  8. Reinhardt, H. On the description of dissipative collective motion. Nucl. Phys. A 1984, 413, 475–488. [Google Scholar] [CrossRef]
  9. Canosa, N.; Plastino, A.; Rossignoli, R. Ground-state wave functions and maximum entropy. Phys. Rev. A 1989, 40, 519. [Google Scholar] [CrossRef]
  10. Canosa, N.; Rossignoli, R.; Plastino, A. Maximum entropy principle for many-body ground states. Nucl. Phys. A 1990, 512, 492–508. [Google Scholar] [CrossRef]
  11. Canosa, N.; Rossignoli, R.; Plastino, A.; Miller, H.G. Quantal entropy, fluctuations, and the description of many-body ground states. Phys. Rev. C 1992, 45, 1162. [Google Scholar] [CrossRef]
  12. Arrachea, L.; Canosa, N.; Plastino, A.; Portesi, M.; Rossignoli, R. Maximum-entropy approach to critical phenomena in ground states of finite systems. Phys. Rev. A 1992, 45, 7104. [Google Scholar] [CrossRef]
  13. Canosa, N.; Plastino, A.; Rossignoli, R. Maximum-entropy-correlated ground state and the description of collective excitations. Nucl. Phys. A 1992, 550, 453–472. [Google Scholar] [CrossRef]
  14. Arrachea, L.; Canosa, N.; Plastino, A.; Rossignoli, R. Ground state of the Hubbard model: A variational approach based on the maximum entropy principle. Phys. Lett. A 1993, 176, 353–359. [Google Scholar] [CrossRef]
  15. Casas, M.; Plastino, A.; Puente, A.; Canosa, N.; Rossignoli, R. WKB wave functions without matching. Phys. Rev. A 1993, 47, 3530. [Google Scholar] [CrossRef]
  16. Plastino, A.; Plastino, A.R. Maximum entropy and approximate descriptions of pure states. Phys. Lett. A 1993, 181, 446–449. [Google Scholar] [CrossRef]
  17. Fernandez, F.M.; Castro, E.A. Hypervirial Theorems; Springer: Berlin, Germany, 1987. [Google Scholar]
  18. Ferri, G.L.; Pennini, F.; Plastino, A.; Rocca, M.C. New mathematics for the nonadditive Tsallis’ scenario. Int. J. Mod. Phys. B 2017, 31, 1750151. [Google Scholar] [CrossRef] [Green Version]
  19. Plastino, A.; Rocca, M.C. Teaching strategy for introducing beginners to Coherent States. Rev. Mex. Fis. E 2019, 65, 191–194. [Google Scholar] [CrossRef] [Green Version]
  20. Tsallis, C. Introduction to Nonextensive Statistical Mechanics; Springer: Berlin, Germany, 2009. [Google Scholar]
  21. Landsberg, P.T. Entropies galore! Braz. J. Phys. 1999, 29, 46–49. [Google Scholar] [CrossRef] [Green Version]
  22. Kaniadakis, G. Statistical mechanics in the context of special relativity. Phys. Rev. E 2002, 66, 056125. [Google Scholar] [CrossRef] [Green Version]
  23. Kaniadakis, G. Theoretical foundations and mathematical formalism of the power-law tailed statistical distributions. Entropy 2013, 15, 3983–4010. [Google Scholar] [CrossRef] [Green Version]
  24. Sparavigna, A.C. On the generalized additivity of Kaniadakis entropy. Int. J. Sci. 2015, 4, 44–48. [Google Scholar] [CrossRef]
  25. Fara, P. Science, a Four Thousand Year History; Oxford University Press: Oxford, UK, 2009. [Google Scholar]
  26. Sharma, B.D.; Mittal, D.P. New non-additive measures of entropy for discrete probability distributions. J. Math. Sci. 1975, 10, 28–40. [Google Scholar]
  27. Nielsen, F.; Nock, R. A closed-form expression for the Sharma–Mittal entropy of exponential families. J. Phys. A 2011, 45, 032003. [Google Scholar] [CrossRef] [Green Version]
  28. Pennini, F.; Plastino, A. Disequilibrium, thermodynamic relations, and Rényi’s entropy. Phys. Lett. A 2017, 381, 212–215. [Google Scholar] [CrossRef]
  29. Pennini, F.; Plastino, A. Complexity and disequilibrium as telltales of superconductivity. Physica A 2018, 506, 828–834. [Google Scholar] [CrossRef]
  30. Pennini, F.; Plastino, A. Disequilibrium, complexity, the Schottky effect, and q-entropies, in paramagnetism. Physica A 2017, 488, 85–95. [Google Scholar] [CrossRef]
  31. Pennini, F.; Plastino, A. Statistical Complexity of the Coriolis Antipairing Effect. Entropy 2019, 21, 558. [Google Scholar] [CrossRef] [Green Version]
  32. Branada, R.; Pennini, F.; Plastino, A. Statistical complexity and classical–quantum frontier. Physica A 2018, 511, 18–26. [Google Scholar] [CrossRef]
  33. Pennini, F.; Plastino, A. Statistical quantifiers for few-fermion’systems. Physica A 2018, 491, 305–312. [Google Scholar] [CrossRef]
  34. Pennini, F.; Plastino, A. Statistical manifestation of quantum correlations via disequilibrium. Phys. Lett. A 2017, 381, 3849–3854. [Google Scholar] [CrossRef] [Green Version]
  35. Anteneodo, C.; Plastino, A.R. Some features of the López-Ruiz-Mancini-Calbet (LMC) statistical measure of complexity. Phys. Lett. A 1996, 223, 348–354. [Google Scholar] [CrossRef]
  36. Shiner, J.S.; Davison, M.; Landsberg, P.T. Simple measure for complexity. Phys. Rev. E 1999, 59, 1459. [Google Scholar] [CrossRef]
  37. López-Ruiz, R.; Mancini, H.L.; Calbet, X. A statistical measure of complexity. Phys. Lett. A 1995, 209, 321–326. [Google Scholar] [CrossRef] [Green Version]
  38. Crutchfield, J.P. The calculi of emergence: Computation, dynamics and induction. Phys. D 1994, 75, 11–54. [Google Scholar] [CrossRef]
  39. Feldman, D.P.; Crutchfield, J.P. Measures of statistical complexity: Why? Phys. Lett. A 1998, 238, 244–252. [Google Scholar] [CrossRef]
  40. Martin, M.T.; Plastino, A.; Rosso, O.A. Statistical Complexity and Disequilibrium. Phys. Lett. A 2003, 311, 126–132. [Google Scholar] [CrossRef]
  41. Kowalski, A.; Martin, M.T.; Plastino, A.; Rosso, O.; Proto, A.N. Wavelet statistical complexity analysis of the classical limit. Phys. Lett. A 2003, 311, 180–191. [Google Scholar] [CrossRef]
  42. Rudnicki, L.; Toranzo, I.V.; Sanchez-Moreno, P.; Dehesa, J.S. Monotone measures of statistical complexity. Phys. Lett. A 2016, 380, 377–380. [Google Scholar] [CrossRef] [Green Version]
  43. López-Ruiz, R. A statistical measure of complexity. In Concepts and Recent Advances in Generalized Information Measures and Statistics; Kowalski, A., Rossignoli, R., Curado, E.M.C., Eds.; Bentham Science Books: New York, NY, USA, 2013; pp. 147–168. [Google Scholar]
  44. Sen, K.D. Statistical Complexity. Applications in Electronic Structure; Springer: Berlin, Germany, 2011. [Google Scholar]
  45. Mitchell, M. Complexity: A Guided Tour; Oxford University Press: Oxford, UK, 2009. [Google Scholar]
  46. Martin, M.T.; Plastino, A.; Rosso, O.A. Generalized statistical complexity measures: Geometrical and analytical properties. Physica A 2006, 369, 439–462. [Google Scholar] [CrossRef]
Figure 1. N T ( q ) versus q. The dark blue dot represents Shannon’s number N S . the green curve corresponds Tsallis’ N T ( q ) , the red one to Rényi’s N R ( q ) , and the blue one is that associated to Kaniadakis’s N K ( q ) .
Figure 1. N T ( q ) versus q. The dark blue dot represents Shannon’s number N S . the green curve corresponds Tsallis’ N T ( q ) , the red one to Rényi’s N R ( q ) , and the blue one is that associated to Kaniadakis’s N K ( q ) .
Entropy 22 00491 g001
Figure 2. Two monoparametric functions N ( q ) versus q. Green for N R ( q ) and red for N T ( q ) . the black curve displays N R ( q ) versus N T ( q ) .
Figure 2. Two monoparametric functions N ( q ) versus q. Green for N R ( q ) and red for N T ( q ) . the black curve displays N R ( q ) versus N T ( q ) .
Entropy 22 00491 g002
Figure 3. Two monoparametric functions N ( q ) versus q. Green for N R ( q ) and red for N K ( q ) . the black curve displays N R ( q ) versus N K ( q ) .
Figure 3. Two monoparametric functions N ( q ) versus q. Green for N R ( q ) and red for N K ( q ) . the black curve displays N R ( q ) versus N K ( q ) .
Entropy 22 00491 g003
Figure 4. Sharma-Mittal’s N S M ( q , r ) = F α R ( S S M ( q , r ) ) versus q. Purple for r = 2 q 1 , orange for r = 2 , and purple for r = 0.5 .
Figure 4. Sharma-Mittal’s N S M ( q , r ) = F α R ( S S M ( q , r ) ) versus q. Purple for r = 2 q 1 , orange for r = 2 , and purple for r = 0.5 .
Entropy 22 00491 g004
Figure 5. F α R is compared for (i) a monoparametric (Tsallis) versus (ii) a bi parametric one (Sharma-Mittal). the independent variable is q. the green curve represents F α R ( S T q ) while the blue one displays F α R ( S S M ( q , 2 q 1 ) ) the black curve is different. It plots F α R ( S T q ) versus F α R ( S S M ( q , 2 q 1 ) ) .
Figure 5. F α R is compared for (i) a monoparametric (Tsallis) versus (ii) a bi parametric one (Sharma-Mittal). the independent variable is q. the green curve represents F α R ( S T q ) while the blue one displays F α R ( S S M ( q , 2 q 1 ) ) the black curve is different. It plots F α R ( S T q ) versus F α R ( S S M ( q , 2 q 1 ) ) .
Entropy 22 00491 g005
Figure 6. The Shiner-Davison-Landsberg complexity measure is plotted vs. q for Shannon, Tsallis, and Rényi entropic measures, as described in the text.
Figure 6. The Shiner-Davison-Landsberg complexity measure is plotted vs. q for Shannon, Tsallis, and Rényi entropic measures, as described in the text.
Entropy 22 00491 g006
Figure 7. The López Ruiz-Mancini-Calbet complexity measure is plotted vs. q for Shannon, Tsallis, and Rényi entropic measures, as described in the text.
Figure 7. The López Ruiz-Mancini-Calbet complexity measure is plotted vs. q for Shannon, Tsallis, and Rényi entropic measures, as described in the text.
Entropy 22 00491 g007

Share and Cite

MDPI and ACS Style

Plastino, A.; Rocca, M.C.; Pennini, F. Useful Dual Functional of Entropic Information Measures. Entropy 2020, 22, 491. https://0-doi-org.brum.beds.ac.uk/10.3390/e22040491

AMA Style

Plastino A, Rocca MC, Pennini F. Useful Dual Functional of Entropic Information Measures. Entropy. 2020; 22(4):491. https://0-doi-org.brum.beds.ac.uk/10.3390/e22040491

Chicago/Turabian Style

Plastino, Angelo, Mario Carlos Rocca, and Flavia Pennini. 2020. "Useful Dual Functional of Entropic Information Measures" Entropy 22, no. 4: 491. https://0-doi-org.brum.beds.ac.uk/10.3390/e22040491

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop