Editors: Andres M. Kowalski, Ra´ul D. Rossignoli, Evaldo M. F. Curado

Concepts and Recent Advances in Generalized Information Measures and Statistics

eBook: US $59 Special Offer (PDF + Printed Copy): US $185
Printed Copy: US $155
Library License: US $236
ISBN: 978-1-60805-761-0 (Print)
ISBN: 978-1-60805-760-3 (Online)
Year of Publication: 2013
DOI: 10.2174/97816080576031130101

Introduction

Introduction: Summary of Contents

The goal of this book is to offer an updated overview on generalized information measures and statistics, including the basic concepts as well as some recent relevant applications.

The book begins with an historical introduction describing the fascinating development of the concepts of heat and entropy. Starting from the ideas of the ancient Greece, an account of the main historical breakthroughs is provided, which allows to appreciate the fundamental contributions of Nicolas Sadi Carnot, Rudolf Clausius, Ludwig Boltzmann, Josiah Willard Gibbs and others. It ends with the seminal works of Claude Shannon, which led to the foundation of Information Theory, and Edwin Jaynes, which provided the connection of the latter with Statistical Mechanics.

The second chapter is a basic tutorial on the essentials of information entropy, describing in an accessible level the concepts and quantities used in the rest of this book. The Shannon entropy and its associated measures such as the conditional entropy, the mutual information (a measure of correlations) and the relative entropy (also known as Kullback-Leibler divergence, a measure of the discrepancy between two probability distributions) are all presented, together with their main properties and the most important proofs. We also provide the main features of the Fisher Information, which can be presented in terms of the relative entropy between two slightly displaced distributions, and the associated Cramer-Rao bound. Other topics include the definition of entropy in quantum systems, the fundamental property of concavity and a brief introduction to the maximum entropy approach and its connection with statistical mechanics. It contains finally the Shannon-Khinchin axioms leading to the uniqueness theorem for the Shannon entropy together with an introduction to the concept of generalized entropies.

Chapters 3, 4 and 5, are devoted precisely to the generalized entropy concept. Chapter 3 presents a review by Constantino Tsallis of the famous non-additive entropy Sq which is known by his name, together with the associated generalized statistical mechanics and q-distributions. As described there, such generalized framework allows for the possibility of an extensive thermodynamic entropy in strongly correlated systems where the standard additive Boltzmann-Gibbs entropy is non-extensive. The chapter includes a comprehensive list of relevant recent applications of the formalism in the most diverse fields, together with the concomitant references. It also comments on recent relevant results related with the connection of Sq with the theory of numbers through the Riemann zeta function. Chapter 4 presents an axiomatic approach for deriving the form of a generalized entropy. After describing in full detail the four Shannon-Khinchin axioms, it considers the situation where just the first three are conserved, together with the requirement of two newly discovered scaling laws which the generalized entropy should fulfill. It is shown that this leads to a general form of entropy depending essentially on two parameters, which define entropic equivalence classes. The connection with the Shannon, Tsallis, R´enyi and other entropies is described in detail, together with the associated distribution functions and some related aspects. It includes an appendix containing the technical details and the demonstration of four associated theorems.

Chapter 5 discusses the relation between generalized entropies and the concept of majorization. The latter is a powerful and elegant mathematical theory for comparing probability distributions, which leads to a rigorous concept of mixedness and disorder. This chapter describes first the concept of majorization in an accessible level. It then considers its connection with entropy, and shows that by means of generalized entropies it is possible to express the majorization relation in terms of entropic inequalities. It also describes the majorization properties of the probability distributions determined by the maximization of general entropic forms, and the concept of mixing parameters, i.e., parameters whose increase ensure majorization. Finally, the concept of majorization in the quantum case, i.e., for density operators, is also examined. As application, the problem of quantum entanglement detection is considered, where it is shown that majorization leads to a generalized entropic criterion for separability, which is much stronger than the standard entropic criterion. In chapter 6, the notion of distance measures for probability distributions is reviewed. An overview of the most frequently used metrics and distances like Euclidean metrics, Wootters’s distance, Fisher metric and Kullback-Leibler divergence is made, centering the analysis in the distance known as the Jensen-Shannon divergence both in their classical and quantum versions. Application of the latter as a measure of quantum entanglement is also discussed. This chapter is related to the next two chapters, which are devoted to Statistical Measures of Complexity, because of the dependence of these measures with distances in probability space.

There is no universally accepted definition of complexity, nor of quantifiers of complex- xi ity. An extensive list of relevant contributions can be found in the introduction of the chapter 8, as well as in the references of chapter 7. A comparative classification of various complexity measures, by Wackerbauer, Witt, Atmanspacher, Kurths and Scheingraber, can be found in the reference [10] of chapter 8. We will here consider just a particular class of complexity measures based on information theory, which are essentially a combination of an entropy with a distance measure in probability space. Such measures vanish when the probability distribution implies either full certainty or complete uncertainty, being maximum at some “intermediate” distribution. This approach is precisely adopted in chapter 7, where Ricardo L´opez-Ruiz, Hector Mancini and Xabier Calbet introduce the well-known measure of complexity known by their surnames (LMC Statistical Measure of Complexity). Its properties are discussed in full detail and some interesting applications (gaussian and exponential distributions, and complexity in a two-level laser model) are also provided. In chapter 8 the properties of a Generalized Statistical Complexity Measure are discussed. The authors adopt the functional product form of the LMC Statistical Measure of Complexity, but consider different entropic forms and different definitions of distance between distributions of probability, beyond the Shannon Entropy and Euclidean distance used in chapter 7. In particular, the use of the Jensen divergence introduced en chapter 6, together with the Shannon Entropy (Shannon Jensen Statistical Complexity) is analyzed in depth. Another important aspect considered in this chapter is the methodology for the proper determination of the underlying probability distribution function (PDF), associated with a given dynamical system or time series. We should also mention here the Statistical Complexity of Shiner, Davison and Landsberg (ref. [13] of chapter 8).

In chapters 9-15, different applications of generalized information measures are considered. Chapter 9 deals with the Fisher Information, whose basic properties were introduced in Chapter 2. The chapter describes its use in radial probability distributions associated with quantum states, determining the related Cramer-Rao inequalities and the explicit expressions of the Fisher information for both ground and excited states of D-dimensional hydrogenic systems. It then considers its application to some physico-chemical processes, showing that the Fisher information can be a valuable tool for detecting the transition rate and the stationary points of a chemical reaction.

Chapters 10 and 11 deal with problems in physics while chapters 12-15 with applications in others fields. In particular, chapters 14 and 15 are devoted to biological applications. In chapter 10, the links between the entanglement concept (see also chapter 5) and the information entropy are analyzed, as represented by different measures like the Shannon, Renyi (see chapter 2 and 4) and Tsallis (see chapter 3) ones. In chapter 11, the authors review the difference between quantum statistical treatments and semiclassical ones, using a semiclassical Fisher Information measure built up with Husimi distributions. Chapter 12 deals with the use of Information theory tools for characterizing pseudo random number generators obtained from chaotic dynamical systems. The authors make use of the conjunction between Entropy and the Shannon Jensen Statistical Complexity introduced in chapter 8, to evaluate the quality of pseudo random number generator. It is done by quantifying the equiprobability of all its values and statistical independence between consecutive outputs by means the comparison of a Shannon Entropy calculated with a Histogram PDF and a Shannon Jensen Statistical Complexity calculated with a Symbolic Bandt–Pompe PDF (see chapter 8) in an Entropy–Statistical Complexity plane. In chapter 13, the authors employ different information measures such as Shannon Entropy, Fisher Information measure (see chapter 2) and the Shannon Jensen Statistical Complexity (introduced in chapter 8), to analyze sedimentary data corresponding to the Holocene and so characterize changes in the dynamical behavior of ENSO (El Ni˜no/Southern Oscillation) during this period.

In chapter 14, the authors present an application of wavelet-based information measures to characterize red blood cells membrane viscoelasticity. Relative Energy, Shannon Entropy, Shannon Jensen Statistical Complexity calculated with a Wavelet PDF, technic introduced in chapter 8, together with an Entropy–Complexity plane, are used to analyzing a human haematological disease.

Finally, in chapter 15 the authors apply an information theoretic approach to analyze the role of spike correlations in the neuronal code. By considering certain brain structures as communication channels, application of Information Theory becomes feasible, allowing in particular to investigate correlations through the pertinent mutual information. It is a nice example of the important role played by Information theoretical methods in current problems of Theoretical Neuroscience. The chapter also includes a comprehensive list of references on the subject.

Foreword

Foreword by Nobre

The concept of entropy appears in many areas of knowledge, like thermodynamics, statistical mechanics, information theory, biology, economy, and human sciences. From the historical point of view, it was introduced in 1865 by Rudolf Clausius through an elegant formulation of the second law of thermodynamics. A nice historical review of how the concept of entropy emerged in physics is presented in Chapter 1, written by the editors of this book. According to the second law, the total entropy of an isolated system can never decrease in thermodynamical transformations, and particularly in the case of irreversible transformations, it always increases. Since most natural processes are irreversible, entropy has been associated with the “arrow of time”. Some years later (1872), Ludwig Boltzmann wrote down an equation (known as Boltzmann equation) to describe the evolution of the single-particle distribution of a rarefied gas. Considering this distribution, he defined a quantity H and proved the famous H-theorem, by showing that H always decreases in time. Boltzmann realized that for a perfect gas in equilibrium, the quantity H was related to Clausius’ entropy S (apart from a minus sign and some multiplicative constants). This identification led to the definition of statistical entropy, i.e., the entropy defined in terms of a probability distribution, P(x̄, t), associated with the occurrence of a given physical quantity x̄ (in the case of gases x̄ may represent the position, the velocity, or both position and velocity, of a molecule) at a time t. Moreover, the H-theorem yielded a microscopic interpretation of the second law of thermodynamics.

The dynamical approach of Boltzmann, together with the elegant theory of statistical ensembles at equilibrium proposed by Josiah Willard Gibbs, led to the Boltzmann-Gibbs theory of statistical mechanics, which represents one of the most successful theoretical frameworks of physics. The knowledge of the equilibrium distribution associated with a given statistical ensemble allows one to calculate average values to be related with thermodynamic quantities. This theory is based on a fundamental assumption, namely, the ergodic hypothesis, which requires that the system will pass through all its microstates after a suf- ficiently long time. Only if ergodicity holds is that one can replace a given time average (defined within the Boltzmann framework) by the corresponding average over a statistical ensemble (defined at equilibrium).

In Chapter 2 (also written by the editors of this book) the essentials of information theory are introduced and discussed. This theory was created in 1948 by Claude Shannon through the definition of a measure of information following a form similar to the quantity H introduced by Boltzmann. Consequently, from this measure of information one constructs a statistical entropy, sometimes called Boltzmann-Gibbs-Shannon entropy. In 1957, E. T. Jaynes introduced the Maximum Entropy Principle, which allows one to obtain equilibrium distributions by maximizing the statistical entropy under given constraints. In this way, one can derive the equilibrium distributions of statistical mechanics from the entropy maximization procedure, by considering the Boltzmann-Gibbs-Shannon entropy under the constraints suitable for each statistical ensemble.

The interest in a wide variety of complex systems, which are usually characterized by a large number of constituents that interact through long-range forces, and may present peculiar features, like strong correlations, long-time memory, and breakdown of ergodicity, has increased in the latest years. Therefore, due to these characteristics, the complex systems are expected to exhibit collective behaviors very different from those of the rarefied gas considered by Boltzmann in 1872. Many experiments, as well as computational studies in complex systems, have shown properties in disagreement with the predictions of Boltzmann-Gibbs statistical mechanics, suggesting the need of a more general theory for their description. A breakthrough occurred in 1988 with the introduction of a generalized entropy Sq by Constantino Tsallis; this proposal is discussed in detail in Chapter 3. It is characterized by a real index q, such that in the limit q ? 1 one recovers the Boltzmann-Gibbs-Shannon entropy. The entropy Sq, and more particularly the distribution that comes from its maximization, has been very successful in describing many situations where Boltzmann-Gibbs statistical mechanics fails.

Although the entropy Sq has been the most successful for describing complex systems so far, other generalized entropic forms have also been proposed in the literature, as discussed in Chapter 4 by S. Thurner and R. Hanel. This is what the present book is about, containing interesting chapters in history, theory, computational methods, experimental verifications, and applications. The book is addressed not only to physicists, but to researchers in a large diversity of fields, like biology, medicine, economics, human sciences, and to all those interested in understanding the mysteries within the realm of complex systems.

Fernando Dantas Nobre
Centro Brasileiro de Pesquisas F´isicas
Rio de Janeiro - RJ - Brazil

Foreword by Vucetich

In the fourth century BC, Aristotle stated that he disposed of an infallible method to find the truth, namely, the inductive one. And in the XVII century, Leibniz proposed the construction of a “Calculus of Ideas” that would end vain debates. Neither project succeeded, but each of them started several projects that enriched science: logic and statistics. In the XIX century a new, powerful idea was developed, namely, that of entropy: an ever-growing physical magnitude that measured the degree of decay of order in a physical system. This powerful idea not only explained quantitatively the behavior of gases, dilute solutions and chemical reactions but also explained the philosophical problem of decay that Aristotle attributed to earthly matter. With the introduction of entropy, thermodynamics became a model of theoretical science.

In 1948 Shannon developed a “Statistical theory of communication” taking ideas from both fields that in turn opened new paths for research. The powerful notion of information entropy played a major part in the development of new statistical techniques, overhauled the Bayesian approach to probability and statistics and provided powerful new techniques and approaches on several fields of science.

Later on, several generalizations of the concept of information entropy were introduced, that extended and shed new light on the field. These generalizations are already applied in statistical problems and may have interesting applications in fields of science such as critical behavior or neuroscience.

These and related topics are treated in this book that is a review of an old subject from a young point of view. Starting from its historical roots, the book proceeds with the mathematical foundations, generalizations, properties and applications to different branches of mathematics and natural science of the powerful notion of information entropy. And as such, it gives a state-of-art perspective of the subject in the second decade of the XXI century.

Reader: enjoy!

Héctor Vucetich
Observatorio Astron´omico
Universidad Nacional de La Plata
La Plata – Argentina


RELATED BOOKS

.Probability and Statistics: Theory and Exercises.
.Introductory Statistics.
.Introductory Statistical Procedures with SPSS.
.Reliability Calculations with the Stochastic Finite Element.