Your cart is empty.

94A17 Information and communication, circuits -- Communication, information -- Measures of information, entropy En-ligne: Zentralblatt | MathSciNet

Location | Call Number | Status | Date Due |
---|---|---|---|

Salle R | 04518-01 / 28 BIL (Browse Shelf) | Available | |

Salle R | 04518-02 / 28 BIL (Browse Shelf) | Available |

Bibliogr. p. 181-185. Index

This book, the first in the new series "Tracts on Probability and Mathematical Statistics'', originated in a series of lectures given by the author to one of the London Mathematical Society's conferences. It is a charming, well written and informative monograph. The exposition is simple enough to be read without pen and paper at hand, and in the space of 180 pages the author covers a remarkably wide selection of topics, sacrificing at times deep technicalities for a cohesive account of ergodic theory and its applications and ramifications. These will perhaps interest the specialist in spite of the disclaimer of the author that the book is intended for readers to whom the subject is new. In fact, the one main concession to the non-specialist is Chapter 3 on elementary conditional probability and conditional expectation.

Chapter 1 treats the ergodic theorem, giving two proofs based on the maximal ergodic theorem. There are numerous examples including an analysis of Gauss' problem on continued fractions, and an application to Diophantine approximation. Chapter 2 treats the Kolmogorov-Sinai application of Shannon's entropy to the isomorphism problem of ergodic theory. Chapter 4 treats the convergence of entropy, including the Shannon-McMillan-Breiman theorem and an application to dimension theory, while Chapter 5 applies the preceding material to the principal theorems of coding theory. (MathScinet)

There are no comments for this item.