## Ergodic Theory

*22 Apr 2013 10:41*

A measure on a mathematical space is a way of assigning weights to different
parts of the space; volume is a measure on ordinary three-dimensional Euclidean
space. Probability distributions are measures, such that the largest measure
of any set is 1 (and some other restrictions). Suppose we're interested in a
dynamical system --- a transformation that maps a space into itself. The set
of points we get from applying the transformation repeatedly to a point is
called its trajectory or orbit. Some dynamical systems are "measure
preserving", meaning that the measure of a set is always the same as the
measure of the set of points which map to it. (In symbols, using *T* for
the map and *P* for the probability measure, \( P(T^{-1}(A)) = P(A) \) for
any measureable set *A*.) Some sets may be invariant: they are the same
as their images. An ergodic dynamical system is one in which, with respect to
some probability distribution, all invariant sets either have measure 0 or
measure 1. (Sometimes non-ergodic systems can be decomposed into a number of
components, each of which is separately ergodic.) The dynamics need not be
deterministic; in particular, irreducible Markov chains with finite state
spaces are ergodic processes, since they have a unique invariant distribution
over the states. (In the Markov chain case, each of the ergodic components
corresponds to an irreducible sub-space.)

Ergodicity is important because of the following theorem (due to von Neumann, and then improved substantially by Birkhoff, in the 1930s). If we take any well-behaved (integrable) function of our space, pick a point in the space at random (according to the ergodic distribution) and calculate the average of the function along the point's orbit, the time-average. Then, with probability 1, in the limit as the time goes to infinity, (1) the time-average converges to a limit and (2) that limit is equal to the weighted average of the value of the function at all points in the space (with the weights given by the same distribution), the space-average. The orbit of almost any point you please will in some sense look like the whole of the state space.

(Symbolically, write *x* for a point in the state
space, *f* for the function we're averaging, and *T* and
*P* for the map and the probability measure as before. The
space-average, \( \overline{f} = \int{f(x)P(dx)} \). The time-average
starting from *x*, \( {\langle f\rangle}_x =
\lim_{n\rightarrow\infty}{(1/n) \sum_{i=0}^{n}{f(T^i(x))}} \). The ergodic
theorem asserts that if *f* is integrable and *T* is ergodic
with respect to *P*, then \( {\langle f \rangle}_x \) exists,
and \( P\left\{x : {\langle f \rangle}_x = \overline{f} \right\} =
1 \). --- A similar result holds for continuous-time dynamical systems,
where we replace the summation in the time average with an integral.)

This is an extremely important property for statistical mechanics. In fact,
the founder of statistical mechanics, Ludwig Boltzmann, coined "ergodic" as the
name for a stronger but related property: starting from a random point in state
space, orbits will typically pass through every point in state space. It is
easy to show (with set theory) that this isn't doable, so people appealled to a
weaker property which was for a time known as "quasi-ergodicity": a typical
trajectory will pass *arbitrarily close* to every point in phase space.
Finally it became clear that only the modern ergodic property is needed. To
see the relation, consider the function, call it *I*, which is 1 on a
certain set, call it *A*, and 0 elsewhere. The time-average
of *I* is the fraction of time that the orbit spends in *A*. The
space-average of *I* is the probability that a randomly picked point is
in *A*. Since the two averages are almost always equal, almost all
trajectories end up covering the state space in the same way.

One way of thinking about the classical ergodic theorem is that it's a version of the law of large numbers --- it tells us that a sufficiently large sample (i.e., an average over a long time) is representative of the whole population (i.e., the space average). One thing I'd like to know more about than I do is ergodic equivalents of the central limit theorem, which say how big the sampling fluctuations are, and how they're distributed. The other thing I want to know about is the rate of convergence in the ergodic theorem --- how long must I wait before my time average is within a certain margin of probable error of the state average. Here I do know a bit more of the relevant literature, from large deviations theory.

Again in symbols: Let's write \( {\langle f\rangle}_{x,n} \) for the
time-average of *f*, starting from *x*, taken over *n*
steps. Then a central-limit theorem result would say that (for example)
\( \frac{{\langle f\rangle}_{X,n} - \overline{f}}{\sigma^{2}r(n)} \)
converges in distribution to a Gaussian with mean zero and variance one, where
\( \sigma^2 \) is the (space-averaged) variance of *f* and
*r*(*n*) is some positive, increasing function of *n*. This
would be weak convergence of the time averages to the space averages, and
*r*(*n*) would give the rate. (In the usual IID case, \( r(n) =
\sqrt{n} \).) Somewhat stronger would be a convegence in probability
result, giving us a function \( N(\varepsilon,\delta) \) such that
\( P\left\{x : \left|{\langle f\rangle}_{x,n} - \overline{f}\right| \geq
\varepsilon \right\} \leq \delta \) if \( n \geq
N(\varepsilon,\delta) \). Proving many of these results requires stronger
assumptions than proving ergodicity does --- for instance, mixing properties.

These issues are part of a more general question about how to do statistical
inference for stochastic processes, a.k.a. time-series analysis. I am
especially interested in statistical learning
theory in this setting, which is in part about ensuring that the
ergodic theorem holds *uniformly* across classes of functions. Very
strong results have recently been achieved on that front by Adams and Nobel
(links below).

Another thing I'd like to understand, but don't have time to explain here, are Pinsker sigma-algebras.

*See also*:
Deviation Inequalities;
Dynamical Systems and Chaos;
Empirical Process Theory;
Information Theory;
Mixing and Weak Dependence of Stochastic Processes;
Nonequilibrium Statistical Mechanics;
Probability Theory;
Recurrence Times of Stochastic
Processes;
Stochastic Processes;
Symbolic Dynamics;
Time Series;
Universal Prediction Algorithms

- Recommended, synoptic:
- Peter Billingsley, Ergodic Theory and Information
- Robert M. Gray, Probability, Random Processes, and Ergodic Properties [Full-text online]
- A. I. Khinchin, Mathematical Foundations of Statistical Mechanics [Proves the von Neumann-Birkhoff ergodic theorem in detail]
- Mark Kac, Probability and Related Topics in Physical Science
- Andrzej Lasota and Michael C. Mackey, Chaos, Fractals and Noise: Stochastic Aspects of Dynamics
- Norbert Wiener, Cybernetics: Or Control and Communication in the Animal and the Machine

- Recommended, close-up:
- Terrence M. Adams and Andrew B. Nobel [See comments under Statistical Learning with Dependent Data]
- "Uniform convergence of
Vapnik-Chervonenkis classes under ergodic
sampling", Annals
of Probability
- "The Gap Dimension and Uniform Laws of Large Numbers for Ergodic Processes", arxiv:1007.2964

**38**(2010): 1345--1367, arxiv:1010.3162 - Itai Benjamini, Nicolas Curien, "Ergodic Theory on Stationary Random Graphs", arxiv:1011.2526
- Richard C. Bradley, "Basic Properties of Strong Mixing Conditions. A Survey and Some Open Questions", Probability Surveys
**2**(2005): 107--144, arxiv:math/0511078 - J.-R. Chazottes and R. Leplaideur, "Birkhoff averages of Poincare cycles for Axiom-A diffeomorphisms," math.DS/0312291
- Jérôme Dedecker, Paul Doukhan, Gabriel Lang, José Rafael León R., Sana Louhichi and Clémentine Prieur, Weak Dependence: With Examples and Applications [Mini-review]
- Paul Doukhan, Mixing: Properties and Examples
- E. B. Dynkin, "Sufficient statistics and extreme
points", Annals of Probability
**6**(1978): 705--730 ["The connection between ergodic decompositions and sufficient statistics is explored in an elegant paper by DYNKIN" --- Kallenberg, Foundations of Modern Probability, p. 577] - Jean-Pierre Eckmann and David Ruelle, "Ergodic Theory of
Chaos and Strange Attractors," Reviews of Modern Physics
**57**(1985): 617--656 - Roberto Fernández and Grégory Maillard, "Chains with
Complete Connections: General Theory, Uniqueness, Loss of Memory and Mixing
Properties", Journal of Statistical
Physics
**118**(2005): 555--588 - Stefano Galatolo, Mathieu Hoyrup, and Cristóbal Rojas,
"Effective symbolic dynamics, random points, statistical behavior, complexity
and entropy", arxiv:0801.0209
[
*All*, not almost all, Martin-Löf points are statistically typical.] - Weihong Huang, "On the long-run average growth rate of chaotic
systems", Chaos
**14**(2004): 38--47 [An amusing demonstration that positive-valued ergodic processes will seem to always have a positive long-run growth rate, even though they're stationary!] - Michael Keane and Karl Petersen, "Easy and nearly simultaneous
proofs of the Ergodic Theorem and Maximal Ergodic
Theorem", math.DS/0608251
[This is a
*lovely*little four-page paper, and the simplest proof by far that I've seen, but they do rely rather heavily on the reader being familiar with facts about time averages, invariant functions, etc. Still, I should definitely teach this in my class.] - Aryeh (Leo) Kontorovich, "Metric and Mixing Sufficient Conditions for Concentration of Measure", math.PR/0610427 [See weblog comments]
- Aryeh Kontorovich and Anthony Brockwell, "A Strong Law of Large Numbers for Strongly Mixing Processes", arxiv:0807.4665
- Michael C. Mackey, Time's Arrow: The Origins of Thermodynamic Behavior
- Florence Merlevède, Magda Peligrad, Sergey Utev, "Recent
advances in invariance principles for stationary
sequences", math.PR/0601315
= Probability
Surveys
**3**(2006): 1--36 [Can I just say how much I hate calling the functional central limit theorem "*the*invariance principle"?] - Mehryar Mohri and Afshin Rostamizadeh, "Stability Bound for Stationary Phi-mixing and Beta-mixing Processes", arxiv:0811.1629 ["Stability" is the property of a learning algorithm that changing a single observation in the training set leads to only small changes in predictions on the test set. This paper shows that stable learning algorithms continue to perform well with dependent data, provided the data are either phi mixing or beta mixing.]
- Andrew Nobel and Amir Dembo, "A Note on Uniform Laws of Averages for Dependent Processes", Statistics and Probability Letters
**17**(1993): 169--172 [PDF preprint via Prof. Nobel. See comments under Statistical Learning with Dependent Data] - Donald Ornstein and Benjamin Weiss, "How Sampling Reveals a
Process", Annals of
Probability
**18**(1990): 905--930 [Some comments under Universal Prediction.] - Murray Rosenblatt, "A Central Limit Theorem and a Strong Mixing
Condition", Proceedings of the National Academy of Sciences
(USA)
**42**(1956): 43--47 [The root from which much subsequent ergodic theory has sprung. PDF reprint] - Daniil Ryabko and Boris Ryabko, "Testing Statistical Hypotheses About Ergodic Processes", arxiv:0804.0510
- Nobusumi Sagara, "Nonparametric maximum-likelihood estimation of
probability measures: existence and consistency", Journal of
Statistical Planning and Inference
**133**(2005): 249--271 ["This paper formulates the nonparametric maximum-likelihood estimation of probability measures and generalizes the consistency result on the maximum-likelihood estimator (MLE). We drop the independent assumption on the underlying stochastic process and replace it with the assumption that the stochastic process is stationary and ergodic. The present proof employs Birkhoff's ergodic theorem and the martingale convergence theorem. The main result is applied to the parametric and nonparametric maximum-likelihood estimation of density functions."*Very*cool.] - Paul C. Shields, The Ergodic Theory of Discrete Sample Paths [Well-written modern text, extremely strong on connections to information theory and coding. I haven't gotten through the last chapter, however.]
- Leslie Sklar, Physics and Chance: Philosophical Issues in the Foundations of Statistical Mechanics [Good discussion of ergodic results in several places]
- Marta Tyran-Kaminska, "An Invariance Principle for Maps with
Polynomial Decay of Correlations", math.DS/0408185 =
Communications in
Mathematical Physics
**260**(2005): 1--15 ["We give a general method of deriving statistical limit theorems, such as the central limit theorem and its functional version, in the setting of ergodic measure preserving transformations. This method is applicable in situations where the iterates of discrete time maps display a polynomial decay of correlations."] - Mathukumalli Vidyasagar, A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems [Has a very nice discussion of when the uniform laws of large numbers of statistical learning theory transfer from the usual IID setting to dependent processes, becoming uniform ergodic theorems. (Sufficient conditions include things like beta-mixing, but necessary and sufficient conditions seem to still be unknown.) Mini-review]
- Benjamin Weiss, Single Orbit Dynamics
- Wei Biao Wu, "Nonlinear system theory: Another look at
dependence", Proceedings of the
National Academy of Sciences
**102**(2005): 14150--14154 ["we introduce [new] dependence measures for stationary causal processes. Our physical and predictive dependence measures quantify the degree of dependence of outputs on inputs in physical systems. The proposed dependence measures provide a natural framework for a limit theory for stationary processes. In particular, under conditions with quite simple forms, we present limit theorems for partial sums, empirical processes, and kernel density estimates. The conditions are mild and easily verifiable because they are directly related to the data-generating mechanisms."]

- Modesty forbids me to recommend:
- chs. 5 and 22--27 of Almost None of the Theory of Stochastic Processes
- "The World's Simplest Ergodic Theorem"

- To read:
- Jon Aaronson, An Introduction to Infinite Ergodic Theory
- Jon Aaronson and Tom Meyerovitch, "Absolutely continuous, invariant measures for dissipative, ergodic transformations", math.DS/0509093 ["We show that a dissipative, ergodic measure preserving transformation of a sigma-finite, non-atomic measure space always has many non-proportional, absolutely continuous, invariant measures and is ergodic with respect to each one of these."]
- Jose M. Amigo, Matthew B. Kennel and Ljupco Kocarev, "The permutation entropy rate equals the metric entropy rate for ergodic information sources and ergodic dynamical systems", nlin.CD/0503044
- Vitor Araujo, "Semicontinuity of entropy, existence of equilibrium states and of physical measures", math.DS/0410099
- L. Arnold, Random Dynamical Systems
- V. I. Arnol'd and A. Avez, Ergodic Problems of Classical Mechanics
- Jeremy Avigad, Philipp Gerhardy and Henry Towsner, "Local stability of ergodic averages", arxiv:0706.1512 [Computing bounds on the rate of convergence in the ergodic theorems; sound scool. Thanks to Gustavo Lacerda for the pointer.]
- Massimiliano Badino, "The Foundational Role of Ergodic Theory", phil-sci/2277
- Dominique Bakry, Patrick Cattiaux, Arnaud Guillin, "Rate of Convergence for ergodic continuous Markov processes: Lyapunov versus Poincare", math.PR/0703355
- Viviane Baladi, Positive Transfer Operators and Decay of Correlations
- Joseph Berkovitz, Roman Frigg and Fred Kronz, "The Ergodic Hierarchy, Randomness and Hamiltonian Chaos", phil-sci/2927
- A. A. Borovkov, Ergodicity and Stability of Stochastic Processes
- Philippe Chassaing, Jean Mairesse, "A non-ergodic PCA with a unique invariant measure", arxiv:1009.0143 [Here "PCA"="probabilistic cellular automaton"]
- J.-R. Chazottes and P. Collet, "Almost-sure central limit theorems
and the Erdös-Rényi law for expanding maps of the interval", Ergodic Theory and
Dynamical Systems
**25**(2005): 419--41 - J.-R. Chazottes, P. Collet and B. Schmitt, "Devroye Inequality for
a Class of Non-Uniformly Hyperbolic Dynamical Systems",
Nonlinearity
**18**(2005): 2323--2340, math.DS/0412166 - J.-R. Chazottes, P. Collet and B. Schmitt, "Statistical
Consequences of Devroye Inequality for Processes. Applications to a Class of
Non-Uniformly Hyperbolic Dynamical Systems", Nonlinearity
**18**(2005): 2341--2364, math.DS/0412167 - J.-R. Chazottes and G. Gouezel, "On almost-sure versions of classical limit theorems for dynamical systems", math.DS/0601388 [arguing in support of the idea that "whenever we can prove a limit theorem in the classical sense for a dynamical system, we can prove a suitable almost-sure version based on an empirical measure with log-average".]
- J.-R. Chazottes, G. Keller, "Pressure and Equilibrium States in Ergodic Theory", arxiv:0804.2562
- J.-R. Chazottes and F. Redig, "Testing the irreversibility of a
Gibbsian process via hitting and return
times", math-ph/0503071
= Nonlinearity
**18**(2005): 2477--2489 - Mu-Fa Chen
- Eigenvalues, Inequalities, and Ergodic Theory
- "Ergodic convergence rates of Markov processes--eigenvalues, inequalities and ergodic theory", math.PR/0304367

- Xia Chen, Limit Theorems for Functionals of Ergodic Markov Chains with General State Space
- Geon Ho Choe, Computational Ergodic Theory
- Christophe Cuny, "Pointwise ergodic theorems with rate and application to limit theorems for stationary processes", arxiv:0904.0185
- Lukasz Debowski, "Variable-Length Coding of Two-sided
Asymptotically Mean Stationary Measures", Journal of Theoretical Probability
**23**(2009): 237--256 - J. Dedecker, F. Merlevède, M. Peligrad, "Invariance principles for linear processes. Application to isotonic regression", arxiv:0903.1951 ["maximal inequalities and study the functional central limit theorem for the partial sums of linear processes generated by dependent innovations. Due to the general weights these processes can exhibit long range dependence and the limiting distribution is a fractional Brownian motion"]
- Thierry De La Rue, "An introduction to joinings in ergodic theory",
math.DS/0507429
= Discrete and Continuous Dynamical Systems
**15**(2006): 121--142 - Manfred Denker, Mikhail Gordin, "On conditional central limit theorems for stationary processes", pp. 133--151 in Krishna Athreya, Mukul Majumdar, Madan Puri, and Edward Waymire (eds.), Probability, statistics and their applications: papers in honor of Rabi Bhattacharya [Free online]
- G. B. DiMasi and L. Stettner, "Ergodicity of hidden Markov models",
Mathematics of
Control, Signals, and Systems
**17**(2005): 269--296 - Ian Domowitz and Mahmoud El-Gamal, "A Consistent Nonparametric Test
of Ergodicity for Time Series with
Applications", Journal
of Econometrics
**102**(2001): 365--398 [SSRN. Having read about 2/3 of this, I completely fail to see how they actually overcome the problem that any one sample path is always confined to a single ergodic component.] - Tomasz Downarowicz, Entropy in Dynamical Systems
- Aryeh Dvoretzky, "Asymptotic normality for sums of dependent random variables", Proceedings of the Sixth Berkeley Symposium on Mathematical Statistics and Probability, Vol. 2, 513--535
- Martin Dyer, Leslie Ann Goldberg, Mark Jerrum, Russell Martin, "Markov chain comparison", math.PR/0410331 [i.e., comparison theorems for mixing times]
- Jean-Pierre Eckmann and Itamar Procaccia, "Invariant Measures in Generic Dynamical Systems", chao-dyn/9708021 [Abstract: "Irreversible thermodynamics of simple fluids have been connected recently to the theory of dynamical systems and some interesting assumptions have been made about the nature of the associated invariant measures. We show that the tests of the validity of these assumptions are insufficient by exhibiting observables that are incorrectly sampled with the proposed invariant measures. Only observables belonging to the 'high temperature phase' of the thermodynamic formalism are insensitive to the sampling methods. We outline methods that are free of these deficiencies." I read this when it came out, but I don't think I understood it then.]
- Bernhold Fiedler (ed.), Ergodic Theory, Analysis, and Efficient Simulation of Dynamical Systems
- Nikos Frantzikinakis, Randall McCutcheon, "Ergodic Theoy: Recurrence", arxiv:0705.0033
- Gary Froyland, "Statistical optimal almost-invariant sets",
Physica
D
**200**(2005): 205--219 [Partitioning state space into*nearly*separated components.] - Stefano Galatolo, Mathieu Hoyrup, Cristóbal Rojas
- "Dynamics and abstract computability: computing invariant measures", arxiv:0903.2385
- "Dynamical systems, simulation, abstract computation", arxiv:1101.0833

- Gan Shixin, "Almost sure convergence
for \( \tilde{\rho} \)-mixing random variable sequences", Statistics
and Probability Letters
**67**(2004): 289--298 - E. Glasner and B. Weiss, "On the interplay between measurable and topological dynamics", math.DS/0408328
- Beniamin Goldys and Bohdan Maslowski, "Uniform exponential ergodicity of stochastic dissipative systems," math.PR/0111143
- Sebastien Gouezel, "Almost sure invariance principle for dynamical systems by spectral methods", Annals of Probability
**38**(2010): 1639--1671 - M. Hairer, "Ergodic properties of a class of non-Markovian processes", arxiv:0708.3338
- P. Halmos, Ergodic Theory
- Bruce E. Hansen, "Strong Laws for Dependent Heterogeneous
Processes", Econometric Theory
**7**(1991): 213--221 [PDF reprint via Prof. Hansen] - Nicolai T. A. Haydn, "The Central Limit Theorem for uniformly strong mixing measures", arxiv:0903.1325
- Nicolai Haydn, Y. Lacroix and Sandro Vaienti, "Hitting and return
times in ergodic dynamical
systems", math.DS/0410384
= Annals of
Probability
**33**(2005): 2043--2050 - Nicolai Haydn and Sandro Vaienti, "Fluctuations of the Metric
Entropy for Mixing Measures", Stochastics and
Dynamics
**4**(2004): 595--627 - Michael Hochman, "Upcrossing Inequalities for Stationary Sequences and Applications to Entropy and Complexity", arxiv:math.DS/0608311 [where "complexity" = algorithmic information content]
- Bernard Host, "Convergence of multiple ergodic averages", math.DS/0606362 ["We study the mean convergence of multiple ergodic averages, that is, averages of a product of functions taken at different times."]
- Steven Kalikow and Randall McCutcheon, An Outline of Ergodic Theory
- Gerhard Keller, Equilibrium States in Ergodic Theory
- Gerhard Keller and Carlangelo Liverani, "Uniqueness of the SRB
Measure for Piecewise Expanding Weakly Coupled Map Lattices in Any Dimension",
Communications in
Mathematical Physics
**262**(2006): 33--50 - U. Kregnel, Ergodic Theorems
- Anna Kuczmaszewska, "The strong law of large numbers for dependent
random variables", Statistics and
Probability Letters
**73**(2005): 305--314 - Lin Zhengyan and Lu Chuanrong, Limit Theory for Mixing Dependent Random Variables
- Pei-Dong Liu and Min Qian, Smooth Ergodic Theory of Random Dynamical Systems
- Martial Longla, Magda Peligrad, "Some Aspects of Modeling Dependence in Copula-based Markov chains", arxiv:1107.1794
- Dasha Loukianova, Oleg Loukianov, Eva Loecherbach, "Polynomial
bounds in the Ergodic Theorem for positive recurrent one-dimensional diffusions
and integrability of hitting
times", arxiv:0903.2405
[
*non-asymptotic*deviation bounds from bounds on moments of recurrence times] - Stefano Luzzatto
- "Mixing and decay of correlations in non-uniformly expanding maps: a survey of recent results," math.DS/0301319
- "Stochastic-like behaviour in nonuniformly expanding maps", math.DS/0409085

- Stefano Luzzatto, Ian Melbourne and Frederic Paccaut, "The Lorenz
Attractor is Mixing", Communications in
Mathematical Physics
**260**(2005): 393--401 - Vincent Lynch, "Decay of correlations for non-Holder observables", math.DS/0401432
- Michael C. Mackey and Marta Tyran-Kaminska
- "Deterministic Brownian Motion: The Effects of Perturbing a Dynamical System by a Chaotic Semi-Dynamical System", cond-mat/0408330
- "Effects of Noise on Entropy Evolution", cond-mat/0501092
- "Central Limit Theorems for Non-Invertible Measure Preserving Maps", math.PR/0608637 ["a new functional central limit theorem result for non-invertible measure preserving maps that are not necessarily ergodic, using the Perron-Frobenius operator"]

- Katalin Marton and Paul C. Shields, "How many future measures can
there be?", Ergodic
Theory and Dynamical Systems
**22**(2002): 257--280 - Jonathan C. Mattingly, "On Recent Progress for the Stochastic Navier Stokes Equations", math.PR/0409194 ["We give an overview of the ideas central to some recent developments in the ergodic theory of the stochastically forced Navier Stokes equations and other dissipative stochastic partial differential equations."]
- Ian Melbourne and Matthew Nicol
- "Almost Sure Invariance Principle for Nonuniformly
Hyperbolic Systems", Communications in
Mathematical Physics
**260**(2005): 131--146 = math.DS/0503693 ["We prove an almost sure invariance principle that is valid for general classes of nonuniformly expanding and nonuniformly hyperbolic dynamical systems. Discrete time systems and flows are covered by this result. ... Statistical limit laws such as the central limit theorem, the law of the iterated logarithm, and their functional versions, are immediate consequences."] - "A Vector-Valued Almost Sure Invariance Principle for Hyperbolic Dynamical Systems", math.DS/0606535

- "Almost Sure Invariance Principle for Nonuniformly
Hyperbolic Systems", Communications in
Mathematical Physics
- D. S. Ornstein and B. Weiss, "Statistical Properties of Chaotic Systems," Bulletin
of the American Mathematical Society
**24**(1991): 11--116 - Goran Peskir, From Uniform Laws of Large Numbers to Uniform Ergodic Theorems
- Karl E. Petersen, Ergodic Theory
- Charles Pugh and Michael Shub, with an appendix by Alexander
Starkov, "Stable Ergodicity", Bulletin of the American Mathematical
Society (new series)
**41**(2003): 1--41 [Link] - Maxim Raginsky, "Joint universal lossy coding and identification of stationary mixing sources with general alphabets", arxiv:0901.1904
- M. Rosenblatt, "Central limit theorem for stationary processes", Proceedings of the Sixth Berkeley Symposium on Mathematical Statistics and Probability, Vol. 2, pp. 551--561
- Gennady Samorodnitsky, "Extreme value theory, ergodic theory and
the boundary between short memory and long memory for stationary stable
processes", Annals of Probability
**32**(2004): 1438--1468 = math.PR/0410149 - Alexander Schoenhuth, "The ergodic decomposition of asymptotically mean stationary random sources", arxiv:0804.2487
- C. E. Silva, Invitation to Ergodic Theory
- Ya. Sinai, Topics in Ergodic Theory
- Charles Stein, "A bound for the error in the normal approximation to the distribution of a sum of dependent random variables", Proceedings of the Sixth Berkeley Symposium on Mathematical Statistics and Probability, Vol. 2, pp. 583--602
- Andre Toom
- "Law of Large Numbers for Non-Local
Functions of Probabilistic Cellular Automata", Journal of Statistical Physics
**133**(2008): 883--897 - "Every Continuous Operator Has an Invariant
Measure", Journal of Statistical
Physics
**129**(2007): 555--566

- "Law of Large Numbers for Non-Local
Functions of Probabilistic Cellular Automata", Journal of Statistical Physics
- Cristina Tone, "Central limit theorems for Hilbert-space valued random fields satisfying a strong mixing condition", arxiv:1012.1842
- Marta Tyran-Kaminska, "Convergence to Lévy stable processes under strong mixing conditions", arxiv:0907.1185
- R. Vilela Mendes, "Beyond Lyapunov", arxiv:1008.2664 ["Ergodic parameters like the Lyapunov and the conditional exponents are global functions of the invariant measure, but the invariant measure itself contains more information. A more complete characterization of the dynamics by new families of ergodic parameters is discussed, as well as their relation to the dynamical R\'{e}nyi entropies and measures of self-organization"]
- Vladimir V'yugin, "On Instability of the Ergodic Limit Theorems with Respect to Small Violations of Algorithmic Randomness", arxiv:1105.4274
- Peter Walters, An Introduction to Ergodic Theory
- Wei Biao Wu, Xiaofeng Shao, "Invariance principles for fractionally integrated nonlinear processes", math.PR/0608223
- Wei Biao
Wu and Michael Woodroofe, "Martingale Approximations for Sums of Stationary
Processes", Annals of Probability
**32**(2004): 1674--1690 = math.PR/0410160 - Ivan Werner, "Ergodic theorem for contractive Markov systems",
Nonlinearity
**17**(2303--2313) [PS preprint] - Guangyu Yang, Yu Miao, "An invariance principle for the law of the iterated logarithm for additive functionals of Markov chains", math.PR/0609593
- Radu Zaharopol, Invariant Probabilities of Markov-Feller Operators and Their Supports
- Steve Zelditch, "Quantum ergodicity and mixing", quant-ph/0503026 ["an expository article for the Encyclopedia of Mathematical Physics"]

Updated 29 October 2007; thanks to "tushar" for pointing out an embarrassing think-o in the first paragraph.

*Previous versions*: 2007-10-29 9:40; 2005-11-15 16:18; first version written c. 1997 (?)