Notebooks

Time Series, or Statistics for Stochastic Processes and Dynamical Systems

08 Sep 2014 11:11

Rates of convergence of estimators; analogs to VC-dimension results (see Meir's paper below). Large deviation techniques. Prediction schemes. Are there universal schemes which do not demand exponentially growing volumes of data?

If you have an ergodic process, then the sample-path mean for any nice statistic you care to measure will, almost surely, converge to the distributional mean. This is even true of trajectory probabilities (i.e., if you want to know the probability of a certain finite-length trajectory, simply count how often it happens). So "sit and count" is a reliable and consistent statistical procedure. If the process mixes sufficiently quickly, the rate of convergence might even be respectable. But this doesn't say anything about the efficiency of such procedures, which is surely a consideration. And what do you do for non-ergodic processes? (Take multiple runs and hope they're telling you about different ergodic components?) Non-stationary, even?

I need to learn more about frequency-domain approaches; despite being raised as a physicist, I find the time domain much more natural. After all, the frequency domain is effectively just one choice of a function basis, and there are infinitely many others, which might in some sense be more appropriate to the process at hand. But that's at least in part a rationalization against having to learn more math.

LSE econometrics and its "general-to-specific" modeling procedure is very interesting, and I think possibly even related to stuff I've done, but I need to understand it much better than I do.

(This notebook really needs subdivision.)

See also: Bootstrapping; Change-Point Problems; Classifiers and Clustering for Time Series; Control Theory; Dynamical Systems; Ergodic Theory; Filtering, State Estimation and Signal Processing; Grammatical Inference; Information Theory; Machine Learning, Statistical Inference and Induction; Markov Models and Hidden Markov Models; Neural Coding; Power Law Distributions, 1/f Noise and Long-Memory Processes; Recurrence Times of Stochastic Processes (also Hitting, Waiting, and First-Passage Times) Sequential Decisions Under Uncertainty; State-Space Reconstruction; Statistical Learning Theory with Dependent Data; Statistics; Stochastic Processes; Symbolic Dynamics; Universal Prediction Algorithms


Notebooks:     Hosted, but not endorsed, by the Center for the Study of Complex Systems