Stochastic convergence
Stochastic convergence is a mathematical concept intended to formalize the idea that a sequence of essentially random or unpredictable events sometimes tends to settle into a pattern.
Four different varieties of stochastic convergence are noted:
- Almost sure convergence
- Convergence in probability
- Convergence in distribution
- Convergence in nth order mean
Almost sure convergence
Example 1
Consider a short-lived animal of some species. We may note the exact amount of food the animal consumes day by day. This sequence of numbers will be unpredictable in advance, but we may be quite certain that one day the number will be zero, and stay zero forever after.
Example 2
Consider a man who starts tomorrow to toss seven coins once every morning. Each afternoon, he donates a random amount of money to a certain charity. The first time the result is all tails, however, he will stop permanently.
Let be the day by day amounts the charity receives from him.
We may be almost sure that one day this amount will be zero, and stay zero forever after that.
However, when we consider any finite number of days, there is a nonzero probability the terminating condition will not occur.
Formal definition
Let be an infinite sequence of stochastic variables defined over a subset of R.
Then the actual outcomes will be an ordinary sequence of real numbers.
If the probability that this sequence will converge to a given real number a equals 1, then we say the original sequence of stochastic variables has almost sure convergence to a.
In more compact notation:
- If for some a, then the sequence has almost sure convergence to a.
Convergence in probability
Example 1
An absent-minded professor gets a job in an unfamiliar part of town.
The first time he walks from home, he has difficulty finding his way, and ends up several hours late.
The next few dozen times, things generally improve, although sometimes he manages to get hopelessly lost again.
As the months and years go by, he gets to know the area very well, and falls into a routine that make him more and more punctual, although it may still happen occasionally that he is very late.
Example 2
We may keep tossing a die an infinite number of times and at every toss note the average outcome so far. The exact number thus obtained after each toss will be unpredictable, but for a fair die, it will tend to get closer and closer to the arithmetic average of 1,2,3,4,5 and 6, i.e. 3.5.
Formal definition
Let be an infinite sequence of stochastic variables defined over a subset of R.
If there exists a real number a such that for all , then the sequence has convergence in probability to a.
Convergence in distribution
Example
Formal definition
Convergence in nth order mean
Example
Formal definition
Relations between the different modes of convergence
- If a stochastic sequence has almost sure convergence, then it also has convergence in probability.
- If a stochastic sequence has convergence in probability, then it also has convergence in distribution.
- If a stochastic sequence has convergence in (n+1)th order mean, then it also has convergence in nth order mean (n>0).
- If a stochastic sequence has convergence in nth order mean, then it also has convergence in probability.