Tomorrow depends only on today
Can you predict next week's weather? If tomorrow's weather depends only on today's — not on the entire history — you have a Markov chain.
The Markov property
A stochastic process is a Markov chain if: The future depends on the present, not the past. This is called memorylessness.
This is a different kind of memorylessness than the Geometric or Exponential. There, a single random variable forgets its past. Here, an entire process forgets its history.
Transition matrices
A transition matrix has entries . Each row sums to 1 (it's a valid probability distribution over next states).
For the weather example:
Row 1: from Sunny, 80% stay sunny, 20% become rainy. Row 2: from Rainy, 40% become sunny, 60% stay rainy.
Explore the chain by clicking nodes. The arrow thickness shows transition probability:
Multi-step transitions
What's the probability of being sunny two days from now? Use matrix multiplication:
The -step transition probability is: Equivalently:
State classification
A Markov chain is irreducible if every state can be reached from every other state. All states "communicate."
A state is aperiodic if the chain can return to it at irregular intervals (not locked into a cycle). If all states are aperiodic, the chain is aperiodic.
For convergence to a stationary distribution, we need the chain to be both irreducible and aperiodic.
Practice problems
Summary
| Concept | Key Idea |
|---|---|
| Markov property | Future depends only on present state |
| Transition matrix | = probability of going ; rows sum to 1 |
| -step transitions | |
| Irreducible | All states communicate |
| Aperiodic | No forced cycling |
The Markov property is a modeling assumption. Real weather depends on more than today's state, but the simplification is often good enough and it makes the math tractable.
What's next
We'll discover the stationary distribution, the long-run equilibrium that every well-behaved Markov chain converges to.