Tomorrow depends only on today

Can you predict next week's weather? If tomorrow's weather depends only on today's — not on the entire history — you have a Markov chain.

What do you think?
Today is sunny. The chance of sun→sun is 80% and rain→sun is 40%. After many days, what fraction of days will be sunny?

The Markov property

Markov Property

A stochastic process X0,X1,X2,X_0, X_1, X_2, \ldots is a Markov chain if: P(Xn+1=jXn=i,Xn1,,X0)=P(Xn+1=jXn=i)P(X_{n+1} = j \mid X_n = i, X_{n-1}, \ldots, X_0) = P(X_{n+1} = j \mid X_n = i) The future depends on the present, not the past. This is called memorylessness.

This is a different kind of memorylessness than the Geometric or Exponential. There, a single random variable forgets its past. Here, an entire process forgets its history.

Transition matrices

Transition Matrix

A transition matrix PP has entries pij=P(Xn+1=jXn=i)p_{ij} = P(X_{n+1} = j \mid X_n = i). Each row sums to 1 (it's a valid probability distribution over next states).

For the weather example:

P=(0.80.20.40.6)P = \begin{pmatrix} 0.8 & 0.2 \\ 0.4 & 0.6 \end{pmatrix}

Row 1: from Sunny, 80% stay sunny, 20% become rainy. Row 2: from Rainy, 40% become sunny, 60% stay rainy.

Explore the chain by clicking nodes. The arrow thickness shows transition probability:

Markov Chain Explorer
Preset
0.80.20.40.6SunRainTransition matrix P:[0.8 0.2][0.4 0.6]
Sun
100.0%
(1 visits)
Rain
0.0%
(0 visits)
Total steps
1

Multi-step transitions

What's the probability of being sunny two days from now? Use matrix multiplication:

P2=P×PP^2 = P \times P

Two-step probability
P=(0.80.20.40.6)P = \begin{pmatrix} 0.8 & 0.2 \\ 0.4 & 0.6 \end{pmatrix}
Start with the transition matrix
Step 1 of 4
Chapman-Kolmogorov Equation

The nn-step transition probability is: pij(n)=(Pn)ijp_{ij}^{(n)} = (P^n)_{ij} Equivalently: pij(m+n)=kpik(m)pkj(n)p_{ij}^{(m+n)} = \sum_k p_{ik}^{(m)} \cdot p_{kj}^{(n)}

State classification

Irreducible Chain

A Markov chain is irreducible if every state can be reached from every other state. All states "communicate."

Aperiodic State

A state is aperiodic if the chain can return to it at irregular intervals (not locked into a cycle). If all states are aperiodic, the chain is aperiodic.

For convergence to a stationary distribution, we need the chain to be both irreducible and aperiodic.

Practice problems

A transition matrix has rows that sum to ___. (whole number)
A 3-state Markov chain has transition matrix P. What matrix gives 5-step transition probabilities? (e.g. P^3)

Summary

ConceptKey Idea
Markov propertyFuture depends only on present state
Transition matrix PPpijp_{ij} = probability of going iji \to j; rows sum to 1
nn-step transitions(Pn)ij(P^n)_{ij}
IrreducibleAll states communicate
AperiodicNo forced cycling

The Markov property is a modeling assumption. Real weather depends on more than today's state, but the simplification is often good enough and it makes the math tractable.

What's next

We'll discover the stationary distribution, the long-run equilibrium that every well-behaved Markov chain converges to.