Bayes' Rule
You know that and are different. But here's the problem: you usually know one and need the other.
A doctor knows P(positive test | disease) from clinical trials. What you need is P(disease | positive test).
Can we flip the conditional around?
Deriving the formula
Where:
- is the prior: your belief before seeing evidence
- is the likelihood: how probable the evidence is if H is true
- is the marginal likelihood: how probable the evidence is overall
- is the posterior: your belief after seeing evidence
Explore the formula
Adjust the three key values and watch how they interact.
P(H|E)
10% → 33.3%
Visualizing base rates
Out of 100,000 people, only 100 have the disease. The test is 99% accurate. If you test positive, how many others also tested positive but are healthy?
P(Disease | Positive)
99 / 1098 positive tests
The medical test calculation
Here's the calculation step by step.
The odds form
There's a more intuitive way to think about Bayes:
In words: Posterior Odds = Prior Odds × Likelihood Ratio
The likelihood ratio tells you how much the evidence should shift your belief. If evidence is 10× more likely when H is true, multiply your odds by 10.
When intuition fails
Bayes' Rule fixes common mistakes. Try each scenario:
The prior matters
| Prior | Likelihood Ratio | Posterior | | --- | --- | --- | | 0.1% (very rare) | 99 | 9% | | 1% (rare) | 99 | 50% | | 10% (somewhat common) | 99 | 92% |
The same evidence, the same test accuracy, but wildly different conclusions. The prior is often more important than the test quality!
Why it's hard
Your brain isn't wired for Bayes' Rule. We see "99% accurate test" and think "99% chance I have it."
Evolution optimized us for pattern recognition, not statistics. When you see smoke, you think fire because P(smoke|fire) is high. But in a building with a smoke machine, P(fire|smoke) drops to near zero.
The fix: Always ask "how common is the hypothesis before I saw this evidence?"
Test your understanding
What's next
Bayes' Rule tells you how new information changes your beliefs. But sometimes new information doesn't change anything at all.
If I tell you it's raining in Seattle, does that change the probability your coin flip lands heads? No. The events are independent.
But here's where people get confused: independence is not the same as being mutually exclusive (disjoint). In fact, if two events are mutually exclusive, they're highly dependent. Learning one happened tells you the other definitely didn't.
That's the next lesson.