The shortcut you didn't know you needed
You know is a fair die roll (1–6). What's the expected value of ?
Your instinct: find the distribution of (so takes values 1, 4, 9, 16, 25, 36), figure out the probabilities, then compute . That works, but there's a much easier way.
The law of the unconscious statistician
If is a discrete random variable and is any function, then: You compute E[g(X)] using the distribution of , not the distribution of .
The name is playful — you use this rule "unconsciously" because it feels obvious. But it's not trivial: it says that to find the expected value of a transformed variable, you don't need to work out the new distribution first.
LOTUS is the bridge between knowing E[X] and computing E[g(X)] for arbitrary functions g. It saves you from the often painful step of finding the PMF of g(X).
See it in action
Pick a distribution for and a transform . The table shows each contribution , and their sum is via LOTUS.
| x | P(X=x) | g(x) | g(x)·P(X=x) |
|---|---|---|---|
| -3 | 0.1429 | 9.0000 | 1.2857 |
| -2 | 0.1429 | 4.0000 | 0.5714 |
| -1 | 0.1429 | 1.0000 | 0.1429 |
| 0 | 0.1429 | 0.0000 | 0.0000 |
| 1 | 0.1429 | 1.0000 | 0.1429 |
| 2 | 0.1429 | 4.0000 | 0.5714 |
| 3 | 0.1429 | 9.0000 | 1.2857 |
Example: variance via LOTUS
Variance itself is a LOTUS application! Recall:
The second form uses LOTUS twice: .
The continuous version
LOTUS extends to continuous random variables:
If is continuous with PDF :
Same idea: weight by the density of , integrate.
Why not just find g(X)'s distribution?
For simple transforms, finding the new distribution is fine. But consider:
- when is continuous — you'd need the distribution of , which requires a change-of-variables formula
- — finding the distribution involves splitting into cases
- — the new distribution might not have a standard name
LOTUS sidesteps all of this. You stay in the original distribution's world.
Summary
| Concept | Formula |
|---|---|
| LOTUS (discrete) | |
| LOTUS (continuous) | |
| Key benefit | No need to find the distribution of |
| Common use | Computing , variance, moment generating functions |
Whenever you need E[g(X)], go straight to LOTUS. Don't waste time deriving the distribution of g(X) unless you actually need that distribution for something else.
What's next
LOTUS computes one moment at a time. Is there a single function that encodes all moments at once? Yes — the moment generating function.