The shortcut you didn't know you needed

You know XX is a fair die roll (1–6). What's the expected value of X2X^2?

Your instinct: find the distribution of Y=X2Y = X^2 (so YY takes values 1, 4, 9, 16, 25, 36), figure out the probabilities, then compute E[Y]E[Y]. That works, but there's a much easier way.

What do you think?
To find E[X²] for a fair die, do you need to first find the PMF of Y = X²?

The law of the unconscious statistician

LOTUS (Discrete)

If XX is a discrete random variable and gg is any function, then: E[g(X)]=xg(x)P(X=x)E[g(X)] = \Large\sum_x g(x) \cdot P(X = x) You compute E[g(X)] using the distribution of XX, not the distribution of g(X)g(X).

The name is playful — you use this rule "unconsciously" because it feels obvious. But it's not trivial: it says that to find the expected value of a transformed variable, you don't need to work out the new distribution first.

LOTUS is the bridge between knowing E[X] and computing E[g(X)] for arbitrary functions g. It saves you from the often painful step of finding the PMF of g(X).

See it in action

Pick a distribution for XX and a transform gg. The table shows each contribution g(x)P(X=x)g(x) \cdot P(X=x), and their sum is E[g(X)]E[g(X)] via LOTUS.

LOTUS Explorer
Distribution of X
Transform g(X)
-3-2-10123P(X=x)g(x)·P(X=x)
xP(X=x)g(x)g(x)·P(X=x)
-30.14299.00001.2857
-20.14294.00000.5714
-10.14291.00000.1429
00.14290.00000.0000
10.14291.00000.1429
20.14294.00000.5714
30.14299.00001.2857
E[X]
0.0000
E[g(X)] via LOTUS
4.0000

Example: variance via LOTUS

Variance itself is a LOTUS application! Recall:

Var(X)=E[(Xμ)2]=E[X2](E[X])2\text{Var}(X) = E[(X - \mu)^2] = E[X^2] - (E[X])^2

The second form uses LOTUS twice: E[X2]=x2P(X=x)E[X^2] = \sum x^2 P(X = x).

Variance of a fair die via LOTUS
E[X]=1+2+3+4+5+66=3.5E[X] = \frac{1+2+3+4+5+6}{6} = 3.5
First moment: the mean.
Step 1 of 3
X ~ Bernoulli(0.4). What is E[X²]? (decimal, e.g. 0.42)
X is uniform on {1,2,3,4}. What is E[1/X]? (Round to 3 decimals) (decimal to 3 places, e.g. 0.456)

The continuous version

LOTUS extends to continuous random variables:

LOTUS (Continuous)

If XX is continuous with PDF f(x)f(x): E[g(X)]=g(x)f(x)dxE[g(X)] = \Large\int_{-\infty}^{\infty} g(x) \cdot f(x)\, dx

Same idea: weight g(x)g(x) by the density of XX, integrate.

What do you think?
X ~ Unif(0,2). What is E[X³]?
Enter a decimal or whole number

Why not just find g(X)'s distribution?

For simple transforms, finding the new distribution is fine. But consider:

  • g(X)=X2g(X) = X^2 when XX is continuous — you'd need the distribution of X2X^2, which requires a change-of-variables formula
  • g(X)=max(X,0)g(X) = \max(X, 0) — finding the distribution involves splitting into cases
  • g(X)=eXg(X) = e^X — the new distribution might not have a standard name

LOTUS sidesteps all of this. You stay in the original distribution's world.

X ~ Expo(1). What is E[X²]? (Hint: Var(X) = E[X²] − (E[X])², and for Expo(1), E[X]=1, Var=1) (whole number)
X is uniform on {−2,−1,0,1,2}. What is E[|X|]? (decimal, e.g. 0.42)

Summary

ConceptFormula
LOTUS (discrete)E[g(X)]=xg(x)P(X=x)E[g(X)] = \sum_x g(x) P(X=x)
LOTUS (continuous)E[g(X)]=g(x)f(x)dxE[g(X)] = \int g(x) f(x)\, dx
Key benefitNo need to find the distribution of g(X)g(X)
Common useComputing E[X2]E[X^2], variance, moment generating functions

Whenever you need E[g(X)], go straight to LOTUS. Don't waste time deriving the distribution of g(X) unless you actually need that distribution for something else.

What's next

LOTUS computes one moment at a time. Is there a single function that encodes all moments at once? Yes — the moment generating function.