A perfectly fair spinner

Imagine spinning a pointer that can land at any angle between 0 and 360 degrees. No region is favored over another. What's the probability it lands between 90° and 180°?

What do you think?
A spinner can land anywhere between 0° and 360°. What's P(90° ≤ X ≤ 180°)?

This "equally likely over an interval" idea is the Uniform distribution — the simplest continuous distribution.

Definition

Uniform Distribution

A random variable XX is uniformly distributed on [a,b][a, b] if its PDF is: f(x)=1ba,axbf(x) = \Large\frac{1}{b - a}, \quad a \leq x \leq b and f(x)=0f(x) = 0 otherwise. We write XUnif(a,b)X \sim \text{Unif}(a, b).

The PDF is a flat rectangle. Every subinterval of the same length gets the same probability.

For any subinterval [c,d][a,b][c, d] \subseteq [a, b]:

P(cXd)=dcbaP(c \leq X \leq d) = \Large\frac{d - c}{b - a}

Uniform Distribution
02468101/(b−a) = 0.16750.0%
09
110
010
010
P(c < X < d)
0.5000
E[X]
5.00
Var(X)
3.000
SD(X)
1.732

Adjust the distribution endpoints and the query interval. The probability is just the ratio of the shaded width to the total width.

The CDF

Integrating the flat PDF gives a straight line:

F(x)={0x<axabaaxb1x>bF(x) = \begin{cases} 0 & x < a \\ \Large\frac{x - a}{b - a} & a \leq x \leq b \\ 1 & x > b \end{cases}

This makes the Uniform distribution the easiest to work with: the CDF is just linear interpolation.

Expectation and variance

Uniform Moments

If XUnif(a,b)X \sim \text{Unif}(a, b): E[X]=a+b2,Var(X)=(ba)212E[X] = \Large\frac{a + b}{2}, \qquad \text{Var}(X) = \frac{(b - a)^2}{12}

The expected value is the midpoint — no surprise for a symmetric distribution.

The variance depends on the square of the interval width. Wider interval → more spread → higher variance.

For Unif(0, 10), what is E[X]? (whole number)
For Unif(2, 8), what is P(3 ≤ X ≤ 5)? (decimal, e.g. 0.42)
For Unif(0, 12), what is Var(X)? (whole number)

Universality of the uniform

The Uniform distribution plays a special role in probability:

If UUnif(0,1)U \sim \text{Unif}(0,1) and FF is any CDF, then X=F1(U)X = F^{-1}(U) has CDF FF. This inverse transform means a single Uniform random number can simulate any distribution.

This is how computers generate random variables: start with Uniform draws, transform them.

Example: generating Exponential from Uniform

If UUnif(0,1)U \sim \text{Unif}(0,1), then X=1λln(1U)X = -\frac{1}{\lambda} \ln(1 - U) follows an Exponential distribution with rate λ\lambda.

Watch the inverse transform in action — draw uniform samples and see them transform into any target distribution:

Inverse Transform Method
Target distribution
0.000.380.771.15Click 'Draw U' to start sampling
Histogram True PDF
Samples
0
Sample mean
Key idea: A single Uniform(0,1) random number can generate a sample from any continuous distribution via U → F⁻¹(U). Watch the histogram converge to the true Exponential(1) density!

Where the uniform appears

ScenarioModel
Random point on a line segmentUnif(0, L)
Round-off errorUnif(−0.5, 0.5)
Random number generatorsUnif(0, 1)
Arrival time within a fixed intervalUnif(a, b)
What do you think?
A bus arrives at some random time between 12:00 and 12:30. You arrive at 12:10. What's the probability you have to wait more than 15 minutes?
Enter a decimal, e.g. 0.42

Simulate random arrivals at a bus stop to see the Uniform waiting time distribution emerge:

Bus Stop Wait Time
Buses arrive every T minutes on a fixed schedule. You arrive at a random time. Your wait is Uniform(0, T).
530
0.0000.0470.093Click 'Arrive' to simulate random arrivals
Wait times Uniform(0, T)
Arrivals
0
Avg wait
0.00
E[W] = T/2
7.5
Sample var
0.00
Var = T²/12
18.75

Summary

PropertyFormula
PDFf(x)=1/(ba)f(x) = 1/(b-a) on [a,b][a,b]
CDFF(x)=(xa)/(ba)F(x) = (x-a)/(b-a)
ExpectationE[X]=(a+b)/2E[X] = (a+b)/2
VarianceVar(X)=(ba)2/12\text{Var}(X) = (b-a)^2/12
Key propertyFlat density = equal probability per unit length

The Uniform distribution is the continuous version of "equally likely." Through the inverse transform, it can generate any other distribution.

What's next

From the flattest possible distribution to the most famous one in all of statistics — the Normal (Gaussian) distribution and its bell-shaped curve.