Visual Tools
Calculators
Tables
Mathematical Keyboard
Converters
Other Tools


Continuous Uniform Distribution






Continuous Uniform Distribution: Equal Likelihood Over an Interval


The probabilistic experiment behind the continuous uniform distribution consists of selecting a value at random from a fixed interval, where no value within the interval is preferred over another. Any subinterval of equal length has the same probability of being selected. The randomness lies purely in position within the interval, not in frequency or accumulation of events.



The Probabilistic Experiment Behind continuous uniform distribution


The probabilistic experiment behind the continuous uniform distribution begins with the assumption that an outcome can occur anywhere within a fixed interval, and that no location inside that interval is preferred over another. The key idea is not randomness over trials, but randomness over position or value. Once the bounds of the interval are set, the experiment treats every point between them as equally plausible.

This type of experiment arises when the only information available is that a value lies somewhere between two limits, and there is no mechanism that biases one sub-interval over another. The uncertainty is purely geometric: probability corresponds to relative length, not to isolated points. Because the outcomes form a continuum, individual values have zero probability; only ranges matter.

The defining feature of this experiment is complete symmetry across the interval. If one interval is twice as long as another, it is twice as likely to contain the outcome. Nothing else distinguishes outcomes.

Like the discrete uniform distribution, the continuous uniform distribution is built on the idea of complete symmetry, no outcome is favored over another. The difference lies only in how probability is assigned: discrete uniform spreads probability across a finite set of distinct values, while continuous uniform spreads it evenly across an entire interval, with probability determined by length rather than individual points.

Example:

Choose a real number at random between 0 and 10, with no additional information. The chance that the number lies between 2 and 4 depends only on the interval length (2 units), not on where it sits inside the range.

Notation


XU(a,b)X \sim U(a, b) — distribution of the random variable.

XUniform(a,b)X \sim \text{Uniform}(a, b) — alternative explicit form.

U(a,b)U(a, b) or Unif(a,b)\text{Unif}(a, b) — used to denote the distribution itself (not the random variable).

U(0,1)U(0, 1) — the standard uniform distribution on the unit interval.

Note: The continuous uniform distribution is distinct from the discrete uniform distribution. The continuous version has a probability density function, while the discrete version has a probability mass function.

See All Probability Symbols and Notations

Parameters


a: lower bound of the interval, where aRa \in \mathbb{R}

b: upper bound of the interval, where bRb \in \mathbb{R} and b>ab > a

The continuous uniform distribution is fully characterized by these two parameters. a and b define the endpoints of the interval where the random variable can take values. The distribution assigns equal probability density to every point in this interval, making it the simplest continuous distribution.

Probability Density Function (PDF) and Support (Range)


Probability Density Function (PDF)


The probability density function (PDF) of a continuous uniform distribution is given by:

f(x)={1baif axb0otherwisef(x) = \begin{cases} \frac{1}{b-a} & \text{if } a \leq x \leq b \\ 0 & \text{otherwise} \end{cases}


Intuition Behind the Formula


Constant Density: The uniform distribution has a flat, rectangular shape. The probability density is the same at every point within the interval [a, b].

Parameters:
aa: The minimum value the random variable can take
bb: The maximum value the random variable can take
• The width of the interval is bab - a

Support (Range of the Random Variable):
• The random variable XX can take any value in the closed interval: [a,b][a, b]
• Outside this interval, the probability density is zero
• The support is the finite interval [a, b]

Logic Behind the Formula:
1ba\frac{1}{b-a}: The constant density ensures the total area under the curve equals 1
• The height of the rectangle is 1ba\frac{1}{b-a} and the width is bab-a, so area = height × width = 1
• All subintervals of the same length have equal probability
• The total area under the curve equals 1:

ab1badx=1ba(ba)=1\int_{a}^{b} \frac{1}{b-a}\,dx = \frac{1}{b-a} \cdot (b-a) = 1


Practical Example: A bus arrives at a stop every 10 minutes. If you arrive at a random time, your wait time XX follows U(0,10)U(0, 10). The PDF is f(x)=110=0.1f(x) = \frac{1}{10} = 0.1 for 0x100 \leq x \leq 10. Any 2-minute interval within this range has the same probability: P(3X5)=P(7X9)=0.1×2=0.2P(3 \leq X \leq 5) = P(7 \leq X \leq 9) = 0.1 \times 2 = 0.2.

Continuous Uniform Distribution

Constant probability over an interval

Explanation

The continuous uniform distribution has constant probability density over the interval [a,b][a, b]. The probability density function is f(x)=1baf(x) = \frac{1}{b-a} for axba \leq x \leq b, and 00 otherwise. The expected value is E[X]=a+b2E[X] = \frac{a+b}{2} and the variance is Var(X)=(ba)212\text{Var}(X) = \frac{(b-a)^2}{12}. This distribution models situations where all values in an interval are equally likely, such as the position of a randomly thrown dart on a board, random arrival times within a time window, or measurement errors uniformly distributed within tolerances.


Cumulative Distribution Function (CDF)

Cumulative Distribution Function (CDF)


The cumulative distribution function (CDF) gives the probability that XX is less than or equal to a specific value:

F(x)={0if x<axabaif axb1if x>bF(x) = \begin{cases} 0 & \text{if } x < a \\ \frac{x-a}{b-a} & \text{if } a \leq x \leq b \\ 1 & \text{if } x > b \end{cases}


Key Properties:
F(a)=0F(a) = 0 (no probability mass below the lower bound)
F(b)=1F(b) = 1 (all probability mass is within [a, b])
• The CDF(Cumulative Density Function) increases linearly from 0 to 1 within the interval [a, b]
• For any xx in [a, b], F(x)F(x) represents the fraction of the interval covered from aa to xx

Practical Use: The CDF(Cumulative Density Function) for the uniform distribution is particularly simple. To find P(Xx)P(X \leq x) when XU(0,10)X \sim U(0, 10) and x=3x = 3: F(3)=30100=0.3F(3) = \frac{3-0}{10-0} = 0.3, meaning 30% of values fall below 3. For any probability calculation: P(cXd)=F(d)F(c)=dabacaba=dcbaP(c \leq X \leq d) = F(d) - F(c) = \frac{d-a}{b-a} - \frac{c-a}{b-a} = \frac{d-c}{b-a}.

Continuous Uniform Distribution CDF

Visualizing probability accumulation for continuous uniform distribution

Continuous Uniform - CDF

Linear increase from 0 to 1 over [a, b]

CDF Explanation

The cumulative distribution function (CDF) of the continuous uniform distribution is F(x)=xabaF(x) = \frac{x-a}{b-a} for axba \leq x \leq b, F(x)=0F(x) = 0 for x<ax < a, and F(x)=1F(x) = 1 for x>bx > b. The CDF shows the probability that the random variable XX is less than or equal to xx, i.e., P(Xx)P(X \leq x). For the uniform distribution, this probability increases linearly from 0 to 1 across the interval. This means that the probability of landing in the first half of the interval is exactly 0.5, and the probability increases uniformly as we move through the interval.

Expected Value (Mean)


For continuous distributions, the expected value emerges from integrating the product of each value with its density across the entire support. Applying the continuous expected value definition to the uniform distribution:

E[X]=xf(x)dxE[X] = \int_{-\infty}^{\infty} x \cdot f(x) \, dx


Formula


E[X]=a+b2E[X] = \frac{a + b}{2}


Where:
aa = minimum value of the interval
bb = maximum value of the interval

Derivation and Intuition


The continuous uniform distribution has PDF f(x)=1baf(x) = \frac{1}{b-a} for axba \leq x \leq b, and 0 elsewhere. Computing the expected value:

E[X]=abx1badx=1baabxdxE[X] = \int_{a}^{b} x \cdot \frac{1}{b-a} \, dx = \frac{1}{b-a} \int_{a}^{b} x \, dx


E[X]=1ba[x22]ab=1bab2a22E[X] = \frac{1}{b-a} \left[\frac{x^2}{2}\right]_{a}^{b} = \frac{1}{b-a} \cdot \frac{b^2 - a^2}{2}


E[X]=1ba(ba)(b+a)2=b+a2E[X] = \frac{1}{b-a} \cdot \frac{(b-a)(b+a)}{2} = \frac{b+a}{2}


The result E[X]=a+b2E[X] = \frac{a+b}{2} is the midpoint of the interval [a,b][a, b]. This makes perfect intuitive sense: when all values in an interval are equally likely, the average value is simply the center of that interval. The distribution is perfectly symmetric, and the expected value is the balance point.

This formula is identical to the discrete uniform distribution, showing that whether we count discrete equally-spaced values or measure continuous equally-dense values, the average lands at the center.

Example


Suppose a bus arrives randomly between 2:00 PM and 2:30 PM, modeled as uniform on the interval [0,30][0, 30] minutes after 2:00 PM:

E[X]=0+302=15 minutesE[X] = \frac{0 + 30}{2} = 15 \text{ minutes}


The expected arrival time is 2:15 PM, exactly halfway through the 30-minute window. If you repeatedly sample random arrival times, the long-run average will converge to 15 minutes after 2:00 PM.

Variance and Standard Deviation


The variance of a continuous random variable quantifies the spread of values around the expected value (mean). For continuous distributions, it is calculated through integration:

Var(X)=E[(Xμ)2]=(xμ)2f(x)dx\mathrm{Var}(X) = \mathbb{E}[(X - \mu)^2] = \int_{-\infty}^{\infty} (x - \mu)^2 f(x) \, dx


Alternatively, using the computational formula:

Var(X)=E[X2]μ2\mathrm{Var}(X) = \mathbb{E}[X^2] - \mu^2


For the continuous uniform distribution, this calculation produces a simple relationship with the interval length.

Formula


Var(X)=(ba)212\mathrm{Var}(X) = \frac{(b-a)^2}{12}


Where:
aa = minimum value of the interval
bb = maximum value of the interval

Derivation and Intuition


Starting with the computational formula, we need to calculate E[X2]\mathbb{E}[X^2].

We know from the expected value section that μ=a+b2\mu = \frac{a+b}{2}.

The continuous uniform distribution has PDF f(x)=1baf(x) = \frac{1}{b-a} for axba \leq x \leq b:

E[X2]=abx21badx=1baabx2dx\mathbb{E}[X^2] = \int_{a}^{b} x^2 \cdot \frac{1}{b-a} \, dx = \frac{1}{b-a} \int_{a}^{b} x^2 \, dx


E[X2]=1ba[x33]ab=1bab3a33\mathbb{E}[X^2] = \frac{1}{b-a} \left[\frac{x^3}{3}\right]_{a}^{b} = \frac{1}{b-a} \cdot \frac{b^3 - a^3}{3}


Using the factorization b3a3=(ba)(b2+ab+a2)b^3 - a^3 = (b-a)(b^2 + ab + a^2):

E[X2]=b2+ab+a23\mathbb{E}[X^2] = \frac{b^2 + ab + a^2}{3}


Applying the computational formula:

Var(X)=b2+ab+a23(a+b2)2\mathrm{Var}(X) = \frac{b^2 + ab + a^2}{3} - \left(\frac{a+b}{2}\right)^2


After algebraic simplification:

Var(X)=(ba)212\mathrm{Var}(X) = \frac{(b-a)^2}{12}


The result Var(X)=(ba)212\mathrm{Var}(X) = \frac{(b-a)^2}{12} shows that variance depends only on the interval width (ba)(b-a). The divisor of 12 appears because probability mass is spread uniformly—not concentrated anywhere. A wider interval means greater spread; location doesn't matter, only the range size.

This formula mirrors the discrete uniform distribution, differing only in technical details between continuous and discrete cases.

Standard Deviation


σ=ba12=ba230.289(ba)\sigma = \frac{b-a}{\sqrt{12}} = \frac{b-a}{2\sqrt{3}} \approx 0.289(b-a)


Example


Suppose a bus arrives randomly between 2:00 PM and 2:30 PM, modeled as uniform on the interval [0,30][0, 30] minutes:

Var(X)=(300)212=90012=75 minutes2\mathrm{Var}(X) = \frac{(30-0)^2}{12} = \frac{900}{12} = 75 \text{ minutes}^2


σ=30128.66 minutes\sigma = \frac{30}{\sqrt{12}} \approx 8.66 \text{ minutes}


The variance of 75 min² and standard deviation of about 8.66 minutes indicate moderate spread. While the expected arrival is 15 minutes after 2:00 PM, typical variations are around ±8.66 minutes, meaning arrivals between roughly 6 and 24 minutes are common.

Mode and Median

Mode


The mode is the value where the probability density function reaches its maximum—the peak of the distribution curve.

For the continuous uniform distribution, every value in the support has the same density:

Mode=Any value in [a,b]\text{Mode} = \text{Any value in } [a, b]


Intuition: The continuous uniform PDF is constant:

f(x)=1ba for axbf(x) = \frac{1}{b-a} \text{ for } a \leq x \leq b


Since the density function is flat across the entire interval, there is no single peak. Every point in [a,b][a, b] is equally probable in terms of density, so every value is technically a mode.

This is the continuous analog of the discrete uniform distribution, where all outcomes are equally likely.

Example:
For a bus arriving uniformly between 2:00 PM and 2:30 PM (interval [0,30][0, 30] minutes):

Mode = Any value in [0, 30]

There is no "most likely" arrival time—all times are equally probable. The density is constant at f(x)=130f(x) = \frac{1}{30} across the entire half-hour window.

Median


The median is the value mm such that P(Xm)=0.5P(X \leq m) = 0.5—the point that divides the distribution's probability in half.

For the continuous uniform distribution, the median is:

Median=a+b2\text{Median} = \frac{a + b}{2}


Derivation: Using the CDF for the uniform distribution:

F(x)=xaba for axbF(x) = \frac{x - a}{b - a} \text{ for } a \leq x \leq b


Setting F(m)=0.5F(m) = 0.5:

maba=0.5\frac{m - a}{b - a} = 0.5


ma=0.5(ba)m - a = 0.5(b - a)


m=a+0.5(ba)=a+b2m = a + 0.5(b - a) = \frac{a + b}{2}


Intuition: Because the distribution is perfectly symmetric and uniform, the point that divides probability in half is simply the midpoint of the interval. This also equals the mean, as with all symmetric distributions.

Example:
For a bus arriving uniformly between 2:00 PM and 2:30 PM (interval [0,30][0, 30] minutes):

Median=0+302=15 minutes\text{Median} = \frac{0 + 30}{2} = 15 \text{ minutes}


Half of arrivals occur before 2:15 PM, and half occur after. This is also the mean arrival time.

Properties:
• For the continuous uniform distribution: median = mean = a+b2\frac{a+b}{2}
• The mode is not uniquely defined (all values equally dense)
• Perfect symmetry ensures median and mean coincide
• The median depends only on the endpoints aa and bb, not on the interval width directly
• This matches the discrete uniform distribution, where the median is also the midpoint

Quantiles/Percentiles


A quantile is a value that divides the distribution at a specific probability threshold. The pp-th quantile xpx_p satisfies:

P(Xxp)=pP(X \leq x_p) = p


where 0<p<10 < p < 1.

Percentiles are quantiles expressed as percentages: the kk-th percentile corresponds to the quantile at p=k/100p = k/100. For example, the 25th percentile is the 0.25 quantile, the 50th percentile is the median, and the 75th percentile is the 0.75 quantile.

Quantiles are found by inverting the CDF: if F(xp)=pF(x_p) = p, then xp=F1(p)x_p = F^{-1}(p).

Finding Quantiles for the Continuous Uniform Distribution


For a continuous uniform distribution on the interval [a,b][a, b], the CDF is:

F(x)=xaba for axbF(x) = \frac{x - a}{b - a} \text{ for } a \leq x \leq b


To find the pp-th quantile, we solve F(xp)=pF(x_p) = p:

xpaba=p\frac{x_p - a}{b - a} = p


xpa=p(ba)x_p - a = p(b - a)


xp=a+p(ba)x_p = a + p(b - a)


This gives the simple quantile formula:

xp=a+p(ba)x_p = a + p(b - a)


The pp-th quantile is located a fraction pp of the way from aa to bb along the interval. This linear relationship makes quantiles for the uniform distribution particularly intuitive.

Common Percentiles


25th Percentile (First Quartile, Q1):

x0.25=a+0.25(ba)=a+ba4=3a+b4x_{0.25} = a + 0.25(b - a) = a + \frac{b - a}{4} = \frac{3a + b}{4}


About 25% of values fall below this point, located one-quarter of the way from aa to bb.

50th Percentile (Median, Q2):

x0.50=a+0.50(ba)=a+ba2=a+b2x_{0.50} = a + 0.50(b - a) = a + \frac{b - a}{2} = \frac{a + b}{2}


This is the median, located exactly at the midpoint.

75th Percentile (Third Quartile, Q3):

x0.75=a+0.75(ba)=a+3(ba)4=a+3b4x_{0.75} = a + 0.75(b - a) = a + \frac{3(b - a)}{4} = \frac{a + 3b}{4}


About 75% of values fall below this point, located three-quarters of the way from aa to bb.

Interquartile Range (IQR):

IQR=Q3Q1=a+3b43a+b4=ba2\text{IQR} = Q3 - Q1 = \frac{a + 3b}{4} - \frac{3a + b}{4} = \frac{b - a}{2}


The IQR spans exactly half the total interval width, containing the middle 50% of the distribution.

Example


For a bus arriving uniformly between 2:00 PM and 2:30 PM (interval [0,30][0, 30] minutes):

25th percentile: 0+0.25(300)=7.50 + 0.25(30 - 0) = 7.5 minutes

25% of arrivals occur before 2:07:30 PM.

50th percentile: 0+0.50(300)=150 + 0.50(30 - 0) = 15 minutes

Half of arrivals occur before 2:15 PM (the median).

75th percentile: 0+0.75(300)=22.50 + 0.75(30 - 0) = 22.5 minutes

75% of arrivals occur before 2:22:30 PM.

IQR: 22.57.5=1522.5 - 7.5 = 15 minutes

The middle 50% of arrivals span a 15-minute window from 2:07:30 PM to 2:22:30 PM.

Other Notable Percentiles


10th percentile: x0.10=a+0.10(ba)x_{0.10} = a + 0.10(b - a) (10% of values below this)

90th percentile: x0.90=a+0.90(ba)x_{0.90} = a + 0.90(b - a) (90% of values below this)

95th percentile: x0.95=a+0.95(ba)x_{0.95} = a + 0.95(b - a) (95% of values below this)

For the uniform distribution, every percentile divides the interval proportionally—the pp-th percentile is always located at position pp along the interval from aa to bb. This makes the uniform distribution the simplest case for understanding quantiles.

Real-World Examples and Common Applications


The continuous uniform distribution models situations where any value within an interval is equally likely, representing complete uncertainty within defined bounds.

Common Applications


Random Number Generation:
• Computational random number generators produce uniform [0,1][0,1] values
• Simulations and Monte Carlo methods
• Cryptographic applications
• Statistical sampling algorithms

Scheduling and Timing:
• Random arrival times within a window
• Bus/train arrival when schedule unknown
• Meeting start times with imprecise information
• Random delays or offsets

Physical and Geometric Problems:
• Random point selection on a line segment or interval
• Angle measurements in certain contexts
• Rounding errors in numerical computations
• Random coordinates within bounded regions

Decision Making Under Ignorance:
• Modeling complete uncertainty about a parameter
• Bayesian prior distributions (non-informative priors)
• Initial estimates before data collection

Why It Appears


The uniform distribution represents maximum entropy—when you know only the minimum and maximum possible values, the uniform distribution assumes nothing else. It's the "default" model for complete ignorance within bounds.

It also serves as the foundation for generating other distributions: by transforming uniform [0,1][0,1] random variables using inverse CDF methods, any continuous distribution can be simulated.

Example Application


A digital clock displays time in whole minutes. When you glance at it, the actual time has passed some fraction into the current minute. This fraction is uniformly distributed on [0,1)[0, 1) minutes, with mean 30 seconds.

For scheduling, if a bus arrives "sometime between 2:00 and 2:30" with no further information, modeling arrival time as Uniform(0,30)\text{Uniform}(0, 30) minutes reflects complete uncertainty, giving an expected wait of 15 minutes.

Interactive Calculator

Continuous Uniform Distribution Calculator

Calculate probabilities and distribution properties

Lower bound of distribution

Upper bound of distribution


Special Cases


The continuous uniform distribution, while mathematically simple, exhibits several special cases and edge behaviors worth understanding.

Unit Interval Uniform Distribution


When a=0a = 0 and b=1b = 1, we obtain the standard uniform distribution Uniform(0,1)\text{Uniform}(0, 1):

f(x)=1 for 0x1f(x) = 1 \text{ for } 0 \leq x \leq 1


This is the fundamental distribution in random number generation—all other continuous distributions can be generated by transforming uniform [0,1][0, 1] random variables using the inverse CDF method.

Any uniform random variable XUniform(a,b)X \sim \text{Uniform}(a, b) can be transformed to standard uniform:

U=XabaUniform(0,1)U = \frac{X - a}{b - a} \sim \text{Uniform}(0, 1)


As Interval Width Approaches Zero


As ba0b - a \to 0 (with midpoint a+b2\frac{a+b}{2} held constant), the uniform distribution degenerates to a point mass:

limbaUniform(a,b)=δa+b2\lim_{b \to a} \text{Uniform}(a, b) = \delta_{\frac{a+b}{2}}


All probability concentrates at a single value. The distribution becomes deterministic—no randomness remains.

As Interval Width Increases


As bab - a \to \infty, the interval grows without bound:

• The density f(x)=1ba0f(x) = \frac{1}{b-a} \to 0 (probability spreads thinner)
Variance (ba)212\frac{(b-a)^2}{12} \to \infty (uncertainty grows)
• The distribution can no longer be normalized over the entire real line

Unlike the normal distribution, a uniform distribution over (,)(-\infty, \infty) cannot exist as a proper probability distribution.

Symmetric Interval Around Zero


When a=ca = -c and b=cb = c for some c>0c > 0, the distribution is symmetric around zero:

XUniform(c,c)X \sim \text{Uniform}(-c, c)


Properties:
Mean =0= 0
Variance =(2c)212=c23= \frac{(2c)^2}{12} = \frac{c^2}{3}
• Useful for modeling symmetric errors or deviations

Limiting Behavior and Connections


Relationship to Discrete Uniform:
The continuous uniform is the limiting case of the discrete uniform distribution as the number of possible values increases and their spacing decreases:

limnDiscreteUniform{a,a+Δ,a+2Δ,,b}Uniform(a,b)\lim_{n \to \infty} \text{DiscreteUniform}\{a, a + \Delta, a + 2\Delta, \ldots, b\} \to \text{Uniform}(a, b)


where Δ=ban\Delta = \frac{b-a}{n} as the grid becomes infinitely fine.

Maximum Entropy:
Among all continuous distributions on [a,b][a, b], the uniform distribution has maximum entropy—it represents the state of complete ignorance about where in the interval a value might fall.

Sum of Uniforms:
Unlike the normal and exponential distributions, sums of uniform random variables do not remain uniform. Instead, the sum of nn independent Uniform(0,1)\text{Uniform}(0, 1) variables approaches normal by the Central Limit Theorem.

Practical Implications


Measurement Rounding:
Digital instruments that round to the nearest unit create uniform rounding errors on [0.5,0.5][-0.5, 0.5]. As precision decreases (rounding to nearest 10, 100, etc.), the interval widens proportionally.

Bounded Uncertainty:
When you know a parameter lies between aa and bb but have no other information, the uniform prior represents minimal assumptions—neither favoring high values nor low values within the range.

Properties


The continuous uniform distribution has simple mathematical properties stemming from its constant density across the support interval.

Symmetry


The uniform distribution is perfectly symmetric around its midpoint a+b2\frac{a+b}{2}:

f(a+b2+x)=f(a+b2x) for xba2f\left(\frac{a+b}{2} + x\right) = f\left(\frac{a+b}{2} - x\right) \text{ for } |x| \leq \frac{b-a}{2}


This symmetry implies:
Mean = Median = a+b2\frac{a+b}{2}
Mode is not uniquely defined (all values equally probable)
• The distribution is mirror-symmetric about its center

Skewness


Skewness=0\text{Skewness} = 0


Zero skewness confirms perfect symmetry. The uniform distribution has no tendency toward either tail—both sides balance exactly.

Kurtosis


Kurtosis=1.8\text{Kurtosis} = 1.8


Excess Kurtosis=1.2\text{Excess Kurtosis} = -1.2


The uniform distribution has negative excess kurtosis (platykurtic). With kurtosis 1.8 < 3, it has lighter tails than the normal distribution. The flat shape means probability is spread evenly rather than concentrated at the center or in the tails.

Tail Behavior


The uniform distribution has no tails in the traditional sense:

f(x)={1bafor axb0otherwisef(x) = \begin{cases} \frac{1}{b-a} & \text{for } a \leq x \leq b \\ 0 & \text{otherwise} \end{cases}


Probability density is constant within [a,b][a, b] and exactly zero outside this interval. There is an abrupt cutoff at the boundaries—no gradual decay.

This represents the opposite extreme from heavy-tailed distributions:
• No values outside [a,b][a, b] are possible
• All values within [a,b][a, b] are equally likely
• No concept of "rare extreme events"

Unique Mathematical Properties


Constant Density:
The defining property is uniform density:

f(x)=1ba for all x[a,b]f(x) = \frac{1}{b-a} \text{ for all } x \in [a, b]


This makes many calculations algebraically simple.

Maximum Entropy on Bounded Interval:
Among all continuous distributions on [a,b][a, b], the uniform distribution has the highest entropy:

H=ln(ba)H = \ln(b - a)


It represents maximum uncertainty or minimum information—complete ignorance about where in [a,b][a, b] a value falls.

Probability Proportional to Length:
For any subinterval [c,d][a,b][c, d] \subseteq [a, b]:

P(cXd)=dcbaP(c \leq X \leq d) = \frac{d - c}{b - a}


Probability is simply the length of the subinterval divided by the total interval length.

Independence from Location:
The distribution shape depends only on interval width bab - a, not on location. Shifting the interval preserves all distributional properties:

Uniform(a,b)=dUniform(a+c,b+c) for any c\text{Uniform}(a, b) \stackrel{d}{=} \text{Uniform}(a+c, b+c) \text{ for any } c


Useful Identities


Linear Transformation:
If XUniform(a,b)X \sim \text{Uniform}(a, b), then:

Y=cX+dUniform(ca+d,cb+d) for c>0Y = cX + d \sim \text{Uniform}(ca + d, cb + d) \text{ for } c > 0


Linear transformations preserve uniformity (with appropriate parameter adjustments).

Standardization:
Any uniform variable can be transformed to standard uniform:

U=XabaUniform(0,1)U = \frac{X - a}{b - a} \sim \text{Uniform}(0, 1)


Inverse CDF Method:
The uniform CDF is:

F(x)=xabaF(x) = \frac{x - a}{b - a}


with inverse:

F1(p)=a+p(ba)F^{-1}(p) = a + p(b - a)


This makes generating uniform random variables and computing quantiles trivial.

Order Statistics:
For nn independent Uniform(0,1)\text{Uniform}(0, 1) variables, the kk-th order statistic follows a Beta(k,nk+1)(k, n-k+1) distribution. The minimum and maximum have simple distributions:

min(X1,,Xn)Uniform(a,a+ban)\min(X_1, \ldots, X_n) \sim \text{Uniform}(a, a + \frac{b-a}{n})
(approximately, for large nn)


Not Closed Under Addition:
Unlike the normal distribution, sums of independent uniform variables do not remain uniform. Instead, the sum approaches normal by the Central Limit Theorem.