Chebyshev's inequality bounds the probability that a random variable deviates from its mean by more than a given amount, using only the mean and variance. For any random variable X with mean μ and variance σ²: P(|X - μ| ≥ a) ≤ σ² / a².
How Chebyshev Inequality Works
Chebyshev inequality bounds the probability of deviating from the mean by measuring how far we are in units of variance. It applies to any distribution with finite variance, making it extremely general but sometimes conservative.
Often expressed as P(|X - μ| ≥ kσ) ≤ 1/k², meaning the probability of being k standard deviations away from the mean is at most 1/k². For example, at least 75% of values lie within 2 standard deviations.
Chebyshev inequality is used in quality control, confidence interval construction, proving the weak law of large numbers, and whenever we need distribution-free probability bounds with only mean and variance information.