Markov vs Chebyshev Inequality
The two fundamental probability inequalities differ in requirements and strength:
Markov Inequality requires only:
• X is non-negative
• E[X] is known
• Bound: P(X ≥ a) ≤ E[X] / a
Chebyshev Inequality requires:
• E[X] = μ is known
• Var(X) = σ² is known
• Bound: P(|X - μ| ≥ a) ≤ σ² / a²
Chebyshev uses more information (variance) and produces tighter bounds. Both inequalities become tighter as the threshold increases—the further into the tail you look, the smaller the probability and the more useful the bound becomes.
The visualization tools demonstrate both bounds across multiple distributions, showing how they compare to actual tail probabilities.