Visual Tools
Calculators
Tables
Mathematical Keyboard
Converters
Other Tools


Poisson Distribution






Poisson Distribution: Event Counts Over an Interval


The Poisson distribution models an experiment where events occur randomly over a continuous interval of time or space at a constant average rate. There are no discrete trials and no fixed number of opportunities; instead, the experiment observes how many events happen within a given interval. The defining assumptions are independence of events and stability of the average rate, making this distribution suitable for modeling counts of rare or spontaneous occurrences.



The Probabilistic Experiment Behind Poisson distribution


The Poisson distribution models the number of events occurring in a fixed interval of time or space, when events happen independently at a constant average rate. Unlike other discrete distributions, it does not rely on repeated Bernoulli trials or success–failure experiments.

The defining idea is event intensity, not trial structure. Events are assumed to occur randomly but with a stable long-run average frequency. The random variable represents the number of events that occur in any given interval of time, regardless of when they occur within it.

The Poisson distribution is especially effective for representing rare or spontaneous events, and it often arises as an approximation to the binomial distribution when the number of trials is large and the probability of success is small.


Example:

Counting the number of emails received by a support desk in one hour when emails arrive randomly but at a stable average rate. The exact timing does not matter — only the total count within the hour.

Notation Used


XPoisson(λ)X \sim \text{Poisson}(\lambda) or XP(λ)X \sim \mathcal{P}(\lambda)distribution of the random variable.

Poisson(λ)\text{Poisson}(\lambda)used to denote the distribution itself (not the random variable).

P(λ)\text{P}(\lambda)sometimes used informally, especially in compact notation.

P(X=k)=λkeλk!,for k=0,1,2,P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!}, \quad \text{for } k = 0, 1, 2, \ldotsprobability mass function (PMF), where:

λ\lambda — average rate of occurrence (expected number of events in the interval)

kk — number of events observed

e2.71828e \approx 2.71828 — Euler's number (base of natural logarithm)

Key properties:

E[X]=λE[X] = \lambda — expected value (mean)

Var(X)=λ\text{Var}(X) = \lambda — variance (equal to the mean)

Relationship to binomial distribution:

Poisson(λ)Binomial(n,p)\text{Poisson}(\lambda) \approx \text{Binomial}(n, p) where λ=np\lambda = np, when nn is large and pp is small (rare events approximation)


See All Probability Symbols and Notations

Parameters


𝜆𝜆: the average rate (mean number of events), with 𝜆>0𝜆>0

The Poisson distribution models the number of events occurring in a fixed interval of time or space, assuming events happen independently and at a constant average rate 𝜆𝜆.

It describes counts: 0,1,2,...,0, 1, 2, ..., with probabilities determined by how large or small 𝜆𝜆 is.

The single parameter 𝜆𝜆 controls both the mean and the variance of the distribution.

Probability Mass Function (PMF) and Support (Range)


The probability mass function (PMF) of a Poisson distribution is given by:

P(X=k)=λkeλk!,k=0,1,2,P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!}, \quad k = 0, 1, 2, \ldots


Counting Rare Events: The Poisson distribution models the number of events occurring in a fixed interval of time or space when events occur independently at a constant average rate.

Support (Range of the Random Variable):
* The random variable XX can take on values 0,1,2,3,0, 1, 2, 3, \ldots (all non-negative integers).
* X=kX = k means exactly kk events occur in the interval.
* The support is thus a countably infinite set.

Logic Behind the Formula:
* λk\lambda^k: Represents the rate parameter raised to the power of the number of events
* eλe^{-\lambda}: The exponential decay factor ensuring probabilities sum to 1
* k!k!: Accounts for the number of ways kk events can be ordered
* The total probability sums to 1:

k=0P(X=k)=k=0λkeλk!=eλk=0λkk!=eλeλ=1\sum_{k=0}^{\infty} P(X = k) = \sum_{k=0}^{\infty} \frac{\lambda^k e^{-\lambda}}{k!} = e^{-\lambda} \sum_{k=0}^{\infty} \frac{\lambda^k}{k!} = e^{-\lambda} \cdot e^{\lambda} = 1


This uses the Taylor series expansion: eλ=k=0λkk!e^{\lambda} = \sum_{k=0}^{\infty} \frac{\lambda^k}{k!}

Poisson Distribution

Rare events over time interval (rate λ)

Explanation

The Poisson distribution models the number of events occurring in a fixed interval when events happen at a constant average rate. The probability mass function is P(X=k)=λkeλk!P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!}, where λ\lambda is the average rate of events. Both the expected value and variance equal λ\lambda: E[X]=λE[X] = \lambda and Var(X)=λ\text{Var}(X) = \lambda. Common applications include customer arrivals per hour, website hits per minute, radioactive decay events, and other scenarios where rare events occur independently at a constant average rate.


Cumulative Distribution Function (CDF)


The cumulative distribution function (CDF) of a Poisson distribution is given by:

FX(k)=P(Xk)=i=0kλieλi!F_X(k) = P(X \leq k) = \sum_{i=0}^{k} \frac{\lambda^i e^{-\lambda}}{i!}


Where:
λ\lambda = average rate of occurrence (expected number of events, λ>0\lambda > 0)
kk = number of events observed (where k=0,1,2,3,k = 0, 1, 2, 3, \ldots)
ee = Euler's number (approximately 2.71828)

Intuition Behind the Formula


Definition: The CDF gives the probability of observing kk or fewer events in a fixed interval of time or space.

Summation of Probabilities:
We sum the individual probabilities from 0 events up to kk events:

P(Xk)=P(X=0)+P(X=1)+P(X=2)++P(X=k)P(X \leq k) = P(X=0) + P(X=1) + P(X=2) + \cdots + P(X=k)


Alternative Formulation via Incomplete Gamma Function:
The CDF can be expressed using the regularized incomplete gamma function:

FX(k)=Γ(k+1,λ)k!=Q(k+1,λ)F_X(k) = \frac{\Gamma(k+1, \lambda)}{k!} = Q(k+1, \lambda)


This relationship is often used in statistical software for efficient computation, especially for large values of kk.

Complementary Probability:
For "more than kk events":

P(X>k)=1FX(k)P(X > k) = 1 - F_X(k)


Infinite Support: Unlike finite discrete distributions, the Poisson distribution has infinite support (kk can be arbitrarily large), though probabilities decrease rapidly for kλk \gg \lambda.

Poisson Distribution CDF

CDF for rare events over time

CDF Explanation

The Poisson CDF is F(k)=P(Xk)=i=0kλieλi!F(k) = P(X \leq k) = \sum_{i=0}^{k} \frac{\lambda^i e^{-\lambda}}{i!} for k0k \geq 0. This gives the probability of observing kk or fewer events in the fixed interval. The CDF starts at F(0)=eλF(0) = e^{-\lambda}, which is the probability of observing zero events. As kk increases, the CDF approaches 1.0, with the rate of convergence depending on λ\lambda. For larger values of λ\lambda, the CDF increases more gradually across a wider range of kk values, while smaller λ\lambda values lead to faster convergence near k=0k = 0.

Expected Value (Mean)


As explained in the general case for calculating expected value, the expected value of a discrete random variable is computed as a weighted sum where each possible value is multiplied by its probability:

E[X]=xxP(X=x)E[X] = \sum_{x} x \cdot P(X = x)


For the Poisson distribution, we apply this general formula to the specific probability mass function of this distribution.

Formula


E[X]=λE[X] = \lambda


Where:
λ\lambda = average rate of occurrence (expected number of events per interval)

Derivation and Intuition


Starting from the general definition and substituting the PMF P(X=k)=λkeλk!P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!} for k=0,1,2,k = 0, 1, 2, \ldots:

E[X]=k=0kλkeλk!E[X] = \sum_{k=0}^{\infty} k \cdot \frac{\lambda^k e^{-\lambda}}{k!}


The k=0k = 0 term vanishes, so we start from k=1k = 1:

E[X]=k=1kλkeλk!=eλk=1kλkk!=eλk=1λk(k1)!E[X] = \sum_{k=1}^{\infty} k \cdot \frac{\lambda^k e^{-\lambda}}{k!} = e^{-\lambda} \sum_{k=1}^{\infty} \frac{k \cdot \lambda^k}{k!} = e^{-\lambda} \sum_{k=1}^{\infty} \frac{\lambda^k}{(k-1)!}


Substituting j=k1j = k - 1:

E[X]=eλλj=0λjj!=eλλeλ=λE[X] = e^{-\lambda} \lambda \sum_{j=0}^{\infty} \frac{\lambda^j}{j!} = e^{-\lambda} \lambda \cdot e^{\lambda} = \lambda


The result E[X]=λE[X] = \lambda has a particularly elegant interpretation: the parameter λ\lambda is both the rate of occurrence and the expected value. The Poisson distribution is parameterized directly by its mean.

Example


Consider phone calls arriving at a call center at an average rate of λ=12\lambda = 12 calls per hour:

E[X]=12E[X] = 12


The expected number of calls in one hour is exactly 12, which is the defining parameter of the distribution. This self-referential property makes the Poisson distribution especially natural for modeling count data.

Variance and Standard Deviation


The variance of a discrete random variable measures how spread out the values are around the expected value. It is computed as:

Var(X)=E[(Xμ)2]=x(xμ)2P(X=x)\mathrm{Var}(X) = \mathbb{E}[(X - \mu)^2] = \sum_{x} (x - \mu)^2 P(X = x)


Or using the shortcut formula:

Var(X)=E[X2]μ2\mathrm{Var}(X) = \mathbb{E}[X^2] - \mu^2


For the Poisson distribution, we apply this formula to derive the variance.

Formula


Var(X)=λ\mathrm{Var}(X) = \lambda


Where:
λ\lambda = average rate of occurrence (expected number of events per interval)

Derivation and Intuition


Starting with the shortcut formula, we need to calculate E[X2]\mathbb{E}[X^2].

We know from the expected value section that μ=λ\mu = \lambda.

Using the PMF P(X=k)=λkeλk!P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!}:

E[X2]=k=0k2λkeλk!\mathbb{E}[X^2] = \sum_{k=0}^{\infty} k^2 \cdot \frac{\lambda^k e^{-\lambda}}{k!}


Through algebraic manipulation using properties of the exponential series:

E[X2]=λ2+λ\mathbb{E}[X^2] = \lambda^2 + \lambda


Applying the shortcut formula:

Var(X)=(λ2+λ)λ2=λ\mathrm{Var}(X) = (\lambda^2 + \lambda) - \lambda^2 = \lambda


The result Var(X)=λ\mathrm{Var}(X) = \lambda reveals a remarkable property: the Poisson distribution's variance equals its mean. This makes the Poisson distribution unique among common distributions. The single parameter λ\lambda completely determines both the center and the spread of the distribution.

This property provides a practical test: if observed count data has variance approximately equal to its mean, the Poisson distribution may be an appropriate model.

Standard Deviation


σ=λ\sigma = \sqrt{\lambda}


Example


Consider phone calls arriving at a rate of λ=16\lambda = 16 calls per hour:

Var(X)=16\mathrm{Var}(X) = 16


σ=16=4\sigma = \sqrt{16} = 4


The variance equals the expected value (16), and the standard deviation of 4 indicates that in most hours, the number of calls will fall within roughly 12 to 20 calls (within one standard deviation of the mean).

Applications and Examples


### Practical Example

Suppose a call center receives an average of λ=4\lambda = 4 calls per hour. The probability of receiving exactly k=6k = 6 calls in a given hour is:

P(X=6)=46e46!=40960.01837200.104P(X = 6) = \frac{4^6 e^{-4}}{6!} = \frac{4096 \cdot 0.0183}{720} \approx 0.104

This means there's about a 10.4% chance of receiving exactly 6 calls in an hour.

Note: The Poisson distribution is often used as an approximation to the binomial distribution when nn is large and pp is small, with λ=np\lambda = np.