Visual Tools
Calculators
Tables
Mathematical Keyboard
Converters
Other Tools


Conditional Probability






Probability When a Condition Is Known


In many situations, probabilities do not stay fixed. Once new information becomes available, how we assess a situation can change. Knowing that something has already happened often reshapes how we view what can happen next.

Conditional probability captures this shift in perspective. It reflects how uncertainty is evaluated after a condition is known, when attention is restricted to a smaller set of possibilities. This idea appears naturally whenever information arrives, observations are made, or situations unfold step by step.

The rest of the page explains how this change of viewpoint works, how it is expressed formally, and how it connects to other central ideas in probability.



Conditioning as Restricting the Situation


When a condition is known, we no longer reason over all possible outcomes. The information tells us that only certain situations remain relevant, and everything outside that condition is discarded.

From this point on, probabilities are evaluated within the condition. We are not changing the event itself — we are changing the context in which it is viewed. The situation is the same, but the frame of reference is smaller.

This way of thinking explains why probabilities can change once information is known. Conditioning is not an extra rule added on top of probability; it is a shift in perspective caused by restricting attention to a specific part of what was originally possible.

Formal Meaning of Conditional Probability


Conditional probability describes how likely an event is once we know that another event has occurred. It represents a reassessment of uncertainty after information is taken into account.

The key idea is that the condition is treated as given. All reasoning is carried out under the assumption that this condition is true, and probabilities are evaluated relative to that restricted situation rather than the original one.

This verbal description captures the essence of conditional probability before any symbols or formulas are introduced.

Useful Notation


    Before introducing the formula, we fix the symbols used to describe conditional probability:

  • AA, BB — events
  • P(A)P(A), P(B)P(B) — unconditional probabilities
  • P(AB)P(A \mid B) — probability of AA when BB is known to have occurred
  • ABA \cap B — the event that both AA and BB occur

  • This notation allows us to express conditioning precisely and compactly in the next section.

Conditional Probability Formula


The idea of conditioning becomes precise through a simple normalization rule. When we restrict attention to situations where BB has occurred, probabilities must be rescaled so that they sum to one within that restricted context.

This leads to the formula:

P(AB)=P(AB)P(B)P(A \mid B) = \dfrac{P(A \cap B)}{P(B)}

The numerator represents the part of AA that is compatible with the condition BB.
The denominator accounts for the fact that we are now working only inside BB.

This formula does not introduce a new probability law — it expresses how probabilities behave once the situation has been restricted by known information.

Visual Representations


Conditional probability becomes clearer when viewed geometrically or sequentially.

Venn diagram view:
The condition BB restricts attention to a smaller region of the sample space. The probability of AA is then evaluated only within that region, as the proportion of the overlap AcapBA cap B relative to BB itself.

Tree diagram view:
In a probability tree, conditioning corresponds to moving along a branch where a condition has already occurred. Probabilities along later branches are evaluated relative to that branch, not the entire tree.

These visual perspectives reinforce the idea that conditioning is a change of viewpoint, not a change in the underlying events.

Examples


1. Information Changes the Probability
Suppose AA is "a randomly chosen person has a university degree" and BB is "the person is over 40."
The probability of AA evaluated after knowing BB may differ from the overall probability of AA, because the condition changes the relevant group.

2. No Change Under Conditioning
If AA is "tomorrow is sunny" and BB is "a fair coin lands heads today," knowing BB has no effect on how we evaluate AA. In this case, conditioning does not change the probability, illustrating a link to independence.

3. Sequential Situations
Consider drawing two cards from a deck without replacement. Let AA be "the second card is an ace" and BB be "the first card is an ace."
Knowing whether BB occurred changes how we evaluate AA, because the situation after the first draw is different from the original one.

These examples show how conditional probability reflects the impact of information on how uncertainty is assessed.

Conditional Probability vs Independence


    Conditioning usually changes probabilities, because new information restricts the situation we are considering. Once a condition is known, the frame of reference shifts, and probabilities are re-evaluated within that restricted context.

    Independence is the special case where this shift does not occur. If events are independent, then knowing that one event happened provides no information about the other. In that case, conditioning leaves probabilities unchanged.

    This contrast is crucial:
  • Conditional probability describes how probabilities *update* when information is known.
  • Independence describes when such an update is unnecessary.

  • Understanding this distinction prevents a common mistake — assuming probabilities should change just because a condition is mentioned, or assuming independence without justification.

Common Patterns Where Conditioning Appears


Conditional probability shows up naturally in many recurring situations.

One common pattern is "given that…" reasoning, where information is stated explicitly and probabilities must be evaluated under that condition. Another is filtering, where attention is restricted to cases that meet a certain criterion before any assessment is made.

Conditioning also appears in sequential processes, where earlier outcomes affect how later ones are viewed, and in classification problems, where probabilities are evaluated within specific groups or categories.

Recognizing these patterns helps identify when conditional probability is required, even if the word "given" is not explicitly used.

Common Mistakes


Conditional probability is often misapplied, even in simple situations.

A frequent mistake is forgetting to restrict the situation properly and continuing to reason as if all outcomes were still possible. Another common error is dividing by the wrong probability, which leads to incorrect normalization.

Confusing P(AB)P(A \mid B) with P(BA)P(B \mid A) is especially widespread and can completely reverse the meaning of a statement. It is also common to assume independence implicitly, treating conditioning as irrelevant without justification.

Being explicit about what is known and what space is being considered helps avoid these errors.

Why Conditional Probability Matters


Conditional probability is the mechanism by which probability responds to information. It models learning, observation, and the updating of beliefs as new facts become known.

This idea lies at the heart of inference, decision-making, and prediction. It underpins statistical reasoning, risk assessment, and data analysis, where conclusions must be drawn in the presence of partial information.

Without conditional probability, probability theory would be unable to describe how uncertainty evolves when knowledge changes.

Connections to Other Probability Concepts


    Conditional probability connects directly to many central ideas in probability.

  • Events provide the objects being conditioned on.
  • Independence describes when conditioning has no effect.
  • Total probability combines conditional probabilities across cases.
  • Chain rule builds joint probabilities from conditional ones.
  • Bayes theorem inverts conditional probabilities to update beliefs.
  • Random variables and distributions extend conditioning to numerical outcomes.

  • Understanding conditional probability clarifies how these concepts fit together into a single coherent framework.