Probability Cheat Sheet
The core ideas of Probability distilled into a single, scannable reference — perfect for review or quick lookup.
Quick Reference
Sample Space and Events
The sample space is the set of all possible outcomes of a random experiment, while an event is any subset of the sample space. Defining these precisely is the first step in any probability analysis, since probabilities are assigned to events rather than to individual outcomes in many frameworks.
Conditional Probability
Conditional probability measures the likelihood of an event occurring given that another event has already occurred. It is defined as $P(A|B) = \frac{P(A \cap B)}{P(B)}$, provided $P(B) > 0$. This concept is essential for updating beliefs when new information becomes available.
Bayes' Theorem
Bayes' theorem provides a formula for reversing conditional probabilities: $P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}$. It allows us to update a prior belief about event $A$ after observing evidence $B$, forming the foundation of Bayesian inference and decision-making under uncertainty.
Law of Large Numbers
The Law of Large Numbers states that as the number of independent, identically distributed trials increases, the sample average converges to the expected value. This theorem explains why casinos are profitable in the long run and why polling works despite individual unpredictability.
Central Limit Theorem
The Central Limit Theorem states that the sum or average of a large number of independent random variables, regardless of their original distribution, tends toward a normal (Gaussian) distribution. This result justifies the widespread use of normal-distribution-based methods in statistics.
Random Variables and Distributions
A random variable is a function that assigns a numerical value to each outcome in a sample space. Its probability distribution describes the likelihood of each possible value. Distributions can be discrete (e.g., binomial, Poisson) or continuous (e.g., normal, exponential).
Expected Value and Variance
The expected value (mean) of a random variable is the long-run average of its outcomes, weighted by their probabilities. Variance measures how spread out values are around the mean. Together, they summarize a distribution's center and dispersion.
Independence and Dependence
Two events are independent if the occurrence of one does not affect the probability of the other, meaning $P(A \cap B) = P(A) \cdot P(B)$. When events are dependent, their joint probability requires conditioning. Recognizing independence is critical for simplifying calculations and building correct models.
Combinatorics in Probability
Combinatorics provides counting techniques such as permutations, combinations, and the multiplication principle that are essential for computing probabilities in finite sample spaces. These methods determine how many ways events can occur, which directly feeds into probability calculations.
Bayesian vs. Frequentist Interpretation
The frequentist interpretation defines probability as the long-run relative frequency of an event in repeated experiments. The Bayesian interpretation treats probability as a degree of belief, updated with evidence via Bayes' theorem. These paradigms lead to different statistical methodologies.
Key Terms at a Glance
Get study tips in your inbox
We'll send you evidence-based study strategies and new cheat sheets as they're published.
We'll notify you about updates. No spam, unsubscribe anytime.