[Jump to AI Interview][Learn Bayes’ Theorem]
Problem Statement
What is the expected number of coin tosses needed to get two consecutive heads for an unfair coin? Generalize this for 3 heads, and then for N heads.
Solution for Two Consecutive Heads (2H)
Approach: Push Recursion
We solve this using conditional expectation by conditioning on the first toss.
Let E(2H) be the expected number of tosses to get two consecutive heads.
Step 1: Set Up the Recursive Equation
We condition on what happens in the first toss:
Where:
-
(1-p) is the probability of getting tails, which requires starting over: 1 + E(2H)
-
p is the probability of getting heads, putting us in state “1H” (one head so far): 1 + E(2H \mid 1H)
Step 2: Analyze E(2H \mid 1H)
If we already have one head, we condition on the next toss:
Where:
-
With probability p, we get heads and complete the sequence (1 more toss)
-
With probability (1-p), we get tails and must start over: 1 + E(2H)
Step 3: Solve the System of Equations
From Step 2:
Actually, let’s simplify:
Wait, let me recalculate:
Now substituting into the first equation:
Numerical Example
For p = 0.3:
Solution for Three Consecutive Heads (3H)
Approach: Pull Recursion
For three heads, we use pull recursion. The key insight: to get 3 heads, you first need 2 heads in a row, then require 1 more toss to get to 3.
Step 1: Set Up the Recursive Equation
Explanation:
-
E(2H): Expected tosses to get 2 heads in a row
-
p: Probability the next toss is heads (completing 3H)
-
(1-p)(1 + E(3H)): If we get tails after 2H, we need to start over
Step 2: Solve for E(3H)
Substituting E(2H) = \frac{1+p}{p^2}:
Numerical Example
For p = 0.3:
Generalization for N Consecutive Heads
General Formula
For N consecutive heads, the expected number of tosses is:
This recursive relationship can be expanded to:
Derivation
Starting from the recursive relationship:
We can expand this:
Using the base case E(1) = \frac{1}{p} (expected tosses for one head), we can show by induction:
Using the geometric series formula \sum_{i=0}^{N-1} p^i = \frac{1-p^N}{1-p} (for p \neq 1):
Alternative Forms
The formula can be written in several equivalent forms:
-
Recursive form: E(N) = \frac{E(N-1) + 1}{p}
-
Sum form: E(N) = \frac{\sum_{i=0}^{N-1} p^i}{p^N}
-
Closed form: E(N) = \frac{1 - p^N}{(1-p) \cdot p^N}
-
Alternative closed form: E(N) = \frac{p^{-N} - 1}{1-p}
Key Insights
- Push vs Pull Recursion:
-
Push recursion: Condition on what happens next
-
Pull recursion: Build up from smaller subproblems
-
State-Based Thinking: Track the current state (0H, 1H, 2H, etc.) and transitions between states.
-
Conditional Expectation: Use the Law of Total Expectation to break down complex expectations into simpler conditional expectations.
-
Recursive Structure: Many expectation problems have recursive structure that can be exploited to find closed-form solutions.
-
Geometric Series: The solution involves geometric series, which appear naturally in sequential probability problems.