Expected Consecutive Heads

[Jump to AI Interview][Learn Bayes’ Theorem]

Problem Statement

What is the expected number of coin tosses needed to get two consecutive heads for an unfair coin? Generalize this for 3 heads, and then for N heads.

Solution for Two Consecutive Heads (2H)

Approach: Push Recursion

We solve this using conditional expectation by conditioning on the first toss.

Let E(2H) be the expected number of tosses to get two consecutive heads.

Step 1: Set Up the Recursive Equation

We condition on what happens in the first toss:

E(2H) = (1-p)(1 + E(2H)) + p(1 + E(2H \mid 1H))

Where:

  • (1-p) is the probability of getting tails, which requires starting over: 1 + E(2H)

  • p is the probability of getting heads, putting us in state “1H” (one head so far): 1 + E(2H \mid 1H)

Step 2: Analyze E(2H \mid 1H)

If we already have one head, we condition on the next toss:

E(2H \mid 1H) = p \cdot 1 + (1-p)(1 + E(2H))

Where:

  • With probability p, we get heads and complete the sequence (1 more toss)

  • With probability (1-p), we get tails and must start over: 1 + E(2H)

Step 3: Solve the System of Equations

From Step 2:

E(2H \mid 1H) = p + (1-p)(1 + E(2H))
E(2H \mid 1H) = p + (1-p) + (1-p)E(2H)
E(2H \mid 1H) = 1 + (1-p)E(2H) - p
E(2H \mid 1H) = 1 - p + (1-p)E(2H)

Actually, let’s simplify:

E(2H \mid 1H) = p + (1-p) + (1-p)E(2H) = 1 + (1-p)E(2H) - p = 1 - p + (1-p)E(2H)

Wait, let me recalculate:

E(2H \mid 1H) = p \cdot 1 + (1-p)(1 + E(2H)) = p + (1-p) + (1-p)E(2H) = 1 + (1-p)E(2H) - p + p
E(2H \mid 1H) = 1 + (1-p)E(2H)

Now substituting into the first equation:

E(2H) = (1-p)(1 + E(2H)) + p(1 + 1 + (1-p)E(2H))
E(2H) = (1-p) + (1-p)E(2H) + p(2 + (1-p)E(2H))
E(2H) = (1-p) + (1-p)E(2H) + 2p + p(1-p)E(2H)
E(2H) = 1-p + 2p + (1-p)E(2H) + p(1-p)E(2H)
E(2H) = 1 + p + E(2H)[(1-p) + p(1-p)]
E(2H) = 1 + p + E(2H)[1-p + p - p^2]
E(2H) = 1 + p + E(2H)[1 - p^2]
E(2H) - E(2H)[1 - p^2] = 1 + p
E(2H)[1 - (1 - p^2)] = 1 + p
E(2H) \cdot p^2 = 1 + p
E(2H) = \frac{1 + p}{p^2}

Numerical Example

For p = 0.3:

E(2H) = \frac{1 + 0.3}{0.3^2} = \frac{1.3}{0.09} \approx 14.44 \text{ tosses}

Solution for Three Consecutive Heads (3H)

Approach: Pull Recursion

For three heads, we use pull recursion. The key insight: to get 3 heads, you first need 2 heads in a row, then require 1 more toss to get to 3.

Step 1: Set Up the Recursive Equation

E(3H) = E(2H) + p + (1-p)(1 + E(3H))

Explanation:

  • E(2H): Expected tosses to get 2 heads in a row

  • p: Probability the next toss is heads (completing 3H)

  • (1-p)(1 + E(3H)): If we get tails after 2H, we need to start over

Step 2: Solve for E(3H)

E(3H) = E(2H) + p + (1-p) + (1-p)E(3H)
E(3H) = E(2H) + 1 + (1-p)E(3H)
E(3H) - (1-p)E(3H) = E(2H) + 1
E(3H) \cdot p = E(2H) + 1
E(3H) = \frac{E(2H) + 1}{p}

Substituting E(2H) = \frac{1+p}{p^2}:

E(3H) = \frac{\frac{1+p}{p^2} + 1}{p} = \frac{1+p + p^2}{p^3} = \frac{1 + p + p^2}{p^3}

Numerical Example

For p = 0.3:

E(3H) = \frac{1 + 0.3 + 0.09}{0.027} = \frac{1.39}{0.027} \approx 51.48 \text{ tosses}

Generalization for N Consecutive Heads

General Formula

For N consecutive heads, the expected number of tosses is:

E(N) = \frac{E(N-1) + 1}{p}

This recursive relationship can be expanded to:

E(N) = \frac{\sum_{i=0}^{N-1} p^i}{p^N} = \frac{1 - p^N}{(1-p) \cdot p^N} = \frac{p^{-N} - 1}{1-p}

Derivation

Starting from the recursive relationship:

E(N) = \frac{E(N-1) + 1}{p}

We can expand this:

E(N) = \frac{1}{p} \left( E(N-1) + 1 \right)
E(N) = \frac{1}{p}E(N-1) + \frac{1}{p}

Using the base case E(1) = \frac{1}{p} (expected tosses for one head), we can show by induction:

E(N) = \frac{\sum_{i=0}^{N-1} p^i}{p^N}

Using the geometric series formula \sum_{i=0}^{N-1} p^i = \frac{1-p^N}{1-p} (for p \neq 1):

E(N) = \frac{\frac{1-p^N}{1-p}}{p^N} = \frac{1-p^N}{(1-p)p^N} = \frac{p^{-N} - 1}{1-p}

Alternative Forms

The formula can be written in several equivalent forms:

  1. Recursive form: E(N) = \frac{E(N-1) + 1}{p}

  2. Sum form: E(N) = \frac{\sum_{i=0}^{N-1} p^i}{p^N}

  3. Closed form: E(N) = \frac{1 - p^N}{(1-p) \cdot p^N}

  4. Alternative closed form: E(N) = \frac{p^{-N} - 1}{1-p}

Key Insights

  1. Push vs Pull Recursion:
  • Push recursion: Condition on what happens next

  • Pull recursion: Build up from smaller subproblems

  1. State-Based Thinking: Track the current state (0H, 1H, 2H, etc.) and transitions between states.

  2. Conditional Expectation: Use the Law of Total Expectation to break down complex expectations into simpler conditional expectations.

  3. Recursive Structure: Many expectation problems have recursive structure that can be exploited to find closed-form solutions.

  4. Geometric Series: The solution involves geometric series, which appear naturally in sequential probability problems.

Again a better way to present this would be using markov chain.

I think so more better wording would be ;
Let’s start with a simple one state:
If we want one head, we can keep flipping until we get a head.
Let p the probability of getting a head

We can have infinite cases:
we can toss one time and get a head.
we can get a tail with (1-p) and after that we get a head
again two tails,then head, three tails and head and this goes infinite

This eventually forms a geometric progression and we get 1/p

Now, talking about getting two consecutive head
Define state:

State 0: No consecutive heads yet.
State 1: We just got one head (H).
State 2: We’ve reached our goal (HH).

First we shall calculate expected value for the initial state
We get a head and go to E1, we get a tail and we go back to E0 and we need to add 1 for the current toss, we did
E(0) = 1 + p + (1-p) *E(0)

Apply the same thing for the next state

When you eventually solve this again, we realize, its forming a gp.