Sum of Standard Normal Random Variables

[Jump to AI Interview]

Problem Statement

X and Y are standard normal random variables. What is E[X + Y]?

Solution

This problem demonstrates the fundamental property of linearity of expectation for normal random variables.

Step 1: Identify the Distributions

We are given:

  • X \sim N(0, 1) (standard normal)

  • Y \sim N(0, 1) (standard normal)

Step 2: Apply Linearity of Expectation

The key property we use is linearity of expectation, which states that for any random variables X and Y:

E[X + Y] = E[X] + E[Y]

This property holds regardless of whether X and Y are independent or not.

Step 3: Compute Individual Expectations

For standard normal random variables:

  • E[X] = 0 (by definition of N(0, 1))

  • E[Y] = 0 (by definition of N(0, 1))

Step 4: Calculate the Sum

E[X + Y] = E[X] + E[Y] = 0 + 0 = 0

Final Answer

E[X + Y] = 0

Key Insights

  1. Linearity of Expectation: The expectation of a sum equals the sum of expectations, regardless of independence. This is a fundamental property that always holds.

  2. Standard Normal Mean: By definition, a standard normal distribution N(0, 1) has mean 0.

  3. No Independence Required: This result holds whether X and Y are independent or correlated. Linearity of expectation doesn’t require independence.

  4. Extension: For X \sim N(\mu_X, \sigma_X^2) and Y \sim N(\mu_Y, \sigma_Y^2), we have E[X + Y] = \mu_X + \mu_Y, regardless of their relationship.

  5. Variance Note: While E[X + Y] = E[X] + E[Y] always holds, \text{Var}(X + Y) = \text{Var}(X) + \text{Var}(Y) only holds when X and Y are uncorrelated (or independent).

Applications

  • Portfolio Theory: Expected returns of portfolios are linear combinations of individual asset expected returns.

  • Signal Processing: Expected values of combined signals follow linearity.

  • Statistical Inference: Many estimators are linear combinations of observations.