Monday, September 15, 2025

Cochran’s Theorem in Statistics: From Univariate to Multivariate Analysis

  Manoj       Monday, September 15, 2025

🔎 Introduction

Cochran’s theorem is a cornerstone in statistical theory. It explains when quadratic forms of normal variables follow a chi-square distribution and when they are independent. This provides the theoretical basis for ANOVA, regression analysis, chi-square tests, and multivariate methods like Hotelling’s \(T^2\) and MANOVA.


📘 Statement of Cochran’s Theorem (Univariate)

Let \(X_1, X_2, \dots, X_n \sim N(0,1)\) independently. Consider quadratic forms:

$$ Q_i = X' A_i X, \quad i=1,2,\dots,k $$ where each \(A_i\) is a symmetric, idempotent matrix such that:
  • \(\sum_{i=1}^k A_i = I_n\)
  • \(\mathrm{rank}(A_1) + \cdots + \mathrm{rank}(A_k) = n\)

Then:

  • \(Q_i \sim \chi^2_{r_i}\), where \(r_i = \mathrm{rank}(A_i)\)
  • The quadratic forms \(Q_1, Q_2, \dots, Q_k\) are independent
  • \(\sum_{i=1}^k Q_i = \sum_{j=1}^n X_j^2 \sim \chi^2_n\)

📊 Example: One-Way ANOVA (Univariate)

In one-way ANOVA with \(k\) groups and \(n\) total observations:

$$ SS_{Total} = SS_{Treatment} + SS_{Error} $$
  • \(SS_{Treatment}/\sigma^2 \sim \chi^2_{k-1}\)
  • \(SS_{Error}/\sigma^2 \sim \chi^2_{n-k}\)
Figure 1: ANOVA decomposition by Cochran’s theorem. Treatment and Error sums of squares are independent chi-square components.

This independence allows the F-test:

$$ F = \frac{SS_{Treatment}/(k-1)}{SS_{Error}/(n-k)} \sim F_{k-1, n-k} $$

📘 Cochran’s Theorem in Multivariate Analysis

Now extend to the multivariate setting: Suppose we have \(n\) independent vectors from a \(p\)-dimensional normal distribution:

$$ X_1, X_2, \dots, X_n \sim N_p(\mu, \Sigma) $$

The centered data matrix is:

$$ Z = (X - \bar{X}1') $$

The sum of squares and cross-products (SSCP) matrix is:

$$ W = Z'Z $$

By Cochran’s theorem (multivariate version):

  • \(W \sim \text{Wishart}_p(n-1, \Sigma)\)
  • Decompositions of \(W\) (e.g., between-group and within-group) yield independent Wishart matrices

📊 Example: MANOVA

In one-way MANOVA with \(g\) groups and \(n\) total observations:

$$ T = H + E $$
  • \(H\): Hypothesis (between-group) SSCP matrix
  • \(E\): Error (within-group) SSCP matrix

By Cochran’s theorem:

  • \(H \sim \text{Wishart}_p(g-1, \Sigma)\)
  • \(E \sim \text{Wishart}_p(n-g, \Sigma)\)
  • \(H\) and \(E\) are independent
Figure 2: MANOVA decomposition. Total SSCP is partitioned into independent Hypothesis (H) and Error (E) Wishart components.

This independence underlies multivariate test statistics such as:

  • Wilks’ Lambda
  • Pillai’s Trace
  • Hotelling–Lawley Trace
  • Roy’s Largest Root

📝 Key Takeaways

  • In the univariate case, Cochran’s theorem partitions sums of squares into independent chi-square components.
  • In the multivariate case, it partitions SSCP matrices into independent Wishart components.
  • It provides the foundation for F-tests (ANOVA) and MANOVA statistics (Wilks, Pillai, Hotelling–Lawley, Roy).

👉 Related Posts


Thank you for reading!
If you found this helpful, please share & subscribe for more statistics lectures. Drop your questions in the comments 👇

logoblog

Thanks for reading Cochran’s Theorem in Statistics: From Univariate to Multivariate Analysis

Newest
You are reading the newest post

No comments:

Post a Comment

Statistics becomes simple when we share and discuss. Drop your questions or suggestions below — I’ll be happy to respond!