
This episode explores the foundational concepts of linear regression as a tool for predictive inference and association analysis. It details the Best Linear Prediction (BLP) problem and its finite-sample counterpart, Ordinary Least Squares (OLS), emphasizing their statistical properties, including analysis of variance and the challenges of overfitting when the number of parameters is not small relative to the sample size. The text further introduces sample splitting as a method for robustly evaluating predictive models and clarifies how partialling-out helps in understanding the predictive effects of specific regressors, such as in analyzing wage gaps. Finally, it discusses adaptive statistical inference and the behavior of OLS in high-dimensional settings where traditional assumptions may not hold.
Disclosure