## Linear regression with random regressors, part 2

Previously I wrote about how when linear regression is introduced and derived, it is almost always done assuming the covariates/regressors/independent variables are fixed quantities. As I wrote, in many studies such an assumption does not match reality, in that both the regressors and outcome in the regression are realised values of random variables. I showed that the usual ordinary least squares (OLS) estimators are unbiased with random covariates, and that the usual standard error estimator, derived assuming fixed covariates, is unbiased with random covariates. This gives us some understand of the behaviour of these estimators in the random covariate setting.

Linear regression is one of the most commonly used statistical methods; it allows us to model how an outcome variable $Y$ depends on one or more predictor (sometimes called independent variables) $X_{1},X_{2},..,X_{p}$. In particular, we model how the mean, or expectation, of the outcome $Y$ varies as a function of the predictors: