Linear regression with random regressors, part 2

Previously I wrote about how when linear regression is introduced and derived, it is almost always done assuming the covariates/regressors/independent variables are fixed quantities. As I wrote, in many studies such an assumption does not match reality, in that both the regressors and outcome in the regression are realised values of random variables. I showed that the usual ordinary least squares (OLS) estimators are unbiased with random covariates, and that the usual standard error estimator, derived assuming fixed covariates, is unbiased with random covariates. This gives us some understand of the behaviour of these estimators in the random covariate setting.

Read moreLinear regression with random regressors, part 2

Regression inference assuming predictors are fixed

Linear regression is one the work horses of statistical analysis, permitting us to model how the expectation of an outcome Y depends on one or more predictors (or covariates, regressors, independent variables) X. Previously I wrote about the assumptions required for validity of ordinary linear regression estimates and their inferential procedures (tests, confidence intervals) assuming (as we often do) that the residuals are normally distributed with constant variance.

Read moreRegression inference assuming predictors are fixed

Assumptions for linear regression

Linear regression is one of the most commonly used statistical methods; it allows us to model how an outcome variable Y depends on one or more predictor (sometimes called independent variables) X_{1},X_{2},..,X_{p}. In particular, we model how the mean, or expectation, of the outcome Y varies as a function of the predictors:

Read moreAssumptions for linear regression