### 15.14 REGRESSION

The `REGRESSION`

procedure fits linear models to data via least-squares
estimation. The procedure is appropriate for data which satisfy those
assumptions typical in linear regression:

- The data set contains n observations of a dependent variable, say
Y_1,...,Y_n, and n observations of one or more explanatory
variables.
Let X_11, X_12, ..., X_1n denote the n observations
of the first explanatory variable;
X_21,...,X_2n denote the n observations of the second
explanatory variable;
X_k1,...,X_kn denote the n observations of
the kth explanatory variable.
- The dependent variable Y has the following relationship to the
explanatory variables:
Y_i = b_0 + b_1 X_1i + ... + b_k X_ki + Z_i
where b_0, b_1, ..., b_k are unknown
coefficients, and Z_1,...,Z_n are independent, normally
distributed noise terms with mean zero and common variance.
The noise, or error terms are unobserved.
This relationship is called the linear model.

The `REGRESSION`

procedure estimates the coefficients
b_0,...,b_k and produces output relevant to inferences for the
linear model.