**Linear Regression** is a form of [[Regression Testing]] using a linear combination of input values. That is, each term has a simple coefficient applied to it.
$ Y = B_0 + B_1\times{a} + B_2\times{b} + \text{...} $
Where:
- `B_0` is [[bias]]
- `B_1` is the coefficient for input `a`
- `B_2` is the coefficient for input `b`
- ... and additional inputs would each get their own coefficient
You "tune" your regression using the coefficients for each term to aim for some sort of **[[Model Fitting Norm]]**. Norms, in this context, are things like **[[Mean Squared Error]]**.
# Interpretability
One nice feature of linear regressions is the semantic meaning of each of the coefficients. Positive values mean that input _contributes_ to increases in the output and negative values mean the input _detracts_ from the output.
****
# More
## Source
- Grad School
## Related
-