Home Page


ORDINARY LEAST SQUARES

Ordinary Least Squares (commonly called Linear Regression) uses the method of least squares to estimate the "best fit" of a set of independent (X) variables against the dependent variable (Y) variable you wish to explain or predict. The primary product of regression analysis is a mathematical equation which can be used to predict values of the dependent variable, give the values of the independent variables. Also associated with regression are measures of precision and accuracy, commonly referred to hypothesis testing. The method of Ordinary Least Squares is based upon a number of statistical assumptions such as...

(1) The model is linear in its parameters

(2) The residuals (errors) are homoscedastic (reflect constant variance)

(3) The residuals are not correlated with one another over time

(4) The independent variables and the residuals from the model are uncorrelated

(5) The data is derived from a normally distributed population

Problems such as multicollinearity (extreme correlation) among the explanatory variables cause difficulties in computing the least squares estimates. The presence of multicollinearity prevents the mathematical procedure from isolating and measuring the contribution of each independent variable on the dependent variable.

Included in the broad category of regression analysis are techniques like Poisson Regression, Two Stage Least Squares, Three Stage Least Squares, Logistic Regression, Probit Regression, Tobit Regression, Zellner Estimation, Almon Polynomial Distributed Lags, Stepwise Regression, Nonlinear Regression, Weighted Least Squares, and Factor Analysis.