Least squares

From Free net encyclopedia

(Redirected from Method of least squares)

Least squares is a mathematical optimization technique which, when given a series of measured data, attempts to find a function which closely approximates the data (a "best fit"). It attempts to minimize the sum of the squares of the ordinate differences (called residuals) between points generated by the function and corresponding points in the data. Specially, it is called least mean squares (LMS) when the number of measured data is 1 and the gradient descent method is used to minimize the squared residual. LMS is known to minimize the expectation of the squared residual, with the smallest operations (per iteration). But it requires a large number of iterations to converge.

An implicit requirement for the least squares method to work is that errors in each measurement be randomly distributed. The Gauss-Markov theorem proves that least square estimators are unbiased and that the sample data do not have to comply with, for instance, a normal distribution. It is also important that the collected data be well chosen, so as to allow visibility into the variables to be solved for (for giving more weight to particular data, refer to weighted least squares).

The least squares technique is commonly used in curve fitting. Many other optimization problems can also be expressed in a least squares form, by either minimizing energy or maximizing entropy.

Contents

Formulation of the problem

Suppose that the data set consists of the points (xi, yi) with i = 1, 2, ..., n. We want to find a function f such that <math>f(x_i)\approx y_i.</math>

To attain this goal, we suppose that the function f is of a particular form containing some parameters which need to be determined. For instance, suppose that it is quadratic, meaning that f(x) = ax2 + bx + c, where a, b and c are not yet known. We now seek the values of a, b and c that minimize the sum of the squares of the residuals:

<math> S = \sum_{i=1}^n (y_i - f(x_i))^2. </math>

This explains the name least squares.

Solving the least squares problem

In the above example, f is linear in the parameters a, b and c. The problem simplifies considerably in this case and essentially reduces to a system of linear equations. This is explained in the article on linear least squares.

The problem is more difficult if f is not linear in the parameters to be determined. We then need to solve a general (unconstrained) optimization problem. Any algorithm for such problems, like Newton's method and gradient descent, can be used. Another possibility is to apply an algorithm that is developed especially to tackle least squares problems, for instance the Gauss-Newton algorithm or the Levenberg-Marquardt algorithm.

Least squares and regression analysis

In regression analysis, one replaces the relation

<math>f(x_i)\approx y_i</math>

by

<math>f(x_i) = y_i + \varepsilon_i,</math>

where the noise term ε is a random variable with mean zero. Note that we are assuming that the <math>x</math> values are exact, and all the errors are in the <math>y</math> values. Again, we distinguish between linear regression, in which case the function f is linear in the parameters to be determined (e.g., f(x) = ax2 + bx + c), and nonlinear regression. As before, linear regression is much simpler than nonlinear regression. (It is tempting to think that the reason for the name linear regression is that the graph of the function f(x) = ax + b is a line. Fitting a curve f(x) = ax2 + bx + c, estimating a, b, and c by least squares, is an instance of linear regression because the vector of least-square estimates of a, b, and c is a linear transformation of the vector whose components are f(xi) + εi.)

One frequently estimates the parameters (a, b and c in the above example) by least squares: those values are taken that minimize the sum S. The Gauss-Markov theorem states that the least squares estimates are optimal in a certain sense, if we take f(x) = ax + b with a and b to be determined and the noise terms ε are independent and identically distributed (see the article for a more precise statement and less restrictive conditions on the noise terms).

See also

External links

de:Methode der kleinsten Quadrate de:Ausgleichungsrechnung fr:Méthode des moindres carrés gl:Mínimos cadrados it:Metodo dei minimi quadrati nl:Kleinste-kwadratenmethode ja:最小二乗法 pl:Metoda najmniejszych kwadratów ru:Метод наименьших квадратов su:Kuadrat leutik sv:Minsta kvadratmetoden vi:Bình phương tối thiểu zh:最小二乘法