Você está na página 1de 3

Regression analysis - Wikipedia, the free encyclopedia http://en.wikipedia.

org/wiki/Regression_analysis

Regression analysis
From Wikipedia, the free encyclopedia.

Regression analysis is any statistical method where the mean of one or more random variables is predicted conditioned
on other (measured) random variables. In particular, there are linear regression, logistic regression, Poisson regression and
supervised learning. Regression analysis is the statistical view of curve fitting: choosing a curve that best fits given data
points.

Sometimes there are only two variables, one of which is called X and can be regarded as constant, i.e., non-random,
because it can be measured without substantial error and its values can even be chosen at will. For this reason it is called
the independent or controlled variable. The other variable called Y, is a random variable called the dependent variable,
because its values depend on X. In regression we are interested in the variation of Y on X.

Typical examples are the dependence of the blood pressure Y on the age X of a person, or the dependence of the weight Y
of certain animals on their daily ration of food X. This dependence is called the regression of Y on X.

See also: multivariate normal distribution, important publications in regression analysis.

Regression is usually posed as an optimization problem as we are attempting to find a solution where the error is at a
minimum. The most common error measure that is used is the least squares: this corresponds to a Gaussian likelihood of
generating observed data given the (hidden) random variable. In a certain sense, least squares is an optimal estimator: see
the Gauss-Markov theorem.

The optimization problem in regression is typically solved by algorithms such as the gradient descent algorithm, the
Gauss-Newton algorithm, and the Levenberg-Marquardt algorithm. Probabilistic algorithms such as RANSAC can be
used to find a good fit for a sample set, given a parametrized model of the curve function.

Regression can be expressed as a maximum likelihood method of estimating the parameters of a model. However, for
small amounts of data, this estimate can have high variance. Some practitioners use maximum a posteriori (MAP)
methods, which place a prior over the parameters and then choose the parameters that maximize the posterior. MAP
methods are related to Occam's Razor: there is a preference for simplicity among a family of regression models (curves)
just as there is a preference for simplicity among competing theories.

Contents
1 Example
2 See also
3 References
4 External links

Example
The simplest example of regression is in the one dimensional case. We are given a vector of x values and another vector of
y values and we are attempting to find a function such that f(xi) = yi.

let

1 of 3 9/9/2005 12:13 PM
Regression analysis - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Regression_analysis

Lets assume that our solution is in the family of functions defined by a 3rd degree Fourier expansion written in the form:

f(x) = a0 / 2 + a1cos(x) + b1sin(x) + a2cos(2x) + b2sin(2x) + a3cos(3x) + b3sin(3x)

where ai,bi are real numbers. This problem can be represented in matrix notation as:

filling this form in with our given values yields a problem in the form Xw = y

This problem can now be posed as an optimization problem to find the minimum sum of squared errors.

solving this with least squares yields:

3rd degree Fourier function

thus the 3rd degree Fourier function that fits the data best is given by:

f(x) = 4.25cos(x) − 6.13cos(2x) + 2.88cos(3x)

2 of 3 9/9/2005 12:13 PM
Regression analysis - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Regression_analysis

See also
Artificial neural network
Data mining
Statistics

References
Audi, R., Ed. (1996) The Cambridge Dictionary of Philosophy. Cambridge, Cambridge University Press. curve fitting
problem p.172-173.

David Birkes and Yadolah Dodge, Alternative Methods of Regression (1993), ISBN 0-471-56881-3

W. Hardle, Applied Nonparametric Regression (1990), ISBN 0-521-42950-1

External links
Curve Expert (shareware) (http://www.ebicom.net/~dhyams/cftp.htm) fits functions to data (limited to one
dependant and one independent variable.)
Online curve and surface fitting (http://zunzun.com) Online curve and surface fitting
TableCurve2D and TableCurve3D by Systat (http://www.systat.com) automates curve fitting
LMS applet (http://intrepid.mcs.kent.edu/~blewis/stat/lsq.html)
another choice (http://www.softintegration.com/chhtml/lang/lib/libch/numeric/CGI_Curvefit.html)
online curve-fitting textbook (http://curvefit.com/)

Retrieved from "http://en.wikipedia.org/wiki/Regression_analysis"

Categories: Statistics | Optimization

This page was last modified 11:20, 10 July 2005.


All text is available under the terms of the GNU Free Documentation License (see Copyrights for details).

3 of 3 9/9/2005 12:13 PM

Você também pode gostar