Você está na página 1de 40

CURVE FITTING

Part 5
Data is often given for discrete values along a
continuum.
It is often necessary to estimate the points
between the given discrete values.
It may also be necessary to simplify a
complicated relationship.

1
CURVE FITTING
Part 5
Describes techniques to fit curves (curve fitting) to
discrete data to obtain intermediate estimates.

There are two general approaches two curve fitting:
Data exhibit a significant degree of scatter. The strategy is
to derive a single curve that represents the general trend of
the data. Because any individual data point may be
incorrect, we make no effort to intersect every point.
(Least-Squares Regression)
Data is very precise. The strategy is to pass a curve or a
series of curves through each of the points.


2
Figure PT5.1
3
Non-computer methods
of curve fitting

Subjective viewpoint
of the person
sketching the curve

Significant errors can
be introduced

CURVE FITTING
Part 5
In engineering two types of applications are
encountered:
Trend analysis. Predicting values of dependent
variable, may include extrapolation beyond data
points or interpolation between data points.
Hypothesis testing. Comparing existing
mathematical model with measured data.
Checking the adequacy of existing models.

4
Mathematical Background
Simple Statistics:
In course of engineering study, if several
measurements are made of a particular quantity,
additional insight can be gained by summarizing the
data in one or more well chosen statistics that convey
as much information as possible about specific
characteristics of the data set.
These descriptive statistics are most often selected to
represent
The location of the centre of the distribution of the data,
The degree of spread of the data.
5
Arithmetic mean. The sum of the individual data
points (yi) divided by the number of points (n).



Standard deviation. The most common measure of a
spread for a sample.

or


S
y
= Standard Deviation about the mean
S
t
= Total sum of the squares of the residuals between
the data points and the mean
n i
n
y
y
i
, , 1 =
=

=

=
2
) (
1
y y S
n
S
S
i t
t
y
6
( )
1
/
2
2
2

=

n
n y y
S
i i
y
Variance. Representation of spread by the square of
the standard deviation.



Coefficient of variation. Has the utility to quantify the
spread of data.



It is the ratio of a measure of error (S
y
) to an estimate of
the true value ( )
1
) (
2
2

=

n
y y
S
i
y
% 100 . .
y
S
v c
y
=
7
Degrees of freedom
y
Example
PT5.1
8
Figure PT5.2

9


10
+1 SD - ~68%
+2 SD - ~95%
+3 SD - ~99.7%

Figure PT5.6
11
12

Raw data



Polynomial fit



Linear fit
Least Squares Regression
Chapter 17
Linear Regression
Fitting a straight line to a set of paired
observations: (x
1
, y
1
), (x
2
, y
2
),,(x
n
, y
n
).
y=a
0
+a
1
x+e
a
1
- slope
a
0
- intercept
e- error, or residual, between the model and
the observations
13
List-Squares Fit of a Straight Line:

( )
x a y a
x x n
y x y x n
a
i i
i i i i
1 0
2
2
1
=

=


14
Fig 17.5
15
Fig 17.7a
16
Fig 17.7b
17
Fig 17.8
18
Linearization of Nonlinear
Relationships
Many non-linear relationships can be linearized
and then linear regression can be applied to
find the best fit coefficients
Exponential Function:
y=ae
bx
can be written as lny=lna+bx
Power Equation:
y=ax
b
can be written as log y=log a +b log x
19
(Note: Remember to detransform the fitted parameters to get the original curve)
Fig 17.9
20
Example Logarithmic
21
Fig 17.10
22
Fig 17.11
23
Statistical Assumptions For Linear
Least Squares Regression
1. x has a fixed value; it is not random and is
measured without error
2. The y values are independent random
variables and all have the same variance
3. The y values for a given x must be normally
distributed

24
If transformations are not possible we can use
polynomial regression
Polynomial Regression
Some engineering data is poorly represented
by a straight line. For these cases a curve is
better suited to fit the data. The least squares
method can readily be extended to fit the data
to higher order polynomials (Sec. 17.2).
An alternate method would be to transform
the data (last week e.g. taking the logarithm
of either the x or y data)
25
Least Squares Regression
Chapter 17
Polynomial Regression
Fitting a curved line to a set of paired observations:
(x
1
, y
1
), (x
2
, y
2
),,(x
n
, y
n
).
y=a
0
+a
1
x+a
2
x
2
+e
a
n
- constants
e- error, or residual, between the model and the
observations
Sum of the squares of the residual:

=
=
n
i
i i i r
x a x a a y S
1
2
2
2 1 0
) (
26
Least Squares Regression

=
=
n
i
i i i r
x a x a a y S
1
2
2
2 1 0
) (
27
Least Squares Regression
28
Least Squares Regression
29
Fig 17.12
30
Fig 17.15
31
Fig 17.13
32
Polynomial Regression
The least squares method can be extended to
fit a curve to the m
th
degree polynomial


For this case the sum of the squares of the
residuals is:
33
y=a
0
+a
1
x+a
2
x
2
++ a
m
x
m
+e
( )

= =
= =
n
i
m
i m i i i
n
i
i r
x a x a x a a y e S
1
2
2
2 1 0
1
2
...
Taking the derivative of S
r
with respect to each
coefficient




These can be re-arranged to develop the
following set of normal equations
34
( )
. ... 0
... 2 0
1
1
2
2 1 0
0
etc
a
S
x a x a x a a y
a
S
r
n
i
m
i m i i i
r
= =
c
c
= =
c
c

=
To determine if a least squares polynomial of
degree m is equivalent to solving a system of
m+1 equations as before
35



= + + + +
= + + + +
= + + +
+
+
+
i
m
i
m
i m
m
i
m
i
m
i
i i
m
i m i i i
i
m
i m i i
y x x a x a x a x a
y x x a x a x a x a
y x a x a x a n a
2
2
2
1
1 0
1
3
2
2
1 0
2
2 1 0
...
...
...

t
r t
S
S S
r

=
2
General Linear Least Squares
{ } | |{ } { }
| |
{ }
{ }
{ } residuals E
ts coefficien unknown A
variable dependent the of valued observed Y
t variable independen the of values measured at the
functions basis the of values calculated the of matrix
functions basis 1 are
1 0
2 2 1 1 0 0

+ =
+
+ + + + + =
Z
E A Z Y
m , z , , z z
e z a z a z a z a y
m
m m

2
1 0

= =
|
|
.
|

\
|
=
n
i
m
j
ji j i r
z a y S
36
Minimized by taking its partial
derivative w.r.t. each of the
coefficients and setting the
resulting equation equal to zero
Multiple Linear Regression
There may exist a situation where y is a linear
function of two or more variables
37
y=a
0
+a
1
x
1
+a
2
x
2
+e
( )

= =
= =
n
i
i i i
n
i
i r
x a x a a y e S
1
2
2 2 1 1 0
1
2
These equations can be written in matrix form
38
( )
( )
( )

=
=
=
= =
c
c
= =
c
c
= =
c
c
n
i
i i i i
r
n
i
i i i i
r
n
i
i i i
r
x a x a a y x
a
S
x a x a a y x
a
S
x a x a a y
a
S
1
2 2 1 1 0 2
2
1
2 2 1 1 0 1
1
1
2 2 1 1 0
0
2 0
2 0
2 0
39

(
(
(




i i
i i
i
i
i i i
i i
i
i
i i
y x
y x
y
a
a
a
x x x x
x x x x
x x n
2
1
2
1
0
2
2
2 1 2
2 1
1
2
1
2 1

(
(
(
(
(




i mi
i i
i
m
mi
i mi mi
mi i
i
i
mi
i
y x
y x
y
a
a
a
x x x x
x x x x
x x n

1 2
1
2
1
1
1
2
1
1
Fig 17.14
40

Você também pode gostar