Você está na página 1de 6

Lecture #3 Identification

ECE 842
Ali Keyhani

Identification is the determination on the basis of input and output, of a process (model), within a specified
class of processes (models), to which the process under test is equivalent.

u(.)

Process y(.)
w(.)

u(.)

Equivalent y(.)
w(.) Model

The equivalent model is a model which has captured the underlying mechanism which has generated
the original process.

Three – Step Modeling Procedure for Process {y(.)} excited by white noise
1. Model Identification—Means the determination of n and m for process.
^
2. Parameter Estimation; The determination of { a (k ) } from { Y(j), J < k }
3. Diagnostic Checking
Checking the {w(.)} sequence to establish that the noise indeed is white.
Classical Approach to Identification and Modeling

Given
Y(k)

Build The Model Over Test The Model Over


This Portion of Past This Portion of Past
History History

A 3 – Step Procedure for Modeling IS:

Identify A Candidate
Model

Estimate The Parameter of


The Identified Model

Yes
Diagnostic Checking Done
Is The Noise
White?

No
Go To Identify
Another Model

Covariance: A partial indication of the degree to which one variable is related to another is given by the
covariance, which is the expectation of the product of the deviation of two random variables from their
means.
{Y(.)} random process Y(.)
{x(.)} random process x(.)

Cov {x(k) Y(k)} = E [(x(k) – ux) (Y(k) – uy)]


= E [x(k) Y(k)] – ux uy
Where
ux = E [x(k)]
uy = E [Y(k)]
Cross – Correlation Coefficient

E[ x(k )Y (k )] − u x u y
ρ=
σ xσ y

Variance of x σ x2 = E[( x(k ) − u x ) 2 ]

Variance of y σ y2 = E[(Y (k ) − u y ) 2 ]

Notes:
1) The correlation coefficient is a measure of the degree of linear dependence between x and y.
2) If x and y are independent, ρ = 0 (the inverse is not true).
3) If Y is a linear function of x, ρ = ±1

Zero Mean Process: Given process {Y(.)}, if the mean of the process is estimated and then subtracted
from each Y(k), we obtain a zero mean process.

Ex.
{Y1 (k )}kN=1
^ 1
u y1 =
N
∑ Y (k )
1

^
Y (1) = Y1 (1) − u y1
^
Y (2) = Y1 (2) − u y1
.

. → {Y (k )}kN=1 is a zero mean process


.
^
Y (k ) = Y1 (k ) − u y1

For zero mean processes {x(.)} and {Y(.)}

Cov { x(k) Y(k) } = E [ x(k) Y(k) ] = γxy


γxy is called cross covariance or correlation between x and Y variables

The basic cross correlation or covariance equation


E [ x(k) Y(k+j) ] = γxy (j)
E [ Y(k) x(k+j) ] = γyx (-j); i.e. γxy (j) = γyx (-j)
For j = 0
E [ x(k) Y(k) ] = γxy = γxy (0)

The sample estimate of cross – correlation for covariance,


^ 1 N− j
γ xy ( j ) = ∑ x(k )Y (k + j )
N − j k =1
j = 0,1,..........m

^ 1 N− j
γ yx (− j ) = ∑ Y (k ) x(k + j )
N − j k =1
j = 0,1,..........m

Sample cross – correlation coefficient of zero mean {Y(.)} and {x(.)} processes.

^
^ E[ x(k )Y (k )]
ρ xy (0) = ^ ^
σxσ y
N
^ ^ 1
γ xy (0) = E[ x(k )Y (k )] =
N
∑ x(k )Y (k )
k =1

^
^ γ xy (0)
ρ xy (0) = ^ ^
σxσ y
^ ^ 2 1 N
γ 0x = σ x = ∑
N − 1 k =1
( x(k )) 2

^ ^ 2 1 N
γ 0y = σ y = ∑
N − 1 k =1
(Y (k )) 2

J – Period apart cross correlation for process {x(.)} and {Y(.)} with zero mean.

γxy (j) = E [ x(k) Y(k+j) ] = γxy (j)

Cross – correlation coeff.

γ xy ( j )
ρ xy ( j ) =
σxσ y
Same cross – correlation coeff.
^
^ γ xy (0)
ρ xy (0) = ^ ^
σxσ y
Historically, the correlation function is defined without restriction on mean and without normalization:

^ 1 N −1
Correlation function R xy ( j ) = ∑ x(k )Y (k + j )
N − j k =1
j = 0,1,..........m

^ ^
R xy ( j ) = E[ x(k )Y (k + j )]

The covariance function has the mean value removed


^ 1 N− j
γ xy ( j ) = C xy ( j ) = ∑ [ x(k ) − u x ][Y (k + j ) − u y ]
N − j k =1
j = 0,1,..........m

In practice, the mean value usually has been subtracted from the data.

γ xy ( j )
The correlation coefficient function ρ xy ( j ) = , σ x = γ x (0) , σ y = γ y (0)
σxσ y
j – Period Apart, auto-covariance for process {Y(.)} with zero mean E [Y(k)] = 0

γ(j) = e [ (Y(k) Y(k+j) ]

Autocorrelation coefficient

γ ( j)
ρ ( j) =
γ0
Sample correlation coefficient
^ ^
^ E[Y (k )Y (k + j )] γ ( j)
ρ ( j) = ^
= ^
γ0 γ0
Where
^ ^ 1 N− j
γ yy ( j ) = E[Y ( j )Y (k + j )] = ∑ Y (k )Y (k + j )
N − j k =1
j = 0,1,2..........

^ 1 N
γ0 = ∑
N − 1 k =1
(Y (k )) 2

Note that j period apart autocorrelation is a partial indication of the degree to which present is dependent to
the past. i.e.
1) The autocorrelation is a measure of the degree of linear dependence between the present and the
past.
^
2) If Y(k) and Y(k+j) are independent, then ρ yy ( y ) = 0 .

3) RYY (0) = γ 0 i.e the mean-square value of the random process can always be obtained by setting the
2

j = 0.
4) RYY ( j ) = RYY (− j )

5) RYY ( j ) ≤ RYY (0) the largest value of the autocorrelation always occurs at j = 0.

Você também pode gostar