Você está na página 1de 13

Iterative Methods for Solving Linear

Systems of Equations
(part of the course given for the 2 grade at BGU, ME)
nd

Iterative Methods
An iterative technique to solve Ax=b starts with an initial
(k )
approximation x (0) and generates a sequence x k 0

First we convert the system Ax=b into an equivalent


form x Tx c
And generate the sequence of approximation by

x ( k ) Tx ( k 1) c, k 1,2,3...
.This procedure is similar to the fixed point method
:The stopping criterion

x ( k ) x ( k 1)
x (k )

Iterative Methods (Example)


E1 :
E2 :
E3 :

10 x1

x 2 2 x3

6
x1 11x 2 x3 3 x 4 25
2 x1 x 2 10 x3 x 4 11
3 x 2 x3 8 x 4

E4 :

15

We rewrite the system in the x=Tx+c form


1
1
3
x 2 x3

10
5
5
1
1
3
25
x2
x1
x3 x 4
11
11
11
11
1
1
1
11
x3 - x1 x 2
x4
5
10
10
10
3
1
15
x4
x 2 x3

8
8
8
x1

.Iterative Methods (Example) cont


and start iterations with x(0) (0, 0, 0, 0)
1 ( 0)
1
3
x 2 x3(0)
0.6000
10
5
5
1 ( 0)
1
3
25
x 2(1)
x1
x3(0) - x 4(0)
2.2727
11
11
11
11
1
1
1
11
x3(1) - x1(0) x 2(0)
x 4(0)
1.1000
5
10
10
10
3
1
15
x 4(1)
- x 2(0) x3(0)

1.8750
8
8
8
x1(1)

:Continuing the iterations, the results are in the Table

The Jacobi Iterative Method


The method of the Example is called the Jacobi iterative
method

xi( k )

j 1
j i

( k 1)
aij x j

aii

i 1, 2,...., n

Algorithm: Jacobi Iterative Method

The Jacobi Method: x=Tx+c Form


a11
a
21

.
.
an1

a11 0...............0
0 a

..........
...0
22

............................

...........................0
0.................0 a nn

D

a1n
a2n

a12
a22

.
.
ann

a12 ........ a1n

.
.
an 2

0 ............. .............0

a 21 ...................0 0 ................ a 2n

.......................... .............................

........................... ...................... . a n -1, n


a ..... a

.........0
n1 n,n 1 0 0..... ..........

A DLU

The Jacobi Method: x=Tx+c Form


(cont)
A DLU

and the equation Ax=b can


be transformed into

D L U x b
Dx L U x b

x D 1 L U x D1b
Finally

TD

L U

c D 1b

The Gauss-Seidel Iterative Method


(k )
The idea of GS is to compute x
using most recently
:calculated values. In our example
1 ( k 1)
1
3
x2
x3( k 1)

10
5
5
1 (k )
1
3
25
x2( k )
x1
x3( k 1) - x4( k 1)
11
11
11
11
1
1
1
11
x3( k ) - x1( k ) x2( k )
x4( k 1)
5
10
10
10
3
1
15
x4( k )
- x2( k )
x3( k )

8
8
8
( 0)
x1( k )

Starting iterations with x

(0, 0, 0, 0)

, we obtain

The Gauss-Seidel Iterative Method

xi( k )

a x a x
i 1

j 1

(k )
ij j

j i 1

( k 1)
bi
ij j

aii

(k )
( k 1)
c
Gauss-Seidel in x Tx
Point)

i 1, 2,...., n

form (the Fixed

Ax (D L U)x b

D L x Ux b
D L x (k ) Ux (k 1) b
Finally

x ( k ) D L 1 U x ( k 1) D L 1 b


T

Algorithm: Gauss-Seidel Iterative Method

The Successive Over-Relaxation Method (SOR)


The SOR is devised by applying extrapolation to the
GS metod. The extrapolation tales the form of a
weighted average between the previous iterate and
the computed GS iterate successively for each
component
(k )
i

(k )
i

(1 - ) x

( k 1)
i

(k )

where xi denotes a GS iterate and is the


extrapolation factor. The idea is to choose a value of
that will accelerate the rate of convergence.

0 1

under-relaxation

1 2

over-relaxation

SOR: Example
4 x1 3 x 2
3 x1 4 x 2

24

x3 30

x 2 4 x3 24

Solution: x=(3, 4, -5)

Você também pode gostar