Você está na página 1de 25

Introduction to Coding Theory

Rong-Jaye Chen

Outline
           

[1] Introduction [2] Basic assumptions [3] Correcting and detecting error patterns [4] Information rate [5] The effects of error correction and detection [6] Finding the most likely codeword transmitted [7] Some basic algebra [8] Weight and distance [9] Maximum likelihood decoding [10] Reliability of MLD [11] Error-detecting codes [12] Error-correcting codes
p2.

Introduction to Coding Theory




[1] Introduction


Coding theory


The study of methods for efficient and accurate transfer of information Detecting and correcting transmission errors

Information transmission system


Transmitter (Encoder) Communication Channel Receiver (Decoder) Information Sink

Information Source

k-digit

n-digit
Noise

n-digit

k-digit

p3.

Introduction to Coding Theory


[2] Basic assumptions
Definitions  Digit 0 or 1(binary digit)  Word a sequence of digits  Example 0110101  Binary code a set of words  Example 1. {00,01,10,11} , 2. {0,01,001}  Block code a code having all its words of the same length  Example {00,01,10,11}, 2 is its length  Codewords words belonging to a given code  |C| Size of a code C(#codewords in C)
p4.

Introduction to Coding Theory




Assumptions about channel

{0,1}

Channel Channel

{0,1}

1. Receiving word by word

011011001

011, 011, 001

2. Identifying the beginning of 1st word 3. The probability of any digit being affected in transmission is the same as the other one.
p5.

Introduction to Coding Theory


Binary symmetric channel

0 1

p: reliability

0 1
p6.

1 p 1 p

In many books, p denotes crossover probability. Here crossover probability(error prob.) is 1-p

Introduction to Coding Theory




[3] Correcting and detecting error patterns


Any received word should be corrected to a codeword that requires as few changes as possible.

C 1 ! {00 , 01 ,10 ,11} Cannot detect any errors !!! C 2 ! {000000 ,010101 ,101010 ,111111 }
source Channel

110101

correct

010101

C 3 ! {000 ,011 ,101 ,110 }


source Channel

parity-check digit

110 ?

010

correct

000 ? 011 ?
p7.

Introduction to Coding Theory




[4] Information rate  Definition: information rate of code C 1 is defined as log 2 C n where n is the length of C


Examples
2

1 log c1 ! 2

4 ! 1

1 ! 3

c3

2 ! 3

p8.

Introduction to Coding Theory


[5] The effects of error correction and detection 1. No error detection and correction
Let C={0,1}11={0000000000, , 11111111111} Reliability p=1-10-8 Transmission rate=107 digits/sec

Then Pr(a word is transmitted incorrectly) = 1-p11 11x10-8 11x10-8(wrong words/words)x107/11(words/sec)=0.1 wrong words/sec
1 wrong word / 10 sec 6 wrong words / min 360 wrong words / hr 8640 wrong words / day
p9.

Introduction to Coding Theory


2. parity-check digit added(Code length becomes 12 )
Any single error can be detected ! (3, 5, 7, ..errors can be detected too !)

Pr(at least 2 errors in a word)=1-p12-12 x p11(1-p) 66x10-16 So 66x10-16 x 107/12 5.5 x 10-9 wrong words/sec one word error every 2000 days! The cost we pay is to reduce a little information rate + retransmission(after error detection!)
p10.

Introduction to Coding Theory


3. 3-repetition code Any single error can be corrected !
Code length becomes 33 and information rate becomes 1/3

Task design codes with


  

reasonable information rates low encoding and decoding costs some error-correcting capabilities
p11.

Introduction to Coding Theory




[6] finding the most likely codeword transmitted

BSC channel

p d n

reliability #digits incorrectly transmitted code length

J p ( R , Z ) ! p n  d (1  p ) d


Example
Code length = 5

Jp (R, R ) ! p5

J0.9 ! (10101,01101) ! ( 0.9 ) 3 ( 0.1) 2 ! 0.00729


p12.

Introduction to Coding Theory


Assume
p

is sent when
p

is received if

( R , Z ) ! max{

( u , ) : u C}

Theorem 1.6.3 Suppose we have a BSC with < p < 1. Let R 1 and R 2 be codewords and a word, each of lengthn . Suppose that R 1 and disagree in d 1 positions and R 2 and disagree in d 2 positions. Then

J p (R 1 ,Z ) e J p (R 2 ,Z )

iff d1 u d 2

p13.

Introduction to Coding Theory




Example

R !?

channel

Z ! 00110

p ! 0.98

R
01101 01001 10100 10101

d (number of disagreements with 3 4 2 3 smallest d

Z)

p14.

Introduction to Coding Theory




[7] Some basic algebra


K ! {0,1}
Addition

0  0 ! 0, 0  1 ! 1, 1  0 ! 1, 1  1 ! 0 Multiplication 0 0 ! 0, 1 0 ! 0, 0 1 ! 0, 11 ! 1
the set of all binary words of length n

Addition Scalar multiplication

0 [ ! 0n , 1 [ ! [
0 n zero word
p15.

Introduction to Coding Theory




Kn is a vector space
u , v, w a, b
words of length n scalar

1. v  w K n 2. (u  v)  w ! u  (v  w) 3. v  0 ! 0  v ! v 4. v  v' ! v' v ! 0, v' K n 5. v  w ! w  v 6. av K n 7. a (v  w) ! av  aw 8. (a  b)v ! av  bv 9. ( ab)v ! a (bv) 10. 1v ! v


p16.

Introduction to Coding Theory




[8] Weight and distance




Hamming weight
 

t (v)
R
wt (000000 ) ! 0

the number of times the digit 1 occurs in Example

wt (110101) ! 4,

Hamming distance
 

d (v, w)
R and w disagree

the number of positions in which Example

d (01011,00111) ! 2,

d (10110,10110) ! 0

p17.

Introduction to Coding Theory




Some facts

u, v, w
a
digit

words of length n

1. 0 3. 0

wt ( v ) d (v, w )

n v ! 0s n v ! w

2 . wt ( v ) ! 0 iff 4 . d ( v , w ) ! 0 iff

5. d (v, w ) ! d ( w , v ) 6 . wt ( v  w ) 7. d (v, w ) wt ( v )  wt ( w ) d (v, u )  d (u , w )


p18.

8 . wt ( av ) ! a wt ( v ) 9 . d ( av , aw ) ! a d ( v , w )

Introduction to Coding Theory




[9] Maximum likelihood decoding


n

w=v+u
channel
CMLD IMLD

Source string x
  

codeword CMLD
 

u
Error pattern

decode

Complete Maximum Likelihood Decoding

If only one word v in C closer to w , decode it to v If several words closest to w, select arbitrarily one of them

IMLD
 

Incomplete Maximum Likelihood Decoding


p19.

If only one word v in C closer to w, decode it to v If several words closest to w, ask for retransmission

Introduction to Coding Theory


d (v, w) ! wt (v  w)
error pattern

u !vw

J p (v1 , w) J p (v2 , w) iff wt (v1  w)

wt (v2  w)

The most likely codeword sent is the one with the error pattern of smallest weight Example Construct IMLD. |M|=3 , C={0000, 1010, 0111}
Received w 0000 1000 0100 0010 0001 1100 1010 1001 0110 0101 Error Pattern 0000 + w 0000 1000 0100 0010 0001 1100 1010 1001 0110 0101 1010 + w 1010 0010 1110 1000 1011 0110 0000 0011 1100 1111 0111 + w 0111 1111 0011 0101 0110 1011 1101 1110 0001 0010 Decode v 0000 0000 0000 1010 0111 0111

p20.

Introduction to Coding Theory




[10] Reliability of MLD




The probability that if v is sent over a BSC of probability p then IMLD correctly concludes that v was sent
w L(v )

U p (C , v ) ! J p ( v , w) here L(v ) : all ords which are close to v


The higher the probability is, the more correctly the word can be decoded!
p21.

Introduction to Coding Theory




[11] Error-detecting codes w C Cant detect u v +


u
Error pattern

Can detect

Example
Error Pattern u 000 100 010 001 110 101 011 111 Can detect Cant detect

C ! {000,111}
v = 000 000 100 010 001 110 101 011 111 v = 111 111 011 101 110 001 010 100 000

u !vw
000  000 ! 000 000  111 ! 111 111  111 ! 000

p22.

Introduction to Coding Theory




the distance of the code C




the smallest of d(v,w) in C

Theorem 1.11.14


A code C of distance d will at least detect all non-zero error patterns of weight less than or equal to d-1. Moreover, there is at least one error pattern of weight d which C will not detect.

t error-detecting code


It detects all error patterns of weight at most t and does not detect at least one error pattern of weight t+1 A code with distance d is a d-1 error-detecting code.

p23.

Introduction to Coding Theory




[12] Error-correcting codes

+
u
Error pattern

For all v in C , if it is closer to v than any other word in C, a code C can correct u.

Theorem 1.12.9


A code of distance d will correct all error patterns of weight less than or equal to (d  1) / 2 . Moreover, there is at least one error pattern of weight 1+(d  1) / 2 which C will not correct. It corrects all error patterns of weight at most t and does not correct at least one error pattern of weight t+1 A code of distance d is a (d  1) / 2 error-correcting code.
p24.

t error-correcting code


Introduction to Coding Theory


C ! {000 , 111}
Received w 000 100 010 001 110 101 011 111

d !3
Error Pattern 000 + w 000* 100* 010* 001* 110 101 011 111 111 + w 111 011 101 110 001* 010* 100* 000* Decode v 000 000 000 000 111 111 111 111

(3  1) / 2 ! 1
C corrects error patterns 000,100,010,001
p25.

Você também pode gostar