Você está na página 1de 5

Instantaneous Codes

Is the code in which each codeword in any string of codewrds can be decoded
( reading from left to right) as soon as it is received
It will be expressed by code trees (so in the following we study something
about graphs and trees)
Graph : a graph G = (V, E ) where
V = { v1 ,, vs } is a set of vertices (nodes)
E = { e 1 ,.,en } is a set of edges
e i : vk

vj

> Directed graph ( Digraph ) : a digraph G = (V, E ) where


V = { v1 ,, vs } is a set of vertices (nodes)
E = { e 1 ,.,en } is a set of arcs (edges)

ei : vk

vj

Tree : a tree is a special digraph with a start distinct node called root
A path : is a sequence of arcs (edges)

Loop : is a path that starts and ends at the some node


Leaf : is the node with no arcs go out
Binary tree : is the tree in which from each node 2 (or 0 ) arcs go out
(Binary) Coding tree : All code words are assigned to leaves
Example : c1 ,c2 ,c3 ,c4
Note that : non-instantaneous codes can not assigned as trees
Example : c5 ,c6

Kraft Inequality
It examines : What requirements must be met by a code in order for it to be
instantaneously decodable
n

r li = l
i=1

in case of binary tree ( i.e. the code alphabet is { 0,1} ) we have

where r : is the size of the code alphabat


and li : is the length of the codeword

2 li = l
i=1
Note : (1). K.I. means : if the lengths l1 , .., ln satisfy K.I., then there must exist some
instantaneous code with these lengths.
(2). K.I. does not mean : any code whose codeword lengths satisfy K.I. must
be instantaneous .
Example:
Symbol
A
B

Code
0
01

length
1
2

2 li = 2 l1 + 2 l2 = 2 l + 2 2 = = 1
i=1
(2 )
it satisfy K.I. , But it is not instantaneous
(because it can not be represented as a binary tree )
(1)

But by note (1) there must exist some instantaneous code with these
lengths (i.e 1 and 2 ), which can be
Symbol
A
B

Code
0
11

length
1
2

2 li = = 1
i=1

and

1
11

0
0

Uniquely Decodable
If any encoded string has only one possible source string producing it then
we have unique decodablity

Extension of source code :


A source code S =

extension

Sn =

a1 , ., am
p1, , pm

has the nth

a1 a1 a1 , a1 a1 .. a1 a2 , ..,
p1 p1 p1, p1 p1 p1 p2, ,

am am . am
pm pm pm

For example the 2nd extension is


S2 =

a1 a1 , a1 a2 , a1 a3 ....,am am
p1 p1, p1 p2, p1 p3 ,pm pm

Example : Consider the code


Symbol
A
B
C
D

Code
1
10
100
1000

The 2nd extension of this code is


S2 =

AA AB AC
AD BA BB BC
BD CA CB
CC CD
DA
DB
DC
DD
11 110 1100 11000 101 1010 10100 101000 1001 10010 100100 101000 10001 100010 1000100 10001000

All codes are different.


v A code is called unequally decodable : if its extension is non-singular
All Codes

Non-singular codes

Uniquely decodable codes


(Mc. I.)

Instantaneous codes (K.I.)


Relationship between Codes

McMillan inequality
McMillan inequality extends Kraft's inequality to uniquely decodable codes
i.e. The codeword lengths of any uniquely decodable code must satisfy the Kraft's
inequality

In case of binary code i.e. { 0, 1},

i=1 r li = l
r = 2
n

i=1 2 li = l

Mean code length


Let S be a memoryless stationary discrete information sourse
S =

a1 , a2 am
p1, p2 pm

Let C be a uniquely decodable code


The mean Code length L is given by
m

L=

Pi li = P1 l1 + P2 l2 + .. + Pm lm
i =1
m
= -
Pi log2 q i
i =1
where q i = 2- li = ()li , i = 1 , 2 , , m

From Kraft McMillan inequality we have q1 + q2 + . + qm = l


and hence we have
m

L = - Pi log2 Pi
i =1

Huffman Code
v With the Huffman code in the binary case the two least probable
source output symbols are joined together, resulting in a new
message alphabet with one less symbol
v Huffman Code is also Compact code and satisfies the properties:
1. Has the shortest mean length among binary instantaneous
codes
2. Optimal tree
3. Compact Code tree

Você também pode gostar