NORMAN ABRAMSON
Associate Professor of Electrical Engineering
‘Stanford University
INFORMATION
mHoiLORY
— comes,
CopINc lll
McGRAW-HILL ELECTRONIC SCIENCES SERIES
EDITORIAL BOARD
Ronald Bracewell
Colin Cherry
Willis W. Harman
Edward W. Herold
John G. Linvill
‘Simon Ramo
John G. Truxal
ABRAMSON Information Theory and Coding
BREMER Superconductive Devices Nor Ye
introduction ‘ory of Finite-state Machines in Francisco
GILL Introduction to the Theory of Fi te 5
London
PAPOULIS ‘The Fourier Integral and Tts Applications
STEINBERG AND LEQUEUX (TRANSLATOR R. N. BRACEWELL)
HUELSMAN Circuits, Matrices, and Linear Vector Spaces |
Radio Astronomy
McGRAW-HILL Book Company, IneCONTENTS
Preface v
Glossary of Symbols and
Entropy Expressions
INTRODUCTION 1
a
12
13
4
15
What Information Theory Is Not 1
What Information Theory Is 3
Encoding Information 3
‘A Problem in Taformation
‘Transmission 6
Some Questions 9CHAPTER 2 INFORMATION AND
SOURCES
24 The Definition of Information 11
2 The Zero-memory Information Source 18
24 Some Properties of Entropy. 1
Extensions of a Zero-memory Source 19
The Markov Information Source 22
The Adjoint Source 27
Extensions of a Markov Source 20
‘The Structure of Language 83
CHAPTER 3 SOME PROPERTIES
OF CODES
BA Introduetion 45
2 Uniquely Decodable Codes 47
3 Instantaneous Codes 49
“€4 Construction of an Tnstantaneous
Code §2
65, Tho Kraft h
Disoussion 5%
“66 The Kraft Inequality—Proot 57
£7 MeMillan’s Inequality 59
88 Some Examples 00
wality—Statement and
CHAPTER 4 CODING INFORMATION
SOURCES
we Average Length of a Code 65
42° A Method of Encoding for Spec
Sources. 68
£3 Shannon's First Theorem
44 Shannon's Firet Theorem for Markov
Sources 74
445. Coding without Extensions 75
44 Finding Binary Compact Codes—
Hutlman Codes 77
47 Completing the Proof 82
“€8 rary Compact Codes 83
£9 Code Bificiensy and Redundancy 85
n
6
6
CHAPTER 5 CHANNELS AND MUTUAL,
INFORMATION 3
5A Introduction 98
\§2 Information Channels 94
5:8 Probsbility Relations in a Channel 98
54 A Priori and A Posteriori Entropies 100
55 A Generalization of Shannon's First
‘Theorem 101
5-6 Mutual Information 105
5-7 Properties of Mutual Information 107
58 Noiscless Channels and Deterministic
Channels 111
5.9 Caseaded Channels 113,
5-10 Redwood Channels and Sufficient
Reductions 118
‘of Mutual Information 123,
Information of Several
Alphabets 127
443. Channel Capacity 131
5-14 Conditional Mutual Information 135
sun
2
CHAPTER 6 RELIABLE MESSAGES:
THROUGH UNRELIABLE
CHANNELS. 7
6-1 Introduction 147
62. Error Probability and Decision Rules 149
63 The Fano Bound 153
6-4 Reliable Messages and Unreliable
Channels 155,
‘An Example of Coding to Correct
Errors 158)
66 Hamming Distance 163
67. Shannon's Second Theorem for the BSC—
The First Step 165,
68. Random Coding—The Second Step 170
69 Shannon's Second Theorem—
Discussion 172
z