Você está na página 1de 1

Engineering Theory and Mathematics in the Early Development of

Information Theory
Lav R. Varshney
School of Electrical and Computer Engineering
Cornell University

The History of Information Theory Information Theory as Science


 Formulated in 1948 by Claude E. Shannon.  Some engineering theorists constructed information theory as a type of science that would
 Two social groups, communications engineering theorists allow measurement and characterization of various sources and channels.
and mathematical scientists, made significant contributions  Led to empirical and experimental characterization of information sources such as television
to information theory in the late 1940s and early 1950s. pictures, human speech, and printed English text.
 The socially constructed meaning of information theory
held by members of the groups, rather than their academic
credentials serve to discriminate the two groups.
 The relationship between mathematicians and
engineering theorists in the development of information
theory has been marked by mutual interaction, synecdochic
of science and technology in electronics.
Shannon’s Initial Formulation Krezmer, Bell System Technical Journal (1952)

 “A Mathematical Theory of Communication.”  Led to characterization of various existing modulation systems to determine their information
 Pioneering work that was the culmination of rate and comparisons to the optimal (channel capacity).
mathematization of communication, however was so unique Information Theory as an Ideal for Communication System Design
Bello, Fortune (1953)

that it may be considered the work of a “heroic inventor.”  Other engineering theorists viewed information theory as an
 Synthesis of communications engineering theory and ideal to work towards, for the construction of communications
mathematical science ideas. systems.
 Perceived by Shannon as “a branch of mathematics, a  Notable successes came in code design, including the
strictly deductive system.” asymptotically optimal codes of Rice and the optimal
 Main results include a general formulation of a instantaneous Huffman source code.
communication system with ways to measure the amount of  Incorporation of information theoretic ideas into actual
information generated by the source and the capacity of the systems was limited by complexity and latency constraints.
noisy channel. Rice, Bell System Technical Journal (1950)

H = −∑ p( x ) log p( x )
Entropy, amount of information
generated by a source

C = W log( 1 + NP )
Channel capacity of an additive
white Gaussian noise channel.

 Information can be sent with arbitrarily small error if the Bello, Fortune (1953)
Feldman, Bell Laboratories Record (1953)

source entropy H is less than the channel capacity C.


 Popular conceptions of information theory:
Mathematicians’ Conceptions of Information Theory
 At first there was very little interest in information theory and doubt into its importance.
 The developing relationship between information theory and algebraic coding theory
established information theory as a true mathematical discipline in the eyes of mathematicians.
 After a few years, mathematicians developed proofs more satisfactory to them, adding rigor to
what was viewed as an important yet incomplete engineer’s sketch.
Conclusions
Feldman, Bell Laboratories Record (1953)

 The meanings of information theory adopted by the various social groups during the formative
period defined research directions at the time and for decades thereafter.
 Mutual interaction between engineering theorists and mathematicians led to developments in
information theory that are rigorous, practical, and of importance in the design of electronic
communications technology.
Bello, Fortune (1953)

2004 IEEE Conference on the History of Electronics, Bletchley Park, England, June 28-30, 2004

Você também pode gostar