Você está na página 1de 3
= 102 - NERAL CREATMENT. PROBLEM OF CODING by ©.B, Shannon ABSTRACT A typical communication system consists of the following five elements: (1) tn information sources. ° This can be considered to he reprasentad mathematically by a suitable stochastic process which chooses one message from a set of possible messages. The rate KR of producing information is measured by the entropy per symbol or the process. (2)__&n encoding or transmitting element. Mathematically this amounts to a transformation applied to the message to produce the signal, i.e., the encoded message. (3) A channel. on which the signal is transmitted from transmitter to receiver, During tranamission the signal may be perturbed by noise. (4) A receiving ana decoding (or demodulating) device which recovers the originel message from the recedved signal. (5) the destination of the information, e.g., the human ear (for telephony) or the eye (for television). The characteristics of the destina~ tion may determine the significant elements of the information to be transmitted, Por oxanple, with sound transmission, precise repovery of the phasss of components is not required because of the insensitivity of ‘the ear to this type of distortion, ‘The central problems to be considered are how one can measure the anpanity of a channel for transmitting information; how this capacity depends on various parameters such as bandwidth, aveilable transmitter power and type of noise; and what is the best encoding system for a given information source vo uvalize a channel most efficiently, Since the output of any information source can be encoded into binary digits using, statistically, R binary @igits per symbol, the problem of defining a channel capacity can be reduced to the problem of determining Une mains nwibex of Uinery digits that can be transmitted per accond over the channel. When there is no noise in the channel, it is generally possible to set up a difference equation whose asymptotic solution gives essentially the number of difforont cignale of duration T when Tis large. Brom this, it is possible to calculate the number of binary digits that can be transmitted in ‘time T and, consequently, the channel capacity. Ina noisy system, the problem is mathematically considerably more aiffioult, Noverthelees, & definite channel enpacity C exists in the following sense. It is possible by proper encoding of binary digits into allowable signal functions to transmit as closely as desired to the rate C binary digits per second with arbitrarily small frequency or errors, ‘there is no method of encoding which transmits a larger number, In general, the fdoal rate C oan only be approached hy using mora nnd more complex encoding systems and longer and longer delays at both transmitter and receiver. The channei capacity U is given by an expression involving Ue @ifference of two entropies, ‘This expression must be maximized over all possible stochastia processes which might be used to generate signal functions. = 105 - ‘The actual rumerical ovaluation of C is difficult and has been carried out in only @ few oases. Even when C is known, the construction of coding systems which approach the ideal rate of transmission 1a often unfeasible. A oimple exemple of 2 uoisy ulmnnel in whion the capacity and an cxplicit ideal code can be found is the following, Assume the elementary signals ere binary digits and that the noinn projicra at most one srcar in @ group of seven of these. The channel capacity oan be caleulated as b/7 bits per elemontary signal, A code which tranamits at this rate on the averaye is au follows. Let a blook OF seven aymbola Be X4 X25 X31 *ys 35» %6» x7 (each x either O or 1). x3, x5, xg and x7 are used as message symbols, dnd x), xz and x, ere uscd reduulantly fur ulleuklng purposes. These are chosen by the £éllowing rules: (4) xy ds chosen so thato(= (x, + x5 + xg +7) = 0 Moa 2 (2) xp 48 chosen so that p (ip + x5 +55 4 x7) = 0 Moa 2 (3) x4 ts chosen so that Ye (my 4k 4 5 + 7) = MOO Be The binary number XPy , calculated by these same expressions from the received Signal, gives the location of the error, (If zero, there was no This forms a comletely self-correcting code for the assumed type If the signal functions are capable of continuous variation we have @ continuous channel, If there were no noise whatever, continuous channel would have an anfinate capacity, rhysically, theré is always some noise. With white Gaussian noise the capacity is given by a) Wiog (4 + F in which W= bandwidth in oycles per socont P= available average tranouitter power N = average noise power within the band We The equation (1) ia an exchange xelation among the quantities MW, ©, Nana G. inus, the transmitter power ean be Tedlicod by anoreasing the bandwidth, retaining the same channel capacity, Conversely @ smaller bandwidth canbe used at the expense of a greater signal-to-noise watio. If, as is usually the case, the noise power increases proportion— ally with bandwidth, N= NoW, we have eee eer 2) Wiog (1 + Ea) (2) 4s W inoreases, C approaches the asymptotic value Coo =

Você também pode gostar