Você está na página 1de 24

Submitted in partial fulfillment of the award of Degree of Bachelor of Technology

TELEVISION STANDARD AND COMMUNICATION SYSTEM


USED IN DOORDARSHAN
AT


DOORDARSHAN KENDRA PATNA
FROM 4 DEC T0 24 DEC 2012


ABHISHEK PRASAD (9910005003)











Implant Training Report

Submitted By

ABHISHEK PRASAD (9910005003)

In partial fulfillment
of
Bachelor of Technology
in
Electronics and Communication Engineering

From 4/12/2012 to 24/12/2012
Kalasalingam University

(Kalasalingam Academy of Research and Education)
Krishnankoil-626190







CONTENT

COMPANY PROFILE

TELEVISION PRINCIPLE AND SCANNING

CAMERAS

COLOR COMPOSITE VIDEO SIGNAL

TELEVISION STUDIO

TRANSMITTER

CONCLUSION

RESOURCES










COMPANY PROFILE
DOORDARSHAN (Hindi:

; literally Distant Show) is an Indian public service


broadcaster, a division of Prasar Bharati. It is one of the largest broadcasting
organizations in India in terms of the infrastructure of studios and transmitters. Recently,
it has also started Digital Terrestrial Transmitters. On September 15, 2009, Doordarshan
celebrated its 50th anniversary. The DD provides television, radio, online and mobile
services throughout metropolitan and regional India, as well as overseas through the
Indian Network and Radio India
Doordarshan had a modest beginning with the experimental telecast starting in Delhi on
15 September 1959 with a small transmitter and a makeshift studio. The regular daily
transmission started in 1965 as a part of All India Radio. The television service was
extended to Bombay (now Mumbai) and Amritsar in 1972. Up until 1975, only seven
Indian cities had a television service and Doordarshan remained the sole provider of
television in India. Television services were separated from radio in April 1 1976 Each
office of All India Radio and Doordarshan were placed under the management of two
separate Director Generals in New Delhi. Finally, in 1982, Doordarshan as a National
Broadcaster came into existence.
In the year 1982, color TV was introduced in the Indian market with the live telecast of
the Independence Day speech by then prime minister Indira Gandhi on 15 August 1982,
followed by the 1982 Asian Games which were held in Delhi. Now more than 1,400
terrestrial transmitters are there and about 46 Doordarshan studios producing TV
programs today. Presently, Doordarshan operates 21 channels two All India channels -
DD National and DD News, 11 Regional language Satellite Channels (RLSC), four
State Networks (SN), an International channel, a Sports Channel DD Sports and two
channels Rajya Sabha TV & DD-Lok Sabha for live broadcast of parliamentary
proceedings.



TELEVISION STANDARDS



There are three main television standards used throughout the world.
NTSC - National Television Standards Committee
Developed in the US and first used in 1954, NTSC is the oldest existing broadcast
standard. It consists of 525 horizontal lines of display and 60 vertical lines. Only one
type exists, known as NTSC M. It is sometimes irreverently referred to as "Never Twice
the Same Color."
SECAM - Systme lectronique pour Couleur avec Mmoire.
Developed in France and first used in 1967. It uses a 625-line vertical, 50-line horizontal
display.
PAL - Phase Alternating Line
Developed in Germany and first used in 1967. A variant of NTSC, PAL uses a 625/50-
line display. TELEVISION STANDARD USED IN INDIA IS PAL.

PAL ENCODER AND DECODER:-
The gamma corrected RGB signals are combined in the Y-matrix to form the Y signal. The
U-V matrix combines the R,B and Y signals to obtain R-Y and B-Y which are weighted to
obtain U and V signals. Weighting by the factor 0.477 for R-Y and 0.895 for preventing
over modulation on saturated colors. This gives:
Y=0.30R+0.59G+0.11B U=0.477(R-Y) V=0.895(B-Y)
PAL encoder input is the primary colors input into the matrix circuit which generates the
signal (R-Y) and (B-Y). The matrix circuit also generates a delay line of 0.6s.the main
purpose of PAL encoder is the alternation of the phase of clock pulses generated by sync
pulse generator. This (R-Y) and (B-Y) signal is allowed to pass through a low pass filter
circuit to become these signal in a band limitation. The keying scheme used in encoding this
signal is the balanced quadrature phase shift (QPSK) which generates the signal Chroma-V
and Chroma-U which gets added at adder block. The sync pulse generates the clock
frequency of 4.43MHz, 7.8 KHz (ident signal), burst key and sync pulse. The 4.43MHz
signal and ident signal is allowed to pass through a 180 degree phase shifter but before that
a colour subcarrier is added to 4.43 MHz video signal. The output of the 180 degree phase
shifter is allowed to move through the +/-45 degree signal which gives the output as 135


and 225 degree respectively and this signal is fed to the burst generator where burst key
generated from the sync pulse generator gets added to this block. The final output is
obtained from the added block where the delay line signal, burst signal, sync signal and
signal from the balanced quadrature phase modulator is input.
The generation of Burst key is important because they contain the colour information
present in the signal. Burst generates an envelope which creates an area/loop for Chroma-U
and Chroma-V. The absence of burst will cause the signal to turn completely black and
white. All the colour information will lost.
The PAL decoder works just opposite to the encoder. The input in case of encoder is the
primary colour signal while in case of decoder the input is the signal which is transmitted
by the encoding system. The only difference in between encoder and decoder is in case of
encoder we use phase shifter to generate phase shifter to generate phase angle difference
while in case of decoder we use gates to generate the phase angles



Spectrum of a System I television channel with PAL


TELEVISION PRINCIPLES AND SCANNING



The basic principal is a spectrum of energy that starts with low frequency radio waves
through VHF-TV,FM radio, UHF-TV (which now includes the new digital TV band of
frequencies),all the way through x-rays. The visible light portion of the electromagnetic
spectrum consist of all the colour of the rainbow. Which combine to produce white light.
The fact that white light consist of all colour of light added together can be demonstrated
with the help of a prism.
Displaying an image
A cathode-ray tube (CRT) television displays an image by scanning a beam
of electrons across the screen in a pattern of horizontal lines known as a raster. At the
end of each line the beam returns to the start of the next line; at the end of the last line it
returns to the top of the screen. As it passes each point the intensity of the beam is
varied, varying the luminance of that point. Acolor television system is identical except
that an additional signal known as chrominance controls the color of the spot.




Raster scanning is shown in a slightly simplified form below.



When analog television was developed, no affordable technology for storing any video
signals existed; the luminance signal has to be generated and transmitted at the same
time at which it is displayed on the CRT. It is therefore essential to keep the raster
scanning in the camera (or other device for producing the signal) in
exact synchronization with the scanning in the television.
The physics of the CRT require that a finite time interval be allowed for the spot to
move back to the start of the next line (horizontal retrace) or the start of the screen
(vertical retrace). The timing of the luminance signal must allow for this.
The human eye has a characteristic called Persistence of vision. Quickly displaying
successive scan images will allow the apparent illusion of smooth motion. Flickering of
the image can be partially solved using a long persistence phosphor coating on the CRT,
so that successive images

fade slowly. However, slow phosphor has the negative side-effect of causing image
smearing and blurring when there is a large amount of rapid on-screen motion occurring.
The maximum frame rate depends on the bandwidth of the electronics and the
transmission system, and the number of horizontal scan lines in the image. A frame rate
of 25 or 30 hertz is a satisfactory compromise, while the process of interlacing two video
fields of the picture per frame is used to build the image. This process doubles the
apparent number of video fields per second and further reduces flicker and other defects
in transmission.
Receiving signals:-
The television system for each country will specify a number of television
channels within the UHF or VHF frequency ranges. A channel actually consists of two
signals: the picture information is transmitted using amplitude modulation on one
frequency, and the sound is transmitted with frequency modulation at a frequency at a
fixed offset (typically 4.5 to 6 MHz) from the picture signal.
The channel frequencies chosen represent a compromise between allowing
enough bandwidth for video (and hence satisfactory picture resolution), and allowing
enough channels to be packed into the available frequency band. In practice a technique


called vestigial sideband is used to reduce the channel spacing, which would be at least
twice the video bandwidth if pure AM was used.
Signal reception is invariably done via a superheterodyne receiver: the first stage is
a tuner which selects a television channel and frequency-shifts it to a fixed intermediate
frequency (IF). The signal amplifier (from the microvolt range to fractions of a volt)
performs amplification to the IF stages.

Extracting the sound:-
At this point the IF signal consists of a video carrier wave at one frequency and the
sound carrier at a fixed offset. A demodulator recovers the video signal and sound as an
FM signal at the offset frequency (this is known as intercarrier sound).
The FM sound carrier is then demodulated, amplified, and used to drive a loudspeaker.
Until the advent of the NICAM and MTS systems, television sound transmissions were
invariably monophonic.







COLOUR COMPOSITE VIDEO SIGNAL


The video carrier is demodulated to give a composite video signal; this contains
luminance, chrominance and synchronization signals; this is identical to the video signal
format used by analog video devices such as VCRs or CCTV cameras. Note that the RF
signal modulation is inverted compared to the conventional AM: the minimum video
signal level corresponds to maximum carrier amplitude, and vice versa. The carrier is
never shut off all together ; this is to ensure that inter carrier sound demodulation can
still occur.

Each line of the displayed image is transmitted using a signal as shown above. The same
basic format (with minor differences mainly related to timing and the encoding of color)
is used for PAL,NTSC and SECAM television systems. A monochrome signal is
identical to a color one, with the exception that the elements shown in color in the
diagram (the color burst, and the chrominance signal) are not present.
The front porch is a brief (about 1.5 microsecond) period inserted between the end of
each transmitted line of picture and the leading edge of the next line sync pulse.
Its purpose was to allow voltage levels to stabilize in older televisions, preventing
interference between picture lines. The front porch is the first component of
the horizontal blanking interval which also contains the horizontal sync pulse and
the back porch.


The back porch is the portion of each scan line between the end (rising edge) of the
horizontal sync pulse and the start of active video. It is used to restore the black level
(300 mV.) reference in analog video. In signal processing terms, it compensates for
the fall time and settling time following the sync pulse.

In color television systems such as PAL and NTSC, this period also includes
the colorburst signal. In the SECAM system it contains the reference subcarrier for each
consecutive color difference signal in order to set the zero-color reference.
In some professional systems, particularly satellite links between locations, the audio is
embedded within the back porch of the video signal, to save the cost of renting a second
channel.
Monochrome video signal extraction:-
The luminance component of a composite video signal varies between 0 V and
approximately 0.7 V above the "black" level. In the NTSC system, there is
a blanking signal level used during the front porch and back porch, and a black signal
level 75 mV above it; in PAL and SECAM these are identical.
In a monochrome receiver the luminance signal is amplified to drive the control grid in
the electron gun of the CRT. This changes the intensity of the electron beam and
therefore the brightness of the spot being scanned. Brightness and contrast controls
determine the DC shift and amplification, respectively.
Color video signal extraction:-
A color signal conveys picture information for each of the red, green, and blue
components of an image.
However, these are not simply transmitted as three separate signals, because:
Such a signal would not be compatible with monochrome receivers (an important
consideration when color broadcasting was first introduced);
it would occupy three times the bandwidth of existing television, requiring a decrease in
the number of television channels available; and,
typical problems with signal transmission (such as differing received signal levels
between different colors) would produce unpleasant side effects.


Instead, the RGB signals are converted into Y, U, V form, where the Y signal represents
the overall brightness, and can be transmitted as the luminance signal. This ensures a
monochrome receiver will display a correct picture. The U and V signals are the
difference between the Y signal and the B and R signals respectively. The U signal then
represents how "blue" the color is, and the V signal how "red" it is. The advantage of
this scheme is that the U and V signals are zero when the picture has no color content.
Since the human eye is more sensitive to errors in luminance than in color, the U and V
signals can be transmitted in a relatively lossy (specifically: bandwidth-limited) way
with acceptable results. The G signal is not transmitted in the YUV system, but rather it
is recovered electronically at the receiving end.
The two signals (U and V) modulate both the amplitude and phase of the color carrier,
so to demodulate them it is necessary to have a reference signal against which to
compare it. For this reason, a short burst of reference signal known as the color burst is
transmitted during the back porch (re-trace period) of each scan line. A reference
oscillator in the receiver locks onto this signal (see phase-locked loop) to achieve a
phase reference, and uses its amplitude to set an AGC system to achieve an amplitude
reference.
The U and V signals are then demodulated by band-pass filtering to retrieve the color
subcarrier, mixing it with the in-phase and quadrature signals from the reference
oscillator, and low-pass filtering the results.
Synchronization:-
Synchronizing pulses added to the video signal at the end of every scan line and video
frame ensure that the sweep oscillators in the receiver remain locked in step with the
transmitted signal, so that the image can be reconstructed on the receiver screen.
A sync separator circuit detects the sync voltage levels and sorts the pulses into
horizontal and vertical sync.
Horizontal synchronization:-
The horizontal synchronization pulse (horizontal sync HSYNC), separates the scan lines.
The horizontal sync signal is a single short pulse which indicates the start of every line.
The rest of thescan line follows, with the signal ranging from 0.3 V (black) to 1 V
(white), until the next horizontal or vertical synchronization pulse.


The format of the horizontal sync pulse varies. In the 525-line NTSC system it is a
4.85 s-long pulse at 0 V. In the 625-line PAL system the pulse is 4.7 s
synchronization pulse at 0 V . This is lower than the amplitude of any video signal
(blacker than black) so it can be detected by the level-sensitive "sync stripper" circuit of
the receiver.
Vertical synchronization:-
Vertical synchronization (Also vertical sync or VSYNC) separates the video fields. In
PAL and NTSC, the vertical sync pulse occurs within the vertical blanking interval. The
vertical sync pulses are made by prolonging the length of HSYNC pulses through almost
the entire length of the scan line.
The vertical sync signal is a series of much longer pulses, indicating the start of a new
field. The sync pulses occupy the whole of line interval of a number of lines at the
beginning and end of a scan; no picture information is transmitted during vertical
retrace. The pulse sequence is designed to allow horizontal sync to continue during
vertical retrace; it also indicates whether each field represents even or odd lines in
interlaced systems (depending on whether it begins at the start of a horizontal line, or
mid-way through).
The format of such a signal in 525-line NTSC is:
pre-equalizing pulses (6 to start scanning odd lines, 5 to start scanning even lines)
long-sync pulses (5 pulses)
post-equalizing pulses (5 to start scanning odd lines, 4 to start scanning even lines)
Each pre- or post- equalizing pulse consists in half a scan line of black signal: 2 s at
0 V, followed by 30 s at 0.3 V.
Each long sync pulse consists in an equalizing pulse with timings inverted: 30 s at 0 V,
followed by 2 s at 0.3 V.
In video production and computer graphics, changes to the image are often kept in step
with the vertical synchronization pulse to avoid visible discontinuity of the image. Since
the frame buffer of a computer graphics display imitates the dynamics of a cathode-ray
display, if it is updated with a new image while the image is being transmitted to the
display, the display shows a mishmash of both frames, producing a page
tearing artifact partway down the image.


Vertical synchronization eliminates this by timing frame buffer fills to coincide with
the vertical blanking interval, thus ensuring that only whole frames are seen on-screen.
Software such as video games and computer aided design (CAD) packages often allow
vertical synchronization as an option, because it delays the image update until the
vertical blanking interval. This produces a small penalty in latency, because the program
has to wait until the video controller has finished transmitting the image to the display
before continuing. Triple buffering reduces this latency significantly.Two timing
intervals are defined the front porch between the end of displayed video and the start
of the sync pulse, and the back porch after the sync pulse and before displayed video.
These and the sync pulse itself are called the horizontal
blanking (or retrace) interval and represent the time that the electron beam in the CRT is
returning to the start of the next display line.













CAMERAS


Studio Cameras
The studio television camera is the beginning of the video signal. It is here that visible
light is transformed or transduced into electrical energy. The video signal remains in the
form of electrical energy, either analog or digital, for most of the remaining process until
a picture monitor (TV set) converts the electrical signal back into visible light. The
principle parts of the studio camera are; the camera head (including lens, imaging
device, and viewfinder), the camera mount, and the studio pedestal.
The Parts of Camera
Lens: The external optics is designed to collect and focus the light onto the face of the
imaging device. The lens contains focusing, focal length, and aperture controls. The first
two controls are made by the camera operator at the camera head, and the aperture
control is typically made by the video engineer at the CCU. at KTSC-TV have servo
controls for zoom, and manual controls for focus. The servo zoom control, which
provides smooth and variable speed zooms with a little practice, is located on the right
pan handle while the focus control is located on the left pan handle. On a properly
maintained camera and lens, focus should be set with the lens set to maximum focal
length. Once set, the lens will maintain accurate focus throughout the zoom range as
long as the distance between subject and lens does not change.
Imaging Devices: The internal optics, including the beam splitter, are housed in the
camera body. KTSC-TV's Hitachi Z-One B cameras employ CCD (Charge-Coupled
Device) imaging devices and are immune to the problem of image retention and burn-in.
View Finder: The monochrome (black and white) monitor on top of the camera head is
your window on the world. And while it provides no information about the colors being
reproduced, it is an accurate display for the purpose of framing, focus and composition.
The angle of the VF is adjustable to provide optimum viewing regardless of the height of
the camera or the height of the operator. The VF has contrast and brightness controls and
should be adjusted for your particular situation. These controls do not in any way affect
the video output of the camera.


The Camera Mount


The camera is attached to a head which is in turn attached to the camera support--in our
case a tripod and dolly combination. Types of professional camera heads include cam
heads and fluid heads. Both allow for smooth pans and tilts. However, the smoothness of
these movements is determined in part by the operator's proficiency and muscular
coordination. Hours of practice are necessary before one can be fully proficient with
camera moves worthy of "on-air" service. Please be aware of the location and use of the
pan and tilt locks and tension adjustments. Never try to operate the camera head with the
locks engaged, or with the tension adjustments tightened. Whenever the operator is at
the camera, both the pan and tilt adjustments should be unlocked and loose enough so
that the camera movements can be executed smoothly and quickly according to the
director's wishes. Before the operator leaves the camera, even for a moment, the pan and
tilt should be locked securely. Please follow these directions carefully!
TV STUDIO
Main parts of T.V studio are:
Action area
Production control room (P.C.R)
Master switching room (M.S.R)
Control apparatus room(C.A.R)
Character generator(C.G)
Video tape recording(V.T.R)
Earth station(E.R)
ACTION AREA
It is the areas were the artist performs the action i.e. the program which is to
broadcasted. The main parts of action area are as follow:-
Camera head unit
Floor preparation
Audio connecter boxes
Lightning drum and Lightning stand and color Light
Resonance minimizer i.e. carp orated walls ,blanket
Talk back management


Camera head unit consists of the following :-
1. Lens assembly
2. Dichroic mirror
3. Focusing arrangement
4. Servo management
5. Back flow
PRODUCTION CONTROL ROOM
The P.C.R also known as studio control room. The P.C.R include
A video monitor which monitors the POGRAM, VTRs, CAMERAS, GRAPHICS
and other video sources. Video monitors consists of series of television sets or
computer monitors which is capable of displaying multiple sources.
A vision mixer, a large control panel used to select the video sources to be seen on
air and in many cases in any monitors on the set.
As audio mixing console and other audio equipment add the audio to video signals
digital video effects or DVE, for manipulation of video sources. In newer vision
mixer the DVE is integrated into vision mixer
The technical director watches the waveform of the video signals and direct the
cameraman accordingly for the high quality video signals in CCUs waveform
1monitors and vectroscope .

MASTER CONTROL ROOM


The MCR houses equipment that is too noisy or runs too hot for the production
control room. It also makes sure that the wire length and installation requirements
keep with in manageable length, since most high quality wiring runs only between
devices in the rooms ,this include
The actual circuitry and connection boxes of vision mixer, DVE and character
generator.
Camera control units
VTRs
Patch panels for the reconfiguration of the wiring between the various pieces of
equipment.
It also controls on-air signal. It may include controls to play back programs, switch
local or network feeds, satellite feeds and monitors the transmitter.
CHARACTER GENERATORS
It creates the majority of names and graphics that is to be inserted into programs.
CONTROL APPARATUS ROOM
It includes the power control room, UPS room and generator for uninterrupted
power supply.
VIDEO TAPE RECORDER
A video tape recorder is tape recorder that can record video material, usually
On a magnetic tape.VTRs originated as individual tape reels, serving as
replacement for motion picture film stock, and making recording for television
applications cheaper and quicker. An improved form include the tape within the
video cassette recorder (VCR)
VIDEO CASSETTE RECORDER
The video cassette is a type of electro-mechanical devices that uses the removable
video cassette that contain the magnetic tape for the recording audio and video
from television broadcast so that image and sound can be played back at a


convenient time. This facility afforded by a VCR machine is commonly referred to
as television program TIME SHIFT.
EARTH STATION
An earth station is terrestrial terminal station design for the extra planetary
telecommunication with spacecraft, and reception of radio waves from an
astronomical radio source. Earth station are located either on the surface of the
earth , or within earths atmosphere. Earth station communicate with spacecraft by
transmitting and receiving radio waves in the super high frequency or extremely
high frequency bands. When an earth station successfully transmit radio waves to a
spacecraft it establishes a telecommunication link.
Specialized satellite earth station are used to telecommunicate with satellite chiefly
Communication satellites. Other earth station communicate with manned space
station or un manned space probe. An earth station that primarily receive
temporary data, or that follows satellite not in geostationary orbit, is called tracking
station.
When a satellite is within an earth stations line of sight, the earth station is said to
have a view of satellite. It is possible for a satellite to communicate with more than
one earth station at time. A pair of earth station is said to have a satellite in mutual
view when the station share simultaneous, unobstructed ,line of sight contact with
the satellite.



UPLINK:-
Pertaining to satellite communications, an uplink (UL or U/L) is the portion of a
communications link used for the transmission of signals from an Earth terminal to
a satellite or to an airborne platform. An uplink is the inverse of a downlink. An
uplink or downlink is distinguished from reverse link or forward link.
Pertaining to GSM and cellular networks, the radio uplink is the transmission path
from the mobile station (cell phone) to a base station (cell site). Traffic and
signaling flows within the BSS and NSS may also be identified as uplink and
downlink.
Pertaining to computer networks, an uplink is a connection from data
communications equipment toward the network core. This is also known as
an upstream connection.

DOWNLINK:-
In the context of satellite communications, a downlink (DL) is the link from a
satellite to a ground station.
Pertaining to cellular networks, the radio downlink is the transmission path from a
cell site to the cell phone. Traffic and signalling flows within the base station
subsystem (BSS) and network switching subsystem (NSS) may also be identified
as uplink and downlink.
Pertaining to a computer networks, a downlink is a connection from data
communications equipment towards data terminal equipment. This is also known
as a downstream connection.







TRANSMITTER:-



In electronics and telecommunications a transmitter or radio transmitter is
an electronic device which, with the aid of an antenna, Produces radio waves. The
transmitter itself generates a radio frequency alternating current, which is applied
to the antenna. When excited by this alternating current, the antenna radiates radio
waves. In addition to their use in broadcasting, transmitters are necessary
component parts of many electronic devices that communicate by radio, such
as cell phones, wireless computer networks, Bluetooth enabled devices, garage
door openers, two-way radios in aircraft, ships, and spacecraft, radar sets, and
navigational beacons. The term transmitter is usually limited to equipment that
generates radio waves for communication purposes; or radiolocation, such
as radar and navigational transmitters. Generators of radio waves for heating or
industrial purposes, such as microwave ovens or diathermy equipment, are not
usually called transmitters even though they often have similar circuits.
The term is popularly used more specifically to refer to a broadcast transmitter, a
transmitter used in broadcasting, as in FM radio transmitter ortelevision
transmitter. This usage usually includes both the transmitter proper, the antenna,
and often the building it is housed in. An unrelated use of the term is in
industrial process control, where a "transmitter" is a telemetry device which


converts measurements from a sensor into a signal, and sends it, usually via wires,
to be received by some display or control device located a distance away.


Block diagram of a TV transmitter (inter carrier method).





CONCLUSION
It was a wonderful experience to be a part of the inplant training at PARSER
BHARTI, DOORDARSHAN KENDRA, patna. Being a part of this training I
was very pleased to see the technology that was able to broadcast the video to
almost every homes of India.
I learnt about the how the video is captured and how we transmit these video
without any distortion in the video signal, what are the different way to
transmit these signal, what is channel and many such thing.
I thank my guide at DOORDARSHAN KENDRA, PATNA Er. N.K SINGH
for teaching and showing some practical while teaching.



REFRENCE
www.shareslides.com
www.google.com
www.howstuffworks.com
http://www.scribd.com/doc/44694977/Modern-Communication-
Systems-PART-1-TELEVISION

Você também pode gostar