Você está na página 1de 26

A REPORT on PRASAR BHARTI BROADCASTING CORPORATION OF INDIA DOORDARSHAN KENDRA LUCKNOW

Submitted by: SYED MOHD MEHNDI Roll no.-1005231051 Electronics and Communication Engineering I.E.T Lucknow

pg. 1

ACKNOWLEDGEMENT

I would like to extend my heartfelt thanks and deep sense of gratitude to all those who helped me in preparing this report directly or indirectly. I would like to express my sincere thanks to respected Assistant Engineer R.K. Naithani, I also express my gratitude to Mr. P.K. Tripathi, Mr. Alok, and Mrs. Sumanwho helped me a lot in understanding the various processes and concepts involved. It was really a great experience working in the DD Kendra and learning from such experienced engineers with hands on experience on the subject their expert guidance and suggestion which helped me to make this report. It gives me immense pleasure in conveying thanks to all the staff members who helped a lot in completing this report. I would also like to express my thanks to my friends. I am indebted to them for providing valuable support and co-operation.

SYED MOHD MEHNDI Roll no. 1005231051 I.E.T Lucknow

pg. 2

TABLE OF CONTENTS

1. Overview 2. TV Camera 3. Composite(CCVS) interface 4. Camera and its basic station 5. Vision mixer 6. Video tape recorder(VTR) 7. Earth station simulcast 8. Transmitter

Page 4 Page 6 Page 10 Page 14 Page 20 Page 20 Page 21 Page 24

pg. 3

OVERVIEW
Introduction of Doordarshan lucknow Lucknow Doordarshan started functioning on 27th Nov. 1975 with an interim setup at 22, Ashok Marg, Lucknow. The colour transmission service of National Channel (only with Transmitter) started from 15-8-82. While the regular colour transmission service from studio was started in 1984 with ENG gadgets. During Reliance Cup, OB Van came to Kendra for outdoor telecast having 4 colour camera chain, recording equipments, portable microwave link. In March 1989 new studio complex started functioning. EFP Van came to DDK Lucknow in 1989 with compliment of 3 colour camera chain and recording setup for outdoor telecast. The entire recording of studio/van have been replaced to Beta format High Band edit VCR and still in use as the old recording are on H.B. UP Regional Service telecast with up linking facility from studio (DDK, Lucknow) started in January 1998 on INSAT-2B. This service was changed to INSAT-2D (T) ARAB SAT. on 147-98. The news feeds are up-linked to Delhi occasionally from Lucknow Earth Station.Studio program is transmitted from 10 KW-TV transmitter installed at Hardoi Road through StudioTransmitter Microwave Link. Besides this, one 16 feet PDA is being installed at TV Transmitter site toreceive the down link signal of Regional Service telecast from studio via ARAB SAT. on INSAT-2D (T). Site of 22 Ashok Marg, Lucknow is being utilized by Doordarshan Training Institute (for staff training) having one studio (12m x 6m) and colour camera chain. The DTI Lucknow was inaugurated in September 1995. In the beginning, only the development programmes were telecast but later on to enlighten the viewers as per their needs, expectations, many more informative, educative and entertaining programmes have been introduced from time to time. Lucknow Doordarshan produced some of the best programmes in the country as "BIBI NATIYON WALI", "NEEM KA PED" and "HATIM TAI" etc. To entertain cross-section of the society.

pg. 4

Technical Overview
DDK Lucknow has the following main departments which manage the production, storage transmission and maintenance of the two DD National channels.

1. STUDIO2.

2. PRODUCTION CONTROL ROOM (PCR) 3.

3. VIDEO STORAGE AND TRANSMISSION ROOM (VTR) 4.

4. MAIN SWITCHING ROOM (MSR) 5.

5. DIGITAL EARTH LINK STATION6.

TRANSMITTER each of these departments are discussed in detail with due stress to their Levant engineering aspects. The studio has

Camera and lights and other equipment required for production of a feed.

Camera control unit or CCU

It is in the studio that all aspects related to the production of a video takes place. The DDK has two large studios and a small studio for news production. The PCR is where the post production activities like minor editing of feed during a live program takes place. The production

manager sits in the PCR and directs the camera men and selects the angles sound parameters etc. during the production stage in the PCR. It is in the PCR that we can control all the studio lights and all the microphones and other aspects. The PCR has a vision mixer and an audio mixer. Its working and other aspects are discussed in detail in the following pages. The PCR is where the phone in console and other systems are also kept. The VTR is the next section where copies of all programs are stored. All the programs shot in the camera are simultaneously recorded in the VTR. Also the VTR plays back all the videos as and when required. Videos of prerecorded events are queued up in the VTR and are played back without a break. Videos of famous people and important events are stored in the central film pool. The MSR stores all the circuitry of the DDK. All

pg. 5

the camera base units, all the vision mixer base units and all the audio processor base units are kept in MSR. The audio chain and video chain of MSR is explained in detail. The monitoring and control of all activities takes place in MSR. It is the MSR which decides what is to go in air. The MSR also performs some additional functions like logo addition etc. The next station is the earth station which has an uplink chain, simulcast transmitters, audio processors video processors, up converters, modulators etc. The earth station is in fully digital domain. The last stage is the transmitter which has the antenna and facilities for terrestrial transmission.

pg. 6

TV Camera INTRODUCTION: A TV Camera consists of three sections. a) A Camera lens & Optics: To form optical image on the face plate of a pickup device b) A transducer or pick up device: To convert optical image into a electrical signal c) Electronics: To process output of a transducer to get a CCVS signal TYPES OF PICK - UP DEVICESThere are three types of pick up devices based on :a) Photo emissive material: These material emits electrons when the light falls on them. Amount of emitted electrons depends on the light. Monochrome cameras used in Doordarshanwere based on this material. These cameras were called Image Orticon Cameras. These cameras were bulky and needed lot of light. These are no longer in use at present. b) Photo conductive material: The conductivity of these material changes with amount of light falling on them. Such material with variable conductivity is made part of a electrical circuit.Voltage developed across this material is thus recovered as electrical signal. Earlier cameras basedon this principle were Videocon Cameras. Such cameras were often used in the monochrometelevise chain . These cameras had serious Lag & other problems relating to dark currents.Improvement in these cameras lead to the development of Plumb icon and Sat icon cameras. c) Charge coupled devices: These are semiconductor devices which convert light into a chargeimage which is then collected at a high speed to form a signal.Most of the TV Studios are now using CCD cameras instead of Tube cameras. Tube cameras havebecome obsolete & are not in use .Camera sensors CCD basics. The CCD is a solid-state device using special integrated circuitry technology, hence it is oftenreferred to as a chip camera. The complete CCD sensor or chip has at least 450 000 pictureelements or pixels, each pixel being basically an isolated (insulated) photodiode. The action of thelight on each pixel is to cause electrons to be released which are held by the action of a positivevoltage.

Picture Basics A television creates a continuous series of moving pictures on the screen. This section will describe in detail how pictures are created in a television. A camera works exactly on the same principle applied the other way round. A picture is "drawn" on a television or computer display screen by sweeping an electrical signal horizontally across the display one line at a time. The amplitude of this signal versus time represents the instantaneous brightness at that physical point on the display. At the end of each line, there is a portion of the waveform (horizontal blanking interval) that tells the scanning circuit in the display to retrace to the left
pg. 7

edge of the display and then start scanning the next line. Starting at the top, all of the lines on the display are scanned in this way. One complete set of lines makes a picture. This is called a frame. Once the first complete picture is scanned, there is another portion of the waveform (vertical blanking interval, not shown) that tells the scanning circuit to retrace to of the

display and start scanning the next frame, or picture. This sequence is repeated at a fast enough rate so that the displayed images are perceived to have continuous motion. This is the same principle as that

Behind the "flip books" that you rapidly flip through to see a moving picture or cartoons that are drawn and rapidly displayed one picture at a time. Interlaced versus Progressive Scans These are two different types of scanning systems. They differ in the technique used to cover the area of the screen. Television signals displays are typically interlaced, and computer

signals and compatible displays are typically progressive (non-interlaced). These two formats are incompatible with each other; one would need to be converted to the other before any common processing could be done. Interlaced where each picture, referred to as a frame, is divided into two separate sub-pictures, and referred to as fields. Two fields make up a frame. An interlaced picture is painted on the screen in two passes, by first scanning the horizontal lines of the first field and then retracing to the top of the screen and then scanning the horizontal lines for the second field in-between the first set. Field 1 consists of lines 1 through 262 1/2, and field 2 consists of lines 262 1/2 through 525. The interlaced principle is illustrated in Figure 2.Only a few lines at the top and the bottom of each field are shown.

pg. 8

There are many different kinds of video signals, which can be divided into either television or computer types. The format of television signals varies from country to country. In the United States and Japan, the NTSC format isused. NTSC stands for National Television Systems Committee, which is thename of the organization that developed the standard. In Europe, the PALformat is common. PAL (phase alternating line), developed after NTSC, isan improvement over NTSC. SECAM is used in France and stands for sequential color avec memoire (with memory). It should be noted that thereis a total of about 15 different subformats contained within these threegeneral formats. Each of the formats is generally not compatible with theothers. Although they all utilize the same basic scanning system andrepresent color with a type of phase modulation, they differ in specificscanning frequencies, number of scan lines, and color modulationtechniques, among others. The various computer formats (such as VGA, XGA, and UXGA) also differ substantially, with the primary difference inthe scan frequencies. These differences do not cause as much concern, because most computer equipment is now designed to handle variable scanrates. This compatibility is a major advantage for computer formats in thatmedia, and content can be interchanged on a global basis. In India we use the PAL system. It has 625 lines in each frame anduses interlaced scanning.

pg. 9

There are three basic levels of baseband signal interfaces. In order of increasing quality, they are composite (or CCVS), which uses one wire pair; Y/C (or S-video), which uses two wire pairs; and component, which uses three wire pairs. Each wire pair consists of a signal and a ground. These three interfaces differ in their level of information combination (or encoding). More encoding typically degrades the quality but allows the signal to be carried on fewer wires. Component has the least amount of encoding, and composite the most.

pg. 10

Composite/CCVS Interface. Composite signals are the most commonly used analog video

interface.Composite video is also referred to as CCVS, which stands for color, video, blanking, and sync, or composite video baseband signal. Itcombines the brightness info rmation (luma), the color information (Chroma), and the synchronizing signals on just one cable. The connector is typically an RCA jack. This is the same connector as that used for standardline level audio connections. A typical waveform of an all-white NTSC composite video signal is shown in Figure.

This figure depicts the portion of the signal that represents one horizontal scan line. Each line is made up of the active video portion and the horizontal blanking portion. The active video portion contains the picture brightness (luma) and color (Chroma) information. The brightness information is the instantaneous amplitude at any point in time. From the figure, it can be see that the voltage during the active video portion would yield a bright-white picture for this horizontal scan line, whereas the horizontal blanking portion would be displayed as black and therefore not be seen on the screen. Color information is added on top of the luma signal and is a sine wave with the colors identified by a specific phase difference between it and the color- burst reference phase. The amplitude of the modulation is proportional to the

pg. 11

amount of color (or saturation), and the phase information denotes the tint (or hue) of the color. The horizontal blanking portion contains the horizontal synchronizing pulse (sync pulse) as well as the color reference (color burst) located just after the rising edge of the sync pulse (called the "back porch"). It is important to note here that the horizontal blanking portion of the signal is positioned in time such that it is not visible on the display screen.

Y/C Interfaces The Y/C signal is a video signal with less encoding. Brightness (luma),which is the Y signal, and the color (chroma), the C signal, are carried ontwo separate sets of wires. Component Interfaces Component signal interfaces are the highest performance, because they havethe least encoding. The signals exist in a nearly native format. They alwaysutilize three pairs of wires that are typically in either a luma (Y) and two-color-difference-signals format or a red, green, blue (RGB) format. RGBformats are almost always used in computer applications, whereas color-difference formats are generally used in television applications. The Y signalcontains the brightness (luma) and synchronizing information, and the color-difference signals contain the red (R) minus the Y signal and the blue (B)minus the Y signal. The theory behind this combination is that each of the base R, G, and B components can be derived from these difference signals.Common variations of these signals are as follows: Y, B-Y, R-Y: Luma and color-difference signals. Y, Pr, Pb: Pr and Pb are scaled versions of B-Y and R-Y. Commonly found in highend consumer equipment. Y, Cr, Cb: Digital-signal equivalent to Y, Pr, Pb. Sometimes incorrectly used in place of Y, Pr, Pb. Y, U, V: Not an interface standard. These are intermediate, quadrature signals used in the formation of composite and Y/C signals. Sometimes incorrectly referred to as a "component interface. Some important terms and their meanings in this context are listed below Aspect Ratio Aspect ratio is the ratio of the visible-picture width to the height. Standard television and computers have an aspect ratio of 4:3(1.33). HDTV has aspects ratios of either 4:3 or 16:9(1.78). Additional aspect ratios like 1.85:1or 2.35:1 are used in cinema.

pg. 12

Blanking Interval There are horizontal and vertical blanking intervals. Horizontal blanking interval is the time period allocated for retrace of the signal from the right edge of the display back to the left edge to start another scan line. Vertical blanking interval is the time period allocated for retrace of the signal from the bottom back to the top to start another field or frame. Synchronizing signals occupy a portion of the blanking interval.

Blanking Level Used to describe a voltage level (blanking level). The blanking level is the nominal voltage of a video waveform during the horizontal and vertical periods, excluding the more negative voltage sync tips. Chroma The color portion of a video signal. This term is sometimes incorrectly referred to as "chrominance," which is the actual displayed color information. Color Burst The color burst, also commonly called the "color subcarrier," is 8 to 10cycles of the color reference frequency. It is positioned between the rising edge of sync and the start of active video for a composite video signal. Fields and Frames A frame is one complete scan of a picture. In NTSC it consists of 525horizontal scan lines. In interlaced scanning systems, a field is half of a frame; thus, two fields make a frame. Luma The monochrome or black-and-white portion of a video signal. This term is sometimes incorrectly called "luminance," which refers to the actual displayed brightness. Monochrome The luma (brightness) portion of a video signal without the color information. Monochrome, commonly known as black-and-white, predates current color television. PAL Phase alternate line. PAL is used to refer to systems and signals that are compatible with this specific modulation technique. Similar to NTSC but uses subcarrier phase alternation to reduce the sensitivity to phase errors that would be displayed as color errors. Commonly used with 626-line, 50Hzscanning systems with a subcarrier frequency of 4.43362MHz. Pixel
pg. 13

Picture element. A pixel is the smallest piece of display detail that has aunique brightness and color. In a digital image, a pixel is an individual pointin the image, represented by a certain number of bits to indicate the brightness. RGB Stands for red, green, and blue. It is a component interface typically used in computer graphics systems. Sync Signals/Pulses Sync signals, also known as sync pulses, are negativegoing timing pulses in video signals that are used by video-processing or display devices the horizontal and vertical portions of the display. Y Cr Cb A digital component video interface. Y is the luma (brightness) portion, and Cr and Cb are the color-difference portions of the signal. Y/C An analog video interface in which the chroma (color) information is carried separately from the luma (brightness) and sync information. Two wire pairs are used, denoted Y and C or Y/C. Often incorrectly referred to as "S-video." CAMERA AND ITS BASE STATION

The camera system in DDK Lucknow has the following main components i) ii) iii) iv) v) vi) vii) viii) ix) Optical system Video system Monitor system Pulse system Control system Auto setup system Power system Intercommunication system and tally system Transmission system

Camera has a head unit as well as a base unit. The head unit is located in thestudio and the base unit is located in the MSR. Also there is a CameraControl Unit (CCU) which is a separate unit in itself which is used to control the camera. The base station of the camera houses all the electronics related to the camera. The head unit of the camera is the part which the camera manhandles in the studio. The head unit of the camera is connected to other parts of the

pg. 14

system through a triax cable alone. This reduces the clutter in the studio. The triax cable carries power for the camera. Signals of the pictures to from the camera and also carries the communications in RF to and from the camera. The head unit of the camera houses the charge coupled devices (CCD) which take in the light from the viewing area and convert them to electrical signals. Before the light hits the CCDs in a colour camera, adichroic prism is used to split the three primary colours RGB into three and cause them to be absorbed by different CCDs which are kept at the focus of the lens system. They absorb light from each part of the screen pixel after pixel and for a moving picture frame after frame. The CCDs improve the apparent limit resolution with the help of spatial pixel shifting. There are 3types of CCDs available. Interline transfer (IT) Frame Transfer (FT) Frame Interline Transfer (FIT) The DDK Lucknow studio uses 4 IKEGAMI (HK 399W) cameras in studio 1 and an Ikegami camera and a SONY camera in Studio 2.TheIkegami camera and Sony both uses FIT type CCDs. The sonny camera gives a digital output whereas the Ikegami gives out an analog output. The FIT type CCD has photodiodes, vertical transfer CCDs and Horizontal transfer CCDs , all of which but photodiodes are covered with metallic film to prevent any kind of exposure to light. The residual charges in vertical transfer CCD is swept out. If it is not swept out smearing occurs (light leaks into vertical transfer CCD and is seen as light above and below a bright object).The charges, the result of light converted by photodiodes are transferred to vertical transfer CCDs during vertical blanking. Then the charges are transferred to the storage CCDs at high speed. This reduces smear. FIT is complex but has very little smear. Light entering sections is covered with metallic film do not cause photoelectric conversion. But light which is reflected enters the photodio desand may generate false signals called moir (faded distortion). An optical low pass filter is used for reducing this moir phenomenon ON CHIP LENS It is mounted on the CCD to collect light which is not contributing to photoelectric conversion. This improves CCD sensitivity. Most CCDs have on

pg. 15

Unlike pickup tubes the CCD does not have a continuous surface but discrete photodiodes. This lowers spatial frequencies that are higher than half the sampling frequency on the basis of sampling theorem. These frequently cause spurious signals which cause moir. The optical low pass filter is used to attenuate and surpass high pass spatial frequencies. A crystal filter with the effect of double refraction is used in this . SPATIAL PIXEL SHIFTING This is a method of improving horizontal resolution such that the light receiving element of channel G is shifted by half pitch compared to that of R.

pg. 16

and B. This effectively doubles the sampling points and theoretically doubles the upper band resolution if luminance signal Y= .25R + .50G +.25B holds true. In reality however Y= .25R + .50G + .25B is required and that does not result in double resolution but can achieve a satisfactory effect. An inner sampling point also reduces moir.

OVERFLOW DRIVES(OFD)of CCDs are responsible for discharging excessive charges when a large volume of light falls on the photo diodes Without OFD the charges will overflow to the adjacent pixels and a phenomenon called blooming occurs. In blooming the ambient are of a spot image extensively in white

pg. 17

Appropriate control of OFD allows signal charges to discharge by force midway through the charge storage process thus performing same role as a shutter. Standards of shutter Preset shutter 1/60thof a second for NTSC and 1/60Th of a second to 1/200th of a second for PALCVSS or continuous variable shutter speed is 1/30.3thto 1/ 57.6th For NTSC and again 1/61.4 to 1/1996 for NTSC. For PAL 1/25.4 to 1/47.6 and from1/50.4 to 1/1953.In particular 1/100 seconds make it possible to eliminate flicker caused between NTSC field and 50Hz commercial power supply. New Super V is technology incorporated to improve vertical resolution. It gives a vertical resolution of 480 TV lines against a normal or 400 TV lines. Video System It has a CCD multi module, a PROC -1 module a PROC-2 module, a Head DPROC and Head pulse modules. The video system of BS/CCU contains BSMPV, BS DF PROC and BS Pulse modules. The electric signal that has undergone photoelectric conversion in the CCD element are transferred to the sample hold circuit in the CCD multi module and output to the A PROC -1 module, undergo video processing by a A PROC-2 Head D Proc and Head Pulse module and are transmitted to BS/CCU via the triax cable adaptor as component (Y, Cr, Cb) signals In self-contained mode they are converted into encoder signals by the digital encoder ASIC in the Head D PROC module for Output.

pg. 18

Monitoring System The monitoring System generates various signals to be output to VF, PF and WFM. It is separate from the main, the system can actually switch R, G and B video signal or display signal requirements for monitoring the maker or characters. Pulse System The pulse system is installed in department of camera head and BS/CCU, and is designed to operate in conjunction with the CCU operation connected to BS/CCU and in the selfcontained mode operated by the camera head alone, in either way the system can be operated in internal or external synchronization mode. Control System The camera is normally controlled through the CPUs of the HEAD MPU and BS MPU modules to keep watching each unit and module.

Production Control Room (PCR) A major objective of TV program control facilities is to maintain a smooth continuous flow of program material. The overall control of program is done in production control room by the producer with the help of a productionassistant, a CCU engineer and an engineer at vision mixer. They have in front of them, the switching panel of the vision mixer console and a stack of monitors for the individual cameras, preview monitors of VTRs and transmission monitor for displaying the switched output, with the aid of which the program is edited. The PCR usually of the various equipments like: Camera Control Unit(CCU) Vision Mixer(VM) Video Tape Recorder(VTR) Audio Mixer(AM) Camera Control Unit (CCU)

The CCU contains control for Aperture Optical Focus Zoom of the lens system Beam Focus Selecting Gain

pg. 19

Color Temperature Contours (Camera Details) Gamma

Vision Mixer (VM) A vision mixer or video switcher enables the program producer to select the desired sources or a combination of the sources in order to compose the program. The vision mixer is typically 10x6 or 20x10 crossbar switcher selecting any one of the 10 or 20 input sources to 6 to10 different output lines. The input sources include: Camera-1, Camera-2, Camera-3, Telecine-1, Telcine-2, VTR-1, VTR-2, Test Signal etc. The vision mixer provides the following operational facilities for the editing of the TV programs.

Take--selection of any input source, or cut-switching cleanly from one source to another. Dissolve-fading in or fading out. Lap Dissolve-dissolving from one source to another with an overlap mixing. Superposition of two sources-keyed caption when the selected inlay is superposed on the background picture

Video Tape Recorder (VTR) The standardized two inch tape quadrupled head recording machines arecalled the video tape recorder and are used for the high quality video taperecording one or half inch helical scan tape recorders have been used for outdoor field recording. This multipurpose studio digital video cassettetapes, and is designed to record, play back and edit interlace signals(6251/5251) as well as record, playback and edit existing DVCPRO signals(25Mbps). Its 625/525 switching functions makes this studio video cassetterecorder which can be used anywhere in the world. In addition, it corporatedigital compression technology so that the deterioration in picture qualityand sound quality resulting from dubbing is significantly minimized. Thecompact, lightweight 4U size makes carry easier, even when mounted in a19 inch rack. The settings for the units set up can be performed interactivelywhile

viewing the screen menus on the monitor, and editing functionsinclude both assemble and insert editing.

pg. 20

DIGITAL EARTH STATION SIMULCAST Frequency range - 5.85 GHz to 6.425 GHz for transmission 3.625 GHz to 4.2 GHz for reception The digital earth station operates in the frequency range of 5.85 GHz to6.425 GHz for transmission and 3.625 to 4.24 GHz for reception of signals. The whole system operates with DVB/MPEG2 Standards. The base band processor subsystem and base band monitoring subsystem operates in fully digital domain. An OFC carries digital base band signal from studio to earthstation site to minimize the noise and interference. It is controlled by a PC called NMS PC.

The compression segment has an MPEG encoder, digital multiplexer and digital modulator. The monitoring and receiving segment comprises of two digital receivers for receiving and decoding program. The output of modulator (70MHz) is sent to an up converter. The up converted signals are sent to an HPA. Then this signal is given to a PDA (parabolic dish antenna) for up linking to satellite. The uplinked signal is received again by the same PDA for monitoring purposes. The signal between earth station and satellite are given a long line of sight which means there must be a clear path from earth to satellite. The uplink signal is fed from the earth station by a large PDA. The satellite is equipped with its own dish antenna which receives the uplink signals and feeds them to a receiver. The signal is then amplified and changed to a different frequency which is downlink frequency. This is done to prevent interference between uplink and downlink signals. The downlinked signal is then again sent to the transmitter which again retransmits it. Each satellite has a transponder and a single antenna receives all signals and another one transmits all signals back. A satellite transmits signals towards earth in pattern called the satellite footprint of the satellite. The footprint is strongest at center and the footprint is used to see if the earth station will be suitable for the reception of the desired signal. Converts the parts of the DES are Antenna subsystem including LNA Antenna control unit, beacon tracking unit, beacon tracking receiver and up converter system high power amplifier and power system. The system operates in 2 +1 mode and is compliant with DVB MPEG 2 standards. The base band processor subsystem and base band monitoring system operates in digital domain. An OFC contains the digital base band signal for studio to earth station to minimize noise interference the network

pg. 21

management system or NMS monitors and controls baseband equipments compression equipment and test instruments like video audio generation and video audio analyzer. They are provided to ensure quality of transmission and help trouble shoot.

The base band segment comprises of baseband subsystems at studio site and base band subsystem at earth station site. This baseband segment processes two video Programmes. The base band segment is monitored and controlled using a PC placed near the base band earth station equipments called base band NMS PC. The compression segments comprises of Mpeg encoders in 2 + 1 configuration for providing redundancy. It also comprises of digital multiplexers and digital modulators in 1 + 1 configuration. The compression segment is monitored and controlled by compression NMS PC. The receive and monitoring segment consists of two digital receivers for receiving and decoding of the video programmes and one ASI to SDI decoder for decoding of the transport stream for monitoring video programmes at the multiplexers output. RF NMS PC is placed near the receive monitoring segment and video audio generator placed in the base band segment. For monitoring of video programmes professional video monitor, LCD video monitor and audio level monitor are provided in the base band segment. An operator console has one 14 professional video monitor a video audio monitor unit for quantitative monitor of video programmes and a personal computer for centralized merit and contention of earth station sub system.

UP CONVERTER (1+1) The UPC will add in any frequency within stated transmission BW in 125 kHz stepped increments. The IF bandwidth is indented for operation within an 80Mhz BW centered at 70MHz (for +/- 40 MHz) Due to its low phase noise and HF stability the model UC6M2D5 (satellite networks) meets

INTELSAT, DOMSAT, EUTELSAT and regional requirements. It can standalone up converter or in a 1:1 protection switch option. The uplink frequency for Trivandrum is 6036.5 MHz and downlink is 3811.5 MHz. AUDIO PROCESSOR Designed specifically for the demands of television audio, the programmable OPTIMOD-TV 8282 digital audio processor meets all requirements of the various systems in use around the
pg. 22

world. It is impossible to characterize thelistening quality of even the simplest limiter or compressor on the basis of the usual specifications, because such specifications cannot adequately describe the crucial dynamic processes that occur under program

conditions.Therefore, the only way to meaningfully evaluate the sound of an audio processor is by subjective listening tests. Certain specifications are present adhere to assure the engineer that they are reasonable, to help plan theinstallation, and to help make certain comparisons with other processingequipment. Some of the specifications are for features that are optional. TheTXs sampling rate can be synchronized with that of audio processors or can be allowed a free run of 32 kHz, 44.1 kHz or 48 kHz. The audio signal issent to the digital I/O cards and analog cards separately. These cards provide pre emphasis truncations required and attenuation on the digital signal before transmission.

Transmitter Antenna A 6.3m diameter antenna with a simplified manual track device features ready erection, ease of maintenance and high reliability. Antenna parameters

pg. 23

Reflector structure The 6.3 m diameter antenna is made up of 4 quarter segment. Each andevery quarter is made up of 10 segments fixed on five trusses. Panels which are fixed to the trusses are made up of fine aluminum expanded mesh strengthened with the help of channel sections and tee

pg. 24

sections whose ends are fixed to the backup structure. Trusses are composed of aluminum square tubes and the welded back up made up of hub and 20 trusses. The hubs and trusses are constructed in such a way that they constitute to the high level of surface accuracy. Mount structure A simple tubular steel space frame makes up most of the mount structure. It allows rotation about x-axis as well as y axis. The x axis drive rod is connected between the top of the mounted structure and the concrete foundation. The y axis drive rod is connected between the base of the x axis bearing mount and the reflector back up structure on the left hand side as viewed from the rear of the antenna. The mount is rigidly attached to the concrete base which is facing north such that it can survive even in wind speeds up to 200 kmph.

Drive mechanism It has a telescopic pipe arrangement and a screw rod within it along withmanual handle. There are mechanical angle indicators along the screw rodwhich indicate the exact position and angle of the antenna with respect to both the axes.

Material Most of the parts of the panel and antenna structure are made up of aluminum alloy which has corrosion resistance and yield strength.

Finish The reflector is treated in the following order before installation (a) Etch primer is applied after caustic soda acid treatment (b) Painted with white matt paint. The mount is treated with the following (a) A hot dip which galvanizes all steel parts

pg. 25

(b) Etch primer treatment (c) White enamel paint is applied as a last coating.

Fixing the feed onto the antenna The feed is supported by a set of four pipes called as a quadruped. It is fixed before the whole antenna structure is hoisted, that is, it is fixed on the ground itself before the whole antenna structure is fixed. Care should be taken that the feed is at the exact focus of the reflector. A maximum tolerance of +3mm is allowed for the separation between the actual focus and feed position. Also the feed entrances and cable output ports are covered with waterproof Teflon sheet to prevent the entry of moisture into the arrangement.

pg. 26

Você também pode gostar