You are on page 1of 48

A INDUSTRIAL TRAINING REPORT ON

PRESENTED BY: HIMANSHU MAHAJAN GIRISH KHAJURIA RACHIT SHARMA

HISTORY :
In its early stages of development, television

employed a combination of optical, mechanical and electronic technologies to capture, transmit and display a visual image. By the late 1920s, however, those employing only optical and electronic technologies were being explored. Doordarshan is the public television broadcaster of India and a division of Prasar Bharti ,a public service broadcaster nominated by the govt. of India

It is one of the largest broadcasting organisations in the world in terms of infrastructure of STUDIOS and TRANSMITTERS More than 90 percent of the population receive doordarshan programmes through a network of 1400 terrestrial transmitters. The first practical use of television was in Germany. Regular television broadcasts began in Germany in 1929 and in 1936 the Olympic Games in Berlin were broadcast to television stations in Berlin and Leipzig where the public could view the games live.

BLOCK DIAGRAM OF T.V. STATION

PROGRAMMER CONTROL ROOM (PCR)

The programmer control room is one of the essential

blocks of TV transmission. It can be termed as recording Centre for the program. The live telecast of the programs such as news, interviews etc. also take place here. This is one among the major sections of transmission and involves a number of technical and nontechnical persons. Recording takes place according to a predetermined schedule called programmed schedule.

TV STUDIO VIDEO TAPE RECODER AUDIO SECTION

TELEVISON STUDIO[CAMERA SECTION]


Studio floor

Studio is the room where a program is performed and

recorded using cameras. The studio floor is an open area, which contains the television cameras, microphones, lighting equipments sets and crew.
Control room

In control room there is program director, assistant

director, technical director, audio engineer and video engineer workers.

T.V. PICTURE
A picture can be considered to contain a number of

small elementary areas of light or shade which are called picture element.
The scene is focused on the photo-sensitive surface of

pickup device and a optical image is formed.


The photo electric properties of the pickup device

convert the optical image to a electric image charge depending on the light & shade of the scene.

To transmit this information, scanning is employed.

VIDEO CAMERA
A video camera is a camera used for electronic motion

picture acquisition, initially developed by the television industry but now common in other applications as well.
Cameras based on solid-state image sensors such as

CCDs (and later CMOS active pixel sensors) eliminated common problems with tube technologies such as burn-in and made digital video workflow practical.

Camera Control Unit (C.C.U)


The camera control unit has the provision to control the zoom lens action and brightness of camera tubes.
The C.C.U. engineer has the necessary facilities to adjust parameters such as video gain , camera sensitivity. The composite video signal received over the microwave

link is demodulated and processed in the usual manner by the C.C.U. engineer for transmission on the channel allocated to the station.

CAMERA TUBE

VIDEO CAMERA

Main recording room having coordination with both

studio as well as audio section


Here the output of all cameras is controlled It consists of video console which is used for shot

selection using different control knobs on it

VTR characteristics
Linear speed: It is the speed at which tape runs. It decides the tape length required/consumed for a particular duration.
Tape length for 30 Min. Cassette= 10.15x30x60 Cm.

Writing Speed: It is the speed at which signal information is written on the tape by the Head.
W. Speed =3.14xDrum Diameter in m x Drum Speed in rps

For Beta cam edit VCR, Drum Diameter = 74.4mm Drum speed =25 rps

Video tape Format

It defines the arrangement of magnetic information on the tape. It specifies: 1. Tape width 2. No. of tracks for Video, Audio 3. Their electrical characteristic and orientation. 4. Track width.

Material Used for recording . FERRO MAGNETIC MATERIALS (Fe2O3/Fe2O4) ARE USED . THESE MATERIALS HAS SPECIAL CHARACTERISTIC OF MAGNETIC RETAINTIVITY

TRANSMITTER

TRANSMITTER

FEATURES :
1. TYPE: -PCN-810 AL.
2. RATING: -10 KW. 3. STATUS: -VHF.
CONSTRUCTION : The transmitter consists of two frames as shown facing the front. The left frame accommodates the s and P A panel while the right frame accommodates Visual Last Stage

power amplifiers In addition plate voltage transformer, silicon rectifier and blower are installed outside the frame.

COLOUR TRANSMISSION
Three types of colour transmission system are:
NTSC SYSTEM SECAM SYSTEM PAL SYSTEM

NTSC
NTSC, named for the National Television System

Committee, is the analog television system used in most of North America, most countries in South America and some Pacific island nations and territories . The NTSC selected 525 scan lines as a compromise the number of scan lines to between 605 and 800. The standard recommended a frame rate of 30 frames per second, consisting of two interlaced fields per frame at 262.5 lines per field and 60 fields per second.

SECAM
SECAM, also written ,as Sequential Colour with

Memory, is an analog color television system first used in France. SECAM uses frequency modulation to encode chrominance information on the sub carrier. The red difference signal is transmitted on one line then the blue difference signal is transmitted on the other line.

PAL (Phase Alternate By Line):


is an analogue television encoding system used in

broadcast television systems in many countries. It is used for broadcast television systems and analogue television for additional discussion of frame rates, image resolution and audio modulation. The basics of PAL and the NTSC system are very similar; a quadrature amplitude modulated subcarrier carrying the chrominance information is added to the luminance video signal to form a composite video baseband signal.

The frequency of this subcarrier is 4.43361875 MHz for

PAL, compared to 3.579545 MHz for NTSC. The name "Phase Alternating Line" describes the way that the phase of part of the colour information on the video signal is reversed with each line, which automatically corrects phase errors in the transmission of the signal by cancelling them out. The 4.43361875 MHz frequency of the colour carrier is a result of 283.75 colour clock cycles per line plus a 25 Hz offset to avoid interferences. Since the line frequency is 15625 Hz, the colour carrier frequency calculates as follows: 4.43361875 MHz = 283.75 * 15625 Hz + 25 Hz.

VESTIGIAL SIDE BAND TRANSMISSION

As PAL is interlaced, every two fields are summed to

make a complete picture frame. Luminance, Y, is derived from red, green, and blue (R'G'B') signals. Y = 0.299R' + 0.587G' + 0.114B' U and V are used to transmit chrominance. Each has a typical bandwidth of 1.3 MHz. U = 0.492(B' Y) V = 0.877(R' Y) Composite PAL signal = Y + Usin(t) + Vcos(t) + timing where = 2FSC. Subcarrier frequency FSC is 4.43361875 MHz (5 Hz) for PAL-B/D/G/H/I/N.

The original colour carrier is required by the colour

decoder to recreate the colour difference signals. Since the carrier is not transmitted with the video information it has to be generated locally in the receiver. In order that the phase of this locally generated signal can match the transmitted information, a 10 cycle burst of colour subcarrier is added to the video signal shortly after the line sync pulse but before the picture information, during the so called BACK PORCH. An interesting comparison can be made with the VGA signal, the most notable differences being the double horizontal sweep time and interlace mode.

Video Formats
MPEG : The 'Moving Picture Experts Group' (MPEG) is

a working group of experts that was formed by ISO and IEC to set standards for audio and video compression and transmission. The MPEG compression methodology is considered asymmetric as the encoder is more complex than the decoder. The encoder needs to be algorithmic or adaptive whereas the decoder is 'dumb' and carries out fixed actions.

This is considered advantageous in applications such

as broadcasting where the number of expensive complex encoders is small but the number of simple inexpensive decoders is large. The MPEG's (ISO's) approach to standardization is novel, because it is not the encoder that is standardized, but the way a decoder interprets the bitstream. A decoder that can successfully interpret the bit stream is said to be complaint. The advantage of standardizing the decoder is that over time encoding algorithms can improve, yet compliant decoders continue to function with them.

MPEG has standardized the following compression

formats and ancillary standards: MPEG-1 (1993): Coding of moving pictures and associated audio for digital storage media at up to about 1.5 Mbit/s (ISO/IEC 11172). The first MPEG compression standard for audio and video. It was basically designed to allow moving pictures and sound to be encoded into the bit rate of a Compact Disc. It is used on Video CD, SVCD and can be used for lowquality video on DVD Video. It was used in digital satellite/cable TV services before MPEG-2 became widespread.

MPEG-2 (1995): Generic coding of moving pictures and

associated audio information. (ISO/IEC 13818) Transport, video and audio standards for broadcast-quality television. MPEG-2 standard was considerably broader in scope and of wider appealsupporting interlacing and high definition. MPEG-2 is considered important because it has been chosen as the compression scheme for over-the-air digital television ATSC, DVB and ISDB, digital satellite TV services like Dish Network, digital cable television signals, SVCD and DVD Video. It is also used on Blu-ray Discs, but these normally use MPEG-4 Part 10 or SMPTE VC-1 for high-definition content.

MPEG-3: MPEG-3 dealt with standardizing scalable

and multi-resolution compression and was intended for HDTV compression but was found to be redundant and was merged with MPEG-2, as a result there is no MPEG-3 standard. MPEG-3 is not to be confused with MP3, which is MPEG-1 Audio Layer 3. MPEG-4 (1998): Coding of audio-visual objects. (ISO/IEC 14496) MPEG-4 uses further coding tools with additional complexity to achieve higher compression factors than MPEG-2. In addition to more efficient coding of video, MPEG-4 moves closer to computer graphics applications.

MPEG-4 Part 2 (or Simple and Advanced Simple

Profile) and MPEG-4 AVC (or MPEG-4 Part 10 or H.264). MPEG-4 AVC may be used on HD DVD and Blu-ray Discs, along with VC-1 and MPEG-2.

SATELLITE COMMUNICATION

Satellite: Satellite is a device that performs two functions at the same time. One is that it receives the information originated from a ground station (transmitter) and then secondly it sends this information to another ground station (receiver). These satellites revolve around a fixed orbit. Satellite is a communicational device used for a high scale broadcast and Monitoring purposes that may be stationary or revolving in an orbit.

Following are two types of satellites that are used generally. Low earth satellite Geo synchronous satellite (i) Low earth Satellite: This type of satellite which are within an altitude of 400kms from earth surface are known as low earth satellite. (ii) Geo Synchronous Satellite: The type of satellites which are above an altitude of 22,000 miles from earths surface are known as Geo - synchronous satellite.

NEAR-EARTH
MOST COMMON ORBIT: NEAR POLAR
ALTITUDE : 400kms LIFETIME: LESS THAN 1 YEAR MAJOR EFFECT ON LIFESPAN:

ATMOSPHERIC DRAG PERIOD: 90 MIN

GEO-SYNCHRONOUS ORBIT
THREE SATELLITES IN NEAR-EQUATORIAL ORBITS CAN PROVIDE CONTINUOUS GLOBAL COVERAGE...
...EXCEPT FOR THE POLES

GEO-SYNCHRONOUS SATCOM

BASIC BLOCK DIAGRAM

Downlink
Uplink

SATELLITE TRANSPONDER

EARTH STATION
Earth Station is a uplink center from which the signals are fed to Satellite for distribution in a specified area covered by the Satellite. The signal is up-linked from the earth station and received by many down link centers in TV broad casting. It is a very important part of satellite communication system for broadcasting of signals. Earth Station classification Analog Earth Station Analog / Digital Simulcast Digital Earth Station C-band or Ku-band

DIGITAL EARTH STATION


Why Digital ?

More programs per channel / Transponder i.e. spectrum efficient Noise-Free Reception CD quality sound & better than DVD quality picture Reduced transmission power. Interactive services like e-commerce, e-banking, tele-quiz, telegames etc. Automated operation in broadcast plan Non availability of analog systems in near future Future of TV transmission DTH, DTT & Digital Cable

DIGITAL EARTH STATION


REQUIREMENTS
Up-converters

The up-conversion is required to raise the frequency of the signal in desired band: C-band, Extended C-band or Ku-band before transmission. The input to up converter is 70 MHz (output of modulator) and output of Up-converter is fed to HPA. The up-conversion may done in stages or in one stage directly. For example the 70 MHz signal is first converted into L band and then L band signal raised to desired frequency band. Normally L-band monitoring point is also provided in Upconverters for monitoring purposes.

DIGITAL EARTH STATION


Antenna system
The most widely used narrow beam antennas are reflector antennas. The shape is generally a paraboloid of revolution. For full earth coverage from a geostationary satellite, a horn antenna is used. Horns are also used as feeds for reflector antennas. A small earth terminal, the feed horn is located at the focus or may be offset to one side of the focus. Large earth station antennas have a sub reflector at the focus. In the Cass grain design, the sub reflector is convex with a hyperboloid surface, while in the Gregorian design it is concave with an ellipsoidal surface.

MICROWAVE ANTENNAE

HIGH POWER AMPLIFIER


The high power amplifier is used for the final power amplification of the digital RF signal in C-band/ Ku band that is fed to the antenna. The important parameters of HPAs are: Frequency range Output power at flange Bandwidth Gain variation (1.0 db (max.) for 40 MHz (narrow band) Types of HPAs are: KHPA - Klystron High Power Amplifier TWTA -Traveling Wave Tube Amplifier SSPA- Solid state Power Amplifier

DOORDARSHAN SAT. SERVICE


sn o 1 2 3 4
service mode satellit e Posn. transpond er D/L freq (MHz)

Sym rate 6.25 6.25 27.5 4.25

DD National DD Kashir DD National

SCPC

3C

74E 93.5E 93.5E


93.5E

C-02 C-02 C-01 C-02

3778.5 3780.5 3725 3774

MCPL 3A MCPC 4B SCPC 4B

DD Jammu

FUTURE SCOPE
HIGH-DEFINITION VIDEO High-definition video or HD video refers to any video system of higher resolution than standard-definition (SD) video, and most commonly involves display resolutions of 1,280720 pixels (720p) or 1,9201,080 pixels (1080i/1080p). High-definition image sources include terrestrial broadcast, direct broadcast satellite, digital cable, high definition disc (BD), internet downloads and the latest generation of video game consoles

THANK YOU

You might also like