Professional Documents
Culture Documents
Introduction
Remote sensing is defined as the science which deals with obtaining information about
objects on earth surface by analysis of data, received from a remote platform.
In the present context, information flows from an object to a receiver (sensor) in the
form of radiation transmitted through the atmosphere. The interaction between the
radiation and the object of interest conveys information required on the nature of the
object. In order for a sensor to collect and record energy reflected or emitted from a
target or surface, it must reside on a stable platform away from the target or surface
being observed.
Platforms
Platform is a stage to mount the camera or sensor to acquire the information about a
target under investigation. Based on its altitude above earth surface, platforms may be
classified as
(1) Ground borne
(2) Air borne
(3) Space borne
Ground-based platforms
The ground based remote sensing system for earth resources studies are mainly used
for collecting the ground truth or for laboratory simulation studies.
Air-borne platforms
Aircrafts are generally used to acquire aerial photographs for photo-interpretation and
photogrammetric purposes. Scanners are tested against their utility and performance
from these platforms before these are flown onboard satellite missions.
Space-borne platforms
Platforms in space are not affected by the earth's atmosphere. The closed path of a
satellite around the earth is called its orbit. These platforms are freely moving in their
orbits around the earth, and entire earth or any part of the earth can be covered at
specified intervals. The coverage mainly depends on the orbit of the satellite. It is
through these space borne platforms, we get the enormous amount of remote sensing
data.
1
Types of Satellite orbits
Satellite orbits are designed according to the capability and objective of the sensors they
carry. Depending on their altitude, orientation and rotation relative to the earth
satellites can be categorized as:
(1) Geostationary (2) Polar orbiting and Sun-synchronous
Geostationary satellites
An equatorial west to east satellite orbiting the earth at an altitude of 35000 km, the
altitude at which it makes one revolution in 24 hours, synchronous with the earth's
rotation. These platforms are covering the same place and give continuous near
hemispheric coverage over the same area day and night. These satellites are put in
equatorial plane orbiting from west to east. Its coverage is limited to 70 oN to 70oS
latitudes and one satellite can view one-third globe (Fig 1). These are mainly used for
communication and meteorological applications viz. GOES METEOSAT, INTELSAT, and
INSAT satellites.
Sun-synchronous satellites
An earth satellite orbit in which the orbital plane is near polar and the altitude is such
that the satellite passes over all places on earth having the same latitude twice in each
orbit at the same local sun-time. Fig 2. This ensures similar illumination conditions when
acquiring images over a particular area over a series of days.
2
Fig 2. Sun synchronous orbit (source CCRS Website)
As the satellite orbits the Earth from pole to pole, its east-west position would not
change if the Earth did not rotate. However, as seen from the Earth, it seems that the
satellite is shifting westward because the Earth is rotating (from west to east) beneath
it. This apparent movement allows the satellite swath to cover a new area with each
pass (Fig. 3). The satellite's orbit and the rotation of the Earth work together to allow
complete coverage of the Earth's surface, after it has completed one complete cycle of
orbits (Fig. 4). Through these satellites the entire globe is covered on regular basis and
gives repetitive coverage on periodic basis. All the remote sensing resource satellites
may be grouped in this category. Few of these satellites are LANDSAT series, SPOT
series, IRS series, NOAA, SEASAT, TIROS, HCMM, SKYLAB, SPACE SHUTTLE etc.
3
Fig 4 Complete Coverage of Earth Surface by Sun Synchronous Satellites (source CCRS
website)
4
Ascending pass and Descending pass: The near polar satellites travel northward on one
side of the earth (ascending pass) and towards South Pole on the second half of the
orbit (descending pass). The ascending pass is on the shadowed side while the
descending pass is on the sunlit side. Optical sensors image the surface on a descending
pass, while active sensors and emitted thermal and microwave radiation can also image
the surface on ascending pass.
Perigee: It is the point in the orbit where an earth satellite is closest to the earth.
Apogee: It is the point in the orbit where an earth satellite is farthest from the earth.
Remote Sensing Sensors
Sensor is a device that gathers energy (EMR or other), converts it into a signal and
presents it in a form suitable for obtaining information about the target under
investigation. These may be active or passive depending on the source of energy.
Sensors used for remote sensing can be broadly classified as those operating in Optical
Infrared (OIR) region and those operating in the microwave region. OIR and microwave
sensors can further be subdivided into passive and active.
Active sensors use their own source of energy. Earth surface is illuminated through
energy emitted by its own source; a part of it is reflected by the surface in the direction
of the sensor, which is received to gather the information. Passive sensors receive solar
electromagnetic energy reflected from the surface or energy emitted by the surface
itself. These sensors do not have their own source of energy and cannot be used at
nighttime, except thermal sensors. Again, sensors (active or passive) could either be
imaging, like camera or sensor, which acquire images of the area and non-imaging types
like non-scanning radiometer or atmospheric sounders.
Instantaneous field of view (IFOV)
It is defined the solid angle through which a detector is sensitive to radiation (units is
mrad). It is defined as angular subtence at a given instant of the limiting detector
aperture at the second principal point of the system. IFOV is both a linear and angular
quantity.
IFOV = D/F radian
GRE = (D/F) x H meter
Where,
D=detector dimension, F=focal length, and H=flying height
5
sensor
radiometer
= angular
aperture (mrad)
Fig. 5 IFOV
Resolution
Resolution is defined as the ability of the system to render the information at the
smallest discretely separable quantity in terms of distance (spatial), wavelength band of
EMR (spectral), time (temporal) and/or radiation quantity (radiometric).
Spatial Resolution
Spatial resolution is the projection of a detector element or a slit onto the ground. In
other words, scanner's spatial resolution is the ground segment sensed at any instant. It
is also called ground resolution element (GRE).
Ground Resolution = H x IFOV
The spatial resolution at which data are acquired has two effects the ability to identify
various features and quantify their extent. The former one relates to the classification
accuracy and the later to the ability to accurately make mensuration. One important
aspect in classification accuracy is the contribution of boundary pixels. As the resolution
improves, pure center pixels of a feature increase in comparison to boundary pixels.
Thus the boundary error gets reduced with improved resolution.
6
The accuracy of measurement of an area will depend upon the accuracy of locating the
boundary. Since it is not possible to locate with accuracy better than a fraction of a
pixel, the larger the pixel size, the more error will be the error in the area estimation.
Images where only large features are visible are said to have coarse or low resolution. In
fine resolution images, small objects can be detected.
Spectral Resolution
Spectral emissivity curves, which characterize the reflectance and/or emittance of a
feature or target, over a variety of wavelengths. Different classes of features and details
in an image can be distinguished by comparing their responses over distinct wavelength
ranges. Broad classes such as water and vegetation can be separated using broad
wavelength ranges (VIS, NIR), whereas specific classes like rock types would require a
comparison of fine wavelength ranges to separate them. Hence spectral resolution
describes the ability of the sensor to define fine wavelength intervals i.e. sampling the
spatially segmented image in different spectral intervals, thereby allowing the spectral
irradiance of the image to be determined.
The selection of spectral band location primarily depends on the feature characteristics
and atmospheric absorption.
Radiometric Resolution
This is a measure of the sensor to differentiate the smallest change in the spectral
reflectance/emittance between various targets. It is normally defined as the noise
equivalent reflectance change NE or noise equivalent temperature NET.
The radiometric resolution depends on the saturation radiance and the number of
quantisation levels. Thus, a sensor whose saturation is set at 100 reflectance with an
7
8 bit resolution will have a poor radiometric sensitivity compared to a sensor whose
saturation radiance is set at 20 reflectance and 7 bit digitization.
Temporal Resolution
Obtaining spatial and spectral data at certain time intervals. Temporal resolution is also
called as the repetivity of the satellite; it is the capability of the satellite to image the
exact same area at the same viewing angle at different periods of time. The temporal
resolution of a sensor depends on a variety of factors, including the satellite/sensor
capabilities, the swath overlap and latitude. It is an important aspect in remote sensing
when
persistent cloud offers limited clear views of the earths surface
short lived phenomenon need to be imaged (flood, oil slicks etc.)
multi temporal comparisons are required (agriculture application)
the changing appearance of a feature over time can be used to distinguish it
from near similar features (wheat/maize)
Cameras and their use for aerial photography are the simplest and oldest of sensors
used for remote sensing of the Earth's surface. Cameras are framing systems (figure 7a),
which acquire a near-instantaneous "snapshot" of an area of the Earths surface.
Camera systems are passive optical sensors that use a lens (or system of lenses
collectively referred to as the optics) to form an image at the focal plane, the aerial
image plane at which an image is sharply defined.
Many electronic (as opposed to photographic) remote sensors acquire data using
scanning systems, which employ a sensor with a narrow field of view that sweeps over
the terrain to build up and produce a two-dimensional image of the surface. Scanning
systems can be used on both aircraft and satellite platforms and have essentially the
same operating principles. A scanning system used to collect data over a variety of
different wavelength ranges is called a multispectral scanner (MSS), and is the most
commonly used scanning system. There are two main modes or methods of scanning
employed to acquire multispectral image data - across-track scanning, and along-track
scanning.
8
Analogue Recording
(a) analogue recording Digital
(b) digital Recording
recording Digital
(c) digital Recording
recording
Camera
(aerial photography) Whiskbroom scanner Pushbroom scanner
Camera Whiskbroom Scanner Pushbroom Scanner
(Aerial Photography)
9
is measured independently. Each detector is designed to have its peak spectral
sensitivity in a specific wavelength band.
The electrical signals generated by each of the detectors of the MSS are amplified by the
system electronics and recorded by a multi-channel tape recorder. Usually, on board
signal conversion is used to record the data digitally for subsequent computer
processing on the ground. Subsets of the data can also be viewed in-flight on a monitor
to verify flight line coverage and to provide a real time interpretation capability of the
scene being recorded.
Along-Track Multispectral Scanning
As with across-track systems, along track or push broom scanners record multispectral
image data along a swath beneath an aircraft. Also similar is the use of the forward
motion of the aircraft to build up a two-dimensional image by recording successive scan
lines that are oriented at right angles to the flight direction. However, there is a distinct
difference between along-track and across-track systems in the manner in which each
scan line is recorded. In an along-track system there is no scanning mirror. Instead, a
linear array of detectors is used to "scan" in the direction parallel to the flight line
(Figure 7c). Linear arrays normally consist of numerous charge-coupled devices (CCDs)
positioned end to end. As illustrated in Figure 7c each detector element is dedicated to
sensing the energy in a single ground resolution cell along any given scan line. The data
for each scan line are electronically compiled by sampling each element along the array
(eliminating the need for a scanning mirror).
The size of the detectors comprising a linear array determines the size of each ground
resolution cell. Hence, CCDs are designed to be very small and a single array may
contain over 10,000 individual detectors. Each spectral band, or channel, of sensing
requires its own linear array. Normally, the arrays are located in the focal plane of the
scanner such that all scan lines are viewed by all arrays simultaneously.
Linear array systems afford a number of advantages over mirror scanning systems. First,
linear arrays provide the opportunity for each detector to have a longer dwell time, or
residence time, to measure the energy from each ground resolution cell. This enables a
much stronger signal to be recorded and a greater range in the levels of signal that can
be sensed. This leads to better spatial and radiometric resolution. In addition, the
geometric integrity of linear array systems is greater because of the fixed relationship
among detector elements recording each scan line. The geometry along each scan line
is similar to that characterizing an aerial mapping camera. Because CCDs are solid-state
microelectronics devices, they are generally smaller in size and weight and require less
power for their operation. Having no moving parts, a linear array system has higher
reliability and longer life expectancy. (Due to such advantages, CCDs are used
extensively in satellite remote sensing systems.)
One disadvantage to push broom systems is the need to calibrate many more detectors.
Another current limitation to commercially available CCDs is their relatively limited
10
range of spectral sensitivity. Charge-coupled detectors are not really available that are
sensitive to wavelengths longer than the near-IR. However, detectors capable of
operating at longer wavelengths are under development.
Optical Sensors
Data products obtained by various scanner/detector/recorder combinations in analogue
or digital form fall in this class. Scanner systems working beyond the visible and near
infrared range of the electromagnetic spectrum, in thermal and microwave region
(RADAR) are all non-photographic systems. Such data is collected by sensor system in
satellite and transmitted to earth, where it is received and recorded at Ground Station.
Thermal Scanners
Many multispectral (MSS) systems sense radiation in the thermal infrared as well as the
visible and reflected infrared portions of the spectrum. However, remote sensing of
different from the sensing of reflected energy. Thermal sensors use photo detectors
sensitive to the direct contact of photons on their surface, to detect emitted thermal
radiation. The detectors are cooled to temperatures close to absolute zero in order to
limit their own thermal emissions. Thermal sensors essentially measure the surface
temperature and thermal properties of targets.
Thermal imagers are typically across-track scanners that detect emitted radiation in only
the thermal portion of the spectrum. Thermal sensors employ one or more internal
temperature references for comparison with the detected radiation, so they can be
related to absolute radiant temperature. The data are generally recorded on film and/or
magnetic tape and the temperature resolution of current sensors can reach 0.1 C. For
analysis, an image of relative radiant temperatures is depicted in grey levels, with
warmer temperatures shown in light tones, and cooler temperatures in dark tones.
In a thermal image, the tone of an object is a function of its surface temperature and its
emissivity. Of these parameters, the surface temperature is the dominant factor for
producing tonal variations in the scene. All objects emit infrared radiation and the
amount of emitted radiation is a function of surface temperature. Hot bodies appear in
lighter tone in a thermal image and cooler bodies appear darker. The emitted radiation
are collected by thermal scanner, which works on the principle of Optical Mechanical
Scanner, and cryogenically cooled detectors are employed to sense the radiation in the
wavelength of 8 to 14 m wavelength. Temperature variations of upto one degree
centigrade can be estimated from the thermal imagery.
11
Table 1 Thermal sensors
HCMM TM
Operational period 1978-1980 1982 to present
Orbital altitude 620 Km 705 Km
Image coverage 700 by 700 Km 185 by 170 Km
Acquisition time, day 1:30 p.m. 10:30 a.m.
Acquisition time, night 2:30 a.m. 9:30 p.m.
Visible and reflected IR detectors
Number of bands 1 6
spectral range 0.5 0 - 1.1m 0.4 - 2.35 m
Ground resolution cell 500 by 500 m 30 by 30 m
Thermal IR detector
Spectral range 10.5 - 12.5 m 10.5 - 12.5m
Ground resolution cell 600 by 600 m 120 by 120m
12
Table 2: Microwave Sensors
Satellite Missions
Today more than ten E.O. satellites provide imagery that can be used in various
applications. The list also includes some failed as well as future missions. Agencies
responsible for the distribution and trading of data internationally are also listed.
13
Table-3 Operational Earth Observation Satellites
EUROPE MIDDLE NORTH AMERICA ASIA
EAST
France ESA Israel USA Canada India Japan
SPOT1-86 LANDSAT5-85
S 10m 30m
A SPOT2-90 ERS1- LANDSAT6-93
T 10m 92/00
E radar
LL SPOT3- ERS2-95 EARLYBIRD-98 IKONOS1-99 RADARSAT- IRS1C-95
I 93/96 radar 1m 95 6m
T SPOT4-98 ENVISAT- LANDSAT7-99 IKONOS2-99 IRS1D-97
E 10m 2001 15m 1m 6m
S Radar
EROS A/1- QUICKBIRD-01 ORBVIEW-01 IRS P6-2003
00 2m 0.6m 1m 5.8m
SPOT5-02 EROS B/1- ORBVIEW-02 RADARSAT- CARTOSAT- ALOS-
3m+HRS10 02 1m 1m 03 1& 2 03
2.5m/80cm 2.5m
Distribution
SPOT Miscellaneo Imagesat SI-EOSAT, Earthwatch, RADARSAT NRSA- Jaxa
IMAGING us Orbimage, USGS EOSAT
14
After more than two decades of success, the Landsat program realised its first
unsuccessful mission with the launch failure of Landsat-6 on October 5, 1993. The
sensor included on-board was the Enhanced Thematic Mapper (ETM). To provide
continuity with Landsat -4 and -5 the ETM incorporated the same seven spectral bands
and the same spatial resolutions as the TM. The ETM's major improvement over the TM
was addition of an eighth panchromatic band operating in 0.50 to 0.90-m range and
spatial resolution of 15m. Landsat-7 includes two sensors: the Enhanced Thematic
Mapper plus (ETM+) and the High Resolution Multispectral Stereo Imager (HRMSI).
Table-4 Characteristics of Landsat-1 to -7 Missions
Satellite Launc Decommissione RBV bands MSS bands TM bands
hed d
Landsat-1 July Jan. 6, 1978 1, 2, 3 4, 5, 6, 7 -
23, (simultaneous
1972 images)
Landsat-2 Jan. Feb. 25, 1982 1, 2, 3 4,5,6,7 -
22, (simultaneous
1975 images)
Landsat-3 March Mar. 31, 1983 A,B,C,D (One band 4,5,6,7,8* -
5, side by side
1978 images)
Landsat-4 July - 1,2,3,4,5,6,
16, 7
1982
Landsat-5 March Same as LANDSAT 4
1,
1984
Landsat-6 Oct. 5, LAUNCH FAILURE
1993
Landsat-7 April - - 1,2,3,4,5,6,
15, 7,8
1999
Landsat-8 Februa OLI Bands
ry 11,
1-11
2013
15
Table-5 Orbital characteristics of Landsat series satellites
Sensors
(i) Multispectral Scanner (MSS) used in Landsat series satellites
Multispectral scanner (Optical Mechanical Scanner) onboard Landsat series of satellites
of U.S.A. (L1, L2, L3, L4 & L5) gives line scan type imagery using an oscillating mirror to
continuously scan the earth surface perpendicular to the spacecraft velocity. Six lines
are scanned simultaneously in each of the four spectral bands for each mirror sweep.
Spacecraft motion provides the along-track progression of the scan lines. Radiation is
sensed simultaneously by an array of six detectors each of four spectral bands from 0.5
to 1.1 m. The detectors outputs are sampled, encoded and formatted into continuous
digital data stream.
(ii) Return Beam Vidicon (RBV) used in Landsat series satellites
Return Beam Vidicon onboard Landsat 1, 2 & 3 is a camera system, which operates by
shuttering 3 independent cameras (2 in case of L3) simultaneously, each sensing a
different spectral band in the range of 0.48 to 0.83 m. The ground scene viewed (185
km x 185 km) is stored on the photosensitive surface of the camera tube and after
shuttering; the image is scanned by an electron beam to produce a video signal output.
In order to produce overlapping images along the direction of spacecraft motion, the
cameras are re-shuttered after every 25 seconds.
(iii) Thematic Mapper (TM) used in Landsat series satellites
Landsat 4 & 5 have onboard a new payload called Thematic Mapper" with 7 spectral
bands & ground resolution of 30 meters. This is in addition to the MSS payload, which is
identical to those carried onboard Landsat 1 & 2 and replaces RBV payload. TM is also
an Optical Mechanical Scanner, similar to MSS; however, being a 2nd generation line
scanning sensor, it ensures better performance characteristics in terms of (i) improved
pointing accuracy and stability, (ii) high resolution, (iii) new and more number of
16
spectral bands, (iv) 16 days repetitive coverage (v) high scanning efficiency using bi-
directional scanning and (vi) increased quantization levels. For achieving the bi-
directional scanning, a scan line corrector (SLC) is introduced between the telescope and
focal plane. The SLC ensures parallel lines of scanning in the forward and reverse
direction.
Table-6 Sensor characteristics of Landsat series satellites
Landsat 8
17
Spectral Band Wavelength Resolution
Band 1 - Coastal / Aerosol 0.433 - 0.453 m 30 m
Band 2 - Blue 0.450 - 0.515 m 30 m
Band 3 - Green 0.525 - 0.600 m 30 m
Band 4 - Red 0.630 - 0.680 m 30 m
Band 5 - Near Infrared 0.845 - 0.885 m 30 m
Band 6 - Short Wavelength Infrared 1.560 - 1.660 m 30 m
Band 7 - Short Wavelength Infrared 2.100 - 2.300 m 30 m
Band 8 - Panchromatic 0.500 - 0.680 m 15 m
Band 9 - Cirrus 1.360 - 1.390 m 30 m
18
to 0.73 m) with red band from these systems (0.61 to 0.68 m). This band will be used
to produce both 10m black and white images and 20m multispectral data. Another
change in SPOT-4 is the addition of a separate wide-field-of-view, sensor called the
Vegetation Monitoring Instrument (VMI).
Altitude 832
Orbital period (min.) 101
Inclination (degrees) 98.7
Equatorial crossing time 10.30 AM (local sun time)
Sensors HRV
Temporal resolution 26 days
(Repetivity)
Stereo viewing capability 5 days
Swath (km) 60
Resolution 20m MLA, 10m PLA
19
0.43-0.47
(blue)
0.61-
600 x
VMI Multispectral 4 0.68(red) 1000 1
120
0.78-0.89(
NIR) 1.58-
1.75(SWIR)
0.5-0.59
(green)
0.61-0.68 10
(red) 10
Multispectral 4 60
0.79-0.89 10
SPOT -5 May 2002
(NIR) 20
1.58-1.75
HRS
(SWIR)
5 m, 26
combined to
HRG
Pan 1 0.61-0.68 generate a 60
2.5-metre
product.
10 m
(resampled
Pan 1 0.61-0.68 60
at every 5m
along track)
Same as
VMI Multispectral 4 I000
SPOT 4
0.5-0.59
(green)
0.61-0.68
March 24,
SPOT-4 (red) 26
1998 Multispectral 4 20 60
HRV 0.79-0.89
(NIR)
1.58-1.75
(SWIR)
Pan 1 0.61-0.68 10 60
0.5-0.59
1990 &
SPOT-2 Multispectral 3 0.61-0.68 20 60
March HRV 26
&3 0.79-0.89
1998
Pan 1 0.51-0.73 10 60
Same as Spot
Multispectral 3 20 -do-
SPOT-1 1986 HRV 2 26
Pan 1 -do- 10 -do-
20
Sensors
21
Table-9 Orbital characteristics of IRS series satellites
Swath
Satellite No. of Resolution Revisit
Launch Sensors Types Width
Name Bands (meters) Time
(km)
5 November
Mangalyaan ( Mars
2013 / 24
Orbiter Mission)
September 2014
Argos
SARAL 25 February 013 Altimeters
Altika
C
RISAT - 1 26 April 2012 SAR Active Radar 1-50 m 25 days
Band
MADRAS
Microwave
Megatropiques 12 October 2011 SAPHIR
radiometer
ScaRaB
22
ROSA
IRS P6
17 Oct, 2003 4 23 142
(Resourcesat 1) LISS-III Multispectral 24 days
23.9 MX
mode
LISS-IV Multispectral 3 5.8 24 days
70 PAN
mode
OCM Multispectral 8 360 m 1420 km
IRS-P4 (Oceansat) 26 May, 1999 120, 80, 40 2 days
MSMR RADAR 4 1360 km
and 40 kms
23
1 70 148
PAN PAN 1 6 70
Sensors
(i) Linear Imaging Self Scanning (LISS) Camera used in IRS-1A & B
Indian Remote Sensing Satellite (IRS-1A) fully designed and fabricated by the Indian
Space Research Organization (ISRO) was launched on 17th March 1988 by Russian
launcher. It has four spectral bands in the range of 0.45 to 0.86 m (0.45 to 0.53 m to
0.59 m, 0.62 to 0.68 m and 0.77 to 0.86 m) in the visible and near infrared range
with two different spatial resolution of 72.5 m. and 36.25 m. from one no. of open LISS-
1 and two nos. of LISS-2 cameras respectively. It provides repetitive coverage after
every 22 days. Like all other LANDSAT/SPOT missions which are designed for global
coverage IRS is also in sun synchronous, polar orbit at about 900 km altitude and cover a
width of 148 km. on ground. It uses linear array detectors (CCD) like SPOT.
(ii) Linear Imaging Self Scanning-3 Camera (LISS-3)
This camera is configured to provide imageries in three visible bands as well as in
shortwave infrared band. The resolution and swath for visible bands are 23.5 m and 142
km, respectively. The detector has a 6000-element CCD based linear array with a pixel
dimension of 10m by 7 m. The detector is placed at the focus of a refractive type
optical system consisting of eight lens elements, which provides a focal length of 360
mm.
The processing of the analogue output video signal is similar to that of PAN. For this
camera, a 7-bit digitization is used which gives an intensity variation of 128 levels.
24
Table - 12 Characteristics of LISS-3
Band 2 0.52-0.59 m
Band 3 0.62-0.68 m
Band 4 0.77-0.86 m
Band 5 1.55-1.70 m
Geometric resolution 23.5 m for bands 2,3,5
70.5 m for band 5
Equivalent focal length (bands2, 3,4/band 5) 347.5 mm/301.2 mm
Swath 141 km for bands 2,3,4
148 km for band 5
Radiometric resolution 7 bits
10 bits in Resourcesat 2
Band-to-band registration 0.25 pixel
The PAN payload with its capability to tilt 26o, can view (revisit) any particular scene
once in 5 days, if required. Additionally this provision can be used for getting stereo
pairs or imageries. The tilting capability is achieved by steering the camera as a whole
by the required angle using a steering mechanism to which PAN camera lugs are fixed.
Table - 13 Characteristics of PAN camera
25
0.3o (along track)
Spectral band 0.5-0.75 m
Band 3 0.62-0.68 m
Band 4 0.77-0.86 m
Resolution 188.3 m
Swath 810 km
Radiometric resolution 7 bits
Band-to-band registration 0.25 pixel
LISS-IV sensor onboard IRS P6 operates in three spectral bands in the visible and near-
infrared (VNIR) or PAN mode with 5.8 meter spatial resolution.
The LISS-IV sensor can be operated in either of two modes ( In IRSP6):
In multi-spectral mode (Mx), LISS-IV covers a swath of 23 km (selectable out of
70 km total swath) in all three bands.
In mono mode (Mono), the full swath of 70 km will be covered in any one single
band selectable by ground command (nominally in B3, red band).
26
IGFOV 5.8 m at nadir
Spectral B2: 0.52-0.59
Bands B3: 0.62-0.68
B4: 0.77-0.86
Swath 23.9 km (multispectral mode in P6 and 70 Km in Resourcesat2)
Integration 0.877714 msec
Time
Quantization 10 bits
No. of gains Single gain (Dynamic range obtained by sliding 7 bits out f of 10 bits)
27
MSMR, which operates in four microwave frequencies both in vertical and horizontal
polarisation is used to collect data on sea surface temperature, wind speed, cloud water
content and water vapor content in the atmosphere above the ocean.
Radiometric Resolution: Data is collected as 11 bits per pixel (2048 gray tones).
Timings of collecting / receiving IKONOS data and satellite orbit characteristics vary
considerably depending on accuracy of product, extent and area. The applications for
this data are boundless: in particular, it will be used for large scale mapping, creating
precise height models for e.g. micro-cellular radio, and for every application requiring
the utmost detail from areas which are inaccessible for aerial photography.
Meteorological Satellites
Designed specifically to assist in weather prediction and monitoring, meteorological
satellites, or meteosats, generally incorporate sensors that have very coarse spatial
resolution compared to land oriented systems. On the other hand, meteosats afford the
advantages of global coverage at very high temporal resolution. Accordingly, meteosat
28
data have been shown to be useful in natural resource applications where frequent,
large area mapping is required and fine detail is not. Apart from the advantage of
depicting large areas at high temporal resolution, the coarse spatial resolution of
meteosats also greatly reduces the volume of data to be processed for a particular
application.
Numerous countries have launched various types of meteosats with a range of orbit and
sensing system designs e.g. NOAA series (operated by U.S. named after the National
Oceanic and Atmospheric Administration). These have near-polar, sun-synchronous
orbits. In contrast GOES and INSAT series satellites are in geo-stationary orbits. India has
launched INSAT series satellites, which are telecommunication, and meteorological
satellites.
INSAT Series
INSAT satellites are basically communication satellites used for telecommunication and
broadcasting, which carried meteorological sensor for weather monitoring. These
satellites are used in day-to-day weather forecasting, cyclone monitoring etc. The
sensor is Very High Resolution Radiometer (VHRR). Among this series, the most
powerful satellite is INSAT-1C, launched from French Guyana on December 1995
weighing 2070 kg in a geo-stationary orbit. This satellite has heralded a new era in
telecommunication by introducing mobile phones. The details are given below.
Table-19 Orbital characteristics of INSAT series satellites
Altitude 36000 km
Nature Geostationary
Repetitive coverage 3 hr.
Sensor VHRR
Resolution 2.75 km
Spectral bands 0.55 - 0.75 m
10.5 - 12.5 m.
29
Megha-Tropiques
ISRO and French National Space Centre (CNES) signed a Memorandum of Understanding
(MOU) in 2004-05 for the development and implementation of Megha-Tropiques
(Megha meaning cloud in Sanskrit and Tropiques meaning tropics in French). The launch
of Megha-Tropiques is planned during the fourth quarter of 2010.
Megha-Tropiques is aimed at understanding the life cycle of convective systems and to
their role in the associated energy and moisture budget of the atmosphere in the
tropical regions. The satellite will carry an Imaging Radiometer Microwave Analysis and
Detection of Rain and Atmospheric Structures (MADRAS), a six channel Humidity
Sounder (SAPHIR), a four channel Scanner for Radiation Budget Measurement (SCARAB)
and GPS Radio Occultation System (GPS-ROS).
SARAL
The Satellite for ARGOS and ALTIKA (SARAL) is a joint ISRO-CNES mission and planned to
be launched during 2011. The Ka band altimeter, ALTIKA, provided by CNES payload
consists of a Ka-band radar altimeter, operating at 35.75 GHz. A dual frequency total
power type microwave radiometer (23.8 and 37 GHz) is embedded in the altimeter to
correct tropospheric effects on the altimeter measurement. Doppler Orbitography and
Radio-positioning Integrated by Satellite (DORIS) on board enables precise
determination of the orbit. A Laser Retroreflector Array (LRA) helps to calibrate the
precise orbit determination system and the altimeter system several times throughout
the mission.
ASTROSAT
ASTROSAT is a first dedicated Indian Astronomy satellite mission, which will enable
multi-wavelength observations of the celestial bodies and cosmic sources in X-ray and
UV spectral bands simultaneously. The scientific payloads cover the Visible (3500-6000
), UV (1300-3000 ), soft and hard X-ray regimes (0.5-8 keV; 3-80 keV). The uniqueness
of ASTROSAT lies in its wide spectral coverage extending over visible, UV, soft and hard
X-ray regions.
The GRS products can either be standard or value added/special products. Standard
products are generated after applying radiometric and Geometric corrections.
Special/value added products are generated after further processing the standard
products by mosaicing/merging/extracting/enhancement of data.
The raw data recorded at the earth station is corrected to various levels of processing at
the Data processing systems
30
Level 1 Radio metrically corrected and Geometrically corrected only for
earth rotation (Browse product)
Level 2 Both radio metrically corrected and Geometrically corrected
(Standard product)
Level 3 Special processing like merging, enhancement etc. after level 2
corrections (Special product)
Precision Product
Value added product e.g. vegetation index map. Digital terrain model
Radiometric distortions
Non Uniform Response of the Detectors
Specific Detector element Failure
Data loss during data communication or Archival/Retrieval
Narrow dynamic range
Image to Image Variations
Geometric distortions
Scene related
Sensor related
Space Craft related
Multi Image Mosaicing
Map Projection
Geocoded Correction True North Rotation
Data dissemination
The data are recorded on Digital Linear Tapes (DLTs) or CD-ROMs, DVDs depending on
the mission and archived for providing data products to users as and when orders are
received.
Satellite data products are available on photographic and digital media. Photographic
products can be supplied as films or prints. Digital products can be supplied in form of
CD-ROMs, DVD or it can be downloaded through FTP services also.
Generally, single band data is provided in B/W such as PAN data or one band data from
multi-spectral sensors. Similarly, photographic, color products called as False Color
Composites (FCC) can be provided for multi-spectral data. The output scale for prints
can vary from 1:1 M to 1:5000.
31
GeoTIFF- Gray Scale (from IRS-1C onwards except NOAA, AQUA and TERRA)
GeoTIFF - RGB single band FCC or NCC(from IRS-1C onwards except NOAA, AQUA
and TERRA)
HDF (AQUA and TERRA and OCEANSAT-2)
The digital data format document is provided along with the digital data.
PATH
An orbit is the course of motion taken by the satellite in space and the ground trace of
the orbit is called a 'Path'.
e.g. IRS IC (source NRSC)
In a 24 day cycle, the satellite completes 341 orbits with an orbital period of 101.35
minutes. This way, the satellite completes approximately 14 orbits per day. Though the
number of orbits and paths are the same, the designated path number in the
referencing scheme and the orbit number are not the same.
On day one (D1), the satellite covers orbit numbers 1 to 14, which as per the referencing
scheme will be path numbers 1, 318, 294, 270, 246, 222, 198, 174, 150, 126, 102, 78, 54
and 30, assuming that the cycle starts with path 1.
So orbit 1 corresponds to path 1, orbit 2 to path 318, orbit 3 to path 294 etc. Path
number one is assigned to the track which is at 29.7 deg West longitude. The gap
between successive path is 1.055 deg. All subsequent orbits fall westward. Due to the
limitation of antenna drive speed it is difficult to track the satellite around zenith
because above 86 deg elevation, if a pass occurs, the data may be lost for a few
seconds. To reduce it to minimum, path 1 is positioned in such a manner that the data
reception station is exactly between two nominal paths, namely 99 and 100.
ROW
The lines joining the corresponding scene centers of different paths are parallel to the
equator and are called Rows.
Along a path, the continuous stream of data is segmented into a number of scenes
which are framed in such a manner that its centre lies on the equator which is taken as
the reference line for segmentation.
e.g. LISS-III (source NRSC)
LISS III scene, consisting of 6000 lines, is framed such that the centre of the scene lies on
the equator. The next scene is defined such that its centre lies exactly 5,703 lines from
32
the equator. The center of next scene is then defined 5,703 lines northwards and so on.
This is continued upto 81 deg North latitude. The uniformly separated scene centers are
such that same rows of different paths fall at the same latitude. The row number 1 falls
around 81 deg North latitude, row number 41 will be near 40 deg North and row
number of the scene lying on the equator is 75. The Indian region is covered by row
numbers 30 to 90 and path numbers 65 to 130.
Scene Definition
The camera scans the ground track line by line continuously. The satellite motion along
the track provides continuous imaging of the ground. This continuous stream of data is
segmented to convenient sizes. These segments are called scenes.
References:
1. Campbell John B. 1996 : Introduction to Remote Sensing. Taylor & Francis
2. Curran P.J., 1985. Principles of Remote Sensing. Longman Group Limited,
London. 282 pp.
3. Elachi C., 1987. Introduction to the Physics and Techniques of Remote Sensing.
Wiley Series in Remote Sensing, New York, 412 pp.
4. Floyd F. Sabins : Remote Sensing and Principles and Image Interpretation
5. George Joseph : Imaging Sensors, 1996 Remote Sensing Reviews
6. Lillesand Thomas M. & Kiefer Ralph 1993 : Remote Sensing and Image
Interpretation Third Edition John Villey
7. Manual of Remote Sensing IIIrd Edition : American Society of Photogrammtery
and Remote Sensing
8. http://www.ccrs.nrcan.gc.ca/ccrs/learn/tutorials/fundam/chapter1/chapter1_2.
9. www.planetary.brown.edu/arc/sensor.html
10. http://www.ersc.edu/resources/EOSC.html
11. www.spaceimage.com
12. www.eospso.gfc.nasa.gov
13. www.landsat.org
14. www.spotimage.fr/home
15. www.space.gc.ca
16. www.esa.int/export/esasa/ESADTOMBAMC_earth_O.html
33