You are on page 1of 20

BASICS OF TRANSPORTATION AND GIS

UNI T-5
MISCELLANEOUS TOPICS
Visual interpretation of satellite images
The photo interpretation is complemented by the interpretation of other kinds of
imagery which are different from photography. The limitation of photo
interpretation has been now changed to broad spectrum of image interpretation
The data collected by the remote sensing devices including passive and active
systems and employ different bands in the visible, near infrared, m infrared and
far infrared as well as microwave regions.
In the passive remote sensing the reflected or emitted electromagnetic energy is
measured by sensors operating in different spectral bands where the original
source of energy is the
But in active remote sensing method the earth surface is illuminated by an art
source of energy. The emitted and reflected energy detected by the sensors
onboard are transmitted to the earth station. The data then it is processed (after
various corrections) and made ready for the users.
The analysis of pictorial data can be performed using visual image interpret
techniques. Visual image interpretation has been applied in many fields inc
agriculture, archaeology conservation, engineering, forestry, geology, geo
meteorology, military intelligence, natural resource management, oceanography
science, and, urban and regional planning .
Competence has achieved in the use of this technique, more particularly in areas
like land use plan wasteland mapping, land evaluation, water quality monitoring,
topographic map and numerous other areas.
The remote sensing data products are available to the users in the form
photographic products such as paper prints, film negatives, diapositives of black
and white, and false colour composite (FCC) on a variety of scales and
(b) Digital form as computer compatible tape (COTs).
Broadly, satellite data products can be classified into different type s based on
satellite and sensor, level of preprocessing and the media.

1
Precision products are radiometric and geometric corrections refined with the use
of ground control points to achieve greater location accuracy.
Data products can be broadly classified into two types depending upon the s
photographic and digital.
The photographic products can either be in black and white, or color. Further they
could be either film or paper products, and in films it is possible to have either
negative film. The sizes of photographic products can vary depending on the
enlargement required.
Different types of photographic products supplied by National Remote Sensing
(NSRA) data centre, Govt., of India (NDC) are: Standard B/W and FCC
The standard products are available in color, and black and white in the form of
either as negatives or positives.
Paper prints both B and W and FCC are supplied in various scales. They a 1 X
(contact prints), 2X (two times enlarged) and 4X (four times enlarged) and 5X
times enlarged). Depending upon the enlargement of the scale of the product
varies
The photographic products contain certain details annotated on the margins.
These are useful for identifying the scene, sensor, date of pa processing level,
band combination, and so on.
Visual image interpretation is a process of identifying what we see on the images
and communicates the information obtained from these images to others for
evaluating this significance.
Levels of interpretation
o The image interpretation process can involve various levels of complexity from a
simple direct recognition of objects in the scene to the inference of site conditions.
An example of this can be a national highway or a major river on the satellite
imagery more particularly on a false color composite. If the interpreter has some
experience the interpretation of these linear features, road and river, may be
straight forward.
On the other hand the interpreter has to adopt some other image characteristics in
to infer the appearance of the objects on the image.

2
For example, interpretation IRSIC LISS III false color composite imagery for the
identification of 18 km pip from Patancheruvu to Amberpet of Hyderabad city,
which carries the industrial effluents, is an indirect approach.
In this case the actual pipeline can not be seen there are often changes at the
ground surface caused by the buried pipeline are visible on FCC.
The appearance of a light -toned linear streak across the in is associated with the
highway. This leads to the identification and mapping of pipe from FCC.
Ideally keys provide a method of organizing the information in a consistent manner and
provide guidance about the correct identification of Real or conditions on the images.
Ideally, it consists of two basic parts:
(i) A collection of annotated or captioned images (Stereopairs) illustrative of the
features or condition to be identified, and
(ii) A graphic or word description that sets forth in some system fashion the image
recognition characteristics of those features or conditions.
PROCESS OF IMAGE INTERPRETATION
Image interpretation or analysis is defined as the act of examining images
for purpose of identifying objects and judging their significance.
Interpreters study the remotely sensed data and attempt through logical
process to detect, identify, classify, measure, and evaluate the significances of
physical and cultural objects, the patterns and spatial relationship.
The sequence begins with the detection and identification of images later by
their measurements. The image interpretation has various aspects which have
overlapping functions. These aspects are detection, recognition and
identification, analysis, classification, and idealization.
The detection is a process of picking out an object or element from photo or
image through interpretation techniques. It may be detection of point or line
or a location, such as, agriculture field and a small settlement.
Recognition and Identification is a process of classification or trying to
distinguish an object by its characteristics or patterns which are familiar on
the image it is also known as photo reading.

3
Analysis is a process of resolving or separating a set of objects or features
having a similar set of characters. In analysis, lines of separation are drawn
between a group of objects and the degree of reliability of these lines can also
be indicated.
Classification is a process of identification and grouping of objects or features
resolved by analysis.
Detection is a process where references are drawn about the objects based on
direct or indirect evidence of the information phenomena under study.
Idealization is a process of drawing ideal or standard representation from
what is actually identified and interpreted from the image or map such as a
set of symbols or colors to be adopted in waste land maps, and geomorphic
landforms.
Basic Elements of Image Interpretation
Aerial and /or imageries are seen through stereoscopic instruments to obtain
three dimensional images. There are many photogram metric equipments for
visual interpretation transferring the details on to the base maps.
Image scale is also considered as the important elements of interpretation, as it
affects the level of useful in that can be extracted form aerial and space images.
Tone or hue refers to the relative brightness of color of objects on Different
surface objects reflect and emit certain amounts of radiant energy differences are
recorded as tonal or cooler or density variations on the imagery.
In black and white images, objects appear in different gray tones. The figure
shows a striking pattern of lines and dark toned soils where the tonal patterns vary
according to the drainage c of the soils.
Size refers to the spatial dimension of the object on ground. The size of an object
is a function of scale of the image or photo, and is also measurable storage
warehouse, for example, may be misinterpreted as a barn if size considered.
Shape refers to the physical form of an object and is also a function of scale of
image or photo. Size and shape are interrelated. Shape can be irregular, for salt
affected patches, boundary of undulating uplands or regular and uniform snow or
glacier.

4
In the case of stereoscopic images, the object s height also defines the shape. A
chimney can be identified very easily based on the shape. Playgrounds, paddy
fields, and designed monuments are some of the examples.
Texture is defined as a repetition of a basic pattern. Texture is the frequency of
tonal change on an image. Texture in the image is due to tonal repetitions it of
objects, which are often too small to be discernible, such as, tree leaves shadows.
It creates a visual impression of surface coarseness or smoothness
Texture is the product of their individual shape, size, pattern, shadow and tone.
Texture is dependent on the scale of the image is reduced, the texture of any given
object or are Progressively finer and ultimately disappears.
Texture can play any important role in distinguishing various features with similar
reflectance. For example, rocky area is considered a coarse texture, where as
beach sand with ripples are taken as fine texture.
Pattern refers to spatial arrangements of surface features characteristic of both
natural and man-made objects. Pattern is of several types like linear road, rail,
canal, non-linear stream, creeks; contiguous snow, clustered, settlements;
dispersed forest blanks; salt-affected patched orchards, linear cropping and so on.
The scale of a photograph and resolution of satellite imagery are the terms used to
determine the clarity of image. Resolution is of two types spatial and Spectral .
The former refers to picture element or the image of the smallest area resolvable
or identifiable on ground.
Aspect refers to the direction in which a mountain or hill slope faces amount of
sunshine and shadow. Aspects have a marked effect on the vegetation, settlements
and cultivation. The geographical site and local objects often provide the clue for
identifying objects and understanding the For example, salt affected land is
located near sea shore lines, desert plains ,blanks, hill slopes, snow or glacial
mountain peaks.
Association refers to the situation of the object with respect to the other features
and other neighboring features. For example, canals with agriculture marsh or
swamps with flood plains and tidal flats and limestone terrain is associated with

5
the Karst topography. Shadow is a function of sun s illumination angle, size,
shape of the object and sensor s viewing angle.
Karst topography is the topography which consists of a number of natural pot
holes, called these pot holes can be identified on the image as it is a surface
phenomenon. Surface expressions can be detected very easily on a
satellite/airborne image, es very good and reliable clue for the identification of
limestone terrain;
INTERPRETATION CHARACTERISTICS OF DIGITAL SATELLITE IMAGE.
Visual image Interpretation of satellite imagery, in general and a False Color Composite
(FCC) in particular, for the creation of these thematic maps/layers, is based on a
systematic observation and evaluation of certain key elements that are studied
stereoscopically. They are topography, drainage pattern, drainage texture and density,
erosion, image tone, vegetation, and landuse.
Topography
Each landform and bedrock type has its characteristic topographic form, including
a typical size and a shape. In practice, we can observe a distinct topographic
change at the boundary between two different landforms.
At the time of interpretation of any stereo pair we can observe that the terrains are
exaggerated in height about three or four times. The larger the base- height ratio,
the greater the vertical exaggeration.
Topography of the terrain may be interpreted in terms of bold, massive, steep
hillsides dissected, gully slopes, Karst and so on. The topography of the
horizontally bedded sandstone can be identified as bold, massive, flat-topped hills
or very steep slopes.
Drainage Pattern and Texture
The drainage pattern and texture seen on aerial and space images are indicators of
landform and bedrock type and suggest soil characteristics and site drainage conditions.
The drainage pattern, which is a surface expression and can be discerned from the
airphoto or satellite image, is a clue/key to infer something about the subsurface
phenomena. Fig. 5.9 shows six of the most common drainage patterns that can be
observed on different types of terrains:

6
Dendritic drainage pattern is a well integrated pattern formed by a main stream with its
tributaries branching and rebranching freely in all directions. This type of drainage
pattern commonly occurs on relatively homogeneous materials, such as, horizontally
bedded sedimentary rocks and granite.
Rectangular drainage patterns are basically dendritic patterns and are modified by
structural bed rock control such that the tributaries meet at right angles to its main stream.
Trellis drainage pattern consists of a number of streams having one dominant direction
with sub tributaries as right angles to it. It can be found in areas of folded sedimentary
rocks.
Radial drainage patterns are formed from a central area and are radiated outward from
this central area. All the sub-streams radiate away from a single point. These can be
found on an area full of volcanoes and domes.
Centripetal drainage pattern is the reverse of the radial drainage pattern. It can be
found in the areas of limestone sinkholes, volcanic craters and other depressions.
Deranged drainage pattern is a dis-ordered pattern, irregularly developed and directed
short streams, ponds, wetland areas, and glacial till areas.

7
The basic character of digital image data is illustrated in Fig. 6.1. Though the image
shown in (a) appears to be a continuous tone photograph, it is actually composed of two-
dimensional array of discrete picture elements or pixels. The intensity of each pixel
corresponds to the average brightness or radiance measured electronically.

8
Spatial Filtering Techniques
A characteristic of remotely sensed images is a parameter called spatial
frequency, defined as the number of changes in brightness values per unit distance
for any particular part of an image.
If there are few changes in brightness value over a given area it is termed as a
low- frequency area. If the brightness values changes dramatically over very
short distances, this is called high frequency area.
Algorithms which perform image enhancement are called filters because they
suppress certain frequencies and pass (emphasis) others. Filters that pass high
frequencies while emphasizing fine detail and edges called high frequency filters,
and filters that pass low frequencies called low frequency filters.
Filtering is performed by using convolution windows. These windows are called
mask, template filter or kernel. In the process of filtering, the window is moved
over the input image from extreme top left hand corner of the scene.
The discrete mathematical function transforming the original input image digital
number to a new digital value. First it will move along the line. As soon as the
line is complete , it will restart for the next line for covering the entire image .
The mask window may be rectangular (1 x 3, or 1 x 5 pixels) size or square (3 x
3, 5 x 7 or 7 x 7 pixels size). Each pixel of the window is given a weight age.
For low pass filters all the weights in the window will be positive and for high
pass filter all the values may be negative or zero, but the central pixel will be
positive with higher weightage value.
In the case of high pass filter the algebraic sum of all the weights in the window
will be
Low Pass Filters
Fig shows the cross section before applying the filter and Fig. shows the cross
section after the application of Low Pass Filter. From plot (Fig. 6.15 (a)) it is
inferred that the raw data has a random noise, whereas the other lines can provide
some information with greater clarity. This is achieved after the application of

9
low pass filters. There are a number of low pass filters, such as, mean filter,
median

10
Filter, and adaptive filters, and the principle behind all these filters are discussed
in this section.
Low-pass filters are used to smooth the image in the process of registration with
the technique of mean filtering each pixel is sequentially examined and if the
pixel digital number (DN) is greater than the average brightness (DN) of its
surrounding pixels by some threshold (t), it is replaced by the means of the 3 x 3
pixels window.
The window may be of any size, such as 5 x 5, 7 x 7 and so on. The larger the
window size, the more will be computational time.

The following window demonstrates the comparison of computation time of


varying window. The central pixel X has no correlation with the neighboring
pixels and is replaced by the mean value of surrounding pixels.
Eight raw data values (a) centered on point summed an averaged to produce one
output value (x). For the output image values from a matrix that has fewer rows
and columns than the input image, the unfiltered margin corresponds to the rows
and columns that can not be reached.
Generally these missing scanlines and columns are filled with zeros in order to
keep the input and output images of the same size.
The effect of the moving average filter is to reduce the overall variability of the
image and lower its contrast. The main problem of mean filter is that the resultant
image becomes blurred. So the edges between the features become fuzzy.

11
Median Filter
An alternative type of smoothing filter utilizes the median of the neighborhood
rather than the mean. The median filter is generally thought to be superior to the
moving average filter for two reasons.
First, the median of a set of n numbers is always one of the data values present
in the set. When n is an odd integer. Secondly, the median is less sensitive to
errors or to extreme data values.
Let us consider a set of nine pixel values in the digital image, (2, 1, 25, 7, 28, 5, 8,
30, 82) (neighbourhood) of 3 x 3 window, thus the median is the central value
when the data are ranked in an ascending or descending order of magnitude.
In this example the ranked values are {1, 2, 5, 7, 8, 25, 28, 30, 82} giving a
median value of 8. The mean 33.10 would be rounded up to the value of 33. The
value 33 is not present in the original data, unlike the median value 8.
Also the mean value is larger than 8 observed values and may be thought to be
unduly influenced by the extreme data values, which might represent noise, are
removed by the median filter. The median filter preserves edges better than a
mean filter shows the output of a median filter.
The median filtering is an alternative approach to spatial averaging. In this
filtering technique, the concept is very much similar to averaging. Here, the
abnormal or spatially uncorrelated pixel digital count (x) will be replaced by the
median within the 3 x 3 window.
This method eliminates isolated values related to noise spikes, while giving
rise to lower resolution loss than with averaging.
Both the median and mean filter image irrespective of the variability of
greylevels. The smoothing methods in which the filter weights are calculated
for each window position, and the calculations being based on the mean and
variance of the greylevels in the area of the image underlying the window,
are called adaptive filters.

12
High Pass Filters
A simple high pass filter may be implemented by subtracting a low pass filtered
image (pixels by pixel) from the original, unprocessed image. The high frequency
component image enhances the spatial detail in the image at the expense of the
large area brightness information. High pass filtering can be performed by means
of image subtraction method or derivative based methods.
It is well known that an image can be considered to be the sum of its low and high
frequency components. The low frequency image can be subtracted from the
original, unfiltered leaving behind the high-frequency component. The resulting
image can be added back to the original. Therefore, it can be effectively doubling
the high frequency part.
I* = I-FC +C
Where, i* = filtered pixel value
i= original pixel value
I = average of the window
f = a proportion vary from 0 to 1
C = a constant.
This method is called image subtraction method. The second method in high pass
filtering is a derivative based method and is based on mathematical concept of the
derivative.
Filtering for Edge Enhancement
Edge or boundary between two features is very important for separating the
different features. Edge is characterized by high frequencies.
In nature, edge between two features or objects may not always be distinct.
Thus edge enhancement or edge crispening is required for better interpretation of
an image. Edge enhancement filtering technique enhances the edge only. High
frequencies signify the degree of tonal variation within a small area or within a
small spatial distance. This refers to those pixels that occur at the transition
between two categories. The edge crispening filtering technique is typically
employed to exaggerate the edges between contrasting cover type or bring out line
trend of geologic significance.

13
It is also useful in mapping lineaments and drainage pattern. This can be
accomplished by a number of filtering techniques. In this section, three of the
many filters are discussed. These three filtering techniques are discrete
convolution filtering, Laplacian edge enhancement filtering and Robert edge
enhancement filtering.

INTEGRATION OF REMOTESENSING AND GIS


Remote sensing data can be readily merged with other sources of geo-coded
information in a GIS. This permits the overlapping of several layers of
information with the remotely sensed data, and the application of a virtually
unlimited number of forms of data analysis. On the one hand, the data in a GIS
might be used to aid in image classification; on the other hand, the land cover data
generated by a classification (through VIP and DIP) might be used in subsequent
queries and manipulations of the GIS database.
Remotely sensed data is almost always processed and stored in raster data
structures. Simultaneously with an image processing system and raster geographic
information system, it is usually easy to move data between the two. Typically, a
single theme of information is extracted from the remotely sensed data.
The combination of image processing and GIS technologies is astounding. As
indicated earlier, they both are coming closer and it is now becoming extremely
difficult to differentiate each other. The hardware was almost similar, and now
software is being integrated.
Remote Sensing and GIS Synergy
The relationship between remote sensing and GIS has received considerable
attention in literature and, indeed, remains the subject of a continuing discussion.
Remotely-sensed images can be used both as a source of spatial data within GIS
and to exploit the functionality of GIS in processing remotely sensed data.
Actual process involves accessing, manipulating, and visualizing vector, raster,
and tabular data simultaneously.
The recent focus on monitoring global environmental change using coarse spatial
resolution sensors and the assimilation of the data that they produce in various

14
environmental simulation models has deflected some of the attention away from
the traditional issues of large scale mapping, which are more closely allied to the
concerns and use of GIS. Similarly,
Need for integration
Recently, it has been proved that Digital Terrain Models (DTMs) can be
developed using GIS technology.
In another example, the digital database was created for the wastelands and their
aerial extent using ARC/INFO GIS. The spatial data used for this database was
derived from the analysis of IRS LISS satellite data. This data is then integrated
with GIS along with other collateral data.
With better resolution and improving software, the topographical mapping
requirements are being met by GIS and remote sensing combination. Remote
sensing and GIS have almost become an unavoidable source for cross-checking or
updating in digital surveying. Further the GIS software could be integrated with t
the Global Positioning System (GPS) information in their program, as an
additional advantage.
For example, the ARC! INFO GIS software now accepts GPS data through its
geolink module. Urban and regional planning are obvious offshoots.
Environmental issues related to ground cover such as forests, water bodies, and
wastelands also serve as examples.
Along with the above application areas, more research work should be oriented
towards the development of application oriented remote sensing software that can
be easily integrated with GIS.
A recent development in this field has been integration of raster based data in
conventional vector based GIS software like ARC/INFO. Apart from raster/vector
conversion facilities in the GIS packages, vector overlays on remote sensing or
other raster data is now possible.
However, integration of GIS and remote sensing data are being carried out in
different institutions and laboratories with different rates of manual/digital
involvement. The remote sensing discipline has developed a number of different

15
techniques for transforming general purpose data sets into thematic information
for many related fields for subsequent analysis using GIS.
In other instances, discrete categories of surface cover may be distinguished
through a classification algorithm. Many different decision rules may be
developed to isolate or characterized components of the earth s surface.
Often, a sequence of discriminating steps is required to provide an acceptably
accurate or precise result. For example, simple binary decision rule may be able to
exclude water bodies from consideration.
OVERVIEW OF APPLICATIONS
GIS is perhaps best considered a methodology or collection of tools which when
applied can bring great benefit.
Remote sensing and GIS can contribute a great deal to our study of patterns and
processes on the surface of the Earth and to create decision support systems. Even
though the innovation of GIS is itself quite recent, it is possible to classify GIS
applications as traditional, developing, and new.
Traditional GIS application fields include military, government, education, and
utilities. The developing GIS application fields of the mid 1990 s include a whole
raft of general business uses like, banking and financial services, transportation
Levels of integration

Grimshaw (1994) has provided different levels at which GIS can be used within
organisations. The levels are operational, tactical, and strategic as shown in Fig. 13.2.
Operational activities are the basic day-to-day activities of many organizations. The

16
tactical activities are typically the domain of middle managers and strategic activities
involve senior management.
Application of remote sensing of resource management.

Introduction

The integration of remote sensing and Geographical Information Systems (GIS)


technology is increasingly finding favor for applications to the marine and coastal zone
environment. This includes estuaries.
With growing awareness and concern over the use of the marine environment, estuaries,
the focus of many potential conflicts between multiple different user groups, will
increasingly require very careful management of the resources. This will require
knowledge and understanding based upon the acquisition of 'up-to-date' data and
information.
Remotely sensed data, including [aerial photography], [airborne video], and [satellite
imagery], offers one way to provide such data. They can be used to monitor and map
such environments, whilst GIS provide the tools to help manage and analyse the spatial
data.
The following two examples examine the application of remote sensing and GIS to an
estuary environment helping to provide some insight into the potential of these two
technologies to provide data and to improve our knowledge and understanding about
estuary environments and processes. In particular they focus on the Ythan Estuary
(JPEG), an area of some considerable scientific interest, just north of Aberdeen.Study
Area (GIF)

Overview of Case Study

Objective

The objective of these two studies has been to investigate the potential of integrating GIS
and remote sensing technologies to study aspects of this problem.

Study Area
The River Ythan catchment lies some 30 km (10 miles) to the North of Aberdeen
covering an area of 690 km. 95% of this area is classified as agricultural.

Problem
In recent years concern has been expressed about the increasing concentration of
Nitrogen in the Ythan's tributaries over the period 1980-1992.

The chemicals used by farmers (e.g. pig farming and oil seed rape in the area along the
Ythan are considered by some to be responsible for causing an increase in the amount
and extent of weed mats (JPEG) in the lower part of the Ythan estuary.

17
Project 1

Background
Throughout the 1980s and 1990s the agricultural landscape of the UK has changed
markedly with changes in EC (European Community) policy.

One of the most striking changes to the visual landscape has been the marked increase in
the production of Winter [Oil Seed Rape] Oilseed Rape(GIF). This has arisen due to the
subsidy of Oil Seed processing, and the fact that Winter Oil Seed Rape cropping regimes
fit into the crop cycle of more traditional cereal crops.

Remote Sensing
Whilst there are more traditional and well established approaches to agricultural surveys,
these can now be perfomed using satellite remote sensing techniques. An operational
example is provided by the [National Remote Sensing Centre] (NRSC) in the UK
(Albedo, 1993).

The application of satellite remote sensing has several advantages over the more
traditional methods:

Results are not restricted by national boundaries or administrative and political


bias.
They can provide information at different scales.
Temporal comparisons can easily be made between different years.
Data can be processed in near-real-time.

Method
Using the [ERDAS] digital image processing system, multi-temporal [LANDSAT TM]
imagery and a series of training sites for each different crop type in the [classification
scheme], a supervised classification technique was used to provide crop maps for each of
three dates for the Bronie Burn sub-catchment (part of the Ythan watershed (GIF)).

After export of the classifications from the ERDAS system to the [SMALLWORLD
GIS], the latter was used to complete the classification process, and to add additional
geographic information to improve the visual appearance and interpretability of the
imagery.

Summary statistics for total crop areas and the relative proportions of the total sub-
catchment under each crop for each year were derived and compared. In addition, simple
modelling of the relationship between nitrate inputs and agricultural data derived from
the LANDSAT imagery were undertaken.

18
UNIT -5
PART-A
1. Write a short note on visual interpretation?
2. Define hue or tone?
3. Define size/
4. Write a short note on shape?
5. Define texture?
6. Define pattern?
7. Define aspect?
8. Define association?
9. Write a short note on karst cartography?
10. What are the different types of drainage patterns?
11. What are the different types of filtering techniques?
12. What is the need for integration of remote sensing?
13. What are the different types of filtering techniques?
14. What is the application of integration?
15. What are the levels at which the integration of remote sensing and GIS is needed?
16. State any two applications where the combination of remote sensing and GIS used
for data analysis?
PART-B
1. Explain in brief about the visual image interpretation of images?
2. What are the steps involved in the process of image interpretation?
3. Explain in brief about the basic elements in the image interpretation?
4. What are the characteristics of image interpretation?
5. Explain in brief about the low pass, medium pass and high pass filtering
techniques?
6. Explain in brief about the process involved in the edge enhancement?
7. Explain in brief the existing synergy of remote sensing and GIS?
8. Explain with an case study the integration of GIS and remote sensing/

19
20

You might also like