You are on page 1of 7

12/12/2014

Data Fusion
What is Data Fusion ?
Data Fusion is a set of methods, tools and
means using data coming from various
sources of different nature, in order to
increase the quality of the requested
information. Mangolini (1994)

Data Fusion
S.Sivanantharajah
Lecture 09
2014/12/13

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

Data Fusion

Image Fusion

Data fusion techniques combine data


from multiple sensors, and related
information from associated databases,
to achieve improved accuracies and
more specific inferences than could be
achieved by using a single sensor
alone.

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

Image fusion is the combination of


two or more different images to form
a new image by using a certain
algorithm.

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

Image Fusion

Quickbird Images

Most of these sensors operate in two modes:


multispectral mode and the panchromatic
mode.
The panchromatic mode corresponds to the
observation over a broad spectral band (similar
to a typical black and white photograph) the
multispectral (color) mode corresponds to the
observation in a number of relatively narrower
band.
Usually the multispectral mode has a better
spectral resolution than the panchromatic mode.

Quickbird Satellite Images in two modes


Panchromatic Image : 65 centimeter
Ground Sample Distance (GSD) at nadir
Black & White: 405 to 1053 nanometers.
Multispectral Images: 2.62 meter GSD
at nadir

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

Blue: 430 - 545 nanometers


Green: 466 - 620 nanometers
Red: 590 - 710 nanometers
Near-IR: 715 - 918 nanometers
5

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

12/12/2014

Why Data Fusion

Image Fusion
Most of the satellite sensors are such that the
panchromatic mode has a better spatial
resolution than the multispectral mode, Better is
the spatial resolution, more detailed information
about a landuse is present in the imagery

Sharpening of Images
Enhance features not visible in either of the
single data alone
Detect changes using multitemporal data
Replace defective data

To combine the advantages of spatial and


spectral resolutions of two different sensors,
image fusion techniques are applied

Increase spatial, spectral & temporal resolution


Increase accuracy
Increase reliability
Extend coverage

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

Types of Data Fusion

What is the application the data need for?....

The selection of the sensor depends on


satellite and sensor characteristics such
as

Single sensor temporal (SAR images)


Multi sensor temporal (Optical & SAR images)

Orbit
Platform

Single sensor spatial (IRS XS + Pan)

Imaging geometry of optical & radar


satellites

Single data multi sensor (ERS1 + ERS2)

Multi sensor spatial (Landsat 30m + SPOT 10m)

Spectral, Spatial & Temporal resolutions

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

Terms

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

10

Fusion Levels
Measurement Level
Pixel Level
Attribute Level
Feature Level
Rule or decision level
Information/Decision Level

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

11

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

12

12/12/2014

The prerequisites of pixel based fusion

Pixel level

The actual procedure of fusion does not

Image fusion at pixel level means fusion at


the lowest processing level referring to the
merging of measured physical parameters.

consume much time and energy. It is the


preprocessing work and the efforts to bring
the diverse data to a common platform that
requires a greater attention and effort.

It uses raster data that is geocoded.


The geocoding plays an essential role
because mis-registration causes artificial
colors.

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

13

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

Proper Pre-Processing of the Image data

Techniques of Pixel based fusion

Fusion involves the interaction of the


images having different spatial resolution.
These images have different pixel size,
which creates problems while merging.
Hence the image data is resampled to a
common pixel spacing and map projection.

Color space transformations


(RGB, IHS)
Numerical
Brovey Transform
Statistical
PCA
Band substitution

14

High Frequency filter

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

Panchromatic Image - 1 m spatial resolution, IKONOS


(846 by 641)

15

Multispectral Image - 4 m spatial resolution, IKONOS


212 by 161

12/12/2014

Band Substitution

Geometric Rectification/Registration using polynomial


curve fitting ( both imageries)

Resampling of low spatial resolution data to high


resolution pixel size

Since Panchromatic data is a record of blue, green and


red energy, it can be substituted directly for either of
the bands and displayed as three bands using R,G,B
color theory

Advantage of this method


Radiometric quality of any of the data is not changed.

Multispectral Image Resampled at 1m spatial resolution,


(849 by 645)

Intensity Hue and Saturation


The intensity (I) component is not associated
with any color and its value varies from black (a
value of 0) to white (that is 1).
In other words for any color, intensity is a
measure of the degree of whitishness. That is
there might be a lighter shade of red and there
might be a darker shade of red.

Image Fusion using Band Substitution

Intensity Hue and Saturation


The saturation (S) component represents
the purity of the color and thus describes
the separation within a color.
For our purpose of Fusion the most
important thing to remember is that this
color coordinate system effectively
separates the spatial and spectral
components of any color.
Thus Intensity is a spatial component
whereas hue and saturation are the
spectral components

The Hue (H) component represents the


dominant wavelength of color, i.e. it is this
component, which provides the information
whether a color is red or green.

Image Fusion using IHS transformation


Procedure
1.

RGB to IHS transformation


Three bands of low spatial resolution data in RGB are transformed into
three I,H,S components

2.

Contrast Enhancement
The high spatial resolution image (Panchromatic) is contrast stretched so
that it has approximately the same variance and mean as intensity image

3. Substitution
The stretched high spatial resolution image is substituted for the
intensity (I) Image.
4. IHS to RGB
The modified IHS dataset is transformed back to RGB color space using an
inverse IHS transformation.

12/12/2014

BROVEY TRANSFORM
Gandhi Park

Cinema halls
(prabhat and
krishna)

Brovey transform is a formula that normalizes multispectral bands used for a RGB display, and
multiplies the result by any desired data to add the intensity or the brightness component of
the image.
The Brovey transform equation is given by
DNB1new =

Ghanta Ghar

DN B1
DN B1 + DNB2 + DNB3

x DN high Resolution Image

DNB2new =

DN B2
DN B1 + DNB2 + DNB3

x DN high Resolution Image

DNB3new =

DN B3
DN B1 + DNB2 + DNB3

x DN high Resolution Image


Where
B is a band

Merged PAN+ Multispectral using IHS Transformation

Multiplicative Technique
The Multiplicative equation is given by
DNB1new =

DN B1 * DN high Resolution Image

DNB2new =

DN B2 * DN high Resolution Image

DNB2new =

DN B2 * DN high Resolution Image

Merged PAN+ Multispectral using Brovey Transformation

Principal Component Analysis (PCA)


It is a statistical technique that
transforms a multivariate dataset of
intercorrelated variables into a data set
of new uncorrelated linear combinations
of the original variables.

Merged PAN+ Multispectral using Multiplicative tech.

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

30

12/12/2014

PRINCIPAL COMPONENT ANALYSIS

Contd

The multispectral image data is usually strongly correlated


from one and to the other thus contain similar information.

The PCA is useful for


Image encoding
Image data compression
Image enhancement
Digital change detection
Multitemporal dimensionality, and
Data fusion

MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

For example, Landsat MSS Bands 4 and 5 (green and red,


respectively) typically have similar visual appearances since
reflectances for the same surface cover types are almost
equal
Principal component analysis is
transformation that creates new
uncorrelated components as band.

a pre-processing
images of with

31

PRINCIPAL COMPONENT ANALYSIS


The objective of this transformation is to reduce the
dimensionality (i.e. the number of bands) in the data, and
compress as much of the information in the original bands
into fewer bands.
The "new" bands that result from this statistical procedure
are called components.
Principal component images may be analysed as separate
black and white images, or any three component images may
be colour coded to form a colour composite.

Band 2

PC1

PC2

PC3

PC4

Band 1

Band 4

Band 1

Band 4

Band 3

Band 1

Band 4

Band 3

Band 2

12/12/2014

Band 2

Band 3

Examples of Image Fusion

Topographic mapping & map updating


Landuse, Agriculture & Forestry
Flood monitoring
Ice-snow monitoring, and
Geology

Color Composite PC1,PC2,PC3


MSGis 06 - Fundaments of Remote Sensing (S.Sivanantharajah)

40

You might also like