You are on page 1of 10

GEOL 1460 /2461 Ramsey

Introduction to Remote Sensing Fall, 2016

Course overview; Remote sensing introduction; Color theory; Aerial photography


Week #1: 31 August 2016

I. Syllabus Review
we will go over the syllabus, schedule, and course structure/information at the
start of class
see the detailed information on the class webpage:

http://ivis.eps.pitt.edu/courses/geol1460/

II. What is remote sensing??


collection and interpretation of information about a target without being in
physical contact with it
o mostly electromagnetic (EM) radiation
o acoustic (sonar)
o examples: human eye, camera, aerial photograph, remote sensing scanners,
satellites (Landsat, ASTER)

o measuring changes in the intensity with wavelength


interpreting the physical properties of the material
spatial variations
temporal variations

physics of remote sensing and the derived information varies strongly with
wavelength

o minimal definition (more appropriate for what we do here)


remote sensing is the non-contact recording of information from the
electromagnetic spectrum by means of instruments on platforms such as
spacecraft, and the analysis of the acquired information by means of
visual and digital image processing
very specific on the wavelengths, sensor types, platforms, and analysis

o art or science (or a tool for each)??

advantages of remote sensing disadvantages of remote sensing


o unobtrusive (passive) o not a panacea for everything!
o unbiased data collection o human-introduced errors
o non single-point data o emit EM radiation (active)
o data collected in-situ o uncalibrated data over time
o others? o $$

what can Remote Sensing measure?


o x, y geographic location
o z topographic location
o vegetation health
chlorophyll content, water, % biomass, phytoplankton

o surface/sea temperature
o surface roughness
o soil moisture & evaporation
o atmosphere
chemistry, temperature, water %, wind speed, precipitation, clouds

o others
snow/ice, volcanoes, EQs, land use, ocean health

two types of remote sensing systems:


o passive: detection of energy from natural illumination or emission
example: camera, visible/near infrared instruments, thermal instruments

o active: detection of energy reflected back to the sensor after providing the
illumination
example: camera with a flash, flashlight and eye, radar, lasers

III. EM Principles:
detection: general principles here (details later in the semester)
o energy interactions:
remote sensing is only useful because we are able to detect some
property about the surface
the only way that this is possible is if the surface alters the energy in some
way upon interaction
this alteration is what we detect

five types of interactions can take place:


i. reflected
energy returned from surface with an angle of reflection equal and
opposite to incidence angle
caused by surfaces smooth relative to the incident wavelength

ii. scattered
deflection of energy in multiple directions
caused by surfaces rough relative to the incident wavelength

iii. transmitted (refracted)


energy passes through the material
caused by a change in density (velocity of the incident wave)
between two material (index of refraction)

iv. absorbed
energy transformation (usually to longer wavelength heating)
v. emitted
release of energy from the material (it is now the source)

EM sectrum and EM waves

o waves have a constant velocity in a vacuum


o but vary in wavelength and frequency by the following equation:

=c/
o where, c = speed of light = 2.998 x 108 m/sec; = frequency (Hz or
cycles/sec)
o EM radiation is quantized into discrete packets called photons
o allows for the frequency () to be related to the energy of the wave

E=h
where, h = Planck constant = 6.626 x 10-34 Joule seconds
because is inversely proportional to wavelength, smaller wavelengths
(higher frequencies) have higher energy
example: X-Rays penetrate deeper (more damaging) to your body than
energy from radio waves

Wavelength Ranges
o varies from gamma rays (short wavelength) to radio waves (long wavelength)
i. gamma rays (<= 0.0001 microns)
change in the energy state of the neutrons/protons
variations in light elemental compositions

ii. X-rays (<= 0.01 microns)


photons absorbed by the inner shell of electrons

iii. ultra violet [UV] (<= 0.4 microns)


photons emitted/absorbed by the outer shell of electrons
information on transition metals (Fe 2+, Fe 3+, Cu 2+) and chlorophyll

iv. visible (<= 0.67 microns)


similar to UV

v. near infrared [NIR] (<= 1.5 microns)


similar to UV

vi. short-wave infrared [SWIR] (<= 3.0 microns)


vibrational structure of certain minerals (OH-, CaCO3)

vii. thermal infrared [TIR] (<= 100 microns)


information on the molecules and bond strength
excellent for mineralogy
information on surface temperatures

viii. microwave (0.1 cm - 10 m)


includes TV and radio bands
radar wavelengths (discrete bands between 3-60 cm) good for
remote sensing
little information on composition, but a lot about the particle size
and surface roughness
IV. Information Interpretation
surface interaction with EM waves yields information
o both a function of the sensor doing the detection and the surface material

o function of the sensor:


the spatial resolution
depends on the altitude and the instrument characteristics

the sensitivity of the detector


the number of wavelength bands (spectral resolution)

o f(composition/texture and wavelength)


example: chemical composition, surface roughness, temperature, distance
from the sensor
will look more at this next week

Imaging Characteristics
o pixel = "picture element"
the quantized spatial resolution of the image
displayed as a square as image is zoomed in

value is recorded as DNs (digital numbers)


for 8-bit (28) data this number ranges from 0-255 (gray-scale)
for 16-bit (216) data this number ranges from 0-65,535
the value chosen depends on the what type of physical parameter you
are trying to store
radiance values may need 16-bit DNs

image display
able to display only 1, 8-bit image in each of three primary colors (red,
green, blue)
known as a 24-bit monitor
the mixing of these three values produce all other colors (color theory)

o what is spatial resolution?


the size of the spatial resolution cell (pixel)
determined by two parameters:
height of the sensor above the ground
instantaneous filed of view (IFOV) of the
sensor

pixel size = H x IFOV

example: H = 2km, IFOV = 2.5mrad;


pixel = 5m
Color Theory
o Important point!
where applied to image visualization, color display and mixing is different
than common thought (i.e. mixing of paint)
critical to understand the difference and how a particular color is created
from the three primary colors and what that tells you about the physical
properties of the surface

contrast ratio: the mixing of these three values produce all other colors human
eye can only distinguish ~30 shades of gray

primary colors (or additive colors)


o red, green, blue (RGB)
o R+B+G = white, -R-B-G = black
o all other colors are formed from some percentage of these three
"true color" image - RGB corresponds to the RGB wavelengths
"false color" image - RGB is used to display other wavelength regions

secondary colors (or subtractive colors)


o three secondary colors formed by the subtraction of one color from white
o or, looking at it another way, 2 primary colors added together
- R (or, B+G) = cyan
- G (or, R+B) = magenta
- B (or, G+R) = yellow

primary
colors

subtractive
colors
color mixing
o one pixel in three wavelength regions may have 3 different DN
values/wavelength
o each wavelength placed in a RGB will combine to form a color
o examples:

RED GREEN BLUE FINAL


155 17 219 _____
219 155 17 _____

color mixing (real example)


o vegetation color changes in autumn
o typical spectra of vegetation (more on spectral features in the next few
weeks)
o vegetation composed of six primary constituents
1. water
2. cellulose (carbohydrate polymer)
3. lignin (woody plants)
4. nitrogen
5. chlorophyll (two types, A & B)
6. anthocyanin
pigment that is responsible for the coloration of flowers and autumn
leaves)

o vegetation health (drying out in autumn)


lower water and chlorophyll, increased anthocyanin
results in increase in brightness in the VIS red
decrease in brightness in the NIR
fairly constant in the green
o energy returned (percent reflectance) in the red leaf spectrum
function of the wavelength region
different for the human eye (white circles) than a multispectral instrument
like ASTER (green bars)

V. Cameras/Aerial Photography
photon detectors
o examples: film, vidicons, charged-couple devices (CCDs)
o absorption of a photon breaks an electron free from its binding atom
o this change in energy state can be measured electrically
o different materials for different wavelength regions
examples: Ag-halide (film), Si (VIS), KBr (SWIR - TIR), HgCdTe (TIR)

o just one part of a spectrometer:


i. fore optics (primary and secondary mirrors/telescope)
ii. beam splitter and detector
iii. electronics
iv. storage

framing camera
o what is film?
light-sensitive emulsion material embedded with silver-halide crystals
coarseness of these crystals determines the resolving power of the film
(speed)
photochemical reaction of photon liberating electrons --> creating silver
atoms
developing uses chemicals to convert exposed Ag-halide atoms into silver
all unexposed grains are removed to leave clear areas
exposed regions remain and are dark (brightest parts of the scene are
the darkest in the developed film --> negative image
printing on paper --> positive image

"negative" color image "positive" color image


o 2-D image acquired instantly
positives: high spatial resolution, low costs, large amount of data captured
negatives: limited spectral range, non-digital, high geometric distortion @
edges

o ground resolution = ability to resolve ground features (expressed as the


number of line pairs per m)
Rg = (Rs f)/H
where, Rs = system resolution (mm); f = focal length of camera (mm); H =
camera height above ground (m
whereas, the width of an individually-resolved line pair = Rg -1

scale = f/H
commonly written as 1:20,000
1 mm on the photograph = 20,000 mm (20m) on the ground

o relief displacement
geometric distortion at image edges giving the effect that taller objects are
leaning away from the optical center of the photo distortion amount is
related to:

1. vertical height of
the object

2. distance from the


principal point

3. inversely
proportional to the
camera height

h = (H d) / r
where, h = actual height of the object (m) ; H = camera height above
ground (m); r = distance from image center to the top of the object (m); d
= relief displacement
removal of large-scale relief displacement produces an orthophotograph

stereo-pairs = successive overlapping air photos


because each photograph images each point on the ground from a
slightly different angle, the offsets can be used to reproduce the
vertical dimension
known as a DEM (digital elevation model)
what are used to produce the USGS topographic maps
low sun angle: images taken generally early morning, late afternoon, or
high latitudes, where the sun is < 15 above the horizon
produces pronounced shadows if object is perpendicular to sun
excellent for interpretation of subtle topographic features

high sun angle:


what benefits do you see?

You might also like