You are on page 1of 531

Remote Sensing of the

Environment
GEO 444/544
Department of Geosciences
Oregon State University

What is remote sensing?


the acquisition and measurement of
data/information on some property(ies) of a
phenomenon, object, or material by a recording
device not in physical, intimate contact with the
features(s) under surveillance; techniques involve
amassing knowledge pertinent to environments by
measuring force fields, electromagnetic radiation, or
acoustic energy employing cameras, radiometers and
scanners, lasers, radio frequency receivers, radar
systems, sonar, thermal devices, seismographs,
magnetometers, gravimeters, scintillometers, and
other instruments.
Source: NASA tutorial on remote sensing
http://rst.gsfc.nasa.gov/Intro/nicktutor_I-1.html

Shorter definition
Remote sensing is the collection of
information about an object or system
without coming into direct physical
contact with it.

Why do we do remote sensing?

Unobtrusive
Automated
Useful for extreme conditions
Offers excellent spatial and temporal
coverage
Provides real time or near-real time
observations
Often cost-effective relative to ground data
collection
Augments in situ measurement systems
Extends our senses

How are measurements made?


Ground-based
Airborne
Satellite

Remote Sensing Systems


Active Sensor - illuminates the subject from
an artificial energy source
Passive Sensor - uses natural radiation from
the Sun or Earth
Imaging Sensor - creates a picture by
scanning across a linear array of detectors
while the array moves through space
Non-imaging Sensor - measures at discrete
locations; or uses a non-photonic approach

REMOTE SENSING ELECTROMAGNETIC


RADIATION DATA TYPES
Visible, infrared, thermal, and microwave are most common

Wave Model of EMR (pg.39)


Electrical (E) and magnetic fields (B) are orthogonal to each other

Direction of each field is perpendicular to each other and to


the direction of wave propagation.

Electromagnetic Waves (pp. 39-40)


Described by:
Wavelength
Frequency
Amplitude

1 Cycle
()

Frequency = # cycles/second
()

Frequency vs. Wavelength (pg.39)


= distance of separation between two
successive wave peaks
= number of wave peaks passing in a given
time
c = speed of light in a vacuum = 3.0 108 ms-1
The product of wavelength and frequency is a
constant:
c = or = c /

Physical Principles of Remote


Sensing:
Electromagnetic Radiation
(Chapter 2 and a small part of Chapter 8)

Outline for Lecture 2


Properties of electromagnetic radiation
The electromagnetic spectrum
Plancks, Stefan-Boltzmann, and Wiens
blackbody Laws
Graybodies and emissivity
Radiant temperature vs. kinetic
temperature

What is Energy?
Typical Definition: Energy is the capacity to do work =
Mechanical Energy (textbook pp. 37-38,47)
From The Feynman Lectures on Physics: Feynman
tells us that ...in physics today, we have no
knowledge of what energy is. He goes on to say
that we know how to calculate its value for a great
variety of situations, but beyond that its just an
abstract thing which has only one really important
property. If we add up all the values before
something happens and then add them up after it
happens the two values will be exactly the
same. (We must be sure to include every object
affected.) This is the law of conservation of energy.

Wave Model of EMR (pg.39)


Electrical (E) and magnetic fields (B) are orthogonal to each other

Direction of each field is perpendicular to each other and to


the direction of wave propagation.

Electromagnetic Waves (pp. 39-40)


Described by:
Wavelength
Frequency
Amplitude

1 Cycle
()

Frequency = # cycles/second
()

Frequency vs. Wavelength (pg.39)


= distance of separation between two
successive wave peaks
= number of wave peaks passing in a given
time
c = speed of light in a vacuum = 3.0 108 ms-1
The product of wavelength and frequency is a
constant:
c = or = c /

Electromagnetic Radiation (pg.40)

Electromagnetic Radiation
EMR is the source for most types of remote sensing

Passive

Passive reflected or reemitted solar radiation

Active

Active the sensor emits the


radiation and detects the return

For terrestrial remote sensing, the most important


source is the sun
Reflected solar energy is used: 0.3 - 2.5 m

The Earth is also an energy source


>6 m for Earth-emitted energy

The Electromagnetic (EM) Spectrum (pg. 42, plate 2.2)

More about Solar Radiation (pg. 43)


Name of Spectral
Region

Wavelength
Range, m

Percent of Total
Energy

Gamma and X-rays

< 0.01

Negligible

Far Ultraviolet

0.01 - 0.2

0.02

Middle Ultraviolet

0.2 - 0.3

1.95

Near Ultraviolet

0.3 - 0.4

5.32

Visible

0.4 - 0.7

43.5

Near Infrared

0.7 - 1.5

36.8

Middle Infrared

1.5 - 5.6

12.0

Thermal Infrared

5.6 - 1000

0.41

Microwave

> 1000

Negligible

Radio Waves

> 1000

Negligible

More about Solar Radiation


At 5900 K, the energy (radiant exitance) from
the Sun is enormous: 6.4 107 Wm-2
This energy is reduced to an irradiance of
~1370 Wm-2 at the top of the Earths
atmosphere because of the Earth-Sun distance
(1/r2)
The Earth, at 288 K has a radiant exitance of
only 390 Wm-2

Photon Energy (pg. 43)


When considering the particle form of energy,
we call it a photon
The radiant energy of a photon, Q, is
proportional to its frequency, :
Q=h
= c/
Q = hc/
Radiant energy Q, measured in Joules (J)
h = the Planck constant = 6.626 10-34 Joule-seconds (Js)
c = speed of light = 3.0 108 meters per second (ms-1)

Thus, the shorter the wavelength, the higher


the energy of a photon.

Blackbody Radiation (pg. 39)


All objects whose temperatures are above
absolute zero Kelvin (-273.16oC) emit radiation at
all wavelengths.
A blackbody is one that is a perfect absorber of
all EMR that falls on it and a perfect emitter of
thermal EMR at a certain temperature that would
appear black when cold.
No real object is a blackbody radiator, but objects
like the Sun come close to being blackbody
radiators.

Blackbody Equations (pg. 41)


Plancks Law:

Stefan-Boltzmann Law:

Wiens displacement equation:

c speed of light
h Planck constant
k Boltzmann constant
Stefan-Boltzmann constant

3.0 108 ms-1


6.63 10-34 Js
1.38 10-23 JK-1
5.67 10-8 Wm-2K-4

Plancks Law
What do the equations mean??
Plancks Law (1900) gives the radiance (L) of a
blackbody (Wm-2 sr-1) at a given temperature at
any wavelength ().

What is a Watt??
An International System unit of power equal to
one joule of energy per second.

What is a steradian (sr)??


The solid angle subtended at
the center of a sphere of
radius r by a portion of the
surface of the sphere having
an area equal to r2.

65.54

Plancks Law, (continued)


Plancks law describes how heat energy
is transformed into radiant energy
According to Plancks law, an object will
emit radiation in all wavelengths but not
equally
This is the basic law for radiation
measurements in all parts of the EM
spectrum

Stefan-Boltzmann Law
What do the equations mean??
Stefan-Boltzmann Law (1884) describes the total
amount of energy being radiated by a blackbody at
different temperatures (radiant exitance in Wm-2).

Blackbody Radiation (pg. 39)


Stefan-Boltzmann Law:
Radiant exitance (M) is proportional to the fourth
power of an objects temperature in Kelvin:

This assumes a blackbody which is a perfect


emitter and a perfect absorber.
Radiant exitance is the power of electromagnetic
radiation away from a surface per unit area,
measured in Watts per square meter (Wm-2)

Wiens Displacement Law


What do the equations mean??
Wiens Displacement Law (1893) describes the
wavelength of maximum blackbody radiation (in
m).

What do the curves mean??


Temperature Color Spectrum

At room temperature,
blackbodies emit mostly infrared
wavelengths, but as the
temperature increases past 798
K, blackbodies start to emit
visible wavelengths, appearing
red, orange, yellow, white, and
blue with increasing temperature.
By the time an object is white, it
is emitting substantial ultraviolet
radiation.

Examples using
Wiens Displacement Law (pg. 41)
Tsun= 5800 K
Peak of Suns radiation =
2898 m K / 5800 K = 0.5 m
Tearth = 288 K
Peak of Earths radiation =
2898 m K / 288 K = 10 m

Blackbody at 6000 K

(pp. 51-52)

leaving the surface


of the sun
SolarRadiation
Emittance
Curve
Solar radiation at sea level

Blackbody vs. Solar Radiation

Graybody Radiation (pp. 255- 256)


real objects are not perfect blackbody
absorbers/emitters
they reflect part of the incident radiation
emissivity, , is the ratio of graybody
exitance to blackbody exitance

Emissivity (pg. 255)


Describes the actual energy absorption
and emission properties of real objects
(graybodies)
Is wavelength dependent
Emissivity establishes the radiant
temperature Trad of an object

Radiant Temperature vs. Kinetic


Temperature (pg. 259)
Two objects can have the same kinetic temperature
but different radiant temperatures
Object

Emissivity
()

Kinetic
Temperature

Radiant
Temperature

Blackbody

1.0

300 K

300 K

Water, distilled

0.99

300

299.2

Basalt, rough

0.95

300

296.2

Basalt, smooth

0.92

300

293.8

Obsidian

0.86

300

288.9

Mirror

0.02

300

112.8

Graybody Radiation (pp. 255- 256)


real objects are not perfect blackbody
absorbers/emitters
they reflect part of the incident radiation
emissivity, , is the ratio of graybody exitance to
blackbody exitance

Emissivity (pg. 255)


Describes the actual energy absorption and
emission properties of real objects
(graybodies)
Is wavelength dependent
Emissivity establishes the radiant
temperature Trad of an object

Radiant Temperature vs. Kinetic


Temperature (pg. 259)
Two objects can have the same kinetic temperature
but different radiant temperatures
Object

Emissivity
()

Kinetic
Temperature

Radiant
Temperature

Blackbody

1.0

300 K

300 K

Water, distilled

0.99

300

299.2

Basalt, rough

0.95

300

296.2

Basalt, smooth

0.92

300

293.8

Obsidian

0.86

300

288.9

Mirror

0.02

300

112.8

Electromagnetic Radiation:
Interactions in the Atmosphere
and with Matter
(Chapter 2)

Outline for Lecture 3


EMR interaction with the atmosphere
Atmospheric attenuation
scattering (Rayleigh, Mie, Non-selective)
absorption

Atmospheric windows
EMR interaction with matter

Blackbody at 6000K

pp. 51-52

leaving the surface


of the sun
SolarRadiation
Emittance
Curve
Solar radiation at sea level

Blackbody vs. Solar Radiation

Atmospheric Effects (pp. 48-51)


EMR is attenuated by its passage
through the atmosphere
Attenuation = scattering + absorption
Scattering is the redirection of radiation
by reflection and refraction
Attenuation is wavelength dependent

(pg. 49)

Droplet

(pg. 50)

Rayleigh Scattering
(pp. 49-50)
scattering by molecules
and particles whose
diameters are <<
primarily due to oxygen
and nitrogen molecules
scattering intensity is
proportional to -4
responsible for blue sky

A Clear Blue Sky


Blue radiation ( = 0.46)
Red radiation ( = 0.66)
(0.66/0.46)4 = 4.24
Blue is scattered 4 more than red radiation

Earth: atmosphere and


Rayleigh scattering, giving
blue horizon

Moon: atmosphere and


no Rayleigh scattering,
giving black horizon

Droplet

(pg. 50)

(pp. 49-50)

Mie Scattering (pg. 50)


Spherical particles that have a mean diameter 0.1 to 10
times the incident wavelength
Examples for visible light: water vapor, smoke particles, fine
dust
Scattering intensity is proportional to -4 to 0 (depending on
particle diameter)
Mie scattering is in the forward direction

Clear atmosphere has both Rayleigh and Mie


scattering. Their combined influence is between -0.7
to 2

Rayleigh and Mie Scattering

An informal measurement of sky


brightness and color saturation.
The selected points approach the
sun direction and avoid obvious
clouds.
(from http://hyperphysics.phyastr.gsu.edu)

Red Sky at Night


At sunset, solar radiation must traverse a longer path
through the atmosphere. Viewing a setting sun, the
energy reaching the observer is largely depleted of
blue radiation, leaving mostly red wavelengths
(Rayleigh). Dust/smoke adds additional scattering with
a wavelength dependence that increases the red sky
effect (Mie)

Droplet

(pg. 50)

(pp. 49-50)

Non-selective Scattering (pp. 50-51)


aerosol particles much larger than the wavelength
(> 10x)
examples: water droplets, ice crystals, volcanic ash,
smog
independent of wavelength: 0

Atmospheric Absorption (pp.51-52)


Absorption is the process whereby radiant energy
is absorbed by atmospheric constituents and
converted to thermal energy
Atmospheric absorbers are primarily:
water vapor, water droplets
H2O
carbon dioxide
CO2
oxygen
O2
ozone
O3
Dust and soot

Blackbody at 6000K

pp. 51-52

leaving the surface


of the sun
SolarRadiation
Emittance
Curve
Solar radiation at sea level

Blackbody vs. Solar Radiation

Absorption Bands (pg. 51)


An absorption band is a portion of the EM
spectrum within which radiant energy is
absorbed by substances such as water (H2O),
carbon dioxide (CO2), oxygen (O2), ozone
(O3), nitrous oxide (N2O), dust, soot, etc.

Atmospheric Windows (pp.51-52)


Regions in the EM spectrum where energy can
be fully transmitted
0.3-0.7 m
3-5 m
8-11 m
1 mm-1 m

UV and visible light from Sun


emitted thermal energy from Earth
emitted thermal energy from Earth
radar and microwave energy

(pg.52)

Electromagnetic Radiation
Interactions with Matter (pp.53-54)
Conservation of energy: radiation at a given
wavelength is either
reflected () -- property of surface or medium is
called the reflectance or albedo (0-1), or percent
reflectance (0-100%)
absorbed () -- property is absorptance (0-1)
transmitted () -- property is transmittance (0-1)

Electromagnetic Radiation
Interactions with Matter (pp.53-54)
Radiation Budget Equation:

i = + +
i is the total radiant flux in Watts
is the hemispherical transmittance
is the hemispherical reflectance
is the hemispherical absorptance

(pp.53-54)
Refraction

Transmittance,

Reflectance,

Absorptance,

(and re-emission)

Reflectance + Absorptance + Transmittance = 1


For the Earths surface, we ignore Transmittance,
so Reflectance + Absorptance = 1 , or:
= 1
= 1

Electromagnetic Radiation at the


Earths Surface
(Chapter 2 and a bit of Chapter 1)

Outline for Lecture 4


Specular vs. Diffuse Reflectance
Directional vs. Hemispherical Reflectance
Following the energy

EMR at the Earths Surface


Recall that radiant energy can be transmitted,
absorbed, or reflected
For remote sensing purposes, energy is not
transmitted through the Earth
Reflectance depends on the composition and
surface characteristics of materials -- it also
depends on the way that the material is
illuminated

Reflection (pp. 51-53)


diffuse reflection (rough surface relative to
wavelength of EMR)
specular reflection (smooth surface relative to
wavelength of EMR)
Diffuse

Specular

(pg.53)

Sunglint and wildfires in


the Carpentaria Bay,
Queensland, Australia

Sunglint on the water


(specular reflection)

Reflectance Terminology (pp.54-55)


Irradiance can be either direct-only (directional)
or direct + diffuse (hemispherical)
Measurement can be either over a small solid
angle (directional) or over a very, very wide
solid angle (hemispherical)

(pg. 21)

Reflectance Terminology (pp.20-21)


Directional-Directional Reflectance
a.k.a.: Bidirectional Reflectance
this is what is measured by satellites at the top of
the atmosphere

Hemispherical-Directional Reflectance
this is what is measured by instruments at the
surface of the Earth

Sunglint from solar


collectors in the
Mohave Desert

Measured using
NASAs Multiangle
Imaging
SpectroRadiometer
(MISR) instrument

Reflectance Terminology (pp.51,53)


A Lambertian surface is one where the bidirectional
reflectance factor does not vary with angle (i.e. the
bidirectional reflectance factor is the same in all
directions). It is a perfectly diffuse (isotropic)
scattering medium
A specular surface is one where the bidirectional
reflectance factor is entirely concentrated in the
forward direction at the same zenith angle as the
incident irradiance (like a perfect mirror)
All natural surfaces are somewhere between these

(pg.56)
The concept of radiant flux
density for an area on
the surface of the Earth.

Irradiance is a measure
of the amount of radiant
flux incident upon a
surface per unit area of
the surface measured in
watts m-2.

Exitance is a measure of
the amount of radiant flux
leaving a surface per unit
area of the surface
measured in watts m-2.

(pg. 57)

The concept of
radiance (Wm-2 sr-1)
leaving a specific
projected source
area on the ground,
in a specific
direction, and within
a specific solid
angle measured in
steradians (sr).

(pg.57)

The cosine effect


cosine=adjacent/hypotenuse

Steradian (Solid Angle)


The steradian (sr) is the unit of solid angle
It is defined as: = A/r2
Spherical
cap area, A

Radius, r

Diameter of
flat cone
Sphere

Spectral Reflectance

spectral absorptance

Spectral Reflectance Curves (pg.55)


Measured spectral reflectance values
across a portion of the EMR spectrum are
often displayed for different ground features
on Spectral Reflectance Curves.
Wavelength regions where there are large
differences in reflectance among features
are good places for remote sensing to
distinguish among the features.

(pg. 55)

(pg. 59)

Path from Sun-to-Earth (pp.58-59)


E0 = Top-of-Atmosphere solar irradiance (Wm-2)
it passes through the atmosphere and is attenuated
based on the atmospheric transmittance, at a
particular solar zenith angle
it reaches the ground and is reduced by a factor of
cos, depending on the angle of incidence (solar zenith
angle + slope)
Thus, for a specified wavelength, the irradiance at the
ground surface is:

E g = E 0 cos

(pp.57,59)

Path 1 contains spectral solar


irradiance ( Eo ) that was
attenuated very little before
illuminating the terrain within the
IFOV. Notice in this case that we
are interested in the solar
irradiance from a specific solar
zenith angle ( o ) and that the
amount of irradiance reaching the
terrain is a function of the
atmospheric transmittance at this
angle ( T ). If all of the irradiance
makes it to the ground, then the
atmospheric transmittance ( T )
equals one. If none of the
irradiance makes it to the ground,
then the atmospheric
transmittance is zero

(pp.57-59)

Path 2 contains spectral diffuse


sky irradiance ( Ed ) that never
even reaches the Earths surface
(the target study area) because of
scattering in the atmosphere.
Unfortunately, such energy is often
scattered directly into the IFOV of
the sensor system. As previously
discussed, Rayleigh scattering of
blue light contributes much to this
diffuse sky irradiance. That is why
the blue band image produced by
a remote sensor system is often
much brighter than any of the
other bands. It contains much
unwanted diffuse sky irradiance
that was inadvertently scattered
into the IFOV of the sensor
system. Therefore, if possible, we
want to minimize its effects. Green
(2003) refers to the quantity as the
upward reflectance of the
atmosphere ( Edu ).

(pp.58-59)

Path 3 contains energy


from the Sun that has
undergone some
Rayleigh, Mie, and/or
nonselective scattering
and perhaps some
absorption and reemission
before illuminating the
study area. Thus, its
spectral composition and
polarization may be
somewhat different from
the energy that reaches
the ground from path 1.
Green (2003) refers to this
quantity as the downward
reflectance of the
atmosphere ( Edd ).

(pp.58-59)

Path 4 contains radiation


that was reflected or
scattered by nearby terrain
( ) covered by snow,
concrete, soil, water,
and/or vegetation into the
IFOV of the sensor
system. The energy does
not actually illuminate the
study area of interest.
Therefore, if possible, we
would like to minimize its
effects.
n

Path 2 and Path 4


combine to produce what
is commonly referred to as
Path Radiance, Lp.

(pp.58-59)

Path 5 is energy that was


also reflected from nearby
terrain into the
atmosphere, but then
scattered or reflected onto
the study area.

(pp.58-59)
The total radiance reaching the
sensor is:

1
LS = T v (E oT o cos o + E d )+ L p

This may be summarized


as:

LS = LT + L p
Ls = radiance at the sensor
LT = radiance from the target
LP = path radiance
Remember that the scattered and transmitted radiances are
spectrally dependent (that is, they vary as a function of wavelength)

Why we care about this


reflectance of the surface tells us about the
nature of the material (composition,
roughness, etc.)
atmospheric effects obscure our ability to
measure surface properties
surrounding materials can also add some
confusion about the nature of the surface

Atmospheric Correction

a) Image containing substantial haze prior to atmospheric correction. b)


Image after atmospheric correction using ATCOR (Courtesy Leica
Geosystems and DLR, the German Aerospace Centre).

Landsat TM Bands 4,3,2,

Landsat TM Bands 3,2,1

TOA Reflectance
Surface Reflectance

(pg. 59)

Path from Sun-to-Earth (pp.58-59)


E0 = Top-of-Atmosphere solar irradiance (Wm-2)
it passes through the atmosphere and is attenuated
based on the atmospheric transmittance, at a
particular solar zenith angle
it reaches the ground and is reduced by a factor of
cos, depending on the angle of incidence (solar zenith
angle + slope)
Thus, for a specified wavelength, the irradiance at the
ground surface is:

(pp.58-59)
The total radiance reaching the
sensor is:

This may be summarized


as:

Ls = radiance at the sensor


LT = radiance from the target
LP = path radiance
Remember that the scattered and transmitted radiances are
spectrally dependent (that is, they vary as a function of wavelength)

Why we care about this


reflectance of the surface tells us about the
nature of the material (composition,
roughness, etc.)
atmospheric effects obscure our ability to
measure surface properties
surrounding materials can also add some
confusion about the nature of the surface

Atmospheric Correction

a) Image containing substantial haze prior to atmospheric correction. b)


Image after atmospheric correction using ATCOR (Courtesy Leica
Geosystems and DLR, the German Aerospace Centre).

Landsat TM Bands 4,3,2,

Landsat TM Bands 3,2,1

TOA Reflectance
Surface Reflectance

Imaging Principles
(parts of Chapters 1,4,5 & 7)

Outline for Lecture 5

Photographic camera/film systems


Digital electro-optical systems
Raster image creation
Image spatial resolution

Photographic Camera/Film Systems (pp.95-98)


Camera Components

Aerial Photography
Camera in Aircraft

Aerial Photograph Film and Paper (pp.110-115)


A non-digital aerial photograph is acquired
instantaneously as a film negative that is made into a
paper positive print

Aerial Photograph Film


Films are coated with photo-sensitive silver
halide emulsions to record an image (pg. 112)
Fine grain

Coarse grain

Spatial resolution of film depends in part on the size of the


silver halide crystals

Different Types of Photographic Films (pg.111)

Film is always passive, uses reflected sunlight


only works in the range of 0.4 - 1.0 m

Different Types of Photographic Films (pg.111)


Panchromatic 0.40.7 m

Black & White IR 0.7 0.9 m

Different Types of Photographic Films (pg.111)


Color 0.40.7 m

Color IR 0.50.9 m

Electro-optical vs. Photographic Systems


Both can be used to create images
photography is non-scanning
electro-optical systems can be either scanning or nonscanning

Both record interactions with radiation


films coated with photo-sensitive silver halide emulsions to
record an image
electro-optical systems detect radiation through an electrical
system

EMR detection capabilities are quite different


film is always passive, uses reflected sunlight, only works in
the range of 0.4 - 1.0 m
digital systems can be active or passive, can work in all
parts of EM spectrum, high spectral resolution is possible,
can be non-imaging system (e.g. laser altimeter)

Basic Types of Imaging Systems (pg. 196)

Images Made from Digital Numbers (DN)


for each Image Band
at-sensor
radiance

imaging optics
detectors
electronics

DN

The DN that is recorded is proportional to the


radiance at the sensor
(Note that the text uses the term Brightness Value (BV) but the
more appropriate, and commonly used term is DN)

Digital Raster Image Format (pg. 195)

Image Spatial Resolution (pp. 15-16)


A measure of the smallest angular or linear
separation between two objects that can be
resolved by the sensor. (pg. 15)
Resolving power is the ability to perceive two
adjacent objects as being distinct

size
distance
shape
color
contrast with background
sensor characteristics

Instantaneous Field Of View (IFOV) (pg.3)


Instantaneous field of
view (IFOV) is the
angular field of view of
the sensor, independent
of height
IFOV is a relative
measure because it is an
angle, not a length
It can be measured in
radians or degrees

sensor

= IFOV, the angular field


of view (measured in
degrees or radians)

GIFOV
Ground-projected instantaneous field of view
(GIFOV) depends on satellite height (H) and
the IFOV is a circle of diameter GIFOV.

sensor
1 / 2 GIFOV / H = tan
2
/2

GIFOV = 2 H tan
2
H
= IFOV, the angular field
of view (measured in
degrees or radians)

GIFOV

PIXEL
SPATIAL
RESOLUTION
(pp. 16-17)

Imaging Principles (cont.)


Spectral and
Temporal Resolution
(Chapters 1 & 7)

Outline for Lecture 6


Radiometric resolution
Quantization
Gain

Signal strength and signal/noise ratio


Spectral Resolution
Spectral Bands: number and width
Temporal Resolution
Multi-temporal imagery

Radiometric Resolution (pg. 18)


Number of digital values (gray levels) that a
sensor can use to express variability of signal
(brightness) within the data
Determines the information content of the
image
The more digital values, the more detail can
be expressed

Radiometric Resolution (pg. 18)


Determined by the number of bits
within which the digital information is
encoded
21 = 2 levels (0,1)
22 = 4 levels (0,1,2,3)
28 = 256 levels (0-255)
212 = 4096 levels (0-4095)

8 bit radiometric resolution

2 bit radiometric resolution

Dynamic
Range

Saturation

Ideal
Response
(offset for
clarity)

Dark
Current
Signal

Actual
Sensor
Response
dark

Scene Brightness

bright

Signal Strength
Need enough photons incident on the
detector to record a strong signal
Signal strength depends on
Energy flux from the surface
Altitude of the sensor
Location of the spectral bands (e.g. visible, NIR,
thermal, etc.)
Spectral bandwidth of the detector
IFOV
Dwell time (more on this next week)

Signal-to-Noise Ratio (SNR)


A sensor responds to both signal strength and
electronic errors from various sensor
components (noise)
SNR = signal-to-noise ratio

signal
noise

signal = the actual energy reaching the detector


noise = random error in the measurement (all
systematic noise has been removed)
To be effective, a sensor must have high SNR

Signal is the mean value of n DN values


from a uniform ground feature (DN)

Noise =

(DN

i=1

DN )
i

n 1

Digital Numbers (DN)


Signal (DN ) is the mean of all DN values
n is the number of DN values
summation (Hyperlink)
Noise is the sample standard deviation of the n DN
values

Example
SNR calculation
50%
A uniform material of
known reflectance

200
201
199
203
201
200
202
DNs from a remote
sensing detector

This means that you cannot reliably


detect a difference in reflectance
between 1/3 of 1% reflectance for
this example

50%

Noise Equivalent Reflectance


Noise Equivalent Radiance
Noise Equivalent Temperature
A measure of the smallest magnitude of
signal or of a change in signal that can be
detected
Can be expressed in terms of reflectance (),
radiance (L) or or temperature (T)

Spectral Resolution (pg. 14)


The width and number of spectral intervals in
the electromagnetic spectrum to which a
remote sensing instrument is sensitive
Allows characterization based on geophysical
parameters (chemistry, mineralogy,etc.)

Spectral Resolution (pg. 14)


Determined by:
the number of spectral bands
width of each band
Described by the full-width at half-maximum
(FWHM)
spectral response function (SRF) of each band

(pg. 15)

For Different Spectral Bands

For Different Spectral Bands

FWHM

Advanced Land Imager (ALI) sensor on the NASA EO-1 satellite

SPECTRAL BANDS (pp.14-15)


Band 1:

Multiple Bands (pg. 15)

Landsat Thematic Mapper (TM and ETM+) Bands (pp. 207-208,211)

Surface components with very distinct


spectral differences can be resolved using
broad wavelength ranges

Hyperspectral Remote Sensing


(pg. 16 and Color Plate 7-10)

Hundreds
of bands

Airborne Visible Infrared Imaging


Spectrometer (AVIRIS) bands (pp. 240-241)

Subtle differences require finer spectral resolution (pg. 239)

vegetation spectral signatures from Jasper Ridge

Temporal Resolution (pp.17-18)


The frequency of data acquisition over
an area
Temporal resolution depends on:
the orbital parameters of the satellite
latitude of the target
swath width of the sensor
pointing ability of the sensor

MULTI-TEMPORAL IMAGERY (pg. 18)


Multi-temporal imagery is important for
infrequent observational opportunities (e.g.,
when clouds often obscure the surface)
short-lived phenomenon (floods, oil spills,
etc.)
rapid-response (fires, hurricanes)
detecting changing properties of a feature to
distinguish it from otherwise similar features

TEMPORAL RESOLUTION (pp. 17-18)


Examples of sensor temporal resolution:
SPOT - 26 days (1, 4-5 days with pointing)
Landsat - 16 days
MODIS - 16 day repeat, 1-2 day coverage
AVHRR 9 day repeat, daily coverage
GOES - 30 minutes

More about this when we discuss orbits (see chapter 7)

SPATIAL AND TEMPORAL RESOLUTION (Color Plate 1-1 and pg. 19)

Orbits and Sensors


Multispectral Sensors
(Chapter 7)

Outline for Lecture 7


Orbits: geostationary, polar
Imaging Systems (some review):
Cross-track scanners (whiskbroom sensor)
Pushbroom sensors

Swaths and pixels (some review):


Field of View (swath width)
Pixel size

Multispectral sensors
Landsat

Satellite Orbits
Orbital parameters can be tuned to produce
particular, useful orbits
Geostationary (Geo)
Sun synchronous (Polar, Low Earth Orbit,
LEO)

Geostationary Orbit (pg. 212)


Geostationary orbit is stationary with respect to a
location on the earth
Circular orbit around the equator (orbital inclination =
zero)
Placed in high orbit (22,240 mi or 35,790 km) to
match the angular velocity of Earth (24 hour
rotational period)
Low spatial resolution

Geostationary Orbits (pp. 213-215)


Weather satellites (GOES,
METEOSAT)
Constant monitoring
Communication satellites

Geostationary Orbits (pp. 213-215)


Large spatial coverage
each satellite can provide useful coverage for about 25-30%
of the earths surface
useful coverage extends only to the mid-latitudes, no more
than about 55oN and S.

Sun-synchronous (Polar) Orbit


Low Earth Orbit (LEO) are typically about 700 km
altitude
Satellite orbits in a fixed plane and the Earth rotates
beneath it
Satellite crosses the equator at the same time each
day
near-polar orbit is very common
Orbital inclination is retrograde (typically ~98o)
Circular orbits have period of about 98-102 minutes

Landsat
Sun-synchronous
Orbit (pg. 200)
Inclination
Angle

Animation of Geosynchronous
and Sun-synchronous orbits

Uses of Sun-Synchronous Orbits


Local equatorial crossing time depends on
nature of application (low sun angle vs. high
sun angle needs)
Provides global coverage effective for Earth
monitoring
LEO provides higher spatial resolution than
geosynchronous satellites

Terra satellite orbits for Oct. 18, 2011

See http://www.ssec.wisc.edu/datacenter/terra/

Terra satellite
overpasses for
Oct. 18, 2011
over North
America
note the
ascending and
descending
orbits (see the
time stamps)

See http://www.ssec.wisc.edu/datacenter/terra/

http://www.gearthblog.com/satellites.html
There are thousands of satellites orbiting the Earth
See this website for real time satellite tracking!

Getting the Data to the Ground

Getting the Data to the Ground


On-board recording and pre-processing
Direct telemetry to ground stations
receive data transmissions from satellites
transmit commands to satellites (pointing,
turning maneuvers, software updating

Indirect transmission through Tracking


and Data Relay Satellite System
(TDRSS)

Imaging Systems (pp. 194-197)


Cross-track scanning systems
whiskbroom

Along-track (non-scanning) system


pushbroom

Linear Array Whiskbroom (pg. 196)


Single detector or a linear
array of detectors
Back and forth motion of the
scanner creates the orbital
swath
Image is built up by movement
of satellite along its orbital
track Produces a wide field-ofview
Pixel resolution varies with
scan angle

Field of View (FOV)


FOV is the swath width of the instrument
It is the width of an orbital track projected on
the Earth surface
Depends on the across track scan angle of
the sensor (for whiskbroom) or the width of
the linear detector array (for a pushbroom
sensor)

Calculating the Field of View (FOV)


FOV = 2 H tan()
H = satellite altitude
Example:
SeaWIFS satellite altitude = 705 km
Scan angle = 58.3o
FOV = 1410 km x tan(58.3o)
= 1410 km x 1.619
= 2282 km

FOV

Linear Array Pushbroom (pg. 196)


Linear array of detectors (aligned
cross-track)
radiance passes through a lens and
onto a line of detectors

Image is built up by movement of the


satellite along its orbital track (no
scanning mirror)
Multiple linear arrays are used for
multi-spectral remote sensing
dispersion element splits light into
different wavelengths and onto
individual detectors

Dwell Time
The amount of time a scanner has to collect
photons from a ground resolution cell
Depends on:

satellite speed
width of scan line
time per scan line
time per pixel

Whiskbroom scanners have much shorter


dwell time than do pushbroom scanners

Whiskbroom vs. Pushbroom


Can have a wide swath
width (but not always)
Complex mechanical
system
Simple optical system
Shorter dwell time
Pixel distortion at edge
of swath
Example: MODIS,
AVHRR, SeaWIFS

Relatively narrow swath


width
Simple mechanical
system
Complex optical system
Longer dwell time
No pixel distortion
Example: MISR

The LANDSAT Program (pp. 197-211)

Pg. 198
History of
the Landsat
series
Currently,
Landsat 5
and
Landsat 7
(ETM+)
are in orbit

Landsat MSS
1972-present
(pg. 199)

Landsat MSS Bands and their Uses


Band 4 (Green: 0.5 - 0.6 m)
water features (large penetration depths)
sensitivity to turbidity (suspended sediments)
sensitivity to atmospheric haze (lack of tonal contrast)

Band 5 (Red: 0.6 - 0.7 m)


chlorophyll absorption region
good contrast between vegetated and non-veg. areas
haze penetration better than Band 4

Band 6 (NIR1: 0.7 - 0.8 m) and Band 7 (NIR2: 0.8 1.1 m)


similar for most surface features
good contrast between land and water (water is strong
absorber in near IR)
both bands excellent haze penetration
Band 7 good for discrimination of snow and ice

(pg. 203)

Landsat MSS Images of Mount St. Helens

September 15, 1973

May 22, 1983

August 31, 1988

Landsat 1,2,3
Attitude-control (pg. 200)
subsystem

Solar array

Wideband recorder
electronics

Attitude
measurement
sensor
Data
collection
antenna

Return Beam
Vidicon (RBV)
cameras (3)

Multispectral
S canner (MS S )

Landsat Sun-synchronous Orbits (pg. 200)

Retrograde orbit with a


99 inclination angle

(pg. 211)

Landsat Thematic Mapper (TM)


(pg. 204)

High-gain
antenna

Global positioning
system antenna
Attitude control
module
Propulsion
module

Power
module

Multispectral
S canner (MS S )

Thematic
Mapper
(TM)

Solar
array
panel

Landsat Thematic Mapper Bands and their Uses (pg. 205)


Band 1 (Blue: 0.45 - 0.52 m)
good water penetration
differentiating soil and rock
surfaces from vegsmoke plumes
most sensitive to atmospheric
haze

Band 2 (Green: 0.52 - 0.60 m)


water turbidity differences
sediment and pollution plumes
discrimination of broad classes of
vegetation

Band 3 (Red: 0.63 - 0.69 m)


strong chlorophyll absorption (veg.
vs. soil)
urban vs. rural areas

Landsat Thematic Mapper Bands and their Uses (pg. 205)


Band 4 (NIR1: 0.76 - 0.90 m)
different vegetation varieties and
conditions
dry vs. moist soil
coastal wetland, swamps, flooded
areas

Band 5 (NIR2: 1.55 - 1.75 m)


leaf-tissue water content
soil moisture
snow vs cloud discrimination

Band 6 (Thermal: 10.4 - 12.5 m)


heat mapping applications (coarse
resolution)
radiant surface temperature range: 100oC to +150oC

Band 7 (NIR3: 2.08 - 2.35 m)

absorption band by hydrous minerals


(clay, mica)
lithologic mapping (clay zones)

Differences in spectral
reflectance provide the
basis for mapping snow,
vegetation, soil, etc.
Landsat TM Band 4
(0.76-0.90m) and
Band 5 (1.55-1.75m)
show strong
differences in
reflectance of snow
and ice

(pg. 211)

Landsat 7 Enhanced Thematic


Mapper (ETM+)
1999-present
15 m panchromatic band
on-board calibration

Landsat ETM+ Scan Line Correction Problem (since May 2003)

(pg. 207)

Multispectral and Hyperspectral


Remote Sensing

Lecture Outline
More Examples of multispectral sensors

Landsat (continued)
AVHRR
MODIS
ASTER

Multispectral vs. Hyperspectral Sensors


Examples of hyperspectral sensors
AVIRIS
Hyperion

(pg. 211)

Landsat Thematic Mapper (TM)


(pg. 204)

High-gain
antenna

Global positioning
system antenna
Attitude control
module
Propulsion
module

Power
module

Multispectral
S canner (MS S )

Thematic
Mapper
(TM)

Solar
array
panel

Landsat Thematic Mapper Bands and their Uses (pg. 205)


Band 1 (Blue: 0.45 - 0.52 m)
good water penetration
differentiating soil and rock
surfaces from vegsmoke plumes
most sensitive to atmospheric
haze

Band 2 (Green: 0.52 - 0.60 m)


water turbidity differences
sediment and pollution plumes
discrimination of broad classes of
vegetation

Band 3 (Red: 0.63 - 0.69 m)


strong chlorophyll absorption (veg.
vs. soil)
urban vs. rural areas

Landsat Thematic Mapper Bands and their Uses (pg. 205)


Band 4 (NIR1: 0.76 - 0.90 m)
different vegetation varieties and
conditions
dry vs. moist soil
coastal wetland, swamps, flooded
areas

Band 5 (NIR2: 1.55 - 1.75 m)


leaf-tissue water content
soil moisture
snow vs cloud discrimination

Band 6 (Thermal: 10.4 - 12.5 m)


heat mapping applications (coarse
resolution)
radiant surface temperature range: 100oC to +150oC

Band 7 (NIR3: 2.08 - 2.35 m)

absorption band by hydrous minerals


(clay, mica)
lithologic mapping (clay zones)

Differences in spectral
reflectance provide the
basis for mapping snow,
vegetation, soil, etc.
Landsat TM Band 4
(0.76-0.90m) and
Band 5 (1.55-1.75m)
show strong
differences in
reflectance of snow
and ice

(pg. 211)

Landsat 7 Enhanced Thematic


Mapper (ETM+)
1999-present
15 m panchromatic band
on-board calibration

Landsat ETM+ Scan Line Correction Problem (since May 2003)

(pg. 207)

Advanced Very High Resolution Radiometer


(AVHRR) (pp. 215-217)
Very High Temporal Resolution
AVHRR on NOAA-18 Satellite

AVHRR Sensor (pg. 216)


5 Bands Designed for hydrological,
oceanographic and meteorological studies
On-board NOAA satellites, 1981 to present
NOAA-18 satellite
Coarse spatial resolution (1.1 km at nadir)
Daily Global coverage (FOV = 2400 km)
Near-polar orbit (98.9o inclination)
10-bit quantization radiometric resolution

Band 1: Daytime cloud,


snow, ice, and vegetation
mapping
Band 2: Delineation of
land/water, snow, ice veg
mapping
Band 3: Monitoring of hot
targets (volcanoes, fires),
nighttime clouds mapping
Band 4: Day and night
cloud and surfacetemperature mapping

(pg. 217)

Band 5: Cloud and surface


temperature, day and night
cloud mapping, atmospheric
correction

AVHRR Swath

AVHRR Swath

AVHRR Swaths are Pieced Together into Global Coverage

Moderate-resolution Imaging Spectroradiometer


(MODIS) (pp. 242-244)

On-board NASAs Terra satellite since 1999 and on-board


Aqua satellite since 2002

Moderate-resolution Imaging Spectroradiometer


(MODIS) (pp. 242-244)
Spectral Range

Spatial Coverage
Spatial Resolution

Radiometric Resolution
Temporal Resolution

0.4 - 14.4 m
20 bands in Vis/NIR
16 bands in mid-IR and Thermal
+55, 2330 km swath
250 m (2 bands)
500 m (5 bands)
1000 m (29 bands)
12-bit
1-2 days, 4 days at equator

MODIS Global Daily Composite Image (pg. 244)

Advanced Spaceborne Thermal Emission


and Reflection Radiometer (ASTER)
(pp.231-232)

On-board Terra satellite

Advanced Spaceborne Thermal Emission


and Reflection Radiometer (ASTER)
(pp.231-232)
On-board Terra satellite
Consists of 3 different subsystems:
Visible/Near-infrared, Shortwave IR, Thermal IR

Pointable
VNIR has stereo capability (for DEM creation)
Swath width of only 60km
8-bit quantization

Advanced Spaceborne Thermal Emission and


Reflection Radiometer (ASTER) (pg. 231)
VNIR 0.4 - 14.4 m
SWIR 1.6 - 2.5 m
TIR 8 - 12 m
Spatial Resolution 15 m (VNIR : 3 bands)
30 m (SWIR: 6 bands)
90 m (TIR: 5 bands)
Spectral Range

(pg. 232)

Pearl Harbor, Oahu, HI


ASTER SWIR image 15 x 15 m
(single band image)

Pearl Harbor, Oahu, HI


ASTER VNIR image 15 x 15 m
(RGB=1,4,3)

(pg. 233)

Three different thermal bands: 10, 12, 14

Multispectral vs. Hyperspectral


Multispectral
separated spectral bands
wider bands
coarse representation of the
spectral signature
not able to discern small
differences between reflectance
spectra
smaller image size
fewer problems with calibration
multi-decadal history of continuous
image acquisition

Hyperspectral
no spectral gaps
narrow bands (10nm)
complete representation of the
spectral signature
ability to detect subtle spectral
features
large image size
radiometric and spectral calibration
require large effort
only one sensor in orbit (research
only)

Hyperspectral Bands (pg. 240)

Published Bandwidths
Defined by Full Width
at Half Maximum
(Pg. 15)

0.5

AVIRIS
(pp. 240-241)

Color
Plate
11-2

Similar to
Color
Plate 14-1

Hyperion: Imaging Spectrometer on EO-1

Hyperion: Imaging Spectrometer on EO-1


Spectral Range
Spatial Resolution
Radiometric Resolution
Swath width

0.43 - 2.4 m
220 bands
30 m
12-bit quantization
7.6 km (pushbroom)

Hyperion Images with Different Band Combinations

30, 21, 15

50, 23, 16

197, 107, 18

115, 35, 23

Multi-angular and Thermal


Remote Sensing

Lecture Outline
Multi-angular remote sensing concept
MISR
Forward and backward scattering
Surface roughness

Emissivity concepts revisited


Spectral emissivity
Kirchoffs Law
Diurnal energy cycles

Thermal remote sensing imagers


ASTER
FLIR

Principles of Multi-angular
Remote Sensing (pp.232-234)
Reflectance from natural surfaces is
anisotropic
the angular pattern of reflectance is described by
the bidirectional reflectance distribution function

Measured reflectance depends on solar and


viewing geometry
solar zenith angle
viewing zenith angle
relative azimuth between sun and sensor

Multi-angle Imaging
SpectroRadiometer (MISR)
(pg. 234)

Onboard NASAs Terra satellite

9 view angles at Earth surface:


nadir to 70.5 forward and aft
4 bands at each view angle:
446, 558, 672, 866 nm
Continuous pole-to-pole coverage
on orbit dayside
360-km swath
9 day coverage at equator
2 day coverage at poles
275 m - 1.1 km spatial sampling
7 minutes to observe each scene
at all 9 angles
14-bit quantization
On-board radiometric calibration

Automated co-registration
of MISR imagery

SPACE OBLIQUE MERCATOR


PROJECTION

(Note: MISR view angles are measured


relative to Earth surface, not the sensor)
Physical MISR instrument

Image grid

9 angles x 4 bands
Earths surface

36 nonregistered
images
VirtualMISR instrument

36 coregistered
images

9 angles x 4 bands
WGS84 ellipsoid

SOM grid

(pg. 21)

Changes in scene
brightness with angle
Oblique view looking at
forward scatter

MISR flight direction

(pg. 21)

Changes in scene
brightness with angle
Less oblique view looking
at backscatter

MISR flight direction

Multi-angle Imaging
Spectroradiometer
(MISR) Onboard Terra
Table 1. List of bands and camera names for the MISR image.
Image Band Number MISR Camera Name Spectral Band
1
2
3
4
5
6
7
8
9
10
11
12

Df
Cf
Bf
Af
An
An
An
An
Aa
Ba
Ca
Da

Red
Red
Red
Red
Blue
Green
Red
Near-IR
Red
Red
Red
Red

Camera Viewing
Angle (degrees)
+70.5
+60.0
+45.6
+26.1
0
0
0
0
-26.1
-45.6
-60.0
-70.5

A,B,C and D are camera names


a is aft (negative viewing
angle)
f is forward (positive viewing
angle)
n is nadir (zero viewing angle)
Jensen, 2007

Visualizing surface
textures with MISR
nadir view
blue band

multi-spectral
compositing
using An
cameras

nadir view
green band

nadir view
red band

Hudson and
James Bays
24 February 2000

380 km

Visualizing surface
textures
forward scatter
red band

nadir
red band

backscatter
red band

stratocumulus
cloud

pack ice
(rough)

multi-angular
compositing
using Df, An, Da
cameras

Hudson and
James Bays
24 February 2000

fast ice
(smooth)

Angular reflectance signatures from MISR

backward scattering

forward scattering

Thermal Remote Sensing


(Chapter 8)

Sources of surface temperature


gain and loss (pp. 249-250)
Absorbed shortwave energy (emitted from Sun) (heat gain)
Longwave emitted energy from earth (heat loss)
Anthropogenic heating
urban areas
power plants

Geothermal
volcanoes
hot springs

Thermal basics, revisited (pg. 257)

Recall that radiant


temperature is not equal to
kinetic temperature
(except for a blackbody)
Emissivity is the ratio of
radiant emittance relative
to that of a blackbody
Emissivity is a spectral
quantity (it varies with
wavelength)

Thermal Basics, revisited


Plancks Blackbody Equation:
describes the spectral distribution of emitted energy as a function of
temperature

Stefan-Boltzmann Law:
shows that emitted radiation increases as a function of the 4th
power of the temperature

Wiens Law:
there is a maximum wavelength at which a blackbody radiates and
this is determined by temperature

Radiant
Intensity
of the Sun

Jensen 2007

The Sun approximates a 6,000 K


blackbody with a dominant
wavelength of 0.48 m (green
light). Earth approximates a 300
K blackbody with a dominant
wavelength of 9.66 m . The
6,000 K Sun produces 41% of its
energy in the visible region from
0.4 - 0.7 m (blue, green, and
red light). The other 59% of the
energy is in wavelengths shorter
than blue light (<0.4 m) and
longer than red light (>0.7 m).
Eyes are only sensitive to light
from the 0.4 to 0.7 m. Remote
sensor detectors can be made
sensitive to energy in the nonvisible regions of the spectrum.

Atmospheric Windows for Thermal Remote Sensing (pg. 52)

3-4 m 8-9 m 10.5-12.5 m

Jensen 2007

a) The absorption of the Suns incident


electromagnetic energy in the region
from 0.1 to 30 m by various
atmospheric gases. The first four graphs
depict the absorption characteristics of
N2O, O2 and O3, CO2, and H2O, while
the final graphic depicts the cumulative
result of all these constituents being in
the atmosphere at one time. The
atmosphere essentially closes down
in certain portions of the spectrum
while atmospheric windows exist in
other regions that transmit incident
energy effectively to the ground. It is
within these windows that remote
sensing systems must function.
b) The combined effects of atmospheric
absorption, scattering, and reflectance
reduce the amount of solar irradiance
reaching the Earths surface at sea
level.

Kirchoffs Law (pp. 257-258)


1 = reflectance + emittance
emittance = 1 - reflectance
This means that (at a particular
wavelength) low reflectance also
means high emittance
And the opposite: good reflectors are
poor emitters

Emittance spectra of
various minerals
Spectral Emittance
Signatures

Similar to spectral reflectance


signatures, emittance spectra can
be exploited to map and
characterize surface materials

Diurnal Temperature and Energy Cycles (pg. 275)

Diurnal Temperature and Energy Cycles


For Different Ground Materials (pg. 275)

Thermal Imagers
Detectors in the thermal region need to be cooled
Changes in detector temperature are a source of noise
(increases the NE T)
Less energy so pixel size is relatively large
Sensors with thermal bands:

ATLAS
LandSat TM
AVHRR
MODIS
ASTER
FLIR (Forward Looking Infrared -- airborne sensor)

(pg. 277)
ATLAS
Daytime
TIR Images

ATLAS
Predawn
TIR Images

ASTER Image of
Saline Valley, CA

RGB composite
image using VNIR
bands (3, 2, 1)
(vegetation is red)

ASTER Image of
Saline Valley, CA
RGB composite
image using SWIR
bands (4, 6, 8)
(show reflectance
differences of
clays, carbonate
and sulfate
minerals)

ASTER Image of
Saline Valley, CA
RGB composite
image using TIR
bands (13, 12, 10)
(quartz is red,
carbonates are
green, mafic
volcanic rocks are
purple)

Forward Looking InfraRed (FLIR) (pg. 272)

FLIR
Sensor

Monochrome FLIR image


Mount St. Helens eruption

Hot lava in crater dome

Color-enhanced FLIR image


Helicopter-mounted in-stream temperature monitoring
Provides info on spatial variability

Side channel with cooler water temperatures on Cavitt Creek.


(Temperature scale in C).
Courtesy Anthony Olegario

Passive Microwave Remote


Sensing
(Chapter 9, pp. 330-332)

Outline for Lecture 10


Passive Microwave Remote Sensing

Rayleigh-Jeans Law vs. Plancks Law


Brightness vs. Kinetic temperature
Dielectric constant
Sensors and applications

Passive Microwave Remote Sensing


(pg. 330)
Microwave region: 0.15-30cm (1-200 GHz)
Uses the same imaging principles as
thermal remote sensing
Multi-wavelength (or frequency) /multipolarization sensing
Weak energy source so need large IFOV
and wide spectral bands

Microwave Brightness Temperature


(pg. 330)
Microwave radiometers measure the emitted
spectral radiance received (L) but record data as a
matrix of Brightness Temperature values that are
linearly related to the spectral radiance received.
The Rayleigh-Jeans Law is an approximation to
Plancks blackbody radiation law that provides a
simple linear relationship between measured
spectral radiance and Brightness Temperature at
long wavelengths

Rayleigh-Jeans Law (1905)

L =

2kcTkin

Plancks Law vs. Rayleigh-Jeans Law


The Rayleigh-Jeans Law
described long wavelength
radiation well, but failed
to describe short
wavelength radiation

microwave

Plancks Law vs. Rayleigh-Jeans Law

Plancks Law

2hc 2
hc
L = 5 x
x=
kTkin
e 1

c speed of light
3.0 108 ms-1
h Planck constant
6.63 10-34 Js
k Boltzmann constant 1.38 10-23 JK-1

Rayleigh-Jeans Equation for Real Materials

L =

2kcTkin

k is Plancks constant, c is the speed of light, is


emissivity, T is kinetic temperature
This approximation only holds for >> max
(e.g. > 2.57mm @300 K)

This image cannot currently be displayed.

Brightness Temperature

The Brightness Temperature Tb of a real


materical typically is defined as

Tb =

2kc

Rayleigh-Jeans and Brightness Temperature


for Real Materials

L =

2kcTkin

Tb =

Tb=Tkin
k is Plancks constant, c is the speed of light, is
emissivity, Tkin is kinetic temperature
This approximation only holds for >> max
(e.g. > 2.57mm @300 K)

2kc

Brightness temperature at long wavelengths


is linearly related to kinetic temperature
through the emissivity of the material

Thus, passive microwave brightness


temperatures can be used to monitor
temperature as well as properties related to
emissivity

Microwave Radiometers
Advanced Microwave Sounding Unit (AMSU) 1978present
Scanning Multichannel Microwave Radiometer (SMMR)
1981- 1987
Special Sensor Microwave/Imager (SSM/I) 1987-present
Tropical Rainfall Measuring Mission (TRMM) 1997present
Advanced Microwave Scanning Radiometer (AMSR-E)
2002-present

Passive Microwave Radiometry


Passive microwave sensors use an antenna
(horn) to detect photons at microwave
frequencies which are then converted to
voltages in a circuit
Scanning microwave radiometers
mechanical rotation of mirror focuses
microwave energy onto horns

1 GHz = 30 cm wavelength
10 GHz = 3 cm
100 GHz = 3 mm

Passive Microwave Applications

Soil moisture
Snow water equivalent
Sea/lake ice extent, concentration and type
Sea surface temperature
Atmospheric water vapor
Surface wind speed
only over the oceans
Cloud liquid water
Rainfall rate

Monitoring Temperatures with


Passive Microwave
Sea surface temperature

Land surface temperature

Passive Microwave Sensing of Land


Surface Emissivity Differences
Microwave emissivity is a function of the
dielectric constant. Technically, this is the ratio
of the amount of electrical energy stored in a
material by an applied voltage, relative to that
stored in a vacuum.
Most earth materials have a dielectric constant in
the range of 1 to 4 (air=1, veg=3, ice=3.2)
Dielectric constant of liquid water is 80
Thus, moisture content affects brightness
temperature
Surface roughness also influences emissivity

Soil Moisture from AMSR-E


Dry soil has low emissivity
Wet soil has high emissivity
Soil moisture value represents and average
over the entire pixel
Represents top ~1 cm of soil
Vegetation is a problem
Salinity, urban areas, etc. also influence the
measurement

Soil moisture
and the 2004
flood in Texas
and Oklahoma

Snow Emissivity Example


Dry
Snow
dry snow

(2)

Soil

snow water equivalent


Wet
Snow
(1)

Soil

(3)

Soil

Wet snow is a strong


absorber/emitter

SSM/I
Northern
Hemisphere
snow water
equivalent
(mm of water)

Atmospheric Effects
At frequencies less than 50 GHz, theres
little effect of clouds and fog on brightness
temperature (it sees through clouds)
Thus, PM can be used to monitor the land
surface under cloudy conditions
In atmospheric absorption bands, PM is
used to map water vapor, rain rates, clouds

Atmospheric Mapping

Mapping
global water
vapor
85 GHz

Passive Microwave Sensing of Rain


Over the ocean:
Microwave emissivity of rain (liquid water) is about 0.9
Emissivity of the ocean is much lower (0.5)
Changes in emissivity (as seen by the measured
brightness temperature) provide and estimate of surface
rain rate

Over the land surface:


Microwave scattering by frozen hydrometeors is used
as a measure of rain rate
Physical or empirical models relate the scattering
signature to surface rain rates

Rainfall from
passive
microwave
sensors:
Accumulated
precipitation from
the Tropical
Rainfall
Measuring
Mission (TRMM)
Similar to SSM/I

Passive Microwave Remote Sensing from Space


Advantages

Disadvantages

Penetration through nonprecipitating clouds


Radiance is linearly related to
temperature (i.e. the retrieval is
nearly linear)
Highly stable instrument
calibration
Global coverage and wide
swath

Larger field of views (1050 km) compared to


VIS/IR sensors
Variable emissivity over
land
Polar orbiting satellites
provide discontinuous
temporal coverage at low
latitudes (need to create
weekly composites)

2006 melt on the


Greenland ice sheet
150% increase in melt
intensity compared with
an 18-year average
Data are from the SSM/I
passive microwave sensor
onboard the Defense
Meteorological Satellite
Program (DMSP)
Courtesy of Marco
Tedesco, NASA Goddard
Space Flight Center

Light Detection and Ranging


(LIDAR) Remote Sensing
(Chapter 10 : pp. 335-354)

Lecture 11 Outline
LIDAR Principles
System Components
Types of returns
LIDAR accuracy
Digital Surface Model creation
Bare earth model creation
LIDAR Intensity Data
LIDAR applications

LIDAR (Light Detection and Ranging) is


an optical remote sensing technology that
measures properties of scattered light to
find the range and/or other information
about a distant target. The prevalent
method to determine distance to an object
or surface is to use laser pulses. The range
to an object is determined by measuring the
time delay between transmission of a pulse
and detection of the reflected signal.

LIDAR Principles (pp.336-337)


A GPS and an Inertial
Measurement Unit (IMU)
constantly record the
airplanes exact position and
its roll, pitch, and yaw.
As the airplane moves
forward, a rotating mirror
directs across-track returns
from transmitted pulses of
laser light to a receiver. More
than 100,000 pulses/sec.
(pulse repetition frequency)
are emitted.

LIDAR Principles (pp.336-337)


The Range (R) is the
distance between the
LIDAR sensor and the
ground. It is a function of the
laser pulse travel time t from
the transmitter to the
receiver and the speed of
light c:

1
R = tc
2

LIDAR Principles (pp.336-337)


The instantaneous laser
footprint Fpinst is the
diameter of a laser beam
pulse at a given instant in
time. It is a function of the
flying height h, the scan
angle at the instant inst,
and the laser beam
divergence in radians:

FPinst =

cos ( inst )
2

LIDAR Principles (pp.336-337)

FPinst =

cos ( inst )
2

1000m
0.001rad
=
2
cos (15)
1m
=
= 1.07 m
0.933

NASA Airborne LiDAR: Laser Vegetation


Imaging Sensor (LVIS)

LIDAR Principles (pp.337-338)


Returns from laser pulses are detected by the
receiver, and the (x,y,z) ground location of each return
(called a masspoint) is computed from the range,
scan angle, aircraft GPS position, and aircraft pitch,
roll, and yaw information from the IMU.

LIDAR Principles (pp.337-338)


There can be more than one return from each pulse,
and masspoints may be returns from the ground, or
from objects above the ground such are trees, poles,
or powerlines. The set of masspoints is called a data
cloud.

Multiple returns
from one pulse

Multiple LIDAR Returns


(pg.339)
Locations of 1st, 2nd, and last
returns are computed, labeled as
such, and stored in a data file for
each instantaneous laser footprint:
NAD 83 geodetic latitude,
longitude, ellipsoid elevation based
on WGS 84 ellipsoid
NAD 83 UTM or State Plane
easting, northing, ellipsoid
elevation

LIDAR Vertical Accuracy

(pp.349-351)

Geospatial Accuracy Standard

(pg.349)

Based on the Root Mean Squared Error (RMSE) measure:


n

RMSE =

i =1

is the difference between the horizontal or


vertical masspoint values from the LIDAR
system and higher accuracy surveyed
values for the same set of n masspoint
locations, often measured with survey-grade
GPS receivers since they measure
ellipsoidal elevations as does LIDAR

Geospatial Accuracy Standard

(pg.349)

Vertical Root Mean Squared Error (RMSE) measure for


masspoint locations:
n

RMSE =

(elev
i =1

GPS

elevLIDAR )

Typical RMSE vertical error values


are between 2 and 10 cm a very
small error for many applications

Digital Surface Model Creation from LIDAR


Returns (pg.341)
A Triangulated Irregular Network
(TIN) can be created from 1st,
intermediate, or last returns. The
set of 3D triangles is made as
compact overall as possible,
often using a method called
Delaunay Triangulation

y
x

Digital Surface Model Creation from LIDAR


Returns (pp. 340-342)
LIDAR
Masspoint
Elevation (zi)
Grid point to
masspoint
distance (di)

DSM Grid point


interpolation by
Inverse Distance
Weighting (IDW)
of neighboring
masspoints

= 359

Digital Surface Model Creation from LIDAR


Returns (pp. 340-342)
The number of neighboring masspoints depends
on the search radius surrounding each grid point

Digital Surface Model Creation from LIDAR


Returns (pp. 340-342)
The End Result
DSM grid points

LIDAR masspoints

The Digital Surface Model


is often displayed with relief
shading and layer tinting

Bare Earth Digital Terrain Models (DTMs)


from Digital Surface Models (pp. 341-343)
The majority of LIDAR
returns used to make a
DSM often are from
vegetation and cultural
features, not bare ground.

How do we eliminate these


natural and cultural
features from the DSM to
create a DTM of the bare
ground?

Bare Earth Digital Terrain Models (DTMs)


from Digital Surface Models (pp. 341-343)
The basic idea is to
study the 1st and last
returns and figure out
what parts of the DSM
are vegetation and
cultural features. The
DSM grid cells
associated with these
features are then
reassigned the
elevations of their bare
earth neighbors to
make a DTM.

Bare Earth Digital Terrain Models (DTMs)


from Digital Surface Models (pp. 341-343)
The first step is to use
an automated filter that
passes a moving window
over the 1st and last
return DSM grid cells
and identifies cells that
likely are returns from
vegetation and cultural
features.

Bare Earth Digital Terrain Models (DTMs)


from Digital Surface Models (pp. 343-344)
The second step is a
human analyst to
examine visually the
masspoints marked as
vegetation or cultural
features and edit these
points, deleting some
and adding others.

Bare Earth Digital Terrain Models (DTMs)


from Digital Surface Models (pp. 341-343)
The result is a DTM
like this.

However, Water flow and


similar LIDAR applications
usually require that buildings
be left in the DTM, as in this
example.

LIDAR Intensity Data (pp. 343-346)


LIDAR Intensity is the
maximum voltage from the
pulses received for each
instantaneous laser
footprint
Intensity depends on:
Pulse bandwidth
Range to feature
Scan angle
Atmospheric dispersion of
pulse
Automatic gain control
Reflectivity of the surface
at the wavelength of the
laser

LIDAR Intensity Data (pp. 343-346)


Appearance of ground features on
LIDAR Intensity images:
Trees appear dark (a) because of
scattering of light in tree canopies
Grassy and bare earth areas (b and c)
absorb and scatter light relatively little, so
they appear lighter
Many rooftop materials strongly
backscatter light, so they appear very light
on the image

LIDAR Applications (pp. 351-352)


Creating 3D maps of buildings in urban areas

LIDAR Applications (pp. 351-352)


OSUs Reser Stadium in 3D Perspective!

Mt. Rainier Debris Flow Project:


Airborne LiDAR, Fall 2007 & 2008
Landsat TM satellite imagery 1982-2009
NAIP aerial photo, 2005
Ground-based surveys 2008

Mt. Rainier and debris flow initiation sites

Intensity image of Inter


Glacier and gullies on
Mt. Rainier, Washington

Inter Glacier

LIDAR Applications (pp. 351-352)


Tree heights can be measured using the laser returns;
multiple laser returns penetrate the canopy,
painting the surfaces of vegetation very thoroughly.
Laser Returns:
Bin 075, 5.00 meter
cross section
Bin shown as 0.5 meter
first return grid

82 meters

All Material Protected by Watershed Sciences, Inc.

Vegetation Structure from Lidar Waveform

LIDAR Applications (pp. 351-352)


Volume Calculations:
Mount St. Helens Dome Growth over 20 Day Period - 2004
Mount St. Helens Dome Growth Over 20 Day Period - 2004

LIDAR Applications (pp. 351-352)


Power Transmission Line Surveys

September 24

September 30

October 14

Active Microwave Remote


Sensing

Outline for Lecture 12


How radar works
Radar nomenclature and parameters
Radar resolution
--ground and slant range
range resolution (cross-track)
azimuth resolution (along-track)

Image foreshortening,layover, and shadows

Passive and Active Systems (pg. 291)


Passive remote sensing systems record
electromagnetic energy that is reflected or emitted
from the Earths surface and atmosphere
Active sensors create their own electromagnetic energy
that 1) is transmitted from the sensor toward the
terrain, 2) interacts with the terrain producing a
backscatter of energy, and 3) is recorded by the
remote sensors receiver.

Radar=Radio Detection and Ranging


(pp. 294-295)
Radar system components

Radar: How it Works (pp. 291-292)


A directed beam of
microwave pulses are
transmitted from an antenna
The energy interacts with the
terrain and is scattered
The backscattered
microwave energy is
measured by the antenna
Radar determines the
direction and distance of the
target from the instrument as
well as the backscattering
properties of the target

Commonly Used Radar Frequencies


(pg. 294)

Radar Parameters (pp. 296-299)


Azimuth Direction
direction of travel of aircraft or orbital track of satellite

Range angle
direction of radar illumination, usually perpendicular to
azimuth direction

Depression angle
angle between horizontal plane and microwave pulse (near
range depression angle > far range depression angle)

Incident angle
angle between microwave pulse and a line perpendicular to
the local surface slope

Polarization
linearly polarized microwave energy emitted/received by the
sensor (HH, VV, HV, VH)

Radar Nomenclature
Nadir
azimuth flight direction
look direction
range (near and far)
depression angle ()
incident angle ()
altitude above-ground-level, H
polarization

Pg. 297

Pg. 297

NEAR
RANGE

FAR
RANGE

RADAR logic
15m across-track
resolution

Ground

Separate returns from


objects 1 and 2
greater than 15 m
apart on the ground

Merged returns from


objects 3and 4 less
than 15 m apart on the
ground

Pg. 302

Slant Range vs. Ground Range (pg. 301)

Slant Range Distortion

Slant Range Image

(pp. 300-301)
Flightline

Near
rangeB2

A4

Far B4
range

A2

B1

A3

B3

A1

Ground Range

Ground Range Image

Geometric distortion that occurs in the direction of the radar beam


The scale is shortened in the near range but not much in the far range.
This systematic geometric distortion can be corrected.

Radar Pulse Length (pg. 301)


Pulse Length (P.L.) = c
c = speed of light (3 108 m/sec)
= duration of transmission

= 1 microsecond = 10-6 sec


P.L. = 3 108 m/sec 10-6 sec
= 300 meters
= 0.4 microsecond = 0.4 10-6 sec
P.L. = 3 108 m/sec 0.4 10-6 sec
= 120 meters

Range (Cross-track) Resolution (pg. 301)


Range resolution depends on pulse length (duration of
transmission speed of light) and depression angle

c
Rr =
2cos

pulse length
depression
angle

short pulse length gives finer range resolution


far range has better resolution than near range

But, when pulse length is shortened, so is the total


amount of energy illuminating the target

Range
Resolution
= ( c)
3 108m/sec =
30m

(pp. 302-303)

c
Rr =
2cos
30m
=
2 cos(40)
30m
=
2 0.766
= 19.58m

Range
Resolution
= ( c)
3 108m/sec =
30m

(pp. 302-303)

c
Rr =
2cos
30m
=
2 cos(65)
30m
=
2 0.423
= 35.5m

Azimuthal (Along-track) Resolution


(pg. 303)
For real aperture radar,
Ra is proportional to the wavelength and the beam width
But, shorter wavelengths arent necessarily useful for
environmental remote sensing (poor atmospheric and vegetation
penetration)

Beam width is inversely proportional to antenna length (L)

S
Ra =
L

S = Slant range distance [m]


= wavelength [m]
L = antenna length [m]

Azimuthal Resolution (pp. 303-304)


(L = 500cm)

X-band (=3cm)

Ra= S / L

Two different slant-range distances:


the tanks can be resolved in the near range: Ra=(20km3cm)/500cm = 120m
but not in the far range: Ra=(40km3cm)/500cm = 240m

Image Foreshortening (pg. 306)

Slopes that are facing the radar appear


compressed (and bright) in the resulting image

Image Foreshortening (pg. 307)


The higher the
incident angle,
the less the
foreshortening.

Foreshortening
of a conical
cinder cone, as
seen on an aerial
photograph.

Airborne Radar

Spaceborne Radar

Spaceborne radar can image over a wide swath


without having as wide a range of incident angles as
for airborne radar.

Image Layover (pg. 306)


Layover
occurs when
the incidence
angle () is
smaller than
the foreslope
(+)
i.e., < +.
This distortion
cannot be
corrected!
See the Figure 9-13 caption for further
explanation of this diagram.

Image Layover (pg. 308)


Extensive layover
in this image of the
San Gabriel
Mountains just
north of Pasadena,
California.

Radar Shadowing (pp.306-307)


A Radar Shadow
occurs when the
backslope angle (-)
is steeper than the
depression angle ().
Radar Shadows
cannot be taken out
of the image!

Radar Shadowing can be Useful for


Interpreting Geomorphological Features

Radar (cont.) and InSAR

Outline
Radar backscatter

surface roughness
Polarization
dielectric constant
vegetation

Radar wavelengths and cloud penetration


Interferometric Synthetic Aperture Radar
(InSAR)

Radar Backscatter (pg. 312)


Power received

= Power per unit area at target

Effective scattering area of the target


Spreading loss of reradiated signal
Effective receiving area of antenna

x
x

Radar Backscatter Coefficient (pg. 313)


The efficiency the terrain to reflect the radar pulse is termed the
radar cross-section,
The radar cross-section per unit area, (A) is called the radar
backscatter coefficient () and is computed as :

=
o

The radar backscatter coefficient determines the percentage of


electro- magnetic energy reflected back to the radar from within
a radar pixel
This is similar to the reflectance in optical remote sensing

Radar Backscattering
Depends on the properties of the target:
roughness
dielectric constant

Depends on characteristics of the radar:


depression angle
frequency/wavelength
polarization

Basic Types of Radar Backscattering


(pg. 314)

h=

8sin

Surface
Roughness in
RADAR Imagery
(pg. 314)

EMR Polarization (pg. 298)


Vertical Polarization (V)

Horizontal Polarization (H)


Direction of each field is perpendicular to the direction
of wave propagation.

Polarization (pg. 298)


Polarized light can be either
vertically polarized (is perpendicular to the plane of
incidence)
horizontally polarized (is parallel to the plane of incidence)

Solar radiation is unpolarized (random) but can


become polarized by reflection, scattering, etc.
Lasers and radars produce polarized radiation

VV and HH
Radar
Polarization
(pg. 299)

Radar
Polarization (pg.
300)

a.
Ka - band, HH polarization

look direction

b.
KaHV
- band,
HV polarization
polarization

Cinder cone and basalt lava


flow in north-central Arizona.
Strong return in the HH
polarized image and weak
HV polarization indicates
that the lava is not
depolarizing the radar pulse
(it is composed of large
blocks with smooth faces)

Radar and the Dielectric Constant


(pg. 316)
Dielectric constant is the ratio of the capacitance (the
ability of a material to conduct electrical energy) to
that of a vacuum. Also known as the relative
permittivity
Dielectric constant depends on the type of material
as well as its moisture state
it is analogous to the refractive index of the material
it is primarily a function of moisture content
also depends on chemical properties such as salinity

Dielectric Constant (pg. 316)

Dielectric Constant of liquid water is 80; dry soil is 2-4.

Response of A Pine Forest Stand to X-, C- and L-band


Microwave Energy (pg. 320)
Surface
Scattering

Surface
Scattering

Volume
Scattering
Volume
Scattering

Surface
Scattering

Cloud Penetration (pp. 320-321)


C-, L-, and P- band radars are defined as all
weather
X-band radar does not penetrate heavy
precipitation
Only Ka band (0.8 - 1.1 cm) has some cloud
mapping capability

Heavy Downpour

SIR-C/X-SAR
Images of a
Portion of
Rondonia, Brazil,
Obtained on
April 10, 1994
(pg. 321)

Interferometric Synthetic Aperture Radar: InSAR


Radar Interferometry
is based on recording
both the amplitude
and the phase of the
backscattered
microwave pulse
when it interacted
with the terrain

Phase Shift between two Microwaves

constructive interference

destructive interference

Waves of
microwave energy
can have
constructive
interference or
destructive
interference
depending on how
they match up

Pg. 328

With SAR interferometry, radar images of the


same area are acquired at two different times
(multiple pass) or from two side-by-side
synthetic aperture radars (single pass)
The interference pattern between the two phase
images gives information on height
Its analogous to stereo-photogrammetry except
that it uses the phase information

Shuttle Radar Topography Mission (SRTM) example


Single Pass
Radar
Interferometry

Shuttle Radar Topography Mission


(SRTM) (pp. 328-329)
Used C and X-band interferometry to produce the first near-global
topographic map of the Earth. February 2000 (10-day mission)
Space Shuttle Endeavor: 2 radar antennas, 60 m apart
Covered 80% of Earths surface (60o N to 54o S)

Pp. 328-329
Baseline

Height from Shuttle


to Ground Feature

Angle from
Horizontal to
Baseline

Geometric Relationship
Between Two SAR
Systems Used for
Interferometry to Extract
Topographic Information,
if the precise height above
mean sea level can be
found for one of the radars

h
= cos( )
r1

h = r1 cos( )

Increasing the Precision


Range differences produce meter-scale precision but phase
differences (phase shifts) produce millimeter-scale precision
The phase shift between the two images corresponds to
relative difference in range between the two radars

~10cm of uplift produces ~3 fringes of deformation

Here, the red rings


show where the two
waves reinforce each
other and blue rings
show where they
cancel out (note: color
assignments are
subjective)
Concentric color bands
are called fringes

Each fringe represents a distance


change (displacement) of /2.
This is in the line-of-sight
direction of the radar.

To quantify the total


movement or height
change, just count the
fringes

Interferometric SAR (InSAR)


Phase shift is a measure of the radar line-of-sight
component of displacement of the ground surface
point in relation to the spacecraft from the time of one
image to the other
Half a wavelength of line-of-sight displacement (2.8
cm for ERS-1,2) produces a phase shift of 360o
Thus, the fringe pattern is a contour map of the lineof-sight displacements w/contour interval of 2.8 cm
Line-of-sight displacement resolution is 1.3mm

Digital Elevation Models can be created from Fringe Data

Counting the Fringes gives the


Displacement

Interferogram of Santa
Clara Valley shows
patterns of subsidence and
some uplift occurring over
7 months in 1997, from
January 4 to August 2.
A central subsidence zone,
(red) the result of seasonal
ground-water pumping, is
fully recoverable.

Remote Sensing of Vegetation

Outline
Physical basis for remote sensing of
vegetation
Visible and Near-IR vegetation reflectance
Vegetation indices
Mid-IR reflectance and thermal emittance as
indicators of water stress
Multiangular sensing of vegetation
Phenological cycles

Vegetation and Photosynthesis


About 70% of the Earths land surface is covered by
vegetation with perennial or seasonal photosynthetic
activity

Physical Basis for Remote


Sensing of Vegetation

Photosynthesis
Pigmentation
Leaf structure
Plant water content
Canopy structure
Phenological cycles

Photosynthesis and Spectral


Characteristics
(pp. 355-361: read for basic ideas, not details)
Energy-storage in plants, powered by light
absorption by leaves
Leaf structures have adapted to perform
photosynthesis, hence their interaction with
electromagnetic energy has a direct impact
on their spectral characteristics

Visible, Near IR and Middle IR Interactions


(pp. 357-361)

Pg. 360
lack of
absorption

Absorption Spectra of
Chlorophyll a and b,
-carotene, Phycoerythrin,
and Phycocyanin Pigments
Chlorophyll a peak absorption is at
0.43 and 0.66 m.
Chlorophyll b peak absorption is at
0.45 and 0.65 m.
Optimum chlorophyll absorption
windows are:
0.45 - 0.52 m and 0.63 - 0.69 m

Pg. 358
Cross-section through a
hypothetical and real leaf
revealing the major
structural components
that determine the
spectral reflectance
of vegetation

Near IR Interactions within the Spongy


Mesophyll (pp. 361-362)
High leaf reflectance in the NIR results from
scattering/reflectance from the spongy
mesophyll
This layer is composed of cells and air
spaces (lots of scattering interfaces)

Reflectance, Transmittance, and Absorption Characteristics of Big Bluestem Grass

Pg. 362

Multiple Scattering in the Plant Canopy


(pp. 362-363)

Vegetation Indices (pg. 382)


A vegetation index is a simple mathematical
formula: a ratio
Used to estimate the likelihood that
vegetation was actively growing at the time of
data acquisition
Widely used over several decades
New, more sensitive vegetation indices have
been developed

Vegetation Indices (pg. 382)


This image cannot currently be displayed.

Make use of the red vs. NIR


reflectance differences for
green vegetation
Veg. indices are associated
with canopy characteristics
such as biomass, leaf area
index and percentage of
vegetation cover

Normalized Difference Vegetation Index (NDVI)


(pp. 385-386)

red = Reflectance in red channel


NIR = Reflectance in NIR channel
Healthy, dense vegetation has high NDVI
Stressed, or sparse vegetation produces lower
NDVI
Bare rock, soil have NDVI near zero
Snow produces negative values of NDVI
Clouds produce low to negative values of NDVI

NDVI as an indicator of drought:

Cautions about NDVI (pg. 386)


Saturates over dense vegetation
Less information than original data
Any factor that unevenly influences the red and NIR
reflectance will influence the NDVI
such as atmospheric path radiance, soil wetness

Pixel-scale values may not represent plant-scale


processes
Derivatives of NDVI (FAPAR, LAI) are not physical
quantities and should be used with caution

Other Vegetation Indices (pp. 387-391)


Soil-adjusted Vegetation Index (SAVI)
Soil and Atmospherically-Resistant
Vegetation Index (SARVI)
Moisture Stress Index (MSI)
Global Monitoring Environmental Index
(GEMI)
Enhanced Vegetation Index (EVI)

Enhanced Vegetation Index (EVI)


(pg. 391)
Compensates for atmospheric and soil effects

red = Reflectance in red channel


NIR = Reflectance in NIR channel
blue = Reflectance in blue channel
C1 = Atmospheric resistance red correction coefficient (C1 = 6)
C2 = Atmospheric resistance red correction coefficient (C2 = 7.5)
L = Canopy background brightness correction factor (L = 1)
G = Gain factor (G = 2.5)

EVI vs NDVI (pg. 391)


The EVI is a modified NDVI with a soil adjustment factor, L, and
two coefficients, C1 and C2 which are used to correct for
atmospheric scattering
The coefficients, C1 , C2 , and L, are empirically determined (from
observations using MODIS data)
The EVI has improved sensitivity to high biomass regions and
improved vegetation monitoring through a de-coupling of the
canopy background signal and a reduction in atmospheric
influences (Huete and Justice, 1999).

Middle IR Interactions with Water in


the Spongy Mesophyll (pp. 364-365)
Plant water content absorbs middle IR
radiation
Middle IR plant reflectance increases as leaf
moisture content decreases
Middle IR reflectance can be used to monitor
plant water stress

Reflectance response of a single Magnolia leaf


(Magnolia grandiflora) to decreased relative water content
(pg. 366)

Thermal Emission and Plant Water Stress


Measures of thermal emission can be used to derive surface
temperature for a crop
As water transpires from a plant, its leaves are cooled
If a plant is stressed, transpiration is reduced and leaf
temperature increases

red=warmer
blue=cooler

Thermal IR image showing plots of irrigated cotton

Aquatic Plants
Immersed aquatic plants absorb solar energy
and emit thermal radiation (warmer than
surrounding water)
This can be detected in thermal imagery

water hyacinth plumes in


Lake Victoria

Angular Reflectance Properties of Vegetation


(pp. 367-368)

Vegetation reflects light unevenly, in different


directions (anisotropic reflectance)
Depends on:
leaf shape
canopy height
vegetation density

Described by Bidirectional Reflectance


Distribution Function (BRDF)

Surface heterogeneity effects on bidirectional reflectance

Manitoba and Saskatchewan, 17 April 2001

Nadir false-color composite: RGB = near-IR, red, green

Surface heterogeneity effects on bidirectional reflectance

farmland
forest

Manitoba and Saskatchewan, 17 April 2001

Multi-angle red band composite: RGB = backscatter nadir, fwd scatter

Retrieved surfac e BRF, red band

Bidirectional reflectances of surface vegetation

back

fore

Farmland =
homogeneous land
surface cover
Limb-brightening

Farmland with light snow


Albedo = 0.18, NDVI = 0.13

Snowy forest
Albedo = 0.18, NDVI = 0.24

MISR View angle, deg


Manitoba and Saskatchewan, 17 April 2001

Snowy forest =
heterogeneous,
sparse coverage
Limb-darkening

Phenological Cycles (pp. 373-375)


Temporal characteristics of vegetation growth
Depends on:
plant available water: rainfall/irrigation
land surface temperature
vegetation type (evergreen vs. deciduous)

Crop cycles (depends on planting/harvesting cycle)


Deciduous cycles (depends on seasonality of rainfall
and temperature)

pp. 378-382
Phenological cycles
of San Joaquin and
Imperial Valley,
California crops and
Landsat Multispectral
Scanner images of
one field during a
growing season

Color plate 11-4

Remote Sensing of Soils


(part of Chapter 14)

Outline
Soils: Properties that are detectable from remote
sensing using optical and radar sensors

mineral composition
soil moisture
organic matter content
texture and roughness
salinity

Soil Definition (pp. 508-509)


Soil is a mixture of
inorganic mineral
particles and
organic matter of
varying size and
composition.
50% particles 50%
air + water
Soils are divided
vertically into
Horizons

Factors Influencing Soil Reflectance (pp. 512-517)

mineral composition
soil moisture
organic matter content
texture and roughness
salinity

Soils: Mineral Composition (pg. 512)


Affects the visible, NIR, and thermal portions
of the reflectance spectrum
Increasing reflectance from visible to NIR
Iron and clays are detectable

Visible to Near IR Reflectance from


Loam Soil with and without Iron Oxide
(pg. 516)

Soil Iron Oxide Content Map made


from AVIRIS Imagery

Iron content in the Santa Monica mountains mapped using AVIRIS

Soils: Organic Matter Content (pg. 515)


Organic Matter affects soil color, heat
capacity, water holding capacity, nutrient
exchange, structure, and erodability
Dark color generally associated with high
Organic Matter
Landsat TM band 5 and 7 (mid-Infrared)
reflectance has a negative correlation with
Organic Matter

Visible to Near IR Reflectance from Soils


with 0% to 100% Organic Matter (pg. 515)

Soil Organic Matter Content Map


made from AVIRIS Imagery

Organic matter content in the Santa Monica mountains mapped


using AVIRIS

Soil Texture (pp. 509-510)


Texture is defined by %sand, %silt, %clay
a. Soil Science Society of America and U.S. Department of Agriculture Soil Particle Size Scale
Sand
Clay
Silt
Gravel
v. fine fine medium coarse
v. coarse
0.002

Clay

0.05 0.1 0.25 0.5

2 mm

76.2

Particle size relative


to a grain of sand
0.15 mm in diameter

Silt

0.15 mm

Sand

b. MIT and British Standards Institute


Silt

Clay
fine

Sand

medium

0.002 0.006

coarse

0.02

0.06

fine

medium

0.2

Gravel

coarse

0.6

Stones

2 mm
c. International Society of Soil Science

Clay

Sand

Silt
0.002

fine

0.02

Gravel

coarse

0.2

2 mm

Soil Texture Classes (pp. 510-511)


100
(%
)

90

Cla
y

Proportion of sand, silt and clay


in a soil (or soil horizon) usually
is calculated as % weight for
each type of particle
These percentages are divided
into different soil texture classes

10
20

80

30

70
Clay

40

60

50

50

read
40

sandy
clay

)
(%

loamy
Sand sand

90

80

70

80

Loam

10

60

silty clay
loam

clay loam

sandy clay
loam

20

100

silty
clay

t
Sil

30

read

silt loam

sandy
loam

90
Silt

100
60
70
Sand (%)

50

40
read

30

20

10

Soil Texture and Moisture (pg. 513)


100
90

Basic Dry Soil Spectrum

80

Increasing reflectance
with increasing wavelength
for both fine and coarse
texture dry soils

Silt

70
60
50

Sand

40
30
20
10
0
0.5 0.7

0.9

1.1

1.3 1.5 1.7 1.9


Wavelength (m)

2.1

2.3

2.5

Soil Moisture (pg. 513)


The finer the soil texture,
the greater the soils
ability to maintain a high
moisture content
The greater the soil
moisture content, the
greater the sunlight
absorbed by liquid water
and the less the reflected
energy to the sensor

incident energy

specular
reflectance

dry
soil
a.

interstitial
air space

incident energy

specular
reflectance

b.

specular reflectance
volume reflectance

soil water

wet
soil

Soil Moisture and Texture (pg. 514)


60

Sand
Sand
Sand

0 4% moisture content

50
40
30

5 12%

20

22 32%

10
0
0.5

0.7

0.9

1.1

1.3

1.5

1.7

1.9

2.1

2.3

2.5

a.
60

Clay
Clay

Clay

50

2 6%

40
30
20
35 40%

10
0
0.5
b.

0.7

0.9

1.1

1.3 1.5 1.7 1.9


Wavelength (m)

2.1

2.3

2.5

Since clayey soil holds


water more tightly than
sandy soil, reflectance
decreases in water
absorption bands will be
more prominent in clay
soils than sand soils
given the same amount
of time since the last
precipitation or watering

Soil Moisture Difference Detection from


RADAR (pg. 322 and pp. 514-515)
Higher dielectric constants
(more moisture) yields higher
RADAR backscatter.
SAR estimates of soil
moisture are also influenced
by:

Surface roughness
Vegetation
Salinity
Mixed pixels

Melfort, Saskatchewan, Canada,


ERS-1: Rainfall fell on the area in
the lower half of the image but not
on the upper half.

Soil Moisture from Thermal Sensors


(pp. 274-275)
Diurnal Temperature Cycle
Water has a higher
thermal capacity than
soil and rock.
Moist soils will change
in temperature more
slowly than dry soils
due to evaporation and
thermal inertia

Soil Moisture from Thermal Sensors


Daedalus thermal
image (night time). If
we had a daytime
image to compare it to,
we could see the
amount of change in
temperature and make
inferences on the soil
moisture content (less
change = more
moisture).

Saline Soils (pg. 515)


Occur through in saltwater intrusion into
groundwater, release of salt-bearing minerals via
rising water tables, evaporation of salt-bearing
irrigation water.
Responsible for severe impacts on vegetation

Soil Salinity Mapping from Landsat Imagery


Digital Elevation Model
Yellow = high
Blue + Pink = low
Green = intermediate

Landsat TM image with


salinity changes mapped
Green = saline in 1977
Light blue = saline in 1988
Dark blue = saline in 1997
Pink = predicted for next
decade

Swelling Clay Soils

Smectite clays (e.g. bentonite) can absorb more than their weight in
water, have >100% increases in volume
Responsible for massive damage to structures throughout U.S.

increasing
swelling
potential

Bentonite clay
absorption bands
for detecting
increasing soil
swelling potential

Bentonite clay exposure


near Denver, CO

Colorado Front Range


Geologic Map
Pierre shale contains
high concentrations of
bentonite clays
AVIRIS data used to
map bentonite
exposures
(Chabrillat and Goetz,
1999)

Geologic Remote Sensing

Outline
InSAR Applications
Geomorphology
Patterns as an indication of processes

Geologic Remote Sensing


Surface mineralogy
Geologic structure
Tectonic landforms

Intermap X-band Star 3i Orthorectified Image of


Bachelor Mountain, CA and Derived Digital Elevation Model (pg. 331)

Intermap X-band Star 3i Orthorectified


Image of Bachelor Mountain, CA and
Derived Digital Elevation Model

You need to have a geodetic baseline to assign elevations to a surface

Shuttle Imaging Radar-C Mission, Long Valley, California


L-band
HH

topographic
map

multi-pass
interferogra
m

C-band
image
draped
over topo
map

InSAR and Ground Movement


(pp. 329-330)

If you have exact repeat overpass then you


can use phase shifts from one time to another
to indicate ground movement
This has been used to map glacier velocities,
movement along faults, volcanic uplift

Three Sisters
Volcanic Uplift

Three Sisters Volcanic Uplift

Uplift

Lambert Glacier, Antarctica


Interferometric
Synthetic
Aperture
Radar (InSAR)
uses phase
difference between
two images to
determine glacier
speed and direction

Image courtesy CSA, NASA, Ohio State Univ., JPL, ASF

Geomorphology from Remote Sensing


(pp. 529-530)

Landforms: erosion and deposition by


various processes
Eolian (wind)
Igneous (volcanic)
Karstic (limestone erosion)
Fluvial (rivers)
Shoreline (ocean and rivers)
Glacial
Extraterrestrial (meteor impacts)

Landforms: Patterns and Processes


(pp. 529-530)

Landforms are typically identified by shape,


not spectral properties
Shadows and height differences help to
identify features

Igneous Landforms (pp. 530-534)


Stratovolcano

ASTER image of
Mt. Taranaki,
New Zealand

Igneous Landforms (pp. 530-534)


Shield Volcanos

Landsat image mosaic


of big island of Hawaii
Kilauea Caldera

Basalt Lava
Flows

Karst Landforms (pg. 549)


Karst topography develops by chemical dissolution of limestone rocks
Creates caves, sinkholes, solution valleys, towers, pitted landscapes

photo of karstic towers in Guangxi,


province China

Landsat TM image of Guangxi


province, China (karst areas are dark)

Fluvial Landforms (pp. 540-549)

Radar image of dendritic river network in east-central Colombia

Glacial Landforms: Alpine Glaciation


(pp. 556-561)

ASTER image of receding Alpine glaciers in Bhutan


(note lakes forming at glacier termini)

Remote Sensing of Surface Mineralogy


(pg. 518)

Rocks are assemblages of minerals that have


interlocking grains or are bound together by
various types of cement (usually silica or calcium
carbonate)
Most rock surfaces consist of several types of
minerals
Depending on vegetation and soil cover, it maybe
possible to differentiate between rock types using
remote sensing techniques

Surface Mineralogy (pp. 518-520)


Discrimination of minerals based on:
visible, near-infrared reflectance
thermal emittance

Remember Kirchoffs Law:


Emittance = 1 - Reflectance

Pg. 520
Spectra of three minerals derived from NASAs airborne visible infrared imaging spectrometer (AVIRIS)

Absorption features useful for


identifying kaolinite

Spectra of three minerals


derived from NASAs
airborne visible infrared
imaging spectrometer
(AVIRIS) and as measured
using a laboratory
spectroradiometer.
Spectra are vertically
offset for clarity.

Color Plate 14-1


Mineral maps of Cuprite, NV

Mineral maps of
Cuprite, NV, derived
from low altitude (3.9
km AGL) and high
altitude (20 km AGL)
AVIRIS data obtained
on October 11 and
June 18, 1998

Thermal IR and longer wavelength emittance spectra for various minerals

Thermal IR and longer


wavelength emittance
spectra for various
minerals
Spectral Emittance
Signatures
Similar to spectral reflectance
signatures, spectral emittance
signatures can be exploited to
map surface materials

ASTER Image of Saline Valley, CA

ASTER Image of Saline


Valley, CA
RGB composite image
using TIR bands (13, 12,
10)
(quartz is red,
carbonates are green,
mafic volcanic rocks are
purple)

Stratigraphy
Appalachian Mountains

Differences in
reflectance can be
used to discriminate
between rock units

Landsat MSS image

Anti-Atlas Mountains, Northern Africa

Yellow: limestone
Orange: sandstone
Green: gypsum
Dark blue: granite

ASTER SWIR color composite

Geologic Structure (pp. 522-524)


Landsat TM
band 2 image of
fractures in the
bedrock of the
Canadian shield

Landsat TM color composite of


Zagros Mtns., Iran

Landsat TM color composite of Zagros Mtns., Iran

ASTER 3-2-1 RGB composite


draped over ASTER-derived
DEM (2x vertical exaggeration)

Anticlines and Synclines (pg. 522)

Tectonic Landforms (pp. 522-524,539-540)


Landsat TM image
of Nevada basin and
range region

Hector Mine Earthquake

Hector Mine Earthquake


Oct. 16, 1999
Southern California
M 7.1
Horizontal and vertical
displacement
Right side subsided about
50-60 cm
Left side uplifted about 6070 cm

Tectonic Landforms (pp. 522-524,539-540)


Landsat MSS Image
of Los Angeles and
vicinity

San Andreas
strike-slip fault

Using Lidar Images to Map Faults


In some
places, its
easy to see
where the
active faults
are on Lidar
images.
30 km

Using Lidar Images to Map Faults


Bainbridge
Island

Seattle

Tacoma
30 km

In other
places, it
is not.

What are the Relevant Differences?


SF Bay
area

Puget
Lowland

slip rate

3 cm/yr

4 mm/yr

average tree
height

10 ft

100 ft

age of
landscape

~106
years

18,000
years

age slip rate = feature size


18,000 yr 1 mm/yr = 18 m
106 yr 1 mm/yr = 1 km
In the Puget Lowland, to see a fault with the same slip
rate as in the SF Bay area, we have to look more
closely.

Lidar Remote Sensing of Faults in the Puget


Sound Area
Scanning Lidar flown over Seattle-Tacoma area
Lidar images processed to remove vegetation
returns
Digital Elevation Model produced
Surface topography reveals geologic structure

12-ft DEM from LIDAR

High-resolution LIDAR
Topography

15 km West of Seattle
Toe Jam Hill
fault scarp

Waterman Point
scarp
beach uplifted
during 900 AD
earthquake

Landslides
Landslides

Southern Bainbridge
Island

Remote Sensing of the


Hydrosphere
(Chapter 12)

Outline for Lecture 18


Oceans

Sea surface temperature


Sea surface height
Ocean color
Winds

Surface freshwater (lakes, rivers)


Changing water levels
Flood plain mapping

Snow and Ice


Snow covered area, snow water equivalent
Glaciers

The Hydrologic Cycle (pg. 409)

Icecaps and Glaciers


2.2%

Rivers and Lakes


0.2%
Groundwater 0.6%

74% of the Earth


is covered by
oceans and surface
freshwater, 97%
being in the
oceans and the
remaining 3% in
freshwater, as
shown on the
diagram to the left.

Ocean Remote Sensing

Sea surface temperature


Sea surface height
Ocean color
Wind and waves

Sea Surface Temperature (SST)


(pp. 426-427)
Thermal sensors (e.g.
Passive microwave sensors
AVHRR, MODIS)
(e.g. TRMM, SSM/I)
good spatial resolution
and accuracy
incomplete spatial
coverage
long heritage (>20 years)
obscured by clouds
atmospheric corrections
are required
measures SST to a water
depth of 10 m

lower spatial resolution and


accuracy than w/thermal
good spatial coverage
long heritage (>20 years)
clouds are transparent
relatively insensitive to
atmospheric effects
measures SST to a water
depth of 1 mm
sensitive to surface
roughness (waves)
sensitive to heavy
precipitation

Differences in Spatial Coverage


(similar to Color Plates 12-4 and 12-5)
AVHRR

TRMM
Microwave
Imager

(white pixels are areas of no data)

Higher Resolution SST Maps made from AVHRR and MODIS data
(pg. 427 and Color Plate 12-4)

AVHRR image of Gulf Stream


(5 km gridding)

MODIS image of
Gulf Stream
(1 km gridding)

Sea Surface Height


Sea surface height changes with wind
patterns, ocean currents, ocean temperature
(thermal expansion with warming)
Radar altimeter used to measure changes in
sea surface height (timing of radar pulse
gives distance between sensor and surface)

Sea Surface Height from Radar


Altimeters
TOPEX/POSEIDON

launched in 1992 and still operating


measures SSH between 66o N lat and 66o S lat
C and Ku band altimeters
Sea surface height accurate to 4.2 cm
data available for free from http://podaac.jpl.nasa.gov

Jason-1 (similar to TOPEX/POSEIDON)


launched in December 2001
some data are now available for free from
http://podaac.jpl.nasa.gov

Comparison of Sea Level and SST Anomalies from TOPEX/Poseidon


and AVHRR Pathfinder data averaged over a week

Proposed New Mission: Surface Water Ocean Topography (SWOT)


Wide swath altimeter:
Ka-band InSAR with 2
swaths, 60 km each
Similar to SRTM
Also, a radar altimeter
(Jason class)
50-m spatial resolution
for rivers/lakes, land
1-km spatial resolution
for ocean

Ocean Color (pp. 418-420)


Ocean color is a measure of biological activity
phytoplankton (produce chlorophyll)
dissolved and particulate matter

Used to measure:

biological productivity
marine optical properties
interaction of winds and currents with ocean biology
human influences on the marine environment

Remote sensing of ocean color uses multispectral


data

Ocean Color Sensors (pg. 421)


Coastal Zone Color Scanner (CZCS)
October 1978 - June 1986
1 km and 4 km gridded data
data products available for free from

http://oceancolor.gsfc.nasa.gov/CZCS/
Seaviewing Wide Field-of-view Scanner (SeaWIFS)
since September 1997
1 km and 4 km gridded data
data products available for free for scientific use from
http://oceancolor.gsfc.nasa.gov/SeaWiFS/ANNOUNCEMENT
S/getting_data.html

Moderate Resolution Imaging Spectroradiometer


(MODIS)
since February 2000
1 km gridded data
data products available for free from
http://mirador.gsfc.nasa.gov/cgibin/mirador/presentNavigation.pl?tree=project

MODIS
chlorophyll
data

lt. blue=good data


dk.blue=questionable
purple=cloud
red=bad data

Revised MODIS
chlorophyll data

(pg. 421)

MODIS
Quality
Assurance
data

Ocean Winds
Ocean wind measurements are needed for
understanding and predicting severe weather patterns
Winds modulate energy transfer between the
atmosphere and ocean
Waves are the expression of momentum transfer from
atmosphere to ocean
Surface roughness can be monitored using Synthetic
Aperture Radar (SAR)
Scatterometry is a new method of measuring
backscattered return from a SAR

How Scatterometry Works


Microwave radar signal is transmitted to
ocean surface and the return signal is
measured at different azimuth angles
Radar pulses are scattered by waves
Rougher surface = more scattering

The backscattered power is a of roughness


and orientation
Near-surface wind speed is computed from
radar backscatter
Orientation of wind ripples indicate wind
direction

Ocean Wind Remote Sensors


NASA Scatterometer (NSCAT)
Sept 1996 - June 1997
C-band
Gridded to 50-km resolution

QuikSCAT SeaWinds scatterometer

June 1999 Nov 2009


C-band
Gridded to 12.5 and 25 km for wind vectors
measures winds of 3- 20 ms-1, with 2 ms-1 accuracy

Data from NSCAT and QuikSCAT are available for


free from http://podaac.jpl.nasa.gov

Hurricane
Katrina from
QuikScat

Ocean Wave Height Remote Sensors


Radar Altimeters measure sea surface height
Examples of satellite radar altimeters:
TOPEX/Poseidon
ERS-1 and ERS-2

TOPEX/Poseidon data show waves developing in Pacific

Two days later, a surfer at Mavericks (Half


Moon Bay, California) rides the huge waves

ERS-2 SAR image of waves the next day.


SAR measures wave direction and wave
height

Sonar Mapping of Seafloor Topography (pp. 424-425)


Acoustic mapping of the
seafloor involves active sensing
using sound waves
Distance from ship to ocean
bottom is inferred from travel
time
Distances must be corrected for
water temperature

Multibeam Sonar (pg. 425)


Multibeam sonar creates a wider
swath
Depth and return intensity can be
recorded
Limited to deeper water

Shallow-water Data Gap (pp. 425-426)

Problems

Potential Solutions

DEM ends at the beach


Navigational hazards limit
shallow range of sonar data

Airborne LIDAR
Satellite Imagery

Filling the Shallow Water Data Gap using Ikonos


Satellite blue and green band imagery
Advantage
High Spatial
Resolution

Imagery
Issues
Sea state and
environmental
conditions
Surf zone
Clouds

Ikonos Image Processing: Principles of


Water Depth Determination
Longer wavelengths of spectral
energy attenuate faster as they
pass through water

Establish the relationship between the


attenuation rates of the blue and
green spectral bands as a function
of depth
Use this correlation to derive depth
Analysis must be image specific

Filling the Shallow Water Data Gap

Remote Sensing of Lakes (pp. 413-414)


Monitoring changes in surface water extent
Lake Chad

Water level
fluctuations in Lake
Chad, Nigeria

Landsat MSS, 1973


The reduction in water
surface area can be
determined by overlaying
the two images
AVHRR, 1997

Remote Sensing of Rivers


Flooding
extent of surface waters
floodplain mapping

Mekong Delta
flooding using
multi-temporal
MODIS data
Mapping the
extent of surface
waters for large
flood events
allows improved
mapping of
floodplains

Southeast Asian Flooding


in summer 2000
NASA global flood
monitoring system operated
through the Dartmouth
Flood Observatory
Data collected from
MODIS 8-day composites
Used to locate hardest hit
areas for United Nations
relief efforts

Tropical Rainfall Monitoring Mission (TRMM) maps


areas of heavy precipitation between 50oN and 50oS.
Near real time capability (daily maps)
(pp. 430-431)

Flooding in southern Texas

Snow cover Mapping (pp. 435-437)


Passive microwave
25 km grid scale
weekly data available since October 1978
daily data available since January 2000

AVHRR
1 km grid scale
weekly data available since October 1966

MODIS
500m grid scale
data available since February 2000

All data available for free from National Snow and Ice
Data Center http://nsidc.org/data

Snow covered area


from MODIS
daily and 8-day
composite global snow
cover products
500 m spatial
resolution
available for free from
http://nsidc.org/data/m
od10_l2.html

March 25, 2003


Colorado Blizzard
(blue=snow, white=cloud)

MODIS Snow Standard Product, Colorado, 15 Feb 2002

snow
cloud
no snow

25% of
actual
snow
covered
area is
incorrectly
classified
as cloud

MODIS fractional snow cover, Colorado, 15 Feb 2002

% snow cover
90-100%
70-90%
50-70%
30-50%
10-30%
0-10%

Vegetation
obscures snow

Glaciers
Global Land Ice Monitoring from Space (GLIMS)
Uses ASTER data to map
glacier extent
snow line
glacier topography

Synthetic Aperture Radar


Used to map glacier zones
Interferometry used to map glacier velocity

ASTER
image of
Patagonian
glacier in
Chile

Ice Velocity
map for the
Lambert glacier
in Antarctica
made by Radar
Interferometry

Remote Sensing of
Urban Landscapes and
Natural Hazards
(Chapter 13)

Outline
Resolution considerations
Sensors
Applications:
urban growth
demography
disaster monitoring and mitigation
epidemiology

Urban-Suburban Land Use (pg. 443-444)


Urban and suburban expansion
almost 1/2 the Earths population lives in cities
rapid expansion of urban centers and their
peripheries
impacts on land cover, societal structure of the
cites, population distribution, land use
characteristics
interconnectivity of cites at large scales

Urban Remote Sensing (pp. 444-445)


High spatial resolution data are needed
High temporal and spectral resolution are
typically not a significant requirement for most
applications
Ancillary data typically used (census data) with
imagery
Can measure things such as urban extent, housing
density, structure type, urban vegetation cover,
air quality, change detection

Temporal and Spatial Resolution


Requirements (Figure 13-2 and Table 13-1)
Requirements vary depending on applications
short term (event-scale, sub-annual) vs. long
term (interannual)
high spatial resolution (< 1m) vs. medium
spatial resolution (15-30m)

High Spatial Resolution Sensors (pg. 446)

QuickBird (65cm B/W, 4m multispectral)


IKONOS (1m B/W, 4m multispectral)
SPOT (2.5m - 20m multispectral)
ASTER (15 -30m multispectral)
Landsat ETM+ (15m B/W, 4m multispectral)

Delineation of Urban Areas (pp. 452-456)


Difficult to do because urban areas are
diverse and complex
Boundaries between urban and suburban
are not always clear
Lack of a consistent definition of what is
urban
administrative boundaries
population density, etc.

Baltimore, MD: Welldeveloped city center


Diffuse boundary
between urban and
natural environment

Landsat TM multispectral image

Las Vegas, NV:


Indistinct city center
Distinct boundary
between urban and
natural
environments

Landsat TM multispectral image

Riyadh, Saudi
Arabia:
Intermediate
case

Riyadh,
Saudi Arabia,
1972-2000 urban
growth

Demographic/Socioeconomic Patterns
Census data lack spatial details and are
infrequently updated (not globally available)
Remote sensing is useful for monitoring urban
growth in developing countries
Need ancillary data plus repeat temporal
coverage from remote sensing
Important to integrate physical and
socioeconomic variables

Example
Pozzi and Small (2002) produced a study of
relationship between population density
(from US census) and vegetation cover (from
Landsat TM)

NYC Population Density

(source: US Census)

NYC Vegetation Fraction

(source: Landsat TM)

Linear inverse correlation between population and vegetation fraction

Urban Heat Island Monitoring (pg. 498)


Project ATLANTA (Atlanta Land-use Analysis:
Temperature and Air-quality)
Uses remote sensing to observe, measure, and
monitor impacts of rapid urban growth

ATLAS Thermal Images of Atlanta, Georgia

Atlanta - Daytime Image

Atlanta - Nighttime Image

New York City Surface Temperature Map


made from Landsat TM Thermal Band Image

City Lights Imagery (pg. 463)


Uses visible band of the Operational Linescan
System (on board the DMSP satellite)
Useful for making global inventories of human
settlements
Spatial resolution of 1km
Relationships between city lights and
socioeconomic variables such as population
density, economic activity, electric power
consumption, etc.

Earth Lights from Operational Linescan System (pg. 463)

Remote Sensing of Air Pollution

MODIS image showing polluted air like a gray veil over the region southwest of
Chicago on September 9, 2002.

Measurements of Pollution in the Troposphere (MOPITT)

Carbon monoxide plumes from China


22 km spatial resolution, 640 km FOV

Natural Hazard Remote Sensing (pg. 500)

Volcanic eruptions
Tornados
Hurricanes
Oil spills
Earthquakes
War/terrorism
Floods

Mt.St. Helens, Washington, May 18, 1980


Ash Plume As Seen By GOES-West

ASTER image of Maryland Tornado Path (pg. 498)

Pg. 498

AVHRR image
of Hurricane
Floyd
September
1999

Flooding in New Orleans


caused by Hurricane
Katrina
Quickbird satellite image showing
flooding that resulted from the
failure of the levee near the 17th
Street Canal

New Orleans on March 04, 2004


for comparison purposes

Mapping Flooding in New Orleans

Lidar-derived water depths superimposed over a high


resolution SPOT image

Building Damage in
Biloxi, Mississippi caused
by Hurricane Katrina

QuickBird image, taken on August 31,


shows extensive damage in the blocks
nearest the shore. Within two city blocks,
two floating casinosthe Island of Capri
Casino and the Grand Casino Biloxi
have disappeared or floated inland

RADARSAT Image of Oil Spill

pp. 500-501

Bam, Iran
Earthquake
destruction
IKONOS image from
12/27/2003

pp. 500-501

Old City Bam


Earthquake
destruction
IKONOS image from
12/27/2003

Images of
Terrorism
IKONOS image of
World Trade Center
Sept. 12, 2001
1 m spatial
resolution

Wartime Remote Sensing Imagery


Quickbird
image of
explosion in
Baghdad
suburb
70 cm
resolution

Epidemiology
Cholera virus attaches to zooplankton (copepods)
and phytoplankton. Plankton plumes emanating
from the Ganges are being monitored

SST and plankton can be monitored in Bay of Bengal to


track this

Hanta virus (carried by mice) correlates to


changes in precipitation (El Nino) and vegetation
cover, especially grasses
NDVI can be used to track these changes

Townshend et al. found that Ebola outbreaks


corresponded to changes in land use and seasonal
climate patterns

Cholera and SST

Landsat TM map
of land cover near
Kikwit, Zaire
(location where
Ebola outbreaks
were first
reported in 1995)
pink=cleared
areas
green=jungle

You might also like