remote sensing & image processing fundamentals · remote sensing & image processing...
TRANSCRIPT
Remote Sensing & Image Processing
FundamentalsMohamed M. Mostafa,GIS Dept., Manager
Information Technology [email protected]
Remote Sensing Fundamentals
What is Remote Sensing?
“Remote sensing is the science (and to some extent,
art) of acquiring information about the Earth's
surface without actually being in contact with it.
This is done by sensing and recording reflected or
emitted energy and processing, analyzing, and
applying that information".
The Remote Sensing Process1. Energy Source or Illumination (A)
2. Radiation and the Atmosphere (B)
3. Interaction with the Target (C)
4. Recording of Energy by the Sensor (D)
5. Transmission, Reception, and
Processing (E)
6. Interpretation and Analysis (F)
7. Application (G)
Advantages of Satellite Imagery– Data coverage is greatly improved– Homogeneity of data– Satellite remote sensing data are spatially continuous– The data from satellite remote sensing are commonly in a
form suitable for computer processing– The frequency of data collection is greatly improved– The time base of a single pass of a satellite is very restricted– Satellite remote sensing provides a low cost means of data
collection
Electromagnetic Radiation
E l e c t r o m a g n e t i c R a d i a t i o n
consists of an electrical field (E) which varies in magnitude in a direction perpendicular to the direction in which the radiation is traveling, and a magnetic field (M) oriented at right angles to the electrical field. Both these fields travel at the speed of light (c)
Wavelength and frequency
The Wavelength is the length of one wave cycle, which can be measured as the distance between successive wave crests. Frequency refers to the number of cycles of a wave passing a fixed point per unit of time. Wavelength and frequency are related by the following formula:
.
The Ultraviolet or UV portion of the
spectrum has the shortest
wavelengths which are practical for
remote sensing.. Some Earth surface
materials, primarily rocks and
minerals, fluoresce or emit visible
light when illuminated by UV
radiation
The Electromagnetic Spectrum
The light which our eyes - our
"remote sensors" - can detect is part
o f t h e v i s i b l e s p e c t r u m
The visible wavelengths cover a range
from approximately 0.4 to 0.7 µm.
The longest visible wavelength is red
a n d t h e s h o r t e s t i s v i o l e t
Interactions with the Atmosphere
Particles and gases in the
atmosphere can affect the incoming
light and radiation. These effects are
caused by the mechanisms of
scattering and absorption
Scattering •Rayleigh scattering particles < wavelength of the radiationScatters Shorter wavelengthsEx. dust or nitrogen and oxygen moleculesBlue sky
•Mie scatteringparticles = wavelength of the radiationScatters longer WavelengthsEx. pollen, smoke and water vapour
•Nonselective scatteringparticles > wavelength of the radiation all wavelengths are scattered about equallyWhite fogs (blue+green+red light = white light).
Absorption•Ozone
Absorbs the harmful (to most living things) ultraviolet radiation from the sun
•Carbon dioxide Absorbs radiation strongly in the far infrared portion of the spectrum Greenhouse effect
•Water vapourabsorbs much of the incoming longwave infrared and shortwave microwave radiation (between 22μm and 1μm).
Atmospheric windowsThose areas of the spectrum which are not severely influenced by atmospheric absorption and thus, are useful to remote sensors.
Radiation Target Interactions
LeavesA chemical compound in leaves called chlorophyll strongly absorbs radiation in the red and blue wavelengths but reflects green wavelengths. Leaves appear "greenest" to us in the summer, when chlorophyll content is at its maximum. In autumn, there is less chlorophyll in the leaves, so there is less absorption and proportionately more reflection of the red wavelengths, making the leaves appear red or yellow (yellow is a combination of red and green wavelengths). IR = healthy leave
Water
Water typically looks blue or blue-
green due to stronger reflectance
at these shorter wavelengths, and
darker if viewed at red or near
infrared wavelengths.
If there is suspended sediment
present in the upper layers of the
water body, then this will allow
better reflectivity and a brighter
appearance of the water.
• Remote sensing systems which measure energy that is naturally available
• during the time when the sun is illuminating the Earth
Passive vs. Active Sensing
• Provide their own energy source for illumination
• anytime, regardless of the time of day or season
• Some examples of active sensors are a laser fluorosensor and a synthetic aperture radar (SAR)
Passive Active
Characteristics of Images• Image is subdivided into
small equal-sized and shaped areas , ca l led p i x e l s , e a c h o n e i s r e p r e s e n t e d w i t h a numeric value or digital n u m b e r ( D N )
• Sensors, electronically record the energy as an array of numbers in digital format right from the start.
Sensors platforms
Satellite Characteristics
Geostationary orbits :Satellites at very high altitudes, which view the same portion of the Earth's surface at all times have.At altitudes of approximately 36,000 km, revolve at speeds which match the rotation of the Earth .
• ex. weather and communications satellites
near-polar orbits :Many remote sensing platforms are
designed to follow an orbit (basically
north-south) which, in conjunction with
the Earth's rotation (west-east), allows
them to cover most of the Earth's
surface over a certain period of time.
Many of these satellite orbits are also
sun-synchronous: This ensures
presence of solar energy most of the
time.
Spatial Resolution
It is important to distinguish between pixel size and spatial resolution - they are not interchangeable. If a sensor has a spatial resolution of 20 meters and an image from that sensor is displayed at full resolution, each pixel represents an area of 20m x 20m on the ground. In this case the pixel size and resolution are the same. However, it is possible to display an image with a pixel size different than the resolution.
Spectral Resolution
describes the ability of a sensor to
define fine wavelength intervals.
•To separate finer wavelength
ranges we would require a sensor
with higher spectral resolution
•Black and white film records
wavelengths of the visible
spectrum without individually
distinguishing. Its spectral
resolution is fairly coarse
Radiometric Resolution• The maximum number of
brightness levels available depends on the number of bits used in representing the energy recorded. Thus, if a sensor used 8 bits to record the data, there would be 28 = 256 digital values available, ranging from 0 to 255. However, if only 4 bits were used, then only 24 = 16 values ranging from 0 to 15 would be available
• By comparing a 2-bit image with an 8-bit image we can see that there is a large difference in the level of detail depending on their radiometric resolutions
2-bit image 8-bit image
Temporal ResolutionTime for satellite to revisit the same point.
• is important when: – persistent clouds offer limited clear
views of the Earth's surface (often in the tropics)
– short-lived phenomena (floods, oil slicks, etc.) need to be imaged
– multi-temporal comparisons are required (e.g. the spread of a forest disease from one year to the next)
– the changing appearance of a feature over time can be used to distinguish it from near-similar features (wheat / maize)
MSS systems Photographic systems
Extended Spectral RangeRestricted Spectral Range
Higher spectral Resolution Lower spectral Resolution
Acquire all spectral bands simultaneously through the same optical system
Use separate lens systems to acquire each spectral band
Data are recorded electronicallyPhotochemical process to record radiation detected
Immediate processing of data in a computer Long time for processing
Electronic transmission of data to receiving stations on the ground
Processing on the ground after the photos have been taken
Image Interpretation and Processing
Photo or image interpretation is to extract qualitative information about photographed objects, either by human visual analysis and/or by computer techniques.
Analyzing images could be for single or stereo images, analog (using optical or mechanical components) or analytical (mathematical modeling).
An orthophoto is prepared from pair of overlapping images to remove the perspective aspect. Superimposing the contour over an orthophoto will result a topographic map that will shows both planimetric features of the terrain and shape/elevation of the ground.
Thematic maps produced when delineating particular themes as we do in GIS spatial data layers.
Tone (color) shape size pattern
association shadow texture
Images Visual Interpretation
Some Satellite Systems
Sensor TypesAbbreviation Name Description
Alt RADAR Altimeter Microwave pulse used to measure altitude from satellite to Earth's surface, useful for determining the ocean's height
PAN Panchromatic Provides grayscale images, wavelength region 400 - 700 nm
RIR Red-Near-Infrared Has a minimum of a red (680 nm) and near-infrared (~800 nm) channel, for vegetation monitoring with NDVI
Red Red Has at least one channel near 680 nm
SAR-C C-Band Synthetic Aperture RADAR
SAR systems use multiple looks along the flight direction to improve the resolution by using the RADAR return Doppler shift at each position. RADAR return strength is dependent on the surface roughness relative to the wavelength and the electrical characteristics (dielectric constant, which is the electrical energy that is stored, absorbed, and conducted by the surface). C-Band wavelengths are about 3-8 cm. See the Jet Propulsion Laboratory SAR site for more information about SAR.
SAR-K K-Band SAR K-band wavelengths are about 1-2 cm
SAR-L L-Band SAR L-band wavelengths are about 15-30 cm
SAR-P P-Band SAR P-band wavelengths are about 30-100 cm
SAR-S S-Band SAR S-band wavelengths are about 7-15 cm
SAR-X X-Band SAR X-band wavelengths are about 2-4 cm
SAR-Q Q-Band SAR Q-band wavelengths are about 1 cm
SAR-W W-Band SAR W-band wavelengths are about 0.3 cm
SWIR Short-Wave Infrared At least one channel in the range 1300-3000 nm
TIR Thermal Infrared At least one channel in the range 3000+ nm
VIS Visible Multiple channels in the visible portion of the electromagnetic spectrum (400-700 nm)
VNIR Visible-Near-Infrared Multiple channels in the visible through near-infrared portion of the electromagnetic spectrum (400-1300 nm)
Mission Country Years Instrument Spatial Resolution •
(meters, at nadir) Swath • (km) PAN VNIR SWIR TIR SAR/band
IRS-P6 (5-24 day cycle)
India 2003 LISS 4 6 6 6 6 24 - 70
AWiFS 60 60 60 740
ADEOS-2 Japan 2002-2003 GLI 250 250 1000 1600
Aqua (EOS PM-1) (2 day) USA 2002 MODIS
250, 500,1000
500,1000 1000 2300
ENVISAT-1 ESA 2002AATSR 1000 1000 1000 512
ASAR 30/C 100
Landsats 1-4 USA 1972-92 MSS 80 185
1982- TM-4,5 30 30 120 185
GOES-10 (0.02 day)
USA 1997Imager 1000,
4000 Hemisphere
Sounder 10000 10000 10000 Hemisphere
Radarsat-2 (4-6 days) Canada 2004 SAR 3 – 100 / C 10 - 500
NOAA 6-11,15 USA 1978-2001 AVHRR 1.1km 1.1km 1.1km 3000
Quickbird-2 (1-5 days) USA 2001 Quickbird 1 4 22
SPOT-1, 3 France 1986-1996 2xHRV 10 20 60
Weather Satellites/Sensors GOES
– Geostationary Operational Environmental
Satellite
– GOES-1 (launched 1975) , GOES-7
(launched 1992) GOES-8 (launched 1994)
– 10bit –13bit radiometric resolution
– 36000 km above the equator
GOES Bands
Spatial ResolutionWave lengthBand
1km0.52 - 0.72) visible(1
4km3.78 - 4.03) shortwave IR(2
4km6.47 - 7.02) upper level water vapor3
4km7.2 – 10.2) longwave IR(4
4km10.2 - 11.2) longwave IR(5
NOAA AVHRRNational Oceanic and Atmospheric
Administration (NOAA)
-Advanced Very High Resolution Radiometer
(AVHRR)
-Sun-synchronous, near-polar orbits (830-870 km)
above the Earth
-AVHRR sensor detects radiation in the visible,
near and mid infrared, and thermal infrared
portions of the electromagnetic spectrum
(source: NOAA)
NOAA AVHRR Bands
ApplicationSpatial
Resolutio n
Wave lengthBand
cloud, snow, and ice monitoring1.1km0.58 - 0.68) red(1
water, vegetation, and agriculture
surveys
1.1km0.725 - 1.1) near IR(2
sea surface temperature,
volcanoes, and forest fire activity
1.1km3.55 -3.93) mid IR(3
sea surface temperature, soil
moisture1.1km10.3 - 11.3) thermal
IR(4
sea surface temperature, soil
moisture1.1km11.5 - 12.5) thermal
IR(5
Sensor Specifications: AVHRRAVHRR/3 (NOAA-15 through 18)
Band Wavelength Region (µm) Resolution (km) 1 0.58-0.68 (red) 1.1 2 0.73-0.98 (near-IR) 1.1
3a 1.58-1.63 (mid-IR) 1.1 3b 3.54-3.87 (high-temp TIR) 1.1 4 10.3-11.3 (TIR) 1.1 5 11.5-12.4 (TIR) 1.1
Land Observation SatellitesLandsat
– 700 km altitude and have revisit periods of 16 days
– a combination of sensors with spectral bands tailored
to Earth observation;
– functional spatial resolution
– good areal coverage (swath width and revisit period).
– near-polar, sun-synchronous orbits
Landsat Sensors
Thematic Mapper (TM)Multi Spectral Scanner (MSS)
starting on Landsat 4
7 bandsfour spectral bands
Spatial resolution of TM is 30 m for all but the thermal infrared band
which is 120 m
spatial resolution of 60 x 80 meters
8 bitsradiometric resolution of 6 bits
Sensor Specifications: MSS Landsat
Band Wavelength Region (µm) Resolution (m) 1 0.5-0.6 (green) 80 2 0.6-0.7 (red) 80 3 0.7-0.8 (red - near-IR) 80 4 0.8-1.0 (near-IR) 80
TM BandsApplicationWave
lengthBand
soil/vegetation discrimination; bathymetry/coastal mapping; cultural/urban feature identification
0.45 - 0.5 15 blue (30m)1
green vegetation mapping (measures reflectance peak); cultural/urban feature identification
0.525 - 0.60 5 green (30m)2
vegetated vs. non-vegetated and plant species discrimination (plant chlorophyll absorption); cultural/urban feature identification
0.63 - 0.69 red (30m)3
identification of plant/vegetation types, health, and biomass content; water body delineation; soil moisture
0.75 - 0.90 near IR (30m)4
sensitive to moisture in soil and vegetation; discriminating snow and cloud-covered areas
1.55 - 1.75 short wave IR5
vegetation stress and soil moisture discrimination related to thermal radiation; thermal mapping (urban, water)
10.4 - 12.5 thermal IR6
discrimination of mineral and rock types; sensitive to vegetation moisture content
2.08 - 2.35 short wave IR7
Urban Planning Applications (Enhanced Thematic Mapper (ETM+) )0.52 – 0.90Pan
Spot• Système Pour l’Observation de la Terre, France, with
support from Sweden and Belgium• SPOT-1 was launched in 1986• sun-synchronous, near-polar orbits at altitudes around
830 km • orbit repetition every 26 days• equator crossing times around 10:30 AM local solar time• the first satellite to use along-track, or pushbroom
scanning technology• have twin High Resolution Visible (HRV) imaging
systems• Each HRV sensor consists of four linear arrays of
detectors: one 6000 element array for the panchromatic mode recording at a spatial resolution of 10 m, and one 3000 element array for each of the three multispectral bands, recording at 20 m spatial resolution
• The swath width for both modes is 60 km
Spot Bands
Wave lengthband0.51 - 0.73) blue-
green-red(Panchromatic
Multispectral 0.50 - 0.59) greenBand 1
0.61 - 0.68) red(Band 20.79 - 0.89) near
infrared(Band 3
Applications requiring frequent
monitoring (agriculture, forestry)
are well served by the SPOT sensors.
The acquisition of stereoscopic imagery
from SPOT has played
an important role in mapping
applications and in the
derivation of topographic information
(Digital Elevation Models - DEMs) from
satellite data.
Sensor Specifications: SPOT
HRV-IR (SPOT-5)
Band Wavelength Region (µm) Resolution (m) 1 0.50-0.59 (green) 10 2 0.61-0.68 (red) 10 3 0.78-0.89 (near-IR) 10 4 1.58-1.75 (mid-IR) 20
PAN 0.51-0.73 (PAN) 5
Vegetation (SPOT-4 and -5)
Band Wavelength Region (µm) Resolution (m) 1 0.43-0.47 (blue) 1000 2 0.61-0.68 (red) 1000 3 0.78-0.89 (near-IR) 1000 4 1.58-1.75 (mid-IR) 1000
Sensor Specifications: Quickbird
Band Wavelength Region (µm) Resolution (m)
1 0.45-0.52 (blue) 4
2 0.52-0.60 (green) 4
3 0.63-0.69 (red) 4
4 0.76-0.89 (near-IR) 4
PAN 0.45-0.90 (PAN) 1
IRS• The Indian Remote Sensing (IRS) satellite series• combines features from both the Landsat MSS/TM
sensors and the SPOT HRV sensor• The third satellite in the series, IRS-1C, launched in
December, 1995• has three sensors:
– a single-channel panchromatic (PAN) high resolution camera,
– a medium resolution four-channel Linear Imaging Self-scanning Sensor (LISS-III),
– and a coarse resolution two-channel Wide Field Sensor (WiFS
• 5 days revisit
IRS Bands
Revisit periodSwath WidthSpatial
ResolutionWave LengthSensor
24 d70 kmm5.80.5 - 0.75PAN
LISS-II
24 d142 km23 m 0.52 - 0.59G
24 d142 km23 m0.62 - 0.68R
24 d142 km23 m0.77 - 0.86NIR
24 d148 km70 m1.55 - 1.70Shortwave IR
WiFS
5 d774 km188 m0.62 - 0.68R
5 d774 km188 m0.77 - 0.86NIR
Marine Observation Satellites/SensorsNimbus satellite
– sun-synchronous, near-polar orbit at an altitude of 955 km
– launched in 1978
– The repeat cycle of the satellite allowed for global coverage every six days
– launched in 1978
– carried sensor, the Coastal Zone Colour Scanner (CZCS)
– The CZCS sensor consisted of six spectral bands in the visible,
near-IR, and thermal portions of the spectrum each collecting data
at a spatial resolution of 825 m at nadir over a 1566 km swath width
– ceased operation in 1986.
Nimbus Satellite
The primary objective of this sensor was
to observe ocean colour and temperature,
particularly in coastal zones, with
sufficient spatial and spectral resolution to
detect pollutants in the upper levels of the
ocean and to determine the nature of
materials suspended in the water column
Mos
- launched by Japan in February, 1987 and was followed by its successor, MOS-
1b, in February of 1990
- carry three different sensors: a four-channel Multispectral Electronic Self-
Scanning Radiometer (MESSR), a four-channel Visible and Thermal Infrared
Radiometer (VTIR), and a two-channel Microwave Scanning Radiometer
(MSR), in the microwave portion of the spectrum.
Data Reception, Transmission
There are three main options for transmitting data acquired by satellites to the surface. The data can be directly transmitted to Earth if a Ground Receiving Station (GRS) is in the line of sight of the satellite (A). If this is not the case, the data can be recorded on board the satellite (B) for transmission to a GRS at a later time. Data can also be relayed to the GRS through the Tracking and Data Relay Satellite System (TDRSS) (C), which consists of a series of communications satellites in geosynchronous orbit. The data are transmitted from one satellite to another until they reach the appropriate GRS .
Microwave Remote Sensing
Microwave spectrumLonger wavelength microwave radiation can penetrate through cloud cover, haze, dust, and all but the heaviest rainfall as the longer wavelengths are not susceptible to atmospheric scattering which affects shorter optical wavelengths. This property allows detection of microwave energy under almost all weather and environmental conditions so that data can be collected at any time.
Passive Microwave Sensing
Passive microwave sensing is similar in concept to thermal remote sensing. All objects emit microwave energy of some magnitude, but the amounts are generally very small.
The microwave energy recorded by a passive sensor can be emitted by the atmosphere (1), reflected from the surface (2), emitted from the surface (3), or transmitted from the subsurface (4).
Because the wavelengths are so long, the energy available is quite small compared to optical wavelengths. Thus, the fields of view must be large to detect enough energy to record a signal. Most passive microwave sensors are therefore characterized by low spatial resolution.
Active Microwave SensingProvides their own source of microwave radiation to illuminate the targetActive microwave sensors are generally divided into two distinct categories: imaging and non-imaging .The most common form of imaging active microwave sensors is RADAR.RADAR is an acronym for RAdio Detection And Ranging, which essentially characterizes the function and operation of a radar sensor.The sensor transmits a microwave (radio) signal towards the target and detects the backscattered portion of the signal. The strength of the backscattered signal is measured to discriminate between different targets and the time delay between the transmitted and reflected signals determines the distance (or range) to the target
Non-imaging microwave sensors include altimeters and scatterometers
Radar BasicsIt consists fundamentally of a transmitter, a receiver, an antenna, and an electronics system to process and record the data. The transmitter generates successive short bursts (or pulses) of microwave (A) at regular intervals which are focused by the antenna into a beam (B). The radar beam illuminates the surface obliquely at a right angle to the motion of the platform. The antenna receives a portion of the transmitted energy reflected (or backscattered) from various objects within the illuminated beam (C). By measuring the time delay between the transmission of a pulse and the reception of the backscattered "echo" from different targets, their distance from the radar and thus their location can be determined. As the sensor platform moves forward, recording and processing of the backscattered signals
builds up a two-dimensional image of the surface .
Radar Image
.Speckle appears as a grainy "salt and
pepper" texture in an image. This is
caused by random constructive and
destructive interference from the
multiple scattering returns that will
occur within each resolution cell
Operations performed on images
• image compression to reduce size, • image filtering and enhancement of brightness and contrast and noise
removal to improve its visualization and interpretation, • Image geometric correction to remove distortions due to:
– the perspective of the sensor optics, film deformations, lens distortion
– the motion of the scanning system, – the motion and (in)stability of the platform, – the platform altitude, attitude, and velocity, – the terrain relief, and atmospheric refraction.– the curvature and rotation of the Earth.
• Image rectification and resampling transformation to remove tilt effects and extrapolate new values for the pixels grid from the original raw image.
• Image projection and georeferencing to relate objects on image to its earth location and to a definite coordinate system, and image feature extraction for purpose of map production that will be based on image spectral/spatial/temporal pattern recognition
Aerial Imagery Perspective Example
Geometric Distortion
Due to :
– the perspective of the sensor optics,
– the motion of the scanning system,
– the motion and (in)stability of the platform,
– the platform altitude, attitude, and velocity,
– the terrain relief, and
– the curvature and rotation of the Earth.
Geometric Correction
It is usually necessary to It is usually necessary to preprocesspreprocess remotely remotely sensed data and remove geometric distortion so sensed data and remove geometric distortion so that individual picture elements (pixels) are in that individual picture elements (pixels) are in their proper their proper planimetricplanimetric ((x,x, yy) map locations. This ) map locations. This allows remote sensingallows remote sensing––derived information to be derived information to be related to other thematic information in related to other thematic information in geographic information systems (GIS) or spatial geographic information systems (GIS) or spatial decision support systems (SDSS). decision support systems (SDSS). Geometrically Geometrically corrected imagerycorrected imagery can be used to extract can be used to extract accurate distance, polygon area, and direction accurate distance, polygon area, and direction (bearing) information.(bearing) information.
Image to Map Rectification Image to Map Rectification Image to Map Rectification
Image-to-map rectification is the process by which the geometry of an image is made planimetric. Whenever accurate area, direction, and distance measurements are required, image-to-map geometric rectification should be performed. It may not, however, remove all the distortion caused by topographic relief displacement in images. The image-to-map rectification process normally involves selecting GCP image pixel coordinates (row and column) with their map coordinate counterparts (e.g., meters northing and easting in a Universal Transverse Mercator map projection).
ImageImage--toto--map rectificationmap rectification is the process by which the is the process by which the geometry of an image is made geometry of an image is made planimetricplanimetric. Whenever . Whenever accurate area, direction, and distance measurements are accurate area, direction, and distance measurements are required, imagerequired, image--toto--map geometric rectification should map geometric rectification should be performed. It may not, however, remove all the be performed. It may not, however, remove all the distortion caused by topographic relief displacement in distortion caused by topographic relief displacement in images. The images. The imageimage--toto--map rectificationmap rectification process process normally involves selecting normally involves selecting GCPGCP image pixel image pixel coordinates (row and column) with their map coordinates (row and column) with their map coordinate counterparts (e.g., meters northing and coordinate counterparts (e.g., meters northing and easting in a Universal Transverse Mercator map easting in a Universal Transverse Mercator map projection). projection).
Image to Image Registration Image to Image Registration Image to Image Registration
Image-to-image registration is the translation and rotation alignment process by which two images of like geometry and of the same geographic area are positioned coincident with respect to one another so that corresponding elements of the same ground area appear in the same place on the registered images. This type of geometric correction is used when it is not necessary to have each pixel assigned a unique x, y coordinate in a map projection. For example, we might want to make a cursory examination of two images obtained on different dates to see if any change has taken place.
ImageImage--toto--image registrationimage registration is the translation and is the translation and rotation alignment process by which two images of like rotation alignment process by which two images of like geometry and of the same geographic area are geometry and of the same geographic area are positioned coincident with respect to one another so that positioned coincident with respect to one another so that corresponding elements of the same ground area appear corresponding elements of the same ground area appear in the same place on the registered images. This type of in the same place on the registered images. This type of geometric correction is used when it is geometric correction is used when it is notnot necessary to necessary to have each pixel assigned a unique have each pixel assigned a unique x,x, yy coordinate in a coordinate in a map projection. For example, we might want to make a map projection. For example, we might want to make a cursory examination of two images obtained on cursory examination of two images obtained on different dates to see if any change has taken place. different dates to see if any change has taken place.
Geometric registration process
involves identifying the image coordinates (i.e. row,
column) of several clearly discernible points, called
ground control points (or GCPs), in the distorted
image (A - A1 to A4), and matching them to their
true positions in ground coordinates (e.g. latitude,
longitude). The true ground coordinates are typically
measured from a map (B - B1 to B4), either in paper
or digital format .
Image structureDigital image radiometry spans from 1Bit to 8Bit; a pixel could have multiple values, 3 values for colored images (R,G,B) where black color is (0,0,0), 4 to 40 values for multi-spectral images, over 200 values for hyper-spectral images.
Image size and storage
• a 9*9 inch aerial photo 8 bits with 0.05 mm pixel size = 20.9 MB. If scanned at 0.0125 mm size will be 334.4 MB. If it is colored RGB size will be 3 times larger.
Image size reduction could be as follows:• Spatial (neighbors) redundancy• Spectral redundancy• Temporal redundancy (as in image frame sequence and videos)• Allowing some minor changes with lossy compression.• Prediction coding (instead of 8bits 145, 148, 140, … to be 5bits 145, 3,
-5, …).• Image pyramid to reduce representation with scale.
Image Analysis
MORRO BAY, CALIFORNIAThe chosen subscene lies along the central California coast about half way between San Francisco and Los Angeles, in the county of San Luis Obispo. This subscene was extracted from Landsat 5 Thematic Mapper scene 5026-31810 (Path 043; Row 035) acquired on November 19, 1984. The actual data set used is part of the Education sampler offered to users by the Earth Observation Satellite (Eosat) Corp., 4300 Forbes Blvd., Lanham, Maryland 20706 (the company has now merged and is known as SpaceImaging-Eosat)
The image is an aerial oblique photograph taken in mid afternoon (note shadows) on January 25, 1988 looking east at the northern part of Morro Bay towards the hills in the background. The town of Morro Bay is situated in the center of the photo.
Thematic Mapper Bands
Band No. Wavelength Interval (µm) Spectral Response Resolution (m)1 0.45 - 0.52 Blue-Green 302 0.52 - 0.60 Green 303 0.63 - 0.69 Red 304 0.76 - 0.90 Near IR 305 1.55 - 1.75 Mid-IR 306 10.40 - 12.50 Thermal IR 1207 2.08 - 2.35 Mid-IR 30
Thematic Mapper Bands
Let us start our exercise in image interpretation. In as much as this could be your first big effort at this, we suggest you now become familiar with this Landsat subscene as it is rendered in all 7 TM bands.
Band1 Band2 Band3
Band4 Band5 Band7
Band1 Band6
Feature Identification Map•Point a suggest actual variations in sedimentload whish are distributed within the more sheltered Morro Bay, especiallyat b which is located just beyond the point of entry of a small river.•The darker patches (as at c) that interrupt this bar are coastal vegetation that tends to have low reflectances in all bands.
Feature Identification Map(cont.)
•Slopes (as at d whose faces incline toward the sun, i.e., are more likely to be at high angles to solar rays, will reflect a high percentage of light back towards the sensor on the spacecraft These surfaces will appear brighter (lighter tones). Slopes (e) that lie away from the sunf. If the slopes are steep, and/or the sun angle is lower, the slopes tend to progressively darker•At higher elevations vegetation is greener and denser, as at (g).
Feature Identification Map(cont.)
•bedrock of granite is exposed inplaces within the scene. The best example, at (h).At (i), long thin lines of dark tones correspond to vegetation, mostly trees and bushes, that cluster along the banks of narrow, probably intermittent, streams in the hills. Other, similar tones (as at j).L,m and n represent a low lands.(o) is centered lies at the end of the small river (p) that courses across the terrain in the valley through which Highway 1 passes.
True Color View• in practice, various color
mapping algorithms are used to facilitate visual interpretation of an image, while analytical treatment usually works with the original DN (digital numbers) values of the pixels.
• the ocean appears too dark in the TM composite. This is due to its general darkness of all the bands
TM Band 1 = blueTM Band 2 = green TM Band 3 = red
False color view• Streets and other Highways
have bluestones, which is expected because they are specially light-toned in bands 2 a(assigned here to blue).
• Red expresses the vegetation
• and medium grayish-browns found mainly along the sun - facing slopes
• the ocean and the bay are deep blue
• the breakers are presented in mottled blue and white patterns
TM Band 4 = redTM Band 3 = green TM Band 2 = blue
Other Color Combinations
TM Band 6 = red TM Band 7 = greenTM Band 5 = blue
TM Band 1 = redTM Band 7 = green TM Band 4 = blue
IP Overview and Introduction• Image restoration:
– Errors during scanning, transmission, recording.– Noise (filters).– Geometric corrections– Atmospheric scattering (haze)
• Image Enhancements (change the original image):– Contrast enhancement (histogram stretching)– IHS transformation– Edge enhancement– Mosaics and stereo images
• Information Extraction:– Multispectral classification– PCA– Ratio images– Change detection images
• Applications:
Image Quality AssessmentImage Quality Assessment
Many remote sensing datasets contain high- quality, accurate data. Unfortunately, sometimes error (or noise) is introduced into the remote sensor data by:
• the environment (e.g., atmospheric scattering), • random or systematic malfunction of the remote
sensing system (e.g., an uncalibrated detector creates striping), or
• improper airborne or ground processing of the remote sensor data prior to actual data analysis (e.g., inaccurate analog-to-digital conversion).
Image Quality AssessmentImage Quality Assessment
• Therefore, the person responsible for analyzing the digital remote sensor data should first assess its quality and statistical characteristics. This is normally accomplished by:
• looking at the frequency of occurrence of individual brightness values in the image displayed in a histogram
• viewing on a computer monitor individual pixel brightness values at specific locations or within a geographic area,
• computing univariate descriptive statistics to determine if there are unusual anomalies in the image data, and
• computing multivariate statistics to determine the amount of between-band correlation (e.g., to identify redundancy).
Remote Sensing MetadataRemote Sensing Metadata•• MetadataMetadata is is ““data or information about datadata or information about data””. Most . Most
quality digital image processing systems read, collect, quality digital image processing systems read, collect, and store metadata about a particular image or suband store metadata about a particular image or sub-- image. It is important that the image analyst have access image. It is important that the image analyst have access to this metadata information. In the most fundamental to this metadata information. In the most fundamental instance, instance, metadata might includemetadata might include: :
•• the file name, date of last modification, level of the file name, date of last modification, level of quantization (quantization (e.ge.g, 8, 8--bit), number of rows and columns, bit), number of rows and columns, number of bands, number of bands, univariateunivariate statistics (minimum, statistics (minimum, maximum, mean, median, mode, standard deviation), maximum, mean, median, mode, standard deviation), perhaps some multivariate statistics, geoperhaps some multivariate statistics, geo--referencing referencing performed (if any), and pixel size. performed (if any), and pixel size.
Radiometric Correction– Radiometric corrections are not as simple as
geometric corrections and are more sensor and scene dependent.
– Multiple detectors in the sensors have slight physical variations which mean that there will be variations in the radiance recorded by different detectors even if they are looking at the same object.
– Software modules are used to perform the radiometric correction.
Noise Image
Dropped line
Striping
Noise in Images
Spatial Filtering• Spatial enhancement modifies pixel
values based on the values of surrounding pixels.
• Filtering is the process of averaging small sets of pixels across an image.
• Convolution kernel is a matrix of numbers that is used to average the value of each pixel with the values of surrounding pixels in a particular way
Spatial Filtering (cont.)Convolution filtering is a common mathematical
method of implementing spatial filters. In this, each
pixel value is replaced by the average over a
square area( matrix) centered on that pixel.
Square sizes typically are 3 x 3, 5 x 5, or 9 x 9
pixels but other values are acceptable.
Spatial Filtering (cont.)
2 8 6 6 62 8 6 6 62 2 8 6 62 2 2 8 6
2 2 2 2 8
-1 -1 -1-1 16 -1-1 -1 -1
2 8 6 6 62 8 6 6 62 2 11 6 62 2 2 8 6
2 2 2 2 8
Data
Kernel
Output
Spatial Filtering(cont.)
To compute the output value for this pixel, each value in the kernel is mutiplied by the image pixel value value that corresponds to it, these product s are summed, and the total is divided by the sum of the values in the kernel:as shown:
integer ((-1*8)+(-1*6)+(-1*6)+(-1*2)+(16*8)+(-1*6)+(-1*2)+(-1*2)+(-1*8): (-1+ -1+ -1+ -1+ 16+ -1+ -1+ -1+ -1))= int (128-40)/(16-8)= int (88/8) = int (11) = 11
Spatial Filtering (cont.)• The kernel used in this example is a high frequency
kernel where the higher values become higher, and relatively lower values become lower
• thus increasing the spatial frequency of the image.• High frequency kernels serves as edge enhancers, since
it brings out the edges between homogeneous groups of pixels
• zero-sum kernels, serves as edge detectors which highlight edges and eliminate other features
• low frequency kernels cause pixels more homogeneous
Spatial Filtering (cont.)
lowpass (mean) filter product, which tends to generalize the image:
An edge enhancement filter highlights abrupt discontinuities
Zero-sum filter,. Details in the water are largely lost. Much of the image is flat.
Image Enhancement
Now that we are familiar with the individual TM bands and color composites showing our study image, we need to investigate the power of two of the most common image processing routines applied to improving scene quality.Stretching, has already been used on all the TM images we have looked at so far to improve their quality for your inspection.Different stretching options are described next. The other, filtering, will be evaluated shortly.
Contrast StretchingThe process of reassigning a range of values to another range, usually according to a linear function. Contrast stretching is often used in displaying continuous raster layers, since the range of data file values is usually much narrower than the range of brightness values on the display device.
Contrast Stretching (cont.)• To illustrate contrast stretching (also called auto scaling) we will
apply the function to Band 3.
Most of the values, however, lie between DNs of 9 and 65 (there are values up to 255 in the original scene but these are few in number).
Contrast Stretching (cont.)
We can perform a simple linear stretch so that 9 goes to 5 and 65 to 255, with all values in between stretched proportionately. This is this expanded histogram, and next to it, the resulting new image.
Haze (due to scattering) removing
Digital numbers for band 4Near IR
Dig
ital n
umbe
rs fo
r ban
d 7
SW
IR
offset
The lack of low DNs on band 4Is caused by illumination from lightScattered by the atmosphere
Digital numbers for band 7 free of atmospheric errors
Unsupervised ClassificationIn an unsupervised classification, the objective is to group multi-band spectral response patterns into clusters that are statistically separable. Thus, a small range of digital numbers (DNs) for, say 3 bands, can fix one cluster that is set apart from a specified range combination for another cluster (and so forth). Separation will depend on the parameters chosen to differentiate.
Unsupervised Classification(cont.)
This can be visualized with the aid of this diagram, taken from Sabins, Remote Sensing: Principles andInterpretation. 2nd Ed. for four classes: A = Agriculture; D= Desert; M = Mountains; W = Water.
Unsupervised Classification (cont.)
Supervised ClassificationSupervised classification is much more effectual in terms of accuracy in mapping substantial classes whose validity depends largely on the cognition and skills of the image specialist. The strategy is simple: conventional classes (real and familiar) or meaningful classes are recognized in the scene from prior knowledge such as personal experience with the region in question, or by identification using thematic maps or actual on-site visits. This allows one to choose and set up discrete classes (thus supervising selection) to which identifying category names are then assigned.
Supervised Classification(cont.)Training sites, areas representing each known land cover category that appear fairly homogeneous on the image (as determined by similarity in tone or color within shapes delineating the category), are located and circumscribed by polygonal boundaries drawn (using the computer mouse) on the image display.
Classification now proceeds by statistical processing in which every pixel is compared with the various Spectral signatures and assigned to the class whose signature comes closest (a few in a scene do not match and remain unclassified; these may belong to a class not recognized or defined).
Supervised Classification(cont.)
For the first attempt at a supervised classification, 13 discretional classes have been formalized. The outlines of their training sites are traced on the true color (bands 1,2,3) compositeand here is the spectral signature
Supervised Classification (cont.)
And finally we can get a MAP!.