Você está na página 1de 49

What is Remote Sensing?

Remote sensing is used to obtain information about objects. Data is collected with an instrument an then analysed. The instrument used is
not in direct contact with the object. The platforms used are located "at a distance" from the earths surface (for example, aircraft and
satellites). These contain carry sensors to observe and study the earth, its land surface, the oceans, the atmosphere and the earth's dynamics
from space (ESA Eduspace).

This tutorial teaches the basics of remote sensing:

Physical basics: the electromagnetic spectrum, atmospheric influences and spectral reflectance properties

Satellite systems with different sensors and orbits, followed by an overview of exemplary earth observation

Geometric, spectral, radiometric and temporal resolutions

Visual image interpretation of satellite images

Image processing and enhancement techniques

Classification techniques like unsupervised and supervised classification

Click on one of the images above to go directly to a specific chapter! A detailed table of contents follows on the next webpage.

What are the blue circles?

Source: infres.enst.fr

Is that alien made? What are the blue circles?

Answer (think first, then click here!)

The blue circles represent irrigation fields in the desert of Libya near a city (on the upper left side of the image). They
appear blue because the satellite image is a false colour composition.

1. Physical Basics
The electromagnetic spectrum
Light and radiation are only some forms of electromagnetic energy. The human eye can only see the part of the
electromagnetic spectrum, which contains the spectral colours; but our skin can also sense temperature differences.
Electromagnetic radiation is one form of energy propagation. It is measured as wave radiation and characterised by
frequency or wavelength. The radiation spreads with the velocity of light (more about frequency, wavelength and the
velocity of light can be found in the tutorial Understanding Spectra from the Earth).
The electromagnetic spectrum can be described in the frequency (given in Hertz) or the wavelength (given in micrometers,
millimetres or metres), see figure below.

The electromagnetic spectrum is divided into several sections starting with a very short wavelength and high frequency range
i.e. x-rays (around 0.01 m). Ultraviolet radiation with wavelengths of 0.1 m (110-6m) follows. The part of the visible
light, which is visible to the human eye, spans from 0.38 m to 0.78 m and ranges from the
colours violet, blue, green, yellow to orange and red. Beyond this spectral part, one finds the infrared wavelengths followed
by even longer wavelengths like the microwaves and radio waves. Infrared wavelengths are divided into near infrared, mid
infrared and thermal infrared (more information can be found in the tutorial about Understanding Spectra from the Earth).
Sources of electromagnetic radiation are the sun, the earth with their infrared radiation and also active satellite sensors.

The electromagnetic spectrum and atmospheric transmittance.

Source: Albertz 2007 with modifications

1. Physical Basics
Radiation principles (1/2)
Radiation principles are important for the understanding of thermal radiation which is emitted by any object depending on
its temperature and also its material properties. Efficiencies of absorbance and emission of radiation are material properties
which need consideration here and their dependence on each other is given by Kirchhoffs law.
The temperature dependence of emitted radiation follows the Stefan-Boltzmann law. Radiation is emitted in the form of
electromagnetic waves, their intensity being a function of the wavelength. The emission maximum of thermal radiation is
explained by Wiens displacement law, while the shape of the entire emission spectrum is given by Plancks law.
Kirchhoffs law
The emission efficiency denotes the efficiency of an object to emit thermal radiation, a quantity varying between 0 (no
emission at all) and 1 (highest possible emission). The absorption efficiency of an object denotes its efficiency to absorb
incident radiation. It is defined as
= absorbed radiation/incident radiation
and varies between 0 and 1, whereby 1 corresponds to total absorption and 0 to total reflection or transmission.
Kirchhoffs law, found in 1859, states:
the efficiencies of absorption and emission of an object are the same. Therefore, objects which absorb all incident radiation
(=1) have highest thermal emission efficiency (=1). They are denoted as black body emitters, where the
term black indicates that there is no reflected radiation. Objects which absorb a fraction of the incident radiation only ( <1)
are denoted as grey body emitters.
Absorption and emission efficiencies are wavelength dependent in case of a selective emitter. High or low absorption
efficiencies of an object in different spectral ranges go along with high or low emission efficiencies in the same spectral
ranges. Therefore, a generalised Kirchhoff's law can be written as:

The Earth in visible light (left) and in the thermal infrared (right) seen by Meteosat in 2004.
Source: Beckel 2007
Stefan-Boltzmann law
This law theoretically found by Josef Stefan in 1879 and experimentally confirmed by Ludwig Boltzmann in 1884 explains the
temperature dependence of the intensity of thermal radiation emitted by an object. It increases strongly with increasing
absolute temperature T given in Kelvin (K). The radiant exitance M, which is the radiative power emitted by the surface of an
object and given in units of W/(m2) is:
M= T4
with the Stefan-Boltzmann constant =5.710-8 W/(m2K). For example, increasing the absolute temperature of an object by a
factor of two results in a 16 times stronger emission of thermal radiation.
This temperature change is also connected with a changing emission spectrum, which is explained by Plancks law (see on the
following page).

1. Physical Basics
Radiation principles (2/2)
The Planck radiation law
In 1900, Max Planck introduced the concept of quantised properties of light, i.e., photons. This became the starting point of
quantum theory. In the framework of classical physics and on the basis of electromagnetic waves it had not been possible to
understand the physics of thermal radiation. Using the concept of photons, Planck derived an equation which describes the
intensity and spectral shape of thermal radiation of an ideal black body emitter. Omitting some physical constants, it reads:
M -5 / {exp(hc/kT) -1)}
where ist the emission wavelength and T is the absolute temperature. It ish=6.6810-34 Ws2 Plancks constant, c the velocity
of light, and k=1.3810-23 Ws/K Boltzmanns constant.
Obviously, emission spectra of ideal black bodies depend on their temperature only, and material properties do not interfer.
Plancks law therefore holds with solids, liquids and gases if their absorbance is =1 (i.e., they absorb all incident radiation).
Examples of Planck curves are shown in the graph on the right. Emission spectra of grey body emitters can be calculated
using their spectral emission efficiency, i.e., M.
Wien's displacement law
Already in 1893 Wilhelm Wien derived an equation which allows to calculate the wavelength of maximum intensity max of a
black body emission spectrum as a function of the temperature T:
max T = const.
where the value of the constant is 0.30 cm K. Hence, high temperatures correspond to maxima at low wavelengths and vice

Emission as a function of wavelengths of objects having different absolute temperatures.

Source: ESA Eduspace with modifications

The sun with its high surface temperature of about 6.000 K emits visible light having a maximum at
around max=0.5 m. The Earth with an ambient temperature of 300 K emits predominantly in the mid infrared with a
maximum around 10 m; this spectral range is called the thermal infrared.

1. Physical Basics
Atmospheric influences
All radiation is influenced by the atmosphere in various ways. The sun's radiation is scattered, reflected or absorbed by
particles in the atmosphere as is the earth's reflected radiation. Clouds are the worst interference for radiation and make it
impossible for passive satellite sensors to measure the Earth's surface.
Atmospheric influences are wavelength dependend. In the range of visible light were the sun emits highest intensities,
atmospheric transmittance is the highest (see figure). In the range of higher wavelengths transmittance is reduced to narrow
bands. This includes the optical windows in the thermal infrared, where the Earth's surface emits radiation. In the range of
microwaves the atmosphere is nearly transmissive, but the sun and earth's radiation are low; therefore, this range is used by
active radar systems. Wavelengths smaller than the ultraviolet are nearly totally absorbed by the atmosphere and are
therefore less relevant for remote sensing. Remote sensing concentrates on the transmissive ranges, the so
called atmospheric windows.
Atmospheric scattering denotes the diffusion of radiation by particles in the atmosphere.
Rayleigh scattering is a diffuse scattering caused by tiny particles and molecules (like nitrogen or oxygen) smaller in
diameter than the wavelength of the interacting radiation. Short wavelengths of the sunlight are more intensely scattered
than radiation at longer wavelengths.

Why is the sky blue?

The shorter (blue) wavelengths of the sunlight are scattered more dominantly than other visible wavelengths. For this reason
the sky appears blue. At sunset the sunlight has to travel a greater distance through the atmosphere and therefore part of its
blue light is lost due to scattering. Hence, the sky appears red.

Rayleigh scatter is also causing "haze" and contrast reductions on images. In colour photographs it leads to a bluish-grey cast
on an image.
Mie scattering is caused by particles in the atmosphere which are larger in diameter than the considered radiation
wavelengths. These are water droplets in clouds, ice crystals or aerosols (sea salt, dust, biological material, sulphate, nitrate
etc. from vaporisation, fire and forest fires, volcanic eruptions and industry). The scattering is less wavelength selective than
Rayleigh scattering which explains the white colour of clouds, and the grey appearance of dust.

The spectrum of electromagnetic waves and the transmittance of a clear cloud-free atmosphere.
Source: Albertz (2007) with modifications

In contrast to scattering, absorption means an effective loss of radiative energy and is mostly caused by water vapour,
carbon dioxide and ozone. The absorption of all gases strongly depends on the wavelength and determines the atmospheric
windows, i.e. the range of non-blocked spectral regions.
Two major aspects must be considered for any remote sensing task: The primary sources of electromagnetic
radiation (sun and earth) and the atmospheric windows. Specific information is needed to select the spectral sensitivity of
sensors used to detect and record radiation (see figure).

1. Physical Basics
Interaction of radiation with the earth surface
Electromagnetic radiation incident on the surface of a body is partly reflected, partly absorbed or transmitted depending
on radiation wavelength, material and surface conditions of the body. The distinctness of different bodies allows us to
distinguish between them on a satellite image.
Besides the angle of incidence, it is primarily the surface roughness which determines the way how an object reflects
radiation. One can distinguish different types of reflectance:

Specular: flat surfaces reflect like a mirror (where the angle of reflection equals the angle of incidence)

Diffuse (or Lambertian): rough surfaces reflect uniformly in all directions

Specular and diffuse reflectance.

Most earth surfaces are neither perfectly specular nor diffuse and lie somewhere between the two extremes.
The type of reflectance depends on the surface's roughness and the wavelength of incident radiation reaching the
surface. Wavelengths which are smaller than the surface height variations lead to a diffuse reflectance.
Diffuse reflectances of earth surfaces are very important in remote sensing because only diffuse reflections contain spectral
information on the "colour" of the reflecting surface. Specular reflections do not (Lillesand, Kiefer 2004).
Supplement: Applications of different bands or wavelength ranges

Application of Different Spectral Ranges

Bands and



(0.45 - 0.52 m)

Detection of water bodies, coastal waters, distinction between vegetation and soil,
deciduous and coniferous forest (different forest types), identification of different
vegetation and land use classes, buildings etc., measuring water pollutions and
plankton (which permit conclusions about the abundance of fish)

(0.52 - 0.6 m)

Detection of vegetation and its vitality as well as different land use classes.

(0.63 - 0.69 m)

Absorption of chlorophyll of plants for detection of different plant species, soil

types, mineral contents and use in geological applications.

Near IR

Mapping of biomass and vitality of vegetation, distinction between different

(0.76 - 0.9 m)

vegetation species, detection of soil moisture.

Mid IR
(1.55 - 1.75 m)

Detection of vegetation and soil moisture, separation of clouds, snow and ice.

Thermal IR
(10.4 - 12.5 m)

Detection of surface temperatures for city and terrain climatic analysis, registration
of vegetation damage and soil moisture, pedological and geological analysis.

Mid IR
(2.08 - 2.35 m)

Distinction of different minerals for geological applications and registration

of moisture patterns like soil and vegetation moisture.


Red/ Blue

Detection of iron oxide in soils and soil and vegetation moisture.


Red/ Near IR

Creation of vegetation indices e.g. NDVI (Normalized Difference Vegetation Index);

registration of vegetation vitality and density as well as biomass analysis.


Mid IR/ Near IR

Detection of clay minerals in soils, distinction between fallow land and sealed areas.


Mid IR/ Mid IR

Detection of minerals containing iron in soils, distinction between fallow and sealed


C Band
(5.6 cm)

Detection of sea surface roughness (find out more in the tutorial about Marine
Pollution), can provide information about wind speed and direction, important for
monitoring the formation of storms or flood events. Used also in climate
modelling and in the planning of flight and shipping companies.

1. Physical Basics
Spectral Reflectance Properties
Remote sensing is based on the measurement of reflected or emitted radiation from different bodies. Objects
having different surface features reflect or absorb the sun's radiation in different ways. The reflectance properties of an object
depend on the particular material and its physical and chemical state (e.g. moisture), the surface roughness as well as the
geometric circumstances (e.g. incidence angle of the sunlight). The most important surface features
are colour, structure and surface texture.
These differences make it possible to identify different earth surface features or materials by analysing their spectral
reflectance patterns or spectral signatures. These signatures can be visualised in so called spectral reflectance
curves as a function of wavelengths. The figure in the right column shows typical spectral reflectance curves of three basic
types of Earth features: green vegetation, dry bare soil and clear water.
The spectral reflectance curve of healthy green vegetation has a significant minimum of reflectance in the visible
portion of the electromagnetic spectrum resulting from the pigments in plant leaves. Reflectance increases dramatically in the
near infrared. Stressed vegetation can also be detected because stressed vegetation has a significantly lower reflectance in
the infrared.
The Role of Chlorophyll
More about spectral signatures of vegetation can be found in the tutorial Remote Sensing and GIS in Agriculture.

The spectral reflectance curve of bare soil is considerably less variable. The reflectance curve is affected by moisture
content, soil texture, surface roughness, presence of iron oxide and organic matter. These factors are less dominant than the
absorbance features observed in vegetation reflectance spectra.

Spectral signatures of soil, vegetation and water, and spectral bands of LANDSAT 7.
Source: Siegmund, Menz 2005 with modifications

The water curve is characterised by a high absorption at near infrared wavelengths range and beyond. Because of this
absorption property, water bodies as well as features containing water can easily be detected, located and delineated with
remote sensing data. Turbid water has a higher reflectance in the visible region than clear water. This is also true for waters
containing high chlorophyll concentrations. These reflectance patterns are used to detect algae colonies as well as
contaminations such as oil spills or industrial waste water (more about different reflections in water can be found in the
tutorial Ocean Colour in the Coastal Zone).

2. Satellites
Satellite orbits
One possibility to distinguish between all earth observation satellite systems is to look at their orbits. Generally there are
two major orbits which are called sun synchronous (or polar) and geostationary orbits.
Geostationary orbits are located about 36.000 km above the earth. At this altitude a satellite needs exactly 24 hours to
orbit around the earth, the same time the earth takes to perform a complete revolution around its axis. The satellites "hover"
at a right angle above the equator, therefore they seem to be stationary in the sky when seen from the earth. Hence, the
satellites always "see" the same section of the earth surface and atmosphere.
For this reason, time lapse images can be produced visualising changes on the land surface or cloud movements. Due to the
high altitude of the satellite orbit, the geometric resolution is very low. The smallest element that can be distinguished is
about 1km2 wide.
METEOSAT is one example for a geostationary satellite. METEOSAT can take an image every 30 minutes. This high
temporal resolution is a significant advantage when monitoring clouds and weather. Normally these geostationary satellites
are used for weather monitoring and prediction as well as telecommunication and television broadcasting.

Geostationary orbit.

The other major orbit of earth observation satellites is the polar or sun synchronous orbit. Satellites in this orbit
provide medium to high resolution images of the whole earth which are mostly used for environmental monitoring. They
orbit at altitudes of 300 to 1.400 km above earth. With every satellite orbit which takes about 90 min, the earth is
rotating a bit further resulting in the fact that the satellite is "watching" different sections of the earth in narrow bands. Days
or weeks later, the satellite orbits again above the same section. Hence, the temporal resolution of these satellites is limited
compared to geostationary satellites.
As satellites pass both polar regions with an inclination near 90 (angle between orbit and equatorial plane), their orbits are
called polar orbits. The term sun synchronous means that the section monitored by the satellites is always radiated by the
sun in the same way. The satellites always fly over a particular section always at a specific local time. Recording conditions
stay constant and scenes from different time periods can be easily compared (Albertz 2007, Lffler et al. 2005).
The US LANDSAT series is a well known example of a polar orbiting satellite.

Polar satellite orbit.

2. Satellites
Active and passive satellite sensors
Another possibility to distinguish between earth observation satellites is to compare the sensors used. In general, there
are passive sensors which measure the reflected sunlight or thermal radiation, and active sensors which make use of their
own source of radiation.
Active sensors (for example Radar and laser scanners) emit artificial radiation to monitor the earth surface or
atmospheric features. Radars are imaging instruments while radar altimeters and scatterometers are non
imaging. Radar is the abbreviation for Radio Detection and Ranging, a method for the detection and ranging of earth surface
features. Radar satellites use short pulses of electromagnetic radiation in the microwave spectral range, therefore they do not
depend on daylight and are hardly affected by clouds, dust, fog, wind and bad weather conditions. They measure the radar
pulses reflected from the ground, analyse the signal intensity in order to retrieve information on the structure of the earth
surface, and detect the elapsed time between pulse emission and return. Results can be used to measure distances.
Depending on the satellite mission, different operations and procedures are used to process the signals into viable
Advantages and disadvantages of active sensors:



Weather independent: artificial microwave

radiation can penetrate clouds, light rain and

The pulse power is mostly low and can be

influenced or interfered by other radiation

Sunlight independent: can be operated day and

Radar penetrates vegetation and soil: can gain
information about surface layer from mm to m

Radar signals contain no spectral


Can give information about moisture content of

soil layer.

Complicated analysis, cost-intensive.

Various applications: oceanography, hydrology,

geology, glaciology, agriculture and forestry

ERS-2 is an example of an European satellite and was launched in 1995 from French Kourou with an ARIANE launching
vehicle. ERS-2 is revolving on a polar orbit at an altitude of 785 km with a speed of 7.5 km per hour, monitoring a band 100
km wide.

Active satellite sensor.

Passive satellite sensor.

Passive sensors detect sunlight radiation reflected from the earth and thermal radiation in the visible and infrared of the
electromagnetic spectrum. They do not emit their own radiation, but receive natural light and thermal radiation from the
earth's surface. Most passive sensors make use of a scanner for imaging, e.g. LANDSAT. Equipped with spectrometers they
measure signals at several spectral bands simultaneously, resulting in so-called multispectral images which allow numerous
interpretations (Albertz 2007, Lffler et al. 2005).

2. Satellites
Name of satellite


Technical data
Sensor (Selection)


ASAR (Advanced Synthetic Aperture





Spatial resolution


800 km

35 days

30 m, 150 m, 1000

MERIS (Medium Resolution Imaging


Ocean: 1040 m x
1200 m,
Land & coast: 260
m x 300 m

AATSR (Advanced Along Track

Scanning Radiometer)


MWR (Microwave Radiometer)

20 km

MIPAS (Michelson Interferometer

for Passive Atmospheric Sounding)

Vertical: 3 km,
Horizontal: 3 km x
30 km

SCIAMACHY (Scanning Imaging

Absorption Spectrometer for
Atmospheric Cartography)



ATSR (Along Track Scanning

RA (Radar Altimeter)

Limb vertical 3 x
132 km, Nadir
horizontal 32 x 215


780 km

24 days

GOME (Global Ozone Monitoring


1 km2

16-20 km

40 km2; 40x320 km2

30 m

SAR (Synthetic Aperture Radar)

20 km

MWR (Microwave Radiometer)





30 min

between 1 km2 and

8 km2





681 km

3 days

82 cm, 3.2 m


LISS IV (Linear Imaging Self



817 km

5 days

5.8 m2


ETM+ (Enhanced Thematic Mapper



705 km

16 days

30 m2/ 60 m2


SEVIRI (Spinning Enhanced Visible

and Infrared Imager)



15 min

3 km2


AVHRR (Advanced very high

resolution radiometer)


870 km

0.5 days

1 km2

HIRS (High Resolution Infrared

Radiation Sounder)



between 1 km2 and

8 km2

20 km


450 km

2- 3

61 cm, 2.4 m


SAR (Synthetic Aperture Radar)

Spot 5




798 km

24 days

8 - 100 m


822 km

26 days

2.5, 5, 10 and 20 m2

For more information about the various satellites click on the images or links in the first column.


Source: ESA.
In March 2002, the European Space Agency launched Envisat, an advanced polar-orbiting earth observation satellite. It provides
measurements with respect to our atmosphere, oceans, land, and ice. The mission ended in April 2012, following the unexpected loss of contact
with the satellite
Continuous and coherent global and regional data sets are needed by the scientific and user community in order to better understand climatic
processes and to improve climate models (ESA).
Official web page: http://envisat.esa.int/

Source: ESA.
ERS-2 was launched on April 21, 1995, on an Ariane 4 from ESA's Guiana Space Centre near Kourou, French Guiana. Largely identical to ERS1 (launched in 1991), it was equipped with additional instruments and included improvements to existing instruments.
ERS-2 carries a comprehensive payload including a Synthetic Aperture Radar (SAR) and a radar altimeter for studying sea surface
temperatures and winds, as well as a sensor to conduct research of atmospheric ozone.
Official web page: http://earth.esa.int/ers/
Where is ERS-2 now?

Source: NASA.
The GOES 12 or M satellite is one of the key providers for U.S. weather monitoring and forecast operations and crucial to NOAA's National
Weather Service operations and modernization program.
Official web page: http://goes.gsfc.nasa.gov/

Source: satimagingcorp.com.
The IKONOS Satellite is a high-resolution satellite operated by GeoEye. Its applications include both urban and rural mapping of natural
resources and of disasters, tax mapping, agriculture and forestry analysis, mining, engineering, construction, and general changes. Its high
resolution data makes an integral contribution to homeland security, coastal monitoring and facilitates 3D terrain analysis.
IKONOS web page: http://www.satimagingcorp.com/satellite-sensors/ikonos.html

Source: ISRO.

IRS-P6 (RESOURCESAT-1) is the most advanced remote sensing satellite built by ISRO(Indian Space Research Organization). The tenth
satellite built by ISRO in the IRS series, IRS-P6 is intended to not only continue the remote sensing data services, but also vastly enhance the
data quality, e.g. with higher spatial resolution.
Official web page: http://www.isro.gov.in/


Source: satimagingcorp.com.
The Landsat Program is a series of earth-observing satellite missions jointly managed by NASA and the U.S. Geological Survey. Since 1972,
Landsat satellites have collected information about earth from space.
The government-owned Landsat 7 was successfully launched on April 15, 1999. The earth observing instrument on Landsat 7, the Enhanced
Thematic Mapper Plus (ETM+), replicates the capabilities of the highly successful Thematic Mapper instruments on Landsats 4 and 5.
Official web page: http://landsat.gsfc.nasa.gov/


Source: NASA.

EUMETSAT operates a fleet of meteorological satellites and their related ground systems to deliver reliable and cost-efficient data, images
and products. These in turn fulfill requirements for weather and climate monitoring, primarily of national meteorological services in the Memberand Cooperating States.
Meteosat Second Generation (MSG) is a significantly enhanced follow-up system to the previous generation of Meteosat. The first MSG
satellite launched was Meteosat-8 in 2002. A second satellite followed in December 2005.
Official web page: http://www.eumetsat.int/Home/Main/Satellites/index.htm?l=en


Source: NASA.
NOAA-N is the latest polar-orbiting satellite developed by NASA for the National Oceanic and Atmospheric Administration (NOAA). NOAA-N will
collect information about earth's atmosphere and environment to improve weather prediction and climate research across the globe.
Severe weather is monitored and reported to the National Weather Service which broadcasts the findings to the global community. Providing
early warnings, effects of catastrophic weather events can be minimized.
Official web page: http://www.nasa.gov/mission_pages/noaa-n/


Source: satimagingcorp.com.

QuickBird is a high resolution satellite owned and operated by DigitalGlobe. The satellite is an excellent source for environmental data with
respect to the analysis of changes in land usage, agricultural and forest climates. QuickBird's imaging capabilities can be applied to a host of
industries, including oil and gas exploration & production (E&P), engineering and construction and environmental studies.
QuickBird web page: http://www.satimagingcorp.com/satellite-sensors/quickbird.html


RADARSAT-1 is a sophisticated earth observation satellite developed by Canada to monitor environmental changes and the planet's natural
Launched in November 1995, RADARSAT-1 provides Canada and the world with an operational radar satellite system capable of timely
delivery of large amounts of data. Equipped with a powerful synthetic aperture radar (SAR) instrument, it acquires images of the earth day or
night, in all weather conditions and through cloud cover, smoke and haze.
Official web page: http://www.asc-csa.gc.ca/eng/satellites/radarsat1/
Spot 5

Source: CNES.
SPOT 5 is the fifth satellite in the SPOT series, placed into orbit by an Ariane launcher. Compared to its predecessors, SPOT-5 offers greatly
enhanced capabilities which provide additional cost-effective imaging solutions. The coverage offered by SPOT-5 is a key asset for applications
such as medium-scale mapping, urban and rural planning, oil and gas exploration, and natural disaster management.
Official web page: http://spot5.cnes.fr/

3. Resolution
Display of remote sensing data
Satellite images are not photographs but pictorial presentations of measured data. Satellite systems measure
electromagnetic radiation in different "areas" or bands of the electromagnetic spectrum (e.g. in the visible or infrared range).
Which "Areas" or Bands Are Detected?

Six out of eight bands of the LANDSAT 7 satellite (in grey) and three spectral reflection curves
Own illustration, Source: Siegmund, Menz 2005.
Separating the whole electromagnetic spectrum in different spectral bands has the advantage of combining these bands in
various ways and therefore gain more information compared to solely panchromatic images (with only one band).

In each band, grey scales are assigned respectively to the intensity of the electromagnetic radiation and saved digitally
in pixels. Through assigning the three fundamental colours (red, green and blue) to three different bands satellite image
composites are produced.
As an example, the US satellite LANDSAT 7 possesses 8 bands. With a band combination of 3, 2, 1 an apparently natural
image or true colour satellite image is produced.

True colour satellite image of the region of Karlsruhe, Germany, acquired with LANDSAT in 1999.
Source: Landsat

More features or earth surface structures can be detected through other bands or band combinations. These include water
bodies, different vegetation species or land use classes. Earth surface temperatures can be detected with thermal bands as
These other band combinations are so called false colour images. They show land cover types in "false" colour compared to
what we see with our eyes (see LANDSAT false colour image examples in the switch-box below).
False - 543 False - 215 False - 342 False - 341

False colour satellite image of LANDSAT 7 of the Rhein-Neckar region, Germany (band combination 543).
Source: Landsat
Other than assigning the three fundamental colours to different bands, mathematical operations applied to the raw data
can provide more information. Artificial bands can be derivated, e.g. vegetation indices showing the plant vitality expressed
through NDVI (Normalized Difference Vegetation Index).

3. Resolution
Spatial resolution
Satellite sensors store information about objects as a grid. Digital data is collected from the area covered in the form of
individual image points, so called pixels. A pixel is the smallest area unit in a digital image.
The size of the pixel is dependent on the sensor type and determines the resolution of the image. The measurement of the
resolution is the edge length of a pixel. The higher the resolution and the finer the grid is, the larger is the degree of
recognizable details on the earth's surface.
The resolutions of today's satellite systems vary from a few centimetres (for example military usage) to kilometres.

Low resolution: larger than 30 m

Medium resolution: 2 - 30 m

High resolution: under 2 m

Low Medium 1 Medium 2 Medium 3 High resolution

Low resolution of Western Europe - METEOSAT (1km).

Source: Copyright 2005 EUMETSAT

Different satellites are designed and launched based on their intended use and orbit.

A lower resolution usually coincides with a higher repetition rate, meaning that within a short interval (METEOSAT8 for example every 15 min) the satellite investigates the same area. A "coarse" resolution is used to record large or
global areas for climate-related enquiries, for example the radiation budget of the earth and for weather
monitoring. Additional applications include earth observation of land use and oceans, the ocean's ice cover and the
surface temperatures.

Satellites with medium resolution such as LANDSAT 7 are used for the global observation of land surfaces.
Tropical rainforests and their deforestation have been observed by the LANDSAT satellites for more than 30 years.

High resolution data is mainly used for smaller areas of the earth's surface. Only recently such data has become
available commercially and privately. Satellites such as IKONOS or QuickBird send data for topographic and
thematic mapping of for example land use, vegetation, or as planning resources for cities, large projects etc.
Information can also be "ordered" in advance, because the turning of the satellite sensors can reduce the repeat
rates and can monitor the desired areas earlier (Albertz 2007, Lffler et al. 2005).

Spatial resolution.
Source: Satellite Imaging Corporation

3. Resolution
Spectral resolution
Spectral resolution is defined through the number of spectral bands and their width. Their purpose is to capture the
differences in the reflection characteristics of different surfaces.
While the human eye only recognizes the visible spectrum of light, a satellite can, depending on the type, depict radiation
differently in many spectral areas. The majority of passive earth observation satellites have between three and eight bands
and are therefore called multispectral as for example the American LANDSAT and the French SPOT.

LANDSAT and SPOT spectral resolution.

The higher the spectral resolution, the narrower is the wavelength range for a specific band, and therefore, the more bands
there are. With a higher spectral resolution single objects can be perceived better and spectrally distinguished.
Visible light: In the area of visible light passive satellite sensors are as sensitive as the human eye. Satellites "see" about the
same as a person would see when looking at the earth from an altitude of about 1,000 km. The satellites only capture what is
being lit by the sun.
Infrared sensors measure radiation in the near, middle, and far (thermal) infrared. The data can be converted to
temperatures of the land and ocean surface in the cloud-free conditions, and to the temperature at the top of clouds during
Panchromatic sensors detect broadband light in the entire visible range, and signal intensities are displayed as grey levels,
i.e., black and white imagery.

3. Resolution
Radiometric resolution
The radiometric resolution specifies how well the differences in brightness in an image can be perceived; this is measured
through the number of the grey value levels. The maximum number of values is defined by the number of bits (binary
numbers). An 8 bit representation has 256 grey values, a 16 bit (ERS satellites) representation 65.536 grey values.
The finer or the higher the radiometric resolution is, the better small differences in reflected or emitted radiation can be
measured, and the larger the volume of measured data will be (compare with the image on the right).
The advantage of a higher radiometric resolution is rather small - when comparing LANDSAT-MSS (6 bits) and TM (8 bits) the
improvement is in the order of 2-3%.
Radiometric resolution depends on the wavelengths and the type of the spectrometer:

LANDSAT-MSS (from LANDSAT 1-3): 6 bits (64 grey values)

IRS-LISS I-III: 7 bits (128 grey values)

LANDSAT-TM (from LANDSAT 4-5) & SPOT-HRV: 8 bits (256 grey values)

LANDSAT-ETM & ETM+ (from LANDSAT 6-7): 9 bits (only 8 bits are transmitted)

IRS-LISS IV: 10 bits (only 7 bits are transmitted)

IKONOS & QuickBird: 11 bits.

Grey scale 4 Grey scale 16 Grey scale 256

Grey scale value 4.

Source: ESA

3. Resolution
Temporal resolution
The temporal resolution is given as the time interval between two identical flights over the same area, also called repetition
Temporal resolution is determined by altitude and orbit of the satellite as well as its sensor characteristics (viewing
The repetition rate and the temporal resolution of earth observing satellites is 14-16 days (IKONOS: 14 days, LANDSAT 7:
16 days, SPOT: 26 days). Meteorological satellites such as METEOSAT 8 with 15 min have extremely shorter repetition

Weather satellite images of Meteosat 7 every half an hour.


Las Vegas over time in 1973, 2000 and 2006.

Source: UNEP

The temporal resolution is lower through clouds in case of sensors which detect visible or infrared radiation that does not
penetrate through clouds. Since many areas of the earth are often covered by a clouds, these areas cannot be properly
depicted when a satellite passes over them.
Images of an area taken at different times (monthly, yearly, and per decade) can be used for multitemporal analysis. They
allow analysis of the following, just to give a few examples:

seasonal changes of vegetation,

the expansion of cities over decades, or

documentation of forest clearance in the tropical rainforest etc.

Landsat orbiting around the Earth.

4. Visual Image Interpretation

Virtually all people live with the visual perception of his/her environment. This experience is also used to interpret images
(in 2D) and 3-dimensional structures and specimens.
The visual interpretation of satelllite images is a complex process. It includes the meaning of the image content but also
goes beyond what can be seen on the image in order to recognise spatial and landscape patterns. This process can be roughly
divided into 2 levels:


The recognition of objects such as streets, fields, rivers, etc. The quality of recognition depends on the expertise in
image interpretation and visual perception.


A true interpretation can be ascertained through conclusions (from previously recognized objects) of situations,
recovery, etc. Subject specific knowledge and expertise are crucial.

Interpretation Factors
Essential interpretation factors in recognising objects are: Brightness of the respective area as well as differences in
brightness between different areas, saturation of grey or colour, the form of objects (lines, contour, and outline), image
assembly, the size of objects (representation through image scale), texture of surfaces (structure of an area), shading or hard
shadow, the relative situation of objects (milieu), as well as object patterns (irrigation networks, vegetation patterns,
settlement patterns, traffic patterns) (Albertz 2007).
The first step recognition of objects and structures, relates to the following saying: "I can recognize in an image only what I
already know." Hence, previous knowledge and experience play a very large role in the interpretation process as only
through subject specific knowledge connections can be made between the key underlying processes.
Both steps, recognition and interpretation, do not "mechanically" follow one another, but rather run through a repetitive
process, where both steps heavily rely on one another (Albertz 2007).

Schematic Presentation of the Interpretation Process.

Source: Albertz 2007 with modifications

Western Europe in a Satellite Image.

Source: NASA

The Practice of Image Interpretation

Acquisition of documents: Satellite images, maps, etc.

Pre-interpretation: gross distribution, apportionment of the area, etc.

Partial land pre-investigation: Recognition of regional particularities

Detail interpretation: Core of the work: areas will be individually considered, objects will be recognised and
compared to maps. Objects that are easily identifiable are addressed first.

Land Examination / Field Comparison: a method to double check uncertain interpretation results

Depiction of the results: through maps, map-like sketches, thematic mapping, etc.

Interpretation Key
Interpretation keys help with the recognition and interpretation of objects and structures.

Key selection: make a selection of image examples available to visually compare the patterns of the images, and to
recognise similarities.

Elimination keys: a systematic tree with object descriptions, always two or more options.

These keys warrant high objectivity with respect to interpretation results. However, they only address a specific question,
since "the spatial structures and connections seen in a specific natural or cultural area can be rarely transferred and if so,
almost always only with limitations" (Lffler et al. 2005).

5. Image Processing
Image processing is a process which makes an image interpretable for a specific use. There are many methods, but only the
most common will be presented here.
Geometric Correction
The geometric correction of image data is an important prerequisite which must be performed prior to using images in
geographic information systems (GIS) and other image processing programs. To process the data with other data or maps in a
GIS, all of the data must have the same reference system. A geometrical correction, also called geo-referencing, is a
procedure where the content of a map will be assigned a spatial coordinate system (for example, geographical latitude and

Schematic Depiction of Geometric Correction.

In geo-referencing, image points and pass points need to be searched, which then can be recognised in the coordinates.
Pass points are usually determined with a GPS receiver on the terrain or with maps. Visual street crossings, bridges over
water, etc. can be identified, and their coordinates will be noted. These points will then be coordinated with identical image
points of the not yet geo-referenced satellite image. These correlations can ensure projections with the help of various
additional procedures.

Radiometric Correction
System corrections are important, when technical defects and deficiencies of the sensor and data transfer systems lead
to mistakes in the image data construction. Causes can be detector failure and/or power failure from detectors operating
In scanners such as Landsat TM and MSS with 6 respectively 15 scan rows which are used for the same spectral area, a failure
of scan rows occurs. These errors always appear at the same intervals and create a characteristic striping (banding) in the

Characteristic striping in a satellite image caused by the failure of scan rows.

Source: Naumann 2008

5. Image Processing
Image enhancement
Why do we enhance satellite images? Different methods of image enhancement are used to prepare the "raw data" so that
the actual analysis of images will be easier, faster and more reliable. The choice of method is dependent on the objective
of the analysis. Two processes are presented below:
Histogram Stretches
In digital image processing the statistics of images are portrayed in a greyscale histogram (frequency distribution of grey

Histogram of the satellite image on the right side before and after the stretch.
The form of a histogram describes the contrast range of a satellite image and permits comments about its homogeneity.
For example, a grey scale distribution with an extreme maximum indicates small contrast. A simply stretched maximum
indicates homogeneity in the image, but also a larger contrast range.
A histogram stretch is a method to process individual values in the image. The stretch is used as a contrasting presentation
of the data. The contrast stretch can be used in many different processes. The entry data will always be stretched over the
entire area of 0-255.
So called filter operations change image structures by calculating greyscale value relations of the neighbouring pixels.
The filters use coefficient matrixes which cut a small area or matrix out of the original image centered on an individual image
point. The filter/matrix then has to "run" over the entire image.

Original before ... ... and after the stretch

Part of the Rhein-Neckar-Kreis, original data before the stretch (see the histogram at the left).
Source: Landsat

The filter command box in the IDRISI image processing software.

6. Classification
Why do we classify data and satellite images? When a satellite image goes through the process of classification, a land
use classification map is one possible outcome. This map can be more successfully "read" and interpreted with its limited
number of classes for uses such as planning.
Individual elements (pixels) of a satellite image including its values (reflection values and greyscale values) can be referenced
to a specific number of classes (for example, land use classes) for a specific classification.

The classification process

The classification process begins after the acquisition of suitable data. The first step is a visual interpretation of the satellite
image (see figure to the right). After a series of operations with respect to image processing and enhancement (such as
radiometric and geometric correction, stretch, and filter logarithms) the actual classification process begins. This process is
divided into different parts. The decision needs to be made if an unsupervised and/or supervised classification will be
Classification methods

Unsupervised classification: Pure static analysis (cluster analysis) of multispectral data from an area (without
reference areas)

Supervised classification: Every object class will be coordinated to reference areas, called training areas. These
areas clearly belong to a specific class and enhance the statistical classification.

The Classfication Process.

Changed after: Naumann 2008
Example of a classification from the SEOS tutorial Land Use und Land Use Change: Land Cover Classification of Tenerife
1978 and 2002

Land Cover Classification of Tenerife in 1978 and 2002

Tenerife 1978
Satellite image Classification

Satellite image of Tenerife in 1978.

Source: Landsat

Tenerife 2002
Satellite image Classification

Satellite image of Tenerife in 2002.

Source: Landsat

6. Classification
Unsupervised classification

"The unsupervised classification uses the statistical distribution of the pixels within feature spaces exclusively to differentiate
between classes."
What is a feature space?
A feature space and its dimensions are defined through the number of captured bands (spectral areas). The satellite LANDSAT
7 has for example, six channels, so every captured pixel (image point) from this satellite has six greyscale values. The pixels
have a characteristic vector in the feature space with six dimensions. Only a maximum of three dimensions are
presentable, but normally two channels (spectral areas) will be depicted in a graph (see below). Every pixel has, through its
characteristic vector (its attribute value) a specific place in this "space".

Feature space of the Channels 3 and 4 of a LANDSAT-Szene.

Compilations or so called pixel clusters within the feature space represent specific classes. The closer two pixels are to each
other in the feature space, the more similar they are and the greater the possibility that they belong to the same class.
There are two procedures for the classification or grouping of pixels:

Beginning with individual pixels, the most similar pixels will be grouped together step by step. In the end, all of
the pixels will be grouped into classes.

Beginning with all pixels: A certain number of classes may be the objective. This procedure is called clustering.

A good and reliable classifier recognises significant differences in the characteristic vectors, which can then be coordinated
with each class. These differences and similarities are almost always depicted in more than three presentable dimensions.
An unsupervised classification is often seen as preparation, i.e. the first step leading to a supervised classification. At this
time it can be checked, whether the data can be distributed into the number of classes needed.

6. Classification
Supervised classification

In supervised classification (in contrast to unsupervised classification) reference classes are used as additional information.
This process safely determines which classes are the result of the classification. The following steps are the most common:

Definition of the land use and land cover classes (spectral classes such as coniferous forest, deciduous forest,
water, agriculture etc.)

Classification of suitable training areas (reference areas for each class)

Execution of the true classification with the help of a suitable classification algorithm

Verification, evaluation, and inspection of the results.

Satellite Image from the Karlsruhe Region and Classification

Source: LANDSAT and LUBW with modifications

Training Areas
The statistic classification for so called training areas is "trained". The areas are selected from the examined area (for
example from maps or air images) and mapped in a site exploration. The exemplary regions for each class will be defined
(for example land use classes such as coniferous forest, water areas, etc.) and will be made available as a reference for the
Important with the capturing of trainings areas...
The size of the captured area is important when capturing training areas. Size is depended on the spectral resolution of the
satellite image. For example, when a satellite image from the LANDSAT satellite has to be classified, it has a spectral
resolution of 30 m (one pixel represents 30 m x 30 m). The size of the test area should be at least 90 m x 90 m in order to be
certain that an entire pixel is "encountered". The chosen training area should be homogenous and spread throughout
the entire satellite image so it can be classified. There should also be enough areas to be captured for each class. Often ten
times as many pixels will be digitised as there are bands available (e.g. LANDSAT - 70 pixels per land cover class).

Maximum likelihood classification

The real classification of the satellite images takes places with the help of extensive classification algorithms, such as for
example maximum likelihood, minimum distance, cubic procedures (parallelepiped), or hierarchical classification.
The most common is the maximum likelihood classification. It touches a probability density function, meaning, the
classifier guesses the probability with which a specific pixel belongs to a specific class. Larger deviations from the center point
will be allowed where a pixel is not in the area of a contesting category - less where such a competition exists.

Task: How should the stars (red and turquoise) be classified here?

Maximum likelihood classification principle.

Source: Naumann 2008
Evaluation and inspection of the results
To verify/inspect the results, the probability of a pixel belonging to a class and the difference between the probabilities that it
belongs to the next class and the suspected class will be calculated. The results will be provided in the form of a confusion
matrix of the training area, in order to show the suitability of the training area.

The Electromagnetic Spectrum
Download Worksheet 'The electromagnetic spectrum' for use in class. Find here the HTML version of the worksheet.

The worksheet "The electromagnetic spectrum" teaches the students the particularities of the electromagnetic spectrum. This
worksheet introduces the area of remote sensing and conveys the basics in subject.

Didactical comment:
The students have to complete the worksheet by filling in the missing technical terms of the electromagnetic spectrum in the
figure and have to fill in a table with the missing words which are given in the question.

Solution of worksheet 'The electromagnetic spectrum':

Use the following words to describe the different spectral areas of the electromagnetic spectrum and to complete the
accompanying table:
Near-IR - Detection of surface temperatures, soil moisture - Visible - Microwave - Reflected solar radiation - Mid-IR - Very high
- Marginal - Ultraviolet - Detection of vegetation and soil moisture, geological applications, ocean currents - Visible Light Marginal - Oil films on water, ozone concentration - Few - Microwave - Thermal IR - Emitted thermal radiation- Near-IR Reflected solar radiation - Detection of vegetation, water bodies and soils for land cover and land use mapping - Ultraviolet Thermal IR - Emitted & reflected radiation - Mid-IR.

The electromagnetic spectrum

Source: Albertz 2007, with modifications












Emitted &

Atmosphe Very high


In the blue
& green





Applicatio Oil films on


Detection of
water bodies
and soils for
land cover
and land use


Detection of
and soil

Detection of
, soil

land shifting

Satellite Orbits
Download Worksheet 'Satellites' for use in class. Find here the HTML version of the worksheet.

An earth observing satellite is an artificial flying object, orbiting around a planet (e.g. the earth) for scientific, commercial or
military purposes. There are two different orbit: geostationary and near polar orbits.

Didactical comment:

This worksheet asks the students to get familiar with the different satellites. The exercise teaches the students the
differences between the two orbits, their applications as well as different types of satellites.

First of all, it is important for the students to understand the distinction between possible applications of geostationary
and near polar satellites, but also to establish a relationship between the satellites and everyday life.

Solution of Worksheet 'Satellites':


Label both orbits in the image above.

The earth and important orbits.


Explain why a satellite can fly around the earth without using energy.

As the satellites are in orbit outside the atmosphere there is no air resistance and therefore, according to the
law of inertia, the speed of the satellite is constant resulting in a stable orbit around the earth for many years.

In a geostationary orbit at a distance of 36.000 km, the orbiting time is 24 hours corresponding to the earth's
rotation time. At this altitude a satellite above the equator will appear stationary in relation to the earth.


After leaving a stable orbit a satellite will reduce speed. A non-functioning satellite is the littering the orbits.

The satellites in the 'red' orbit are used for several civilian services. List some satellites and their uses
and explain how they can affect your quality of life, both positively and negatively.

Example satellites in geostationary orbits are Meteosat, GOES, GMS, GOMS, KALPANA, and INSAT.


news on



parts are

Why are the


Sun-synchronous, near polar















broadcasting etc.
on our life and quality:
weather prediction in the
broadcasting are part of
everyday life.
and broken satellites or
littering the orbits and
functioning satellites.

satellites shown in the

important for earth

They give
areas for

They are
the earth




These satellites have multiple uses, e.g. disaster monitoring, mapping, digital elevation model building,
agricultural mapping, traffic control etc.

Changes can also be compared easily because the satellites record an area always at the same local time
(they are sun-synchronous).


an overview of (large)
the last 30 years.
and sending data to the

help to monitor the

on the earth's surface,
covering everything from urban planning to vegetation monitoring etc. depending on the spatial resolution.

Assign the following satellites to their correct orbits by writing the satellite names next to the correct
orbit: Landsat, KALPANA, IRS, Meteosat, INSAT, Radarsat, SPOT, GMS, QuickBird, GOES, ERS, GOMS,
Envisat, Ikonos, NOAA.

A Question of Resolution

Shanghai in different satellite images: left: recorded by MERIS on Enivsat 2003, middle: Landsat TM 1989, right: recorded by
Ikonos in May 2000.
Source: ESA & Beckel 2007

A Question of Resolution
Download Worksheet 'A Question of resolution' for use in class. Find here the HTML version of the worksheet.

Spatial resolution is an indication of how a satellite sensor can record spatial details such as lakes, houses, cars or persons.
This resolution is essential and must be selected according to the respective application and size of the analysed area.
Monitoring weather conditions with satellite images a large area view, but lower spatial resolution is needed. Monitoring urban
planning requires very high resolution but only for a smaller area.

Didactical commentary:

To complete this worksheet, the students should find pros and cons for the use of different spatial resolutions and
then choose the right spatial resolutions with respect to the particular applications given in the table.

In addition, the students are asked to search the Internet to find the primary applications for the satellites Landsat,
SPOT and QuickBird.

Solution to the Worksheet 'A Question of resolution':


Different resolutions for different applications. a. Complete the table which demonstrates how different
resolutions are suitable for different applications. Use the following scale: + (best suited for purpose), o
(moderately suited for purpose), - (poorly suited).



Medium to high

Very high

Traffic control

Regional environmental mapping

Topographic map up-dates

Agricultural mapping

Tree census

Urban planning

Weather monitoring

b. Construct a table to show the advantages and disadvantages for the use of low, medium to high and
very high resolution image.

Low resolution






Medium to high

Very high resolution


can detect als

small objects

temporal resolution

weather forecasting

a lot of data
good time series)

useful for urba

planning, construction
defence, farming, etc.

can monitor
changes etc.


can only detect

large objects

cannot see the

polar regions

not suitable
for urban planning

very expensive

the last few years

The choice of satellite system is usually a compromise between cost and spatial resolution. Which satellite
system's resolution would you need to...
a. monitor the vegetation of a region?

For this purpose one would use medium to high resolution images, which are cheaper than very high
resolution data and cover larger areas.

Landsat data provides widely spread and easily obtainable images. Images from different dates are still on
record. These images have been available for the last 30 years.

b. check the construction progress of an airport?


For monitoring the construction of an airport very high resolutions in meters are required.

Satellites such as Ikonos, QuickBird or SPOT can provide such data.

c. draw the topography of a region?


For this purpose low resolution data covering an entire region is needed.

Satellites such as NOAA satellites or ASTER can provide such information.


Find main tasks and uses of the satellites Landsat, SPOT and QuickBird on the Internet.

Landsat: Monitoring of the earth surface, vegetation, environmental monitoring and land cover classification etc.

SPOT: Mapping, security and defence, farming and forestry, fisheries, land management and planning, geology and
risk mitigation.

QuickBird: Various applications, e.g. urban planning, natural disaster management, agriculture, forestry.

Visual Interpretation of Satellite Images

Satellite image mosaic of Europe, viewed by SPOT Vegetation in 2002.

Source: Beckel 2007

Visual Interpretation of Satellite Images

Download the Worksheet 'Visual interpretation of Satellite Images' for use in class. Find here the HTML version of the

A true colour satellite image is a combination of different bands of the visible light spectrum resulting in an apparently natural
image, similar to what an airplane passenger would see when looking down from a plane.
The satellite image mosaic (a mosaic is an adapted composition of various satellite images) in the worksheet depicts Europe
and was recorded by the vegetation sensor onboard SPOT satellite in 2002.

Didactical comment:

The worksheet 'Visible Interpretation of Satellite Images' will teach the students how to read and interpret a true
colour satellite image. The image mosaic of Europe should be described and the different colours should be assigned
to various land cover types.

Students are supposed to analyse during which season of the year the images were recorded.

At the end, the results should be documented on transparencies.

Solution to the worksheet 'Visual Interpretation of Satellite Images'


Describe the satellite image mosaic of Europe. What different patterns and surface structures (regional
landscapes) can you identify? You may use additional references such as topographic, geological, soil, or
regional planning maps.


Refer to an ESA school atlas, page 52/53 or every other school atlas with a topographic map of Europe to
verify your results.

The main colours in the image (green, yellow, brown) represent changes in land cover. How are they
linked with land cover types like forests, pastures etc. and where can you identify large settlement areas?

Image colours/patterns



Land cover types

Black and dark blue areas

Water bodies, ocean

White areas

Snow cover, glaciers

Red areas

Urban areas

Light green areas

Pasture and agricultural land

Medium and dark green areas

Deciduous and coniferous forests

Red to brown areas

Rocks and mountain areas without vegetation

Brown to yellow areas

Vegetation free areas, deserts

Turquoise areas

Salt lakes (in Tunisia, Algeria and Turkey)

A mosaic is composed of a number of images taken from various satellites. In which season (spring,
summer, fall, winter) do you think the images were taken? Explain your decision.

The images were taken between July and September 2002.

The missing snow cover over land and only some small snow covered areas in the Alps, Norway and in Iceland
(glaciers) speak for summer.

Large areas over Europe and North Africa are very green due to rich vegetation, especially in Northern Europe.

Take a transparency film and cover the image with it; use paper clips to secure the transparency. Label
distinctive points and trace the coastline.
a. Draw the country borders of the European countries with the help of an atlas. Discuss in class why
some boundaries follow natural surface features and others do not.

Satellite image mosaic of Europe, viewed by SPOT Vegetation in 2002.

Source: Beckel 2007
b. Delineate important regional landscapes and label them with the help of an atlas.

Refer to an ESA school atlas, page 52/53 or every other school atlas with a topographic map of Europe to
verify your results.

Image Processing
Georectifying - Enhancement and Mosaicing

Composition of MERIS scenes of Europe.

Source: Beckel 2007

Image Processing
Georectifying - Enhancement and Mosaicing
Download the Worksheet 'Image Processing' for use in class. Find here the HTML version of the worksheet.

To produce meaningful and valuable satellite images as we know them from everyday life, or an image mosaic like in
worksheet 4 (about Visual Image Interpretation of Europe), the raw satellite data has to be processed, enhanced and
mosaiced together. Moreover images have to be georectified to compare the data with other sources. In this process a
relationship between the image pixels and the position of the corresponding points on the earth's surface is established.

Didactical comment:

The worksheet 'Image processing' illustrates that satellite image data has to be processed, enhanced, and run through
different work stages before it is useful.

The students should understand that interferences such as clouds or sun reflections must be eliminated by blending
different images to get a 'clear' view of the surface.

The students should describe and reconstruct the different image processing steps.

Solution to the worksheet 'Image Processing'




Explain why we enhance data.


Non-corrected satellite images depend on atmosphere and weather conditions.

Enhancements like geometric and radiometric corrections make it easier to visually extract information and
apply it to various problems.

An atmospheric correction includes additional information from other sources about interferences in the
atmosphere between a satellite and the earth surface, such as for example aerosols. With this additional
information the interferences can be erased.

Speculate how the satellite scenes in the image above were manipulated. Which type of image processing
did they go through?

The images were first radiometrically and spectrally corrected, which means atmospheric influences (haze)
were reduced and colours optimised.

Next the images were geometrically corrected. This establishes a correlation between the image pixels and the
position of the corresponding points on the earth's surface. Therefore, images obtained at different dates can
be combined to a mosaic.

Describe and interpret variations in colour between the different scenes.


The ocean is represented in different colours in the satellite images. Colours range from medium blue to black.

Some images in the area of former Yugoslavia show a darker green value compared to the other images of

Images from North Africa show completely different colours due too deserts, geological formations etc.

a. What could the white spots above Iceland represent?


They represent clouds over the island.

MERIS is an optical sensor which cannot penetrate clouds or rain in contrast to active Radar systems and
therefore clouds are visible in white.

b. What might be the white areas in the Southern Mediterranean Sea?


These white areas are reflections of the sun in the Mediterranean Sea.

Speculate how the satellite scenes in the image above were manipulated. Which type of image processing
did they go through?

The satellite scenes have to be spectrally adapted to the neighbouring images to obtain evenly distributed

Missing images have to be built into the mosaic.

Image Classification Compare a Satellite Image with a Classification

Dsseldorf in a satellite image.

Source: www.flaechennutzung.nrw.de

Dsseldorf on an classified image.

Source: www.flaechennutzung.nrw.de

Image Classification Compare a Satellite Image with a Classification

Download the Worksheet 'Image Classification' for use in class. Find here the HTML version of the worksheet.
Satellite images are often classified for further analysis and interpretations. During this process the different colour and
structure information of the image data is clustered and categorised. Each pixel and area can be assigned to a specific land
cover class or type.
Didactical comment:

The worksheet 'Image classification' helps the students understand classification procedures with help of the satellite
image example of the area of Dsseldorf, Germany. The students should compile a legend by assigning the correct
colour to the different land cover classes in the given table. In addition, the students have to estimate the area
percentage of each legend class.

Advantages and possible applications of true colour satellite images and classified images are developed by the
students in the second step.

Solution to the worksheet 'Image Classification'


Please compile the legend by adding the right colour to the 10 land cover classes. Following colours are
given: yellow - light green - medium red - blue - lime green - dark red - grey - light red - dark green medium green


Estimate the area percentage of each land cover class (be sure not to exceed 100% in total)

Legend class
High degree of sealing (> 80%)
Medium degree of sealing (40 - 80%)
Low degree of sealing (< 40%)
Waste deposit sites, gravel-pits, building sites

Dark red


Medium red


Light red



Grassland and pasture

Lime green


Coniferous forest

Dark green


Mixed forest

Medium green

Deciduous forest
Water bodies


Area percentage


Agricultural land



Light green


Name reasons why a classification is useful.


Image classification categorises all pixels in an image into land cover classes or themes.

Classified images are more valuable for regional planners, researchers etc.

They can be used for inventory monitoring and other land use applications.

True colour satellite image or classified image? Name advantages and disadvantages of true colour and
classified satellite images and where they are most useful.

True colour satellite images

Classified satellite images

+ the natural colours can provide information about the

actual state of vegetation and seasonal changes

- classified images do not distinguish

between seasons

- high information density, no distinction between

important and unimportant information

+ only important land use classes or

themes visible on a classified imaged
- these images are generalised and lack

- a clearly assignment of colour and land cover type is

not possible

+ each pixel can be assigned to a land

cover class with its colour
+ (infra)structures and cities are
immediately distinguishable

+ general overview over an area

+ very useful for regional/ environmenta

planning and monitoring
+ land cover inventory
+ change detection