SlideShare une entreprise Scribd logo
1  sur  50
Télécharger pour lire hors ligne
আরিফু ল ইসলাম 1
Principles of Remote Sensing
DSMHT-213
Ariful Islam
Department of Disaster Science and Management
Faculty of Earth and Environmental Science
University of Dhaka
আরিফু ল ইসলাম 2
What is Remote Sensing?
"Remote sensing is the science (and to some extent, art) of acquiring information
about the Earth's surface without actually being in contact with it. This is done by
sensing and recording reflected or emitted energy and processing, analyzing, and
applying that information."
Remote sensing is the means of observing objects and their physical, optical and
geometrical properties without touching these objects. The information is gathered
through electromagnetic waves or lights of different wave length. The light falls on the
object and scatters back to the sensor which is sensitive to the intensity of radiation.
The sensor is photo sensitive and as soon as the light reflects back to the sensor a
voltage is created which produces a signal that is saved in a storage device.
In much of remote sensing, the process involves an interaction between incident
radiation and the targets of interest. This is exemplified by the use of imaging systems
where the following seven elements are involved. Note, however that remote sensing
also involves the sensing of emitted energy and the use of non-imaging sensors.
1. Energy Source or Illumination (A) – the first requirement for remote sensing is to
have an energy source which illuminates or provides electromagnetic energy to the
target of interest.
আরিফু ল ইসলাম 3
2. Radiation and the Atmosphere (B) – as the energy travels from its source to the
target, it will come in contact with and interact with the atmosphere it passes through.
This interaction may take place a second time as the energy travels from the target to
the sensor.
3. Interaction with the Target (C) - once the energy makes its way to the target
through the atmosphere, it interacts with the target depending on the properties of
both the target and the radiation.
4. Recording of Energy by the Sensor (D) - after the energy has been scattered by, or
emitted from the target, we require a sensor (remote - not in contact with the target)
to collect and record the electromagnetic radiation.
5. Transmission, Reception, and Processing (E) - the energy recorded by the sensor
has to be transmitted, often in electronic form, to a receiving and processing station
where the data are processed into an image (hardcopy and/or digital).
6. Interpretation and Analysis (F) - the processed image is interpreted, visually and/or
digitally or electronically, to extract information about the target which was
illuminated.
7. Application (G) - the final element of the remote sensing process is achieved when
we apply the information we have been able to extract from the imagery about the
target in order to better understand it, reveal some new information, or assist in
solving a particular problem.
The Electromagnetic Spectrum
আরিফু ল ইসলাম 4
The electromagnetic spectrum can be divided into several wavelength (frequency)
regions, among which only a narrow band from about 400 to 700 nm is visible to the
human eyes. Note that there is no sharp boundary between these regions. The
boundaries shown in the above figures are approximate and there are overlaps
between two adjacent regions.
Wavelength units: 1 mm = 1000 µm; 1 µm = 1000 nm.
 Radio Waves: 10 cm to 10 km wavelength. The short radio wave is reflected from
ionosphere
 Microwaves: 1 mm to 1 m wavelength. The microwaves are further divided into
different frequency (wavelength) bands: (1 GHz = 109
Hz)
o P band: 0.3 - 1 GHz (30 - 100 cm)
o L band: 1 - 2 GHz (15 - 30 cm)
o S band: 2 - 4 GHz (7.5 - 15 cm)
o C band: 4 - 8 GHz (3.8 - 7.5 cm)
o X band: 8 - 12.5 GHz (2.4 - 3.8 cm)
o Ku band: 12.5 - 18 GHz (1.7 - 2.4 cm)
o K band: 18 - 26.5 GHz (1.1 - 1.7 cm)
o Ka band: 26.5 - 40 GHz (0.75 - 1.1 cm)
 Infrared: 0.7 to 300 µm wavelength. This region is further divided into the following
bands:
 Near Infrared (NIR): 0.7 to 1.5 µm.
 Short Wavelength Infrared (SWIR): 1.5 to 3 µm.
 Mid Wavelength Infrared (MWIR): 3 to 8 µm.
 Long Wanelength Infrared (LWIR): 8 to 15 µm.
 Far Infrared (FIR): longer than 15 µm.
The NIR and SWIR are also known as the Reflected Infrared, referring to the main
infrared component of the solar radiation reflected from the earth's surface. The
MWIR and LWIR are the Thermal Infrared.
 Visible Light: Blue, green, and red are the primary colours or wavelengths of the
visible spectrum. They are defined as such because no single primary colour can
be created from the other two, but all other colours can be formed by combining
blue, green, and red in various proportions. This narrow band of electromagnetic
radiation extends from about 400 nm (violet) to about 700 nm (red). The various
colour components of the visible spectrum fall roughly within the following
wavelength regions:
আরিফু ল ইসলাম 5
o Red: 610 - 700 nm
o Orange: 590 - 610 nm
o Yellow: 570 - 590 nm
o Green: 500 - 570 nm
o Blue: 450 - 500 nm
o Indigo: 430 - 450 nm
o Violet: 400 - 430 nm
 Ultraviolet: 3 to 400 nm . Ultraviolet or UV portion of the spectrum has the
shortest wavelengths which are practical for remote sensing. This radiation is
just beyond the violet portion of the visible wavelengths
 X-Rays and Gamma Rays
Description of Electromagnetic Waves
Electromagnetic waves are energy transported through space in the form of periodic
disturbances of electric and magnetic fields.
The waves are constitutes by Electric and magnetic vectors. The waves are sinusoidal
in nature and is describes mathematically by the following expression.
)sin( tkxAy  , where

2
k and
T


2
 and Tk ,,,  are wave number, wave
length, angular frequency and T is the time period. The frequency, f=
T
1
All electromagnetic waves travel through space at the same speed, c = 2.99792458 x
108
m/s, commonly known as the speed of light. An electromagnetic wave is
characterized by a frequency and a wavelength. These two quantities are related to
the speed of light by the equation,
c = f x 
আরিফু ল ইসলাম 6
The frequency (and hence, the wavelength) of an electromagnetic wave depends on
its source. There is a wide range of frequency encountered in our physical world,
ranging from the low frequency of the electric waves generated by the power
transmission lines to the very high frequency of the gamma rays originating from the
atomic nuclei. This wide frequency range of electromagnetic waves constitutes the
Electromagnetic Spectrum.
Blackbody Radiation
All material bodies emit radiation. The intensity and wave length of the radiation
depends on the temperature of the body. If the emissivity of the body of the body is 1
then it is known as blackbody. But in reality there is no perfect blackbody, but most of
the objects may be considered very close to blackbody.
Planck's law describes the electromagnetic radiation emitted by a black body in
thermal equilibrium at a definite temperature. The law is named after Max Planck,
who originally proposed it in 1900. It is a pioneering result of modern physics and
quantum theory.
The spectral radiance of a body, , describes the amount of energy it gives off as
radiation in different frequencies. It is measured in terms of the power emitted per
unit area of the body and per unit solid angle for a frequency μ. Planck showed that
the spectral radiance of a body at absolute temperature T is given by
Where kB the Boltzmann constant (1.3806488×10−23
J/K.), h the Planck constant
(6.62606957×10−34
Js) and c the speed of light in the medium, whether material or
vacuum.[1][2][3]
The spectral radiance can also be measured per unit wavelength instead
of per unit frequency. In this case, it is given by
The peak wavelength of the emission is inversely proportional to its temperature in
Kelvin.
আরিফু ল ইসলাম 7
Wien's Law is the equation to use to solve for this:
T
0029.0
max  Meter, where T is in Kelvin
For earth’s surface with T=303; the wave length with maximum energy for T= 303 K is
9.5 µm.
Atmospheric Interaction of Electromagnetic Waves/ Solar Radiation
The earth has a gaseous envelop around its solid and liquid core. The electromagnetic
radiations while passing through the atmosphere interacts with atmospheric
molecules causing scattering. Before radiation used for remote sensing reaches the
Earth's surface it has to travel through some distance of the Earth's atmosphere.
Particles and gases in the atmosphere can affect the incoming light and radiation.
These effects are caused by the mechanisms of scattering and absorption.
 Top of the Atmosphere
o Solar radiation arrives essentially unchanged by passage through space
o Interacts with gas molecules, water droplets and aerosols in the
atmosphere
 Absorption
o Radiant energy converted into internal energy
 Ceases to exist as radiant energy
 Heats atmosphere directly
o Very little absorption in visible wavelengths
 Ozone absorbs most ultraviolet
 CO2, water vapor and water droplets absorb near infrared
 Transmission
o Sunlight passing through atmosphere without being altered
 Light arriving at surface directly from the Sun
 Makes the disk of the Sun visible
 Creates patterns of strong light and shadows
 Reflection
o Interaction that changes direction of light
o Reflection from surfaces
 Radiation redirected away from surface without being absorbed
 Reflected light makes objects visible
 Wavelength-dependent reflection sensed as color differences
আরিফু ল ইসলাম 8
o Specular reflection
 Coherent beam reflected - images preserved
 Mirrors, shiny metal and smooth water surfaces
 Scattering
o Light reflected in all directions - images not preserved
o Fogged mirror (water droplets) scatter at different angles, so can't
see anything in mirror
o Comparable effect with fog on glass scattering transmitted light
o Atmospheric molecules and aerosols scatter incoming solar radiation
o Some forward scattered and continues toward surface at different
angle
o Some back scattered out to space taking energy out of earth-
atmosphere system
Figure-3 Scattering of solar radiation in the atmosphere in all direction and
thus the diffused light is produced (left). The right panel is scattering by
clouds
 Diffuse radiation
o Some light reaches surface after being scattered
 Appears to originate from location of last scattering event
 Illuminates sky outside of direct solar beam
 No atmosphere on moon, so sky is black except for direct light
source
o Amounts of direct and diffuse light reaching surface vary depending on
atmospheric conditions
 Clear, dry atmosphere - large amount of direct, little diffuse
 High contrast between dark shadows and brightly
illuminated surfaces
 Sky appears deep blue
আরিফু ল ইসলাম 9
o Cloud cover - no direct, all diffuse
 Light coming from all directions of the sky, so only shadows
directly underneath things
 Disk of sun not visible
 Rayleigh Scattering
o Scattering by objects smaller than wavelengths of visible light
 Primarily atmospheric gas molecules
o Preferentially scatters shorter wavelengths
 Violet and blue scattered most, red least.
o Makes sky appear blue from earth and earth appear blue from space
o Makes red sunsets
 All wavelengths shorter than red scattered during long passage
through atmosphere
 Best sunsets occur with clear sky except for medium or high
clouds
 Reflect the red light against dark sky background
 Mie Scattering
 Scattering by objects larger than wavelengths of visible light
(Water droplets and aerosols)
o Scatters all wavelengths equally - white light
o Makes clouds appear white
o Aerosols near surface often make skies appear hazy white, especially
near horizon
Energy of light
Photons
Light is wave, but is also considered as particle/corpuscle in quantum Mechanics and
the energy of an electromagnetic wave is considered to be quantized, i.e. it can only
exist in discrete amount. The basic unit of energy for an electromagnetic wave is called
a photon. The energy E of a photon is proportional to the wave frequency f,
E = h f
Where, the constant of proportionality h is the Planck's Constant,
h = 6.626 x 10-34
J s.
আরিফু ল ইসলাম 1
0
Attenuation of solar radiation while passing thorough the atmosphere
Solar Constant=1.361 KW/m2
As the solar radiation passes through the atmosphere, the radiation is attenuated till
it reaches the surface of the earth. Then the incident radiation is by given by,
)exp()()( zkII topinc  
Or )()()(  topinc II  , where )exp()( zk 
Here, k is the coefficient of attenuation and )( is known as transmittance.
The transmittance of the atmosphere radiations is also given as as
)(
)(
)(



top
inc
I
I
 , i.e.
There are strong variation of atmospheric transmittance with  . The figure 4 shows
the atmospheric transmittance for the E-M spectrum.
Figure: The transmittance of E-M radiation through atmosphere (upper panel) and
Blackbody radiation of sun and that from the earth’s surface (lower panel).
From the figure (upper panel) it appears that the solar radiations have high
transmittance in the Near UV (λ=0.2-0.4 µm), visible (λ=0.4-0.7 µm) and Near Infrared
(NIR) λ= (0.7-1.5 µm).
আরিফু ল ইসলাম 1
1
Atmospheric window
Parts of the electromagnetic spectrum that can be transmitted through the
atmosphere with relatively little interference. The atmosphere is considered as
window for the spectral radiation having high transmittance through the atmosphere
and such ranges of spectrum are termed as atmospheric windows. The figure shows
that the following rages of radiations are atmospheric window:
I. UV: 0.2-0.4 µm- (Solar Radiation Passive)
II. Visible: 0.4-0.7 µm (Solar Radiation-Passive)
III. NIR: 0.7-1.5 µm (Solar Radiation-Passive)
IV. Far Infrared (Thermal radiation): 8.0-15.0 µm (Blackbody radiation from earth’s
surface and lower atmosphere-Passive)
V. Passive microwave (50 hz) Radiation from earth’s surface.
VI. Microwaves: centimeter to meter: Passive radiation from earth and Active
radiation from Synthetic Aperture RADAR (SAR) that used direct Microwave
beams for illuminating the object on earth.
Absorption Bands and Atmospheric Windows
Some types of electromagnetic radiation easily pass through the atmosphere, while
other types do not. The ability of the atmosphere to allow radiation to pass through it
is referred to as its transmissivity, and varies with the wavelength/type of the
radiation. The gases that comprise our atmosphere absorb radiation in certain
wavelengths while allowing radiation with differing wavelengths to pass through.
The areas of the EM spectrum that are absorbed by atmospheric gases such as water
vapor, carbon dioxide, and ozone are known as absorption bands. In the figure,
absorption bands are represented by a low transmission value that is associated with
a specific range of wavelengths.
In contrast to the absorption bands, there are areas of the electromagnetic spectrum
where the atmosphere is transparent (little or no absorption of radiation) to specific
wavelengths. These wavelength bands are known as atmospheric "windows" since
they allow the radiation to easily pass through the atmosphere to Earth's surface.
Most remote sensing instruments on aircraft or space-based platforms operate in one
or more of these windows by making their measurements with detectors tuned to
specific frequencies (wavelengths) that pass through the atmosphere. When a remote
sensing instrument has a line-of-sight with an object that is reflecting sunlight or
emitting heat, the instrument collects and records the radiant energy.While most
remote sensing systems are designed to collect reflected radiation, some sensors,
আরিফু ল ইসলাম 1
2
especially those on meteorological satellites, directly measure absorption
phenomena, such as those associated with carbon dioxide (CO2) and other gases. The
atmosphere is nearly opaque to EM radiation in part of the mid-IR and all of the far-IR
regions. In the microwave region, by contrast, most of this radiation moves through
unimpeded, so radar waves reach the surface (although weather radars are able to
detect clouds and precipitation because they are tuned to observe backscattered
radiation from liquid and ice particles).
The figure above shows the percentage of light transmitted at various wavelengths
from the near ultraviolet to the far infrared, and the sources of atmospheric opacity
are also given. You can see that there is plenty of atmospheric transmission of
radiation at 0.5 microns, 2.5 microns, and 3.5 microns, but in contrast there is a great
deal of atmospheric absorption at 2.0, 3.0, and about 7.0 microns. Both passive and
active remote sensing technologies do best if they operate within the atmospheric
windows.
আরিফু ল ইসলাম 1
3
Microwave Remote Sensing
Electromagnetic radiation in the microwave wavelength region is used in remote
sensing to provide useful information about the Earth's atmosphere, land and ocean.
A microwave radiometer is a passive device which records the natural microwave
emission from the earth. It can be used to measure the total water content of the
atmosphere within its field of view.
A radar altimeter sends out pulses of microwave signals and record the signal
scattered back from the earth surface. The height of the surface can be measured from
the time delay of the return signals.
A wind scatterometer can be used to measure wind speed and direction over the
ocean surface. It sends out pulses of microwaves along several directions and records
the magnitude of the signals backscattered from the ocean surface. The magnitude of
the backscattered signal is related to the ocean surface roughness, which in turns is
dependent on the sea surface wind condition, and hence the wind speed and direction
can be derived. orne platforms to generate high resolution images of the earth surface
using microwave energy.
আরিফু ল ইসলাম 1
4
Synthetic Aperture Radar (SAR)
In synthetic aperture radar (SAR) imaging, microwave pulses are transmitted by an
antenna towards the earth surface. The microwave energy scattered back to the
spacecraft is measured. The SAR makes use of the radar principle to form an image by
utilising the time delay of the backscattered signals.
A radar pulse is transmitted from
the antenna to the ground
The radar pulse is scattered by the
ground targets back to the antenna.
In real aperture radar imaging, the ground resolution is limited by the size of the
microwave beam sent out from the antenna. Finer details on the ground can be
resolved by using a narrower beam. The beam width is inversely proportional to the
size of the antenna, i.e. the longer the antenna, the narrower the beam.
আরিফু ল ইসলাম 1
5
The microwave beam sent out by
the antenna illuminates an area
on the ground (known as the
antenna's "footprint"). In radar
imaging, the recorded signal
strength depends on the
microwave energy backscattered
from the ground targets inside
this footprint. Increasing the
length of the antenna will
decrease the width of the
footprint.
It is not feasible for a spacecraft to carry a very long antenna which is required for high
resolution imaging of the earth surface. To overcome this limitation, SAR capitalises
on the motion of the space craft to emulate a large antenna (about 4 km for the ERS
SAR) from the small antenna (10 m on the ERS satellite) it actually carries on board.
Imaging geometry for a typical strip-mapping synthetic aperture radar imaging system.
The antenna's footprint sweeps out a strip parallel to the direction of the satellite's
ground track.
আরিফু ল ইসলাম 1
6
Interaction between Microwaves and Earth's Surface
When microwaves strike a surface, the proportion of energy scattered back to the
sensor depends on many factors:
 Physical factors such as the dielectric constant of the surface materials which
also depends strongly on the moisture content;
 Geometric factors such as surface roughness, slopes, orientation of the objects
relative to the radar beam direction;
 The types of landcover (soil, vegetation or man-made objects).
 Microwave frequency, polarisation and incident angle.
All-Weather Imaging
Due to the cloud penetrating property of microwave, SAR is able to acquire "cloud-
free" images in all weather. This is especially useful in the tropical regions which are
frequently under cloud covers throughout the year. Being an active remote sensing
device, it is also capable of night-time operation.
Basic mechanism of Remote Sensing and sensor design
Basic mechanism of remote sensing is that the outgoing radiation from the object
passes through the atmosphere up to the sensor onboard the remote sensing
platform. The platform is the aircraft or satellite or in special case balloons which carry
the sensor. The sensor has a field of view on which the resolution of the measurement
depends.
In designing the remote sensing sensors the electromagnetic waves / radiations are so
chosen that they belong to atmospheric window regions of the E-M spectrum. This
radiation may be solar radiation reflected from the object, or radiated by the object
due to thermal radiation or active beam generated by the Synthetic Aperture Radar
(SAR) and scattered by the object. This radiation scattered or emitted by the object is
sensed by the suitable sensors onboard the platforms. Then, the energy received by
the sensor is converted to digital signal or a physical value representing the properties
of the object on earth. The field of view determines how much area of the earth is
seen by the sensor providing single value of information. The sensor is able to scan the
আরিফু ল ইসলাম 1
7
earth from left to right of the path of the satellite over a width equal to the field of
view. Thus strips of scanned information are received by the sensor as the satellite
moves. The path along which the satellite moves is called Nadir. The scanned
information over the strips (we call the strips as row) and individual observations in
the rows as column.
Figure: The Field of view, strips and viewed area / resolution
Figure: Illumination of the object and reflection form the object to the sensor; and
data recording and downloading to the ground station.
1. Energy Source or Illumination (A) – the first requirement for remote sensing is to
have an energy source which illuminates or provides electromagnetic energy to the
target of interest.
Strips or Rows
Field of view
আরিফু ল ইসলাম 1
8
2. Radiation and the Atmosphere (B) – as the energy travels from its source to the
target, it will come in contact with and interact with the atmosphere it passes through.
This interaction may take place a second time as the energy travels from the target to
the sensor.
3. Interaction with the Target (C) - once the energy makes its way to the target
through the atmosphere, it interacts with the target depending on the properties of
both the target and the radiation.
The following aspects are important for remote sensing:
 Satellite or Aircraft
 Sensors: Sensors uses reflected solar, earth’s infrared radiation and active beam
projected from the platform for illumination and scattered back to sensors
 Multi spectral observation. This means the optical system on board the
platform looks at the object to capture radiation at different smaller spectral
bands. There are devices or filters which separates the chosen bands.
 For SAR technology the multu-spectral, multi polarization and multi angle
remote sensing are made.
 The information is gathered in known format in the storage system within the
platform and then beams down to the ground station
 After receiving in the ground station, data are pre-processed, corrected for
geometrical error, geo-referenced in coordinate system
 Then the process of image interpretation, image processing and classification
are done.
 Applications are of manifold: land-use mapping, disaster monitoring, flood and
cyclone affected area mapping, ect. are performed through man-machine
interaction.
Remote Sensing Platforms
The remote sensing of earth is materialized with sensors on board platforms are:
 Aircrafts with special arrangements of making high resolution photographs on
photographic films using optical analog camera or digital camera or sensors.
The image quality depends on the atmospheric conditions. The images obtained
আরিফু ল ইসলাম 1
9
through this imaging technique are known as aerial photograph. The
atmosphere should be cloud or fog free for getting high quality photographs.
The process of analysis and interpretation of aerial photographs is known as
photogrammetry.
 Satellites are used as platforms with imaging instruments/sensors onboard
these satellites. The satellites are of 2-types depending on the orbit. The
satellites which are placed on the equator at a height of around 35000 km from
the earth surface have the same orbital time period as that of earth and is found
to be stationary with respect to the earth is known as Geostationary satellites.
 Satellites having lower orbital heights of around 500-800 km are the known as
polar orbiting satellites. The orbital heights may be lower up to 300 km and is
known as Low Orbiting Satellites.
 Balloons may also be used
Airborne Remote Sensing
In airborne remote sensing, downward or sideward looking sensors are mounted on
an aircraft to obtain images of the earth's surface. An advantage of airborne remote
sensing, compared to satellite remote sensing, is the capability of offering very
high spatial resolution images (20 cm or less). The disadvantages are low coverage
area and high cost per unit area of ground coverage. It is not cost-effective to map a
large area using an airborne remote sensing system. Airborne remote sensing missions
are often carried out as one-time operations, whereas earth observation satellites
offer the possibility of continuous monitoring of the earth.
আরিফু ল ইসলাম 2
0
Analog aerial photography, videography, and digital photography are commonly used
in airborne remote sensing. Synthetic Aperture Radar imaging is also carried out on
airborne platforms.
Analog photography is capable of providing high spatial resolution. The interpretation
of analog aerial photographs is usually done visually by experienced analysts. The
photographs may be digitized using a scanning device for computer-assisted analysis.
Digital photography permits real-time transmission of the remotely sensed data to a
ground station for immediate analysis. The digital images can be analysed and
interpreted with the aid of a computer.
A high resolution aerial photograph over a forested
area. The canopy of each individual tree can be
clearly seen. This type of very high resolution
imagery is useful in identification of tree types and
in assessing the conditions of the trees.
Another example of a high resolution aerial
photograph over a residential area.
Spaceborne Remote Sensing
Eg. IKONOS 2; SPOT 1, 2, 4; EROS A1; OrbView 2 (SeaStar); NOAA 12, 14, 16; TERRA;
RADARSAT 1; ERS 1, 2.
আরিফু ল ইসলাম 2
1
The receiving ground station at CRISP receives data from these satellites. In
spaceborne remote sensing, sensors are mounted on-board a spacecraft (space
shuttle or satellite) orbiting the earth. At present, there are several remote sensing
satellites providing imagery for research and operational applications. Spaceborne
remote sensing provides the following advantages:
 Large area coverage;
 Frequent and repetitive coverage of an area of interest;
 Quantitative measurement of ground features using radiometrically
calibrated sensors;
 Semiautomated computerised processing and analysis;
 Relatively lower cost per unit area of coverage.
Satellite imagery has a generally lower resolution compared to aerial photography.
However, very high resolution imagery (up to 1-m resolution) is now commercially
available to civilian users with the successful launch of the IKONOS-2 satellite in
September 24, 1999.
Airborne versus Spaceborne Radars
Like other remote sensing systems, an imaging radar sensor may be carried on either
an airborne or spaceborne platform. Depending on the use of the prospective imagery,
there are trade-offs between the two types of platforms. Regardless of the platform
used, a significant advantage of using a Synthetic Aperture Radar (SAR) is that the
spatial resolution is independent of platform altitude. Thus, fine resolution can be
achieved from both airborne and spaceborne platforms.
Although spatial resolution is independent of altitude, viewing geometry and swath
coverage can be greatly affected by altitude variations. At aircraft operating altitudes,
an airborne radar must image over a wide range of incidence angles, perhaps as much
as 60 or 70 degrees, in order to achieve relatively wide swaths (let's say 50 to 70 km).
As we have learned in the preceding sections, incidence angle (or look angle) has a
significant effect on the backscatter from surface features and on their appearance on
an image. Image characteristics such as foreshortening, layover, and shadowing will
আরিফু ল ইসলাম 2
2
be subject to wide variations, across a large incidence angle range. Spaceborne radars
are able to avoid some of these imaging geometry problems since they operate at
altitudes up to one hundred times higher than airborne radars.
At altitudes of several hundred kilometres, spaceborne radars can image comparable
swath widths, but over a much narrower range of incidence angles, typically ranging
from five to 15 degrees. This provides for more uniform illumination and reduces
undesirable imaging variations across the swath due to viewing geometry.
Although airborne radar systems may be more susceptible to imaging geometry
problems, they are flexible in their capability to collect data from different look angles
and look directions. By optimizing the geometry for the particular terrain being
imaged, or by acquiring imagery from more than one look direction, some of these
effects may be reduced. Additionally, an airborne radar is able to collect data
anywhere and at any time (as long as weather and flying conditions are acceptable!).
A spaceborne radar does not have this degree of flexibility, as its viewing geometry
and data acquisition schedule is controlled by the pattern of its orbit. However,
satellite radars do have the advantage of being able to collect imagery more quickly
over a larger area than an airborne radar, and provide consistent viewing geometry.
The frequency of coverage may not be as often as that possible with an airborne
platform, but depending on the orbit parameters, the viewing geometry flexibility, and
the geographic area of interest, a spaceborne radar may have a revisit period as short
as one day.
As with any aircraft, an airborne radar will be susceptible to variations in velocity and
other motions of the aircraft as well as to environmental (weather) conditions. In order
to avoid image artifacts or geometric positioning errors due to random variations in
the motion of the aircraft, the radar system must use sophisticated
navigation/positioning equipment and advanced image processing to compensate for
these variations. Generally, this will be able to correct for all but the most severe
variations in motion, such as significant air turbulence. Spaceborne radars are not
affected by motion of this type. Indeed, the geometry of their orbits is usually very
stable and their positions can be accurately calculated. However, geometric correction
আরিফু ল ইসলাম 2
3
of imagery from spaceborne platforms must take into account other factors, such as
the rotation and curvature of the Earth, to achieve proper geometric positioning of
features on the surface.
Satellite Orbits
A satellite follows a generally elliptical orbit around the earth. The time taken to
complete one revolution of the orbit is called the orbital period. The satellite traces
out a path on the earth surface, called its ground track, as it moves across the sky. As
the earth below is rotating, the satellite traces out a different path on the ground in
each subsequent cycle. Remote sensing satellites are often launched into special
orbits such that the satellite repeats its path after a fixed time interval. This time
interval is called the repeat cycle of the satellite.
There are two different orbits for Satellites depending on the height of the satellites:
Polar orbiting satellites
The polar orbiting satellites are put at a height of around 300-1000 km from the
surface of the earth have fixed orbit. The satellites move from south to north or north
to south. The satellite covers the full earth as the earth rotates from west to east. The
satellite passes from south to north and after the rotation by 180 degrees the satellite
is found to pass from north to south. If the satellite passes from south to north it is
known as ascending pass, while it passes from north to south it is known as descending
pass. The satellite with orbital height of 800-1000 km takes about 120 second for a
complete rotation. The orbital time increases with the increase of the height. Thus the
low orbiting satellites have high orbital speed and lower time period.
আরিফু ল ইসলাম 2
4
If the satellite crosses the orbit at a particular time of the day, then it is known as sun
synchronous.
Geostationary Satellites
 The geostationary orbits are such that the satellite moves in the same direction
as that of the earth with same angular speed, as a result, the satellite appears
to remain stationary with respect to earth and views the same region of the
earth at certain interval of time which is from 10-30 minutes. Thus, the satellites
on geostationary orbits are used for environmental monitoring and also for
telecommunication purposes.
 A geostationary satellite is launched in such a way that it follows an orbit
parallel to the equator and travels in the same direction as the earth's rotation
with the same period of 24 hours. Thus, it appears stationary with respect to
the earth surface. A satellite following a geostationary orbit always views the
same area on the earth. Thus, it is possible to monitor continuously any location
within the field of view of its sensor. A large area of the earth can also be
covered by the satellite.
Geostationary Orbit: The satellite appears stationary with respect to the
Earth's surface.
Remote Sensing Satellites
Several remote sensing satellites are currently available, providing imagery suitable
for various types of applications. Each of these satellite-sensor platform is
আরিফু ল ইসলাম 2
5
characterised by the wavelength bands employed in image acquisition, spatial
resolution of the sensor, the coverage area and the temporal coverge, i.e. how
frequent a given location on the earth surface can be imaged by the imaging system.
In terms of the spatial resolution, the satellite imaging systems can be classified into:
 Low resolution systems (approx. 1 km or more)
 Medium resolution systems (approx. 100 m to 1 km)
 High resolution systems (approx. 5 m to 100 m)
 Very high resolution systems (approx. 5 m or less)
In terms of the spectral regions used in data acquisition, the satellite imaging systems
can be classified into:
 Optical imaging systems (include visible, near infrared, and shortwave infrared
systems)
 Thermal imaging systems
 Synthetic aperture radar (SAR) imaging systems
Optical/thermal imaging systems can be classified according to the number of spectral
bands used:
 Monospectral or panchromatic (single wavelength band, "black-and-white",
grey-scale image) systems
 Multispectral (several spectral bands) systems
 Superspectral (tens of spectral bands) systems
 Hyperspectral (hundreds of spectral bands) systems
Synthetic aperture radar imaging systems can be classified according to the
combination of frequency bands and polarization modes used in data acquisition, e.g.:
 Single frequency (L-band, or C-band, or X-band)
 Multiple frequency (Combination of two or more frequency bands)
 Single polarization (VV, or HH, or HV)
 Multiple polarization (Combination of two or more polarization modes)
আরিফু ল ইসলাম 2
6
Types of Satellites and Sensors
Optical/Infrared Remote Sensing Satellites
Low Resolution
 Geostationary Satellites
 Polar Orbiting Meteorological Satellites
o NOAA-AVHRR
o DMSP-OLS
 Orbview2-SeaWiFS
 SPOT4-Vegetation
 ADEOS-OCTS
Medium Resolution
 TERRA-MODIS
 ENVISAT-MERIS
 ADEOS2-GLI
High Resolution
 LANDSAT
 SPOT1,2,4
 MOS
 EO1
 IRS
 RESURS
Very High Resolution
 IKONOS2
 EROS-A1
 Quickbird2
 Orbview3
 SPOT5
আরিফু ল ইসলাম 2
7
Ground Truthing
[Accuracy] The accuracy of remotely sensed or mathematically calculated data based
on data actually measured in the field.
In order to "anchor" the satellite measurements, we need to compare them to
something we know. One way to do this is by what we call "ground truth", which is
one part of the calibration process. This is where a person on the ground (or
sometimes in an airplane) makes a measurement of the same thing the satellite is
trying to measure, at the same time the satellite is measuring it. The two answers are
then compared to help evaluate how well the satellite instrument is performing.
Usually we believe the ground truth more than the satellite, because we have more
experience making measurements on the ground and sometimes we can see what we
are measuring with the naked eye.
There are a number of ways to take ground truth measurements. The first is what we
call a "field campaign". This is where several scientists and technicians take lots of
equipment and set it up somewhere for a short but intense period of measurement.
Often they go someplace rather simple, like the middle of the Great Plains in the
United States, or an island or ship in the middle of the ocean, or an ice shelf at one of
the poles. We get a lot of information from field campaigns, but they are expensive
and only run for a short time.
Another source of ground truth is the on-going work of the National Weather Service.
They have a record of weather conditions stretching back for over 100 years.
Observations are made at regular intervals at offices around the country. These
provide a nice record but are not necessarily taken at the same time a satellite passes
over the spot. As clouds are very changeable, things can change completely in even a
few minutes.
Another option for ground truth is S'COOL. Students at schools around the world can
be involved by making an observation within a few minutes of the time that a satellite
views their area.
Image Classification
The intent of the classification process is to categorize all pixels in a digital image into
one of several land cover classes, or "themes". This categorized data may then be used
to produce thematic maps of the land cover present in an image. Normally,
multispectral data are used to perform the classification and, indeed, the spectral
pattern present within the data for each pixel is used as the numerical basis for
আরিফু ল ইসলাম 2
8
categorization (Lillesand and Kiefer, 1994). The objective of image classification is to
identify and portray, as a unique gray level (or color), the features occurring in an
image in terms of the object or type of land cover these features actually represent on
the ground. Image classification is perhaps the most important part of digital image
analysis. It is very nice to have a "pretty picture" or an image, showing a magnitude of
colors illustrating various features of the underlying terrain, but it is quite useless
unless to know what the colors mean.
Digital image classification techniques group pixels to represent land cover features.
Land cover could be forested, urban, agricultural and other types of features. There
are three main image classification techniques.
Image Classification Techniques in Remote Sensing:
 Unsupervised image classification
 Supervised image classification
 Object-based image analysis
Pixels are the smallest unit represented in an image. Image classification uses the
reflectance statistics for individual pixels. Unsupervised and supervised image
classification techniques are the two most common approaches. However, object-
based classification has been breaking more ground as of late.
What is the difference between supervised and unsupervised
classification?
Unsupervised Classification
Fig: Unsupervised Classification Example
আরিফু ল ইসলাম 2
9
Pixels are grouped based on the reflectance properties of pixels. These groupings are
called “clusters”. The user identifies the number of clusters to generate and which
bands to use. With this information, the image classification software generates
clusters. There are different image clustering algorithms such as K-means and
ISODATA.
The user manually identifies each cluster with land cover classes. It’s often the case
that multiple clusters represent a single land cover class. The user merges clusters into
a land cover type. The unsupervised classification image classification technique is
commonly used when no sample sites exist.
Unsupervised Classification Steps:
 Generate clusters
 Assign classes
Fig: Unsupervised Classification Diagram
Supervised Classification
Fig: Supervised Classification Example: IKONOS
আরিফু ল ইসলাম 3
0
The user selects representative samples for each land cover class in the digital image.
These sample land cover classes are called “training sites”. The image classification
software uses the training sites to identify the land cover classes in the entire image.
The classification of land cover is based on the spectral signature defined in the
training set. The digital image classification software determines each class on what it
resembles most in the training set. The common supervised classification algorithms
are maximum likelihood and minimum-distance classification.
Supervised Classification Steps:
 Select training areas
 Generate signature file
 Classify
Fig: Supervised Classification Diagram
Object-Based (or Object-Oriented) Image Analysis Classification
Fig: Object-based Classification
আরিফু ল ইসলাম 3
1
Traditional pixel-based processing generates square classified pixels. Object-based
image classification is very different in that it generates objects of different shape and
scale. This process is calledmulti-resolution segmentation.
Multiresolution segmentation produces homogenous image objects by grouping
pixels. Objects are generated with different scales in an image simultaneously. These
objects are more meaningful than the tradional pixel-based segmentation because
they can be classified based on texture, context and geometry.
Object-based image analysis supports the use of multiple bands for multiresolution
segmentation and classification. For example, infrared, elevation or existing shape files
can simultaneously be used to classify image objects. Multiple layers can have context
with each other. This context comes in the form of neighborhood relationships,
proximity and distance between layers.
Nearest neighbor (NN) classification is similar to supervised classification. After multi-
resolution segmentation, the user identifies sample sites for each land cover class. The
statistics to classify image objects are defined. The object based image analysis
software then classifies objects based on their resemblance to the training sites and
the statistics defined.
Fig: Object-Based Classification Diagram
Object-Based Nearest Neighbor Classification Steps:
 Perform multiresolution segmentation
 Select training areas
 Define statistics
 Classify
আরিফু ল ইসলাম 3
2
আরিফু ল ইসলাম 3
3
Pixels
A digital image comprises of a two dimensional array of individual picture elements
called pixels arranged in columns and rows. Each pixel represents an area on the
Earth's surface. A pixel has an intensity value and a location address in the two
dimensional image.
The intensity value represents the measured physical quantity such as the solar
radiance in a given wavelength band reflected from the ground, emitted infrared
radiation or backscattered radar intensity. This value is normally the average value for
the whole ground area covered by the pixel.
The intensity of a pixel is recorded as a digital number. Due to the finite storage
capacity, a digital number is stored with a finite number of bits (binary digits). The
number of bits determine the radiometric resolution of the image. For example, an 8-
bit digital number ranges from 0 to 255 (i.e. 28
- 1), while a 11-bit digital number ranges
from 0 to 2047. The detected intensity value needs to be scaled and quantized to fit
within this range of value. In a Radiometrically Calibrated image, the actual intensity
value can be derived from the pixel’s digital number.
The address of a pixel is denoted by its row and column coordinates in the two-
dimensional image. There is a one-to-one correspondence between the column-row
address of a pixel and the geographical coordinates (e.g. Longitude, latitude) of the
imaged location. In order to be useful, the exact geographical location of each pixel on
the ground must be derivable from its row and column indices, given the imaging
geometry and the satellite orbit parameters.
Digital number
Digital number in remote sensing systems, a variable assigned to a pixel, usually in the
form of a binary integer in the range of 0–255 (i.e. a byte). The range of energies
examined in a remote sensing system is broken into 256bins. A single pixel may have
several digital number variables corresponding to different bands recorded.
Digital Image
A digital image is a two-dimensional array of pixels. Each pixel has an intensity value
(represented by a digital number) and a location address (referenced by its row and
column numbers).
আরিফু ল ইসলাম 3
4
Analog Images
Data collected in hard-copy pictorial records are termed photographs. Photographs
are recorded two-dimensionally on photosensitive emulsions that are either reflective
(such as in a print) or transmissive (such as a slide). In remote sensing, the term
photograph is reserved exclusively for images that were detected as well as recorded
on film. The more generic term image is used for any pictorial representation of image
data. Because the term image relates to any pictorial product, all photographs are
images, but not all images are photographs.
Image noise
Image noise is random (not present in the object imaged) variation of brightness or
color information in images, and is usually an aspect of electronic noise. It can be
produced by the sensor and circuitry of a scanner or digital camera. Image noise can
also originate in film grain and in the unavoidable shot noise of an ideal photon
detector. Image noise is an undesirable by-product of image capture that adds
spurious and extraneous information.
The original meaning of "noise" was and remains "unwanted signal"; unwanted
electrical fluctuations in signals received by AM radios caused audible acoustic noise
("static"). By analogy unwanted electrical fluctuations themselves came to be known
as "noise". Image noise is, of course, inaudible.
The magnitude of image noise can range from almost imperceptible specks on a digital
photograph taken in good light, to optical and radio astronomical images that are
almost entirely noise, from which a small amount of information can be derived by
sophisticated processing (a noise level that would be totally unacceptable in a
photograph since it would be impossible to determine even what the subject was).
Multilayer Image
Several types of measurement may be made from the ground area covered by a single
pixel. Each type of measurement forms an image which carry some specific
information about the area. By "stacking" these images from the same area together,
a multilayer image is formed. Each component image is a layer in the multilayer
image.
Multilayer images can also be formed by combining images obtained from different
sensors, and other subsidiary data. For example, a multilayer image may consist of
আরিফু ল ইসলাম 3
5
three layers from a SPOT multispectral image, a layer of ERS synthetic aperture
radar image, and perhaps a layer consisting of the digital elevation map of the area
being studied.
fig: An illustration of a multilayer image consisting of five component layers.
Multispectral Image
A multispectral image consists of a few image layers, each layer represents an image
acquired at a particular wavelength band. For example, the SPOT HRVsensor
operating in the multispectral mode detects radiations in three wavelength bands:
the green (500 - 590 nm), red (610 - 680 nm) and near infrared (790 - 890 nm) bands.
A single SPOT multispectral scene consists of three intensity images in the three
wavelength bands. In this case, each pixel of the scene has three intensity values
corresponding to the three bands.
A multispectral IKONOS image consists of four bands: Blue, Green, Red and Near
Infrared, while a landsat TM multispectral image consists of seven bands: blue, green,
red, near-IR bands, two SWIR bands, and a thermal IR band.
Superspectral Image
The more recent satellite sensors are capable of acquiring images at many more
wavelength bands. For example, the MODIS sensor on-board the NASA's TERRA
satellite consists of 36 spectral bands, covering the wavelength regions ranging from
the visible, near infrared, short-wave infrared to the thermal infrared. The bands have
narrower bandwidths, enabling the finer spectral characteristics of the targets to be
captured by the sensor. The term "superspectral" has been coined to describe such
sensors.
আরিফু ল ইসলাম 3
6
Hyperspectral Image
A hyperspectral image consists of about a hundred or more contiguous spectral bands.
The characteristic spectrum of the target pixel is acquired in a hyperspectral image.
The precise spectral information contained in a hyperspectral image enables better
characterisation and identification of targets. Hyperspectral images have potential
applications in such fields as precision agriculture (e.g. monitoring the types, health,
moisture status and maturity of crops), coastal management (e.g. monitoring of
phytoplanktons, pollution, bathymetry changes).
Currently, hyperspectral imagery is not commercially available from satellites. There
are experimental satellite-sensors that acquire hyperspectral imagery for scientific
investigation (e.g. NASA's Hyperion sensor on-board the EO1 satellite, CHRIS sensor
onboard ESA's PRABO satellite).
Panchromatic image
A single band image generally displayed as shades of gray. Panchromatic emulsion is
a type of black-and-white photographic emulsion that is sensitive to
all wavelengths of visible light. Panchromatic film is a normal black-and-white film
which you can buy in the store and is what your grandparents probably used. It is called
panchromatic because it is sensitive to light of all visible colors (electromagnetic
wavelengths approximately 0.4mm to 0.7mm). Objects of different colors are
distinguishable in panchromatic film, with grayscale tones closely approximating the
brightness of the photograph's subject. Panchromatic film is typically not as sensitive
to subtle variation in green light as some other colors, making it difficult to
differentiate between similar vegetation types. To reduce atmospheric haze, a yellow
filter is normally used. For these images the filters believed to have been used were a
K2 (Wratten No. 8) and #12 deep amber (minus blue member of the cyan-magenta-
yellow set of subtractive primaries).
NDVI (Normalized Difference Vegetation Index)
The normalized difference vegetation index (NDVI) is a simple graphical indicator that
can be used to analyze remote sensing measurements, typically but not necessarily
from a space platform, and assess whether the target being observed contains live
green vegetation or not.
আরিফু ল ইসলাম 3
7
Remote sensing phenology studies use data gathered by satellite sensors that measure
wavelengths of light absorbed and reflected by green plants. Certain pigments in plant
leaves strongly absorb wavelengths of visible (red) light. The leaves themselves
strongly reflect wavelengths of near-infrared light, which is invisible to human eyes.
As a plant canopy changes from early spring growth to late-season maturity and
senescence, these reflectance properties also change.
Many sensors carried aboard satellites measure red and near-infrared light waves
reflected by land surfaces. Using mathematical formulas (algorithms), scientists
transform raw satellite data about these light waves into vegetation indices. A
vegetation index is an indicator that describes the greenness — the relative density
and health of vegetation — for each picture element, or pixel, in a satellite image.
Although there are several vegetation indices, one of the most widely used is the
Normalized Difference Vegetation Index (NDVI). NDVI values range from +1.0 to -1.0.
Areas of barren rock, sand, or snow usually show very low NDVI values (for example,
0.1 or less). Sparse vegetation such as shrubs and grasslands or senescing crops may
result in moderate NDVI values (approximately 0.2 to 0.5). High NDVI values
(approximately 0.6 to 0.9) correspond to dense vegetation such as that found in
temperate and tropical forests or crops at their peak growth stage.
By transforming raw satellite data into NDVI values, researchers can create images and
other products that give a rough measure of vegetation type, amount, and condition
on land surfaces around the world. NDVI is especially useful for continental- to global-
scale vegetation monitoring because it can compensate for changing illumination
conditions, surface slope, and viewing angle. That said, NDVI does tend to saturate
over dense vegetation and is sensitive to underlying soil color.
NDVI values can be averaged over time to establish "normal" growing conditions in a
region for a given time of year. Further analysis can then characterize the health of
vegetation in that place relative to the norm. When analyzed through time, NDVI can
reveal where vegetation is thriving and where it is under stress, as well as changes in
vegetation due to human activities such as deforestation, natural disturbances such as
wild fires, or changes in plants' phenological stage.
UN-SPIDER
In its resolution 61/110 of 14 December 2006 the United Nations General Assembly
agreed to establish the "United Nations Platform for Space-based Information for
Disaster Management and Emergency Response - UN-SPIDER" as a new United
আরিফু ল ইসলাম 3
8
Nations programme, with the following mission statement: "Ensure that all countries
and international and regional organizations have access to and develop the capacity
to use all types of space-based information to support the full disaster management
cycle".
Whereas there have been a number of initiatives in recent years that have contributed
to making space technologies available for humanitarian and emergency response,
UN-SPIDER is the first to focus on the need to ensure access to and use of such
solutions during all phases of the disaster management cycle, including the risk
reduction phase, which will significantly contribute to reducing the loss of lives and
property.
The UN-SPIDER programme is achieving this by being a gateway to space information
for disaster management support, by serving as a bridge to connect the disaster
management and space communities and by being a facilitator of capacity-building
and institutional strengthening, in particular for developing countries.
UN-SPIDER is being implemented as an open network of providers of space-based
solutions to support disaster management activities. Besides Vienna (where UNOOSA
is located), the programme also has offices in Bonn, Germany and in Beijing, China.
Additionally, a network of Regional Support Offices multiplies the work of UN-SPIDER
in the respective regions.
Field of view
The field of view (also field of vision, abbreviated FOV) is the extent of the observable
world that is seen at any given moment. In case of optical instruments or sensors it is
a solid angle through which a detector is sensitive to electromagnetic radiation.
IFOV (INSTANTANEOUS FIELD OF VIEW)
A measure of the spatial resolution of a remote sensing imaging system. Defined as
the angle subtended by a single detector element on the axis of the optical system.
IFOV has the following attributes:
আরিফু ল ইসলাম 3
9
 Solid angle through which a detector is sensitive to radiation.
 The IFOV and the distance from the target determines the spatial resolution. A
low altitude imaging instrument will have a higher spatial resolution than a
higher altitude instrument with the same IFOV.
Swath width
Swath width refers to the strip of the Earth’s surface from which geographic data are
collected by a moving vehicle such as asatellite, aircraft or ship in the course of swath
mapping. The longitudinal extent of the swath is defined by the motion of the vehicle
with respect to the surface, whereas the swath width is measured perpendicularly to
the longitudinal extent of the swath. Note that swath width is also cited in the context
of sonar specifications. It is measured and represented in metres or kilometres.
Fig: swath
Nadir
The nadir is the direction pointing directly below a particular location; that is, it is one
of two vertical directions at a specified location, orthogonal to a horizontal flat surface
there. Since the concept of being below is itself somewhat vague, scientists define the
nadir in more rigorous terms. Specifically, in astronomy, geophysics and related
sciences (e.g., meteorology), the nadir at a given point is the local vertical direction
pointing in the direction of the force of gravity at that location. The direction opposite
of the nadir is the zenith.
Nadir also refers to the downward-facing viewing geometry of an
orbiting satellite, such as is employed during remote sensing of the atmosphere, as
well as when an astronaut faces the Earth while performing a spacewalk.
আরিফু ল ইসলাম 4
0
The word is also used figuratively to mean the lowest point of a person's spirits, or the
quality of an activity or profession.
Fig: This diagram depicts a satellite observing backscattered sunlight in the nadir
viewing geometry.
The term nadir can also be used to represent the lowest point reached by a celestial
body during its apparent orbit around a given point of observation. This can be used
to describe the location of the Sun, but it is only technically accurate for one latitude at
a time and only possible at the low latitudes. The sun is said to be at the nadir at a
location when it is at the zenith at the location's antipode.
In oncology, the term nadir is used to represent the lowest level of a blood cell count
while a patient is undergoing chemotherapy. A diagnosis of neutropenic nadir after
chemotherapy typically lasts 7–10 days.
Fig: Diagram showing the relationship between the zenith, the nadir, and different types of horizon.
The nadir is opposite the zenith.
আরিফু ল ইসলাম 4
1
Image Processing and Analysis
Many image processing and analysis techniques have been developed to aid the
interpretation of remote sensing images and to extract as much information as
possible from the images. The choice of specific techniques or algorithms to use
depends on the goals of each individual project. In this section, we will examine some
procedures commonly used in analysing/interpreting remote sensing images.
Pre-Processing
Prior to data analysis, initial processing on the raw data is usually carried out to correct
for any distortion due to the characteristics of the imaging system and imaging
conditions. Depending on the user's requirement, some standard correction
procedures may be carried out by the ground station operators before the data is
delivered to the end-user. These procedures include radiometric correction to correct
for uneven sensor response over the whole image and geometric correctionto correct
for geometric distortion due to Earth's rotation and other imaging conditions (such as
oblique viewing). The image may also be transformed to conform to a specific map
projection system. Furthermore, if accurate geographical location of an area on the
image needs to be known, ground control points (GCP's) are used to register the image
to a precise map (geo-referencing).
Image Enhancement
Enhancements are used to make it easier for visual interpretation and
understanding of imagery. The advantage of digital imagery is that it allows us to
manipulate the digital pixel values in an image. Although radiometric corrections
for illumination, atmospheric influences, and sensor characteristics may be done
prior to distribution of data to the user, the image may still not be optimized for
visual interpretation. Remote sensing devices, particularly those operated from
satellite platforms, must be designed to cope with levels of target/background
energy which are typical of all conditions likely to be encountered in routine use.
With large variations in spectral response from a diverse range of targets (e.g.
forest, deserts, snowfields, water, etc.) no generic radiometric correction could
optimally account for and display the optimum brightness range and contrast for all
targets. Thus, for each application and each image, a custom adjustment of the
range and distribution of brightness values is usually necessary.
আরিফু ল ইসলাম 4
2
Multispectral SPOT image of the same area shown in a previous section, but acquired
at a later date. Radiometric and geometric corrections have been done. The image
has also been transformed to conform to a certain map projection (UTM projection).
This image is displayed without any further enhancement.
In the above unenhanced image, a bluish tint can be seen all-over the image, producing
a hazy apapearance. This hazy appearance is due to scattering of sunlight by
atmosphere into the field of view of the sensor. This effect also degrades the contrast
between different landcovers.
It is useful to examine the image Histograms before performing any image
enhancement. The x-axis of the histogram is the range of the available digital numbers,
i.e. 0 to 255. The y-axis is the number of pixels in the image having a given digital
number. The histograms of the three bands of this image is shown in the following
figures.
আরিফু ল ইসলাম 4
3
Histogram of the XS3 (near infrared) band (displayed in red). Histogram of the XS2
(red) band (displayed in green). Histogram of the XS1 (green) band (displayed in blue).
Note that the minimum digital number for each band is not zero. Each histogram is
shifted to the right by a certain amount. This shift is due to the atmospheric scattering
component adding to the actual radiation reflected from the ground. The shift is
particular large for the XS1 band compared to the other two bands due to the higher
contribution from Rayleigh scattering for the shorter wavelength.
The maximum digital number of each band is also not 255. The sensor's gain factor has
been adjusted to anticipate any possibility of encountering a very bright object. Hence,
most of the pixels in the image have digital numbers well below the maximum value
of 255.
The image can be enhanced by a simple linear grey-level stretching. In this method, a
level threshold value is chosen so that all pixel values below this threshold are mapped
to zero. An upper threshold value is also chosen so that all pixel values above this
threshold are mapped to 255. All other pixel values are linearly interpolated to lie
between 0 and 255. The lower and upper thresholds are usually chosen to be values
close to the minimum and maximum pixel values of the image. The Grey-Level
Transformation Table is shown in the following graph.
Grey-Level Transformation Table for performing linear grey level stretching
of the three bands of the image. Red line: XS3 band; Green line: XS2 band;
Blue line: XS1 band.
আরিফু ল ইসলাম 4
4
The result of applying the linear stretch is shown in the following image. Note that
the hazy appearance has generally been removed, except for some parts near to the
top of the image. The contrast between different features has been improved.
Multispectral SPOT image after enhancement by a simple linear greylevel stretching.
Passive Remote Sensing
Remote sensing systems which measure energy that is naturally available are
called passive sensors. Passive sensors can only be used to detect energy when
the naturally occurring energy is available
Every day the Earth’s surface is hit by incoming electromagnetic (EM) radiation
from the sun. This EM radiation is known as incident energy (Ei).
There are three fundamental energy interactions with incident energy:
1) reflected energy (Er)
2) absorbed energy (Ea)
3) transmitted energy (Et)
Incident Energy Formula:
Ei = Er + Ea + Et
আরিফু ল ইসলাম 4
5
Active Remote Sensing
The sensor emits radiation which is directed toward the target to be
investigated. The radiation reflected from that target is detected and
measured by the sensor.
Active remote sensing is being done any time of the day.
Active remote Sensors create their own electromagnetic energy that
• is transmitted from the sensor toward the terrain
• interacts with the terrain producing a backscatter of energy
• is recorded by the remote sensor’s receiver.
Active remote sensing, that is, using radiation-emitting devices or substances
to produce electromagnetic waves that ‘artificially’ illuminate the Earth's
surface. As in the act of ‘seeing’, for example: One would turn on the light
when it's too dark to see.
Active Passive
Using artificial energy source Using natural energy source
Done any time of the day Work only the day when sun is present
used for examining wavelengths that’s
your need
Sensors can measure visible, infrared, ultraviolet
and more types of EM radiation of sun ray that’s
reflected form the object.
High cost Low cost
LIDAR image Landsat image
আরিফু ল ইসলাম 4
6
Image resolution
Digital images are made up of thousands of pixels (blocks of colour), and the number
of pixels in the image will determine how high the image's resolution is.
The image's resolution is the number of pixels divided by the size it is being viewed at
- so if the image is 720 pixels wide by 720 pixels tall, and it's being viewed at 10 inches
square, it has a resolution of 72 pixels per inch (I'm not sure why it's measured in inches
- but it's probably a combination of the fact it originated in print before metrication
and that computers were so heavily based in the US).
Resolution refers to the number of pixels in an image. Resolution is sometimes
identified by the width and height of the image as well as the total number of pixels in
the image. For example, an image that is 2048 pixels wide and 1536 pixels high
(2048X1536) contains (multiply) 3,145,728 pixels (or 3.1 Megapixels).
Advantage of high resolution image
Sub-meter (VHR) resolution satellites provides users a smaller ground sampling
distance (GSD) than other imaging satellites, making them more suitable for reliable
Earth observation, site monitoring, object identification and many other tasks
requiring precision data.
Whether natural or man-made, objects and landscapes may be examined and
interpreted using very high spatial resolution satellite imagery. Among its many
advantages, VHR images enable identification of specific models of vehicles, vessels
and airplanes, as well as structural changes, consequences of natural disasters and
monitoring of operations – whether of industrial facilities or farm livestock. This
versatility makes EROS imagery a powerful resource for civilian and military needs.
Sub-half-meter imagery provides more accurate feature detection and identification,
allowing you to detect smaller features and identify them more accurately. This will
aid you in more accurate and automated land use/land cover mapping, and provide
improved change monitoring identifying smaller changes. Ex: Mapping, Natural
resource management, urban planning, Emergency response
Location-based Services
With enhanced resolution capabilities, companies in the Location Based Services (LBS)
industry will be able to provide their customers and portal users with crisper, clearer
base map images, allowing for better recognition of features.
আরিফু ল ইসলাম 4
7
Example Applications:
> Automobile and personal navigation mapping
> Fleet management
> Nautical navigation
> Online portals
Energy
A wide range of energy-specific industries will be able to take advantage of our higher
resolution capabilities. From more accurate monitoring of pipeline assets or
construction areas to higher levels of detail and accuracy in volumetric calculations,
our enhanced imagery completeness will help you do your job more efficiently and
effectively.
Example Applications:
> Oil & Gas facility and resource monitoring
> Drilling and resource exploration
> Emergency management and response
> Industry production information gathering
Disadvantage of Low resolution image
Real-time data
Real-time data denotes information that is delivered immediately after collection.
There is no delay in the timeliness of the information provided. Real-time data is often
used for navigation or tracking. Some uses of this term confuse it with the term
dynamic data.
আরিফু ল ইসলাম 4
8
Solid angle
In geometry, a solid angle (symbol: Ω) is the two-dimensional angle in three-
dimensional space that an object subtends at a point. It is a measure of how large the
object appears to an observer looking from that point. In the International System of
Units (SI), a solid angle is expressed in a dimensionless unit called a steradian (symbol:
sr).
A small object nearby may subtend the same solid angle as a larger object farther
away. For example, although the Moon is much smaller than the Sun, it is also much
closer to Earth. Indeed, as viewed from any point on Earth, both objects have
approximately the same solid angle as well as apparent size. This is evident during a
solar eclipse
Look Angle
The elevation and azimuth at which a particular satellite is predicted to be found at a
specified time.
look angle In radar terminology, the angle between the vertical plane passing through
the radar antenna and the line between the antenna and object.
Look angles are required such that the earth station antenna points or “looks at” the
satellite directly. From a location on earth, the 2 look angles that are needed are
Azimuth and Elevation. Since XTAR satellites are
geostationary, these look angles are constant for
fixed ground antennae.
The look angle calculator determines the azimuth
and elevation angles for XTAR satellites based on
the input earth station latitude (°N) and longitude
(°E) geographical locations.
Look Angle Calculator
To use for XTAR satellites, make the following
selections and entries:
 Select either:
o 29E XTAR-EUR
o 30W HISPASAT 1C (aka XTAR-LANT
aka SpainSat)
আরিফু ল ইসলাম 4
9
 Select your location by either interactively clicking on the map or by entering
your address into the dropdown. Alternatively, your latitude (°N) and
longitude (°E) can be entered into this dropdown in decimal degrees
(separated with comma but do not use parenthesis)
 The resulting Azimuth and elevation look angles appear interactively at the
base of map.
 The magnetic Azimuth is also given for compass reference.
 The LNB skew is not used for XTAR since our satellites use circular
polarization.
 The Options box for “show obstacle” can be checked to display the maximum
height, h and minimum distance, d that an obstacle can have without blocking
the line of sight from your location to the satellite.
আরিফু ল ইসলাম 5
0
Spectral Signature In Remote Sensing
Features on the Earth reflect, absorb, transmit, and emit electromagnetic energy from the
sun. Special digital sensors have been developed to measure all types of electromagnetic
energy as it interacts with objects in all of the ways listed above. The ability of sensors to
measure these interactions allows us to use remote sensing to measure features and changes
on the Earth and in our atmosphere. A measurement of energy commonly used in remote
sensing of the Earth is reflected energy (e.g., visible light, near-infrared, etc.) coming from
land and water surfaces. The amount of energy reflected from these surfaces is usually
expressed as a percentage of the amount of energy striking the objects. Reflectance is 100%
if all of the light striking and object bounces off and is detected by the sensor. If none of the
light returns from the surface, reflectance is said to be 0%. In most cases, the reflectance
value of each object for each area of the electromagnetic spectrum is somewhere between
these two extremes. Across any range of wavelengths, the percent reflectance values for
landscape features such as water, sand, roads, forests, etc. can be plotted and compared.
Such plots are called “spectral response curves” or “spectral signatures.” Differences among
spectral signatures are used to help classify remotely sensed images into classes of landscape
features since the spectral signatures of like features have similar shapes. The figure below
shows differences in the spectral response curves for healthy versus stressed sugar beet
plants.
The more detailed the spectral information recorded by a sensor, the more information that
can be extracted from the spectral signatures. Hyperspectral sensors have much more
detailed signatures than multispectral sensors and thus provide the ability to detect more
subtle differences in aquatic and terrestrial features.

Contenu connexe

Tendances

Image interpretation keys & image resolution
Image interpretation keys & image resolutionImage interpretation keys & image resolution
Image interpretation keys & image resolutionPramoda Raj
 
Basics of Remote Sensing
Basics of Remote SensingBasics of Remote Sensing
Basics of Remote SensingAkash Tikhe
 
History of remote sensing
History of remote sensingHistory of remote sensing
History of remote sensingSumant Diwakar
 
Remote sensing - Sensors, Platforms and Satellite orbits
Remote sensing - Sensors, Platforms and Satellite orbitsRemote sensing - Sensors, Platforms and Satellite orbits
Remote sensing - Sensors, Platforms and Satellite orbitsAjay Singh Lodhi
 
Sensors for remote sensing
Sensors for remote sensingSensors for remote sensing
Sensors for remote sensingMohsin Siddique
 
Remote sensing principles-spectral signature-spectural range
Remote sensing principles-spectral signature-spectural rangeRemote sensing principles-spectral signature-spectural range
Remote sensing principles-spectral signature-spectural rangeMohsin Siddique
 
Remote Sensing PPT
Remote Sensing PPTRemote Sensing PPT
Remote Sensing PPTAmal Murali
 
Remote Sensing Platforms and Sensors
Remote Sensing Platforms and SensorsRemote Sensing Platforms and Sensors
Remote Sensing Platforms and SensorsUday kumar Devalla
 
Fundamentals of Remote Sensing
Fundamentals of Remote SensingFundamentals of Remote Sensing
Fundamentals of Remote SensingShah Naseer
 
Thermal Remote Sensing
Thermal Remote SensingThermal Remote Sensing
Thermal Remote SensingRohit Kumar
 
Spectral signatures
Spectral signaturesSpectral signatures
Spectral signaturesKunal Rathod
 
Chapter 1 (Introduction to remote sensing)
Chapter 1 (Introduction to remote sensing)Chapter 1 (Introduction to remote sensing)
Chapter 1 (Introduction to remote sensing)Shankar Gangaju
 
Types of aerial photographs
Types of aerial photographsTypes of aerial photographs
Types of aerial photographsHARITHANAIR15
 
Remote sensing ppt
Remote sensing pptRemote sensing ppt
Remote sensing pptcoolmridul92
 
Remote sensing
Remote sensingRemote sensing
Remote sensingKU Leuven
 
Spectral reflectance curve of dead stressed vegetation
Spectral reflectance curve of dead stressed vegetationSpectral reflectance curve of dead stressed vegetation
Spectral reflectance curve of dead stressed vegetationJunaid Ijaz
 
Thermal remote sensing and its applications
Thermal remote sensing and its applicationsThermal remote sensing and its applications
Thermal remote sensing and its applicationschandan00781
 
Types of scanners
Types of scannersTypes of scanners
Types of scannersPramoda Raj
 

Tendances (20)

Remote sensing
Remote sensingRemote sensing
Remote sensing
 
Image interpretation keys & image resolution
Image interpretation keys & image resolutionImage interpretation keys & image resolution
Image interpretation keys & image resolution
 
Basics of Remote Sensing
Basics of Remote SensingBasics of Remote Sensing
Basics of Remote Sensing
 
History of remote sensing
History of remote sensingHistory of remote sensing
History of remote sensing
 
Remote sensing - Sensors, Platforms and Satellite orbits
Remote sensing - Sensors, Platforms and Satellite orbitsRemote sensing - Sensors, Platforms and Satellite orbits
Remote sensing - Sensors, Platforms and Satellite orbits
 
Sensors for remote sensing
Sensors for remote sensingSensors for remote sensing
Sensors for remote sensing
 
Remote sensing principles-spectral signature-spectural range
Remote sensing principles-spectral signature-spectural rangeRemote sensing principles-spectral signature-spectural range
Remote sensing principles-spectral signature-spectural range
 
Remote Sensing PPT
Remote Sensing PPTRemote Sensing PPT
Remote Sensing PPT
 
Remote Sensing Platforms and Sensors
Remote Sensing Platforms and SensorsRemote Sensing Platforms and Sensors
Remote Sensing Platforms and Sensors
 
Fundamentals of Remote Sensing
Fundamentals of Remote SensingFundamentals of Remote Sensing
Fundamentals of Remote Sensing
 
Thermal Remote Sensing
Thermal Remote SensingThermal Remote Sensing
Thermal Remote Sensing
 
Spectral signatures
Spectral signaturesSpectral signatures
Spectral signatures
 
Chapter 1 (Introduction to remote sensing)
Chapter 1 (Introduction to remote sensing)Chapter 1 (Introduction to remote sensing)
Chapter 1 (Introduction to remote sensing)
 
Types of aerial photographs
Types of aerial photographsTypes of aerial photographs
Types of aerial photographs
 
Remote sensing ppt
Remote sensing pptRemote sensing ppt
Remote sensing ppt
 
Remote sensing
Remote sensingRemote sensing
Remote sensing
 
Spectral reflectance curve of dead stressed vegetation
Spectral reflectance curve of dead stressed vegetationSpectral reflectance curve of dead stressed vegetation
Spectral reflectance curve of dead stressed vegetation
 
Thermal remote sensing and its applications
Thermal remote sensing and its applicationsThermal remote sensing and its applications
Thermal remote sensing and its applications
 
Digital Elevation Model (DEM)
Digital Elevation Model (DEM)Digital Elevation Model (DEM)
Digital Elevation Model (DEM)
 
Types of scanners
Types of scannersTypes of scanners
Types of scanners
 

Similaire à Principles of Remote Sensing

Role of electromagnetic Radiation in Remote Sensing
Role of electromagnetic Radiation in  Remote SensingRole of electromagnetic Radiation in  Remote Sensing
Role of electromagnetic Radiation in Remote SensingNzar Braim
 
Remote sensing
Remote sensingRemote sensing
Remote sensingGokul Saud
 
Introduction to remote sensing pt 1
Introduction to remote sensing pt 1Introduction to remote sensing pt 1
Introduction to remote sensing pt 1TafadzwaMarione
 
Introduction to Remote Sensing- by Wankie Richman
Introduction to Remote Sensing- by Wankie RichmanIntroduction to Remote Sensing- by Wankie Richman
Introduction to Remote Sensing- by Wankie RichmanRichmanWankie
 
detail information of advance total station and remote sensing
detail information of advance total station and remote sensingdetail information of advance total station and remote sensing
detail information of advance total station and remote sensingshaikh imran
 
Electromagnetic spectrum_lecture 1.pptx
Electromagnetic spectrum_lecture 1.pptxElectromagnetic spectrum_lecture 1.pptx
Electromagnetic spectrum_lecture 1.pptxmichelle894942
 
Concept of Remote sensing
Concept of Remote sensingConcept of Remote sensing
Concept of Remote sensingCivilhod15
 
F:\Electromagnetic Spectrum
F:\Electromagnetic SpectrumF:\Electromagnetic Spectrum
F:\Electromagnetic Spectrumjc_elprofe
 
.remote sensing.Ece 402 unit-2
.remote sensing.Ece 402 unit-2.remote sensing.Ece 402 unit-2
.remote sensing.Ece 402 unit-2Vandana Verma
 
Spectroscopy assign
Spectroscopy assignSpectroscopy assign
Spectroscopy assignKanwal Asif
 
Rs and gis lect 3-6.pdf
Rs and gis lect 3-6.pdf Rs and gis lect 3-6.pdf
Rs and gis lect 3-6.pdf AniruddhMane2
 
Surveying ii ajith sir class1
Surveying ii ajith sir class1Surveying ii ajith sir class1
Surveying ii ajith sir class1SHAMJITH KM
 
Wireless mobile charging using microwaves full report
Wireless mobile charging using microwaves full reportWireless mobile charging using microwaves full report
Wireless mobile charging using microwaves full reportAyswarya Ayshu P R
 
Basic of Remote Sensing
Basic of Remote SensingBasic of Remote Sensing
Basic of Remote Sensinggueste5cfed
 

Similaire à Principles of Remote Sensing (20)

Role of electromagnetic Radiation in Remote Sensing
Role of electromagnetic Radiation in  Remote SensingRole of electromagnetic Radiation in  Remote Sensing
Role of electromagnetic Radiation in Remote Sensing
 
Remote sensing
Remote sensingRemote sensing
Remote sensing
 
Chapter 2 RS.pptx
Chapter 2 RS.pptxChapter 2 RS.pptx
Chapter 2 RS.pptx
 
Introduction to remote sensing pt 1
Introduction to remote sensing pt 1Introduction to remote sensing pt 1
Introduction to remote sensing pt 1
 
2 Intro RS.pdf
2 Intro RS.pdf2 Intro RS.pdf
2 Intro RS.pdf
 
Introduction to Remote Sensing- by Wankie Richman
Introduction to Remote Sensing- by Wankie RichmanIntroduction to Remote Sensing- by Wankie Richman
Introduction to Remote Sensing- by Wankie Richman
 
L3 emr
L3 emrL3 emr
L3 emr
 
detail information of advance total station and remote sensing
detail information of advance total station and remote sensingdetail information of advance total station and remote sensing
detail information of advance total station and remote sensing
 
Electromagnetic spectrum_lecture 1.pptx
Electromagnetic spectrum_lecture 1.pptxElectromagnetic spectrum_lecture 1.pptx
Electromagnetic spectrum_lecture 1.pptx
 
Concept of Remote sensing
Concept of Remote sensingConcept of Remote sensing
Concept of Remote sensing
 
F:\Electromagnetic Spectrum
F:\Electromagnetic SpectrumF:\Electromagnetic Spectrum
F:\Electromagnetic Spectrum
 
.remote sensing.Ece 402 unit-2
.remote sensing.Ece 402 unit-2.remote sensing.Ece 402 unit-2
.remote sensing.Ece 402 unit-2
 
Spectroscopy assign
Spectroscopy assignSpectroscopy assign
Spectroscopy assign
 
remote sensing-converted.pptx
remote sensing-converted.pptxremote sensing-converted.pptx
remote sensing-converted.pptx
 
Rs and gis lect 3-6.pdf
Rs and gis lect 3-6.pdf Rs and gis lect 3-6.pdf
Rs and gis lect 3-6.pdf
 
Surveying ii ajith sir class1
Surveying ii ajith sir class1Surveying ii ajith sir class1
Surveying ii ajith sir class1
 
Wireless mobile charging using microwaves full report
Wireless mobile charging using microwaves full reportWireless mobile charging using microwaves full report
Wireless mobile charging using microwaves full report
 
Remote Sensing
Remote SensingRemote Sensing
Remote Sensing
 
Basic of Remote Sensing
Basic of Remote SensingBasic of Remote Sensing
Basic of Remote Sensing
 
Light
LightLight
Light
 

Dernier

2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptxMaritesTamaniVerdade
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17Celine George
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701bronxfugly43
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.pptRamjanShidvankar
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsMebane Rash
 
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-IIFood Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-IIShubhangi Sonawane
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibitjbellavia9
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfagholdier
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...Poonam Aher Patil
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesShubhangi Sonawane
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin ClassesCeline George
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxDenish Jangid
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docxPoojaSen20
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 

Dernier (20)

2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-IIFood Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 

Principles of Remote Sensing

  • 1. আরিফু ল ইসলাম 1 Principles of Remote Sensing DSMHT-213 Ariful Islam Department of Disaster Science and Management Faculty of Earth and Environmental Science University of Dhaka
  • 2. আরিফু ল ইসলাম 2 What is Remote Sensing? "Remote sensing is the science (and to some extent, art) of acquiring information about the Earth's surface without actually being in contact with it. This is done by sensing and recording reflected or emitted energy and processing, analyzing, and applying that information." Remote sensing is the means of observing objects and their physical, optical and geometrical properties without touching these objects. The information is gathered through electromagnetic waves or lights of different wave length. The light falls on the object and scatters back to the sensor which is sensitive to the intensity of radiation. The sensor is photo sensitive and as soon as the light reflects back to the sensor a voltage is created which produces a signal that is saved in a storage device. In much of remote sensing, the process involves an interaction between incident radiation and the targets of interest. This is exemplified by the use of imaging systems where the following seven elements are involved. Note, however that remote sensing also involves the sensing of emitted energy and the use of non-imaging sensors. 1. Energy Source or Illumination (A) – the first requirement for remote sensing is to have an energy source which illuminates or provides electromagnetic energy to the target of interest.
  • 3. আরিফু ল ইসলাম 3 2. Radiation and the Atmosphere (B) – as the energy travels from its source to the target, it will come in contact with and interact with the atmosphere it passes through. This interaction may take place a second time as the energy travels from the target to the sensor. 3. Interaction with the Target (C) - once the energy makes its way to the target through the atmosphere, it interacts with the target depending on the properties of both the target and the radiation. 4. Recording of Energy by the Sensor (D) - after the energy has been scattered by, or emitted from the target, we require a sensor (remote - not in contact with the target) to collect and record the electromagnetic radiation. 5. Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be transmitted, often in electronic form, to a receiving and processing station where the data are processed into an image (hardcopy and/or digital). 6. Interpretation and Analysis (F) - the processed image is interpreted, visually and/or digitally or electronically, to extract information about the target which was illuminated. 7. Application (G) - the final element of the remote sensing process is achieved when we apply the information we have been able to extract from the imagery about the target in order to better understand it, reveal some new information, or assist in solving a particular problem. The Electromagnetic Spectrum
  • 4. আরিফু ল ইসলাম 4 The electromagnetic spectrum can be divided into several wavelength (frequency) regions, among which only a narrow band from about 400 to 700 nm is visible to the human eyes. Note that there is no sharp boundary between these regions. The boundaries shown in the above figures are approximate and there are overlaps between two adjacent regions. Wavelength units: 1 mm = 1000 µm; 1 µm = 1000 nm.  Radio Waves: 10 cm to 10 km wavelength. The short radio wave is reflected from ionosphere  Microwaves: 1 mm to 1 m wavelength. The microwaves are further divided into different frequency (wavelength) bands: (1 GHz = 109 Hz) o P band: 0.3 - 1 GHz (30 - 100 cm) o L band: 1 - 2 GHz (15 - 30 cm) o S band: 2 - 4 GHz (7.5 - 15 cm) o C band: 4 - 8 GHz (3.8 - 7.5 cm) o X band: 8 - 12.5 GHz (2.4 - 3.8 cm) o Ku band: 12.5 - 18 GHz (1.7 - 2.4 cm) o K band: 18 - 26.5 GHz (1.1 - 1.7 cm) o Ka band: 26.5 - 40 GHz (0.75 - 1.1 cm)  Infrared: 0.7 to 300 µm wavelength. This region is further divided into the following bands:  Near Infrared (NIR): 0.7 to 1.5 µm.  Short Wavelength Infrared (SWIR): 1.5 to 3 µm.  Mid Wavelength Infrared (MWIR): 3 to 8 µm.  Long Wanelength Infrared (LWIR): 8 to 15 µm.  Far Infrared (FIR): longer than 15 µm. The NIR and SWIR are also known as the Reflected Infrared, referring to the main infrared component of the solar radiation reflected from the earth's surface. The MWIR and LWIR are the Thermal Infrared.  Visible Light: Blue, green, and red are the primary colours or wavelengths of the visible spectrum. They are defined as such because no single primary colour can be created from the other two, but all other colours can be formed by combining blue, green, and red in various proportions. This narrow band of electromagnetic radiation extends from about 400 nm (violet) to about 700 nm (red). The various colour components of the visible spectrum fall roughly within the following wavelength regions:
  • 5. আরিফু ল ইসলাম 5 o Red: 610 - 700 nm o Orange: 590 - 610 nm o Yellow: 570 - 590 nm o Green: 500 - 570 nm o Blue: 450 - 500 nm o Indigo: 430 - 450 nm o Violet: 400 - 430 nm  Ultraviolet: 3 to 400 nm . Ultraviolet or UV portion of the spectrum has the shortest wavelengths which are practical for remote sensing. This radiation is just beyond the violet portion of the visible wavelengths  X-Rays and Gamma Rays Description of Electromagnetic Waves Electromagnetic waves are energy transported through space in the form of periodic disturbances of electric and magnetic fields. The waves are constitutes by Electric and magnetic vectors. The waves are sinusoidal in nature and is describes mathematically by the following expression. )sin( tkxAy  , where  2 k and T   2  and Tk ,,,  are wave number, wave length, angular frequency and T is the time period. The frequency, f= T 1 All electromagnetic waves travel through space at the same speed, c = 2.99792458 x 108 m/s, commonly known as the speed of light. An electromagnetic wave is characterized by a frequency and a wavelength. These two quantities are related to the speed of light by the equation, c = f x 
  • 6. আরিফু ল ইসলাম 6 The frequency (and hence, the wavelength) of an electromagnetic wave depends on its source. There is a wide range of frequency encountered in our physical world, ranging from the low frequency of the electric waves generated by the power transmission lines to the very high frequency of the gamma rays originating from the atomic nuclei. This wide frequency range of electromagnetic waves constitutes the Electromagnetic Spectrum. Blackbody Radiation All material bodies emit radiation. The intensity and wave length of the radiation depends on the temperature of the body. If the emissivity of the body of the body is 1 then it is known as blackbody. But in reality there is no perfect blackbody, but most of the objects may be considered very close to blackbody. Planck's law describes the electromagnetic radiation emitted by a black body in thermal equilibrium at a definite temperature. The law is named after Max Planck, who originally proposed it in 1900. It is a pioneering result of modern physics and quantum theory. The spectral radiance of a body, , describes the amount of energy it gives off as radiation in different frequencies. It is measured in terms of the power emitted per unit area of the body and per unit solid angle for a frequency μ. Planck showed that the spectral radiance of a body at absolute temperature T is given by Where kB the Boltzmann constant (1.3806488×10−23 J/K.), h the Planck constant (6.62606957×10−34 Js) and c the speed of light in the medium, whether material or vacuum.[1][2][3] The spectral radiance can also be measured per unit wavelength instead of per unit frequency. In this case, it is given by The peak wavelength of the emission is inversely proportional to its temperature in Kelvin.
  • 7. আরিফু ল ইসলাম 7 Wien's Law is the equation to use to solve for this: T 0029.0 max  Meter, where T is in Kelvin For earth’s surface with T=303; the wave length with maximum energy for T= 303 K is 9.5 µm. Atmospheric Interaction of Electromagnetic Waves/ Solar Radiation The earth has a gaseous envelop around its solid and liquid core. The electromagnetic radiations while passing through the atmosphere interacts with atmospheric molecules causing scattering. Before radiation used for remote sensing reaches the Earth's surface it has to travel through some distance of the Earth's atmosphere. Particles and gases in the atmosphere can affect the incoming light and radiation. These effects are caused by the mechanisms of scattering and absorption.  Top of the Atmosphere o Solar radiation arrives essentially unchanged by passage through space o Interacts with gas molecules, water droplets and aerosols in the atmosphere  Absorption o Radiant energy converted into internal energy  Ceases to exist as radiant energy  Heats atmosphere directly o Very little absorption in visible wavelengths  Ozone absorbs most ultraviolet  CO2, water vapor and water droplets absorb near infrared  Transmission o Sunlight passing through atmosphere without being altered  Light arriving at surface directly from the Sun  Makes the disk of the Sun visible  Creates patterns of strong light and shadows  Reflection o Interaction that changes direction of light o Reflection from surfaces  Radiation redirected away from surface without being absorbed  Reflected light makes objects visible  Wavelength-dependent reflection sensed as color differences
  • 8. আরিফু ল ইসলাম 8 o Specular reflection  Coherent beam reflected - images preserved  Mirrors, shiny metal and smooth water surfaces  Scattering o Light reflected in all directions - images not preserved o Fogged mirror (water droplets) scatter at different angles, so can't see anything in mirror o Comparable effect with fog on glass scattering transmitted light o Atmospheric molecules and aerosols scatter incoming solar radiation o Some forward scattered and continues toward surface at different angle o Some back scattered out to space taking energy out of earth- atmosphere system Figure-3 Scattering of solar radiation in the atmosphere in all direction and thus the diffused light is produced (left). The right panel is scattering by clouds  Diffuse radiation o Some light reaches surface after being scattered  Appears to originate from location of last scattering event  Illuminates sky outside of direct solar beam  No atmosphere on moon, so sky is black except for direct light source o Amounts of direct and diffuse light reaching surface vary depending on atmospheric conditions  Clear, dry atmosphere - large amount of direct, little diffuse  High contrast between dark shadows and brightly illuminated surfaces  Sky appears deep blue
  • 9. আরিফু ল ইসলাম 9 o Cloud cover - no direct, all diffuse  Light coming from all directions of the sky, so only shadows directly underneath things  Disk of sun not visible  Rayleigh Scattering o Scattering by objects smaller than wavelengths of visible light  Primarily atmospheric gas molecules o Preferentially scatters shorter wavelengths  Violet and blue scattered most, red least. o Makes sky appear blue from earth and earth appear blue from space o Makes red sunsets  All wavelengths shorter than red scattered during long passage through atmosphere  Best sunsets occur with clear sky except for medium or high clouds  Reflect the red light against dark sky background  Mie Scattering  Scattering by objects larger than wavelengths of visible light (Water droplets and aerosols) o Scatters all wavelengths equally - white light o Makes clouds appear white o Aerosols near surface often make skies appear hazy white, especially near horizon Energy of light Photons Light is wave, but is also considered as particle/corpuscle in quantum Mechanics and the energy of an electromagnetic wave is considered to be quantized, i.e. it can only exist in discrete amount. The basic unit of energy for an electromagnetic wave is called a photon. The energy E of a photon is proportional to the wave frequency f, E = h f Where, the constant of proportionality h is the Planck's Constant, h = 6.626 x 10-34 J s.
  • 10. আরিফু ল ইসলাম 1 0 Attenuation of solar radiation while passing thorough the atmosphere Solar Constant=1.361 KW/m2 As the solar radiation passes through the atmosphere, the radiation is attenuated till it reaches the surface of the earth. Then the incident radiation is by given by, )exp()()( zkII topinc   Or )()()(  topinc II  , where )exp()( zk  Here, k is the coefficient of attenuation and )( is known as transmittance. The transmittance of the atmosphere radiations is also given as as )( )( )(    top inc I I  , i.e. There are strong variation of atmospheric transmittance with  . The figure 4 shows the atmospheric transmittance for the E-M spectrum. Figure: The transmittance of E-M radiation through atmosphere (upper panel) and Blackbody radiation of sun and that from the earth’s surface (lower panel). From the figure (upper panel) it appears that the solar radiations have high transmittance in the Near UV (λ=0.2-0.4 µm), visible (λ=0.4-0.7 µm) and Near Infrared (NIR) λ= (0.7-1.5 µm).
  • 11. আরিফু ল ইসলাম 1 1 Atmospheric window Parts of the electromagnetic spectrum that can be transmitted through the atmosphere with relatively little interference. The atmosphere is considered as window for the spectral radiation having high transmittance through the atmosphere and such ranges of spectrum are termed as atmospheric windows. The figure shows that the following rages of radiations are atmospheric window: I. UV: 0.2-0.4 µm- (Solar Radiation Passive) II. Visible: 0.4-0.7 µm (Solar Radiation-Passive) III. NIR: 0.7-1.5 µm (Solar Radiation-Passive) IV. Far Infrared (Thermal radiation): 8.0-15.0 µm (Blackbody radiation from earth’s surface and lower atmosphere-Passive) V. Passive microwave (50 hz) Radiation from earth’s surface. VI. Microwaves: centimeter to meter: Passive radiation from earth and Active radiation from Synthetic Aperture RADAR (SAR) that used direct Microwave beams for illuminating the object on earth. Absorption Bands and Atmospheric Windows Some types of electromagnetic radiation easily pass through the atmosphere, while other types do not. The ability of the atmosphere to allow radiation to pass through it is referred to as its transmissivity, and varies with the wavelength/type of the radiation. The gases that comprise our atmosphere absorb radiation in certain wavelengths while allowing radiation with differing wavelengths to pass through. The areas of the EM spectrum that are absorbed by atmospheric gases such as water vapor, carbon dioxide, and ozone are known as absorption bands. In the figure, absorption bands are represented by a low transmission value that is associated with a specific range of wavelengths. In contrast to the absorption bands, there are areas of the electromagnetic spectrum where the atmosphere is transparent (little or no absorption of radiation) to specific wavelengths. These wavelength bands are known as atmospheric "windows" since they allow the radiation to easily pass through the atmosphere to Earth's surface. Most remote sensing instruments on aircraft or space-based platforms operate in one or more of these windows by making their measurements with detectors tuned to specific frequencies (wavelengths) that pass through the atmosphere. When a remote sensing instrument has a line-of-sight with an object that is reflecting sunlight or emitting heat, the instrument collects and records the radiant energy.While most remote sensing systems are designed to collect reflected radiation, some sensors,
  • 12. আরিফু ল ইসলাম 1 2 especially those on meteorological satellites, directly measure absorption phenomena, such as those associated with carbon dioxide (CO2) and other gases. The atmosphere is nearly opaque to EM radiation in part of the mid-IR and all of the far-IR regions. In the microwave region, by contrast, most of this radiation moves through unimpeded, so radar waves reach the surface (although weather radars are able to detect clouds and precipitation because they are tuned to observe backscattered radiation from liquid and ice particles). The figure above shows the percentage of light transmitted at various wavelengths from the near ultraviolet to the far infrared, and the sources of atmospheric opacity are also given. You can see that there is plenty of atmospheric transmission of radiation at 0.5 microns, 2.5 microns, and 3.5 microns, but in contrast there is a great deal of atmospheric absorption at 2.0, 3.0, and about 7.0 microns. Both passive and active remote sensing technologies do best if they operate within the atmospheric windows.
  • 13. আরিফু ল ইসলাম 1 3 Microwave Remote Sensing Electromagnetic radiation in the microwave wavelength region is used in remote sensing to provide useful information about the Earth's atmosphere, land and ocean. A microwave radiometer is a passive device which records the natural microwave emission from the earth. It can be used to measure the total water content of the atmosphere within its field of view. A radar altimeter sends out pulses of microwave signals and record the signal scattered back from the earth surface. The height of the surface can be measured from the time delay of the return signals. A wind scatterometer can be used to measure wind speed and direction over the ocean surface. It sends out pulses of microwaves along several directions and records the magnitude of the signals backscattered from the ocean surface. The magnitude of the backscattered signal is related to the ocean surface roughness, which in turns is dependent on the sea surface wind condition, and hence the wind speed and direction can be derived. orne platforms to generate high resolution images of the earth surface using microwave energy.
  • 14. আরিফু ল ইসলাম 1 4 Synthetic Aperture Radar (SAR) In synthetic aperture radar (SAR) imaging, microwave pulses are transmitted by an antenna towards the earth surface. The microwave energy scattered back to the spacecraft is measured. The SAR makes use of the radar principle to form an image by utilising the time delay of the backscattered signals. A radar pulse is transmitted from the antenna to the ground The radar pulse is scattered by the ground targets back to the antenna. In real aperture radar imaging, the ground resolution is limited by the size of the microwave beam sent out from the antenna. Finer details on the ground can be resolved by using a narrower beam. The beam width is inversely proportional to the size of the antenna, i.e. the longer the antenna, the narrower the beam.
  • 15. আরিফু ল ইসলাম 1 5 The microwave beam sent out by the antenna illuminates an area on the ground (known as the antenna's "footprint"). In radar imaging, the recorded signal strength depends on the microwave energy backscattered from the ground targets inside this footprint. Increasing the length of the antenna will decrease the width of the footprint. It is not feasible for a spacecraft to carry a very long antenna which is required for high resolution imaging of the earth surface. To overcome this limitation, SAR capitalises on the motion of the space craft to emulate a large antenna (about 4 km for the ERS SAR) from the small antenna (10 m on the ERS satellite) it actually carries on board. Imaging geometry for a typical strip-mapping synthetic aperture radar imaging system. The antenna's footprint sweeps out a strip parallel to the direction of the satellite's ground track.
  • 16. আরিফু ল ইসলাম 1 6 Interaction between Microwaves and Earth's Surface When microwaves strike a surface, the proportion of energy scattered back to the sensor depends on many factors:  Physical factors such as the dielectric constant of the surface materials which also depends strongly on the moisture content;  Geometric factors such as surface roughness, slopes, orientation of the objects relative to the radar beam direction;  The types of landcover (soil, vegetation or man-made objects).  Microwave frequency, polarisation and incident angle. All-Weather Imaging Due to the cloud penetrating property of microwave, SAR is able to acquire "cloud- free" images in all weather. This is especially useful in the tropical regions which are frequently under cloud covers throughout the year. Being an active remote sensing device, it is also capable of night-time operation. Basic mechanism of Remote Sensing and sensor design Basic mechanism of remote sensing is that the outgoing radiation from the object passes through the atmosphere up to the sensor onboard the remote sensing platform. The platform is the aircraft or satellite or in special case balloons which carry the sensor. The sensor has a field of view on which the resolution of the measurement depends. In designing the remote sensing sensors the electromagnetic waves / radiations are so chosen that they belong to atmospheric window regions of the E-M spectrum. This radiation may be solar radiation reflected from the object, or radiated by the object due to thermal radiation or active beam generated by the Synthetic Aperture Radar (SAR) and scattered by the object. This radiation scattered or emitted by the object is sensed by the suitable sensors onboard the platforms. Then, the energy received by the sensor is converted to digital signal or a physical value representing the properties of the object on earth. The field of view determines how much area of the earth is seen by the sensor providing single value of information. The sensor is able to scan the
  • 17. আরিফু ল ইসলাম 1 7 earth from left to right of the path of the satellite over a width equal to the field of view. Thus strips of scanned information are received by the sensor as the satellite moves. The path along which the satellite moves is called Nadir. The scanned information over the strips (we call the strips as row) and individual observations in the rows as column. Figure: The Field of view, strips and viewed area / resolution Figure: Illumination of the object and reflection form the object to the sensor; and data recording and downloading to the ground station. 1. Energy Source or Illumination (A) – the first requirement for remote sensing is to have an energy source which illuminates or provides electromagnetic energy to the target of interest. Strips or Rows Field of view
  • 18. আরিফু ল ইসলাম 1 8 2. Radiation and the Atmosphere (B) – as the energy travels from its source to the target, it will come in contact with and interact with the atmosphere it passes through. This interaction may take place a second time as the energy travels from the target to the sensor. 3. Interaction with the Target (C) - once the energy makes its way to the target through the atmosphere, it interacts with the target depending on the properties of both the target and the radiation. The following aspects are important for remote sensing:  Satellite or Aircraft  Sensors: Sensors uses reflected solar, earth’s infrared radiation and active beam projected from the platform for illumination and scattered back to sensors  Multi spectral observation. This means the optical system on board the platform looks at the object to capture radiation at different smaller spectral bands. There are devices or filters which separates the chosen bands.  For SAR technology the multu-spectral, multi polarization and multi angle remote sensing are made.  The information is gathered in known format in the storage system within the platform and then beams down to the ground station  After receiving in the ground station, data are pre-processed, corrected for geometrical error, geo-referenced in coordinate system  Then the process of image interpretation, image processing and classification are done.  Applications are of manifold: land-use mapping, disaster monitoring, flood and cyclone affected area mapping, ect. are performed through man-machine interaction. Remote Sensing Platforms The remote sensing of earth is materialized with sensors on board platforms are:  Aircrafts with special arrangements of making high resolution photographs on photographic films using optical analog camera or digital camera or sensors. The image quality depends on the atmospheric conditions. The images obtained
  • 19. আরিফু ল ইসলাম 1 9 through this imaging technique are known as aerial photograph. The atmosphere should be cloud or fog free for getting high quality photographs. The process of analysis and interpretation of aerial photographs is known as photogrammetry.  Satellites are used as platforms with imaging instruments/sensors onboard these satellites. The satellites are of 2-types depending on the orbit. The satellites which are placed on the equator at a height of around 35000 km from the earth surface have the same orbital time period as that of earth and is found to be stationary with respect to the earth is known as Geostationary satellites.  Satellites having lower orbital heights of around 500-800 km are the known as polar orbiting satellites. The orbital heights may be lower up to 300 km and is known as Low Orbiting Satellites.  Balloons may also be used Airborne Remote Sensing In airborne remote sensing, downward or sideward looking sensors are mounted on an aircraft to obtain images of the earth's surface. An advantage of airborne remote sensing, compared to satellite remote sensing, is the capability of offering very high spatial resolution images (20 cm or less). The disadvantages are low coverage area and high cost per unit area of ground coverage. It is not cost-effective to map a large area using an airborne remote sensing system. Airborne remote sensing missions are often carried out as one-time operations, whereas earth observation satellites offer the possibility of continuous monitoring of the earth.
  • 20. আরিফু ল ইসলাম 2 0 Analog aerial photography, videography, and digital photography are commonly used in airborne remote sensing. Synthetic Aperture Radar imaging is also carried out on airborne platforms. Analog photography is capable of providing high spatial resolution. The interpretation of analog aerial photographs is usually done visually by experienced analysts. The photographs may be digitized using a scanning device for computer-assisted analysis. Digital photography permits real-time transmission of the remotely sensed data to a ground station for immediate analysis. The digital images can be analysed and interpreted with the aid of a computer. A high resolution aerial photograph over a forested area. The canopy of each individual tree can be clearly seen. This type of very high resolution imagery is useful in identification of tree types and in assessing the conditions of the trees. Another example of a high resolution aerial photograph over a residential area. Spaceborne Remote Sensing Eg. IKONOS 2; SPOT 1, 2, 4; EROS A1; OrbView 2 (SeaStar); NOAA 12, 14, 16; TERRA; RADARSAT 1; ERS 1, 2.
  • 21. আরিফু ল ইসলাম 2 1 The receiving ground station at CRISP receives data from these satellites. In spaceborne remote sensing, sensors are mounted on-board a spacecraft (space shuttle or satellite) orbiting the earth. At present, there are several remote sensing satellites providing imagery for research and operational applications. Spaceborne remote sensing provides the following advantages:  Large area coverage;  Frequent and repetitive coverage of an area of interest;  Quantitative measurement of ground features using radiometrically calibrated sensors;  Semiautomated computerised processing and analysis;  Relatively lower cost per unit area of coverage. Satellite imagery has a generally lower resolution compared to aerial photography. However, very high resolution imagery (up to 1-m resolution) is now commercially available to civilian users with the successful launch of the IKONOS-2 satellite in September 24, 1999. Airborne versus Spaceborne Radars Like other remote sensing systems, an imaging radar sensor may be carried on either an airborne or spaceborne platform. Depending on the use of the prospective imagery, there are trade-offs between the two types of platforms. Regardless of the platform used, a significant advantage of using a Synthetic Aperture Radar (SAR) is that the spatial resolution is independent of platform altitude. Thus, fine resolution can be achieved from both airborne and spaceborne platforms. Although spatial resolution is independent of altitude, viewing geometry and swath coverage can be greatly affected by altitude variations. At aircraft operating altitudes, an airborne radar must image over a wide range of incidence angles, perhaps as much as 60 or 70 degrees, in order to achieve relatively wide swaths (let's say 50 to 70 km). As we have learned in the preceding sections, incidence angle (or look angle) has a significant effect on the backscatter from surface features and on their appearance on an image. Image characteristics such as foreshortening, layover, and shadowing will
  • 22. আরিফু ল ইসলাম 2 2 be subject to wide variations, across a large incidence angle range. Spaceborne radars are able to avoid some of these imaging geometry problems since they operate at altitudes up to one hundred times higher than airborne radars. At altitudes of several hundred kilometres, spaceborne radars can image comparable swath widths, but over a much narrower range of incidence angles, typically ranging from five to 15 degrees. This provides for more uniform illumination and reduces undesirable imaging variations across the swath due to viewing geometry. Although airborne radar systems may be more susceptible to imaging geometry problems, they are flexible in their capability to collect data from different look angles and look directions. By optimizing the geometry for the particular terrain being imaged, or by acquiring imagery from more than one look direction, some of these effects may be reduced. Additionally, an airborne radar is able to collect data anywhere and at any time (as long as weather and flying conditions are acceptable!). A spaceborne radar does not have this degree of flexibility, as its viewing geometry and data acquisition schedule is controlled by the pattern of its orbit. However, satellite radars do have the advantage of being able to collect imagery more quickly over a larger area than an airborne radar, and provide consistent viewing geometry. The frequency of coverage may not be as often as that possible with an airborne platform, but depending on the orbit parameters, the viewing geometry flexibility, and the geographic area of interest, a spaceborne radar may have a revisit period as short as one day. As with any aircraft, an airborne radar will be susceptible to variations in velocity and other motions of the aircraft as well as to environmental (weather) conditions. In order to avoid image artifacts or geometric positioning errors due to random variations in the motion of the aircraft, the radar system must use sophisticated navigation/positioning equipment and advanced image processing to compensate for these variations. Generally, this will be able to correct for all but the most severe variations in motion, such as significant air turbulence. Spaceborne radars are not affected by motion of this type. Indeed, the geometry of their orbits is usually very stable and their positions can be accurately calculated. However, geometric correction
  • 23. আরিফু ল ইসলাম 2 3 of imagery from spaceborne platforms must take into account other factors, such as the rotation and curvature of the Earth, to achieve proper geometric positioning of features on the surface. Satellite Orbits A satellite follows a generally elliptical orbit around the earth. The time taken to complete one revolution of the orbit is called the orbital period. The satellite traces out a path on the earth surface, called its ground track, as it moves across the sky. As the earth below is rotating, the satellite traces out a different path on the ground in each subsequent cycle. Remote sensing satellites are often launched into special orbits such that the satellite repeats its path after a fixed time interval. This time interval is called the repeat cycle of the satellite. There are two different orbits for Satellites depending on the height of the satellites: Polar orbiting satellites The polar orbiting satellites are put at a height of around 300-1000 km from the surface of the earth have fixed orbit. The satellites move from south to north or north to south. The satellite covers the full earth as the earth rotates from west to east. The satellite passes from south to north and after the rotation by 180 degrees the satellite is found to pass from north to south. If the satellite passes from south to north it is known as ascending pass, while it passes from north to south it is known as descending pass. The satellite with orbital height of 800-1000 km takes about 120 second for a complete rotation. The orbital time increases with the increase of the height. Thus the low orbiting satellites have high orbital speed and lower time period.
  • 24. আরিফু ল ইসলাম 2 4 If the satellite crosses the orbit at a particular time of the day, then it is known as sun synchronous. Geostationary Satellites  The geostationary orbits are such that the satellite moves in the same direction as that of the earth with same angular speed, as a result, the satellite appears to remain stationary with respect to earth and views the same region of the earth at certain interval of time which is from 10-30 minutes. Thus, the satellites on geostationary orbits are used for environmental monitoring and also for telecommunication purposes.  A geostationary satellite is launched in such a way that it follows an orbit parallel to the equator and travels in the same direction as the earth's rotation with the same period of 24 hours. Thus, it appears stationary with respect to the earth surface. A satellite following a geostationary orbit always views the same area on the earth. Thus, it is possible to monitor continuously any location within the field of view of its sensor. A large area of the earth can also be covered by the satellite. Geostationary Orbit: The satellite appears stationary with respect to the Earth's surface. Remote Sensing Satellites Several remote sensing satellites are currently available, providing imagery suitable for various types of applications. Each of these satellite-sensor platform is
  • 25. আরিফু ল ইসলাম 2 5 characterised by the wavelength bands employed in image acquisition, spatial resolution of the sensor, the coverage area and the temporal coverge, i.e. how frequent a given location on the earth surface can be imaged by the imaging system. In terms of the spatial resolution, the satellite imaging systems can be classified into:  Low resolution systems (approx. 1 km or more)  Medium resolution systems (approx. 100 m to 1 km)  High resolution systems (approx. 5 m to 100 m)  Very high resolution systems (approx. 5 m or less) In terms of the spectral regions used in data acquisition, the satellite imaging systems can be classified into:  Optical imaging systems (include visible, near infrared, and shortwave infrared systems)  Thermal imaging systems  Synthetic aperture radar (SAR) imaging systems Optical/thermal imaging systems can be classified according to the number of spectral bands used:  Monospectral or panchromatic (single wavelength band, "black-and-white", grey-scale image) systems  Multispectral (several spectral bands) systems  Superspectral (tens of spectral bands) systems  Hyperspectral (hundreds of spectral bands) systems Synthetic aperture radar imaging systems can be classified according to the combination of frequency bands and polarization modes used in data acquisition, e.g.:  Single frequency (L-band, or C-band, or X-band)  Multiple frequency (Combination of two or more frequency bands)  Single polarization (VV, or HH, or HV)  Multiple polarization (Combination of two or more polarization modes)
  • 26. আরিফু ল ইসলাম 2 6 Types of Satellites and Sensors Optical/Infrared Remote Sensing Satellites Low Resolution  Geostationary Satellites  Polar Orbiting Meteorological Satellites o NOAA-AVHRR o DMSP-OLS  Orbview2-SeaWiFS  SPOT4-Vegetation  ADEOS-OCTS Medium Resolution  TERRA-MODIS  ENVISAT-MERIS  ADEOS2-GLI High Resolution  LANDSAT  SPOT1,2,4  MOS  EO1  IRS  RESURS Very High Resolution  IKONOS2  EROS-A1  Quickbird2  Orbview3  SPOT5
  • 27. আরিফু ল ইসলাম 2 7 Ground Truthing [Accuracy] The accuracy of remotely sensed or mathematically calculated data based on data actually measured in the field. In order to "anchor" the satellite measurements, we need to compare them to something we know. One way to do this is by what we call "ground truth", which is one part of the calibration process. This is where a person on the ground (or sometimes in an airplane) makes a measurement of the same thing the satellite is trying to measure, at the same time the satellite is measuring it. The two answers are then compared to help evaluate how well the satellite instrument is performing. Usually we believe the ground truth more than the satellite, because we have more experience making measurements on the ground and sometimes we can see what we are measuring with the naked eye. There are a number of ways to take ground truth measurements. The first is what we call a "field campaign". This is where several scientists and technicians take lots of equipment and set it up somewhere for a short but intense period of measurement. Often they go someplace rather simple, like the middle of the Great Plains in the United States, or an island or ship in the middle of the ocean, or an ice shelf at one of the poles. We get a lot of information from field campaigns, but they are expensive and only run for a short time. Another source of ground truth is the on-going work of the National Weather Service. They have a record of weather conditions stretching back for over 100 years. Observations are made at regular intervals at offices around the country. These provide a nice record but are not necessarily taken at the same time a satellite passes over the spot. As clouds are very changeable, things can change completely in even a few minutes. Another option for ground truth is S'COOL. Students at schools around the world can be involved by making an observation within a few minutes of the time that a satellite views their area. Image Classification The intent of the classification process is to categorize all pixels in a digital image into one of several land cover classes, or "themes". This categorized data may then be used to produce thematic maps of the land cover present in an image. Normally, multispectral data are used to perform the classification and, indeed, the spectral pattern present within the data for each pixel is used as the numerical basis for
  • 28. আরিফু ল ইসলাম 2 8 categorization (Lillesand and Kiefer, 1994). The objective of image classification is to identify and portray, as a unique gray level (or color), the features occurring in an image in terms of the object or type of land cover these features actually represent on the ground. Image classification is perhaps the most important part of digital image analysis. It is very nice to have a "pretty picture" or an image, showing a magnitude of colors illustrating various features of the underlying terrain, but it is quite useless unless to know what the colors mean. Digital image classification techniques group pixels to represent land cover features. Land cover could be forested, urban, agricultural and other types of features. There are three main image classification techniques. Image Classification Techniques in Remote Sensing:  Unsupervised image classification  Supervised image classification  Object-based image analysis Pixels are the smallest unit represented in an image. Image classification uses the reflectance statistics for individual pixels. Unsupervised and supervised image classification techniques are the two most common approaches. However, object- based classification has been breaking more ground as of late. What is the difference between supervised and unsupervised classification? Unsupervised Classification Fig: Unsupervised Classification Example
  • 29. আরিফু ল ইসলাম 2 9 Pixels are grouped based on the reflectance properties of pixels. These groupings are called “clusters”. The user identifies the number of clusters to generate and which bands to use. With this information, the image classification software generates clusters. There are different image clustering algorithms such as K-means and ISODATA. The user manually identifies each cluster with land cover classes. It’s often the case that multiple clusters represent a single land cover class. The user merges clusters into a land cover type. The unsupervised classification image classification technique is commonly used when no sample sites exist. Unsupervised Classification Steps:  Generate clusters  Assign classes Fig: Unsupervised Classification Diagram Supervised Classification Fig: Supervised Classification Example: IKONOS
  • 30. আরিফু ল ইসলাম 3 0 The user selects representative samples for each land cover class in the digital image. These sample land cover classes are called “training sites”. The image classification software uses the training sites to identify the land cover classes in the entire image. The classification of land cover is based on the spectral signature defined in the training set. The digital image classification software determines each class on what it resembles most in the training set. The common supervised classification algorithms are maximum likelihood and minimum-distance classification. Supervised Classification Steps:  Select training areas  Generate signature file  Classify Fig: Supervised Classification Diagram Object-Based (or Object-Oriented) Image Analysis Classification Fig: Object-based Classification
  • 31. আরিফু ল ইসলাম 3 1 Traditional pixel-based processing generates square classified pixels. Object-based image classification is very different in that it generates objects of different shape and scale. This process is calledmulti-resolution segmentation. Multiresolution segmentation produces homogenous image objects by grouping pixels. Objects are generated with different scales in an image simultaneously. These objects are more meaningful than the tradional pixel-based segmentation because they can be classified based on texture, context and geometry. Object-based image analysis supports the use of multiple bands for multiresolution segmentation and classification. For example, infrared, elevation or existing shape files can simultaneously be used to classify image objects. Multiple layers can have context with each other. This context comes in the form of neighborhood relationships, proximity and distance between layers. Nearest neighbor (NN) classification is similar to supervised classification. After multi- resolution segmentation, the user identifies sample sites for each land cover class. The statistics to classify image objects are defined. The object based image analysis software then classifies objects based on their resemblance to the training sites and the statistics defined. Fig: Object-Based Classification Diagram Object-Based Nearest Neighbor Classification Steps:  Perform multiresolution segmentation  Select training areas  Define statistics  Classify
  • 33. আরিফু ল ইসলাম 3 3 Pixels A digital image comprises of a two dimensional array of individual picture elements called pixels arranged in columns and rows. Each pixel represents an area on the Earth's surface. A pixel has an intensity value and a location address in the two dimensional image. The intensity value represents the measured physical quantity such as the solar radiance in a given wavelength band reflected from the ground, emitted infrared radiation or backscattered radar intensity. This value is normally the average value for the whole ground area covered by the pixel. The intensity of a pixel is recorded as a digital number. Due to the finite storage capacity, a digital number is stored with a finite number of bits (binary digits). The number of bits determine the radiometric resolution of the image. For example, an 8- bit digital number ranges from 0 to 255 (i.e. 28 - 1), while a 11-bit digital number ranges from 0 to 2047. The detected intensity value needs to be scaled and quantized to fit within this range of value. In a Radiometrically Calibrated image, the actual intensity value can be derived from the pixel’s digital number. The address of a pixel is denoted by its row and column coordinates in the two- dimensional image. There is a one-to-one correspondence between the column-row address of a pixel and the geographical coordinates (e.g. Longitude, latitude) of the imaged location. In order to be useful, the exact geographical location of each pixel on the ground must be derivable from its row and column indices, given the imaging geometry and the satellite orbit parameters. Digital number Digital number in remote sensing systems, a variable assigned to a pixel, usually in the form of a binary integer in the range of 0–255 (i.e. a byte). The range of energies examined in a remote sensing system is broken into 256bins. A single pixel may have several digital number variables corresponding to different bands recorded. Digital Image A digital image is a two-dimensional array of pixels. Each pixel has an intensity value (represented by a digital number) and a location address (referenced by its row and column numbers).
  • 34. আরিফু ল ইসলাম 3 4 Analog Images Data collected in hard-copy pictorial records are termed photographs. Photographs are recorded two-dimensionally on photosensitive emulsions that are either reflective (such as in a print) or transmissive (such as a slide). In remote sensing, the term photograph is reserved exclusively for images that were detected as well as recorded on film. The more generic term image is used for any pictorial representation of image data. Because the term image relates to any pictorial product, all photographs are images, but not all images are photographs. Image noise Image noise is random (not present in the object imaged) variation of brightness or color information in images, and is usually an aspect of electronic noise. It can be produced by the sensor and circuitry of a scanner or digital camera. Image noise can also originate in film grain and in the unavoidable shot noise of an ideal photon detector. Image noise is an undesirable by-product of image capture that adds spurious and extraneous information. The original meaning of "noise" was and remains "unwanted signal"; unwanted electrical fluctuations in signals received by AM radios caused audible acoustic noise ("static"). By analogy unwanted electrical fluctuations themselves came to be known as "noise". Image noise is, of course, inaudible. The magnitude of image noise can range from almost imperceptible specks on a digital photograph taken in good light, to optical and radio astronomical images that are almost entirely noise, from which a small amount of information can be derived by sophisticated processing (a noise level that would be totally unacceptable in a photograph since it would be impossible to determine even what the subject was). Multilayer Image Several types of measurement may be made from the ground area covered by a single pixel. Each type of measurement forms an image which carry some specific information about the area. By "stacking" these images from the same area together, a multilayer image is formed. Each component image is a layer in the multilayer image. Multilayer images can also be formed by combining images obtained from different sensors, and other subsidiary data. For example, a multilayer image may consist of
  • 35. আরিফু ল ইসলাম 3 5 three layers from a SPOT multispectral image, a layer of ERS synthetic aperture radar image, and perhaps a layer consisting of the digital elevation map of the area being studied. fig: An illustration of a multilayer image consisting of five component layers. Multispectral Image A multispectral image consists of a few image layers, each layer represents an image acquired at a particular wavelength band. For example, the SPOT HRVsensor operating in the multispectral mode detects radiations in three wavelength bands: the green (500 - 590 nm), red (610 - 680 nm) and near infrared (790 - 890 nm) bands. A single SPOT multispectral scene consists of three intensity images in the three wavelength bands. In this case, each pixel of the scene has three intensity values corresponding to the three bands. A multispectral IKONOS image consists of four bands: Blue, Green, Red and Near Infrared, while a landsat TM multispectral image consists of seven bands: blue, green, red, near-IR bands, two SWIR bands, and a thermal IR band. Superspectral Image The more recent satellite sensors are capable of acquiring images at many more wavelength bands. For example, the MODIS sensor on-board the NASA's TERRA satellite consists of 36 spectral bands, covering the wavelength regions ranging from the visible, near infrared, short-wave infrared to the thermal infrared. The bands have narrower bandwidths, enabling the finer spectral characteristics of the targets to be captured by the sensor. The term "superspectral" has been coined to describe such sensors.
  • 36. আরিফু ল ইসলাম 3 6 Hyperspectral Image A hyperspectral image consists of about a hundred or more contiguous spectral bands. The characteristic spectrum of the target pixel is acquired in a hyperspectral image. The precise spectral information contained in a hyperspectral image enables better characterisation and identification of targets. Hyperspectral images have potential applications in such fields as precision agriculture (e.g. monitoring the types, health, moisture status and maturity of crops), coastal management (e.g. monitoring of phytoplanktons, pollution, bathymetry changes). Currently, hyperspectral imagery is not commercially available from satellites. There are experimental satellite-sensors that acquire hyperspectral imagery for scientific investigation (e.g. NASA's Hyperion sensor on-board the EO1 satellite, CHRIS sensor onboard ESA's PRABO satellite). Panchromatic image A single band image generally displayed as shades of gray. Panchromatic emulsion is a type of black-and-white photographic emulsion that is sensitive to all wavelengths of visible light. Panchromatic film is a normal black-and-white film which you can buy in the store and is what your grandparents probably used. It is called panchromatic because it is sensitive to light of all visible colors (electromagnetic wavelengths approximately 0.4mm to 0.7mm). Objects of different colors are distinguishable in panchromatic film, with grayscale tones closely approximating the brightness of the photograph's subject. Panchromatic film is typically not as sensitive to subtle variation in green light as some other colors, making it difficult to differentiate between similar vegetation types. To reduce atmospheric haze, a yellow filter is normally used. For these images the filters believed to have been used were a K2 (Wratten No. 8) and #12 deep amber (minus blue member of the cyan-magenta- yellow set of subtractive primaries). NDVI (Normalized Difference Vegetation Index) The normalized difference vegetation index (NDVI) is a simple graphical indicator that can be used to analyze remote sensing measurements, typically but not necessarily from a space platform, and assess whether the target being observed contains live green vegetation or not.
  • 37. আরিফু ল ইসলাম 3 7 Remote sensing phenology studies use data gathered by satellite sensors that measure wavelengths of light absorbed and reflected by green plants. Certain pigments in plant leaves strongly absorb wavelengths of visible (red) light. The leaves themselves strongly reflect wavelengths of near-infrared light, which is invisible to human eyes. As a plant canopy changes from early spring growth to late-season maturity and senescence, these reflectance properties also change. Many sensors carried aboard satellites measure red and near-infrared light waves reflected by land surfaces. Using mathematical formulas (algorithms), scientists transform raw satellite data about these light waves into vegetation indices. A vegetation index is an indicator that describes the greenness — the relative density and health of vegetation — for each picture element, or pixel, in a satellite image. Although there are several vegetation indices, one of the most widely used is the Normalized Difference Vegetation Index (NDVI). NDVI values range from +1.0 to -1.0. Areas of barren rock, sand, or snow usually show very low NDVI values (for example, 0.1 or less). Sparse vegetation such as shrubs and grasslands or senescing crops may result in moderate NDVI values (approximately 0.2 to 0.5). High NDVI values (approximately 0.6 to 0.9) correspond to dense vegetation such as that found in temperate and tropical forests or crops at their peak growth stage. By transforming raw satellite data into NDVI values, researchers can create images and other products that give a rough measure of vegetation type, amount, and condition on land surfaces around the world. NDVI is especially useful for continental- to global- scale vegetation monitoring because it can compensate for changing illumination conditions, surface slope, and viewing angle. That said, NDVI does tend to saturate over dense vegetation and is sensitive to underlying soil color. NDVI values can be averaged over time to establish "normal" growing conditions in a region for a given time of year. Further analysis can then characterize the health of vegetation in that place relative to the norm. When analyzed through time, NDVI can reveal where vegetation is thriving and where it is under stress, as well as changes in vegetation due to human activities such as deforestation, natural disturbances such as wild fires, or changes in plants' phenological stage. UN-SPIDER In its resolution 61/110 of 14 December 2006 the United Nations General Assembly agreed to establish the "United Nations Platform for Space-based Information for Disaster Management and Emergency Response - UN-SPIDER" as a new United
  • 38. আরিফু ল ইসলাম 3 8 Nations programme, with the following mission statement: "Ensure that all countries and international and regional organizations have access to and develop the capacity to use all types of space-based information to support the full disaster management cycle". Whereas there have been a number of initiatives in recent years that have contributed to making space technologies available for humanitarian and emergency response, UN-SPIDER is the first to focus on the need to ensure access to and use of such solutions during all phases of the disaster management cycle, including the risk reduction phase, which will significantly contribute to reducing the loss of lives and property. The UN-SPIDER programme is achieving this by being a gateway to space information for disaster management support, by serving as a bridge to connect the disaster management and space communities and by being a facilitator of capacity-building and institutional strengthening, in particular for developing countries. UN-SPIDER is being implemented as an open network of providers of space-based solutions to support disaster management activities. Besides Vienna (where UNOOSA is located), the programme also has offices in Bonn, Germany and in Beijing, China. Additionally, a network of Regional Support Offices multiplies the work of UN-SPIDER in the respective regions. Field of view The field of view (also field of vision, abbreviated FOV) is the extent of the observable world that is seen at any given moment. In case of optical instruments or sensors it is a solid angle through which a detector is sensitive to electromagnetic radiation. IFOV (INSTANTANEOUS FIELD OF VIEW) A measure of the spatial resolution of a remote sensing imaging system. Defined as the angle subtended by a single detector element on the axis of the optical system. IFOV has the following attributes:
  • 39. আরিফু ল ইসলাম 3 9  Solid angle through which a detector is sensitive to radiation.  The IFOV and the distance from the target determines the spatial resolution. A low altitude imaging instrument will have a higher spatial resolution than a higher altitude instrument with the same IFOV. Swath width Swath width refers to the strip of the Earth’s surface from which geographic data are collected by a moving vehicle such as asatellite, aircraft or ship in the course of swath mapping. The longitudinal extent of the swath is defined by the motion of the vehicle with respect to the surface, whereas the swath width is measured perpendicularly to the longitudinal extent of the swath. Note that swath width is also cited in the context of sonar specifications. It is measured and represented in metres or kilometres. Fig: swath Nadir The nadir is the direction pointing directly below a particular location; that is, it is one of two vertical directions at a specified location, orthogonal to a horizontal flat surface there. Since the concept of being below is itself somewhat vague, scientists define the nadir in more rigorous terms. Specifically, in astronomy, geophysics and related sciences (e.g., meteorology), the nadir at a given point is the local vertical direction pointing in the direction of the force of gravity at that location. The direction opposite of the nadir is the zenith. Nadir also refers to the downward-facing viewing geometry of an orbiting satellite, such as is employed during remote sensing of the atmosphere, as well as when an astronaut faces the Earth while performing a spacewalk.
  • 40. আরিফু ল ইসলাম 4 0 The word is also used figuratively to mean the lowest point of a person's spirits, or the quality of an activity or profession. Fig: This diagram depicts a satellite observing backscattered sunlight in the nadir viewing geometry. The term nadir can also be used to represent the lowest point reached by a celestial body during its apparent orbit around a given point of observation. This can be used to describe the location of the Sun, but it is only technically accurate for one latitude at a time and only possible at the low latitudes. The sun is said to be at the nadir at a location when it is at the zenith at the location's antipode. In oncology, the term nadir is used to represent the lowest level of a blood cell count while a patient is undergoing chemotherapy. A diagnosis of neutropenic nadir after chemotherapy typically lasts 7–10 days. Fig: Diagram showing the relationship between the zenith, the nadir, and different types of horizon. The nadir is opposite the zenith.
  • 41. আরিফু ল ইসলাম 4 1 Image Processing and Analysis Many image processing and analysis techniques have been developed to aid the interpretation of remote sensing images and to extract as much information as possible from the images. The choice of specific techniques or algorithms to use depends on the goals of each individual project. In this section, we will examine some procedures commonly used in analysing/interpreting remote sensing images. Pre-Processing Prior to data analysis, initial processing on the raw data is usually carried out to correct for any distortion due to the characteristics of the imaging system and imaging conditions. Depending on the user's requirement, some standard correction procedures may be carried out by the ground station operators before the data is delivered to the end-user. These procedures include radiometric correction to correct for uneven sensor response over the whole image and geometric correctionto correct for geometric distortion due to Earth's rotation and other imaging conditions (such as oblique viewing). The image may also be transformed to conform to a specific map projection system. Furthermore, if accurate geographical location of an area on the image needs to be known, ground control points (GCP's) are used to register the image to a precise map (geo-referencing). Image Enhancement Enhancements are used to make it easier for visual interpretation and understanding of imagery. The advantage of digital imagery is that it allows us to manipulate the digital pixel values in an image. Although radiometric corrections for illumination, atmospheric influences, and sensor characteristics may be done prior to distribution of data to the user, the image may still not be optimized for visual interpretation. Remote sensing devices, particularly those operated from satellite platforms, must be designed to cope with levels of target/background energy which are typical of all conditions likely to be encountered in routine use. With large variations in spectral response from a diverse range of targets (e.g. forest, deserts, snowfields, water, etc.) no generic radiometric correction could optimally account for and display the optimum brightness range and contrast for all targets. Thus, for each application and each image, a custom adjustment of the range and distribution of brightness values is usually necessary.
  • 42. আরিফু ল ইসলাম 4 2 Multispectral SPOT image of the same area shown in a previous section, but acquired at a later date. Radiometric and geometric corrections have been done. The image has also been transformed to conform to a certain map projection (UTM projection). This image is displayed without any further enhancement. In the above unenhanced image, a bluish tint can be seen all-over the image, producing a hazy apapearance. This hazy appearance is due to scattering of sunlight by atmosphere into the field of view of the sensor. This effect also degrades the contrast between different landcovers. It is useful to examine the image Histograms before performing any image enhancement. The x-axis of the histogram is the range of the available digital numbers, i.e. 0 to 255. The y-axis is the number of pixels in the image having a given digital number. The histograms of the three bands of this image is shown in the following figures.
  • 43. আরিফু ল ইসলাম 4 3 Histogram of the XS3 (near infrared) band (displayed in red). Histogram of the XS2 (red) band (displayed in green). Histogram of the XS1 (green) band (displayed in blue). Note that the minimum digital number for each band is not zero. Each histogram is shifted to the right by a certain amount. This shift is due to the atmospheric scattering component adding to the actual radiation reflected from the ground. The shift is particular large for the XS1 band compared to the other two bands due to the higher contribution from Rayleigh scattering for the shorter wavelength. The maximum digital number of each band is also not 255. The sensor's gain factor has been adjusted to anticipate any possibility of encountering a very bright object. Hence, most of the pixels in the image have digital numbers well below the maximum value of 255. The image can be enhanced by a simple linear grey-level stretching. In this method, a level threshold value is chosen so that all pixel values below this threshold are mapped to zero. An upper threshold value is also chosen so that all pixel values above this threshold are mapped to 255. All other pixel values are linearly interpolated to lie between 0 and 255. The lower and upper thresholds are usually chosen to be values close to the minimum and maximum pixel values of the image. The Grey-Level Transformation Table is shown in the following graph. Grey-Level Transformation Table for performing linear grey level stretching of the three bands of the image. Red line: XS3 band; Green line: XS2 band; Blue line: XS1 band.
  • 44. আরিফু ল ইসলাম 4 4 The result of applying the linear stretch is shown in the following image. Note that the hazy appearance has generally been removed, except for some parts near to the top of the image. The contrast between different features has been improved. Multispectral SPOT image after enhancement by a simple linear greylevel stretching. Passive Remote Sensing Remote sensing systems which measure energy that is naturally available are called passive sensors. Passive sensors can only be used to detect energy when the naturally occurring energy is available Every day the Earth’s surface is hit by incoming electromagnetic (EM) radiation from the sun. This EM radiation is known as incident energy (Ei). There are three fundamental energy interactions with incident energy: 1) reflected energy (Er) 2) absorbed energy (Ea) 3) transmitted energy (Et) Incident Energy Formula: Ei = Er + Ea + Et
  • 45. আরিফু ল ইসলাম 4 5 Active Remote Sensing The sensor emits radiation which is directed toward the target to be investigated. The radiation reflected from that target is detected and measured by the sensor. Active remote sensing is being done any time of the day. Active remote Sensors create their own electromagnetic energy that • is transmitted from the sensor toward the terrain • interacts with the terrain producing a backscatter of energy • is recorded by the remote sensor’s receiver. Active remote sensing, that is, using radiation-emitting devices or substances to produce electromagnetic waves that ‘artificially’ illuminate the Earth's surface. As in the act of ‘seeing’, for example: One would turn on the light when it's too dark to see. Active Passive Using artificial energy source Using natural energy source Done any time of the day Work only the day when sun is present used for examining wavelengths that’s your need Sensors can measure visible, infrared, ultraviolet and more types of EM radiation of sun ray that’s reflected form the object. High cost Low cost LIDAR image Landsat image
  • 46. আরিফু ল ইসলাম 4 6 Image resolution Digital images are made up of thousands of pixels (blocks of colour), and the number of pixels in the image will determine how high the image's resolution is. The image's resolution is the number of pixels divided by the size it is being viewed at - so if the image is 720 pixels wide by 720 pixels tall, and it's being viewed at 10 inches square, it has a resolution of 72 pixels per inch (I'm not sure why it's measured in inches - but it's probably a combination of the fact it originated in print before metrication and that computers were so heavily based in the US). Resolution refers to the number of pixels in an image. Resolution is sometimes identified by the width and height of the image as well as the total number of pixels in the image. For example, an image that is 2048 pixels wide and 1536 pixels high (2048X1536) contains (multiply) 3,145,728 pixels (or 3.1 Megapixels). Advantage of high resolution image Sub-meter (VHR) resolution satellites provides users a smaller ground sampling distance (GSD) than other imaging satellites, making them more suitable for reliable Earth observation, site monitoring, object identification and many other tasks requiring precision data. Whether natural or man-made, objects and landscapes may be examined and interpreted using very high spatial resolution satellite imagery. Among its many advantages, VHR images enable identification of specific models of vehicles, vessels and airplanes, as well as structural changes, consequences of natural disasters and monitoring of operations – whether of industrial facilities or farm livestock. This versatility makes EROS imagery a powerful resource for civilian and military needs. Sub-half-meter imagery provides more accurate feature detection and identification, allowing you to detect smaller features and identify them more accurately. This will aid you in more accurate and automated land use/land cover mapping, and provide improved change monitoring identifying smaller changes. Ex: Mapping, Natural resource management, urban planning, Emergency response Location-based Services With enhanced resolution capabilities, companies in the Location Based Services (LBS) industry will be able to provide their customers and portal users with crisper, clearer base map images, allowing for better recognition of features.
  • 47. আরিফু ল ইসলাম 4 7 Example Applications: > Automobile and personal navigation mapping > Fleet management > Nautical navigation > Online portals Energy A wide range of energy-specific industries will be able to take advantage of our higher resolution capabilities. From more accurate monitoring of pipeline assets or construction areas to higher levels of detail and accuracy in volumetric calculations, our enhanced imagery completeness will help you do your job more efficiently and effectively. Example Applications: > Oil & Gas facility and resource monitoring > Drilling and resource exploration > Emergency management and response > Industry production information gathering Disadvantage of Low resolution image Real-time data Real-time data denotes information that is delivered immediately after collection. There is no delay in the timeliness of the information provided. Real-time data is often used for navigation or tracking. Some uses of this term confuse it with the term dynamic data.
  • 48. আরিফু ল ইসলাম 4 8 Solid angle In geometry, a solid angle (symbol: Ω) is the two-dimensional angle in three- dimensional space that an object subtends at a point. It is a measure of how large the object appears to an observer looking from that point. In the International System of Units (SI), a solid angle is expressed in a dimensionless unit called a steradian (symbol: sr). A small object nearby may subtend the same solid angle as a larger object farther away. For example, although the Moon is much smaller than the Sun, it is also much closer to Earth. Indeed, as viewed from any point on Earth, both objects have approximately the same solid angle as well as apparent size. This is evident during a solar eclipse Look Angle The elevation and azimuth at which a particular satellite is predicted to be found at a specified time. look angle In radar terminology, the angle between the vertical plane passing through the radar antenna and the line between the antenna and object. Look angles are required such that the earth station antenna points or “looks at” the satellite directly. From a location on earth, the 2 look angles that are needed are Azimuth and Elevation. Since XTAR satellites are geostationary, these look angles are constant for fixed ground antennae. The look angle calculator determines the azimuth and elevation angles for XTAR satellites based on the input earth station latitude (°N) and longitude (°E) geographical locations. Look Angle Calculator To use for XTAR satellites, make the following selections and entries:  Select either: o 29E XTAR-EUR o 30W HISPASAT 1C (aka XTAR-LANT aka SpainSat)
  • 49. আরিফু ল ইসলাম 4 9  Select your location by either interactively clicking on the map or by entering your address into the dropdown. Alternatively, your latitude (°N) and longitude (°E) can be entered into this dropdown in decimal degrees (separated with comma but do not use parenthesis)  The resulting Azimuth and elevation look angles appear interactively at the base of map.  The magnetic Azimuth is also given for compass reference.  The LNB skew is not used for XTAR since our satellites use circular polarization.  The Options box for “show obstacle” can be checked to display the maximum height, h and minimum distance, d that an obstacle can have without blocking the line of sight from your location to the satellite.
  • 50. আরিফু ল ইসলাম 5 0 Spectral Signature In Remote Sensing Features on the Earth reflect, absorb, transmit, and emit electromagnetic energy from the sun. Special digital sensors have been developed to measure all types of electromagnetic energy as it interacts with objects in all of the ways listed above. The ability of sensors to measure these interactions allows us to use remote sensing to measure features and changes on the Earth and in our atmosphere. A measurement of energy commonly used in remote sensing of the Earth is reflected energy (e.g., visible light, near-infrared, etc.) coming from land and water surfaces. The amount of energy reflected from these surfaces is usually expressed as a percentage of the amount of energy striking the objects. Reflectance is 100% if all of the light striking and object bounces off and is detected by the sensor. If none of the light returns from the surface, reflectance is said to be 0%. In most cases, the reflectance value of each object for each area of the electromagnetic spectrum is somewhere between these two extremes. Across any range of wavelengths, the percent reflectance values for landscape features such as water, sand, roads, forests, etc. can be plotted and compared. Such plots are called “spectral response curves” or “spectral signatures.” Differences among spectral signatures are used to help classify remotely sensed images into classes of landscape features since the spectral signatures of like features have similar shapes. The figure below shows differences in the spectral response curves for healthy versus stressed sugar beet plants. The more detailed the spectral information recorded by a sensor, the more information that can be extracted from the spectral signatures. Hyperspectral sensors have much more detailed signatures than multispectral sensors and thus provide the ability to detect more subtle differences in aquatic and terrestrial features.