An Introduction to Remote Sensing

This chapter is an attempt to comprehend Remote Sensing. To reach a basic understanding so that it becomes a bit easier to understand future lectures and reading supporting material. As a visual learner, I have opted for a visual representation of many concepts or processes, as it works for me (kindly feel welcome to search for the theory).

So lets understand few basicā€¦

How does this work?

What are these bands?

Is this word a satellite or a sensor?

Terms

Below indicated are the images that schematically depict few terms.

Wave interactions, SOURCE: Obaid and Al-Rahim (2020)

Wavelength and Energy, SOURCE: NASA Science

Propagation of the electric and magnetic fields associated with an electromagnetic wave. SOURCE: emagtech

Zenith, Nadir, Meridian, Astronomical Horizontal and True Horizon. SOURCE: trekview

Schematic depicting the solar zenith angle, solar altitude angle and solar azimuth angle. SOURCE: thesolarlabs and Zhang et al. (2021)

Hyperspectral Data Cubes, SOURCE: trekview

Remote Sensing Process

ā€œRemote sensing combines science and technology to acquire information about an object, area, or phenomenon by measuring reflected and emitted radiation using a device thatā€™s without direct physical contact (typically from satellite or aircraft)ā€ geospatialuk

Remote Sensing process, SOURCE: Fundamentals of Remote Sensing

The following table explains the process represented in the above image.

Code Interaction Process
A Energy Source or illumination

Energy form:

  • Electromagnetic radiation (EMR)

Energy source which

  • illuminates the target of interest
  • provides electromagnetic energy to the target of interest.
  • sensed energy is being emitted by the target
B Radiation and the Atmosphere

Energy and Atmosphere

Interaction: 2

Energy Travelling pattern:

  • Source> atmosphere (contact and interact) > target
  • Target> atmosphere (contact and interact) >sensor
C Interaction with the Target

Energy and Target Interaction dependencies:

  • Target properties
  • Radiation properties
D Recording of energy by the Sensor

Target energy output:

  • scattering
  • emission

Sensor location: Remote (not in contact with the target)

Sensor role:

  • Collection of EMR/ energy
  • Record of EMR/ energy
E Transmission, Reception and processing

Sensor role:

  • Transmission of energy recorded (electronic form)

Receiving and processing station role:

  • Receiving the transmitted energy record
  • Data processing
  • Output: target image (hardcopy and/or digital)
F Interpretation and Analysis

Processed target Image:

  • Interpretation
  • Analysis
G Application

Interpretation application

  • To better understand the target
  • Reveal some new information
  • Assist in solving a particular problem

Electromagnetic Waves (EMW)

  • Composition of oscillating electric and magnetic fields. (Maxwellā€™s equation)
  • EM radiation transmission: can transmit in vacuum
    • frequency, time period and wavelength= dependent on the producing source
    • velocity= dependent on medium in which it is travelling
    • high frequency of propagation= increase in accuracy
  • lightšŸŒž = visible part = colors (corresponding to different wavelength of light) eg: rainbowšŸŒˆ
  • Sound šŸ”Š= invisible waves
    • travels through molecules present in air (molecules bumping into each other)

Diagram of the Electromagnetic Spectrum. SOURCE: NASA Science

  • Longer wavelength = lower šŸ”½ energy = lower šŸ”½ frequency

  • Shorter wavelength = higher šŸ”¼ energy = higher šŸ”¼ frequency

Visible light in Electromagnetic Spectrum. SOURCE: QGEO

  • Wonder How BIG are these waves ??

    Image below indicates EMW Radio upto Gamma.

    • Radio waves = very long (size of a building to the size of a coin)
    • gamma-rays = very smaller (atomic nuclei)

Scale of the Electromagnetic Spectrum. SOURCE: NASA Science

I see COLOUR? šŸ‘€

reflection of EMW. SOURCE: DAI

REFLECTED (Vegetation) = infrared light + infrared light= GREEN COLOR

  • Geospatial data acquisition (Majority)= sensing (visible and infrared range)
  • UV portion covers= shortest wavelengths (of practical use to earth observation)
    • Application: some properties of minerals
    • familiar application: UV rays used to detect forged bank notes
  • Microwaves=larger wave length
    • Application: information on surface roughness and moisture content of soils
  • Green=0.54 Ī¼m (solar radiation maximum intensity)
  • Red and beyond=IR (Infra-red)
    • IR familiar application: night vision security cameras

Infrared

  • Near-infrared (NIR)
  • Mid-IR (SWIR- short-wave infrared)
  • Thermal Infrared (TIR)

NIR SWIR TIR

To under stand this better lets take an example of deciduous forest

NIR energy:

  • Deciduous trees= bright (photographic film that is sensitive to infrared)
  • Reflect more Near-infrared (NIR) energy
  • Healthy vegetation=high reflectance (NIR range)
  • Damaged vegetation= low reflectance (NIR range)
  • Does not cause the sensation of ā€˜heatā€™

MID-IR aka SWIR

  • Application: Monitor surface features at night
  • Does not causes the sensation of ā€˜heatā€™
  • IR (wavelength longer than 3 Ī¼m)
  • Causes the sensation of ā€˜heatā€™
  • Thermal emission of the Earthā€™s surface at (300 K)
    • Peak wavelength of 10 Ī¼m
  • Human thermal detectors
    • Wavelength range 7 to 14 Ī¼m
  • NOAAā€™s thermal scanner (National Oceanic and Atmospheric Administration- scientific and regulatory agency US)
    • Detects thermal IR radiation in the range 3.5 to 12.5 Ī¼m

    • Application: environmental problems, analyzing the mineral composition of rocks, the condition of vegetation, etc.

Vegetation spectral signature. Vegetation has low reflectance in the visible region and high reflectance in the near infrared. SOURCE: SEOS

Cellular leaf structure and its interaction with electromagnetic energy. Most visible light is absorbed, while almost half of the near infrared energy is reflected. SOURCE: SEOS

Dust storm across the Red Sea: this LST image shows the dust as a cool blue streak over the hot and arid landscape between Egypt and Saudi Arabia on 13 May, 2005. SOURCE: visible earth

Application of different spectral ranges. SOURCE: SEOS

Spectral Reflectance Properties

So how do we see these characteristics of earth surface features or materials??

Spectral Reflectance Properties!! By analysing spectral reflectance patterns or spectral signatures.

  • Signatures can be visualised in spectral reflectance curves

  • Curves= function of wavelengths

    • Eg: image below for water, soil and vegetation

Spectral signatures of soil, vegetation and water, and spectral bands of LANDSAT 7. SOURCE: SEOS

EMW Interaction

In above sections we have looked into what is EMW and few applications. This segment we shall look into how it interacts when it travels.

Energy inter-actions in the atmosphereand at the surface

Energy inter-actions in the atmosphere and at the surface. SOURCE: Tempfli et al. (2009a)

Atmosphere

Absorption and Transmission

So let us look into the first two absorption and transmission

A schematic representation of the atmospheric transmission (0 to 22 Āµm wavelength range). SOURCE: Tempfli et al. (2009b)

  • Many wavelengths are not useful of RS of ES (none of the corresponding energy can penetrate the atmosphere)
  • Spectrum portion outside the main absorption range= useful range
  • Useful Range= Atmospheric Transmission Windows
    • ONE window from 0.4 to 2 Ī¼m visible, NIR, SWIR remote sensors operating in this range are often referred to as optical ones
    • THREE windows in the TIR range two narrow windows around 3 and 5 Ī¼m Third relatively broader approximately 8 to 14 Ī¼m
  • Longer wavelength= strong absorption
    • Range 22 Ī¼m to 1 mm
    • Very low energy transmission
  • Microwave range= transparent
    • Range= beyond 1 mm
  • Measure: solar radiation outside the atmosphere= resembles black-body radiation at 6000K
  • Measure: Earth surface:
    • Solar radiation at earth surface> spectral distribution of energy= very ragged
  • Graph above:
    • dip= atmospheric absorption by different gases
    • energy>> earth surface= intensity reduction ? gas of increasing density.
Scattering
  • Occurrence: passing from one medium (air,water etc) to another
  • Light>>medium>>part light absorbed by medium>>
  • Intensity of scattered light= size of particles + wavelength
  • High scattering = short wavelength+high frequency (wavier= higher changes of collision with particles)
  • Low scattering = longer wavelength+ low frequency (less wavier/ straighter in comparision= lower changes of collision with particles)

Factors:

  • Wavelength of the radiation
  • Amount of particles and gases
  • Distance the radiant energy travels through the atmosphere

Rayleigh Scattering. SOURCE: Tempfli et al. (2009a)

Rayleigh scattering causes us to see a blue sky during the day and a red sky at sunset. SOURCE: Tempfli et al. (2009c)

Mie Scattering. Source: apollo

Nonselective scattering. SOURCE:

Rayleigh Scattering Mie Scattering Nonselective scattering
Particle size

Very small compared to wavelength (atmospheric particle size < incoming radiation wave length)

0.0001 Ī¼m to 0.001 Ī¼m

Same as the wavelength of The radiation. (atmospheric particle size ~ incoming radiation wave length)

0.01 Ī¼m to 1.0 Ī¼m

much larger than the radiation wavelength. (atmospheric particle size > incoming radiation wave length)

10 Ī¼m to 100 Ī¼m

Particles Small specks of dust or nitrogen and oxygen molecules. Dust, pollen, smoke and water vapour (larger particles) water droplets and larger dust particle
Scattering result Causes shorter wavelengths of energy to be scattered much more than longer wavelengths. Affect longer wavelengths independent of the wavelength within the optical range.
Location (dominant scattering mechanism) Upper atmosphere Lower portions of the atmosphere (dominates when cloud conditions are overcast)
Effect on RS
  • Disturbs RS in the visible spectral range from high altitudes
  • Distortion of the spectral characteristics of the reflected light
  • Shorter wavelengths are overestimated
  • Blueness of colour photos taken from high altitudes
  • Diminishes the ā€œcrispnessā€ of photos> reduces their interpretability.

Negative effect on digital classification using data from multispectral sensors

  • Influences the spectral range from the near-UV up to mid-IR range
  • Greater effect on radiation of longer wavelengths than Rayleigh scattering

Clouds have a limiting effect on RS

Eg: cloud cast

Example Blue Sky, red sunset dominates under overcast, cloudy conditions clouds as white bodies
Wavelength Blue (scattered more- shorter) white light (scattered in all directions)
  • Cloud Consists Of Water Droplets;
  • Since Water Droplets Scatter Light Of Every Wavelength Equally, A Cloud Appears White
During the day
  • solar radiation travels the shortest distance
  • output: Blue/ clear sky
During sunrise and sunset
  • solar radiation travels a longer distance

  • All the radiation of shorter wavelengths is scattered

  • only the longer wavelengths reach the Earthā€™s surface output: orange or red sky

  • No particle=no scattering= black sky

Earthā€™s Surface

  • The proportion of reflected - absorbed - transmitted energy will vary with wavelength and material type.
  • Reflection depends: The surface roughness relative to the wavelength of the incident radiation.
  • Angle of incidence is the surface roughness

Specular Reflection

Diffuse Reflection

  • Mirror like reflection
  • Surface= smooth
  • Reflection= single direction (mostly all energy)
  • Angle of reflection = Angle of incidence
  • Eg: water surface, glasshouse roof
  • Output= Hot Spot (very bright spot in an image)
  • Specular reflections= do not contain spectral information on the ā€œcolourā€ of the reflecting surface
  • Surface= rough
  • Reflection= uniformly in all direction
  • diffuse reflections = contain spectral information on the ā€œcolourā€ of the reflecting surface.

Spectral signatures of different Earth features within the visible light spectrum. SOURCE: earthdata

Reflectance Characteristics

Vegetation Bare SOIL Water
Depending factors
  • Properties of the leaves,
  • Orientation and the structure of the leaf canopy

Dependent factors

  • leaf pigmentation,
  • leaf thickness
  • Composition (cell structure)
  • amount of water in the leaf tissue

Application:

  • optical remote sensing
  • information about the type of plant
  • plant health condition
  • Soil colour,
  • Moisture content,
  • The presence of carbonates,
  • iron oxide content
lower reflectance
Reflects
  • NIR range is highest
  • SWIR range
  • visible range
  • NIR (little)
Incident energy Reflection Vegetation may reflect up to 50% soils up to 30ā€“40% water reflects at most 10%
Energy Absorbed
  • absorption bands
  • 1.45 and 1.95 Ī¼m
  • convex shape= range 0.5 to 1.3 Ī¼m
  • dips at 1.45 Ī¼m and 1.95 Ī¼m
Beyond 1.2 Ī¼m
Examples
  • higher reflectance of SWIR
  • dry leaves
  • NIR may decrease
  • Healthy vegetation= reflection of the blue and red components of incident light is comparatively low, because these portions are absorbed by the plant
  • highest reflectance
  • Turbid (silt loaded) water
  • Water containing plants
  • pronounced reflectance peak for green light (chlorophyll)

BRDF The Bidirectional Reflectance Distribution Function: It gives the reflectance of a target as a function of illumination geometry and viewing geometry. It depends on:

  • Wavelength
  • Determined by the structural and optical properties of the surface,
    • shadow-casting, multiple scattering, mutual shadowing, transmission, reflection, absorption, and emission by surface elements, facet orientation distribution, and facet density.

Bidirectional Reflectance Distribution Function Cause. SOURCE: umb

more: umb

Sensors

Remote sensor

  • Device that detects EM energy
  • Quantifies + records EM energy IN analogue or digital way

Active and Passive sensors. SOURCE: Tempfli et al. (2009d)

Common Remote-Sensing Platform and Sensor Combinations. SOURCE: Lechner, Foody, and Boyd (2020)

Image: Common Remote-Sensing Platform and Sensor Combinations and Remote-Sensing Data. (Left) Platforms and most commonly utilized sensors for specific platforms. (Right) True-color digital aerial photography and false color with NIR sensing (top), LiDAR point cloud of vegetation near a river (middle), and SAR data for two polarizations from Sentinel 1 (bottom).

Active Passive
Use their own source of energy Do not have their own source of energy
Independent of solar radiation Sun/ solar energy , Earth heat emission or detect naturally occurring radiation

Mostly works in microwave regions of EMR spectrum.

  • Can penetrate clouds
  • Not affected by rain and snow

Record electromagnetic energy

  • Reflected (e.g., blue, green, red, and infrared light)
  • Emitted (e.g., thermal infrared radiation)- earth
Can choose any radiation from EM spectrum Detect reflected or emitted radiation: ultraviolet to the thermal infrared
an all weather, day-night system It depends upon good weather conditions
It uses both transmitter and receiver units to produce imagery Relatively simple both mechanically and electrically
requires high energy levels does not have high power requirement
Relatively independent of atmospheric scatterings

RADAR signal

  • does not detect colour information
  • does not detect temperature information
  • penetrates vegetation and soil
  • It can detect roughness, slope and electrical conductivity of the objects
  • Information about: surface layer, soil moisture content
  • Application: oceanography, hydrology, geology, glaciology, agriculture and forestry services

wavebands, where natural remittance or reflected levels are low high detector sensitivities and wide radiation collection apertures are necessary to obtain a reasonable signal level.

most passive sensors are relatively wide band systems

  • Complicated analysis,
  • cost-intensive
Types: Laser altimeter, scanning electron microscopes, LiDAR, radar, GPS, x-ray, sonar, infrared and seismic (exist in both active and passive forms) Devices Eg: Spectrometer, Radiometer, Spectroradiometer, Hyperspectral radiometer, Imaging radiometer, Sounder, Accelerometer
Mounted to a satellite, an airplane, a boat, building top or a submarine UAV drone
Application: cartography to resource exploration to atmospheric and chemical measurements Application: agriculturalists and foresters, marine sciences and rescue missions, weather forecast etc.

Radar:

  • radio signals
  • feature: antenna emitting impulses
  • active remote sensing> energy flow> obstacle>scatters back> sensor (to some degree)
  • target distance= back scatter amount+ traveling time
  • A radar Technique: Synthetic aperture radar (SAR)

Lidar:

  • light
  • active remote sensing>transmitting light impulse>
  • target distance= multiplying the time by the speed of light

Laser Altimeter:

  • Measures elevation
  • Uses Lidar

SOURCE: brainkart, ltb, eos

more on senors: earthdata

Active Sensors

Spatial resolution of radar data~ ratio of the sensor wavelength to the length of the sensorā€™s antenna

  • Meaning: for a given wavelength
  • Longer antenna = higher spatial resolution

SAR

  • Synthetic aperture radar (SAR)
  • sensor produces its own energy
  • active data collection
  • can see through night, any weather condition
  • wavelength: micorwaves
  • A series of shorter antenna is combined to simulate a larger antenna
    • Result: higher resolution

Geometry of observations used to form the synthetic aperture for target P at along-track position x = 0. Credit: NASA SAR Handbook

SOURCE: earthdata

  • Radar sensors utilize longer wavelengths at the centimetre to meter scale
    • Helps: ability to see through clouds
  • Produces: fine-resolution images from a resolution-limited radar system
    • radar movement: straight line
    • carrier: airplane or NISAR
    • NISAR
      • provide multiple polarization modes across its two radar bands
      • 24 cm wavelength L-SAR
      • 10 cm wavelength S-SAR.
  • How?
    • transmits microwave signals>earth surface>receives back the signals (backscattered)> sensor makes an image from returned echoes- bat flying in cave
  • Result imaginary built?
    • Wavelength strength and time delay of the returned signal
    • depends primarily on the
      • roughness and electrical conducting properties of the observed surface
      • its distance from the orbiting radar 
  • SAR has different wavelengths = bands
    • Bands: X,C,L,P
    • Important feature to consider
    • Wavelength determines
      • How far a signal can penetrate into the medium
      • How the signal interacts with the surface
      • To understand look at X an L bands below

The electromagnetic spectrum with microwave bands inset. SOURCE: earthdata

SAR: band, frequency,wavelength and typical application. SOURCE: earthdata

Sensitivity of SAR measurements to forest structure and penetration into the canopy at different wavelengths used for airborne or spaceborne remote sensing observations of the land surface. SOURCE: earthdata

  • Polarization: orientation of the plane in which the transmitted electromagnetic wave oscillates
    • SAR=typically transmit linearly polarized
    • Signal polarization can be precisely controlled on both transmit and receive
    • Horizontal polarization = H
    • Vertical polarization = V
    • Signals
    • polarizations carries information about the structure of the imaged surface

SAR Polarizations. SOURCE: ASF

SAR signals are transmitted and received. SOURCE: ASF

SAR-Polarization. SOURCE: sorabatake

Scattering Surface Emitted+ Recieved
Rough Surface scattering bare soil or water VV Scattering
Volume Scattering leaves and branches in a forest canopy VH or HV (cross-polarization)
Double bounce Scattering buildings, tree trunks, or inundated vegetation HH polarized

Scattering Strength

Strong scattering in HH indicates a predominance of double-bounce scattering (e.g., stemmy vegetation, manmade structures), while strong VV relates to rough surface scattering (e.g., bare ground, water), and spatial variations in dual polarization indicate the distribution of volume scatterers (e.g., vegetation and high-penetration soil types such as sand or other dry porous soils). Credit: NASA SAR Handbook. SOURCE: earthdata

Polarimetry Decomposition. SOURCE: sorabatake

An example of polarimetry imagery from the airborne UAVSAR instrument obtained over Rosamond, California. Horizontal and vertical polarized signals were transmitted and the resulting backscatter signals received, resulting in three channels of imagery. When these separate images are colorized and overlaid, details and differences in surface features can been readily discerned. SOURCE: nisar

SAR Scatter Process. SOURCE: sorabatake

  • Interferometry (InSAR) ā€“ analysis method
  • Uses: uses the phase information recorded
  • Measures: distance from the sensor to the target
  • Measurement= very accurate
  • to measure changes in land surface topography??
    • Observations= Atleast 2 of the same target
    • Sensor data= Distance + additional geometric information
  • Application:
    • identify areas of deformation from events like volcanic eruptions and earthquakes
  • SAR data limitations:
    • Tedious preprocessing steps (applying the orbit file, radiometric calibration, de-bursting, multilooking, speckle filtering, and terrain correction) earthdata
  • Sensors: earthdata
  • SAR distortion: alaska.edu
  • SAR Hand book: ntrs
  • Interferometry: nisar
  • SweepSAR: nisar

Passive Sensors

Passive sensors

Example: LANDSAT 8

Landsat 8 Bands. SOURCE: gisrsstudy

So now we have an idea on how sensors work. But how do SENSORS STORE INFORMATION???

  • Data received is translated => Digital Image (can be displayed on digital screen)
  • What is this digital image???

NOISE

in remote sensing

  • REMOTE SENSING> Frequency band - any disturbance
  • DATA QUALITY> - Transmission Signal - Any irregular, sporadic, or random oscillation
  • TELECOMMUNICATIONS> communication - interfering random or repetitive events

Digital Image

2 dimensional array of pixels

Pixel. SOURCE: crisp

Digital format of an image. SOURCE: natural-resources

An example of a remote sensing image showing pixels and digital numbers; the arrow shows the progression in the level of detail of information which can be extracted from the images. Source: Phiri and Morgenroth (2017)

Bands

  • bands is a set of data file values
  • FOR?? a specific portion of the Electromagnetic spectrum (reflected light or emitted heat).
  • Bands; red, green, blue, near-infrared, thermal infrared (we covered it above)
  • bands> GIS > bands= layers - Additional layers can be created and added to the image file.ā€

Multispectral image:

  • Each band of a multispectral image >
  • Displayed:
    • One band at a time as a grey scale image OR
    • Combination of three bands at a time as a color composite image

PIXEL

  • The tiny Squares šŸŸ„šŸŸ§- look at the image above (Just think of your phoneā€™s Camera image)
  • Unit of image = pixel (Picture Elements)
  • Represent the relative reflected light energy recorded for that part of the image
  • Each pixel= square area on an image
  • Square area of an image= measure of the sensorā€™s ability
  • Ability of what??
    • To see objects of different sizes
    • Each pixel has intensity value and Location address
    • Intensity value - represented as? Digital Number
    • Location address - row and column numbers
    • So how does the computer that receives it know which part of the image should be dark and light?
      • Binary Numbers!!
      • 0s and 1s= on-off switch _(cos computers like it that way!! we donā€™t want it to be fussy- iā€™ts an inclusive approach)_
      • How does this happen??
        • converting decimal system => Binary System
        • 00=0
        • 01=1
        • 10=2
        • 11=3
    • OKay so assigning is doneā€¦but now is 00 the darkest or 11??? (see the image below- Binary= Dark and Light pixels
      • 00 = Darkest = black=āš«
      • 11= brightest = white = šŸ’”
    • The issue: as the image above it would create image with High CONTRASTā€¦..SO?
    • Sapcecraft uses a string 8-bit data (there can be
      • 8 bit= 8 binary numbers= 00000000
      • Think of it as creating a gradient or adding more white color to your black gradually until u make your black whiteā€¦get it? or adding more and more milk into your coffee, untill u dont taste coffee.
      • Range in binary: 00000000 to 11111111
      • in decimal system; 0 to 255
      • so in spacecraft which is the darkest?? its the Same, just more numbers.
        • 00000000 = darkest āš«
        • 11111111 = brightest šŸ’”
    • SO? what next???
      • This entire set is sent back from spacecraftā€¦.
      • binary numbers (btw 0-255)
  • Eg: Landsat 7 šŸ›°ļø
    • Sensor怰ļø āœ“ļø : Enhanced Thematic Mapper (ETM+)
    • Bands: 8 band whiskbroom scanning radiometer
    • Maximum Resolution: 15 meters
    • Pixel (Each):
      • area: area 15 m x 15 m
      • pixel1 + pixel2 + pixel3ā€¦..n = Area
      • Example: false image> count number of green pixels= vegetation area
        • ā¬†ļøResolution = ā¬†ļø Pixel Resolution
    • Sensor= ā¬†ļøResolution = detect SMALL objects šŸ‘ļø Pixel resolution

Landsat 7 Bands, wavelength and Resolution. SOURCE: gisrsstudy

Summary of this section, below image is just indicative

Digital Image formation

SO next question, what format is it stored in??

Data Format

data format

Image data format. SOURCE: kangwon

other sources: gisgeography., e-education.psu, sar.kangwon

Resolutions

By now let us just accept that every word in remote sensing branches out and even has aerial roots like a banyan tree. So just accept it, its a long journey.

Definition of the spatial resolution of a passive sensor system. SOURCE: geo-informatie

  • Measure of sensor ability to distinguish between signals
  • Information in RS= dependent on Resolution
  • role: how data from one sensor can be used
  • helps understand what type of data is required for a given study
  • 4 types of resolution for data set

4 RESOLUTIONS

Resolution SOURCE: earthdata

Spatial Resolution Spectral Resolution Radiometric Resolution Temporal Resolution
Size of the smallest object that can be detected in an image Number of bands + wavelength width of each band sensitivity of a remote sensor to variations in the reflectance levels how often a remote sensing platform can provide coverage of an area

Spatial Resolution

size of the smallest object that can be detected in an image

  • size of smallest object
  • unit of image= pixel
  • 1 m spatial resolution= each pixel image~reprsents an area of 1 sqm (1x1)

Resolution. SOURCE: ecoursesonline

Spatial Resolution. SOURCE: crisp

Spatial Resolution
Source: eos
High Resolution Use Medium Resolution Use Low Resolution Use
Extreme precision Not extreme precision Lack of precision
  • 1-5 meters per pixel
  • less than 1 meter per pixel
  • 5-30 meters per pixel
  • 30-250 meters per pixel
  • detection of crop diseases
  • detection of pests in precision agriculture;
  • identification of erosive soil processes;
  • detection of field borders and field mapping;
  • livestock observation and management;
  • deforestation detection and forestry management;
  • detection and mitigation of local anomalies;
  • 3D city modeling
  • crop health and growth monitoring;
  • moisture and nutrient content monitoring;
  • vegetation density monitoring;
  • pest and disease detection;
  • estimation of biodiversity loss in the forest lands;
  • identification of natural anomalies on a large scale;
  • monitoring water bodies;
  • urban expansion analysis.
  • crop growth modeling;
  • predicting yields;
  • trend mapping;
  • large-scale anomaly detection;
  • monitoring infrastructure changes on a large scale

Spectral Resolution

Specification of LISS-III and LISS-IV sensors of IRS-P6. SOURCE: ecoursesonline

Number of bands + wavelength width of each band

  • BAND= narrow portion of the electromagnetic spectrum
  • short wavelength width= HIGHER resolution
  • Multi-spectral imagery
  • measure: several wavelength bands (example: visible green or NIR)
  • Multi-spectral sensors: Landsat, Quickbird and Spot satellites
  • Hyperspectral imagery:
  • measures: energy in narrower and more numerous bands
  • more sensitive to variations in energy wavelengths = greater potential
  • Eg: used to check crop stress

Radiometric Resolution

Amount of information in each pixel!!

  • sensitivity of a remote sensor to variations in the reflectance levels
  • Information form? BIT= energy recorded
  • ā¬†ļøHigher radiometric resolution = ā¬†ļømore sensitive = ā¬†ļømore precise picture (a specific portion of the electromagnetic spectrum)=
  • Sensitive how? - detecting small differences in reflectance values
  • small differences how? - ā¬†ļø more values stored= ā¬†ļø detecting btw slightest differnce of energy
  • How is the information recorded/ stored?
    • in the bit for an exponet of 2
    • eg: 8 bit = 2^8 resolution = 256 potential digital values (refer example above)
    • Application: water quality: distinguish btw subtle difference in ocen color

Advances in remote sensing technology have significantly improved satellite imagery. Among the advances are improvements in radiometric resolution, or how sensitive an instrument is to small differences in electromagnetic energy. Sensors with high radiometric resolution can distinguish greater detail and variation in light. Credit: NASA Earth Observatory images by Joshua Stevens, using Landsat data from the U.S. Geological Survey. SOURCE: earthdata

Temporal Resolution

Satellite to complete an orbit and revisit the same observation area

  • how often a remote sensing platform can provide coverage of an area
  • Geo-stationary satellites = provide continuous sensing
  • eg cameras mounted on Airplanes = provide data for applications requiring more frequent sensing
  • Remote sensors located: fields or attached to agricultural equipment >> most frequent temporal resolution
  • normal orbiting satellites = provide data each time they pass over an area
  • Issues: cloud cover
  • eg: Polar orbiting satellites: temporal resolution = 1-16 days
    • MODIS sensor: resolution of 1-2 days- daily changes
    • OLI: narrow swath width: resolution of 16 days= bi-monthly changes= high spatial resolution
    • high spatial resolution = narrower swath = molre time = low temporal resolution

Multilayer Images

  • Of a certain area/ one particular areaI on ground
    • multiple measurements
    • each measurement forms an image (carries specific information)
    • STACKING these images = MULTILAYER IMAGE
  • MULTIPLELAYER image formation: (combination of)
    • same
    • different sensors
    • subsidiary data
    • Combination eg:
      • 3 layers = SPOT
      • 1 layer = ERS (synthetic aperture radar)
      • 1 layer = digital elevation map
    • For Healthy vegetation (Infra-red> into Red band, red into> gree band, and then gree into Blue band. Blue band is omitted as it reflects too much light an dis not useful. RESULT IMAGE: would show vegetation as red (if its healthy)

Multispectral Image

  • few layers
  • image from a particular wavelength band
  • Eg:
    • Sensors:
      • SPOT & HRV
        • three wavelength bands
          • green (500 - 590 nm)
          • red (610 - 680 nm)
          • near infrared (790 - 890 nm)
        • meaning: each pixel= three intensity value= corr to 3 bands
      • IKONOS
        • 4 bands: Blue, Green, Red and Near Infrared
      • LANDSAT
        • 7 Bands: blue, green, red, near-IR bands, two SWIR bands, and a thermal IR band

Superspectral Image

  • more wavelength bands
  • Sensor:
    • MODIS
      • 36 spectral bands: visible, near infrared, short-wave infrared to the thermal infrared

Hyperspectral Image processing

  • Based on taking a fraction of the electromagnetic spectrum> breaking it into numerous bands for theoretical analysis and computations.
  • Why do it?
    • One can detect and identify objects more precisely compared to using only three bands information provided by a RGB camera.
  • not commercially available, only scientific investigation
  • 100+ contiguous spectral bands.
  • characteristic spectrum of the target pixel is acquired in a hyperspectral image
  • precise spectral information
  • enables better characterisation
  • better identification of targets
  • Application:
    • precision agriculture (crop: health, moisture status and maturity)
    • coastal management ( phytoplanktons, pollution, bathymetry changes)
  • EOI satellite
  • Sensor:
    • CHRIS: ESAā€™SPRABO satellite

Hyperspectral Data Cube. SOURCE: mathworks

Hyperspectral Data Cube. SOURCE: mathworks

Hyperspectral Data Cube Processing. SOURCE: mathworks

Hyperspectral Data Cube Processing. SOURCE: mathworks

Spectral Unmixing. SOURCE: mathworks

Spectral Unmixing. SOURCE: mathworks

Application

  1. The above segments explain the extent of the Remote Sensing application.
  2. Data Extracted application (just a few):
    • Forest fires
    • Pollution
    • Volcanic eruption
    • Flooding
    • Weather
    • Crop growth
    • Carbon sinks
    • Forest covers
    • City Growth
    • Ocean floor

Reflection

  1. How to approach the subject initially to understand the concepts?--- overwhelming resources, so stick to three good resources (combination of books or webpage) and donā€™t deviate.
  2. The entire process is about measuring and understanding reflected and emitted radiation.
  3. It is essential to understand how EM waves and frequency work. It is like EMW has a life of its everything it interacts with has some effect.
  4. Sensors are on-boarded on a satellite (active and passive)
  5. Multiple pixels make a band, where information is stored in a bit. The higher the bit meaning more information leading to better detection of features.
  6. Each band has a set of information Multiple bands with different information are stacked on each other to form an image. (can be a combination of various satellites)
  7. It is interesting to see how much is observed and accomplished through remote sensing.
  8. It is essential to understand which scale data is required and for what use/ size of the target is to be resolved, based on which the specific satellite data is to be acquired. Eg. If I need to find data to count the number of cars parked in a parking lot, I wonā€™t use Landsat imagery as it is (30mx30m). Solution: Maxar or planet.com (3mx3m)
  9. Whatever is to be detected through remote sensing needs to be twice the size of your pixel.
  10. Cadence: Landset-16days (40-year time period), sentinel 2-10 days (equator) and 2-3 (mid-latitude) (10mx10m) pixel == it is amazing so many questions can be resolved in a short duration of time.
  11. How to approach the subject initially to understand the concepts?--- overwhelming resources, so stick to three good resources (combination of books or webpage) and donā€™t deviate.
  12. The entire process is about measuring and understanding reflected and emitted radiation.
  13. It is essential to understand how EM waves and frequency work. It is like EMW has a life of its own everything it interacts with has some effect.
  14. Sensors are on-boarded on a satellite (active and passive)
  15. Multiple pixels make a band, where information is stored in a bit. The higher the bit meaning more information leading to better detection of features.
  16. Each band has a set of information Multiple bands with different information are stacked on each other to form an image. (can be a combination of various satellites)
  17. It is interesting to see how much is observed and accomplished through remote sensing.
  18. Its is essential to understand which scale data is required and for what use/ size of target that is to be resolved, based on which the specific satellite data is to be accuired. Eg. If I need to find data to count the number of cars parked in a parking lot, I wont use Landsat imagery as it is (30mx30m). Solution: Maxar or planet.com (3mx3m)
  19. what ever needs to be detected through remote sensing need to be twice the size of your pixel.
  20. Cadence: Landset-16days (40 year time period), sentinel 2-10days(equator) and 2-3 (mid latitude) (10mx10m) pixel == its amazing, so many questions can be resolved in short duration of time.

References

Lechner, Alex, Giles Foody, and Doreen Boyd. 2020. ā€œApplications in Remote Sensing to Forest Ecology and Management.ā€ One Earth 2 (May): 405ā€“12. https://doi.org/10.1016/j.oneear.2020.05.001.
Obaid, Fadhil, and Ali Al-Rahim. 2020. ā€œImaging of 2D Seismic Data Using Time Migration of Ajeel Oilfield, Central of Iraq.ā€ PhD thesis. https://doi.org/10.13140/RG.2.2.21505.28003.
Phiri, Darius, and Justin Morgenroth. 2017. ā€œRemote Sensing Review Developments in Landsat Land Cover Classification Methods: A Review.ā€ Remote Sensing 9 (September). https://doi.org/10.3390/rs9090967.
Tempfli, Klaus, G. C. Huurneman, Wim Bakker, L. L. F. Janssen, W. F. Feringa, Ambro Gieske, K. A. Grabmaier, et al. 2009a. ā€œPrinciples of Remote Sensing : An Introductory Textbook.ā€ In, 56ā€“85.
ā€”ā€”ā€”, et al. 2009b. ā€œPrinciples of Remote Sensing : An Introductory Textbook.ā€ In, 56ā€“85.
ā€”ā€”ā€”, et al. 2009c. ā€œPrinciples of Remote Sensing : An Introductory Textbook.ā€ In, 56ā€“85.
ā€”ā€”ā€”, et al. 2009d. ā€œPrinciples of Remote Sensing : An Introductory Textbook.ā€ In, 56ā€“85.
Zhang, Yichao, Lakitha Wijeratne, Shawhin Talebi, and David Lary. 2021. ā€œMachine Learning for Light Sensor Calibration.ā€ Sensors 21 (September): 6259. https://doi.org/10.3390/s21186259.