Skip to main content

Atmospheric Correction

It is the process of removing the influence of the atmosphere from remotely sensed images so that the data accurately represent the true reflectance of Earth's surface.

When a satellite sensor captures an image, the radiation reaching the sensor is affected by gases, water vapor, aerosols, and dust in the atmosphere. These factors scatter and absorb light, changing the brightness and color of the features seen in the image.

Although these atmospheric effects are part of the recorded signal, they can distort surface reflectance values, especially when images are compared across different dates or sensors. Therefore, corrections are necessary to make data consistent and physically meaningful.


🔹 Why Do We Need Atmospheric Correction?

  1. To retrieve true surface reflectance – It separates the surface signal from atmospheric influence.

  2. To ensure comparability – Enables comparing images from different times, seasons, or sensors.

  3. To improve visual quality – Removes haze and increases image contrast.

  4. For accurate quantitative analysis – Essential for calculating vegetation, water, or urban indices (e.g., NDVI, NDWI).

  5. For change detection and mosaicking – Ensures that images have uniform brightness and color.

  6. For ground validation – Required when comparing satellite data with field reflectance measurements.


🔹 Atmospheric Effects on Satellite Images

  1. Scattering – Occurs when particles or gas molecules redirect light.

    • Rayleigh scattering: caused by very small particles (affects blue wavelengths most).

    • Mie scattering: caused by dust or smoke (affects longer wavelengths).

    • Non-selective scattering: caused by large water droplets (affects all wavelengths equally).

  2. Absorption – Certain gases (like ozone, carbon dioxide, and water vapor) absorb specific wavelengths, reducing the energy reaching the sensor.

  3. Path Radiance / Haze – Scattered light that reaches the sensor without reflecting from the ground. It adds a bright veil over the image, especially in blue bands, and reduces contrast.

  4. Transmittance – The fraction of light that successfully travels through the atmosphere from the Sun to the surface and back to the sensor.


🔹 Key Concepts and Terminologies

TermMeaning
RadianceThe total light energy received by the sensor.
ReflectanceThe fraction of incident light reflected by a surface (what we want to retrieve).
Path RadianceUnwanted light scattered into the sensor's line of sight, causing haze.
TransmittanceEfficiency of the atmosphere in letting light pass through.
AerosolsTiny particles that scatter and absorb radiation, major source of atmospheric distortion.
HazeVisual result of atmospheric scattering; reduces image clarity.
CalibrationConversion of raw digital numbers (DNs) to physical units like radiance or reflectance.

🔹 Common Atmospheric Correction Methods

Atmospheric correction can be performed using image-based or model-based methods.

1. Image-Based Methods

These rely only on the image itself and do not require external atmospheric data.

a) Histogram Minimum / Dark Pixel Subtraction

  • Assumes that some pixels (deep water, shadows, dark rocks) should have nearly zero reflectance.

  • The minimum DN value in each band is treated as atmospheric haze.

  • That value is subtracted from all pixels in the band.

  • Simple and fast, but can be inaccurate if no truly dark object exists.

b) Regression Method

  • Plots pixel values from a short wavelength band (affected by scattering) against a long wavelength band (less affected).

  • The intercept of the line indicates atmospheric path radiance.

  • That offset is subtracted from the image.

  • Works well for homogeneous areas but depends on proper band selection.

c) Empirical Line Method (ELC)

  • Uses ground reference reflectance measurements (from field spectrometer or known targets).

  • Establishes a direct relationship between sensor radiance and true surface reflectance.

  • Most accurate among empirical methods if ground data are available.

  • Commonly used for airborne or hyperspectral imagery.


2. Model-Based (Radiative Transfer) Methods

These methods use physical models of atmospheric behavior and require information about the atmospheric conditions during image capture.

Key Models:

  • LOWTRAN 7 – Early model for visible to thermal IR regions.

  • MODTRAN 4 – Advanced model for a wide spectral range.

  • 6S (Second Simulation of the Satellite Signal in the Solar Spectrum) – Widely used open-source model.

  • ATCOR (Atmospheric and Topographic Correction) – Commercial software used in ERDAS Imagine.

  • FLAASH (Fast Line-of-sight Atmospheric Analysis of Spectral Hypercubes) – For hyperspectral and multispectral data.

  • ATREM (Atmospheric REMoval) – For hyperspectral imagery.

Inputs Required:

  • Scene location (latitude and longitude)

  • Date and time of image capture

  • Sensor altitude and scene elevation

  • Atmospheric model (e.g., tropical, mid-latitude summer)

  • Visibility or aerosol optical depth

  • Water vapor and ozone concentration

These models simulate how light interacts with the atmosphere and remove its effect to retrieve surface reflectance.


🔹 Additional Step: Cloud Masking

Before atmospheric correction, clouds and their shadows must be identified and masked out, since they distort spectral values.
This step uses cloud detection algorithms (e.g., Fmask, QA bands) to remove cloudy pixels from analysis.


🔹 When Is Atmospheric Correction Necessary?

Required When:

  • Comparing multiple scenes (multi-temporal analysis)

  • Performing change detection studies

  • Creating mosaics of multiple images

  • Calculating accurate surface reflectance or biophysical parameters

Not Always Necessary When:

  • Working with a single scene for visual interpretation

  • Using ratio-based indices (e.g., NDVI), which minimize atmospheric effects



MethodTypeRequires Atmospheric Data?AccuracyTypical Use
Dark Pixel SubtractionImage-basedNoLow–MediumQuick correction, simple projects
Histogram MinimumImage-basedNoLow–MediumBasic haze removal
Regression MethodImage-basedNoMediumScenes with dark objects
Empirical Line MethodImage-basedYes (ground reflectance)HighAirborne or field-calibrated data
Radiative Transfer Models (e.g., ATCOR, MODTRAN, 6S)Model-basedYesVery HighProfessional quantitative studies


Atmospheric correction is a critical preprocessing step in remote sensing.
It ensures that image brightness truly represents the Earth's surface rather than the atmosphere above it.
Choosing the right method depends on your data availability, required accuracy, and application type — from simple visual enhancement to advanced quantitative analysis.

Comments

Popular posts from this blog

History of GIS

1. 1832 - Early Spatial Analysis in Epidemiology:    - Charles Picquet creates a map in Paris detailing cholera deaths per 1,000 inhabitants.    - Utilizes halftone color gradients for visual representation. 2. 1854 - John Snow's Cholera Outbreak Analysis:    - Epidemiologist John Snow identifies cholera outbreak source in London using spatial analysis.    - Maps casualties' residences and nearby water sources to pinpoint the outbreak's origin. 3. Early 20th Century - Photozincography and Layered Mapping:    - Photozincography development allows maps to be split into layers for vegetation, water, etc.    - Introduction of layers, later a key feature in GIS, for separate printing plates. 4. Mid-20th Century - Computer Facilitation of Cartography:    - Waldo Tobler's 1959 publication details using computers for cartography.    - Computer hardware development, driven by nuclear weapon research, leads to broader mapping applications by early 1960s. 5. 1960 - Canada Geograph...

Supervised Classification

Image Classification in Remote Sensing Image classification in remote sensing involves categorizing pixels in an image into thematic classes to produce a map. This process is essential for land use and land cover mapping, environmental studies, and resource management. The two primary methods for classification are Supervised and Unsupervised Classification . Here's a breakdown of these methods and the key stages of image classification. 1. Types of Classification Supervised Classification In supervised classification, the analyst manually defines classes of interest (known as information classes ), such as "water," "urban," or "vegetation," and identifies training areas —sections of the image that are representative of these classes. Using these training areas, the algorithm learns the spectral characteristics of each class and applies them to classify the entire image. When to Use Supervised Classification:   - You have prior knowledge about the c...

Pre During and Post Disaster

Disaster management is a structured approach aimed at reducing risks, responding effectively, and ensuring a swift recovery from disasters. It consists of three main phases: Pre-Disaster (Mitigation & Preparedness), During Disaster (Response), and Post-Disaster (Recovery). These phases involve various strategies, policies, and actions to protect lives, property, and the environment. Below is a breakdown of each phase with key concepts, terminologies, and examples. 1. Pre-Disaster Phase (Mitigation and Preparedness) Mitigation: This phase focuses on reducing the severity of a disaster by minimizing risks and vulnerabilities. It involves structural and non-structural measures. Hazard Identification: Recognizing potential natural and human-made hazards (e.g., earthquakes, floods, industrial accidents). Risk Assessment: Evaluating the probability and consequences of disasters using GIS, remote sensing, and historical data. Vulnerability Analysis: Identifying areas and p...

History of GIS

The history of Geographic Information Systems (GIS) is rooted in early efforts to understand spatial relationships and patterns, long before the advent of digital computers. While modern GIS emerged in the mid-20th century with advances in computing, its conceptual foundations lie in cartography, spatial analysis, and thematic mapping. Early Roots of Spatial Analysis (Pre-1960s) One of the earliest documented applications of spatial analysis dates back to  1832 , when  Charles Picquet , a French geographer and cartographer, produced a cholera mortality map of Paris. In his report  Rapport sur la marche et les effets du cholĂ©ra dans Paris et le dĂ©partement de la Seine , Picquet used graduated color shading to represent cholera deaths per 1,000 inhabitants across 48 districts. This work is widely regarded as an early example of choropleth mapping and thematic cartography applied to epidemiology. A landmark moment in the history of spatial analysis occurred in  1854 , when  John Snow  inv...

Representation of Spatial and Temporal Relationships

In GIS, spatial and temporal relationships allow the integration of location (the "where") and time (the "when") to analyze phenomena across space and time. This combination is fundamental to studying dynamic processes such as urban growth, land-use changes, or natural disasters. Key Concepts and Terminologies Geographic Coordinates : Define the position of features on Earth using latitude, longitude, or other coordinate systems. Example: A building's location can be represented as (11.6994° N, 76.0773° E). Timestamp : Represents the temporal aspect of data, such as the date or time a phenomenon was observed. Example: A landslide occurrence recorded on 30/07/2024 . Spatial and Temporal Relationships : Describes how features relate in space and time. These relationships can be: Spatial : Topological (e.g., "intersects"), directional (e.g., "north of"), or proximity-based (e.g., "near"). Temporal : Sequential (e....