Skip to main content

RADIOMETRIC CORRECTION

 


Radiometric correction is the process of removing sensor and environmental errors from satellite images so that the measured brightness values (Digital Numbers or DNs) truly represent the Earth's surface reflectance or radiance.

In other words, it corrects for sensor defects, illumination differences, and atmospheric effects.


1. Detector Response Calibration

Satellite sensors use multiple detectors to scan the Earth's surface. Sometimes, each detector responds slightly differently, causing distortions in the image. Calibration adjusts all detectors to respond uniformly.

This includes:

(a) De-Striping

  • Problem: Sometimes images show light and dark vertical or horizontal stripes (banding).

    • Caused by one or more detectors drifting away from their normal calibration — they record higher or lower values than others.

    • Common in early Landsat MSS data.

  • Effect: Every few lines (e.g., every 6th line) appear consistently brighter or darker.

  • Solution (De-Striping):

    • Compare histograms of scan lines (e.g., 1,7,13 or 2,8,14) for mean and standard deviation.

    • Adjust the detector's response to match neighboring detectors.

    • Methods:

      • Histogram equalization and normalization

      • Fourier transformation (removes periodic striping patterns)


(b) Missing Scan Line Removal

  • Problem: Sometimes a detector stops working or becomes temporarily saturated, creating blank lines or missing data in the image.

  • Solution:

    • Replace missing lines with estimated pixel values based on the lines above and below using interpolation techniques.

    • Example: Affected Landsat 7 ETM+ (Scan Line Corrector failure).


(c) Random Noise Removal

  • Problem: Some pixels show random bright or dark spots known as "salt-and-pepper noise" or "snowy noise."

    • Caused by random electronic interference or transmission errors.

  • Solution:

    • Spatial filtering: Replace noisy pixels with average values from neighboring pixels.

    • Convolution filtering: Smooths image by using a moving filter (kernel) to reduce random pixel variation.


(d) Vignetting Removal

  • Problem: In images taken with lenses, the corners often appear darker than the center — this is vignetting.

  • Cause: Uneven illumination across the sensor array or lens curvature.

  • Solution:

    • Use sensor calibration data that describes how brightness varies from center to edges.

    • Apply Fourier Transform or other normalization methods to equalize brightness.


2. Sun Angle and Topographic Correction

(a) Sun Angle Correction

  • The sun's position changes with time of day and season, affecting image brightness.

  • Higher solar angle (summer) → more direct sunlight → brighter image.

  • Lower solar angle (winter) → less sunlight → darker image.

  • Correction Method:

    • Adjust each pixel's brightness (DN) by dividing it with the sine of the solar elevation angle:
      [
      DN_{corrected} = \frac{DN_{original}}{\sin(\text{solar elevation angle})}
      ]

    • Solar elevation data is given in the image metadata or header file.


(b) Topographic Correction

  • Problem: In hilly or mountainous areas, slopes facing the sun appear brighter, while those facing away appear darker due to uneven solar illumination.

  • Cause:

    • Slope and aspect of terrain

    • Shadowing effects

    • Bidirectional Reflectance Distribution Function (BRDF) differences

  • Solution: Adjust radiance based on slope orientation and sun angle using models such as:

    Minnaert Correction:
    [
    L_n = L \cdot (\cos e)^{k-1} \cdot \cos i
    ]
    Where:

    • (L_n): normalized radiance

    • (L): measured radiance

    • (e): slope angle (from DEM)

    • (i): solar incidence angle

    • (k): Minnaert constant (depends on land cover and illumination conditions)

    This correction helps produce uniform brightness across slopes.


3. Atmospheric Correction

  • Problem: Before reaching the sensor, sunlight interacts with the atmosphere, where gases, dust, and water vapor scatter and absorb radiation.

    • Causes haze, color distortion, and lower contrast in the image.

  • Goal: Remove the effects of atmosphere to obtain true surface reflectance.

  • Methods:

    • Dark Object Subtraction (DOS): Assumes that dark pixels (like water) should have near-zero reflectance; subtracts atmospheric haze values.

    • Radiative Transfer Models: e.g., 6S, MODTRAN, FLAASH, or QUAC to simulate atmospheric scattering and absorption effects accurately.


Type of CorrectionProblem FixedExample of ErrorCommon Methods
Detector CalibrationUneven sensor responseStriping, noiseHistogram matching, Fourier transform
Missing LineLost data linesLandsat 7 SLC failureInterpolation
Random NoiseSalt-and-pepper noiseBright/dark spotsSpatial/convolution filtering
VignettingDark cornersLens-based imagesFourier normalization
Sun AngleSeasonal/diurnal illuminationWinter images darkerDivide by sin(solar angle)
TopographicSlope illumination differenceBright/dark slopesMinnaert correction
AtmosphericScattering, absorptionHazy imagesDOS, FLAASH, MODTRAN


Comments

Popular posts from this blog

History of GIS

1. 1832 - Early Spatial Analysis in Epidemiology:    - Charles Picquet creates a map in Paris detailing cholera deaths per 1,000 inhabitants.    - Utilizes halftone color gradients for visual representation. 2. 1854 - John Snow's Cholera Outbreak Analysis:    - Epidemiologist John Snow identifies cholera outbreak source in London using spatial analysis.    - Maps casualties' residences and nearby water sources to pinpoint the outbreak's origin. 3. Early 20th Century - Photozincography and Layered Mapping:    - Photozincography development allows maps to be split into layers for vegetation, water, etc.    - Introduction of layers, later a key feature in GIS, for separate printing plates. 4. Mid-20th Century - Computer Facilitation of Cartography:    - Waldo Tobler's 1959 publication details using computers for cartography.    - Computer hardware development, driven by nuclear weapon research, leads to broader mapping applications by early 1960s. 5. 1960 - Canada Geograph...

Supervised Classification

Image Classification in Remote Sensing Image classification in remote sensing involves categorizing pixels in an image into thematic classes to produce a map. This process is essential for land use and land cover mapping, environmental studies, and resource management. The two primary methods for classification are Supervised and Unsupervised Classification . Here's a breakdown of these methods and the key stages of image classification. 1. Types of Classification Supervised Classification In supervised classification, the analyst manually defines classes of interest (known as information classes ), such as "water," "urban," or "vegetation," and identifies training areas —sections of the image that are representative of these classes. Using these training areas, the algorithm learns the spectral characteristics of each class and applies them to classify the entire image. When to Use Supervised Classification:   - You have prior knowledge about the c...

Pre During and Post Disaster

Disaster management is a structured approach aimed at reducing risks, responding effectively, and ensuring a swift recovery from disasters. It consists of three main phases: Pre-Disaster (Mitigation & Preparedness), During Disaster (Response), and Post-Disaster (Recovery). These phases involve various strategies, policies, and actions to protect lives, property, and the environment. Below is a breakdown of each phase with key concepts, terminologies, and examples. 1. Pre-Disaster Phase (Mitigation and Preparedness) Mitigation: This phase focuses on reducing the severity of a disaster by minimizing risks and vulnerabilities. It involves structural and non-structural measures. Hazard Identification: Recognizing potential natural and human-made hazards (e.g., earthquakes, floods, industrial accidents). Risk Assessment: Evaluating the probability and consequences of disasters using GIS, remote sensing, and historical data. Vulnerability Analysis: Identifying areas and p...

History of GIS

The history of Geographic Information Systems (GIS) is rooted in early efforts to understand spatial relationships and patterns, long before the advent of digital computers. While modern GIS emerged in the mid-20th century with advances in computing, its conceptual foundations lie in cartography, spatial analysis, and thematic mapping. Early Roots of Spatial Analysis (Pre-1960s) One of the earliest documented applications of spatial analysis dates back to  1832 , when  Charles Picquet , a French geographer and cartographer, produced a cholera mortality map of Paris. In his report  Rapport sur la marche et les effets du cholĂ©ra dans Paris et le dĂ©partement de la Seine , Picquet used graduated color shading to represent cholera deaths per 1,000 inhabitants across 48 districts. This work is widely regarded as an early example of choropleth mapping and thematic cartography applied to epidemiology. A landmark moment in the history of spatial analysis occurred in  1854 , when  John Snow  inv...

Representation of Spatial and Temporal Relationships

In GIS, spatial and temporal relationships allow the integration of location (the "where") and time (the "when") to analyze phenomena across space and time. This combination is fundamental to studying dynamic processes such as urban growth, land-use changes, or natural disasters. Key Concepts and Terminologies Geographic Coordinates : Define the position of features on Earth using latitude, longitude, or other coordinate systems. Example: A building's location can be represented as (11.6994° N, 76.0773° E). Timestamp : Represents the temporal aspect of data, such as the date or time a phenomenon was observed. Example: A landslide occurrence recorded on 30/07/2024 . Spatial and Temporal Relationships : Describes how features relate in space and time. These relationships can be: Spatial : Topological (e.g., "intersects"), directional (e.g., "north of"), or proximity-based (e.g., "near"). Temporal : Sequential (e....