Skip to main content

Supervised Classification


Supervised classification is a digital image classification method where the analyst guides the classification process by defining classes of interest and providing representative training samples.
The classifier uses these training samples to learn the spectral signatures of each class and then assigns every pixel in the image to the most appropriate class.

This method relies heavily on prior knowledge of the study area.

How Supervised Classification Works

✔ Step 1: Define Information Classes

These are real-world land-cover classes such as:

  • water

  • forest

  • agriculture

  • urban

  • barren land

✔ Step 2: Select Training Areas

Training areas (also called ROIs—Regions of Interest) are chosen on the image where the analyst is confident about the land-cover type.

✔ Step 3: Extract Spectral Signatures

The classifier calculates:

  • mean

  • variance

  • covariance

  • pixel distribution

for each class across different spectral bands.

✔ Step 4: Apply Decision Rules

The classification algorithm uses statistical rules to assign each pixel to a class.

✔ Step 5: Produce Classified Output

The final output is a thematic map showing land-cover classes.

When to Use Supervised Classification

Use supervised classification when:

  • You have prior knowledge of the landscape.

  • Ground truth or ancillary data is available (GPS points, survey data).

  • You can identify distinct, homogeneous training sites for each class.

  • The objective is to extract specific land-cover categories.

Information Class vs Spectral Class

Understanding the difference between these two is essential:

Information Class

  • Defined by the analyst based on real-world concepts.

  • Examples: village, river, wetland, cropland.

  • Represents semantic categories used for mapping and interpretation.

Spectral Class

  • Group of pixels that are spectrally similar, based on reflectance values.

  • Identified statistically by the software.

  • May not always match real-world categories exactly.

📌 Mapping involves matching spectral classes to information classes.

Supervised Training

Supervised training involves:

  • Manually selecting representative pixel samples

  • Ensuring the samples capture the full spectral variability of each class
    (e.g., different shades of vegetation or soil types)

  • Evaluating spectral signatures using

    • histograms

    • scatter plots

    • spectral profiles

    • separability indices (e.g., Jeffries–Matusita)

✔ Characteristics

  • Analyst-controlled

  • Knowledge-driven

  • Often more accurate

  • Requires skill in selecting high-quality training data

Classification Decision Rules (Supervised)

Decision rules determine how the classifier decides which class a pixel belongs to.

They fall into two broad groups:

Parametric Decision Rules

Parametric classifiers assume pixel values follow a normal (Gaussian) distribution.

These rules rely on statistical measures such as:

  • class mean

  • variance

  • covariance

  • probability density functions

Minimum Distance Classifier

  • Computes Euclidean or Mahalanobis distance between pixel and class mean.

  • Assigns pixel to the closest class mean.

  • Simple and fast but may misclassify overlapping classes.

Maximum Likelihood Classifier (MLC)

  • Most widely used supervised classifier.

  • Considers:

    • class mean

    • variance

    • covariance

    • overall probability distribution

  • Assigns pixel to the class with the highest likelihood of belonging.

  • Requires good training data; performs best when classes are normally distributed.

Nonparametric Decision Rules

Do not assume any specific statistical distribution; useful when pixel distributions are irregular.

Parallelepiped Classifier

  • Creates "boxes" using min–max values for each band.

  • A pixel is assigned to a class if its values fall within the box.

  • Fast, but may leave pixels:

    • unclassified (if no box contains the pixel)

    • ambiguously classified (if pixel falls in more than one box)

Feature Space Classifier

  • Plots pixel values in a multi-dimensional feature space.

  • Uses polygons in the feature space to define classes.

  • More flexible and accurate than parallelepiped.

  • Good for visually evaluating class separability.



Comments

Popular posts from this blog

Energy Interaction with Atmosphere and Earth Surface

In Remote Sensing , satellites record electromagnetic radiation (EMR) that is reflected or emitted from the Earth. Before reaching the sensor, radiation interacts with: The Atmosphere The Earth's Surface These interactions control how satellite images look and how we interpret them. I. Interaction of EMR with the Atmosphere When solar radiation travels from the Sun to the Earth, four main processes occur: 1. Absorption Definition: Absorption occurs when atmospheric gases absorb radiation at specific wavelengths and convert it into heat. Main absorbing gases: Ozone (O₃) → absorbs Ultraviolet (UV) Carbon dioxide (CO₂) → absorbs Thermal Infrared Water vapour (H₂O) → absorbs Infrared Concept: Atmospheric Windows These are wavelength regions where absorption is very low, allowing radiation to pass through the atmosphere. Remote sensing depends on these windows. For example, satellites like Landsat 8 use visible, near-infrared, and thermal bands located in atmospheric windows. 2. Trans...

Types of Remote Sensing

Remote Sensing means collecting information about the Earth's surface without touching it , usually using satellites, aircraft, or drones . There are different types of remote sensing based on the energy source and the wavelength region used. 🛰️ 1. Active Remote Sensing 📘 Concept: In active remote sensing , the sensor sends out its own energy (like a signal or pulse) to the Earth's surface. The sensor then records the reflected or backscattered energy that comes back from the surface. ⚙️ Key Terminology: Transmitter: sends energy (like a radar pulse or laser beam). Receiver: detects the energy that bounces back. Backscatter: energy that is reflected back to the sensor. 📊 Examples of Active Sensors: RADAR (Radio Detection and Ranging): Uses microwave signals to detect surface roughness, soil moisture, or ocean waves. LiDAR (Light Detection and Ranging): Uses laser light (near-infrared) to measure elevation, vegetation...

Platforms in Remote Sensing

In remote sensing, a platform is the physical structure or vehicle that carries a sensor (camera, scanner, radar, etc.) to observe and collect information about the Earth's surface. Platforms are classified mainly by their altitude and mobility : Ground-Based Platforms Definition : Sensors mounted on the Earth's surface or very close to it. Examples : Tripods, towers, ground vehicles, handheld instruments. Applications : Calibration and validation of satellite data Detailed local studies (e.g., soil properties, vegetation health, air quality) Strength : High spatial detail but limited coverage. Airborne Platforms Definition : Sensors carried by aircraft, balloons, or drones (UAVs). Altitude : A few hundred meters to ~20 km. Examples : Airplanes with multispectral scanners UAVs with high-resolution cameras or LiDAR High-altitude balloons (stratospheric platforms) Applications : Local-to-regional mapping ...

Government of Kerala Initiatives for Water Management

Kerala, with its abundant rainfall and network of rivers, faces a dual challenge of water scarcity and excess —seasonal droughts and monsoon floods. The state government has implemented various policies and programs to address these challenges through sustainable water conservation, management, and distribution practices . Below is a detailed breakdown of the major water management initiatives in Kerala. 1. Jal Jeevan Mission (JJM) – Kerala Implementation Objective: To provide functional household tap connections (FHTC) to all rural households by 2024. Focuses on source sustainability and community-led water resource management. Key Features: Water Quality Monitoring & Surveillance: Ensures supply of safe drinking water through real-time monitoring. Decentralized Approach: Implementation through gram panchayats and local self-governments (LSGs) . Recharge & Conservation Measures: Rainwater harvesting, groundwater recharge, and watershed development inte...

Atmospheric Window

The atmospheric window in remote sensing refers to specific wavelength ranges within the electromagnetic spectrum that can pass through the Earth's atmosphere relatively unimpeded. These windows are crucial for remote sensing applications because they allow us to observe the Earth's surface and atmosphere without significant interference from the atmosphere's constituents. Key facts and concepts about atmospheric windows: Visible and Near-Infrared (VNIR) window: This window encompasses wavelengths from approximately 0. 4 to 1. 0 micrometers. It is ideal for observing vegetation, water bodies, and land cover types. Shortwave Infrared (SWIR) window: This window covers wavelengths from approximately 1. 0 to 3. 0 micrometers. It is particularly useful for detecting minerals, water content, and vegetation health. Mid-Infrared (MIR) window: This window spans wavelengths from approximately 3. 0 to 8. 0 micrometers. It is valuable for identifying various materials, incl...