Skip to main content

Grey level thresholding. Level slicing. Contrast stretching lo p

Grey level thresholding.

Level slicing.

Contrast stretching.


Image enhancement


Lillesand and Kiefer (1994) explained the goal of image enhancement procedures is to improve the visual interpretability of any image by increasing the apparent distinction between the features in the scene. This objective is to create "new" image from the original image in order to increase the amount of information that can be visually interpreted from the data.


Enhancement operations are normally applied to image data after the appropriate restoration procedures have been performed. Noise removal, in particular, is an important precursor to most enhancements. In this study, typical image enhancement techniques are as follows:


Grey level thresholding


Grey level thresholding is a simple lookup table, which partitions the gray levels in an image into one or two categories - those below a user-selected threshold and those above. Thresholding is one of many methods for creating a binary mask for an image. Such masks are used to restrict subsequent processing to a particular region within an image.


This procedure is used to segment an input image into two classes: one for those pixels having values below an analyst- defined gray level and one for those above this value. (Lillesand and Kiefer, 1994).


Level slicing


Level slicing is an enhancement technique whereby the Digital Numbers (DN) distributed along the x-axis of an image histogram is divided into a series of analyst-specified intervals of "slices". All of DNs falling within a given interval in the input image are then displayed at a single DN in the output image (Lillesand and Kiefer, 1994).


Contrast stretching


Most satellites and airborne sensor were designed to accommodate a wide range of illumination conditions, from poorly lit arctic regions to high reflectance desert regions. Because of this, the pixel values in the majority of digital scenes occupy a relatively small portion of the possible range of image values. If the pixel values are displayed in their original form, only a small range of gray values will be used, resulting in a low contrast display on which similar features night is indistinguishable.


A contrast stretch enhancement expands the range of pixel values so that they are displayed over a fuller range of gray values. (PCI, 1997)


Generally, image display and recording devices typically operate over a range of 256 gray levels (the maximum number represent in 8-bit computer encoding). In the case of 8-bit single image, is to expand the narrow range of brightness values typically present in an output image over a wider range of gray value. The result is an output image that is designed to accentuate the contrast between features of interest to the image analyst (Lillesand and Kiefer, 1994).

The grey level or grey value indicates the brightness of a pixel. The minimum grey level is 0. The maximum grey level depends on the digitisation depth of the image. For an 8-bit-deep image it is 255. In a binary image a pixel can only take on either the value 0 or the value 255.



Comments

Popular posts from this blog

Natural Disasters

A natural disaster is a catastrophic event caused by natural processes of the Earth that results in significant loss of life, property, and environmental resources. It occurs when a hazard (potentially damaging physical event) interacts with a vulnerable population and leads to disruption of normal life . Key terms: Hazard → A potential natural event (e.g., cyclone, earthquake). Disaster → When the hazard causes widespread damage due to vulnerability. Risk → Probability of harmful consequences from interaction of hazard and vulnerability. Vulnerability → Degree to which a community or system is exposed and unable to cope with the hazard. Resilience → Ability of a system or society to recover from the disaster impact. 👉 Example: An earthquake in an uninhabited desert is a hazard , but not a disaster unless people or infrastructure are affected. Types Natural disasters can be classified into geophysical, hydrological, meteorological, clim...

geostationary and sun-synchronous

Orbital characteristics of Remote sensing satellite geostationary and sun-synchronous  Orbits in Remote Sensing Orbit = the path a satellite follows around the Earth. The orbit determines what part of Earth the satellite can see , how often it revisits , and what applications it is good for . Remote sensing satellites mainly use two standard orbits : Geostationary Orbit (GEO) Sun-Synchronous Orbit (SSO)  Geostationary Satellites (GEO) Characteristics Altitude : ~35,786 km above the equator. Period : 24 hours → same as Earth's rotation. Orbit type : Circular, directly above the equator . Appears "stationary" over one fixed point on Earth. Concepts & Terminologies Geosynchronous = orbit period matches Earth's rotation (24h). Geostationary = special type of geosynchronous orbit directly above equator → looks fixed. Continuous coverage : Can monitor the same area all the time. Applications Weather...

Types of Remote Sensing

Remote Sensing means collecting information about the Earth's surface without touching it , usually using satellites, aircraft, or drones . There are different types of remote sensing based on the energy source and the wavelength region used. 🛰️ 1. Active Remote Sensing 📘 Concept: In active remote sensing , the sensor sends out its own energy (like a signal or pulse) to the Earth's surface. The sensor then records the reflected or backscattered energy that comes back from the surface. ⚙️ Key Terminology: Transmitter: sends energy (like a radar pulse or laser beam). Receiver: detects the energy that bounces back. Backscatter: energy that is reflected back to the sensor. 📊 Examples of Active Sensors: RADAR (Radio Detection and Ranging): Uses microwave signals to detect surface roughness, soil moisture, or ocean waves. LiDAR (Light Detection and Ranging): Uses laser light (near-infrared) to measure elevation, vegetation...

India remote sensing

1. Foundational Phase (Early 1970s – Early 1980s) Objective: To explore the potential of space-based observation for national development. 1972: The Space Applications Programme (SAP) was initiated by the Indian Space Research Organisation (ISRO), focusing on applying space technology for societal benefits. 1975: The Department of Space (DoS) was established, providing an institutional base for space applications, including remote sensing. 1977: India began aerial and balloon-borne experiments to study Earth resources and assess how remote sensing data could aid in agriculture, forestry, and hydrology. 1978 (June 7): Bhaskara-I launched by the Soviet Union — India's first experimental Earth Observation satellite . Payloads: TV cameras (for land and ocean surface observation) and a Microwave Radiometer. Significance: Proved that satellite-based Earth observation was feasible for India's needs. 1981 (November 20): Bhaskara-II launche...

Linear Arrays Along-Track Scanners or Pushbroom Scanners

Multispectral Imaging Using Linear Arrays (Along-Track Scanners or Pushbroom Scanners) Multispectral Imaging: As previously defined, this involves capturing images using multiple sensors that are sensitive to different wavelengths of electromagnetic radiation. Linear Array of Detectors (A): This refers to a row of discrete detectors arranged in a straight line. Each detector is responsible for measuring the radiation within a specific wavelength band. Focal Plane (B): This is the plane where the image is formed by the lens system. It is the location where the detectors are placed to capture the focused image. Formed by Lens Systems (C): The lens system is responsible for collecting and focusing the incoming radiation onto the focal plane. It acts like a camera lens, creating a sharp image of the scene. Ground Resolution Cell (D): As previously defined, this is the smallest area on the ground that can be resolved by a remote sensing sensor. In the case of linear array scanne...