Skip to main content

#Landsat #NASA #USGS #Earth. Great Bahama Bank


When oceanographer Serge Andréfouet first saw a satellite image of the Great Bahama Bank, he knew the colors and contours were special. He passed the unique image to a colleague, who submitted it to NASA's Earth Observatory (EO) for an Image of the Day in 2002 (top image). Nearly eighteen years later, the image is still much appreciated. In fact, it knocked off more recent satellite imagery to win EO's Tournament Earth 2020.

"There are many nice seagrass and sand patterns worldwide, but none like this anywhere on Earth," said Andréfouet, who is now studying reefs at the Institute for Marine Research & Observation in Indonesia. "I am not surprised it is still a favorite, especially for people who see it for the first time." He said the image has been featured over the years on numerous websites, in books, and even at rave parties.

The varying colors and curves remind us of graceful strokes on a painting, but the features were sculpted by geologic processes and ocean creatures. The Great Bahama Bank was dry land during past ice ages, but it slowly submerged as sea levels rose. Today, the bank is covered by water, though it can be as shallow as two meters (seven feet) deep in places. The bank itself is composed of white carbonate sand and limestone, mainly from the skeletal fragments of corals. The Florida peninsula was built from similar deposits.

Andréfouet's image (top) shows a small section of the bank as it appeared on January 17, 2001, and was acquired by the Enhanced Thematic Mapper Plus (ETM+) on the Landsat 7 satellite (using bands 1-2-3). At that time the instrument's blue channel (band 1) helped distinguish shallow water features better than previous satellite mission.

The wave-shaped ripples in the images are sand on the seafloor. The curves follow the slopes of underwater dunes, which were probably shaped by a fairly strong current near the sea bottom. Sand and seagrass are present in different quantities and at different depths, which gives the image a range of blues and greens. The area appeared largely the same when Landsat 8 passed over on February 15, 2020.

The shallow bank quickly drops off into a deep, dark region known as the "Tongue of the Ocean." Diving about 2,000 meters (6,500 feet) deep, the Tongue of the Ocean is home to more than 160 fish and coral species. It lies adjacent to the Andros Island, the largest in the Bahamas and one of the largest fringing reefs in the world. The image above was acquired on April 4, 2020, by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra satellite.

At the time of the 2001 image, researchers did not have a good understanding of the location and distribution of reef systems across the world. Global maps of coral reefs had not changed much since the 19th Century. So researchers turned to satellites for a better view. Andréfouet's image was collected as part of the NASA-funded Millennium Coral Reef Mapping Project, which aimed to image and map coral reefs worldwide. The project gathered more than 1,700 images with Landsat 7, the first Landsat to take images over coastal waters and the open ocean.

Today, many satellites and research programs continue to map and monitor coral reef systems, and marine scientists have a better idea of where the reefs are and how they are faring. Researchers now use reef images and maps in tandem with sea surface temperature data to identify areas vulnerable to coral bleaching.

NASA Earth Observatory images by Joshua Stevens, using Landsat data from the U.S. Geological Survey, and MODIS data from NASA EOSDIS/LANCE and GIBS/Worldview.2002 imagery courtesy Serge Andrefouet, University of South Florida. Story by Kasha Patel.


#Landsat #NASA #USGS #Earth


....

Vineesh V
Assistant Professor of Geography,
Directorate of Education,
Government of Kerala.
https://g.page/vineeshvc

Comments

Popular posts from this blog

Geometric Correction

When satellite or aerial images are captured, they often contain distortions (errors in shape, scale, or position) caused by many factors — like Earth's curvature, satellite motion, terrain height (relief), or the Earth's rotation . These distortions make the image not properly aligned with real-world coordinates (latitude and longitude). 👉 Geometric correction is the process of removing these distortions so that every pixel in the image correctly represents its location on the Earth's surface. After geometric correction, the image becomes geographically referenced and can be used with maps and GIS data. Types  1. Systematic Correction Systematic errors are predictable and can be modeled mathematically. They occur due to the geometry and movement of the satellite sensor or the Earth. Common systematic distortions: Scan skew – due to the motion of the sensor as it scans the Earth. Mirror velocity variation – scanning mirror moves at a va...

Radar Sensors in Remote Sensing

Radar sensors are active remote sensing instruments that use microwave radiation to detect and measure Earth's surface features. They transmit their own energy (radio waves) toward the Earth and record the backscattered signal that returns to the sensor. Since they do not depend on sunlight, radar systems can collect data: day or night through clouds, fog, smoke, and rain in all weather conditions This makes radar extremely useful for Earth observation. 1. Active Sensor A radar sensor produces and transmits its own microwaves. This is different from optical and thermal sensors, which depend on sunlight or emitted heat. 2. Microwave Region Radar operates in the microwave region of the electromagnetic spectrum , typically from 1 mm to 1 m wavelength. Common radar frequency bands: P-band (70 cm) L-band (23 cm) S-band (9 cm) C-band (5.6 cm) X-band (3 cm) Each band penetrates and interacts with surfaces differently: Lo...

Unmanned Earth Resources Satellites

Unmanned Earth resources satellites are satellites equipped with remote sensing instruments used to collect images and environmental data from the Earth's surface without a crew onboard. They help monitor: land use vegetation soil and water resources climate oceans atmosphere natural hazards These satellites are grouped based on the type of radiation they measure and the sensors they carry. Five Groups of Unmanned Earth Resources Satellites Remote sensing satellites can be categorized into five main groups , based on the wavelengths they record and the type of environmental information they collect. First-Generation Earth Resources Satellites Wavelength region: Visible and Near-Visible (VNIR) ✔ Characteristics Use multispectral scanners Record reflected sunlight Mainly for land use, vegetation, and surface mapping ✔ Example Landsat series (Landsat 1, 2, 3) These were the first generation of Earth resource sate...

Unmanned Aerial Vehicles

Unmanned Aerial Vehicles (UAVs) —commonly called drones —are pilotless aircraft used as remote sensing platforms to acquire very high-resolution geospatial data . They fly at low altitudes (typically 50–300 m), enabling them to record centimeter-level details of the Earth's surface. UAVs are increasingly used in remote sensing because they offer on-demand data acquisition , flexible sensor deployment , and the ability to fly under cloud cover , making them ideal for scientific, environmental, and disaster applications. Characteristics ✔ 1. High-Resolution Data Acquisition UAVs can collect imagery with spatial resolutions up to <1 cm . Suitable for detailed mapping of vegetation, buildings, hazards, and micro-topography. ✔ 2. On-Demand and Rapid Deployment Can be launched quickly anytime data is needed. Extremely useful after floods, landslides, earthquakes , or in inaccessible terrain. ✔ 3. Operational Flexibility Able to fly: in rugged ...

Optical Sensors in Remote Sensing

1. What Are Optical Sensors? Optical sensors are remote sensing instruments that detect solar radiation reflected or emitted from the Earth's surface in specific portions of the electromagnetic spectrum (EMS) . They mainly work in: Visible region (0.4–0.7 µm) Near-Infrared – NIR (0.7–1.3 µm) Shortwave Infrared – SWIR (1.3–3.0 µm) Thermal Infrared – TIR (8–14 µm) — emitted energy, not reflected Optical sensors capture spectral signatures of surface features. Each object reflects/absorbs energy differently, creating a unique spectral response pattern . a) Electromagnetic Spectrum (EMS) The continuous range of wavelengths. Optical sensing uses solar reflective bands and sometimes thermal bands . b) Spectral Signature The unique pattern of reflectance or absorbance of an object across wavelengths. Example: Vegetation reflects strongly in NIR Water absorbs strongly in NIR and SWIR (appears dark) c) Radiance and Reflectance Radi...