Skip to main content

Landsat #NASA #USGS #Earth Taking Temperatures from ISS

Taking Temperatures from ISS

When remote sensing scientists observe Earth, they often look for heat signatures. Fires, volcanoes, ice, water, and even sunlit or shaded landscapes emit and reflect heat and light—energy—in ways that make them stand out from their surroundings. NASA scientists recently used a new sensor to read some of those signatures more clearly.

Through nearly a year of testing on the International Space Station (ISS), the experimental Compact Thermal Imager (CTI) collected more than 15 million images of Earth, and the results were compelling. Researchers were impressed by the breadth and quality of the imagery CTI collected in 10 months on the ISS, particularly of fires.

For instance, CTI captured several images of the unusually severe fires in Australia that burned for four months in 2019-20. With its 80-meter (260 foot) per pixel resolution, CTI was able to detect the shape and location of fire fronts and how far they were from settled areas—information that is critically important to first responders.

For the past two decades, scientists have generally relied upon coarse resolution (375–1000 m) thermal data from the satellite-based Moderate Resolution Imaging Spectroradiometer (MODIS) and Visible Infrared Imaging Radiometer Suite (VIIRS) sensors to monitor fire activity from above. During its flight test, CTI made observations of fires with 20 times more detail than VIIRS and 190 times more detail than MODIS.

The images above highlight the difference. Both images show CTI's view of large fires burning in the Gondwana Rainforests of New South Wales on November 1, 2019. The right image also includes the VIIRS fire detections (red diamonds) of the same area that day. The data were overlaid on a natural-color image acquired by the Operational Land Imager (OLI) on Landsat 8.

The image below, acquired by the European Space Agency's Sentinel-2 spacecraft on November 1, shows a more detailed view of one of the fire clusters, along with the CTI data.

"CTI's deployment on the space station was primarily a test of how well the hardware would perform in space. It was not initially designed as a science mission," explained Doug Morton, chief of the Biospheric Sciences Laboratory at NASA's Goddard Space Flight Center. "Nonetheless, CTI data proved scientifically useful as we monitored several high-profile fire outbreaks this past summer."

One aspect of CTI's mission that was of particular interest to Morton was the timing of the images. MODIS and VIIRS have polar orbits and make observations over a given area at the same time each day (roughly 10:30 a.m. and 1:30 p.m.). Imagers on the ISS provide more variety and less consistency in timing, as the orbit of the International Space Station is more variable, as is the lighting and angles as it passes over different locations.

"We ended up getting these amazing images of fires at times of the day when we don't usually get them," said Morton. Fire researchers are eager to have more views of fires around dawn and dusk, which are sometimes missed by MODIS and VIIRS. "It was a reminder of how much critical science we could do if we had a whole fleet of sensors like CTI giving us such detailed measurements multiple times a day."

CTI was designed at NASA's Goddard Space Flight Center and installed on the ISS in 2019 as part of the Robotic Refueling Mission 3. It used an advanced detector called a strained layer superlattice (SLS), an improved version of the detector technology that is part of the Thermal Infrared Sensor (TIRS) of Landsat 8 and 9.

"The new SLS technology operates at a much warmer temperature with greater sensitivity and has a broader spectral response than the TIRS technology, resulting in a smaller and less costly instrument to design and build," said Murzy Jhabvala, principal investigator for CTI. "SLS has proved itself. This technology is now a viable candidate for the future Landsat 10 and a variety of other lunar, planetary, and asteroid missions."

NASA Earth Observatory images by Lauren Dauphin, using Landsat data from the U.S. Geological Survey, VIIRS data from NASA EOSDIS/LANCE and GIBS/Worldview and the Suomi National Polar-orbiting Partnership, topographic data from the Shuttle Radar Topography Mission (SRTM), and modified Copernicus Sentinel data (2018) processed by the European Space Agency. CTI data courtesy of the CTI team at NASA's Goddard Space Flight Center. The sensor was developed with QmagiQ and funded by the Earth Science Technology Office (ESTO). Story by Adam Voiland.

Read More at:


and/or


#Landsat #NASA #USGS #Earth

....


Vineesh V
Assistant Professor of Geography,
Directorate of Education,
Government of Kerala.
https://g.page/vineeshvc

Comments

Popular posts from this blog

Optical Sensors in Remote Sensing

1. What Are Optical Sensors? Optical sensors are remote sensing instruments that detect solar radiation reflected or emitted from the Earth's surface in specific portions of the electromagnetic spectrum (EMS) . They mainly work in: Visible region (0.4–0.7 µm) Near-Infrared – NIR (0.7–1.3 µm) Shortwave Infrared – SWIR (1.3–3.0 µm) Thermal Infrared – TIR (8–14 µm) — emitted energy, not reflected Optical sensors capture spectral signatures of surface features. Each object reflects/absorbs energy differently, creating a unique spectral response pattern . a) Electromagnetic Spectrum (EMS) The continuous range of wavelengths. Optical sensing uses solar reflective bands and sometimes thermal bands . b) Spectral Signature The unique pattern of reflectance or absorbance of an object across wavelengths. Example: Vegetation reflects strongly in NIR Water absorbs strongly in NIR and SWIR (appears dark) c) Radiance and Reflectance Radi...

Radar Sensors in Remote Sensing

Radar sensors are active remote sensing instruments that use microwave radiation to detect and measure Earth's surface features. They transmit their own energy (radio waves) toward the Earth and record the backscattered signal that returns to the sensor. Since they do not depend on sunlight, radar systems can collect data: day or night through clouds, fog, smoke, and rain in all weather conditions This makes radar extremely useful for Earth observation. 1. Active Sensor A radar sensor produces and transmits its own microwaves. This is different from optical and thermal sensors, which depend on sunlight or emitted heat. 2. Microwave Region Radar operates in the microwave region of the electromagnetic spectrum , typically from 1 mm to 1 m wavelength. Common radar frequency bands: P-band (70 cm) L-band (23 cm) S-band (9 cm) C-band (5.6 cm) X-band (3 cm) Each band penetrates and interacts with surfaces differently: Lo...

Thermal Sensors in Remote Sensing

Thermal sensors are remote sensing instruments that detect naturally emitted thermal infrared (TIR) radiation from the Earth's surface. Unlike optical sensors (which detect reflected sunlight), thermal sensors measure heat energy emitted by objects because of their temperature. They work mainly in the Thermal Infrared region (8–14 µm) of the electromagnetic spectrum. 1. Thermal Infrared Radiation All objects above 0 Kelvin (absolute zero) emit electromagnetic radiation. This is explained by Planck's Radiation Law . For Earth's surface temperature range (about 250–330 K), the peak emitted radiation occurs in the 8–14 µm thermal window . Thus, thermal sensors detect emitted energy , not reflected sunlight. 2. Emissivity Emissivity is the efficiency with which a material emits thermal radiation. Values range from 0 to 1 : Water, vegetation → high emissivity (0.95–0.99) Bare soil → medium (0.85–0.95) Metals → low (0.1–0.3) E...

Pre During and Post Disaster

Disaster management is a structured approach aimed at reducing risks, responding effectively, and ensuring a swift recovery from disasters. It consists of three main phases: Pre-Disaster (Mitigation & Preparedness), During Disaster (Response), and Post-Disaster (Recovery). These phases involve various strategies, policies, and actions to protect lives, property, and the environment. Below is a breakdown of each phase with key concepts, terminologies, and examples. 1. Pre-Disaster Phase (Mitigation and Preparedness) Mitigation: This phase focuses on reducing the severity of a disaster by minimizing risks and vulnerabilities. It involves structural and non-structural measures. Hazard Identification: Recognizing potential natural and human-made hazards (e.g., earthquakes, floods, industrial accidents). Risk Assessment: Evaluating the probability and consequences of disasters using GIS, remote sensing, and historical data. Vulnerability Analysis: Identifying areas and p...

Geometric Correction

When satellite or aerial images are captured, they often contain distortions (errors in shape, scale, or position) caused by many factors — like Earth's curvature, satellite motion, terrain height (relief), or the Earth's rotation . These distortions make the image not properly aligned with real-world coordinates (latitude and longitude). 👉 Geometric correction is the process of removing these distortions so that every pixel in the image correctly represents its location on the Earth's surface. After geometric correction, the image becomes geographically referenced and can be used with maps and GIS data. Types  1. Systematic Correction Systematic errors are predictable and can be modeled mathematically. They occur due to the geometry and movement of the satellite sensor or the Earth. Common systematic distortions: Scan skew – due to the motion of the sensor as it scans the Earth. Mirror velocity variation – scanning mirror moves at a va...