FireWatch is a specialized geospatial analysis tool built on top of IBM and NASA's Prithivi-EO-2.0-300M foundation model. It is designed to detect and quantify wildfire damage by comparing pre and post-event satellite imagery using a hybrid approach that combines classical spectral indices with deep learning embeddings.
graph TD
Input[Pre & Post Satellite Images] --> Processor[Data Pre-processing]
Processor --> AI[Prithivi AI & Spectral Engine]
AI --> Logic[Hybrid Damage Detection]
Logic --> Mask[Damage Mask]
Logic --> Overlay[Pink & Purple Overlay]
Logic --> Boxed[Red Boxed Detection]
Logic --> Export[GeoTIFF Metadata Export]
- Hybrid Detection Engine: Combines dNBR (delta Normalized Burn Ratio) with Prithivi-EO-2.0 spatial embeddings for high-precision burn mapping.
- Zero-Shot Capability: Leverages the power of foundation models to detect damage without needing specific training for every fire event.
- GeoTIFF Native: Full support for 6-band HLS (Harmonized Landsat Sentinel) format, preserving geospatial metadata throughout the pipeline.
- Enhanced Visualization:
- Natural Color RGB previews.
- New! Pink/Purple high-contrast damage overlays for better visibility on satellite backgrounds.
- New! Red bounding boxes (Squares) for automated target identification.
- Spatial Export: Results can be downloaded as georeferenced TIFs for immediate use in GIS software like QGIS or ArcGIS.
- CPU-Optimized: Engineered to run efficiently on standard CPU environments using
terratorchandtorchoptimizations.
The following imagery demonstrates the end-to-end detection process on a real satellite image pair (unknown area).
| Pre-Event | Post-Event |
|---|---|
![]() |
![]() |
| Satellite view before the fire | Satellite view after the fire (burn scars) |
| Change Mask | Pink/Purple Overlay | Red Boxed Detection |
|---|---|---|
![]() |
![]() |
![]() |
| Raw AI Detection (Binary Mask) | Enhanced Composite Overlay | Automated Cluster Identification |
FireWatch generates several distinct visual and data products for comprehensive analysis:
A binary representation of the detected areas. This is the pure mathematical output of the hybrid model, highlighting exactly which pixels have undergone significant spectral and contextual shifts typical of fire damage.
A high-contrast visualization designed for human interpretation. By using a Pink/Purple spectrum on top of the original imagery, it provides superior visibility compared to standard red masks, especially over dark, charred forest backgrounds.
Integrated automated target identification. Using connected-component analysis, the tool draws sharp red squares (bounding boxes) around every distinct damage cluster. This is ideal for rapid assessment and prioritizing area-based responses.
- Python 3.10+
- UV Package Manager (recommended for speed and reliability)
-
Clone the repository:
git clone https://github.com/your-username/Changedetection-prithivi2.0.git cd Changedetection-prithivi2.0 -
Install dependencies:
uv sync
-
Download the Model Weights: The application uses the Prithivi-EO-2.0-300M foundation model. While the app will attempt to download weights automatically on first run, we recommend downloading them manually via Git LFS if you have a slow connection:
- Link: IBM/NASA Prithivi-EO-2.0-300M (Hugging Face)
- Place the weights in a
./weightsfolder in the project root if you wish to use them offline.
-
Run the application:
uv run python app.py
FireWatch includes a public satellite data example to help you test the detection pipeline immediately:
- Start the server using
uv run python app.py. - Open
http://localhost:5000in your browser. - Upload the following files from the project directory:
- Pre-Event:
demo_data/real/pre.tif - Post-Event:
demo_data/real/post.tif
- Pre-Event:
- Set the Threshold to
0.5for best clarity on this dataset. - Click "Detect Changes" and wait for the AI to process the results.
For optimal results, input GeoTIFFs should follow the HLS (Harmonized Landsat Sentinel) band specification:
| Band Index | Band Name | Wavelength | Purpose |
|---|---|---|---|
| 1 | Blue | B02 | Natural Color / Atmospheric Check |
| 2 | Green | B03 | Natural Color / Vegetation |
| 3 | Red | B04 | Natural Color / Soil |
| 4 | NIR | B8A | Vegetation Health (NBR calculation) |
| 5 | SWIR 1 | B11 | Moisture / Burn Detection (NBR calculation) |
| 6 | SWIR 2 | B12 | Burn Scars / Soil Composition |
Important
Both "Pre" and "Post" images must be spatially aligned (same CRS, bounds, and dimensions).
FireWatch doesn't just look for "change"—it looks for fire damage. The detection pipeline uses a weighted fusion:
- Spectral Evidence (70%): Calculates the
dNBR = NBR_pre - NBR_post, whereNBR = (NIR - SWIR2) / (NIR + SWIR2). This is the gold standard for identifying charred organic matter. - Contextual Evidence (30%): Uses the Prithivi ViT (Vision Transformer) to extract 1024-dimensional embeddings. It calculates the cosine distance between temporal features to understand the shape and context of land cover change.
This reduces false positives caused by clouds, shadows, or seasonal vegetation changes.
Since FireWatch is designed for the NASA HLS standard, you can download more public imagery from these official sources:
- NASA Earthdata Search: Access HLS L30 (Landsat) and HLS S30 (Sentinel) global datasets.
- USGS EarthExplorer: High-resolution Landsat 8-9 products compatible with HLS.
- Microsoft Planetary Computer: Browse and stream HLS Collections using Python/STAC.
This project is licensed under the Apache 2.0 License - see the LICENSE file for details. Built using Prithvi-EO-2.0 models by IBM Research and NASA.
Built with ❤️ for Earth Observation by tushar365.




