Sea-Undistort: A Synthetic Dataset for Restoring Through-Water Images in Airborne Bathymetric Mapping
Sea-Undistort a synthetic dataset created using the open-source 3D graphics platform Blender. The dataset comprises 1200 image pairs, each consisting of 512×512 pixel RGB renderings of shallow underwater scenes. Every pair includes a “non-distorted” image, representing minimal surface and column distortions, and a corresponding “distorted” version that incorporates realistic optical phenomena such as sun glint, wave-induced deformations, turbidity, and light scattering. These effects are procedurally generated to replicate the diverse challenges encountered in through-water imaging for bathymetry. The scenes are designed with randomized combinations of typical shallow-water seabed types, including rocky outcrops, sandy flats, gravel beds, and seagrass patches, capturing a wide range of textures, reflectance patterns, and radiometric conditions. Refraction is accurately modeled in both the distorted and non-distorted images to maintain geometric consistency with real underwater imaging physics.
In addition, camera settings are uniformly sampled within specific ranges to ensure diverse imaging conditions. Sensor characteristics include a physical width of 36 mm and effective pixel widths of 4000 or 5472 pixels. Focal lengths of 20 mm and 24 mm are simulated with only the central 512x512 pixels rendered. Camera altitude ranges from 30 m to 200 m, resulting in a ground sampling distance (GSD) between 0.014 m and 0.063 m. Average depths range from –0.5 m to –8 m, with a maximum tilt angle of 5°. Sun elevation angles between 25° and 70°, along with varying atmospheric parameters (e.g., air, dust), are used to simulate different illumination conditions. Generated images are accompanied by a .json file containing this metadata per image.
Sea-Undistort is designed to support supervised training of deep learning models for through-water image enhancement and correction, enabling generalization to real-world conditions where undistorted ground truth is otherwise unobtainable.
DOI of Dataset Repository
This repository contains the code of the paper "M. Kromer, P. Agrafiotis, and B. Demir, "Sea-Undistort: A Dataset for Through-Water Image Restoration in High Resolution Airborne Bathymetric Mapping" submitted to IEEE GRSL"
If you find this repository useful, please consider giving a star ⭐.
If you use the code in this repository or the dataset please cite:
Kromer, M., Agrafiotis, P., & Demir, B. (2025). Sea-Undistort: A dataset for through-water image restoration in high resolution airborne bathymetric mapping. arXiv. https://arxiv.org/abs/2508.07760
@misc{kromer2025seaundistortdatasetthroughwaterimage,
title={Sea-Undistort: A Dataset for Through-Water Image Restoration in High Resolution Airborne Bathymetric Mapping},
author={Maximilian Kromer and Panagiotis Agrafiotis and Begüm Demir},
year={2025},
eprint={2508.07760},
archivePrefix={arXiv},
primaryClass={eess.IV},
url={https://arxiv.org/abs/2508.07760},
}
For downloading the dataset and a detailed explanation of it, please visit the MagicBathy Project website at https://www.magicbathy.eu/Sea-Undistort.html
The folder structure should be as follows:
┗ 📂 Sea-Undistort/
┣ 📜 render_0000_ground.png
┣ 📜 render_0000_no_sunglint.png
┣ 📜 render_0000_no_waves.png
┣ 📜 render_0000.png
┣ 📜 render_0001_ground.png
┣ 📜 render_0001_no_sunglint.png
┣ 📜 render_0001_no_waves.png
┣ 📜 render_0001.png
┣ 📜 ...
┣ 📜 render_1199_ground.png
┣ 📜 render_1199_no_sunglint.png
┣ 📜 render_1199_no_waves.png
┣ 📜 render_1199.png
┣ 📜 scene_settings.json
┗ 📜 LICENSE_and_info.txt
We provide code and model weights for the following deep learning models that have been pre-trained on Sea-Undistort for through water image restoration:
| Model Names | Pre-Trained PyTorch Models |
|---|---|
| NDR-Restore | NDR-Restore.zip |
| ResShift | ResShift.zip |
| ResShift+EF | ResShift+EF.zip |
To achieve the results presented in the paper, use the parameters and the specific train-evaluation splits provided in the dataset.
- For NDR-Restore inference and training, see
NDR-Instructions.md. - For ResShift and ResShift+EF inference and training, see
ResShift-Instructions.md.
- We do not include the full model code for NDR-Restore or ResShift. The original codebases are publicly available:
- NDR-Restore repository: https://github.com/mdyao/NDR-Restore
- ResShift repository: https://github.com/zsyOAOA/ResShift
- This repository includes only our modified files, configuration files, pretrained weights links, and instructions on how to integrate them into the original projects.
- We provide
req.txtandenvironment.ymlat the repository root to create a Python environment compatible with both models. These files reflect the environment we used for our experiments.
- For further questions about implementation details, please consult the original repositories linked above, open an issue in this repository, or email: m.kromer@tu-berlin.de
- We gratefully acknowledge the authors of the external baselines:
- NDR-Restore: https://github.com/mdyao/NDR-Restore
- ResShift: https://github.com/zsyOAOA/ResShift
Example imagery from the Agia Napa area, in order of appearance from left to right: (1) original patches, restorations using (2) NDR-Restore, (3) ResShift, and (4) ResShift+EF.
Maximilian Kromer https://github.com/MaximilianKromer and Panagiotis Agrafiotis https://www.user.tu-berlin.de/pagraf/
Feel free to give feedback, by sending an email to: agrafiotis@tu-berlin.de
This work is part of MagicBathy project funded by the European Union’s HORIZON Europe research and innovation programme under the Marie Skłodowska-Curie GA 101063294. Work has been carried out at the Remote Sensing Image Analysis group. For more information about the project visit https://www.magicbathy.eu/.