opensr-test / README.md
csaybar's picture
Update README.md
167fc2e verified
|
raw
history blame
5.33 kB
metadata
license: mit
task_categories:
  - image-to-image
language:
  - en
tags:
  - earth
  - remote sensing
  - super-resolution
  - Sentinel-2
  - sentinel-2
pretty_name: opensr_test
size_categories:
  - 10K<n<100K

header

A comprehensive benchmark for real-world Sentinel-2 imagery super-resolution


GitHub: https://github.com/ESAOpenSR/opensr-test

Documentation: https://esaopensr.github.io/opensr-test

PyPI: https://pypi.org/project/opensr-test/

Paper: https://www.techrxiv.org/users/760184/articles/735467-a-comprehensive-benchmark-for-optical-remote-sensing-image-super-resolution


Overview

Super-Resolution (SR) aims to improve satellite imagery ground sampling distance. However, two problems are common in the literature. First, most models are tested on synthetic data, raising doubts about their real-world applicability and performance. Second, traditional evaluation metrics such as PSNR, LPIPS, and SSIM are not designed to assess SR performance. These metrics fall short, especially in conditions involving changes in luminance or spatial misalignments - scenarios frequently encountered in real world.

To address these challenges, 'opensr-test' provides a fair approach for SR benchmark. We provide three datasets carefully crafted to minimize spatial and spectral misalignment. Besides, 'opensr-test' precisely assesses SR algorithm performance across three independent metrics groups that measure consistency, synthesis, and correctness.

header

Datasets

The opensr-test package provides five datasets for benchmarking SR models. These datasets are carefully crafted to minimize spatial and spectral misalignment. See our Hugging Face repository for more details about the datasets. https://huggingface.co/datasets/isp-uv-es/opensr-test

NAIP (X4 scale factor)

The National Agriculture Imagery Program (NAIP) dataset is a high-resolution aerial imagery dataset that covers the continental United States. The dataset consists of 2.5m NAIP imagery captured in the visible and near-infrared spectrum (RGBNIR) and all Sentinel-2 L1C and L2A bands. The dataset focus in crop fields, forests, and bare soil areas.

import opensr_test

naip = opensr_test.load("naip")

header

SPOT (X4 scale factor)

The SPOT imagery were obtained from the worldstat dataset. The dataset consists of 2.5m SPOT imagery captured in the visible and near-infrared spectrum (RGBNIR) and all Sentinel-2 L1C and L2A bands. The dataset focus in urban areas, crop fields, and bare soil areas.

import opensr_test

spot = opensr_test.load("spot")

header

Venµs (X2 scale factor)

The Venµs images were obtained from the Sen2Venµs dataset. The dataset consists of 5m Venµs imagery captured in the visible and near-infrared spectrum (RGBNIR) and all Sentinel-2 L1C and L2A bands. The dataset focus in crop fields, forests, urban areas, and bare soil areas.

import opensr_test

venus = opensr_test.load("venus")

header

Deeper understanding

Explore the API section for more details about personalizing your benchmark experiments.

opensr-test

Citation

If you use opensr-test in your research, please cite our paper:

@article{aybar2024comprehensive,
  title={A Comprehensive Benchmark for Optical Remote Sensing Image Super-Resolution},
  author={Aybar, Cesar and Montero, David and Donike, Simon and Kalaitzis, Freddie and G{\'o}mez-Chova, Luis},
  journal={Authorea Preprints},
  year={2024},
  publisher={Authorea}
}

Acknowledgements

This work was made with the support of the European Space Agency (ESA) under the project “Explainable AI: application to trustworthy super-resolution (OpenSR)”. Cesar Aybar acknowledges support by the National Council of Science, Technology, and Technological Innovation (CONCYTEC, Peru) through the “PROYECTOS DE INVESTIGACIÓN BÁSICA – 2023-01” program with contract number PE501083135-2023-PROCIENCIA. Luis Gómez-Chova acknowledges support from the Spanish Ministry of Science and Innovation (project PID2019-109026RB-I00 funded by MCIN/AEI/10.13039/501100011033).