GitHub

Sentinel-2 SuperResolution Models from 10m and 20m to 2.5m

This is a collection of SR models trained on Sentinel-2 data. The models are trained to super-resolve 10m and 20m Sentinel-2 bands to 2.5m resolution. The models are trained on the Sen2NAIPv2. We present three models modalities:

  • *_SR: Non-reference super-resolution model. The model takes the RGBN 10m Sentinel-2 bands as input and super-resolves them to 2.5m resolution.

  • *_F2: Reference super-resolution model. The model takes the Red Edges and SWIRs (RSWIRs) 20m Sentinel-2 bands as input and super-resolves them to 10m resolution. It needs that users report RGBN 10m bands as reference.

  • *_F4: Reference super-resolution model. The model takes the RSWIRs 20m Sentinel-2 bands as input and super-resolves them to 2.5m resolution. It needs that users report RGBN 2.5m bands as reference.

drawing

The diagram illustrates the inference procedure for generating a complete 2.5-meter Sentinel-2 data cube. It highlights the specific role of each model in the multi-step resolution enhancement process.

Example Usage

Model Description Run Link
Run Simple Model A lightweight SR model optimized for fast runs. Open In Colab
Run Best Model Our most accurate model for high-quality super-resolution. Open In Colab
Run Anywhere on Earth A flexible implementation allowing inference on any location worldwide. Open In Colab

From 10m RGBN S2 to 2.5m

Dependencies for the simple model:

pip install torch supers2 mlstac safetensors

Dependencies for the best model:

pip install torch supers2 mlstac safetensors einops timm mamba-ssm
import torch
import mlstac


# Download model
# file="https://huggingface.co/tacofoundation/supers2/resolve/main/Mamba_Medium_SR/mlm.json" ## best model
file="https://huggingface.co/tacofoundation/supers2/resolve/main/CNN_Light_SR/mlm.json"
output_dir="models2/CNN_Light_SR"
mlstac.download(file, output_dir)

# Create a mlstac object
mlstac_object = mlstac.load(output_dir)
device = "cpu" # "cpu"

# Load model
#srmodel = mlstac_object.trainable_model() # for fine-tuning
srmodel = mlstac_object.compiled_model(device=device) # for benchmarking

# Load Demo Data
lr, hr = mlstac_object.example_data()

# Inference
sr = srmodel(lr.to(device))

# Plot
#fig, ax = plt.subplots(1, 3, figsize=(15, 5))
#ax[0].imshow(lr[0, 0:3].permute(1, 2, 0)*3)
#ax[0].set_title("Low Resolution")
#ax[1].imshow(hr[0, 0:3].permute(1, 2, 0)*3)
#ax[1].set_title("High Resolution")
#ax[2].imshow(sr[0, 0:3].permute(1, 2, 0)*3)
#ax[2].set_title("Super Resolution")
#plt.show()


# Fast plot
fig = mlstac_object.display_results()
plt.show()

image/png

From 10m and 20m S2 to 2.5m

import mlstac
import matplotlib.pyplot as plt

# Download model
# file="https://huggingface.co/tacofoundation/supers2/resolve/main/Swin_Medium_F4/mlm.json" ## best model
file="https://huggingface.co/tacofoundation/supers2/resolve/main/CNN_Light_F4/mlm.json"
output_dir="models2/CNN_Light_F4"
mlstac.download(file, output_dir)

# Create a mlstac object
mlstac_object = mlstac.load(output_dir)
device = "cpu" # "cpu"

# Load model
#srmodel = mlstac_object.trainable_model() # for fine-tuning
srmodel = mlstac_object.compiled_model(device=device) # for benchmarking

# Load Demo Data
lr = mlstac_object.example_data()

# Inference
sr = srmodel(lr.to(device))

# Fast plot
fig = mlstac_object.display_results()
plt.show()

image/png

Model Details

We provide the following models:

Model Name Modalities Description Input Resolution Output Resolution Trainable Parameters (M) Total Mult-Adds (GFLOPs)
CNN_Light_SR SR Super-resolves 10m Sentinel-2 bands to 2.5m resolution 10m 2.5m 0.4 7.8
CNN_Light_F2 F2 Super-resolves 20m Sentinel-2 bands to 10m resolution 20m 10m 0.4 7.8
CNN_Light_F4 F4 Super-resolves 20m Sentinel-2 bands to 2.5m resolution 20m 2.5m 0.4 7.8
Mamba_Medium_SR SR Super-resolves 10m Sentinel-2 bands to 2.5m resolution 10m 2.5m 13.8 69.8
Swin_Large_F2 F2 Super-resolves 20m Sentinel-2 bands to 10m resolution 20m 10m 32.6 71.5
Swin_Medium_F4 F4 Super-resolves 20m Sentinel-2 bands to 2.5m resolution 20m 2.5m 13.2 35.4
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support