SSL4EO-S12: A Large-Scale Multi-Modal, Multi-Temporal Dataset for Self-Supervised Learning in Earth Observation

11/13/2022
by   Yi Wang, et al.
0

Self-supervised pre-training bears potential to generate expressive representations without human annotation. Most pre-training in Earth observation (EO) are based on ImageNet or medium-size, labeled remote sensing (RS) datasets. We share an unlabeled RS dataset SSL4EO-S12 (Self-Supervised Learning for Earth Observation - Sentinel-1/2) to assemble a large-scale, global, multimodal, and multi-seasonal corpus of satellite imagery from the ESA Sentinel-1 & -2 satellite missions. For EO applications we demonstrate SSL4EO-S12 to succeed in self-supervised pre-training for a set of methods: MoCo-v2, DINO, MAE, and data2vec. Resulting models yield downstream performance close to, or surpassing accuracy measures of supervised learning. In addition, pre-training on SSL4EO-S12 excels compared to existing datasets. We make openly available the dataset, related source code, and pre-trained models at https://github.com/zhu-xlab/SSL4EO-S12.

READ FULL TEXT

page 2

page 3

page 16

page 17

research
03/30/2021

Seasonal Contrast: Unsupervised Pre-Training from Uncurated Remote Sensing Data

Remote sensing and automatic earth monitoring are key to solve global-sc...
research
03/11/2022

Embedding Earth: Self-supervised contrastive pre-training for dense land cover classification

In training machine learning models for land cover semantic segmentation...
research
01/20/2021

Self-supervised pre-training enhances change detection in Sentinel-2 imagery

While annotated images for change detection using satellite imagery are ...
research
06/04/2023

rPPG-MAE: Self-supervised Pre-training with Masked Autoencoders for Remote Physiological Measurement

Remote photoplethysmography (rPPG) is an important technique for perceiv...
research
06/22/2020

The color out of space: learning self-supervised representations for Earth Observation imagery

The recent growth in the number of satellite images fosters the developm...
research
04/19/2023

CMID: A Unified Self-Supervised Learning Framework for Remote Sensing Image Understanding

Self-supervised learning (SSL) has gained widespread attention in the re...
research
03/25/2021

Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy Labels

The success of learning with noisy labels (LNL) methods relies heavily o...

Please sign up or login with your details

Forgot password? Click here to reset