Cross Modal Distillation for Flood Extent Mapping

02/16/2023
by   Shubhika Garg, et al.
0

The increasing intensity and frequency of floods is one of the many consequences of our changing climate. In this work, we explore ML techniques that improve the flood detection module of an operational early flood warning system. Our method exploits an unlabelled dataset of paired multi-spectral and Synthetic Aperture Radar (SAR) imagery to reduce the labeling requirements of a purely supervised learning method. Prior works have used unlabelled data by creating weak labels out of them. However, from our experiments we noticed that such a model still ends up learning the label mistakes in those weak labels. Motivated by knowledge distillation and semi supervised learning, we explore the use of a teacher to train a student with the help of a small hand labelled dataset and a large unlabelled dataset. Unlike the conventional self distillation setup, we propose a cross modal distillation framework that transfers supervision from a teacher trained on richer modality (multi-spectral images) to a student model trained on SAR imagery. The trained models are then tested on the Sen1Floods11 dataset. Our model outperforms the Sen1Floods11 baseline model trained on the weak labeled SAR imagery by an absolute margin of 6.53

READ FULL TEXT

page 4

page 5

page 7

page 8

page 9

page 11

research
04/01/2020

Knowledge as Priors: Cross-Modal Knowledge Generalization for Datasets without Superior Knowledge

Cross-modal knowledge distillation deals with transferring knowledge fro...
research
10/10/2019

Cross-modal knowledge distillation for action recognition

In this work, we address the problem how a network for action recognitio...
research
06/24/2020

X-ModalNet: A Semi-Supervised Deep Cross-Modal Network for Classification of Remote Sensing Data

This paper addresses the problem of semi-supervised transfer learning wi...
research
07/18/2021

Flood Segmentation on Sentinel-1 SAR Imagery with Semi-Supervised Learning

Floods wreak havoc throughout the world, causing billions of dollars in ...
research
04/01/2020

Creating Something from Nothing: Unsupervised Knowledge Distillation for Cross-Modal Hashing

In recent years, cross-modal hashing (CMH) has attracted increasing atte...
research
02/13/2021

Distilling Double Descent

Distillation is the technique of training a "student" model based on exa...
research
05/02/2023

DeepAqua: Self-Supervised Semantic Segmentation of Wetlands from SAR Images using Knowledge Distillation

Remote sensing has significantly advanced water detection by applying se...

Please sign up or login with your details

Forgot password? Click here to reset