MIMA: MAPPER-Induced Manifold Alignment for Semi-Supervised Fusion of Optical Image and Polarimetric SAR Data

06/13/2019
by   Jingliang Hu, et al.
5

Multi-modal data fusion has recently been shown promise in classification tasks in remote sensing. Optical data and radar data, two important yet intrinsically different data sources, are attracting more and more attention for potential data fusion. It is already widely known that, a machine learning based methodology often yields excellent performance. However, the methodology relies on a large training set, which is very expensive to achieve in remote sensing. The semi-supervised manifold alignment (SSMA), a multi-modal data fusion algorithm, has been designed to amplify the impact of an existing training set by linking labeled data to unlabeled data via unsupervised techniques. In this paper, we explore the potential of SSMA in fusing optical data and polarimetric SAR data, which are multi-sensory data sources. Furthermore, we propose a MAPPER-induced manifold alignment (MIMA) for semi-supervised fusion of multi-sensory data sources. Our proposed method unites SSMA with MAPPER, which is developed from the emerging topological data analysis (TDA) field. To our best knowledge, this is the first time that SSMA has been applied on fusing optical data and SAR data, and also the first time that TDA has been applied in remote sensing. The conventional SSMA derives a topological structure using k-nearest-neighbor (kNN), while MIMA employs MAPPER, which considers the field knowledge and derives a novel topological structure through the spectral clustering in a data-driven fashion. Experiment results on data fusion with respect to land cover land use classification and local climate zone classification suggest superior performance of MIMA.

READ FULL TEXT

page 1

page 6

page 7

page 8

research
07/04/2018

The SEN1-2 Dataset for Deep Learning in SAR-Optical Data Fusion

While deep learning techniques have an increasing impact on many technic...
research
06/24/2020

X-ModalNet: A Semi-Supervised Deep Cross-Modal Network for Classification of Remote Sensing Data

This paper addresses the problem of semi-supervised transfer learning wi...
research
06/18/2021

Paradigm selection for Data Fusion of SAR and Multispectral Sentinel data applied to Land-Cover Classification

Data fusion is a well-known technique, becoming more and more popular in...
research
05/29/2019

Fusion of Heterogeneous Earth Observation Data for the Classification of Local Climate Zones

This paper proposes a novel framework for fusing multi-temporal, multisp...
research
07/01/2023

Detection of River Sandbank for Sand Mining with the Presence of Other High Mineral Content Regions Using Multi-spectral Images

Sand mining is a booming industry. The river sandbank is one of the prim...
research
01/09/2019

Learnable Manifold Alignment (LeMA) : A Semi-supervised Cross-modality Learning Framework for Land Cover and Land Use Classification

In this paper, we aim at tackling a general but interesting cross-modali...
research
03/19/2020

On the Detectability of Conflict: a Remote Sensing Study of the Rohingya Conflict

The detection and quantification of conflict through remote sensing moda...

Please sign up or login with your details

Forgot password? Click here to reset