-
Optimal transport mapping via input convex neural networks
In this paper, we present a novel and principled approach to learn the o...
read it
-
Large-Scale Optimal Transport and Mapping Estimation
This paper presents a novel two-step approach for the fundamental proble...
read it
-
TrojanNet: Embedding Hidden Trojan Horse Models in Neural Networks
The complexity of large-scale neural networks can lead to poor understan...
read it
-
Fairness with Continuous Optimal Transport
Whilst optimal transport (OT) is increasingly being recognized as a powe...
read it
-
Learning transport cost from subset correspondence
Learning to align multiple datasets is an important problem with many ap...
read it
-
Efficient robust optimal transport: formulations and algorithms
The problem of robust optimal transport (OT) aims at recovering the best...
read it
-
Calibrated Domain-Invariant Learning for Highly Generalizable Large Scale Re-Identification
Many real-world applications, such as city-scale traffic monitoring and ...
read it
Large-Scale Optimal Transport via Adversarial Training with Cycle-Consistency
Recent advances in large-scale optimal transport have greatly extended its application scenarios in machine learning. However, existing methods either not explicitly learn the transport map or do not support general cost function. In this paper, we propose an end-to-end approach for large-scale optimal transport, which directly solves the transport map and is compatible with general cost function. It models the transport map via stochastic neural networks and enforces the constraint on the marginal distributions via adversarial training. The proposed framework can be further extended towards learning Monge map or optimal bijection via adopting cycle-consistency constraint(s). We verify the effectiveness of the proposed method and demonstrate its superior performance against existing methods with large-scale real-world applications, including domain adaptation, image-to-image translation, and color transfer.
READ FULL TEXT
Comments
There are no comments yet.