Model Fusion via Optimal Transport

10/12/2019
by   Sidak Pal Singh, et al.
0

Combining different models is a widely used paradigm in machine learning applications. While the most common approach is to form an ensemble of models and average their individual predictions, this approach is often rendered infeasible by given resource constraints in terms of memory and computation, which grow linearly with the number of models. We present a layer-wise model fusion procedure for neural networks that utilizes optimal transport to (soft-) align neurons across the models before averaging their associated parameters. We discuss two main algorithms for fusing neural networks in this "one-shot" manner, without requiring any retraining. Finally, we illustrate on CIFAR10 and MNIST how this significantly outperforms vanilla averaging on convolutional networks, such as VGG11 and multi-layer perceptrons, and for transfer tasks even surpasses the performance of both original models.

READ FULL TEXT
research
05/30/2022

Neural Optimal Transport with General Cost Functionals

We present a novel neural-networks-based algorithm to compute optimal tr...
research
10/29/2021

Model Fusion of Heterogeneous Neural Networks via Cross-Layer Alignment

Layer-wise model fusion via optimal transport, named OTFusion, applies s...
research
10/01/2021

On the complexity of the optimal transport problem with graph-structured cost

Multi-marginal optimal transport (MOT) is a generalization of optimal tr...
research
08/01/2022

Beyond kNN: Adaptive, Sparse Neighborhood Graphs via Optimal Transport

Nearest neighbour graphs are widely used to capture the geometry or topo...
research
01/23/2022

Revisiting Pooling through the Lens of Optimal Transport

Pooling is one of the most significant operations in many machine learni...
research
02/24/2020

TrojanNet: Embedding Hidden Trojan Horse Models in Neural Networks

The complexity of large-scale neural networks can lead to poor understan...

Please sign up or login with your details

Forgot password? Click here to reset