Accelerating Diffusion-based Combinatorial Optimization Solvers by Progressive Distillation

08/12/2023
by   Junwei Huang, et al.
0

Graph-based diffusion models have shown promising results in terms of generating high-quality solutions to NP-complete (NPC) combinatorial optimization (CO) problems. However, those models are often inefficient in inference, due to the iterative evaluation nature of the denoising diffusion process. This paper proposes to use progressive distillation to speed up the inference by taking fewer steps (e.g., forecasting two steps ahead within a single step) during the denoising process. Our experimental results show that the progressively distilled model can perform inference 16 times faster with only 0.019

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/16/2023

DIFUSCO: Graph-based Diffusion Solvers for Combinatorial Optimization

Neural network-based Combinatorial Optimization (CO) methods have shown ...
research
02/01/2022

Progressive Distillation for Fast Sampling of Diffusion Models

Diffusion models have recently shown great promise for generative modeli...
research
06/17/2022

Diffusion models as plug-and-play priors

We consider the problem of inferring high-dimensional data 𝐱 in a model ...
research
07/13/2022

ProDiff: Progressive Fast Diffusion Model For High-Quality Text-to-Speech

Denoising diffusion probabilistic models (DDPMs) have recently achieved ...
research
12/24/2021

An Efficient Combinatorial Optimization Model Using Learning-to-Rank Distillation

Recently, deep reinforcement learning (RL) has proven its feasibility in...
research
02/18/2023

Modelos Generativos basados en Mecanismos de Difusión

Diffusion-based generative models are a design framework that allows gen...
research
06/13/2013

Second Order Swarm Intelligence

An artificial Ant Colony System (ACS) algorithm to solve general-purpose...

Please sign up or login with your details

Forgot password? Click here to reset