Emulating Quantum Dynamics with Neural Networks via Knowledge Distillation

03/19/2022
by   Yu Yao, et al.
0

High-fidelity quantum dynamics emulators can be used to predict the time evolution of complex physical systems. Here, we introduce an efficient training framework for constructing machine learning-based emulators. Our approach is based on the idea of knowledge distillation and uses elements of curriculum learning. It works by constructing a set of simple, but rich-in-physics training examples (a curriculum). These examples are used by the emulator to learn the general rules describing the time evolution of a quantum system (knowledge distillation). The goal is not only to obtain high-quality predictions, but also to examine the process of how the emulator learns the physics of the underlying problem. This allows us to discover new facts about the physical system, detect symmetries, and measure relative importance of the contributing physical processes. We illustrate this approach by training an artificial neural network to predict the time evolution of quantum wave packages propagating through a potential landscape. We focus on the question of how the emulator learns the rules of quantum dynamics from the curriculum of simple training examples and to which extent it can generalize the acquired knowledge to solve more challenging cases.

READ FULL TEXT
research
04/10/2023

A Survey on Recent Teacher-student Learning Studies

Knowledge distillation is a method of transferring the knowledge from a ...
research
02/08/2018

Imitation networks: Few-shot learning of neural networks from scratch

In this paper, we propose imitation networks, a simple but effective met...
research
04/19/2019

Knowledge Distillation via Route Constrained Optimization

Distillation-based learning boosts the performance of the miniaturized n...
research
08/29/2022

How to Teach: Learning Data-Free Knowledge Distillation from Curriculum

Data-free knowledge distillation (DFKD) aims at training lightweight stu...
research
06/15/2020

Multi-fidelity Neural Architecture Search with Knowledge Distillation

Neural architecture search (NAS) targets at finding the optimal architec...
research
03/20/2023

A closer look at the training dynamics of knowledge distillation

In this paper we revisit the efficacy of knowledge distillation as a fun...
research
01/02/2020

Operationally meaningful representations of physical systems in neural networks

To make progress in science, we often build abstract representations of ...

Please sign up or login with your details

Forgot password? Click here to reset