Lowering the T-depth of Quantum Circuits By Reducing the Multiplicative Depth Of Logic Networks

06/06/2020
by   Thomas Häner, et al.
0

The multiplicative depth of a logic network over the gate basis {, ⊕, } is the largest number of gates on any path from a primary input to a primary output in the network. We describe a dynamic programming based logic synthesis algorithm to reduce the multiplicative depth in logic networks. It makes use of cut enumeration, tree balancing, and exclusive sum-of-products (ESOP) representations. Our algorithm has applications to cryptography and quantum computing, as a reduction in the multiplicative depth directly translates to a lower T-depth of the corresponding quantum circuit. Our experimental results show improvements in T-depth over state-of-the-art methods and over several hand-optimized quantum circuits for instances of AES, SHA, and floating-point arithmetic.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

07/11/2019

Optimal Space-Depth Trade-Off of CNOT Circuits in Quantum Logic Synthesis

Due to the decoherence of the state-of-the-art physical implementations ...
05/25/2020

Depth-2 QAC circuits cannot simulate quantum parity

We show that the quantum parity gate on n > 3 qubits cannot be cleanly s...
06/23/2021

Learning quantum circuits of some T gates

In this paper, we study the problem of learning quantum circuits of a ce...
06/04/2019

Phase Gadget Synthesis for Shallow Circuits

We give an overview of the circuit optimisation methods used by tket, a ...
09/01/2021

Irredundant Buffer and Splitter Insertion and Scheduling-Based Optimization for AQFP Circuits

The adiabatic quantum-flux parametron (AQFP) is a promising energy-effic...
07/13/2020

In-place implementation of Quantum-Gimli

We present an in-place implementation of the Gimli permutation, a NIST r...
12/20/2021

Efficient Floating Point Arithmetic for Quantum Computers

One of the major promises of quantum computing is the realization of SIM...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.