Generalizable Cross-Graph Embedding for GNN-based Congestion Prediction

11/10/2021
by   Amur Ghose, et al.
0

Presently with technology node scaling, an accurate prediction model at early design stages can significantly reduce the design cycle. Especially during logic synthesis, predicting cell congestion due to improper logic combination can reduce the burden of subsequent physical implementations. There have been attempts using Graph Neural Network (GNN) techniques to tackle congestion prediction during the logic synthesis stage. However, they require informative cell features to achieve reasonable performance since the core idea of GNNs is built on the message passing framework, which would be impractical at the early logic synthesis stage. To address this limitation, we propose a framework that can directly learn embeddings for the given netlist to enhance the quality of our node features. Popular random-walk based embedding methods such as Node2vec, LINE, and DeepWalk suffer from the issue of cross-graph alignment and poor generalization to unseen netlist graphs, yielding inferior performance and costing significant runtime. In our framework, we introduce a superior alternative to obtain node embeddings that can generalize across netlist graphs using matrix factorization methods. We propose an efficient mini-batch training method at the sub-graph level that can guarantee parallel training and satisfy the memory restriction for large-scale netlists. We present results utilizing open-source EDA tools such as DREAMPLACE and OPENROAD frameworks on a variety of openly available circuits. By combining the learned embedding on top of the netlist with the GNNs, our method improves prediction performance, generalizes to new circuit lines, and is efficient in training, potentially saving over 90 % of runtime.

READ FULL TEXT
research
07/23/2022

The prediction of the quality of results in Logic Synthesis using Transformer and Graph Neural Networks

In the logic synthesis stage, structure transformations in the synthesis...
research
11/30/2022

Towards Training GNNs using Explanation Directed Message Passing

With the increasing use of Graph Neural Networks (GNNs) in critical real...
research
09/19/2022

Revisiting Embeddings for Graph Neural Networks

Current graph representation learning techniques use Graph Neural Networ...
research
10/27/2021

VQ-GNN: A Universal Framework to Scale up Graph Neural Networks using Vector Quantization

Most state-of-the-art Graph Neural Networks (GNNs) can be defined as a f...
research
01/18/2022

High-Level Synthesis Performance Prediction using GNNs: Benchmarking, Modeling, and Advancing

Agile hardware development requires fast and accurate circuit quality ev...
research
08/07/2023

Imbalanced Large Graph Learning Framework for FPGA Logic Elements Packing Prediction

Packing is a required step in a typical FPGA CAD flow. It has high impac...
research
08/01/2023

Variational Label-Correlation Enhancement for Congestion Prediction

The physical design process of large-scale designs is a time-consuming t...

Please sign up or login with your details

Forgot password? Click here to reset