BERT4ETH: A Pre-trained Transformer for Ethereum Fraud Detection

03/29/2023
by   Sihao Hu, et al.
0

As various forms of fraud proliferate on Ethereum, it is imperative to safeguard against these malicious activities to protect susceptible users from being victimized. While current studies solely rely on graph-based fraud detection approaches, it is argued that they may not be well-suited for dealing with highly repetitive, skew-distributed and heterogeneous Ethereum transactions. To address these challenges, we propose BERT4ETH, a universal pre-trained Transformer encoder that serves as an account representation extractor for detecting various fraud behaviors on Ethereum. BERT4ETH features the superior modeling capability of Transformer to capture the dynamic sequential patterns inherent in Ethereum transactions, and addresses the challenges of pre-training a BERT model for Ethereum with three practical and effective strategies, namely repetitiveness reduction, skew alleviation and heterogeneity modeling. Our empirical evaluation demonstrates that BERT4ETH outperforms state-of-the-art methods with significant enhancements in terms of the phishing account detection and de-anonymization tasks. The code for BERT4ETH is available at: https://github.com/git-disl/BERT4ETH.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/18/2022

Tutela: An Open-Source Tool for Assessing User-Privacy on Ethereum and Tornado Cash

A common misconception among blockchain users is that pseudonymity guara...
research
09/02/2021

CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation

Pre-trained models for Natural Languages (NL) like BERT and GPT have bee...
research
08/29/2023

Vision Grid Transformer for Document Layout Analysis

Document pre-trained models and grid-based models have proven to be very...
research
04/06/2022

Unleashing Vanilla Vision Transformer with Masked Image Modeling for Object Detection

We present an approach to efficiently and effectively adapt a masked ima...
research
06/01/2021

You Only Look at One Sequence: Rethinking Transformer in Vision through Object Detection

Can Transformer perform 2D object-level recognition from a pure sequence...
research
09/29/2020

Ethereum's Recursive Length Prefix in ACL2

Recursive Length Prefix (RLP) is used to encode a wide variety of data i...

Please sign up or login with your details

Forgot password? Click here to reset