Identity Inference on Blockchain using Graph Neural Network

04/14/2021 ∙ by Jie Shen, et al. ∙ 0

The anonymity of blockchain has accelerated the growth of illegal activities and criminal behaviors on cryptocurrency platforms. Although decentralization is one of the typical characteristics of blockchain, we urgently call for effective regulation to detect these illegal behaviors to ensure the safety and stability of user transactions. Identity inference, which aims to make a preliminary inference about account identity, plays a significant role in blockchain security. As a common tool, graph mining technique can effectively represent the interactive information between accounts and be used for identity inference. However, existing methods cannot balance scalability and end-to-end architecture, resulting high computational consumption and weak feature representation. In this paper, we present a novel approach to analyze user's behavior from the perspective of the transaction subgraph, which naturally transforms the identity inference task into a graph classification pattern and effectively avoids computation in large-scale graph. Furthermore, we propose a generic end-to-end graph neural network model, named I^2 BGNN, which can accept subgraph as input and learn a function mapping the transaction subgraph pattern to account identity, achieving de-anonymization. Extensive experiments on EOSG and ETHG datasets demonstrate that the proposed method achieve the state-of-the-art performance in identity inference.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 11

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

As a distributed database technology, blockchain achieves the function of decentralization, encryption, and tamper-proof. Benefiting from its anonymity, the past few years have witnessed the growing prevalence of cryptocurrencies. As of the first quarter of 2021, there are more than 8,700 kinds of cryptocurrencies with a total market cap of 1,721 billion dollars111https://coinmarketcap.com/ 222https://www.feixiaohao.com/. People only need to create a pseudonymous account (synonymous with address in this paper), and they can implement transaction at almost no cost. However, as the volume of transactions surged, the blockchain system of cryptocurrencies has also become a hotbed of illegal and criminal behavior, such as various scams[1, 2, 3] (Ponzi schemes, mining scams, scam wallets, fraudulent exchanges, etc.), money laundering[4, 5], abusing bot accounts[6] and vulnerability attack[7].

As an open technique, blockchain provides public and tamper-proof transaction records, which creates the condition for data mining and analysis. Recently, the emergence of related research has helped to analyze the transaction pattern and account behavior on the blockchain system, and most of them leverage graph modeling methods. Such as evolution analysis of market via the on-chain transaction graph[8, 9, 10, 11, 12]

, transaction patterns recognition via graph topology and motifs

[13, 14], detection of abnormal users or transactions via graph embedding or graph neural network[15, 16], etc. Among them, identity inference, which can be regarded as a de-anonymization process, is particularly important in blockchain data mining. Generally, identity inference aims to make a preliminary inference about account identity by capturing the characteristics of the transaction pattern of the accounts. For this task, common researches mainly concentrate on manual feature engineering including transaction features[17], graph features[18] and external features[14]. These features are mainly intuitive information, and share the same drawback like weak representation ability for classification. Further, several methods based on random-walk[19] and graph motif[20]

capture higher-order network features that are more representational. With the development of graph deep learning, graph convolution network (GCN) has attracted considerable attention and been applied in identity inference gradually, achieving outstanding results

[21, 22].

After reviewing the above various methods, we summarize two conflicting issues: scalability and end-to-end. On the one hand, real-world transaction data on blockchain systems are generally extremely huge. Although these methods based on feature engineering, especially manual features, show good scalability because of the independence of feature extraction, they cannot achieve end-to-end architecture. End-to-end can reduce the reliance on expertise which is the core of feature engineering, and optimize target task in a complete form rather than multi-flows. On the other hand, although the graph convolution network is commonly achieved via end-to-end, most of them have poor scalability. Because the training of graph convolution network is usually performed on the whole transaction graph, where the loading and computing are not realistic.

Motivated by the subgraph perspective[15], we propose a framework to reconcile the scalability and end-to-end solution for identity inference. Benefiting from previous work, we collect two kinds of on-chain transaction data including Ethereum and EOSIO, to infer the “phisher” and “bot” accounts, respectively. Firstly, we extract the transaction subgraph for each labeled accounts by a sampling mechanism. Through that, each account is transformed into an independent transaction subgraph. The sampling mechanism constrains the scale of transaction subgraph, which can effectively reduce the occupation of resources. Secondly, we propose an end-to-end model, to achieve Identity Inference on Blockchain using Graph Neural Network (named ).

The rest of paper is organized as follows. In Sec. 2, we introduce the related work about identity inference. In Sec. 3, we describe the details of our framework, including subgraph extraction and the architecture of . Sec. 4 presents the experiment settings and the comparison of experimental results with discussion. Finally, we conclude the paper in Sec. 5.

2 Related Work

Identity inference, which aims to detect abnormal and illegal accounts, has become an effective means to monitor accounts for platform and measure transaction risks for users. For identity inference on blockchain, related works concentrate on manual feature, graph embedding, graph neural network, and the others.

2.0.1 Manual Feature

Manual feature is a kind of feature engineering that relies on the experience of experts relatively. Normally, the more expert experience involved, the more reliable the feature vectors are. Lin et al. 

[23]

designed various features of transaction timestamps to express the transaction history about the accounts, and constructed a classifier against abnormal bitcoin addresses. Li et al. 

[17] considered three kinds of features: the basic account feature, the topological feature which is related to transaction patterns, and temporal feature which is captured from the distributions of transaction timestamp. In addition to transaction information, Huang et al. [6] also considered the calling information of smart contract to expand the feature space, and finally realized the identification of bot accounts in EOSIO.

2.0.2 Graph Embedding

Graph embedding aims to learn low-dimensional node representations that capture the graph structure and drive downstream graph mining task such as node classification to identify illicit accounts. Up to now, a series of methods based on DeepWalk (DW)[24] have been used to detect accounts. Yuan et al. [16]

used the Node2Vec algorithm which is a variant of DW to extract the potential features of the accounts and classified the phishers by Support-Vector-Machine (SVM). Wu et al. 

[19] redesigned the walking strategy by using transaction volume, timestamps, and multi-edges features to make their embedding framework more suitable for this task. Subsequently, Yuan et al. [15] extracted the subgraphs for each target account and embedded their transaction topology into feature vector via an embedding method named Graph2Vec [25]. Besides, they introduced the line graph [26] to further enhance the network structure embedding. Chen et al. [27] also used subgraph mechanism and got the embeddings by a graph convolution layer combining graph auto-encoder in an unsupervised way, and achieved phisher classification by LightGBM[28].

2.0.3 Graph Neural Network

This part mainly about the graph neural networks with end-to-end architecture. In [22], the whole transaction graph was sliced into small graphs by timestamp. This operation reduced the computational complexity and memory consumption which alleviate the scalability problem. Subsequently, graph convolution network was used for inductive learning to realize account identity inference. Tam et al. [21] used the mechanism of sampling transaction neighbors which is similar to the subgraph extraction. They characterized edges by embedding the temporal features from the time-series of transactions and incorporating them into the graph convolution network.

2.0.4 Others

Besides the aforementioned methods, there are other frameworks to achieve this identity inference. Phetsouvanh [29] proposed a graph mining technology to detect the suspicious bitcoin flow and account by analyzing the path length and confluence account of the directed subgraph. Zhang [30] introduced the concept of meta-path from the heterogeneous network and constructed multi-constrained meta-path based on time, attribution and topology, which is an effective way to capture behavior pattern features in a complex network.

3 Method

In this section, we first define the identity inference problem on blockchain, then present the details of subgraph extraction for constructing graph classification dataset. Finally, we review the knowledge of using graph neural network (GNN) for learning node and graph representations, and represent the details of proposed model for identity inference.

3.1 Problem Definition

From the perspective of graph mining, identity inference can be regarded as a node classification task. During node classification, the blockchain data will be modeled as a user network with million nodes, which results in unaffordable time and memory consumption for most practical algorithms. Inspired by the core of “neighborhood aggregation” in graph neural network, we transform the node classification problem into a graph classification pattern in return for less time and memory consumption.

Given a set of labeled accounts , we can extract the transaction subgraph centered on each target account. Specifically, we extract the transaction subgraph of account : , where represents the set of accounts in this subgraph, and represent the directed edge sets that contain information about transaction volume and transaction frequency respectively, represents the calling information of smart contract, is the label of subgraph . Note that we assign the label of account to the transaction subgraph centered on it, and transform the node classification problem into a graph classification task: . The final goal is to learn the transaction patterns of subgraphs and classify centered account into phishing or non-phishing via graph neural networks. The workflow of our framework is shown in Fig. 1, and the details of subgraph extraction and classifier design will be introduced in Subsection 3.2 and 3.3, respectively.

Figure 1: The schematic depiction of our framework. The complete workflow proceeds as follows: a) modeling the transaction network; b) sampling the labeled accounts; c) extracting the subgraphs centered on target accounts; d) training and evaluating using GNNs.

3.2 Subgraph Extraction

For each account in , we check the number of its transaction partners (i.e. neighbor nodes) first. Here, an upper limit of neighbor size, denoted as , will be set to control the scale of transaction subgraph. If the neighbor size of is less than the threshold, all neighbors and all transactions between them will be extracted. Otherwise, we calculate and sort the total transaction volume between target account and its neighbors, and select the top- neighbors. We assume that the larger the transaction volume, the higher the correlation between the two accounts. The above extraction mechanism can also be used to sample -hop transaction neighbors from the ()-hop neighbors. Therefore, the scale of one-order / two-order subgraph of target account will not exceed / normally.

The above process constructs the account (node) sets , the transaction (edge) volume set and the transaction (edge) frequency set . Next, we briefly introduce two datasets which are named ETHG and EOSG and construct feature matrix for them.

  • ETHG It is from Xblock333http://xblock.pro/ which is a blockchain data platform for academic research. There is an account list that contains 1660 phisher accounts and 1700 non-phisher accounts with their 2-hop transaction records in Ethereum. Based on this, we filter the Contract-Account (CA) which will be considered as the contract calling feature of Externally-Owned-Accounts (EOA). And according to the span of the block where the transaction record is located, we collect all CAs from 0 to 10,000,000 blocks, filter them via the calling amount, and retain the top 14885 finally. After that, we construct the feature matrix of contract calling (cc) , and each EOA has a 14885 dimension vector to represent their calling situation about those CAs.

  • EOSG It is collected by [6]. They integrate and model the on-chain data of EOSIO: Enhanced Money Flow Graph (EMFG) which contains the transactions between accounts including timestamps and volume, Enhanced Account Creation Graph (EACG) which contains account creation tree data, Enhanced Contract Invocation Graph (ECIG) which contains smart contract calling data, and a list of labeled accounts which contains 229,907 normal accounts and 63863 bot-like accounts. Similarly, we extract the subgraph graph and contract calling features from EMFG and ECIG respectively, and construct the feature matrix of contract calling (cc) . Further, we consider the account name restriction mechanism of EOSIO and add three kinds of node labels to expand features, since that the type of neighbors can also express the transaction pattern of the account. The three labels are the general account which consists of 12 characters, the auction account which is less than 12 characters but does not contain the character ’.’, and the sub-account of auction account which combines ’.’ with auction account name as the suffix. On the other hand, the neighbor extraction will stop at the system account whose name begins with ’EOSIO.’. Because the behavior pattern of the current center account has nothing to do with the transactions between other further accounts and system accounts. We construct the feature matrix of node label (nl) .

In summary, the feature matrix is for ETHG and for EOSG where is concatenation operation.

3.3 Graph Neural Networks

By viewing the accounts interaction as graph data (i.e., transaction graph), recent deep learning methods for graph structural data, such as graph neural network (GNNs)[31, 32, 33]

, can be utilized to learn transaction pattern representation that can be fed to downstream machine learning models for phishing account detection. In this section, we will present the details of employing GNNs to obtain transaction pattern representation.

GNNs learn the representations of nodes by leveraging both the graph structure and node/edge features. This is done by a neighborhood aggregation function that iteratively takes the representation of all neighbors together with the graph structure as input, and outputs the aggregate representation of target node. The aggregation function can be defined using Graph Convolution layer[31], Graph Attention layer[32], or any general message passing layer[33]. Formally, a graph convolution network (GCN) model follows the following rule to aggregate the feature of neighbors:

(1)

where is a matrix containing the -dimensional representation of nodes in the -th layer,

is the activation function (typically ReLU),

is a symmetric normalization of and can be defined as:

(2)

where is an adjacency matrix of the graph with self connections added, is a degree diagonal matrix. After layer of computation, the node representations is able to capture the information within their -hop neighborhoods.

Generally, GCN model is used to learn the node representations in semi-supervised node classification. A 2-layer GCN model with softmax function can be formulated as:

(3)

where

is the prediction probability distribution and

is the dimension of node labels. and are the input-to-hidden and hidden-to-output weights, respectively.

Figure 2: The architecture of model. a) () and

are captured from subgraph extraction and sent to graph convolution network; b) the max-pooling layer is used to compress the aggregated node representations to obtain the whole graph representation; c) the graph representation is used to predict the subgraph (account) label.

3.4

We now present the details of proposed for identity inference on blockchain. For graph classification, the pooling operations aggregate node representations from the final iteration to obtain the whole graph’s representation. By stacking the pooling layer and fully-connected layer after 2-layer GCN, the basic graph classification model for identity inference can be constructed as follows:

(4)

Note that we use the max pooling to obtain the whole graph’s representation. The model architecture of is shown in Fig. 2. In a transaction subgraph, each node represents an account and each directed edge represents transaction flow that contains information about transaction volume and frequency. For the input layer of GCN, we first initialize the node representations using their attributions in transaction subgraph. Specifically, the node attributions include contract calling information (cc) and distinctive node-label (nl), as mentioned in Sec. 3.2, and we initialize node representation as .

Dataset              Label bias
ETHG 3266 80 239 222 14885 2 0.99
EOSG 2000 260 4250 3212 1216 2 1
Table 1: Dataset properties. is the number of subgraphs in dataset, is the average number of nodes per graph, is the average number of edges per directed graph, is the average number of edges per undirected graph which is transformed from corresponding directed graph, is the dimension of node features, is the number of classes for labels.

4 Experiment

4.1 Dataset

For EOSG dataset, we filter the labeled account list in term of subgraph size and obtain over 20,000 available accounts. Then, 1000 accounts per label are selected randomly for the follow-up experiments. The detailed dataset properties are given in Table 1. Finally, each dataset is split into training and testing sets with a proportion of 1:1, and they will be resplit 3 times using different random seeds. We report the average accuracy across all trials.

4.2 Baseline

Since we implement identity inference with a graph classification pattern, we compare our framework with several SOTA graph classification algorithms including SF [34], Graph2vec [25], Netlsd [35] and FGSD [36]

. The first two are graph embedding methods and the last two are graph kernel methods. Graph2vec extends the document embedding methods to graph classification and learns a distributed representation of the whole graph via document embedding neural networks. SF performs graph classification by spectral decomposition of the graph Laplacian, i.e., it relies on spectral features of the graph. Netlsd performs graph classification by extracting compact graph signatures that inherit the formal properties of the Laplacian spectrum. FGSD calculates the Moore-Penrose spectrum of the normalized laplacian and uses the histogram of the spectral features of this spectrum to represent the whole graph.

4.3 Experiment setting

During subgraph extraction, the direction of edges in subgraph is determined by the transaction flow. However, during the experiments, we find that symmetric adjacency matrix of subgraph usually outperforms directed adjacency matrix. Therefore, we transform the directed adjacency matrix into a symmetric adjacency matrix by adding its transpose to itself. Other settings for models and datasets are as follows:

4.3.1 Method settings

For all the four baseline methods, we set the embedding dimension to 128, and use default settings for other parameters. Further, we implement graph classification by using the following machine learning classifiers: Support Vector Machine (SVM) with radial basis kernel, k-Nearest Neighbors classifier (KNN) and Random Forest classifier (RF). As for

, we apply two layers of GCNs with output dimensions both equal to 128, and set the maximum number of eopchs to be 50, the batch size to be 30 and dropout to be 0.3.

Method Dataset
EOSG ETHG
 F1 Precision Recall  F1 Precision Recall
 Graph2vec  SVM  0.8223 0.8132 0.8317  0.6487 0.7564 0.5678
 KNN  0.6171 0.9820 0.4499  0.5705 0.5709 0.5701
 RF  0.7637 0.8155 0.7180  0.6104 0.7355 0.5216
 SF  SVM  0.9428 0.9222 0.9643  0.6287 0.6405 0.6173
 KNN  0.9089 0.9081 0.9098  0.6238 0.6401 0.6083
 RF  0.9333 0.9166 0.9507  0.6908 0.7056 0.6766
 Netlsd  SVM  0.8730 0.8574 0.8891  0.7067 0.6869 0.7276
 KNN  0.8406 0.8417 0.8396  0.6774 0.6884 0.6667
 RF  0.8845 0.8625 0.9077  0.6702 0.6782 0.6623
 FGSD  SVM  0.9617 0.9534 0.9701  0.7206 0.6810 0.7650
 KNN  0.9469 0.9404 0.9534  0.7161 0.6750 0.7625
 RF  0.9578 0.9579 0.9578  0.7372 0.7448 0.7297
-v  0.9940 0.9894 0.9986  0.8587 0.8190 0.9024
-t  0.9950 0.9917 0.9983  0.8600 0.8697 0.8505
Table 2: Results of identity inference. The top-2 best results are highlighted in bold.

4.3.2 Metric settings

Both the datasets have two classes, so we evaluate the results of binary classification by precision, recall and F1-Score.

4.4 Result and discussion

4.4.1 Inference Performance

Table 2 reports the performance comparison between and baselines, from which we can observe that significantly outperforms other methods across the two datasets. Specifically, compared with baselines, our achieves average improvement of 12% / 19% in term of F1 on EOSG / ETHG. This may be due to the excellent expression ability of the graph convolution layer and the effectiveness of the features which are constructed by the contract calling information. In addition, these graph embedding and kernel methods which are based on spectral analysis are significantly better than Graph2vec. Their advantages are also reflected in the efficiency of model operation in experiments.

Furthermore, we investigate the influence of neighborhood depth and data division on the experimental results under various settings.

4.4.2 The influence of neighborhood depth

Normally, the subgraph containing 3-hop neighbors will have a large scale, which leads to difficulties in feature learning. To further analyze the influence of different depth for subgraph extraction, we extract 1-hop and 2-hop neighbors to construct the subgraphs for each target account. Table 3 shows the properties of 1-order and 2-order subgraphs for two datasets. And Table 4 reports the performance comparison between 1-order and 2-order subgraphs using . For EOSG, with 1-order subgraph performs slightly better than that with 2-order subgraph. Actually, the 1-order transaction subgraph contains sufficient and effective characteristics of transaction behavior, while the larger scale of the 2-order subgraph leads to the redundancy of information. As for ETHG, the situation is just the opposite, with 2-order subgraph outperforms that with 1-order subgraph. Obviously, the 1-order subgraph contains sparse transaction information, which is not conducive to inference, while the denser interactions in the 2-order subgraph facilitate behavior analysis and identity inference.

 Dataset  Subgraph      
EOSG 1-order 17 65 48
2-order 260 4250 3212
ETHG 1-order 10 13 12
2-order 80 239 222
Table 3: The properties of 1-hop and 2-hop subgraphs.
 Dataset  Method  1-order  2-order
EOSG  -v 0.9960 0.9940
 -t 0.9980 0.9950
ETHG  -v 0.8356 0.8587
 -t 0.8366 0.8600
Table 4: Results of with different neighborhood depth.
Figure 3: Experimental results of different division ratios of the dataset on EOSG(left) and ETHG(right)

4.4.3 The influence of data split

Next, we analyze the sensitivity of models to different ratios of data split. Specifically, we vary the ratio of training set to testing set in . Fig. 3 reports the inference results (F1) of different models with various proportion of training set. Obviously, for different ratios, our holds the best performance compared with other graph classification models. In addition, with the increase of training data, the performances of all models are naturally improved.

5 Conclusion

Traditional graph mining methods for identity inference are stuck in a dilemma where it is difficult to integrate scalability and end-to-end architecture into one model. In this work, we balance scalability and end-to-end architecture in model design. Specifically, we propose to learn the transaction subgraph centered on target account and transform the identity inference task on blockchain into graph classification pattern, resulting in a great reduction in resource consumption. Moreover, we design an end-to-end model, which is capable of learning an effective graph representation. Finally, we conduct extensive experiments on two real blockchain datasets (EOSG and ETHG) to demonstrate the effectiveness of our proposed . Experimental results show that the transaction pattern hidden in subgraph can actually reveal the account behavior, and our achieves the outstanding performance in identity inference.

5.0.1 Acknowledgements.

The authors would like to thank all the members in the IVSN Research Group, Zhejiang University of Technology for the valuable discussions about the ideas and technical details presented in this paper. This work was partially supported by the National Key R&D Program of China under Grant No. 2020YFB1006104, by the National Natural Science Foundation of China under Grant No. 61973273, by the Zhejiang Provincial Natural Science Foundation of China under Grant No. LR19F030001, by the Ministry of Public Security’s Research Project “Research and Demonstration Application of Key Technologies of Criminal Social Network Model”, and by the Special Scientific Research Fund of Basic Public Welfare Profession of Zhejiang Province under Grant LGF20F020016.

References

  • [1] Marie Vasek and Tyler Moore. There’s no free lunch, even using bitcoin: Tracking the popularity and profits of virtual currency scams. In International conference on financial cryptography and data security, pages 44–61. Springer, 2015.
  • [2] Jiajing Wu, Dan Lin, Zibin Zheng, and Qi Yuan. T-edge: Temporal weighted multidigraph embedding for ethereum transaction network analysis. arXiv preprint arXiv:1905.08038, 2019.
  • [3] Weili Chen, Zibin Zheng, Jiahui Cui, Edith Ngai, Peilin Zheng, and Yuren Zhou. Detecting ponzi schemes on ethereum: Towards healthier blockchain technology. In Proceedings of the 2018 World Wide Web Conference, pages 1409–1418, 2018.
  • [4] Danton Bryans. Bitcoin and money laundering: mining for an effective solution. Ind. LJ, 89:441, 2014.
  • [5] Yaya Fanusie and Tom Robinson. Bitcoin laundering: an analysis of illicit flows into digital currency services. Center on Sanctions and Illicit Finance memorandum, January, 2018.
  • [6] Yuheng Huang, Haoyu Wang, Lei Wu, Gareth Tyson, Xiapu Luo, Run Zhang, Xuanzhe Liu, Gang Huang, and Xuxian Jiang. Understanding (mis) behavior on the eosio blockchain. Proceedings of the ACM on Measurement and Analysis of Computing Systems, 4(2):1–28, 2020.
  • [7] Damiano Di Francesco Maesa, Andrea Marino, and Laura Ricci. Detecting artificial behaviours in the bitcoin users graph. Online Social Networks and Media, 3:63–74, 2017.
  • [8] Dániel Kondor, Márton Pósfai, István Csabai, and Gábor Vattay. Do the rich get richer? an empirical analysis of the bitcoin transaction network. PloS one, 9(2):e86197, 2014.
  • [9] Israa Alqassem, Iyad Rahwan, and Davor Svetinovic. The anti-social system properties: Bitcoin network data analysis. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 50(1):21–31, 2018.
  • [10] Paolo Tasca, Adam Hayes, and Shaowen Liu. The evolution of the bitcoin economy: extracting and analyzing the network of payment relationships. The Journal of Risk Finance, 2018.
  • [11] Qianlan Bai, Chao Zhang, Yuedong Xu, Xiaowei Chen, and Xin Wang. Evolution of ethereum: a temporal graph perspective. arXiv preprint arXiv:2001.05251, 2020.
  • [12] Stefano Ferretti and Gabriele D’Angelo. On the ethereum blockchain structure: A complex networks theory perspective. Concurrency and Computation: Practice and Experience, 32(12):e5493, 2020.
  • [13] Butian Huang, Zhenguang Liu, Jianhai Chen, Anan Liu, Qi Liu, and Qinming He. Behavior pattern clustering in blockchain networks. Multimedia Tools and Applications, 76(19):20099–20110, 2017.
  • [14] Stephen Ranshous, Cliff A Joslyn, Sean Kreyling, Kathleen Nowak, Nagiza F Samatova, Curtis L West, and Samuel Winters. Exchange pattern mining in the bitcoin transaction directed hypergraph. In International Conference on Financial Cryptography and Data Security, pages 248–263. Springer, 2017.
  • [15] Zihao Yuan, Qi Yuan, and Jiajing Wu. Phishing detection on ethereum via learning representation of transaction subgraphs. In International Conference on Blockchain and Trustworthy Systems, pages 178–191. Springer, 2020.
  • [16] Qi Yuan, Baoying Huang, Jie Zhang, Jiajing Wu, Haonan Zhang, and Xi Zhang. Detecting phishing scams on ethereum based on transaction records. In 2020 IEEE International Symposium on Circuits and Systems (ISCAS), pages 1–5. IEEE, 2020.
  • [17] Yang Li, Yue Cai, Hao Tian, Gengsheng Xue, and Zibin Zheng. Identifying illicit addresses in bitcoin network. In International Conference on Blockchain and Trustworthy Systems, pages 99–111. Springer, 2020.
  • [18] Thai Pham and Steven Lee. Anomaly detection in the bitcoin system-a network perspective. arXiv preprint arXiv:1611.03942, 2016.
  • [19] Jiajing Wu, Qi Yuan, Dan Lin, Wei You, Weili Chen, Chuan Chen, and Zibin Zheng. Who are the phishers? phishing scam detection on ethereum via network embedding. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2020.
  • [20] Jiajing Wu, Jieli Liu, Weili Chen, Huawei Huang, Zibin Zheng, and Yan Zhang. Detecting mixing services via mining bitcoin transaction network with hybrid motifs. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2021.
  • [21] Da Sun Handason Tam, Wing Cheong Lau, Bin Hu, Qiu Fang Ying, Dah Ming Chiu, and Hong Liu. Identifying illicit accounts in large scale e-payment networks–a graph representation learning approach. arXiv preprint arXiv:1906.05546, 2019.
  • [22] Mark Weber, Giacomo Domeniconi, Jie Chen, Daniel Karl I Weidele, Claudio Bellei, Tom Robinson, and Charles E Leiserson. Anti-money laundering in bitcoin: Experimenting with graph convolutional networks for financial forensics. arXiv preprint arXiv:1908.02591, 2019.
  • [23] Yu-Jing Lin, Po-Wei Wu, Cheng-Han Hsu, I-Ping Tu, and Shih-wei Liao. An evaluation of bitcoin address classification based on transaction history summarization. In 2019 IEEE International Conference on Blockchain and Cryptocurrency (ICBC), pages 302–310. IEEE, 2019.
  • [24] Bryan Perozzi, Rami Al-Rfou, and Steven Skiena. Deepwalk: Online learning of social representations. In Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining, pages 701–710, 2014.
  • [25] Annamalai Narayanan, Mahinthan Chandramohan, Rajasekar Venkatesan, Lihui Chen, Yang Liu, and Shantanu Jaiswal. graph2vec: Learning distributed representations of graphs. arXiv preprint arXiv:1707.05005, 2017.
  • [26] Qi Xuan, Jinhuan Wang, Minghao Zhao, Junkun Yuan, Chenbo Fu, Zhongyuan Ruan, and Guanrong Chen. Subgraph networks with application to structural feature space expansion. IEEE Transactions on Knowledge and Data Engineering, 2019.
  • [27] Liang Chen, Jiaying Peng, Yang Liu, Jintang Li, Fenfang Xie, and Zibin Zheng. Phishing scams detection in ethereum transaction network. ACM Transactions on Internet Technology (TOIT), 21(1):1–16, 2020.
  • [28] Guolin Ke, Qi Meng, Thomas Finley, Taifeng Wang, Wei Chen, Weidong Ma, Qiwei Ye, and Tie-Yan Liu.

    Lightgbm: A highly efficient gradient boosting decision tree.

    Advances in neural information processing systems, 30:3146–3154, 2017.
  • [29] Silivanxay Phetsouvanh, Frédérique Oggier, and Anwitaman Datta. Egret: Extortion graph exploration techniques in the bitcoin network. In 2018 IEEE International conference on data mining workshops (ICDMW), pages 244–251. IEEE, 2018.
  • [30] Rui Zhang, Guifa Zhang, Lan Liu, Chen Wang, and Shaohua Wan. Anomaly detection in bitcoin information networks with multi-constrained meta path. Journal of Systems Architecture, 110:101829, 2020.
  • [31] Thomas N Kipf and Max Welling. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
  • [32] Petar Veličković, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. Graph Attention Networks. International Conference on Learning Representations, 2018. accepted as poster.
  • [33] Justin Gilmer, Samuel S Schoenholz, Patrick F Riley, Oriol Vinyals, and George E Dahl. Neural message passing for quantum chemistry. In International Conference on Machine Learning, pages 1263–1272. PMLR, 2017.
  • [34] Nathan de Lara and Edouard Pineau. A simple baseline algorithm for graph classification. arXiv preprint arXiv:1810.09155, 2018.
  • [35] Anton Tsitsulin, Davide Mottin, Panagiotis Karras, Alexander Bronstein, and Emmanuel Müller. Netlsd: hearing the shape of a graph. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pages 2347–2356, 2018.
  • [36] Saurabh Verma and Zhi-Li Zhang. Hunt for the unique, stable, sparse and fast feature learning on graphs. In NIPS, pages 88–98, 2017.