With the increasing development of e-commerce and online services,
perso...
Pseudo Labeling is a technique used to improve the performance of
semi-s...
Attention-based neural networks, such as Transformers, have become ubiqu...
Responding with multi-modal content has been recognized as an essential
...
Recent self-supervised pre-training methods on Heterogeneous Information...
Structured pruning techniques have achieved great compression performanc...
To create a large amount of training labels for machine learning models
...
Despite the impressive progress of general face detection, the tuning of...
Over-smoothing is a challenging problem, which degrades the performance ...
Responsing with image has been recognized as an important capability for...
Creating labeled training sets has become one of the major roadblocks in...
Graph Neural Networks (GNNs) have shown advantages in various graph-base...
Recent Weak Supervision (WS) approaches have had widespread success in
e...
To alleviate data sparsity and cold-start problems of traditional recomm...
Architecture performance predictors have been widely used in neural
arch...
Pre-trained language models like BERT achieve superior performances in
v...
With the success of Neural Architecture Search (NAS), weight sharing, as...
Large-scale pre-trained models have attracted extensive attention in the...
Graph Convolutional Network (GCN) has achieved extraordinary success in
...
BERT is a cutting-edge language representation model pre-trained by a la...
One of the most popular paradigms of applying large, pre-trained NLP mod...
With the success of deep neural networks, Neural Architecture Search (NA...
Learning text representation is crucial for text classification and othe...
Graph Convolutional Network (GCN) has attracted intensive interests rece...