GROWN+UP: A Graph Representation Of a Webpage Network Utilizing Pre-training

08/03/2022
by   Benedict Yeoh, et al.
0

Large pre-trained neural networks are ubiquitous and critical to the success of many downstream tasks in natural language processing and computer vision. However, within the field of web information retrieval, there is a stark contrast in the lack of similarly flexible and powerful pre-trained models that can properly parse webpages. Consequently, we believe that common machine learning tasks like content extraction and information mining from webpages have low-hanging gains that yet remain untapped. We aim to close the gap by introducing an agnostic deep graph neural network feature extractor that can ingest webpage structures, pre-train self-supervised on massive unlabeled data, and fine-tune to arbitrary tasks on webpages effectually. Finally, we show that our pre-trained model achieves state-of-the-art results using multiple datasets on two very different benchmarks: webpage boilerplate removal and genre classification, thus lending support to its potential application in diverse downstream tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/12/2020

The Lottery Tickets Hypothesis for Supervised and Self-supervised Pre-training in Computer Vision Models

The computer vision world has been re-gaining enthusiasm in various pre-...
research
02/18/2022

A Survey of Vision-Language Pre-Trained Models

As Transformer evolved, pre-trained models have advanced at a breakneck ...
research
03/17/2022

POLARIS: A Geographic Pre-trained Model and its Applications in Baidu Maps

Pre-trained models (PTMs) have become a fundamental backbone for downstr...
research
03/14/2022

Lead-agnostic Self-supervised Learning for Local and Global Representations of Electrocardiogram

In recent years, self-supervised learning methods have shown significant...
research
10/13/2021

EventBERT: A Pre-Trained Model for Event Correlation Reasoning

Event correlation reasoning infers whether a natural language paragraph ...
research
05/31/2023

Diffused Redundancy in Pre-trained Representations

Representations learned by pre-training a neural network on a large data...
research
04/04/2023

G2PTL: A Pre-trained Model for Delivery Address and its Applications in Logistics System

Text-based delivery addresses, as the data foundation for logistics syst...

Please sign up or login with your details

Forgot password? Click here to reset