POLARIS: A Geographic Pre-trained Model and its Applications in Baidu Maps

03/17/2022
by   Jizhou Huang, et al.
0

Pre-trained models (PTMs) have become a fundamental backbone for downstream tasks in natural language processing and computer vision. Despite initial gains that were obtained by applying generic PTMs to geo-related tasks at Baidu Maps, a clear performance plateau over time was observed. One of the main reasons for this plateau is the lack of readily available geographic knowledge in generic PTMs. To address this problem, in this paper, we present POLARIS, which is a geographic pre-trained model designed and developed for improving the geo-related tasks at Baidu Maps. POLARIS is elaborately designed to learn a universal representation of geography-language by pre-training on large-scale data generated from a heterogeneous graph that contains abundant geographic knowledge. Extensive quantitative and qualitative experiments conducted on large-scale real-world datasets demonstrate the superiority and effectiveness of POLARIS. POLARIS has already been deployed in production at Baidu Maps since April 2021, which significantly benefits the performance of a wide range of downstream tasks. This demonstrates that POLARIS can serve as a fundamental backbone for geo-related tasks.

READ FULL TEXT

page 4

page 8

research
04/04/2023

G2PTL: A Pre-trained Model for Delivery Address and its Applications in Logistics System

Text-based delivery addresses, as the data foundation for logistics syst...
research
04/04/2023

Improved Visual Fine-tuning with Natural Language Supervision

Fine-tuning a pre-trained model can leverage the semantic information fr...
research
02/18/2022

A Survey of Vision-Language Pre-Trained Models

As Transformer evolved, pre-trained models have advanced at a breakneck ...
research
08/03/2022

GROWN+UP: A Graph Representation Of a Webpage Network Utilizing Pre-training

Large pre-trained neural networks are ubiquitous and critical to the suc...
research
05/02/2021

MathBERT: A Pre-Trained Model for Mathematical Formula Understanding

Large-scale pre-trained models like BERT, have obtained a great success ...
research
10/13/2021

EventBERT: A Pre-Trained Model for Event Correlation Reasoning

Event correlation reasoning infers whether a natural language paragraph ...
research
08/29/2023

Reprogramming under constraints: Revisiting efficient and reliable transferability of lottery tickets

In the era of foundation models with huge pre-training budgets, the down...

Please sign up or login with your details

Forgot password? Click here to reset