TourBERT: A pretrained language model for the tourism industry

01/19/2022
by   Veronika Arefieva, et al.
0

The Bidirectional Encoder Representations from Transformers (BERT) is currently one of the most important and state-of-the-art models for natural language. However, it has also been shown that for domain-specific tasks it is helpful to pretrain BERT on a domain-specific corpus. In this paper, we present TourBERT, a pretrained language model for tourism. We describe how TourBERT was developed and evaluated. The evaluations show that TourBERT is outperforming BERT in all tourism-specific tasks.

READ FULL TEXT

page 4

page 7

page 8

page 9

research
03/05/2020

What the [MASK]? Making Sense of Language-Specific BERT Models

Recently, Natural Language Processing (NLP) has witnessed an impressive ...
research
03/09/2022

Pretrained Domain-Specific Language Model for General Information Retrieval Tasks in the AEC Domain

As an essential task for the architecture, engineering, and construction...
research
09/10/2021

IndoBERTweet: A Pretrained Language Model for Indonesian Twitter with Effective Domain-Specific Vocabulary Initialization

We present IndoBERTweet, the first large-scale pretrained model for Indo...
research
08/13/2019

Domain Adaptive Training BERT for Response Selection

We focus on multi-turn response selection in a retrieval-based dialog sy...
research
12/10/2020

Towards Neural Programming Interfaces

It is notoriously difficult to control the behavior of artificial neural...
research
08/21/2022

A Syntax Aware BERT for Identifying Well-Formed Queries in a Curriculum Framework

A well formed query is defined as a query which is formulated in the man...
research
02/17/2020

A Financial Service Chatbot based on Deep Bidirectional Transformers

We develop a chatbot using Deep Bidirectional Transformer models (BERT) ...

Please sign up or login with your details

Forgot password? Click here to reset