-
UM-IU@LING at SemEval-2019 Task 6: Identifying Offensive Tweets Using BERT and SVMs
This paper describes the UM-IU@LING's system for the SemEval 2019 Task 6...
read it
-
Predicting Preference Flips in Commerce Search
Traditional approaches to ranking in web search follow the paradigm of r...
read it
-
Brown University at TREC Deep Learning 2019
This paper describes Brown University's submission to the TREC 2019 Deep...
read it
-
PECOS: Prediction for Enormous and Correlated Output Spaces
Many challenging problems in modern applications amount to finding relev...
read it
-
Grid Search Hyperparameter Benchmarking of BERT, ALBERT, and LongFormer on DuoRC
The purpose of this project is to evaluate three language models named B...
read it
-
Towards Productionizing Subjective Search Systems
Existing e-commerce search engines typically support search only over ob...
read it
-
Towards a Soft Faceted Browsing Scheme for Information Access
Faceted browsing is a commonly supported feature of user interfaces for ...
read it
Fine-tune BERT for E-commerce Non-Default Search Ranking
The quality of non-default ranking on e-commerce platforms, such as based on ascending item price or descending historical sales volume, often suffers from acute relevance problems, since the irrelevant items are much easier to be exposed at the top of the ranking results. In this work, we propose a two-stage ranking scheme, which first recalls wide range of candidate items through refined query/title keyword matching, and then classifies the recalled items using BERT-Large fine-tuned on human label data. We also implemented parallel prediction on multiple GPU hosts and a C++ tokenization custom op of Tensorflow. In this data challenge, our model won the 1st place in the supervised phase (based on overall F1 score) and 2nd place in the final phase (based on average per query F1 score).
READ FULL TEXT
Comments
There are no comments yet.