StarSpace: Embed All The Things!

09/12/2017
by   Ledell Wu, et al.
0

We present StarSpace, a general-purpose neural embedding model that can solve a wide variety of problems: labeling tasks such as text classification, ranking tasks such as information retrieval/web search, collaborative filtering-based or content-based recommendation, embedding of multi-relational graphs, and learning word, sentence or document level embeddings. In each case the model works by embedding those entities comprised of discrete features and comparing them against each other -- learning similarities dependent on the task. Empirical results on a number of tasks show that StarSpace is highly competitive with existing methods, whilst also being generally applicable to new cases where those methods are not.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2012

Latent Collaborative Retrieval

Retrieval tasks typically require a ranking of items given a query. Coll...
research
10/16/2012

Latent Structured Ranking

Many latent (factorized) models have been proposed for recommendation ta...
research
01/16/2020

Document Network Projection in Pretrained Word Embedding Space

We present Regularized Linear Embedding (RLE), a novel method that proje...
research
09/25/2018

Learning Consumer and Producer Embeddings for User-Generated Content Recommendation

User-Generated Content (UGC) is at the core of web applications where us...
research
06/03/2019

Global Textual Relation Embedding for Relational Understanding

Pre-trained embeddings such as word embeddings and sentence embeddings a...
research
05/10/2022

Sentence-level Privacy for Document Embeddings

User language data can contain highly sensitive personal content. As suc...
research
05/12/2021

Multi-Field Models in Neural Recipe Ranking – An Early Exploratory Study

Explicitly modelling field interactions and correlations in complex docu...

Please sign up or login with your details

Forgot password? Click here to reset