Visual Interest Prediction with Attentive Multi-Task Transfer Learning

05/26/2020
by   Deepanway Ghosal, et al.
5

Visual interest affect prediction is a very interesting area of research in the area of computer vision. In this paper, we propose a transfer learning and attention mechanism based neural network model to predict visual interest affective dimensions in digital photos. Learning the multi-dimensional affects is addressed through a multi-task learning framework. With various experiments we show the effectiveness of the proposed approach. Evaluation of our model on the benchmark dataset shows large improvement over current state-of-the-art systems.

READ FULL TEXT

page 3

page 8

research
09/13/2023

Learning from Auxiliary Sources in Argumentative Revision Classification

We develop models to classify desirable reasoning revisions in argumenta...
research
07/05/2019

Attentive Multi-Task Deep Reinforcement Learning

Sharing knowledge between tasks is vital for efficient learning in a mul...
research
07/03/2017

Discriminatory Transfer

We observe standard transfer learning can improve prediction accuracies ...
research
06/25/2021

Generative Modeling for Multi-task Visual Learning

Generative modeling has recently shown great promise in computer vision,...
research
05/05/2018

Transfer Learning of Artist Group Factors to Musical Genre Classification

The automated recognition of music genres from audio information is a ch...
research
06/22/2023

Multi-Task Learning with Loop Specific Attention for CDR Structure Prediction

The Complementarity Determining Region (CDR) structure prediction of loo...
research
07/29/2021

Ranking Micro-Influencers: a Novel Multi-Task Learning and Interpretable Framework

With the rise in use of social media to promote branded products, the de...

Please sign up or login with your details

Forgot password? Click here to reset