Deep Multiple Instance Learning with Distance-Aware Self-Attention

05/17/2023
by   Georg Wölflein, et al.
0

Traditional supervised learning tasks require a label for every instance in the training set, but in many real-world applications, labels are only available for collections (bags) of instances. This problem setting, known as multiple instance learning (MIL), is particularly relevant in the medical domain, where high-resolution images are split into smaller patches, but labels apply to the image as a whole. Recent MIL models are able to capture correspondences between patches by employing self-attention, allowing them to weigh each patch differently based on all other patches in the bag. However, these approaches still do not consider the relative spatial relationships between patches within the larger image, which is especially important in computational pathology. To this end, we introduce a novel MIL model with distance-aware self-attention (DAS-MIL), which explicitly takes into account relative spatial information when modelling the interactions between patches. Unlike existing relative position representations for self-attention which are discrete, our approach introduces continuous distance-dependent terms into the computation of the attention weights, and is the first to apply relative position representations in the context of MIL. We evaluate our model on a custom MNIST-based MIL dataset that requires the consideration of relative spatial information, as well as on CAMELYON16, a publicly available cancer metastasis detection dataset, where we achieve a test AUROC score of 0.91. On both datasets, our model outperforms existing MIL approaches that employ absolute positional encodings, as well as existing relative position representation schemes applied to MIL. Our code is available at https://anonymous.4open.science/r/das-mil.

READ FULL TEXT
research
07/09/2018

Position-aware Self-attention with Relative Positional Encodings for Slot Filling

This paper describes how to apply self-attention with relative positiona...
research
11/01/2021

Accounting for Dependencies in Deep Learning Based Multiple Instance Learning for Whole Slide Imaging

Multiple instance learning (MIL) is a key algorithm for classification o...
research
03/06/2018

Self-Attention with Relative Position Representations

Relying entirely on an attention mechanism, the Transformer introduced b...
research
09/19/2023

Multi-Stain Self-Attention Graph Multiple Instance Learning Pipeline for Histopathology Whole Slide Images

Whole Slide Images (WSIs) present a challenging computer vision task due...
research
12/11/2021

Multi-Attention Multiple Instance Learning

A new multi-attention based method for solving the MIL problem (MAMIL), ...
research
07/25/2020

HATNet: An End-to-End Holistic Attention Network for Diagnosis of Breast Biopsy Images

Training end-to-end networks for classifying gigapixel size histopatholo...
research
07/14/2023

Dual-Query Multiple Instance Learning for Dynamic Meta-Embedding based Tumor Classification

Whole slide image (WSI) assessment is a challenging and crucial step in ...

Please sign up or login with your details

Forgot password? Click here to reset