The Total Variation on Hypergraphs - Learning on Hypergraphs Revisited

12/18/2013
by   Matthias Hein, et al.
0

Hypergraphs allow one to encode higher-order relationships in data and are thus a very flexible modeling tool. Current learning methods are either based on approximations of the hypergraphs via graphs or on tensor methods which are only applicable under special conditions. In this paper, we present a new learning framework on hypergraphs which fully uses the hypergraph structure. The key element is a family of regularization functionals based on the total variation on hypergraphs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2023

Weighted structure tensor total variation for image denoising

Based on the variational framework of the image denoising problem, we in...
research
02/19/2021

Regularized Recovery by Multi-order Partial Hypergraph Total Variation

Capturing complex high-order interactions among data is an important tas...
research
02/09/2023

Incorporating Total Variation Regularization in the design of an intelligent Query by Humming system

A Query-By-Humming (QBH) system constitutes a particular case of music i...
research
04/24/2019

Prediction bounds for (higher order) total variation regularized least squares

We establish oracle inequalities for the least squares estimator f̂ with...
research
07/09/2021

Higher Order Imprecise Probabilities and Statistical Testing

We generalize standard credal set models for imprecise probabilities to ...
research
08/28/2020

Nonlocal Adaptive Direction-Guided Structure Tensor Total Variation For Image Recovery

A common strategy in variational image recovery is utilizing the nonloca...
research
11/16/2015

Probabilistic Segmentation via Total Variation Regularization

We present a convex approach to probabilistic segmentation and modeling ...

Please sign up or login with your details

Forgot password? Click here to reset