On the inverse Potts functional for single-image super-resolution problems

08/19/2020 ∙ by Pasquale Cascarano, et al. ∙ 0

We consider a variational model for single-image super-resolution based on the assumption that the image gradient of the target image is sparse. To promote jump sparsity, we use an isotropic and anisotropic ℓ^0 inverse Potts gradient regularisation term combined with a quadratic data fidelity, similarly as studied in [1] for general problems in signal recovery. For the numerical realisation of the model, we consider a converging ADMM algorithm. Differently from [1], [2], where approximate graph cuts and dynamic programming techniques were used for solving the non-convex substeps in the case of multivariate data, the proposed splitting allows to compute explicitly their solution by means of hard-thresholding and standard conjugate-gradient solvers. We compare quantitatively our results with several convex, nonconvex and deep-learning-based approaches for several synthetic and real-world data. Our numerical results show that combining super-resolution with gradient sparsity is particularly helpful for object detection and labelling tasks (such as QR scanning and land-cover classification), for which our results are shown to improve the classification precision of standard clustering algorithms and state-of-the art deep architectures [3].

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 5

page 6

page 7

page 8

page 10

Code Repositories

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.