A preconditioned deepest descent algorithm for a class of optimization problems involving the p(x)-Laplacian operator

05/22/2022
by   Sergio Gonzalez-Andrade, et al.
0

In this paper we are concerned with a class of optimization problems involving the p(x)-Laplacian operator, which arise in imaging and signal analysis. We study the well-posedness of this kind of problems in an amalgam space considering that the variable exponent p(x) is a log-Hölder continuous function. Further, we propose a preconditioned descent algorithm for the numerical solution of the problem, considering a "frozen exponent" approach in a finite dimension space. Finally, we carry on several numerical experiments to show the advantages of our method. Specifically, we study two detailed example whose motivation lies in a possible extension of the proposed technique to image processing.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro