PlumX Metrics
Embed PlumX Metrics

Stochastic Gradient Descent for Linear Inverse Problems in Variable Exponent Lebesgue Spaces

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), ISSN: 1611-3349, Vol: 14009 LNCS, Page: 457-470
2023
  • 1
    Citations
  • 0
    Usage
  • 0
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Citations
    1

Conference Paper Description

We consider a stochastic gradient descent (SGD) algorithm for solving linear inverse problems (e.g., CT image reconstruction) in the Banach space framework of variable exponent Lebesgue spaces ℓ(pn)(R). Such non-standard spaces have been recently proved to be the appropriate functional framework to enforce pixel-adaptive regularisation in signal and image processing applications. Compared to its use in Hilbert settings, however, the application of SGD in the Banach setting of ℓ(pn)(R) is not straightforward, due, in particular to the lack of a closed-form expression and the non-separability property of the underlying norm. In this manuscript, we show that SGD iterations can effectively be performed using the associated modular function. Numerical validation on both simulated and real CT data show significant improvements in comparison to SGD solutions both in Hilbert and other Banach settings, in particular when non-Gaussian or mixed noise is observed in the data.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know