Stochastic Convergence Analysis for Tikhonov-Regularization with Sparsity Constraints

Daniel Gerth* and Ronny Ramlau

In recent years, regularization methods based on the minimization of Tikhonov-type functionals \begin{equation}\tag{$\star$} ||Ax-y||^2+\alpha\Phi(x) \end{equation} with a linear bounded operator $A$ and a sparsity promoting penalty term $\Phi$ have been discussed widely in literature. Convergence of the solution has been analysed assuming a deterministic error bound \begin{equation}\tag{$\diamond$} ||y-y^\delta||\leq\delta \end{equation} between the measured data $y^\delta$ and the true data $y$. Instead of $(\diamond)$, an explicit stochastic error model is considered in the talk. Namely, the case of a normally distributed error with zero mean and variance $\sigma^2$ in each component of the measured data is addressed. This especially means arbitrarily large errors are allowed, but with low probability. Equation $(\star)$ is derived from a stochastic model using Bayes' formula. Deterministic results are lifted to this situation using the Ky Fan metric for the convergence analysis. After giving a general convergence theorem, Besov space penalties are considered. For this case, a parameter choice rule is presented which immediately leads to convergence rates in the Ky Fan metric with respect to the error parameter $\sigma$. The theoretical results are illustrated in one dimensional and two dimensional numerical examples.

Mathematics Subject Classification: 65J20

Keywords: Tikhonov-Regularization; Sparsity; Bayesian inversion

Minisymposion: New Trends in Regularization Theory and Methods for Geomathematical Problems