# Total Generalized Variation Regularization for Image Reconstruction

Kristian Bredies^{*}

We study the total generalized variation (TGV) of order $k$ for symmetric tensor fields as a regularizer for variational image reconstruction problems. This functional extends the notion of total variation in the scalar case as well as the Radon norm of the symmetrized gradient in the vector case. It is in particular aware of geometrical information such as edges as well as discontinuities in higher-order derivatives. The associated Banach spaces are shown to coincide, in terms of the strong topology and for any order $k$, with the space of symmetric tensor fields of bounded deformation. The functional-analytic properties of the latter allow to prove, for instance, well-posedness of TGV-regularized variational imaging problems. We also discuss strict TGV-topologies which turn out not to be equivalent to the strict TV-topology. \newline Furthermore, computational methods for the minimization of TGV-regularized optimization problems are presented. They base on rewriting the suitably discretized objective functional as a convex-concave saddle point problem and applying a primal-dual iteration. The TGV-regularization approach as well as the proposed algorithms are applied to image reconstruction problems such as undersampled magnetic resonance imaging, reconstruction of noisy diffusion tensor imaging data and denoising of dual energy computed tomography images. Finally, numerical experiments confirm the high reconstruction quality as well as the efficiency of TGV-based methods.

Mathematics Subject Classification: 68U10 65K10 46B10

Keywords: Total generalized variation; variational methods; image reconstruction; magnetic resonance imaging

Minisymposion: High Resolution Imaging with Geometric Priors