# Model Selection with Piecewise-Smooth Gauges

Gabriel PeyrÃ©^{*}

In this talk, we investigate in a unified way the structural properties of a large class of convex regularizers for linear inverse problems. We consider regularizations with convex positively 1-homogenous functionals (so-called gauges) which obey a weak decomposability property. Weak decomposability promotes solutions of the inverse problem conforming to some notion of simplicity/low complexity by living on a low dimensional sub-space. This family of priors encompasses many special instances routinely used in regularized inverse problems such as $\ell^1$, $\ell^1-\ell^2$ (group sparsity), nuclear norm, or the $\ell^\infty$ norm. The weak decomposability requirement is flexible enough to cope with analysis-type priors that include a pre-composition with a linear operator, such as for instance the total variation and polyhedral gauges. Weak decomposability is also stable under summation of regularizers, thus enabling to handle mixed regularizations. $\\$ The first part of the talk is dedicated to assessing the theoretical recovery performance of this class of regularizers. We provide sufficient conditions that allow to provably controlling the deviation of the recovered solution from the true underlying object, as a function of the noise level. More precisely we establish two main results. The first one ensures that the solution to the inverse problem is unique and lives on the same low dimensional sub-space as the true vector to recover, with the proviso that the minimal signal to noise ratio is large enough. This extends previous results well-known for the $\ell^1$ norm [1], analysis $\ell^1$ semi-norm [2], and the nuclear norm [3] to the general class of weakly decomposable gauges. In the second result, we establish $\ell^2$ stability by showing that the $\ell^2$ distance between the recovered and true vectors is within a factor of the noise level, thus extending results that hold for coercive convex positively 1-homogenous functionals [4]. $\\$ This second part of the talk is dedicated to analyzing the local sensitivity of any regularized solution to perturbations on the observations, and as a by-product, its implications to construct unbiased risk estimators and parameters selection procedures. We give a precise local parameterization and establish local differentiability of any regularized solution as a function of the observations. We also give an expression of the Jacobian of a solution mapping with respect to the observations. Using tools from o-minimal geometry, we prove that the set of non-differentiability points is of zero Lebesgue measure, hence showing that the expression of the Jacobian is valid Lebesgue almost everywhere. These results are achieved without requiring that the forward-model linear operator is injective, and extend and unify previous results known for analysis $\ell^1$ sparsity, see [5] and references therein. When the noise in the observations is Gaussian, these results allow us to derive expressions of the Generalized Stein Unbiased Risk Estimator (GSURE) [6]. In turn, this provides a principled way to automatically select the hyperparameters involved, such as the regularization parameter. $\\$ This is a joint work with S. Vaiter, C. Deledalle, M. Golbabaee, C. Dossal and J. Fadili. $\\$ \begin{thebibliography}{99} \bibitem{fuchs} J.\ J.\ Fuchs, {\em On sparse representations in arbitrary redundant bases}, IEEE Transactions on Information Theory, 50(6):1341-1344, 2004. \bibitem{VPD} S. Vaiter, G. PeyrÃ©, C. Dossal, J. Fadili, {\em Robust Sparse Analysis Regularization}, to appear in IEEE Transactions on Information Theory, 2013. \bibitem{bach} F. Bach, Consistency of trace norm minimization, Journal of Machine Learning Research, 9, 1019-1048, 2008. \bibitem{grasmair} M. Grasmair, {\em Linear convergence rates for Tikhonov regularization with positively homogeneous functionals}, Inverse Problems, 27(7):075014, 2011.20 \bibitem{vaiter} S. Vaiter, C. Deledalle, G. Peyre, J. Fadili, and C. Dossal, {\em Local behavior of sparse analysis regularization: Applications to risk estimation}, to appear in Applied and Computational Harmonic Analysis, 2013. \bibitem{eldar} Y. C. Eldar, {\em Generalized sure for exponential families: Applications to regularization}, E2809D IEEE Transactions on Signal Processing, vol. 57, pp. 471E28093 481, 2009. \end{thebibliography}

Mathematics Subject Classification: 62H35 94A08

Keywords: Inverse problems ; convex optimization ; image processing ; risk estimation

Minisymposion: Noise Estimation, Model Selection and Bilevel Optimization