We are interested in solving various image restoration problems
by constrained convex models. In particular, we deal with the minimization of seminorms $\|L \cdot\|$ on $\R^n$ under
the constraint of a bounded $I$-divergence $D(b,H \cdot)$.
The $I$-divergence is also known as Kullback-Leibler divergence and appears
in many models in imaging science, in particular when dealing with Poisson data.
Typically, $H$ represents here, e.g., a linear blur operator and $L$ is some discrete derivative operator.
Our preference for the constrained approach over the corresponding penalized version
is based on the fact that the $I$-divergence of data corrupted, e.g., by Poisson noise
or multiplicative Gamma noise can be estimated by statistical methods.
Our minimization technique rests upon relations between constrained and penalized convex problems
and resembles the idea of Morozov's discrepancy principle.
More precisely, we propose first-order primal-dual algorithms
which reduce the problem to the
solution of certain proximal minimization problems in each iteration step.
The most interesting of these proximal minimization problems
is an $I$-divergence constrained least squares problem.
We solve this problem by connecting it to the corresponding $I$-divergence penalized least squares problem
with an appropriately chosen regularization parameter.
Therefore, our algorithm produces not only a sequence of vectors
which converges to a minimizer of the constrained problem
but also a sequence of parameters which convergences to a regularization parameter
so that the penalized problem has the same solution as our constrained one.
Finally, we deal with Anscombe constrained problems.
We demonstrate the performance of our algorithms for different image restoration examples.
This is joint work with R. Chan (Chinese University of Hongkong), J.-C. Pesquet (Universit´e Paris-Est, France)
and R. Ciak, T. Teuber, B. Shafei (University of Kaiserslautern).