Spelling suggestions: "subject:"phonotype regularization"" "subject:"kronotype regularization""
1 |
Regularization properties of the discrepancy principle for Tikhonov regularization in Banach spacesAnzengruber, Stephan W., Hofmann, Bernd, Mathé, Peter 11 December 2012 (has links) (PDF)
The stable solution of ill-posed non-linear operator equations in Banach space requires regularization. One important approach is based on Tikhonov regularization, in which case a one-parameter family of regularized solutions is obtained. It is crucial to choose the parameter appropriately. Here, a variant of the discrepancy principle is analyzed. In many cases such parameter choice exhibits the feature, called regularization property below, that the chosen parameter tends to zero as the noise tends to zero, but slower than the noise level. Here we shall show such regularization property under two natural assumptions. First, exact penalization must be excluded, and secondly, the discrepancy principle must stop after a finite number of iterations. We conclude this study with a discussion of some consequences for convergence rates obtained by the discrepancy principle under the validity of some kind of variational inequality, a recent tool for the analysis of inverse problems.
|
2 |
Regularization properties of the discrepancy principle for Tikhonov regularization in Banach spaces: Regularization properties of the discrepancy principle for Tikhonov regularization in Banach spacesAnzengruber, Stephan W., Hofmann, Bernd, Mathé, Peter January 2012 (has links)
The stable solution of ill-posed non-linear operator equations in Banach space requires regularization. One important approach is based on Tikhonov regularization, in which case a one-parameter family of regularized solutions is obtained. It is crucial to choose the parameter appropriately. Here, a variant of the discrepancy principle is analyzed. In many cases such parameter choice exhibits the feature, called regularization property below, that the chosen parameter tends to zero as the noise tends to zero, but slower than the noise level. Here we shall show such regularization property under two natural assumptions. First, exact penalization must be excluded, and secondly, the discrepancy principle must stop after a finite number of iterations. We conclude this study with a discussion of some consequences for convergence rates obtained by the discrepancy principle under the validity of some kind of variational inequality, a recent tool for the analysis of inverse problems.
|
3 |
The impact of a curious type of smoothness conditions on convergence rates in l1-regularizationBot, Radu Ioan, Hofmann, Bernd 31 January 2013 (has links) (PDF)
Tikhonov-type regularization of linear and nonlinear ill-posed problems in abstract spaces under sparsity constraints gained relevant attention in the past years. Since under some weak assumptions all regularized solutions are sparse if the l1-norm is used as penalty term, the l1-regularization was studied by numerous authors although the non-reflexivity of the Banach space l1 and the fact that such penalty functional is not strictly convex lead to serious difficulties. We consider the case that the sparsity assumption is narrowly missed. This means that the solutions may have an infinite number of nonzero but fast decaying components. For that case we formulate and prove convergence rates results for the l1-regularization of nonlinear operator equations. In this context, we outline the situations of Hölder rates and of an exponential decay of the solution components.
|
4 |
The impact of a curious type of smoothness conditions on convergence rates in l1-regularizationBot, Radu Ioan, Hofmann, Bernd January 2013 (has links)
Tikhonov-type regularization of linear and nonlinear ill-posed problems in abstract spaces under sparsity constraints gained relevant attention in the past years. Since under some weak assumptions all regularized solutions are sparse if the l1-norm is used as penalty term, the l1-regularization was studied by numerous authors although the non-reflexivity of the Banach space l1 and the fact that such penalty functional is not strictly convex lead to serious difficulties. We consider the case that the sparsity assumption is narrowly missed. This means that the solutions may have an infinite number of nonzero but fast decaying components. For that case we formulate and prove convergence rates results for the l1-regularization of nonlinear operator equations. In this context, we outline the situations of Hölder rates and of an exponential decay of the solution components.
|
Page generated in 0.1211 seconds