• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A Scaled Gradient Descent Method for Unconstrained Optimization Problems With A Priori Estimation of the Minimum Value

D'Alves, Curtis January 2017 (has links)
A scaled gradient descent method for competition of applications of conjugate gradient with priori estimations of the minimum value / This research proposes a novel method of improving the Gradient Descent method in an effort to be competitive with applications of the conjugate gradient method while reducing computation per iteration. Iterative methods for unconstrained optimization have found widespread application in digital signal processing applications for large inverse problems, such as the use of conjugate gradient for parallel image reconstruction in MR Imaging. In these problems, very good estimates of the minimum value at the objective function can be obtained by estimating the noise variance in the signal, or using additional measurements. The method proposed uses an estimation of the minimum to develop a scaling for Gradient Descent at each iteration, thus avoiding the necessity of a computationally extensive line search. A sufficient condition for convergence and proof are provided for the method, as well as an analysis of convergence rates for varying conditioned problems. The method is compared against the gradient descent and conjugate gradient methods. A method with a computationally inexpensive scaling factor is achieved that converges linearly for well-conditioned problems. The method is tested with tricky non-linear problems against gradient descent, but proves unsuccessful without augmenting with a line search. However with line search augmentation the method still outperforms gradient descent in iterations. The method is also benchmarked against conjugate gradient for linear problems, where it achieves similar convergence for well-conditioned problems even without augmenting with a line search. / Thesis / Master of Science (MSc) / This research proposes a novel method of improving the Gradient Descent method in an effort to be competitive with applications of the conjugate gradient method while reducing computation per iteration. Iterative methods for unconstrained optimization have found widespread application in digital signal processing applications for large inverse problems, such as the use of conjugate gradient for parallel image reconstruction in MR Imaging. In these problems, very good estimates of the minimum value at the objective function can be obtained by estimating the noise variance in the signal, or using additional measurements. The method proposed uses an estimation of the minimum to develop a scaling for Gradient Descent at each iteration, thus avoiding the necessity of a computationally extensive line search. A sufficient condition for convergence and proof are provided for the method, as well as an analysis of convergence rates for varying conditioned problems. The method is compared against the gradient descent and conjugate gradient methods. A method with a computationally inexpensive scaling factor is achieved that converges linearly for well-conditioned problems. The method is tested with tricky non-linear problems against gradient descent, but proves unsuccessful without augmenting with a line search. However with line search augmentation the method still outperforms gradient descent in iterations. The method is also benchmarked against conjugate gradient for linear problems, where it achieves similar convergence for well-conditioned problems even without augmenting with a line search.

Page generated in 0.1232 seconds