• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Método subgradiente incremental para otimização convexa não diferenciável / Incremental subgradient method for nondifferentiable convex optimization

Adona, Vando Antônio 18 December 2014 (has links)
Submitted by Luciana Ferreira (lucgeral@gmail.com) on 2015-03-26T12:20:46Z No. of bitstreams: 2 Dissertação - Vando Antônio Adona - 2014.pdf: 1128475 bytes, checksum: a2d00afcaef383726904cf6e6fd3527d (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2015-03-27T10:48:07Z (GMT) No. of bitstreams: 2 Dissertação - Vando Antônio Adona - 2014.pdf: 1128475 bytes, checksum: a2d00afcaef383726904cf6e6fd3527d (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) / Made available in DSpace on 2015-03-27T10:48:07Z (GMT). No. of bitstreams: 2 Dissertação - Vando Antônio Adona - 2014.pdf: 1128475 bytes, checksum: a2d00afcaef383726904cf6e6fd3527d (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) Previous issue date: 2014-12-18 / Conselho Nacional de Pesquisa e Desenvolvimento Científico e Tecnológico - CNPq / We consider an optimization problem for which the objective function is the sum of convex functions, not necessarily differentiable. We study a subgradient method that executes the iterations incrementally selecting each component function sequentially and processing the subgradient iteration individually. We analyze different alternatives for choosing the step length, highlighting the convergence properties for each case. We also analyze the incremental model in other methods, considering proximal iteration and combinations of subgradient and proximal iterations. This incremental approach has been very successful when the number of component functions is large. / Consideramos um problema de otimização cuja função objetivo consiste na soma de funções convexas, não necessariamente diferenciáveis. Estudamos um método subgradiente que executa a iteração de forma incremental, selecionando cada função componente de maneira sequencial e processando a iteração subgradiente individualmente. Analisamos diferentes alternativas para a escolha do comprimento de passo, destacando as propriedades de convergência para cada caso. Abordamos também o modelo incremental em outros métodos, considerando iteração proximal e combinações de iterações subgradiente e proximal. Esta abordagem incremental tem sido muito bem sucedida quando o número de funções componentes é grande.

Page generated in 0.1463 seconds