Spelling suggestions: "subject:"poisedness"" "subject:"posedness""
1 |
Derivative Free Optimization Methods: Application In Stirrer Configuration And Data ClusteringAkteke, Basak 01 July 2005 (has links) (PDF)
Recent developments show that derivative free methods are highly demanded by researches for solving optimization problems in various practical contexts.
Although well-known optimization methods that employ derivative information can be very effcient, a derivative free method will be more effcient in cases
where the objective function is nondifferentiable, the derivative information is
not available or is not reliable. Derivative Free Optimization (DFO) is developed
for solving small dimensional problems (less than 100 variables) in which
the computation of an objective function is relatively expensive and the derivatives
of the objective function are not available. Problems of this nature more
and more arise in modern physical, chemical and econometric measurements
and in engineering applications, where computer simulation is employed for the
evaluation of the objective functions.
In this thesis, we give an example of the implementation of DFO in an approach
for optimizing stirrer configurations, including a parametrized grid generator,
a flow solver, and DFO. A derivative free method, i.e., DFO is preferred because
the gradient of the objective function with respect to the stirrer&rsquo / s design variables is not directly available. This nonlinear objective function is obtained
from the flow field by the flow solver. We present and interpret numerical results
of this implementation. Moreover, a contribution is given to a survey and
a distinction of DFO research directions, to an analysis and discussion of these.
We also state a derivative free algorithm used within a clustering algorithm in
combination with non-smooth optimization techniques to reveal the effectiveness
of derivative free methods in computations. This algorithm is applied on
some data sets from various sources of public life and medicine. We compare
various methods, their practical backgrounds, and conclude with a summary
and outlook. This work may serve as a preparation of possible future research.
|
2 |
Otimização sem derivadas : sobre a construção e a qualidade de modelos quadráticos na solução de problemas irrestritos / Derivative-free optimization : on the construction and quality of quadratic models for unconstrained optimization problemsNascimento, Ivan Xavier Moura do, 1989- 25 August 2018 (has links)
Orientador: Sandra Augusta Santos / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matemática Estatística e Computação Científica / Made available in DSpace on 2018-08-25T00:20:47Z (GMT). No. of bitstreams: 1
Nascimento_IvanXavierMourado_M.pdf: 5587602 bytes, checksum: 769fbf124a59d55361b184a6ec802f66 (MD5)
Previous issue date: 2014 / Resumo: Métodos de região de confiança formam uma classe de algoritmos iterativos amplamente utilizada em problemas de otimização não linear irrestrita para os quais as derivadas da função objetivo não estão disponíveis ou são imprecisas. Uma das abordagens clássicas desses métodos envolve a otimização de modelos polinomiais aproximadores para a função objetivo, construídos a cada iteração com base em conjuntos amostrais de pontos. Em um trabalho recente, Scheinberg e Toint [SIAM Journal on Optimization, 20 (6) (2010), pp. 3512-3532 ] mostram que apesar do controle do posicionamento dos pontos amostrais ser essencial para a convergência do método, é possível que tal controle ocorra de modo direto apenas no estágio final do algoritmo. Baseando-se nessas ideias e incorporando-as a um esquema algorítmico teórico, os autores investigam analiticamente uma curiosa propriedade de autocorreção da geometria dos pontos, a qual se evidencia nas iterações de insucesso. A convergência global do novo algoritmo é, então, obtida como uma consequência da geometria autocorretiva. Nesta dissertação estudamos o posicionamento dos pontos em métodos baseados em modelos quadráticos de interpolação e analisamos o desempenho computacional do algoritmo teórico proposto por Scheinberg e Toint, cujos parâmetros são determinados / Abstract: Trust-region methods are a class of iterative algorithms widely applied to nonlinear unconstrained optimization problems for which derivatives of the objective function are unavailable or inaccurate. One of the classical approaches involves the optimization of a polynomial model for the objective function, built at each iteration and based on a sample set. In a recent work, Scheinberg and Toint [SIAM Journal on Optimization, 20 (6) (2010), pp. 3512¿3532 ] proved that, despite being essential for convergence results, the improvement of the geometry (poisedness) of the sample set might occur only in the final stage of the algorithm. Based on these ideas and incorporating them into a theoretical algorithm framework, the authors investigate analytically an interesting self-correcting geometry mechanism of the interpolating set, which becomes evident at unsuccessful iterations. Global convergence for the new algorithm is then proved as a consequence of this self-correcting property. In this work we study the positioning of the sample points within interpolation-based methods that rely on quadratic models and investigate the computational performance of the theoretical algorithm proposed by Scheinberg and Toint, whose parameters are based upon either choices of previous works or numerical experiments / Mestrado / Matematica Aplicada / Mestre em Matemática Aplicada
|
Page generated in 0.0427 seconds