• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 22
  • 6
  • 6
  • 4
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 59
  • 59
  • 13
  • 13
  • 11
  • 8
  • 8
  • 8
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Algoritmo de otimização bayesiano com detecção de comunidades / Bayesian optimization algorithm with community detection

Crocomo, Márcio Kassouf 02 October 2012 (has links)
ALGORITMOS de Estimação de Distribuição (EDAs) compõem uma frente de pesquisa em Computação Evolutiva que tem apresentado resultados promissores para lidar com problemas complexos de larga escala. Nesse contexto, destaca-se o Algoritmo de Otimização Bayesiano (BOA) que usa um modelo probabilístico multivariado (representado por uma rede Bayesiana) para gerar novas soluções a cada iteração. Baseado no BOA e na investigação de algoritmos de detecção de estrutura de comunidades (para melhorar os modelos multivariados construídos), propõe-se dois novos algoritmos denominados CD-BOA e StrOp. Mostra-se que ambos apresentam vantagens significativas em relação ao BOA. O CD-BOA mostra-se mais flexível que o BOA, ao apresentar uma maior robustez a variações dos valores de parâmetros de entrada, facilitando o tratamento de uma maior diversidade de problemas do mundo real. Diferentemente do CD-BOA e BOA, o StrOp mostra que a detecção de comunidades a partir de uma rede Bayesiana pode modelar mais adequadamente problemas decomponíveis, reestruturando-os em subproblemas mais simples, que podem ser resolvidos por uma busca gulosa, resultando em uma solução para o problema original que pode ser ótima no caso de problemas perfeitamente decomponíveis, ou uma aproximação, caso contrário. Também é proposta uma nova técnica de reamostragens para EDAs (denominada REDA). Essa técnica possibilita a obtenção de modelos probabilísticos mais representativos, aumentando significativamente o desempenho do CD-BOA e StrOp. De uma forma geral, é demonstrado que, para os casos testados, CD-BOA e StrOp necessitam de um menor tempo de execução do que o BOA. Tal comprovação é feita tanto experimentalmente quanto por análise das complexidades dos algoritmos. As características principais desses algoritmos são avaliadas para a resolução de diferentes problemas, mapeando assim suas contribuições para a área de Computação Evolutiva / ESTIMATION of Distribution Algorithms represent a research area which is showing promising results, especially in dealing with complex large scale problems. In this context, the Bayesian Optimization Algorithm (BOA) uses a multivariate model (represented by a Bayesian network) to find new solutions at each iteration. Based on BOA and in the study of community detection algorithms (to improve the constructed multivariate models), two new algorithms are proposed, named CD-BOA and StrOp. This paper indicates that both algorithms have significant advantages when compared to BOA. The CD-BOA is shown to be more flexible, being more robust when using different input parameters, what makes it easier to deal with a greater diversity of real-world problems. Unlike CD-BOA and BOA, StrOp shows that the detection of communities on a Bayesian network more adequately models decomposable problems, resulting in simpler subproblems that can be solved by a greedy search, resulting in a solution to the original problem which may be optimal in the case of perfectly decomposable problems, or a fair approximation if not. Another proposal is a new resampling technique for EDAs (called REDA). This technique results in multivariate models that are more representative, significantly improving the performance of CD-BOA and StrOp. In general, it is shown that, for the scenarios tested, CD-BOA and StrOp require lower running time than BOA. This indication is done experimentally and by the analysis of the computational complexity of the algorithms. The main features of these algorithms are evaluated for solving various problems, thus identifying their contributions to the field of Evolutionary Computation
12

Pushing the limits of spectroscopic imaging using novel low-rank based reconstruction algorithm

Bhattacharya, Ipshita 01 May 2017 (has links)
Non-invasively reosolving spatial distribution of tissue metabolites serves as a diagnostic tool to in-vivo metabolism thus making magnetic resonance spectroscopic imaging (MRSI) a very useful application. The tissue concentrations of various metabolites reveal disease state and pseudo-progression of tumors. Also, bio-chemical changes manifest much earlier than structural changes that are achieved using standard magnetic resonance imaging(MRI). However, MRSI has not achieved its potential due to several technical challenges that are specic to it. Several technical advances in the eld of MRI does not translate to MRSI. The specic limitations which make MRSI challenging include long scan times, poor spatial resolution, extremely low signal to noise ratio (SNR). In the last few decades, research in MRSI has focused on advanced data acquisition and reconstruction methods, however they cannot achieve high resolution and feasible scan time. Moreover there are several artifacts that lead to increase of spatial resolution not to mention starved SNR. Existing methods cannot deal with these limitations which considerably impacts applications of MRSI. This thesis work we revisit these problems and introduce data acquisition and reconstruction techniques to address several such challenges. In the first part of the thesis we introduce a variable density spiral acquisition technique which achieves high SNR corresponding to metabolites of interest while reducing truncation artifacts. Along with that we develop a novel compartmentalized reconstruction framework to recover high resolution data from lipid unsuppressed data. Avoiding lipid suppression not only reduces scan time and reliability but also improves SNR which is otherwise reduced even further with existing lipid suppression methods. The proposed algorithm exploits the idea that the lipid and metabolite compartment reside in low-dimensional subspace and we use orthogonality priors to reduce overlap of subspaces. We also look at spectral artifacts like Nyquist ghosting which is a common problem with spectral interleaving. Especially in echo-planar spectroscopic imaging (EPSI), one of the most popular MRSI techniques, maintaining a spatial and spectral resolution requires interleaving. Due to scanner inconsistencies spurious peaks arise which makes quantication inecient. In this thesis a novel structural low-rank prior is used to reduce and denoise spectra and achieve high resolution ESPI data. Finally we look at accelerating multi-dimensional spectroscopic problems. Resolving spectra in two dimensions can help study overlapping spectra and achieve more insight. However with an increased dimension the scan time increases. We developed an algorithm for accelerating this method by recovering data from undersampled measurements. We demonstrate the performance in two applications, 2D infra red spectroscopy and 2D MR spectroscopy . The aim of the thesis is to solve these challenges in MRSI from a signal processing perspective and be able to achieve higher resolution data in practical scan time to ultimately help MRSI reach its potential.
13

Algoritmo guloso

MORAIS, Camila Mendonça 19 December 2014 (has links)
Submitted by (lucia.rodrigues@ufrpe.br) on 2017-03-28T13:29:01Z No. of bitstreams: 1 Camila Mendonca Morais.pdf: 1358642 bytes, checksum: 6f2183e2579739977dd39a9eefa8f2cc (MD5) / Made available in DSpace on 2017-03-28T13:29:01Z (GMT). No. of bitstreams: 1 Camila Mendonca Morais.pdf: 1358642 bytes, checksum: 6f2183e2579739977dd39a9eefa8f2cc (MD5) Previous issue date: 2014-12-19 / This research aims to study the Greedy Algorithm, a type of optimization algorithm, and some of its applications, in order to develop a didactic sequence to be applied with secondary level students. In the study, the construction and logic of the algorithm were related to the graph and trees, concepts which were previously studied and analyzed as requisites to the comprehension of the properties and characteristics of the algorithm. Firstly, we synthetized the elaboration of the Theory of Graphs; then, we presented some concepts about graphs in general, such as its de nition, properties, classi cations and percusses. Next, we de ned trees - a special type of graph - and studied some of its fundamentals theorems for the comprehension of the algorithm, as well as some methods of codi cation such as the Pr ufer code. Finally, we de ned the Greedy Algorithm, specially the Kruskal algorithm, using a practical setting in order to exemplify its application. After the theoretical fundaments, we develop a didactical sequence to be applied in ve classes. In this didactical sequence, activities which involve graphs and trees were progressively applied, with contextualized questions such as exercises in such a way that in the last class of the sequence, the Greedy Algorithm could be de ned and studied, and the students were able to use them to analyze a project, which would be used as a nal instrument of evaluation. This didactical sequence aims to stimulate the student's logical reasoning, as well as to introduce these concepts in their school curriculum on secondary level. / O presente trabalho tem como objetivo principal estudar o Algoritmo Guloso, esp écie de algoritmo de otimiza cão, e algumas de suas aplica ções, para posterior desenvolvimento de uma sequência didática a ser abordada com alunos do Ensino M édio. Neste estudo, a constru ção e l ógica do algoritmo foram relacionadas a grafos e arvores, conceitos os quais foram previamente estudados e analisados como requisitos para a compreensão das propriedades e caracter ísticas do algoritmo. Primeiramente, fi zemos uma s íntese de como surgiu a Teoria dos Grafos; em seguida retratamos alguns conceitos sobre grafos em geral, como sua de finição, propriedades, classi ca ções e percursos. Na sequência, defi nimos arvores - um tipo especial de grafo - e estudamos alguns de seus principais teoremas fundamentais para a posterior compreensão do algoritmo, al ém de alguns m étodos de codi ca ção, como o c ódigo de Pr ufer. Finalmente, defi nimos o Algoritmo Guloso, especialmente o algoritmo de Kruskal, utilizando uma situa ção pr ática para exempli car sua aplicação. Ap ós toda a fundamenta ção, desenvolvemos uma sequência did ática para ser trabalhada em cinco aulas. Nesta sequência did ática, atividades envolvendo grafos e árvores foram progressivamente realizadas, com questões contextualizadas como exercí cios, para que na última aula da sequência o Algoritmo Guloso fosse defi nido e estudado, e os alunos capacitados a utiliz á-lo na an álise de um projeto, que seria utilizado como instrumento fi nal de avalia ção. Esta sequência did ática tem como objetivo estimular o raciocí nio l ógico dos estudantes, al ém de introduzir estes conceitos em seu currí culo escolar do Ensino M édio.
14

Algoritmo de otimização bayesiano com detecção de comunidades / Bayesian optimization algorithm with community detection

Márcio Kassouf Crocomo 02 October 2012 (has links)
ALGORITMOS de Estimação de Distribuição (EDAs) compõem uma frente de pesquisa em Computação Evolutiva que tem apresentado resultados promissores para lidar com problemas complexos de larga escala. Nesse contexto, destaca-se o Algoritmo de Otimização Bayesiano (BOA) que usa um modelo probabilístico multivariado (representado por uma rede Bayesiana) para gerar novas soluções a cada iteração. Baseado no BOA e na investigação de algoritmos de detecção de estrutura de comunidades (para melhorar os modelos multivariados construídos), propõe-se dois novos algoritmos denominados CD-BOA e StrOp. Mostra-se que ambos apresentam vantagens significativas em relação ao BOA. O CD-BOA mostra-se mais flexível que o BOA, ao apresentar uma maior robustez a variações dos valores de parâmetros de entrada, facilitando o tratamento de uma maior diversidade de problemas do mundo real. Diferentemente do CD-BOA e BOA, o StrOp mostra que a detecção de comunidades a partir de uma rede Bayesiana pode modelar mais adequadamente problemas decomponíveis, reestruturando-os em subproblemas mais simples, que podem ser resolvidos por uma busca gulosa, resultando em uma solução para o problema original que pode ser ótima no caso de problemas perfeitamente decomponíveis, ou uma aproximação, caso contrário. Também é proposta uma nova técnica de reamostragens para EDAs (denominada REDA). Essa técnica possibilita a obtenção de modelos probabilísticos mais representativos, aumentando significativamente o desempenho do CD-BOA e StrOp. De uma forma geral, é demonstrado que, para os casos testados, CD-BOA e StrOp necessitam de um menor tempo de execução do que o BOA. Tal comprovação é feita tanto experimentalmente quanto por análise das complexidades dos algoritmos. As características principais desses algoritmos são avaliadas para a resolução de diferentes problemas, mapeando assim suas contribuições para a área de Computação Evolutiva / ESTIMATION of Distribution Algorithms represent a research area which is showing promising results, especially in dealing with complex large scale problems. In this context, the Bayesian Optimization Algorithm (BOA) uses a multivariate model (represented by a Bayesian network) to find new solutions at each iteration. Based on BOA and in the study of community detection algorithms (to improve the constructed multivariate models), two new algorithms are proposed, named CD-BOA and StrOp. This paper indicates that both algorithms have significant advantages when compared to BOA. The CD-BOA is shown to be more flexible, being more robust when using different input parameters, what makes it easier to deal with a greater diversity of real-world problems. Unlike CD-BOA and BOA, StrOp shows that the detection of communities on a Bayesian network more adequately models decomposable problems, resulting in simpler subproblems that can be solved by a greedy search, resulting in a solution to the original problem which may be optimal in the case of perfectly decomposable problems, or a fair approximation if not. Another proposal is a new resampling technique for EDAs (called REDA). This technique results in multivariate models that are more representative, significantly improving the performance of CD-BOA and StrOp. In general, it is shown that, for the scenarios tested, CD-BOA and StrOp require lower running time than BOA. This indication is done experimentally and by the analysis of the computational complexity of the algorithms. The main features of these algorithms are evaluated for solving various problems, thus identifying their contributions to the field of Evolutionary Computation
15

Optimal dimensional synthesis of planar parallel manipulators with respect to workspaces.

Hay, Alexander Morrison 04 May 2005 (has links)
Please read the abstract in the section 00front of this document / Thesis (PhD(Mechanical Engineering))--University of Pretoria, 2006. / Mechanical and Aeronautical Engineering / unrestricted
16

Multi-Objective Optimization of Plug-In HEV Powertrain Using Modified Particle Swarm Optimization

Parkar, Omkar 05 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / An increase in the awareness of environmental conservation is leading the automotive industry into the adaptation of alternatively fueled vehicles. Electric, Fuel-Cell as well as Hybrid-Electric vehicles focus on this research area with the aim to efficiently utilize vehicle powertrain as the first step. Energy and Power Management System control strategies play a vital role in improving the efficiency of any hybrid propulsion system. However, these control strategies are sensitive to the dynamics of the powertrain components used in the given system. A kinematic mathematical model for Plug-in Hybrid Electric Vehicle (PHEV) has been developed in this study and is further optimized by determining optimal power management strategy for minimal fuel consumption as well as NOx emissions while executing a set drive cycle. A multi-objective optimization using weighted sum formulation is needed in order to observe the trade-off between the optimized objectives. Particle Swarm Optimization (PSO) algorithm has been used in this research, to determine the trade-off curve between fuel and NOx. In performing these optimizations, the control signal consisting of engine speed and reference battery SOC trajectory for a 2-hour cycle is used as the controllable decision parameter input directly from the optimizer. Each element of the control signal was split into 50 distinct points representing the full 2 hours, giving slightly less than 2.5 minutes per point, noting that the values used in the model are interpolated between the points for each time step. With the control signal consisting of 2 distinct signals, speed, and SOC trajectory, as 50 element time-variant signals, a multidimensional problem was formulated for the optimizer. Novel approaches to balance the optimizer exploration and convergence, as well as seeding techniques are suggested to solve the optimal control problem. The optimization of each involved individual runs at 5 different weight levels with the resulting cost populations being compiled together to visually represent with the help of Pareto front development. The obtained results of simulations and optimization are presented involving performances of individual components of the PHEV powertrain as well as the optimized PMS strategy to follow for a given drive cycle. Observations of the trade-off are discussed in the case of Multi-Objective Optimizations.
17

MODELING, ESTIMATION AND BENCHMARKING OF LITHIUM ION ELECTRIC BICYCLE BATTERY

Wang, Weizhong January 2016 (has links)
As a conventional transportation modality, bicycles have been gradually electrified to meet the desire for convenient and green commuting patterns, especially in developed urban areas. The electric bicycle battery pack and its management system are core elements that determine key performance metrics such as electric range and output power. With respect to electric bicycle applications, focused research on the battery, its management system, and performance has received less attention compared to other energy storage applications. In this thesis, a well-developed conversion kit produced by BionX is studied. A data collecting system is first installed to record both mechanical and electrical data, such as speed, power and voltage; this enables defining two standard riding cycles at different riding conditions. Two benchmarking tests are performed to investigate the battery life in pure electric mode and at different threshold levels of optimal assistance. A novel quadratic programming based fitting algorithm is derived and applied in both time and frequency domain parameter identification tests. The proposed algorithm is able to fit single/multiple pulses by applying a masking vector. Sensitivity study and experimental results show the high robustness and fast computation time of the approach compared to existing and commonly used methods, such as fmincon. The comparison between hybrid power pulse characterization (HPPC) and electrochemical impedance spectrum (EIS) tests are performed in terms of extracted internal resistance. A second-order RC battery model is developed using parameters extracted from HPPC tests. The model is validated by experimental riding cycles and used to generate the reference SOC profiles that are employed in a SOC estimation study. Four estimation strategies, including extended Kalman Filter (EKF), Sigma point Kalman Filter (SPKF), Cubature Kalman Filter (CKF), and joint extended Kalman Filter (JEKF), are compared systematically in terms of accuracy, robustness and computation complexity. / Thesis / Master of Applied Science (MASc)
18

Investigation of a Simulated Annealing Cooling Schedule Used to Optimize the Estimation of the Fiber Diameter Distribution in a Peripheral Nerve Trunk

Vigeh, Arya 01 May 2011 (has links) (PDF)
In previous studies it was determined that the fiber diameter distribution in a peripheral nerve could be estimated by a simulation technique known as group delay. These results could be further improved using a combinatorial optimization algorithm called simulated annealing. This paper explores the structure and behavior of simulated annealing for the application of optimizing the group delay estimated fiber diameter distribution. Specifically, a set of parameters known as the cooling schedule is investigated to determine its effectiveness in the optimization process. Simulated annealing is a technique for finding the global minimum (or maximum) of a cost function which may have many local minima. The set of parameters which comprise the cooling schedule dictate the rate at which simulated annealing reaches its final solution. Converging too quickly can result in sub-optimal solutions while taking too long to determine a solution can result in an unnecessarily large computational effort that would be impractical in a real-world setting. The goal of this study is to minimize the computational effort of simulated annealing without sacrificing its effectiveness at minimizing the cost function. The cost function for this application is an error value computed as the difference in the maximum compound evoked potentials between an empirically-determined template distribution of fiber diameters and an optimized set of fiber diameters. The resulting information will be useful when developing the group delay estimation and subsequent simulated annealing optimization in an experimental laboratory setting.
19

A Hybrid Genetic Algorithm for Reinforced Concrete Flat Slab.

Sahab, M.G., Ashour, Ashraf, Toropov, V.V. 28 July 2009 (has links)
No / This paper presents a two-stage hybrid optimization algorithm based on a modified genetic algorithm. In the first stage, a global search is carried out over the design search space using a modified GA. The proposed modifications on the basic GA includes dynamically changing the population size throughout the GA process and the use of different forms of the penalty function in constraint handling. In the second stage, a local search based on the genetic algorithm solution is executed using a discretized form of Hooke and Jeeves method. The hybrid algorithm and the modifications to the basic genetic algorithm are examined on the design optimization of reinforced concrete flat slab buildings. The objective function is the total cost of the structure including the cost of concrete, formwork, reinforcement and foundation excavation. The constraints are defined according to the British Standard BS8110 for reinforced concrete structures. Comparative studies are presented to study the effect of different parameters of handling genetic algorithm on the optimized flat slab building. It has been shown that the proposed hybrid algorithm can improve genetic algorithm solutions at the expense of more function evaluations.
20

Design and characterization of advanced diffractive devices for imaging and spectroscopy

Zhu, Yilin 18 January 2024 (has links)
Due to the ever-increasing demands of highly integrated optical devices in imaging, spectroscopy, communications, and so on, there is a compelling need to design and characterize novel compact photonic components. The traditional approaches to realizing compact optical devices typically result in large footprints and sizable optical thicknesses. Moreover, they offer few degrees of freedom (DOF), hampering on-demand functionalities, on-chip integration, and scalability. This thesis will address the design and development of ultracompact diffractive devices for imaging and spectroscopy, utilizing advanced machine learning techniques and optimization algorithms. I first present the inverse design of ultracompact dual-focusing lenses and broad-band focusing spectrometers based on adaptive diffractive optical networks (a-DONs), which combine optical diffraction physics and deep learning capabilities for the inverse design of multi-layered diffractive devices. I designed two-layer diffractive devices that can selectively focus incident radiation over well-separated spectral bands at desired distances and also optimized a-DON-based focusing spectrometers with engineered angular dispersion for desired bandwidth and nanometer spectral resolution. Furthermore, I introduced a new approach based on a-DONs for the engineering of diffractive devices with arbitrary k-space, which produces improved imaging performances compared to contour-PSF approaches to lens-less computational imaging. Moreover, my method enables control of sparsity and isotropic k-space in pixelated screens of dielectric scatterers that are compatible with large-scale photolithographic fabrication techniques. Finally, by combining adjoint optimization with the rigorous generalized Mie theory, I developed and characterize functionalized compact devices, which I called "photonic patches," consisting of ~100 dielectric nanocylinders that achieve predefined functionalities such as beam steering, Fresnel zone focusing, local density of states (LDOS) enhancement, etc. My method enables the inverse design of ultracompact focusing spectrometers for on-chip planar integration. Leveraging multiple scattering of light in disordered random media, I additionally demonstrated a novel approach to on-chip spectroscopy driven by high-throughput multifractal (i.e., multiscale) media, resulting in sub-nanometer spectral resolution at the 50×50 µm²-scale footprint.

Page generated in 0.113 seconds