• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 72
  • 29
  • 6
  • 5
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 137
  • 44
  • 32
  • 26
  • 25
  • 20
  • 19
  • 19
  • 16
  • 15
  • 15
  • 15
  • 14
  • 14
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Modelos computacionais para simulações de tomografia por impedância elétrica e sua aplicação no problema de determinação da fração de ejeção cardíaca

Ribeiro, Marcos Henrique Fonseca 03 October 2016 (has links)
Submitted by Renata Lopes (renatasil82@gmail.com) on 2017-05-15T14:59:59Z No. of bitstreams: 1 marcoshenriquefonsecaribeiro.pdf: 12873424 bytes, checksum: 2b2b91fd2a9726856a0486afa760fe2c (MD5) / Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2017-05-17T16:01:12Z (GMT) No. of bitstreams: 1 marcoshenriquefonsecaribeiro.pdf: 12873424 bytes, checksum: 2b2b91fd2a9726856a0486afa760fe2c (MD5) / Made available in DSpace on 2017-05-17T16:01:12Z (GMT). No. of bitstreams: 1 marcoshenriquefonsecaribeiro.pdf: 12873424 bytes, checksum: 2b2b91fd2a9726856a0486afa760fe2c (MD5) Previous issue date: 2016-10-03 / A Tomografia por Impedância Elétrica (TIE) consiste em uma técnica onde imagens são construídas a partir da injeção de uma corrente elétrica em determinado meios, seguida da leitura de valores de potencial elétrico em pontos do contorno externo de tal domínio. Desta maneira, conhecendo-se ou estimando-se a condutividade elétrica de regiões internas ao meio, pode-se inferir aspectos geométricos da composição do mesmo. Trabalhos na literatura aplicam esta técnica ao contexto de obtenção de imagens do tórax humano, com objetivo de estimar a geometria das cavidades cardíacas de um determinado paciente. O objetivo final de estudo deste trabalho, dentro do contexto de aplicação da TIE à obtenção de cavidades cardíacas, é propor uma metodologia para a estimação da Fração de Ejeção Cardíaca, ou simplesmente Fração de Ejeção (FE), que consiste em medir o percentual de volume de sangue expulso dos ventrículos ao final de um ciclo de batimento do coração. Este trabalho visa evoluir outros trabalhos já existentes que modelam o problema acima descrito como sendo um problema inverso, de otimização, onde se pretende minimizar a diferença entre valores de potencial elétrico medidos e valores simulados por modelos computacionais. A evolução se dá em níveis diferentes. No primeiro nível, é feito um avanço sobre as técnicas de otimização para a resolução do problema inverso, em sua formulaçãobidimensional. Paratal, épropostaumametaheurísticaqueauxiliamétodosde buscanaobtençãodevaloresmaisacurados. Estametaheurísticaéapresentadaemversões sequencial e paralela. São apresentados resultados computacionais de testes realizados para este primeiro nível. Em um segundo nível, é feita a modelagem em três dimensões das mesmas abordagens já encontradas na literatura, que, para a aplicação específica da determinação da FE, até então estão limitadas a modelos bidimensionais. Assim, todo o problema é revisto para uma nova proposta de modelagem, que inclui a criação de modelos geométricos tridimensionais para as regiões de interesse do problema. Como principal contribuição do trabalho neste segundo nível, encontra-se um esquema de parametrização das malhas de polígonos que modelam ventrículos do coração, de forma que se tenha uma maneira compacta de representar as mesmas e, ao mesmo tempo, diminuindo o custo computacional do método de otimização por meio de drástica redução do número de variáveis do problema. Por fim, também é realizado um estudo preliminar da sensibilidade da técnica à presença de ruídos nos dados de entrada. / The Electrical Impedance Tomography (EIT) consists in a technique where images are constructed from the measurements of the electrical potential in some points on the external boundary of some specific domain, caused by the injection of an electrical current in such domain. This way, knowing or estimating the electrical conductivity of some regions inside the domain, geometric aspects of the composition of that domain can be inferred. Works in literature apply this technique to the context of obtaining images from the human thorax, with the objective of estimating the geometry of some cardiac cavities of a specific patient. The final goal of this work, inside the context of the obtention of cardiac cavities, is to propose a methodology for estimating the Cardiac Ejection Fraction, orsimplyEjectionFraction(EF),whichconsistsinmeasuringthepercentualofthevolume of blood expelled from the ventricles at the end of a heart beat cicle. This work intends to evolute previous works, that models the above mentioned problem as an inverse problem, an optimization problem, where the intention is to minimize the difference between the values of measured electrical potentials and the values obtained through simulation using computational models. This evolution occurs in different levels. In the first level, is performedanimprovementoverthepre-existentoptimizationtechniquesforthesolutionof theinverseproblem,inatwodimensionalversion. Forthis,isproposedametaheuristicthat assistssearchmethodstowardstheobtentionofmoreaccuratedvalues. Suchmetaheuristic is presented in sequential and parallel versions. Computational results for performed tests for this level are presented. In a second level, a three dimensional modeling of the same approaches found in literature is done. Those approaches, for the specific application of determining the EF, are so far limited to two dimensional models. Therefore, the whole problem is reviewed in order to propose a new model, which includes the creation of three dimensional geometric models for the regions of interest of the problem. As the main contribution of this work in that second level, there is a parameterization schema of the polygon meshes that model heart ventricles, so that it provides a compact way of representing such meshes, and, at the same time, a way of reducing the computational cost of the optimization method by means of a drastic reduction of the number of variables of the problem. Finally, a preliminary study of the sensibility of the technique to the presence of noise in the input data is also performed.
52

[pt] PROBLEMA DE ROTEAMENTO DE VEÍCULOS COM MOTORISTAS OCASIONAIS PARA ENTREGAS DE LAST-MILE: UMA ABORDAGEM META-HEURÍSTICA / [en] VEHICLE ROUTING PROBLEM WITH OCCASIONAL DRIVERS FOR E-COMMERCE LAST-MILE DELIVERY: A METAHEURISTIC APPROACH

MATHEUS OLIVEIRA MEIRIM 25 September 2023 (has links)
[pt] Nos últimos anos o comércio eletrônico tem se difundido na sociedade e a logística de entrega dos produtos é um dos pilares para que este mercado mantenha o nível de serviço alto e continue sendo vantajoso para o consumidor decidir por realizar a compra pela internet. O presente trabalho se destina a estudar sobre o problema de roteamento de veículos de entrega last-mile para e-commerce e aplicar a metaheurística Iterated Local Search (ILS) visando otimizar o roteamento do trecho last-mile de encomendas realizadas em uma empresa de comércio eletrônico brasileira. Com o objetivo de encontrar rotas de menor custo para as entregas a serem realizadas, este trabalho propõe uma extensão para o Vehcile Routing Problem With Occasional Drivers (VRPOD),considerando frota heterogênea e motoristas ocasionais realizando o transporte de mais de uma entrega. Para a aplicação do método foram utilizados dados fornecidos por uma empresa de e-commerce que foram devidamente anonimizados de forma a não ser possível identificar a empresa e nem os clientes, respeitando os princípios éticos. Foram utilizadas 121 instâncias, sendo a menor com um vértice e a maior com 344. Os resultados do modelo proposto são apresentados em dois cenários, primeiramente considerando que o roteamento é realizado sem a utilização de motoristas ocasionais. O segundo cenário considera a disponibilização de motoristas ocasionais para serem utilizados em algumas rotas. Ambos os cenários foram comparados com as rotas geradas pelo roteador existente hoje na companhia e os resultados preliminares indicam que o sem a utilização de motoristas ocasionais o ILS proposto obtém melhores soluções em 53.72 por cento das instâncias e quando os motoristas ocasionais são incorporados a rota ocorre melhoria em 76.03 por cento das instâncias utilizadas. A utilização de motoristas ocasionais também proporciona uma redução de 10.30 por cento no custo médio de roteamento. / [en] In recent years, e-commerce has become widespread in society, and the logistics of product delivery is a crucial pillar for this market to maintain a high level of service and remain advantageous for consumers choosing to make purchases online. The present work aims to study the problem of last-mile vehicle routing for e-commerce deliveries and apply an Iterated Local Search (ILS) metaheuristic to optimize the routing of parcels in a Brazilian e-commerce company. With the objective of finding routes with the lowest cost for the deliveries, this study proposes an extension to the Vehicle Routing Problem with Occasional Drivers (VRPOD), considering a heterogeneous fleet and occasional drivers handling multiple deliveries. For the methodology application, data provided by an e-commerce company are used, and they are properly anonymized to prevent the identification of the company and its clients, respecting ethical principles. A total of 121 instances are used, ranging from the smallest with one vertex to the largest with 344. The results of the proposed model are presented in two scenarios: firstly, considering routing without the use of occasional drivers, and secondly, considering the availability of occasional drivers for some routes. Both scenarios are compared with the routes generated by the current router used in the company, and preliminary results indicate that without the use of occasional drivers, the proposed ILS obtains better solutions in 53.72 percent of the instances, and when occasional drivers are incorporated into the route, improvements occur in 76.03 percent of the instances. The utilization of occasional drivers also provides a 10.30 percent reduction in the average routing cost.
53

Multiscale Methods in Image Modelling and Image Processing

Alexander, Simon January 2005 (has links)
The field of modelling and processing of 'images' has fairly recently become important, even crucial, to areas of science, medicine, and engineering. The inevitable explosion of imaging modalities and approaches stemming from this fact has become a rich source of mathematical applications. <br /><br /> 'Imaging' is quite broad, and suffers somewhat from this broadness. The general question of 'what is an image?' or perhaps 'what is a natural image?' turns out to be difficult to address. To make real headway one may need to strongly constrain the class of images being considered, as will be done in part of this thesis. On the other hand there are general principles that can guide research in many areas. One such principle considered is the assertion that (classes of) images have multiscale relationships, whether at a pixel level, between features, or other variants. There are both practical (in terms of computational complexity) and more philosophical reasons (mimicking the human visual system, for example) that suggest looking at such methods. Looking at scaling relationships may also have the advantage of opening a problem up to many mathematical tools. <br /><br /> This thesis will detail two investigations into multiscale relationships, in quite different areas. One will involve Iterated Function Systems (IFS), and the other a stochastic approach to reconstruction of binary images (binary phase descriptions of porous media). The use of IFS in this context, which has often been called 'fractal image coding', has been primarily viewed as an image compression technique. We will re-visit this approach, proposing it as a more general tool. Some study of the implications of that idea will be presented, along with applications inferred by the results. In the area of reconstruction of binary porous media, a novel, multiscale, hierarchical annealing approach is proposed and investigated.
54

Random Function Iterations for Stochastic Feasibility Problems

Hermer, Neal 24 January 2019 (has links)
No description available.
55

Ismailova, Rita 01 September 2012 (has links) (PDF)
The subject of this thesis is the study of cryptographic hash functions, which utilize block ciphers as underlying chain functions. It is mainly concerned with the analysis of the three hash algorithms, the Whirlpool, Gr&oslash / stl and Grindahl. All these hash functions have underlying block ciphers that are modified versions of the Advance Encryption Standard and we investigate the behavior of these block ciphers under the integral attack. Statistical tests, such as the avalanche test and the collision test, are the regular tools for examining the hash function security. In this work, we inspect the statistical behavior the three hash functions and search for collisions. Although it is very difficult to obtain collisions for the actual algorithms, we find some collisions under slight modifications of the original constructions. The ease or difficulty of finding a collision for a modified version also shows the respective importance of the specific hash function branch, missing in the modified version.
56

On the Logic of Theory Change : Extending the AGM Model

Fermé, Eduardo January 2011 (has links)
This thesis consists in six articles and a comprehensive summary. • The pourpose of the summary is to introduce the AGM theory of belief change and to exemplify the diversity and significance of the research that has been inspired by the AGM article in the last 25 years. The research areas associated with AGM was divided in three parts: criticisms, where we discussed some of the more common criticisms of AGM. Extensions where the most common extensions and variations of AGM are presented and applications where we provided an overview of applications and connections with other areas of research. • Article I elaborates on the connection between partial meet contractions [AGM85] and kernel contractions [Han94a] in belief change theory. Also both functions are equivalent in belief sets, there are notequivalent in belief bases. A way to define incision functions (used in kernel contractions) from selection functions (used in partial meet contractions) and vice versa is presented. It is explained under which conditions there are exact correspondences between selection and incision functions so that the same contraction operations can be obtained by using either of them. • Article II proposes an axiomatic characterization for ensconcement-based contraction functions, belief base functions proposed byWilliams and relates this function with other kinds of base contraction functions. • Article III adapts the Fermé and Hansson model of Shielded Contraction [FH01] as well as Hansson et all Credibility-Limited Revision [HFCF01] for belief bases, to join two of the many variations of the AGM model [AGM85], i.e. those in which knowledge is represented through belief bases instead of logic theories, and those in which the object of the epistemic change does not get the priority over the existing information as it is the case in the AGM model. • Article IV introduces revision by comparison a refined method for changing beliefs by specifying constraints on the relative plausibility of propositions. Like the earlier belief revision models, the method proposed is a qualitative one, in the sense that no numbers are needed in order to specify the posterior plausibility of the new information. The method uses reference beliefs in order to determine the degree of entrenchment of the newly accepted piece of information. Two kinds of semantics for this idea are proposed and a logical characterization of the new model is given. • Article V focuses on the extension of AGM that allows change for a belief base by a set of sentences instead of a single sentence. In [FH94], Fuhrmann and Hansson presented an axiomatic for Multiple Contraction and a construction based on the AGM Partial Meet Contraction. This essay proposes for their model another way to construct functions: Multiple Kernel Contraction, that is a modification of Kernel Contraction,proposed by Hansson [Han94a] to construct classical AGM contractions and belief base contractions. • Article VI relates AGM model with the DFT model proposed by Carlos Alchourrón [Alc93]. Alchourrón devoted his last years to the analysis of the notion of defeasible conditionalization. His definition of the defeasible conditional is given in terms of strict implication operator and a modal operator f which is interpreted as a revision function at the language level. This essay points out that this underlying revision function is more general than AGM revision. In addition, a complete characterization of that more general kind of revision that permits to unify models of revision given by other authors is given. / QC 20110211
57

Multiscale Methods in Image Modelling and Image Processing

Alexander, Simon January 2005 (has links)
The field of modelling and processing of 'images' has fairly recently become important, even crucial, to areas of science, medicine, and engineering. The inevitable explosion of imaging modalities and approaches stemming from this fact has become a rich source of mathematical applications. <br /><br /> 'Imaging' is quite broad, and suffers somewhat from this broadness. The general question of 'what is an image?' or perhaps 'what is a natural image?' turns out to be difficult to address. To make real headway one may need to strongly constrain the class of images being considered, as will be done in part of this thesis. On the other hand there are general principles that can guide research in many areas. One such principle considered is the assertion that (classes of) images have multiscale relationships, whether at a pixel level, between features, or other variants. There are both practical (in terms of computational complexity) and more philosophical reasons (mimicking the human visual system, for example) that suggest looking at such methods. Looking at scaling relationships may also have the advantage of opening a problem up to many mathematical tools. <br /><br /> This thesis will detail two investigations into multiscale relationships, in quite different areas. One will involve Iterated Function Systems (IFS), and the other a stochastic approach to reconstruction of binary images (binary phase descriptions of porous media). The use of IFS in this context, which has often been called 'fractal image coding', has been primarily viewed as an image compression technique. We will re-visit this approach, proposing it as a more general tool. Some study of the implications of that idea will be presented, along with applications inferred by the results. In the area of reconstruction of binary porous media, a novel, multiscale, hierarchical annealing approach is proposed and investigated.
58

Iteratyvioji tabu paieška ir jos modifikacijos komivojažieriaus uždaviniui / Iterated tabu search and its modifications for the travelling salesman problem

Eimontienė, Ieva 16 August 2007 (has links)
Šiame darbe nagrinėjamas patobulintas tabu paieškos metodas, žinomas kaip iteratyvioji tabu paieška (ITP). Pasiūlytos kai kurios ITP metodo modifikacijos, besiremiančios tam tikromis sprendinių mutavimo (pertvarkymo) procedūromis (inversijos, įterpimai ir kt.), kurios įgalina pagerinti gaunamų sprendinių kokybę. Atlikti išsamūs sudaryto ITP algoritmo ir kitų pasiūlytų modifikacijų eksperimentiniai tyrimai, panaudojant testinius KU pavyzdžius iš KU testinių pavyzdžių bibliotekos TSPLIB. Gauti rezultatai patvirtina pasiūlytų modifikacijų pranašumą kitų ITP variantų atžvilgiu. / In this work, one of the heuristic algorithm – the iterated tabu search and its modifications are discussed. The work is organized as follows. Firstly, some basic definitions and preliminaries are given. Then, the iterated tabu search algoritm and its variants based on special type mutations are considered in more details. The ITS algorithms modifications were tested on the TSP instances from the TSP library TSPLIB. The results of this tests (experiments) are presented as well. The work is completed with the conclusions.
59

A Search for Maximal Diversity Amongst Paired Prisoner's Dilemma Strategies

von Keitz, Michael 21 December 2011 (has links)
Previous research has identified linear boundaries within a normalized unit square for specific paired strategies within the iterated prisoner's dilemma schema. In this work, general methods of capturing linear boundaries are developed and demonstrated on a wider variety of paired strategies. The method is also tested using an alternate scoring method. An application of Burnside's Lemma simplifies the number of neighbourhood configurations to be considered. In addition, Shannon entropy is used as a means of evaluating diversity of agents evolved with different payoff matrices, by which one might locate a game that is as balanced as possible.
60

State Complexity of Tree Automata

PIAO, XIAOXUE 04 January 2012 (has links)
Modern applications of XML use automata operating on unranked trees. A common definition of tree automata operating on unranked trees uses a set of vertical states that define the bottom-up computation, and the transitions on vertical states are determined by so called horizontal languages recognized by finite automata on strings. The bottom-up computation of an unranked tree automaton may be either deterministic or nondeterministic, and further variants arise depending on whether the horizontal string languages defining the transitions are represented by DFAs or NFAs. There is also an alternative syntactic definition of determinism introduced by Cristau et al. It is known that a deterministic tree automaton with the smallest total number of states does not need to be unique nor have the smallest possible number of vertical states. We consider the question by how much we can reduce the total number of states by introducing additional vertical states. We give an upper bound for the state trade-off for deterministic tree automata where the horizontal languages are defined by DFAs, and a lower bound construction that, for variable sized alphabets, is close to the upper bound. We establish upper and lower bounds for the state complexity of conversions between different types of deterministic and nondeterministic unranked tree automata. The bounds are, usually, tight for the numbers of vertical states. Because a minimal deterministic unranked tree automaton need not be unique, establishing lower bounds for the number of horizontal states, that is, the combined size of DFAs used to define the horizontal languages, is challenging. Based on existing lower bound results for unambiguous finite automata we develop a lower bound criterion for the number of horizontal states. We consider the state complexity of operations on regular unranked tree languages. The concatenation of trees can be defined either as a sequential or a parallel operation. Furthermore, there are two essentially different ways to iterate sequential concatenation. We establish tight state complexity bounds for concatenation-like operations. In particular, for sequential concatenation and bottom-up iterated concatenation the bounds differ by an order of magnitude from the corresponding state complexity bounds for regular string languages. / Thesis (Ph.D, Computing) -- Queen's University, 2012-01-04 14:48:02.916

Page generated in 0.078 seconds