• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 341
  • 133
  • 67
  • 62
  • 37
  • 22
  • 19
  • 14
  • 11
  • 8
  • 7
  • 7
  • 6
  • 5
  • 4
  • Tagged with
  • 872
  • 219
  • 99
  • 95
  • 79
  • 73
  • 68
  • 63
  • 55
  • 51
  • 49
  • 46
  • 44
  • 42
  • 41
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Minimização de funções decomponíveis em curvas em U definidas sobre cadeias de posets -- algoritmos e aplicações / Minimization of decomposable in U-shaped curves functions defined on poset chains -- algorithms and applications

Marcelo da Silva Reis 28 November 2012 (has links)
O problema de seleção de características, no contexto de Reconhecimento de Padrões, consiste na escolha de um subconjunto X de um conjunto S de características, de tal forma que X seja \"ótimo\" dentro de algum critério. Supondo a escolha de uma função custo c apropriada, o problema de seleção de características é reduzido a um problema de busca que utiliza c para avaliar os subconjuntos de S e assim detectar um subconjunto de características ótimo. Todavia, o problema de seleção de características é NP-difícil. Na literatura existem diversos algoritmos e heurísticas propostos para abordar este problema; porém, quase nenhuma dessas técnicas explora o fato que existem funções custo cujos valores são estimados a partir de uma amostra e que descrevem uma \"curva em U\" nas cadeias do reticulado Booleano (P(S),<=), um fenômeno bem conhecido em Reconhecimento de Padrões: conforme aumenta-se o número de características consideradas, há uma queda no custo do subconjunto avaliado, até o ponto em que a limitação no número de amostras faz com que seguir adicionando características passe a aumentar o custo, devido ao aumento no erro de estimação. Em 2010, Ris e colegas propuseram um novo algoritmo para resolver esse caso particular do problema de seleção de características, que aproveita o fato de que o espaço de busca pode ser organizado como um reticulado Booleano, assim como a estrutura de curvas em U das cadeias do reticulado, para encontrar um subconjunto ótimo. Neste trabalho estudamos a estrutura do problema de minimização de funções custo cujas cadeias são decomponíveis em curvas em U (problema U-curve), provando que o mesmo é NP-difícil. Mostramos que o algoritmo de Ris e colegas possui um erro que o torna de fato sub-ótimo, e propusemos uma versão corrigida e melhorada do mesmo, o algoritmo U-Curve-Search (UCS). Apresentamos também duas variações do algoritmo UCS que controlam o espaço de busca de forma mais sistemática. Introduzimos dois novos algoritmos branch-and-bound para abordar o problema, chamados U-Curve-Branch-and-Bound (UBB) e Poset-Forest-Search (PFS). Para todos os algoritmos apresentados nesta tese, fornecemos análise de complexidade de tempo e, para alguns deles, também prova de corretude. Implementamos todos os algoritmos apresentados utilizando o arcabouço featsel, também desenvolvido neste trabalho; realizamos experimentos ótimos e sub-ótimos com instâncias de dados reais e simulados e analisamos os resultados obtidos. Por fim, propusemos um relaxamento do problema U-curve que modela alguns tipos de projeto de classificadores; também provamos que os algoritmos UCS, UBB e PFS resolvem esta versão generalizada do problema. / The feature selection problem, in the context of Pattern Recognition, consists in the choice of a subset X of a set S of features, such that X is \"optimal\" under some criterion. If we assume the choice of a proper cost function c, then the feature selection problem is reduced to a search problem, which uses c to evaluate the subsets of S, therefore finding an optimal feature subset. However, the feature selection problem is NP-hard. Although there are a myriad of algorithms and heuristics to tackle this problem in the literature, almost none of those techniques explores the fact that there are cost functions whose values are estimated from a sample and describe a \"U-shaped curve\" in the chains of the Boolean lattice o (P(S),<=), a well-known phenomenon in Pattern Recognition: for a fixed number of samples, the increase in the number of considered features may have two consequences: if the available sample is enough to a good estimation, then it should occur a reduction of the estimation error, otherwise, the lack of data induces an increase of the estimation error. In 2010, Ris et al. proposed a new algorithm to solve this particular case of the feature selection problem: their algorithm takes into account the fact that the search space may be organized as a Boolean lattice, as well as that the chains of this lattice describe a U-shaped curve, to find an optimal feature subset. In this work, we studied the structure of the minimization problem of cost functions whose chains are decomposable in U-shaped curves (the U-curve problem), and proved that this problem is actually NP-hard. We showed that the algorithm introduced by Ris et al. has an error that leads to suboptimal solutions, and proposed a corrected and improved version, the U-Curve-Search (UCS) algorithm. Moreover, to manage the search space in a more systematic way, we also presented two modifications of the UCS algorithm. We introduced two new branch-and-bound algorithms to tackle the U-curve problem, namely U-Curve-Branch-and-Bound (UBB) and Poset-Forest-Search (PFS). For each algorithm presented in this thesis, we provided time complexity analysis and, for some of them, also proof of correctness. We implemented each algorithm through the featsel framework, which was also developed in this work; we performed optimal and suboptimal experiments with instances from real and simulated data, and analyzed the results. Finally, we proposed a generalization of the U-curve problem that models some kinds of classifier design; we proved the correctness of the UCS, UBB, and PFS algorithms for this generalized version of the U-curve problem.
102

Estratégias de resolução para o problema de job-shop flexível / Solution approaches for flexible job-shop scheduling problem

Wellington Donizeti Previero 16 September 2016 (has links)
Nesta tese apresentamos duas estratégias para resolver o problema de job-shop flexível com o objetivo de minimizar o makespan. A primeira estratégia utiliza um algoritmo branch and cut (B&C) e a segunda abordagens matheuristics. O algoritmo B&C utiliza novas classes de inequações válidas, originalmente formulada para o problema de job-shop e estendida para o problema em questão. Para que as inequações válidas sejam eficientes, o modelo proposto por Birgin et al, (2014) (A milp model for an extended version of the fexible job shop problem. Optimization Letters, Springer, v. 8, n. 4, 1417-1431), é reformulado (MILP-2). A segunda estratégia utiliza as matheuristcs local branching e diversification, refining and tight-refining. Os experimentos computacionais mostraram que a inclusão dos planos de corte melhoram a relaxação do modelo MILP-2 e a qualidade das soluções. O algoritmo B&C reduziu o gap e o número de nós explorados para uma grande quantidade de instâncias. As abordagens matheuristics tiveram um excelente desempenho. Do total de 59 instâncias analisadas, somente em 3 problemas a resolução do modelo MILP-1 obteve melhores resultados do que as abordagens matheuristcs / This thesis proposes two approaches to solve the flexible job-shop scheduling problem to minimize the makespan. The first strategy uses a branch and cut algorithm (B&C) and the second approach is based on matheuristics. The B&C algorithm uses new classes of valid inequalities, originally formulated for job-shop scheduling problems and extended to the problem at hand. The second approach uses the matheuristics local branching and diversification, refining and tight-refining. For all valid inequalities to be effective, the precedence variable based model proposed by Birgin et al, (2014) (A milp model for an extended version of the fexible job shop problem. Optimization Letters, Springer, v. 8, n. 4, 1417-1431), is reformulated (MILP-2). The computational experiments showed that the inclusion of cutting planes tightened the linear programming relaxations and improved the quality of solutions. B&C algorithm reduced the gap value and the number of nodes explored in a large number of instances. The matheuristics approaches had an excellent performance. From 59 instances analized, MILP-1-Gurobi showed better results than matheuristics approaches in only 3 problems
103

Automatická detekce srdečních patologií pomocí vysokofrekvenčních složek komplexu QRS / Automatic detection of heart pathologies using high-frequency components of QRS complex

Daňová, Ľudmila January 2020 (has links)
The aim of this thesis is to analyse high-frequency ECG to detect some heart diseases. This is performed with averaging of selected QRS complexes for each lead of the signal; these are thenfilteredin range 500-1 000 Hz. After that the envelope of the signal is done and here the peaks are detected. Based on mutual positions of this peaks, it is possible to detectwhat kind od signal we treat.
104

Automatická detekce srdečních patologií pomocí vysokofrekvenčních složek komplexu QRS / Automatic detection of heart pathologies using high-frequency components of QRS complex

Daňová, Ľudmila January 2021 (has links)
The aim of this thesis is to analyse high-frequency ECG to detect some heart diseases. This is performed with averaging of selected QRS complexes for each lead of the signal; these are then filtered in range 500-1 000 Hz. After that the envelope of the signal is done and here the peaks are detected. Based on mutual positions of this peaks, it is possible to detect what kind od signal we treat.
105

Branch Guided Metrics for Functional and Gate-level Testing

Acharya, Vineeth Vadiraj 31 March 2015 (has links)
With the increasing complexity of modern day processors and system-on-a-chip (SOCs), designers invest a lot of time and resources into testing and validating these designs. To reduce the time-to-market and cost, the techniques used to validate these designs have to constantly improve. Since most of the design activity has moved to the register transfer level (RTL), test methodologies at the RTL have been gaining momentum. We present a novel functional test generation framework for functional test generation at RTL. A popular software-based metric for measuring the effectiveness of an RTL test suite is branch coverage. But exercising hard-to-reach branches is still a challenge and requires good understanding of the design semantics. The proposed framework uses static analysis to extract certain semantics of the circuit and uses several data structures to model these semantics. Using these data structures, we assist the branch-guided search to exercise these hard-to-reach branches. Since the correlation between high branch coverage and detecting defects and bugs is not clear, we present a new metric at the RTL which augments the RTL branch coverage with state values. Vectors which have higher scores on the new metric achieve higher branch and state coverages, and therefore can be applied at different levels of abstraction such as post-silicon validation. Experimental results show that use of the new metric in our test generation framework can achieve a high level of branch and fault coverage for several benchmark circuits, while reducing the length of the vector sequence. This work was supported in part by the NSF grant 1016675. / Master of Science
106

Symbolische Interpretation Technischer Zeichnungen

Bringmann, Oliver 19 January 2003 (has links) (PDF)
Gescannte und vektorisierte technische Zeichnungen werden automatisch unter Nutzung eines Netzes von Modellen in eine hochwertige Datenstruktur migriert. Die Modelle beschreiben die Inhalte der Zeichnungen hierarchisch und deklarativ. Modelle für einzelne Bestandteile der Zeichnungen können paarweise unabhängig entwickelt werden. Dadurch werden auch sehr komplexe Zeichnungsklassen wie Elektroleitungsnetze oder Gebäudepläne zugänglich. Die Modelle verwendet der neue, sogenannte Y-Algorithmus: Hypothesen über die Deutung lokaler Zeichnungsinhalte werden hierarchisch generiert. Treten bei der Nutzung konkurrierender Modelle Konflikte auf, werden diese protokolliert. Mittels des Konfliktbegriffes können konsistente Interpretationen einer kompletten Zeichnung abstrakt definiert und während der Analyse einer konkreten Zeichnung bestimmt werden. Ein wahrscheinlichkeitsbasiertes Gütemaß bewertet jede dieser alternativen, globalen Interpretationen. Das Suchen einer bzgl. dieses Maßes optimalen Interpretation ist ein NP-hartes Problem. Ein Branch and Bound-Algorithmus stellt die adäquate Lösung dar.
107

Pruning Deciduous Shade Trees

Davison, Elisabeth, DeGomez, Tom 04 1900 (has links)
Revised; Originally published:1999 / 6 pp. / The pruning principles discussed in this publication have proven to provide the best possible out comes including tree longevity and safety. Although trees may live for years following improper pruning their life span and safety may be severely reduced. We encourage proper pruning so that the trees we care for may bring us pleasure for many years.
108

Post-Main Sequence Habitability for Outer Solar System Moons / Habitability in the future Outer Solar System

Sparrman, Viktor January 2022 (has links)
The search for extra-terrestrial life is guided by the classification of promising candidate worlds. In this classification the habitable zone acts as a measure for the perceived habitability of a circumstellar body. Habitable zone definitions vary between using a conservative and an optimistic limit. As the Sun progresses through stages of stellar evolution previously uninhabitable outer moons may receive sufficient heating for the existence of liquid water on their surface. To evaluate the possibility for life on these moons the time inside the habitable zone is calculated and compared to estimates for the time for life to develop on Earth. For these calculations the stellar evolution models of PARSEC and Dartmouth are employed. A class of moons is discovered whose time inside the habitable zone is longest during the horizontal branch evolutionary phase (fueled by helium burning in the core). Since the horizontal branch luminosity is near constant, this class is of particular interest due to being less dependent on a stabilizing climate mechanism to regulate atmospheric composition needed to counteract luminosity changes. Ultimately, it is found that regardless of moon, stellar evolution model, and habitable zone definition no post-main sequence time inside the habitable zone is as long as the time for life to arise on Earth. / <p>Research presentation</p>
109

FIXED ORDER BRANCH AND BOUND METHODS FOR MIXED-INTEGER PROGRAMMING.

SINGHAL, JAYA ASTHANA. January 1982 (has links)
The aim of this dissertation is to present an algorithm for mixed integer programs which when started with a good heuristic solution can find improved solutions and reduce the error estimate as quickly as possible. This is achieved by using two ideas: a fixed order branch-and-bound method with selective expansion of subproblems and the sieve strategy which uses stronger than optimal bounds. The fixed order branch-and-bound method with selective expansion of subproblems is effective in reducing the error estimate quickly whereas the sieve strategy is effective in reducing the error estimate as well as finding improved solutions quickly. Computational experience is reported.
110

Mathematical Programming Algorithms for Reliable Routing and Robust Evacuation Problems

Andreas, April Kramer January 2006 (has links)
Most traditional routing problems assume perfect operability of all arcs and nodes. However, when independent arc failure probabilities exist, a secondary objective must be present to retain some measure of expected functionality. We first briefly consider the reliability-constrained single-path problem, where we look for the lowest cost path that meets a reliability side constraint. This analysis enables us to then examine the reliability-constrained two-path problem, which seeks to establish two minimum-cost paths between a source and destination node wherein at least one path must remain fully operable with some threshold probability. We consider the case in which both paths must be arc-disjoint and the case in which arcs can be shared between the paths. We prove both problems to be NP-hard. We examine strategies for solving the resulting nonlinear integer program, including pruning, coefficient tightening, lifting, and branch-and-bound partitioning schemes. Next, we consider the reliable h-path routing problem, which seeks a minimum-cost set of h ≥ 2 arc-independent paths between a source and destination node, such that the probability that at least one path remains operational is sufficiently large. Our prior arc-based models and algorithms tailored for the case in which h = 2 do not extend well to the general h-path problem. Thus, we propose two alternative integer programming formulations for the h-path problem in which the variables correspond to origin-destination paths. We propose two branch-and-price-and-cut algorithms for solving these new formulations, and provide computational results to demonstrate the efficiency of these algorithms. Finally, we examine the robust design of an evacuation tree, in which evacuation is subject to capacity restrictions on arcs. Given a discrete set of disaster scenarios with varying network populations, arc capacities, transit times, and time-dependent penalty functions, we seek to establish an optimal a priori evacuation tree that minimizes the expected evacuation penalty. The solution strategy is based on Benders decomposition, and we provide effcient methods for obtaining primal and dual sub-problem solutions. We analyze techniques for strengthening the master problem formulation, thus reducing the number of master problem solutions required for the algorithm's convergence.

Page generated in 0.0467 seconds