• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 393
  • 85
  • 67
  • 51
  • 27
  • 13
  • 7
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • Tagged with
  • 793
  • 220
  • 112
  • 82
  • 67
  • 58
  • 56
  • 55
  • 55
  • 55
  • 52
  • 52
  • 51
  • 50
  • 49
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
531

Electromagnetic Modelling for the Estimation of Wood Parameters

Sjödén, Therese January 2008 (has links)
<p>Spiral grain in trees causes trouble to the wood industry, since boards sawn from trees with large grain angle have severe problems with form stability. Measurements of the grain angle under bark enable the optimisation of the refining process. The main objective of this thesis is to study the potential in estimating the grain angle by using microwaves. To do this, electromagnetic modelling and sensitivity analysis are combined.</p><p>The dielectric properties of wood are different along and perpendicular to the wood fibres. This anisotropy is central for the estimation of the grain angle by means of microwaves. To estimate the grain angle, measurements are used together with electromagnetic modelling for the scattering from plane surfaces and cylinders. Measurement set-ups are proposed to determine the material parameters, such as the grain angle, for plane boards and cylindrical logs. For cylindrical logs both near-field and far-field measurements are investigated. In general, methods for determining material parameters exhibit large errors in the presence of noise. In this case, acceptable levels of these errors are achieved throug using few material parameters in the model: the grain angle and two dielectric parameters, characterising the electrical properties parallel and perpendicular to the fibres.</p><p>From the case with plane boards, it is concluded that it is possible to make use of the anisotropy of wood to estimate the grain angle from the reflected electromagnetic field. This property forms then the basis of the proposed methods for the estimation of the grain angle in cylindrical logs. For the proposed methods, a priori knowledge of the moisture content or temperature of the wood is not needed. Furthermore, since the anisotropy persist also for frozen wood, the method is valid for temperatures below zero degrees Celsius.</p><p>For the case with cylindrical logs, sensitivity analysis is applied to the near-field as well as the far-field methods, to analyse the parameter dependence with respect to the measurement model and the errors introduced by noise. In this sensitivity analysis, the Cram\'{e}r-Rao bound is used, giving the best possible variance for estimating the parameters. The levels of the error bounds are high, indicating a problematic estimation problem. However, the feasibility of accurate estimation will be improved through higher signal-to-noise ratios, repeated measurements, and better antenna gain. The sensitivity analysis is also useful as an analytical tool to understand the difficulties and remedies related to the method used for determining material parameters, as well as a practical aid in the design of a measurement set-up.</p><p>According to the thesis, grain angle estimation is possible with microwaves. The proposed methods are fast and suitable for further development for in-field use in the forest or in saw mills.</p> / <p>Träd med växtvridenhet orsakar problem i träindustrin eftersom brädor som sågats från träd med stor fibervinkel har problem med formstabiliteten och vrider sig då de torkas. Mätning av fibervinkeln under bark möjliggör optimering av förädlingsprocessen. I den här avhandlingen kombineras elektromagnetisk modellering och känslighetsanalys för att undersöka möjligheterna att bestämma fibervinkeln med mikrovågor.</p><p>De elektriska egenskaperna hos trä är olika längs med och vinkelrätt mot fibrerna. Den här anisotropin är utgångspunkten för att bestämma fibervinkeln med hjälp av mikrovågor. För att skatta fibervinkeln används mätningar tillsammans med elektromagnetisk modellering för spridningen från plana ytor och cylindrar. Mätuppställningar föreslås för problemet att skatta materialparametrar, såsom fibervinkeln, i plana brädor och cylindriska stockar. För cylindriska stockar undersöks både närfälts- och fjärrfältsmätningar. I allmänhet har metoder för skattning av materialparametrar stora fel då systemet innehåller brus. Här erhålls acceptabla fel genom att använda få materialparametrar i modelleringen. De materialparametrar som används är fibervinkeln och två dielektriska parametrar som karakteriserar de elektriska egenskaperna längs med och vinkelrätt mot träfibern.</p><p>Slutsatsen från fallet med plana brädor är att det är möjligt att använda anisotropin hos trä och dess påverkan på ett reflekterat elektromagnetiskt fält för att skatta fibervinkeln. Detta är grunden i de metoder som föreslås för cylindriska stockar. För samtliga metoder så gäller att varken fukthalt eller temperatur behöver vara kända på förhand. Eftersom anisotropin kvarstår också för fruset trä så är metoderna användbara även för temperaturer under noll grader Celsius.</p><p>För fallet med cylindriska stockar används känslighetsanalys på både närfälts- och fjärrfältsmetoderna för att analysera parameterberoendet i uppmätt data samt felen som introduceras av brus. I den här känslighetsanalysen används Cram\'{e}r-Rao gränsen som ger den bästa möjliga variansen för skattning av parametrarna. Nivåerna på gränserna är höga vilket indikerar att det är ett svårt estimeringsproblem. Möjligheterna att skatta parametrarna noggrant förbättras genom bättre signal-brus förhållande, upprepade mätningar samt ökad antennstyrka. Känslighetsanalysen är också användbar som ett analytiskt verktyg för ökad förståelse för problem och möjligheter relaterade till metoden för att skatta parametrarna och som ett praktiskt stöd för design av en mätuppställning.</p><p>Enligt avhandlingen är skattning av fibervinkel möjlig med mikrovågor. De föreslagna metoderna är snabba och lämpliga att utveckla vidare för användning i skogen eller i sågverk.</p>
532

Auswirkungen der Deletion membranständiger Dehydrogenasen auf Gluconobacter oxydans DSM 7145 / Impacts of the deletion of membrane-bound dehydrogenases on Gluconobacter oxydans DSM 7145

Voss, Jörn 02 July 2009 (has links)
No description available.
533

Market Integration Analysis and Time-series Econometrics: Conceptual Insights from Markov-switching Models / Marktintegrationsanalyse und Zeitreihenökonometrie: Begriffseinblicke aus den Markov-Switching Modellen

Abunyuwah, Isaac 31 January 2008 (has links)
No description available.
534

Monitoring von Membranen und membrangebundenen Dehydrogenasen in Essigsäurebakterien / Monitoring of membranes and membrane-bound dehydrogenases in acetic acid bacteria

Kokoschka, Sebastian 21 October 2013 (has links)
No description available.
535

Algebraic Soft- and Hard-Decision Decoding of Generalized Reed--Solomon and Cyclic Codes

Zeh, Alexander 02 September 2013 (has links) (PDF)
Deux défis de la théorie du codage algébrique sont traités dans cette thèse. Le premier est le décodage efficace (dur et souple) de codes de Reed--Solomon généralisés sur les corps finis en métrique de Hamming. La motivation pour résoudre ce problème vieux de plus de 50 ans a été renouvelée par la découverte par Guruswami et Sudan à la fin du 20ème siècle d'un algorithme polynomial de décodage jusqu'au rayon Johnson basé sur l'interpolation. Les premières méthodes de décodage algébrique des codes de Reed--Solomon généralisés faisaient appel à une équation clé, c'est à dire, une description polynomiale du problème de décodage. La reformulation de l'approche à base d'interpolation en termes d'équations clés est un thème central de cette thèse. Cette contribution couvre plusieurs aspects des équations clés pour le décodage dur ainsi que pour la variante décodage souple de l'algorithme de Guruswami--Sudan pour les codes de Reed--Solomon généralisés. Pour toutes ces variantes un algorithme de décodage efficace est proposé. Le deuxième sujet de cette thèse est la formulation et le décodage jusqu'à certaines bornes inférieures sur leur distance minimale de codes en blocs linéaires cycliques. La caractéristique principale est l'intégration d'un code cyclique donné dans un code cyclique produit (généralisé). Nous donnons donc une description détaillée du code produit cyclique et des codes cycliques produits généralisés. Nous prouvons plusieurs bornes inférieures sur la distance minimale de codes cycliques linéaires qui permettent d'améliorer ou de généraliser des bornes connues. De plus, nous donnons des algorithmes de décodage d'erreurs/d'effacements [jusqu'à ces bornes] en temps quadratique.
536

“Here in Paraguay we have to sacrifice so much to get anything”: Perceptions of Health and Healthcare Services among Subsistence Farmers in Paraguay

Flanagan, Sarah 17 September 2012 (has links)
In this Master's of Public Issues Anthropology thesis I examine the perceptions of health and healthcare services within a small rural subsistence farming community in South-Western Paraguay from a political ecology of health perspective. Qualitative research data was collected from May to September of 2010 in Lindo Manantial, a subsistence farming village, and Piribebuy, the closest town to Lindo Manantial and the location of the nearest health centre, the Piribebuy Centro de Salud. The primary goals of this research project were to gain an ethnographic understanding of current local health perspectives and concerns, as well as the local frameworks for health provision in Piribebuy. I argue that the introduction of culturally competent healthcare services could greatly improve individual and community health statuses and outcomes in Lindo Manantial and other similar rural subsistence farming communities in Paraguay. / Social Sciences and Humanities Research Council of Canada (SSHRC)
537

Approches multicritères pour le traitement des débris spatiaux

Madakat, D. 16 June 2014 (has links) (PDF)
Les débris spatiaux constituent une menace pour l'exploration et l'exploitation de l'espace. Leur nombre ne cesse d'augmenter et continuera à grandir même si on arrête toute activité spatiale, augmentant ainsi la probabilité d'entrer en collision avec un satellite actif. Le retrait des débris s'avère le seul moyen de protéger ces satellites. Le nombre des débris spatiaux étant très élevé, il convient préalablement de repérer les plus dangereux. Dans la première partie de la thèse, nous avons élaboré une approche multicritère afin de classer les débris selon leur degré de priorité d'enlèvement. Les débris de la classe la plus prioritaire, feront l'objet d'une mission spatiale de retrait de débris. La planification d'une telle mission est étudiée dans la deuxième partie de la thèse. Elle doit être réalisée en minimisant deux critères : le coût de la mission ainsi que la durée nécessaire pour traiter tous les débris. La navette se déplace d'une orbite à une autre, traite les débris un par un puis retourne à son orbite initiale. Etant donné que le transfert entre deux orbites de débris peut être effectué de multiples façons, chacune correspondant à un compromis possible entre la durée et le coût de transfert, et que ces coûts et durées dépendent des moments de départ et d'arrivée sur les orbites, l'ensemble des solutions réalisables est défini sur un multigraphe dynamique orienté. Un tour dans un tel graphe définit un scénario de mission possible. Il s'agit de trouver l'ensemble des tours non dominés dans un tel multigraphe. Ceci revient à résoudre un problème de voyageur de commerce biobjectif et dépendant du temps. Nous avons développé un algorithme basé sur la technique de séparation et évaluation pour restituer l'ensemble de ces tours. L'optimisation de l'algorithme est faite sur deux niveaux : - On limite le nombre des transferts possibles entre deux orbites en évitant de calculer le coût pour les transferts qui s'avéreraient dominés. - Des règles de dominance sont utilisées pour couper certaines branches de l'arborescence de recherche qui ne mèneront pas à des solutions efficaces. Des résultats expérimentaux illustrent l'efficacité de la procédure.
538

Enhance the understanding of whole-genome evolution by designing, accelerating and parallelizing phylogenetic algorithms

Yin, Zhaoming 22 May 2014 (has links)
The advent of new technology enhance the speed and reduce the cost for sequencing biological data. Making biological sense of this genomic data is a big challenge to the algorithm design as well as the high performance computing society. There are many problems in Bioinformatics, such as how new functional genes arise, why genes are organized into chromosomes, how species are connected through the evolutionary tree of life, or why arrangements are subject to change. Phylogenetic analyses have become essential to research on the evolutionary tree of life. It can help us to track the history of species and the relationship between different genes or genomes through millions of years. One of the fundamentals for phylogenetic construction is the computation of distances between genomes. Since there are much more complicated combinatoric patterns in rearrangement events, the distance computation is still a hot topic as much belongs to mathematics as to biology. For the distance computation with input of two genomes containing unequal gene contents (with insertions/deletions and duplications) the problem is especially hard. In this thesis, we will discuss about our contributions to the distance estimation for unequal gene order data. The problem of finding the median of three genomes is the key process in building the most parsimonious phylogenetic trees from genome rearrangement data. For genomes with unequal contents, to the best of our knowledge, there is no algorithm that can help to find the median. In this thesis, we make our contributions to the median computation in two aspects. 1) Algorithm engineering aspect, we harness the power of streaming graph analytics methods to implement an exact DCJ median algorithm which run as fast as the heuristic algorithm and can help construct a better phylogenetic tree. 2) Algorithmic aspect, we theoretically formulate the problem of finding median with input of genomes having unequal gene content, which leads to the design and implementation of an efficient Lin-Kernighan heuristic based median algorithm. Inferring phylogenies (evolutionary history) of a set of given species is the ultimate goal when the distance and median model are chosen. For more than a decade, biologists and computer scientists have studied how to infer phylogenies by the measurement of genome rearrangement events using gene order data. While evolution is not an inherently parsimonious process, maximum parsimony (MP) phylogenetic analysis has been supported by widely applied to the phylogeny inference to study the evolutionary patterns of genome rearrangements. There are generally two problems with the MP phylogenetic arose by genome rearrangement: One is, given a set of modern genomes, how to compute the topologies of the according phylogenetic tree; Another is, given the topology of a model tree, how to infer the gene orders of the ancestor species. To assemble a MP phylogenetic tree constructor, there are multiple NP hard problems involved, unfortunately, they organized as one problem on top of other problems. Which means, to solve a NP hard problem, we need to solve multiple NP hard sub-problems. For phylogenetic tree construction with the input of unequal content genomes, there are three layers of NP hard problems. In this thesis, we will mainly discuss about our contributions to the design and implementation of the software package DCJUC (Phylogeny Inference using DCJ model to cope with Unequal Content Genomes), that can help to achieve both of these two goals. Aside from the biological problems, another issue we need to concern is about the use of the power of parallel computing to assist accelerating algorithms to handle huge data sets, such as the high resolution gene order data. For one thing, all of the method to tackle with phylogenetic problems are based on branch and bound algorithms, which are quite irregular and unfriendly to parallel computing. To parallelize these algorithms, we need to properly enhance the efficiency for localized memory access and load balance methods to make sure that each thread can put their potentials into full play. For the other, there is a revolution taking place in computing with the availability of commodity graphical processors such as Nvidia GPU and with many-core CPUs such as Cray-XMT, or Intel Xeon Phi Coprocessor with 60 cores. These architectures provide a new way for us to achieve high performance at much lower cost. However, code running on these machines are not so easily programmed, and scientific computing is hard to tune well on them. We try to explore the potentials of these architectures to help us accelerate branch and bound based phylogenetic algorithms.
539

Aspects technologiques et économiques de la qualité de service dans les alliances de fournisseurs de services

AMIGO, Maria Isabel 12 July 2013 (has links) (PDF)
Providing end-to-end quality-assured services implies many challenges, which go beyond technical ones, involving as well economic and even cultural or political issues. In this thesis we first focus on a technical problem and then intent a more holistic regard to the whole problem, considering at the same time Network Service Providers (NSPs), stakeholders and buyers' behaviour and satisfaction. One of the most important problems when deploying interdomain path selection with Quality of Service (QoS) requirements is being able to rely the computations on metrics that hold for a long period of time. Our proposal for solving that problem is to compute bounds on the metrics, taking into account the uncertainty on the traffic demands. We then move to a NSP-alliance scenario, where we propose a complete framework for selling interdomain quality-assured services, and subsequently distributing revenues. At the end of the thesis we adopt a more holistic approach and consider the interactions with the monitoring plane and the buyers' behaviour. We propose a simple pricing scheme and study it in detail, in order to use QoS monitoring information as feedback to the business plane, with the ultimate objective of improving the seller's revenue.
540

New Techniques for Building Timing-Predictable Embedded Systems

Guan, Nan January 2013 (has links)
Embedded systems are becoming ubiquitous in our daily life. Due to close interaction with physical world, embedded systems are typically subject to timing constraints. At design time, it must be ensured that the run-time behaviors of such systems satisfy the pre-specified timing constraints under any circumstance. In this thesis, we develop techniques to address the timing analysis problems brought by the increasing complexity of underlying hardware and software on different levels of abstraction in embedded systems design. On the program level, we develop quantitative analysis techniques to predict the cache hit/miss behaviors for tight WCET estimation, and study two commonly used replacement policies, MRU and FIFO, which cannot be analyzed adequately using the state-of-the-art qualitative cache analysis method. Our quantitative approach greatly improves the precision of WCET estimation and discloses interesting predictability properties of these replacement policies, which are concealed in the qualitative analysis framework. On the component level, we address the challenges raised by multi-core computing. Several fundamental problems in multiprocessor scheduling are investigated. In global scheduling, we propose an analysis method to rule out a great part of impossible system behaviors for better analysis precision, and establish conditions to guarantee the bounded responsiveness of computing tasks. In partitioned scheduling, we close a long standing open problem to generalize the famous Liu and Layland's utilization bound in uniprocessor real-time scheduling to multiprocessor systems. We also propose to use cache partitioning for multi-core systems to avoid contentions on shared caches, and solve the underlying schedulability analysis problem. On the system level, we present techniques to improve the Real-Time Calculus (RTC) analysis framework in both efficiency and precision. First, we have developed Finitary Real-Time Calculus to solve the scalability problem of the original RTC due to period explosion. The key idea is to only maintain and operate on a limited prefix of each curve that is relevant to the final results during the whole analysis procedure. We further improve the analysis precision of EDF components in RTC, by precisely bounding the response time of each computation request.

Page generated in 0.0358 seconds