• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 25
  • 6
  • 5
  • 4
  • 4
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 57
  • 9
  • 9
  • 8
  • 7
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Ensuring Serializable Executions with Snapshot Isolation DBMS

Alomari, Mohammad January 2009 (has links)
Doctor of Philosophy(PhD) / Snapshot Isolation (SI) is a multiversion concurrency control that has been implemented by open source and commercial database systems such as PostgreSQL and Oracle. The main feature of SI is that a read operation does not block a write operation and vice versa, which allows higher degree of concurrency than traditional two-phase locking. SI prevents many anomalies that appear in other isolation levels, but it still can result in non-serializable execution, in which database integrity constraints can be violated. Several techniques have been proposed to ensure serializable execution with engines running SI; these techniques are based on modifying the applications by introducing conflicting SQL statements. However, with each of these techniques the DBA has to make a difficult choice among possible transactions to modify. This thesis helps the DBA’s to choose between these different techniques and choices by understanding how the choices affect system performance. It also proposes a novel technique called ’External Lock Manager’ (ELM) which introduces conflicts in a separate lock-manager object so that every execution will be serializable. We build a prototype system for ELM and we run experiments to demonstrate the robustness of the new technique compare to the previous techniques. Experiments show that modifying the application code for some transactions has a high impact on performance for some choices, which makes it very hard for DBA’s to choose wisely. However, ELM has peak performance which is similar to SI, no matter which transactions are chosen for modification. Thus we say that ELM is a robust technique for ensure serializable execution.
12

Ensuring Serializable Executions with Snapshot Isolation DBMS

Alomari, Mohammad January 2009 (has links)
Doctor of Philosophy(PhD) / Snapshot Isolation (SI) is a multiversion concurrency control that has been implemented by open source and commercial database systems such as PostgreSQL and Oracle. The main feature of SI is that a read operation does not block a write operation and vice versa, which allows higher degree of concurrency than traditional two-phase locking. SI prevents many anomalies that appear in other isolation levels, but it still can result in non-serializable execution, in which database integrity constraints can be violated. Several techniques have been proposed to ensure serializable execution with engines running SI; these techniques are based on modifying the applications by introducing conflicting SQL statements. However, with each of these techniques the DBA has to make a difficult choice among possible transactions to modify. This thesis helps the DBA’s to choose between these different techniques and choices by understanding how the choices affect system performance. It also proposes a novel technique called ’External Lock Manager’ (ELM) which introduces conflicts in a separate lock-manager object so that every execution will be serializable. We build a prototype system for ELM and we run experiments to demonstrate the robustness of the new technique compare to the previous techniques. Experiments show that modifying the application code for some transactions has a high impact on performance for some choices, which makes it very hard for DBA’s to choose wisely. However, ELM has peak performance which is similar to SI, no matter which transactions are chosen for modification. Thus we say that ELM is a robust technique for ensure serializable execution.
13

Contribution à l'étude expérimentale des écoulements à surface libre : application à l'interaction de sillages et à l'écoulement dans un sluice artisanal / Contribution to the experimental study of free surface flow : application to the interaction wake pattern and flow in a sluice-box

Ramanakoto, Toky Nandrasana 28 October 2014 (has links)
ALe travail réalisé dans le cadre de cette thèse en co-tutelle est à cheval entre deux domaines d’étude de l’interférence de sillage d’obstacle non profilé (soit un cylindre ou deux cylindres en tandem) et celle des zones de recirculations dans un sluice artisanal (un appareil de tri d’or). Notre étude contribue à l’étude des écoulements à surface libre par combinaison de quelques méthodes expérimentales. Nous avons couplé quelques techniques entre autres la visualisation de la topologie du sillage proche par la caméra CCD embarquée, les mesures des champs de vitesse par PIV et des efforts hydrodynamiques. Ainsi, le cylindre animé d’un mouvement uniformément accéléré/décéléré est caractérisé par les forces de traînée et de portance, les enveloppes des maximas, la longueur de le zone de recirculation et le nombre de Strouhal local. Dont ce dernier a une valeur avoisinant de 0.4 près de la surface libre. Et entre deux pics de la portance, le déferlement de la vague d’accompagnement est observable. L’interférence entre deux structures est simulée à travers deux cylindres et que sur une configuration symétrique, le paramètre de proximité B a tendance à accroître la longueur de la zone de recirculation. Les méthodes et dispositifs expérimentales décrites ci-dessus sont appliqués dans le cas d’écoulement hydrodynamique turbulent au sein du sluice. Les essais sur sites aurifères à Madagascar ont permis la collecte des informations sur les paramètres optimums du tri. Ces paramètres sont pris comme base initiale des travaux de laboratoire entamés sur une maquette à l’échelle 1. L’acquisition par PIV suivi des traitements statistique à multi-variables POD Snapshot nous a permis de créer un modèle expérimental de l’écoulement composé de 4 zones distinctes. L’étude expérimentale a été complétée par une simulation numérique par ANSYS14.5 ; qui nous a permis de conclure que l’approchement des riffles entraîne une diminution de la zone favorable du dépôt des minerais lourds. / Two major areas are focused in the thesis. The first investigates the interference due to a wake-pattern of non-profiled obstacles (such as one or two cylinder in tandem). The second characterizes the zone of recirculation inside an artisanal sluice-boxe, which is a device for gold extraction. The work contributes to the study of the flow of a free-surface using experimental methods. Also, a few approaches is combined for the investigation. In this regard, the near-wake pattern of the flow is examined using an embedded CCD camera, correlated to a PIV measurement of the velocity fields and the hydrodynamic forces. We found that a cylinder of uniform motion, accelerated or decelerated, is characterized by: the drag and the lift forces, the envelopes of maxima, the length of the recirculation zone and the local Strouhal number. The Strouhal number approaches the value of 0.4 next to a free-surface and a breaking wave is observed in-between two peaks of the lift force. We modeled the interference between two structures using a succession of two cylinders. The proximity parameter B tends to an increase of the recirculation length for a symmetrical arrangement.Our methods and the experimental procedures are applied for the examination of a turbulent hydrodynamic flow inside a sluice. Tests were performed on gold sites in Madagascar. This permitted the collection of information concerning the optimum parameters for an extraction. The obtained values form the basis of our laboratory work and are applied to a scaled model for validation. An experimental flow model, made of four distinct zones, is derived from a monitored PIV data and a statistical analysis of a multi-variable POD snapshot. The results are validated through simulations using the package ANSYS 14.5. The investigation shows that a closer riffles reduces the region for heavy minerals deposition.
14

Rychlé MRI metody / Fast MRI methods

Kořínek, Radim January 2010 (has links)
This thesis deals with comparison of rapid and conventional methods used in MRI (Magnetic Resonance Imaging). There is a description of imaging methods such as EPI (Echo Planar Imaging), Ultra-fast GRE, FSE (Fast spin echo) as well as a snapshot-FLASH and FISP (Fast Imaging with Steady Precession). Experimental part of this thesis deals with the rapid FSE (Fast Spin Echo) method. Especially is explained and assembled an algorithm for proper compilation of data from the FSE method. This algorithm allows us to evaluate the images from the FSE method. This method is examined in detail (in terms of impact parameters) and compared with traditional conventional methods. Finally, the individual images are evaluated, and the best parameters for FSE method are identified.
15

Gargamel : accroître les performances des DBMS en parallélisant les transactions en écriture / Gargamel : boosting DBMS performance by parallelising write transactions

Cincilla, Pierpaolo 15 September 2014 (has links)
Les bases de données présentent des problèmes de passage à l’échelle. Ceci est principalement dû à la compétition pour les ressources et au coût du contrôle de la concurrence. Une alternative consiste à centraliser les écritures afin d’éviter les conflits. Cependant, cette solution ne présente des performances satisfaisantes que pour les applications effectuant majoritairement des lectures. Une autre solution est d’affaiblir les propriétés transactionnelles mais cela complexifie le travail des développeurs d’applications. Notre solution, Gargamel, répartie les transactions effectuant des écritures sur différentes répliques de la base de données tout en gardant de fortes propriétés transactionnelles. Toutes les répliques de la base de donnée s’exécutent séquentiellement, à plein débit; la synchronisation entre les répliques reste minime. Les évaluations effectuées avec notre prototype montrent que Gargamel permet d’améliorer le temps de réponse et la charge d’un ordre de grandeur quand la compétition est forte (systèmes très chargés avec ressources limitées) et que dans les autres cas le ralentissement est négligeable. / Databases often scale poorly in distributed configurations, due to the cost of concurrency control and to resource contention. The alternative of centralizing writes works well only for read-intensive workloads, whereas weakening transactional properties is problematic for application developers. Our solution spreads non-conflicting update transactions to different replicas, but still provides strong transactional guarantees. In effect, Gargamel partitions the database dynamically according to the update workload. Each database replica runs sequentially, at full bandwidth; mutual synchronisation between replicas remains minimal. Our prototype show that Gargamel improves both response time and load by an order of magnitude when contention is high (highly loaded system with bounded resources), and that otherwise slow-down is negligible.
16

Snapshot Spectral Domain Optical Coherence Tomography

Valdez, Ashley January 2016 (has links)
Optical coherence tomography systems are used to image the retina in 3D to allow ophthalmologists diagnose ocular disease. These systems yield large data sets that are often labor-intensive to analyze and require significant expertise in order to draw conclusions, especially when used over time to monitor disease progression. Spectral Domain Optical Coherence Tomography (SD-OCT) instantly acquires depth profiles at a single location with a broadband source. These systems require mechanical scanning to generate two- or three-dimensional images. Instead of mechanically scanning, a beamlet array was used to permit multiple depth measurements on the retina with a single snapshot using a 3x 3 beamlet array. This multi-channel system was designed, assembled, and tested using a 1 x 2 beamlet lens array instead of a 3 x 3 beamlet array as a proof of concept prototype. The source was a superluminescent diode centered at 840nm with a 45nm bandwidth. Theoretical axial resolution was 6.92um and depth of focus was 3.45mm. Glass samples of varying thickness ranging from 0.18mm to 1.14mm were measured with the system to validate that correct depth profiles can be acquired for each channel. The results demonstrated the prototype system performed as expected, and is ready to be modified for in vivo applicability.
17

SATELLITE GROUND OPERATIONS AUTOMATION – LESSONS LEARNED AND FUTURE APPROACHES

Catena, John, Frank, Lou, Saylor, Rick, Weikel, Craig 10 1900 (has links)
International Telemetering Conference Proceedings / October 22-25, 2001 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Reducing spacecraft ground system operations costs are a major goal in all missions. The Fast Auroral Snapshot (FAST) flight operations team at the NASA/Goddard Spacecraft Flight Center developed in-house scripts and procedures to automate monitoring of critical spacecraft functions. The initial staffing profile of 16x7 was reduced first to 8x5 and then to “lights out”. Operations functions became an offline review of system performance and the generation of future science plans for subsequent upload to the spacecraft. Lessons learned will be applied to the challenging Triana mission, where 24x7 contact with the spacecraft will be necessary at all times.
18

Computational and Design Methods for Advanced Imaging

Birch, Gabriel C. January 2012 (has links)
This dissertation merges the optical design and computational aspects of imaging systems to create novel devices that solve engineering problems in optical science and attempts to expand the solution space available to the optical designer. This dissertation is divided into two parts: the first discusses a new active illumination depth sensing modality, while the second part discusses a passive illumination system called plenoptic, or lightfield, imaging. The new depth sensing modality introduced in part one is called depth through controlled aberration. This technique illuminates a target with a known, aberrated projected pattern and takes an image using a traditional, unmodified imaging system. Knowing how the added aberration in the projected pattern changes as a function of depth, we are able to quantitatively determine depth of a series of points from the camera. A major advantage this method permits is the ability for illumination and imaging axes to be coincident. Plenoptic cameras capture both spatial and angular data simultaneously. This dissertation present a new set of parameters that permit the design and comparison of plenoptic devices outside the traditionally published plenoptic 1.0 and plenoptic 2.0 configurations. Additionally, a series of engineering advancements are presented, including full system ray traces of raw plenoptic images, Zernike compression techniques of raw image files, and non-uniform lenslet arrays to compensate for plenoptic system aberrations. Finally, a new snapshot imaging spectrometer is proposed based off the plenoptic configuration.
19

Genetická determinace a dědičnost kraniofaciálních znaků na základě vybraných lokusů DNA / Genetic determination and heredity of craniofacial traits based on specific DNA loci

Králíková, Kristýna January 2018 (has links)
Introduction: Genetic determination of human face is clearly visible in family members. The resemblance between monozygotic twins who are genetically identical is especially remarkable. So far the possibilities of reliable prediction of the complex morphology of facial traits on the basis of genome analysis and the ability to capture the variability of human facial morphology through genotype variability are highly limited. Complete genetic basis of the physiological variability of craniofacial traits remains more or less unknown. This master's thesis was created as a pilot study of the shared project of the Laboratory of 3D Imagining and Analytical Methods and the Laboratory of Molecular Anthropology on Department of Anthropology and Human Genetics. Material and Methods: The specimen collection is composed of DNA samples derived from 30 families (29 with 4 members, 1 with 5 members) who fulfilled required criteria. Nine single nucleotide polymorphisms were chosen based on the available information. Eight of them are linked to normal facial variability and one was chosen based on the assumed function of the gene where the polymorphism is located. There were two methods of genotyping: RFLP method with the use of restriction endonuclease and SNaPshot method. Morphological data were provided by the...
20

Enhancing Data Processing on Clouds with Hadoop/HBase

Zhang, Chen January 2011 (has links)
In the current information age, large amounts of data are being generated and accumulated rapidly in various industrial and scientific domains. This imposes important demands on data processing capabilities that can extract sensible and valuable information from the large amount of data in a timely manner. Hadoop, the open source implementation of Google's data processing framework (MapReduce, Google File System and BigTable), is becoming increasingly popular and being used to solve data processing problems in various application scenarios. However, being originally designed for handling very large data sets that can be divided easily in parts to be processed independently with limited inter-task communication, Hadoop lacks applicability to a wider usage case. As a result, many projects are under way to enhance Hadoop for different application needs, such as data warehouse applications, machine learning and data mining applications, etc. This thesis is one such research effort in this direction. The goal of the thesis research is to design novel tools and techniques to extend and enhance the large-scale data processing capability of Hadoop/HBase on clouds, and to evaluate their effectiveness in performance tests on prototype implementations. Two main research contributions are described. The first contribution is a light-weight computational workflow system called "CloudWF" for Hadoop. The second contribution is a client library called "HBaseSI" supporting transactional snapshot isolation (SI) in HBase, Hadoop's database component. CloudWF addresses the problem of automating the execution of scientific workflows composed of both MapReduce and legacy applications on clouds with Hadoop/HBase. CloudWF is the first computational workflow system built directly using Hadoop/HBase. It uses novel methods in handling workflow directed acyclic graph decomposition, storing and querying dependencies in HBase sparse tables, transparent file staging, and decentralized workflow execution management relying on the MapReduce framework for task scheduling and fault tolerance. HBaseSI addresses the problem of maintaining strong transactional data consistency in HBase tables. This is the first SI mechanism developed for HBase. HBaseSI uses novel methods in handling distributed transactional management autonomously by individual clients. These methods greatly simplify the design of HBaseSI and can be generalized to other column-oriented stores with similar architecture as HBase. As a result of the simplicity in design, HBaseSI adds low overhead to HBase performance and directly inherits many desirable properties of HBase. HBaseSI is non-intrusive to existing HBase installations and user data, and is designed to work with a large cloud in terms of data size and the number of nodes in the cloud.

Page generated in 0.028 seconds