• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 182
  • 112
  • 38
  • 19
  • 17
  • 6
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • Tagged with
  • 467
  • 98
  • 48
  • 44
  • 33
  • 30
  • 29
  • 27
  • 27
  • 27
  • 26
  • 24
  • 22
  • 22
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Volumetric PIV and OH PLIF imaging in the far field of nonpremixed jet flames

Gamba, Mirko 03 September 2009 (has links)
Cinematographic stereoscopic PIV, combined with Taylor's frozen flow hypothesis, is used to generate three-dimensional (3D) quasi-instantaneous pseudo volumes of the three-component (3C) velocity field in the far field of turbulent nonpremixed jet flames at jet exit Reynolds number Reδ in the range 8,000-15,300. The effect of heat release, however, lowers the local (i.e., based on local properties) Reynolds number to the range 1,500-2,500. The 3D data enable computation of all nine components of the velocity gradient tensor ∇u from which the major 3D kinematic quantities, such as strain rate, vorticity, dissipation and dilatation, are computed. The volumetric PIV is combined with simultaneously acquired 10 Hz OH planar laser-induced fluorescence (PLIF). A single plane of the OH distribution is imaged on the center-plane of the volume and provides an approximate planar representation of the instantaneous reaction zone. The pseudo-volumes are reconstructed from temporally and spatially resolved kilohertz-rate 3C velocity field measurements on an end-view plane (perpendicular to the jet flame axis) invoking Taylor's hypothesis. The interpretation of the measurements is therefore twofold: the measurements provide a time-series representation of all nine velocity gradients on a single end-view plane or, after volumetric reconstruction, they offer a volumetric representation, albeit approximate, of the spatial structure of the flow. The combined datasets enable investigation of the fine-scale spatial structure of turbulence, the effect of the reaction zone on these structures and the relationship between the jet kinematics and the reaction zone. Emphasis is placed on the energy dissipation field and on the presence and role of dilatation. Statistics of the components of the velocity gradient tensor and its derived quantities show that these jet flames exhibit strong similarities to incompressible turbulent flows, such as in the distribution of the principal strain rates and strain-vorticity alignment. However, the velocity-gradient statistics show that these jet flames do not exhibit small-scale isotropy but exhibit a strong preference for high-magnitude radial gradients, which are attributed to regions of strong shear induced by the reaction zone. The pseudo volumes reveal that the intense-vorticity field is organized in two major classes of structures: tube-like away from the reaction zone (the classical worms observed in incompressible turbulence) and sheet-like in the vicinity of the local reaction zone. Sheet-like structures are, however, the dominant ones. Moreover, unlike incompressible turbulence where sheet-like dissipative structures enfold, but don't coincide with, clusters of tube-like vortical structures, it is observed that the sheet-like intense-vorticity structures tend to closely correspond to sheet-like structures of high dissipation. The primary reason for these features is believed to be due to the stabilizing effect of heat release on these relatively low local Reynolds number jet flames. It is further observed that regions of both positive and negative dilatation are present and tend to be associated with the oxidizer and fuel sides of the OH zones, respectively. These dilatation features are mostly organized in small-scale, short-lived blobby structures that are believed to be mainly due to convection of regions of varying density rather than to instantaneous heat release rate. A model of the dilatation field developed by previous researchers using a flamelet approximation of the reaction zone was used to provide insights into the observed features of the dilatation field. Measurements in an unsteady laminar nonpremixed jet flame where dilatation is expected to be absent support the simplified model and indicate that the observed structure of dilatation is not just a result of residual noise in the measurements, although resolution effects might mask some of the features of the dilatation field. The field of kinetic energy dissipation is further investigated by decomposing the instantaneous dissipation field into the solenoidal, dilatational and inhomogeneous components. Analysis of the current measurements reveals that the effect of dilatation on dissipation is minimal at all times (it contributes to the mean kinetic energy dissipation only by about 5-10%). Most of the mean dissipation arises from the solenoidal component. On average, the inhomogeneous component is nearly zero, although instantaneously it can be the dominant component. Two mechanisms are believed to be important for energy dissipation. Near the reaction zone, where the stabilizing effect of heat release generates layers of laminar-like shear and hence high vorticity, solenoidal dissipation (which is proportional to the enstrophy) dominates. In the rest of the ow the inhomogeneous component dominates in regions subjected to complex systems of nested vortical structures where the mutual interaction of interwoven vortical structures in intervening regions generates intense dissipation. / text
52

Growth and titration of Newcastle disease and infectious bronchitis viruses in tissue culture

Durand, Donald Paul. January 1957 (has links)
Call number: LD2668 .T4 1957 D87 / Master of Science
53

Volumetric imaging across spatiotemporal scales in biology with fluorescence microscopy

Sims, Ruth Rebecca January 2019 (has links)
Quantitative three dimensional maps of cellular structure, activity and function provide the key to answering many prevalent questions in modern biological research. Fluorescence microscopy has emerged as an indispensable tool in generating such maps, but common techniques are limited by fundamental physical constraints which render them incapable of simultaneously achieving high spatial and temporal resolution. This thesis will describe the development of novel microscopy techniques and complementary computational tools capable of addressing some of the aforementioned limitations of fluorescence microscopy and further outline their application in providing novel biological insights. The first section details the design of a light sheet microscope capable of high-throughput imaging of cleared, macroscopic samples with cellular resolution. In light sheet microscopy, the combination of spatially confined illumination with widefield detection enables multi-megapixel acquisition in a single camera exposure. The corresponding increase in acquisition speed enables systems level biological studies to be performed. The ability of this microscope to perform rapid, high-resolution imaging of intact samples is demonstrated by its application in a project which established a niche and hierarchy for stem cells in the adult nervous system. Light sheet microscopy achieves fast volumetric imaging rates, but the two dimensional nature of each measurement results in an inevitable lag between acquisition of the initial and final planes. The second section of this thesis describes the development and optimization of a light field microscope which captures volumetric information in a snapshot. Light field microscopy is a computational technique and images are reconstructed from raw data. Both the fidelity of computed volumes and the efficiency of the algorithms are strongly dependent on the quality of the rectification. A highly accurate, automated procedure is presented in this section. Light field reconstruction techniques are investigated and compared and the results are used to inform the re-design of the microscope. The new optical configuration is demonstrated to minimize the long-object problem. In the final section of the thesis, the spatial resolution limits of light field microscopy are explored using a combination of simulations and experiments. It is shown that light field microscopy is capable of localizing point sources over a large depth of field with high axial and lateral precision. Notably, this work paves the way towards frame rate limited super resolution localization microscopy with a depth of field larger than the thickness of a typical mammalian cell.
54

Theoretical and experimental concepts to increase the performance of structured illumination microscopy

Ströhl, Florian January 2018 (has links)
The aim of the work described in this thesis is to improve the understanding, implementation, and overall capabilities of structured illumination microscopy (SIM). SIM is a superresolution technique that excels in gentle live-cell volumetric imaging tasks. Many modalities of SIM were developed over the last decade that tailored SIM into the versatile and powerful technique that it is today. Nevertheless, the field of SIM continues to evolve and there is plenty of room for novel concepts. Specifically, in this thesis, a generalised framework for a theoretical description of SIM variants is introduced, the constraints of optical components for a flexible SIM system are discussed and the set-up is realised, the important aspect of deconvolution in SIM is highlighted and further developed, and finally novel SIM modalities introduced that improve its time-resolution, gentleness, and volumetric imaging capabilities. Based on the generalised theory, the computational steps for the extraction of superresolution information from SIM raw data are outlined and the essential concept of spatial frequency un-mixing explained for standard SIM as well as for multifocal SIM. Multifocal SIM hereby acts as a parallelised confocal as well as widefield technique and thus serves as link between the two modalities. Using this novel scheme deconvolution methods for SIM are then further developed to allow a holistic reconstruction procedure. Deconvolution is of great important in the SIM reconstruction process, and hence rigorous derivations of advanced deconvolution methods are provided and further developed to enable generalised ‘multi-image’ Richardson-Lucy deconvolution in SIM, called joint Richardson-Lucy deconvolution (jRL). This approach is demonstrated to robustly produce optically sectioned multifocal SIM images and, through the incorporation of a 3D imaging model, also volumetric standard SIM images within the jRL framework. For standard SIM this approach enabled acquisition speed doubling, because the recovery of superresolved images from a reduced number of raw frames through constrained jRL was made possible. The method is validated in silico and in vitro. For the study of yet faster moving samples deconvolution microscopy is found to be the method of choice. To enable optical sectioning, a key feature of SIM, in deconvolution microscopy, a new modality of optical sectioning microscopy is introduced that can be implemented as a single-shot technique. Via polarised excitation and detection in orthogonal directions in conjunction with structured illumination the theoretical framework is rigorously derived and validated.
55

Volumetric capnography in the diagnosis and the therapeutic monitoring of pulmonary embolism in the emergency department

Verschuren, Franck 07 December 2005 (has links)
CO2 and its influence on environmental and ecological processes focuses the attention of all current media. In the medical area, expired CO2 measurement with Capnography has gained acceptance for all patients needing clinical monitoring and supervision. But recent research works are showing the promises of CO2 as a diagnostic tool or therapeutic monitoring. In this case, measurement of expired CO2 in function of the expired volume, called Volumetric Capnography, has a theoretical better performance than the traditional time-based Capnography. When expired CO2 data are combined to arterial CO2 sampling, the clinician faces breath-by-breath curves, which give a bedside knowledge of the pulmonary ventilation and perfusion status of his patient. Pulmonary embolism is a particular application of Volumetric Capnography. This frequent and challenging disease is characterized by impaired relationships between the pulmonary ventilation and perfusion, going from deadspace to shunt. Volumetric Capnography deserves a careful attention in this area, since its combination with other clinical or biological signs could become part of a diagnostic procedure, either for the detection of the disease when capnographic parameters are clearly impaired, or for ruling out this diagnosis when Volumetric Capnography analysis is normal. In the same way, monitoring the efficacy of thrombolytic therapy when pulmonary embolism is massive is another particular interest for expired CO2 measurement. Physicians working in the Emergency Department demand performing devices for improving patient care. Such devices can be particularly adapted to daily practice if they can be used by the bedside, if they are non-invasive, safe, efficient, feasible, and applicable to non-intubated patients. Volumetric Capnography, which seems to answer those requirements, will certainly deserve growing attention and interest in the future as a direct application of pulmonary pathophysiology. Even if Volumetric Capnography is still at the frontier between clinical research and clinical practice, let us hope that the studies presented in this thesis will improve the clinical acceptance of this attractive technology.
56

Estimation of iron-55 volumetric contamination via surrogates produced during Z-machine operations

Flores-McLaughlin, John 2008 August 1900 (has links)
Analysis of the radiation produced by Z-machine nuclear experiments at Sandia National Laboratory and the materials irradiated indicate that the majority of produced radionuclides can easily be detected. One significant exception is volumetric contamination of stainless steel by iron-55. Detecting iron-55 in Z-machine components presents a particular problem due to its low-abundance and the low-energy (5.9 keV) xray it emits. The nuclide is often below the minimum detectable activity (MDA) threshold and resolution criteria of many standard radiation detection devices. Liquid scintillation has proven useful in determining iron-55 presence in loose contamination at concentrations below that of regulatory guidelines, but determination of volumetric iron- 55 contamination remains a significant challenge. Due to this difficulty, an alternate method of detection is needed. The use of radioactive surrogates correlating to iron-55 production is proposed in order to establish an estimate of iron-55 abundance. The primary interaction pathways and interaction probabilities for all likely radionuclide production in the Z-machine were tabulated and radionuclides with production pathways matching those of iron-55 production were noted. For purposes of nuclide identification and adequate detection, abundant gamma emitters with half-lives on the order of days were selected for use as surrogates. Interaction probabilities were compared between that of iron-55 production and a chosen surrogate. Weighting factors were developed to account for the differences in the interaction probabilities over the range of the known energy spectra produced on the device. The selection process resulted in cobalt-55, cobalt-57 and chromium-51 as optimal surrogates for iron-55 detection in both deuterium and non-deuterium loaded interactions. A decay corrected correlation of the surrogates (chromium-51, cobalt-57 and cobalt-55) to iron-55 for deuterium and non-deuterium loaded Z-machine driven reactions was derived. The weighting factors presented here are estimates which are based on rough comparisons of cross-section graphs. Analysis considering factors such as energy spectrum criteria to provide refined weighting factors may be utilized in future work.
57

Integration and quantification of uncertainty of volumetric and material balance analyses using a Bayesian framework

Ogele, Chile 01 November 2005 (has links)
Estimating original hydrocarbons in place (OHIP) in a reservoir is fundamentally important to estimating reserves and potential profitability. Quantifying the uncertainties in OHIP estimates can improve reservoir development and investment decision-making for individual reservoirs and can lead to improved portfolio performance. Two traditional methods for estimating OHIP are volumetric and material balance methods. Probabilistic estimates of OHIP are commonly generated prior to significant production from a reservoir by combining volumetric analysis with Monte Carlo methods. Material balance is routinely used to analyze reservoir performance and estimate OHIP. Although material balance has uncertainties due to errors in pressure and other parameters, probabilistic estimates are seldom done. In this thesis I use a Bayesian formulation to integrate volumetric and material balance analyses and to quantify uncertainty in the combined OHIP estimates. Specifically, I apply Bayes?? rule to the Havlena and Odeh material balance equation to estimate original oil in place, N, and relative gas-cap size, m, for a gas-cap drive oil reservoir. The paper considers uncertainty and correlation in the volumetric estimates of N and m (reflected in the prior probability distribution), as well as uncertainty in the pressure data (reflected in the likelihood distribution). Approximation of the covariance of the posterior distribution allows quantification of uncertainty in the estimates of N and m resulting from the combined volumetric and material balance analyses. Several example applications to illustrate the value of this integrated approach are presented. Material balance data reduce the uncertainty in the volumetric estimate, and the volumetric data reduce the considerable non-uniqueness of the material balance solution, resulting in more accurate OHIP estimates than from the separate analyses. One of the advantages over reservoir simulation is that, with the smaller number of parameters in this approach, we can easily sample the entire posterior distribution, resulting in more complete quantification of uncertainty. The approach can also detect underestimation of uncertainty in either volumetric data or material balance data, indicated by insufficient overlap of the prior and likelihood distributions. When this occurs, the volumetric and material balance analyses should be revisited and the uncertainties of each reevaluated.
58

The volumetric determination of vanadium and chromium in special alloy steels Ceric sulfate as a volumetric oxidizing agent ...

Young, Philena Anne, January 1928 (has links)
Thesis (Ph. D.)--University of Michigan, 1928.
59

A volumetric mesh-free deformation method for surgical simulation in virtual environments

Wang, Shuang. January 2009 (has links)
Thesis (M.S.)--University of Delaware, 2009. / Principal faculty advisors: Kenneth E. Barner and Karl V. Steiner, Dept. of Electrical & Computer Engineering. Includes bibliographical references.
60

A series of simple basic indicators and its application to some very strongly acid systems ...

Deyrup, Alden Johnson, January 1900 (has links)
Thesis (Ph. D.)--Columbia University, 1932. / Vita. Includes bibliographical references (p. 61-62).

Page generated in 0.0534 seconds