Spelling suggestions: "subject:"[een] QUANTIFICATION"" "subject:"[enn] QUANTIFICATION""
111 |
Partial volume correction for absolute quantification of in vivo proton MRSDong, Shih-Shan 20 March 2008 (has links)
Magnetic resonance spectroscopy is now in widespread use, which with various
tools of spectra analysis can provide concentrations of metabolites. The influence of
metabolites on human physiology is greatly. Due to the tiny variation of the
concentration in various metabolites, the analytic method used in the quantitative
determination of the absolute concentrations of metabolites plays an important role in
this research area.
In this thesis we present an analysis tool for segmentation of white matter, gray
matte and cerebrospinal fluid using region growing with spatial space, and provide
manual interaction for exception handling in this subject. Then we use this tool to
analyze different percentages of white matter and gray matter with the default
parameter by LCModel and correct partial volume effect. The results show that the
proposed tool can improve significantly the accuracy in absolute quantitative analysis
of concentration.
|
112 |
Flou et quantification dans les images numériquesLadjal, Saïd 22 March 2005 (has links) (PDF)
La première partie de la thèse introduit une méthode de déquantification de l'image qui améliore les propriétés statistiques du champ de gradient. Nous appliquons notre méthode à la détection de segments significatifs développée par Agnès Desolneux.La seconde partie présente une méthode d'évaluation du flou dans les images naturelles. Nous tirons profit du scale space morphologique pour permettre une évaluation aussi précise que possible de la quantité de flou local, sans connaissances sur le noyau.
|
113 |
Cross-scale model validation with aleatory and epistemic uncertaintyBlumer, Joel David 08 June 2015 (has links)
Nearly every decision must be made with a degree of uncertainty regarding the outcome. Decision making based on modeling and simulation predictions needs to incorporate and aggregate uncertain evidence. To validate multiscale simulation models, it may be necessary to consider evidence collected at a length scale that is different from the one at which a model predicts. In addition, traditional methods of uncertainty analysis do not distinguish between two types of uncertainty: uncertainty due to inherently random inputs, and uncertainty due to lack of information about the inputs. This thesis examines and applies a Bayesian approach for model parameter validation that uses generalized interval probability to separate these two types of uncertainty. A generalized interval Bayes’ rule (GIBR) is used to combine the evidence and update belief in the validity of parameters. The sensitivity of completeness and soundness for interval range estimation in GIBR is investigated. Several approaches to represent complete ignorance of probabilities’ values are tested. The result from the GIBR method is verified using Monte Carlo simulations. The method is first applied to validate the parameter set for a molecular dynamics simulation of defect formation due to radiation. Evidence is supplied by the comparison with physical experiments. Because the simulation includes variables whose effects are not directly observable, an expanded form of GIBR is implemented to incorporate the uncertainty associated with measurement in belief update. In a second example, the proposed method is applied to combining the evidence from two models of crystal plasticity at different length scales.
|
114 |
A quantitative, model-driven approach to technology selection and development through epistemic uncertainty reductionGatian, Katherine N. 02 April 2015 (has links)
When aggressive aircraft performance goals are set, he integration of new, advanced technologies into next generation aircraft concepts is required to bridge the gap between current capabilities and required capabilities. A large number of technologies exists that can be pursued, and only a subset may practically be selected to reach the chosen objectives. Additionally, the appropriate numerical and physical
experimentation must be identified to further develop the selected technologies. These decisions must be made under a large amount of uncertainty because developing technologies introduce phenomena that have not been previously characterized. Traditionally, technology selection decisions are made based on deterministic performance assessments that do not capture the uncertainty of the technology impacts. Model-driven environments and new, advanced uncertainty quantification techniques provide the ability to characterize technology impact uncertainties and pinpoint how they are driving the system performance, which will aid technology selection decisions. Moreover, the probabilistic assessments can be used to plan experimentation that facilitates uncertainty reduction by targeting uncertainty sources with large performance impacts. The thesis formulates and implements a process that allows for risk-informed decision making throughout technology development. It focuses on quantifying technology readiness risk and performance risk by synthesizing quantitative, probabilistic performance information with qualitative readiness assessments. The Quantitative Uncertainty Modeling, Management, and Mitigation (QuantUM3) methodology was tested through the use of an environmentally-motivated aircraft design case study based upon NASAs Environmentally Responsible Aviation (ERA) technology development program. A physics-based aircraft design environment was created that has the ability to provide quantitative system-level performance assessments and was employed to model the technology impacts as probability distributions to facilitate the development of an overall process required to enable risk-informed technology and experimentation decisions. The outcome of the experimental e orts was a detailed outline of the entire methodology and a confirmation that the methodology enables risk-informed technology development decisions with respect to both readiness risk and performance risk. Furthermore, a new process for communicating technology readiness through morphological analysis was created as well as an experiment design process that utilizes the readiness information and quantitative uncertainty analysis to simultaneously increase readiness and decrease technology performance uncertainty.
|
115 |
Reliability methods in dynamic system analysisMunoz, Brad Ernest 26 April 2013 (has links)
Standard techniques used to analyze a system's response with uncertain system parameters or inputs, are generally Importance sampling methods. Sampling methods require a large number of simulation runs before the system
output statistics can be analyzed. As model fidelity increases, sampling techniques become computationally infeasible, and Reliability methods have gained popularity as an analysis method that requires significantly fewer simulation runs. Reliability analysis is an analytic technique which finds a particular point in the design space that can accurately be related to the probability of system failure. However, application to dynamic systems have remained limited.
In the following thesis a First Order Reliability Method (FORM) is used to determine the failure probability of a dynamic system due to system/input uncertainties. A pendulum cart system is used as a case study to demonstrate the FORM on a dynamic system. Three failure modes are discussed which
correspond to the maximum pendulum angle, the maximum system velocity,
and a combined requirement that neither the maximum pendulum angle or system velocity are exceeded. An explicit formulation is generated from the implicit formulation using a Response Surface Methodology, and the FORM is performed using the explicit estimate. Although the analysis converges with minimal simulation computations, attempts to verify FORM results illuminate current limitations of the methodology. The results of this initial study conclude that, currently, sampling techniques are necessary to verify the FORM results, which restricts the potential applications of the FORM methodology. Suggested future work focuses on result verification without the use of Importance sampling which would allow Reliability methods to have widespread applicability. / text
|
116 |
Parametric uncertainty and sensitivity methods for reacting flowsBraman, Kalen Elvin 09 July 2014 (has links)
A Bayesian framework for quantification of uncertainties has been used to quantify the uncertainty introduced by chemistry models. This framework adopts a probabilistic view to describe the state of knowledge of the chemistry model parameters and simulation results. Given experimental data, this method updates the model parameters' values and uncertainties and propagates that parametric uncertainty into simulations. This study focuses on syngas, a combination in various ratios of H2 and CO, which is the product of coal gasification. Coal gasification promises to reduce emissions by replacing the burning of coal with the less polluting burning of syngas. Despite the simplicity of syngas chemistry models, they nonetheless fail to accurately predict burning rates at high pressure. Three syngas models have been calibrated using laminar flame speed measurements. After calibration the resulting uncertainty in the parameters is propagated forward into the simulation of laminar flame speeds. The model evidence is then used to compare candidate models.
Sensitivity studies, in addition to Bayesian methods, can be used to assess chemistry models. Sensitivity studies provide a measure of how responsive target quantities of interest (QoIs) are to changes in the parameters. The adjoint equations have been derived for laminar, incompressible, variable density reacting flow and applied to hydrogen flame simulations. From the adjoint solution, the sensitivity of the QoI to the chemistry model parameters has been calculated. The results indicate the most sensitive parameters for flame tip temperature and NOx emission. Such information can be used in the development of new experiments by pointing out which are the critical chemistry model parameters.
Finally, a broader goal for chemistry model development is set through the adjoint methodology. A new quantity, termed field sensitivity, is introduced to guide chemistry model development. Field sensitivity describes how information of perturbations in flowfields propagates to specified QoIs. The field sensitivity, mathematically shown as equivalent to finding the adjoint of the primal governing equations, is obtained for laminar hydrogen flame simulations using three different chemistry models. Results show that even when the primal solution is sufficiently close for the three mechanisms, the field sensitivity can vary. / text
|
117 |
From Memory to Mastery: Accounting for Control in America, 1750-1880Rosenthal, Caitlin Clare January 2012 (has links)
From Memory to Mastery charts the development of commercial numeracy and accounting in America and the English-speaking Atlantic world between 1750 and 1880. Over this period, accounting evolved from a system of recordkeeping into a multifaceted instrument of control and analysis—from an aid to memory to an instrument of mastery. The traditional story of modern management begins in the factories of England and New England, extending only much later to the American South. This dissertation draws on textbooks and manuscript account books to argue that southern and West Indian plantations also influenced the development of bookkeeping. Scientific planters adopted sophisticated accounting practices, foreshadowing the rise of scientific management in the late nineteenth century. Their sophistication was not just incidental to the use of forced labor. Rather, the control of planters over their slaves made data easier to collect and more profitable to use. New methods were, in a sense, a byproduct of bondage. By contrast, the mobility of labor in the North made detailed recordkeeping necessary for keeping track of wages but relatively futile for detailed benchmarks and comparisons. Early northern factories distinguished themselves not by analyzing productivity but by mediating between firms and the market. They developed hybrid practices that bridged management hierarchies and market exchange. Commercial colleges educated clerical workers, accountants, and bookkeepers, providing the staff for a revolution in the organization of information. Though the rise of accounting helped planters and manufacturers to organize and control their expanding workforces, numeracy was not always class-biased. Textbooks and common schools spread numerical knowledge across a wide range of people, enabling them to turn the language of accounts to their own purposes. Account books reflect the power of their keepers, but bookkeeping is also a creative language that can be used by all kinds of people. This study bridges history and economics, blending qualitative and quantitative methods. The dissertation closes with a statistical analysis of accounting practices among Massachusetts corporations in the 1870s. Both these data and close readings of account books elsewhere in the dissertation suggest that practices were incredibly diverse but also fundamental to firm survival. Keeping accounts was a creative, narrative process that helped eighteenth- and nineteenth-century Americans to navigate the increasingly complex world around them.
|
118 |
Real Time PCR Protocol Development for Rapid and Low Cost Quantification of Baculovirus and for Monitoring Progression of InfectionGeorge, Steve January 2010 (has links)
The work presented in this thesis aims to further the understanding and implementation of the Baculovirus Expression Vector System (BEVS) for varied uses such as protein production and viral vector production. To this end, three projects have been presented, two of which deal with methods to quantify baculovirus titres and the last deals with tracking baculovirus transcripts in infected insect cells.
The first project examined assumption-free analysis as a method for data analysis of Real Time PCR data in order to enable direct comparison of baculovirus titres between samples, without the need for a traditional standard curve. It concluded that assumption-free analysis was well suited for this purpose and fold differences of baculovirus titres of different samples obtained using this method corresponded to real differences in sample titres.
The second project aimed to develop a cheap and reliable method for sample preparation for Real Time PCR which would remove the need for the use of commercially available extraction kits. Samples were subjected to various combinations of Triton X-100 at different concentrations and different numbers of freeze/thaw cycles in order to determine the combination which would provide the best baculovirus genome exposure. One of these combinations was found to be at least as good as commercially available kits in reliably extracting baculovirus DNA and providing baculovirus titres that are at least as accurate.
The third project was a preliminary study examining the effects of multiplicity of infection on the levels of baculovirus Gp-64 transcript in insect cell culture. The study concludes that at high multiplicities of infection, there seems to be no increase in baculovirus transcripts when the multiplicity of infection is further increased. This study served to allow for familiarization with tracking transcript levels, and the principles and techniques demonstrated here will form the basis for an exhaustive future study on the same subject.
|
119 |
Coronary Artery Calcium Quantification in Contrast-enhanced Computed Tomography AngiographyDhungel, Abinashi 18 December 2013 (has links)
Coronary arteries are the blood vessels supplying oxygen-rich blood to the heart muscles. Coronary artery calcium (CAC), which is the total amount of calcium deposited in these arteries, indicates the presence or the future risk of coronary artery diseases. Quantification of CAC is done by using computed tomography (CT) scan which uses attenuation of x-ray by different tissues in the body to generate three-dimensional images. Calcium can be easily spotted in the CT images because of its higher opacity to x-ray compared to that of the surrounding tissue. However, the arteries cannot be identified easily in the CT images. Therefore, a second scan is done after injecting a patient with an x-ray opaque dye known as contrast material which makes different chambers of the heart and the coronary arteries visible in the CT scan. This procedure is known as computed tomography angiography (CTA) and is performed to assess the morphology of the arteries in order to rule out any blockage in the arteries.
The CT scan done without the use of contrast material (non-contrast-enhanced CT) can be eliminated if the calcium can be quantified accurately from the CTA images. However, identification of calcium in CTA images is difficult because of the proximity of the calcium and the contrast material and their overlapping intensity range. In this dissertation first we compare the calcium quantification by using a state-of-the-art non-contrast-enhanced CT scan method to conventional methods suggesting optimal quantification parameters. Then we develop methods to accurately quantify calcium from the CTA images. The methods include novel algorithms for extracting centerline of an artery, calculating the threshold of calcium adaptively based on the intensity of contrast along the artery, calculating the amount of calcium in mixed intensity range, and segmenting the artery and the outer wall. The accuracy of the calcium quantification from CTA by using our methods is higher than the non-contrast-enhanced CT thus potentially eliminating the need of the non-contrast-enhanced CT scan. The implications are that the total time required for the CT scan procedure, and the patient's exposure to x-ray radiation are reduced.
|
120 |
ANTIBODY-BASED DETECTION AND QUANTIFICATION OF PECTOBACTERIUM CAROTOVORUM SSP. CAROTOVORUMBassoriello, Melissa Maria Ivana 28 October 2010 (has links)
Pectobacterium carotovorum ssp. carotovorum (Pcc) is implicated in the destruction of
ornamental plants in greenhouse recirculating systems. PCR-based detection and
quantification of Pcc requires expensive instrumentation and knowledgeable users. This
thesis describes the production of polyclonal antibodies and a single-domain antibody
fragment (VHH) against Pcc lipopolysaccharide (LPS), and the development of user-
friendly diagnostic assays for detection and quantification of the pathogen. Polyclonal
ELISAs against heat-killed (HK) Pcc (limit of detection (LOD) = 81 CFU/ml; limit of
quantitation (LOQ) = 216 CFU/ml) and Pcc LPS (LOD = 23 ng/ml; LOQ = 76 ng/ml)
were developed. A preliminary user-friendly dipstick assay was also developed (≥ 105
CFU/ml). A phage display library was constructed (6.0 x 105 clones/ml), yielding one
unique anti-Pcc LPS VHH. Using the Pcc LPS-specific VHH to produce affordable, user-
friendly diagnostic assays is feasible since antibody fragments can be produced on a large
scale through expression in Escherichia coli or Piccia pastoris. / Flowers Canada, CANADA-ONTARIO RESEARCH AND DEVELOPMENT (CORD) PROGRAM, Canada Research Chairs (CRC) Program, NSERC/NRC
|
Page generated in 0.0364 seconds