• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 3
  • 2
  • 1
  • Tagged with
  • 15
  • 15
  • 8
  • 5
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Improving the Modeling Framework for DCE-MRI Data in Hepatic Function Evaluation

Mossberg, Anneli January 2013 (has links)
Background Mathematical modeling combined with prior knowledge of the pharmacokinetics of the liver specific contrast agent Gd-EOB-DTPA has the potential to extract more information from Dynamic Contrast Enhanced Magnetic Resonance Imaging (DCE-MRI) data than previously possible. The ultimate goal of that work is to create a liver model that can describe DCE-MRI data well enough to be used as a diagnostic tool in liver function evaluation. Thus far this goal has not been fully reached and there is still some work to be done in this area. In this thesis, an already existing liver model will be implemented in the software Wolfram SystemModeler (WSM), the corresponding modeling framework will be further developed to better handle the temporally irregular sampling of DCE-MRI data and finally an attempt will be made to determine an optimal sampling design in terms of when and how often to collect images. In addition to these original goals, the work done during this project revealed two more issues that needed to be dealt with. Firstly, new standard deviation (SD) estimation methods regarding non-averaged DCE-MRI data were required in order to statistically evaluate the models. Secondly, the original model’s poor capability of describing the early dynamics of the system led to the creation of an additional liver model in attempt to model the bolus effect. Results The model was successfully implemented in WSM whereafter regional optimization was implemented as an attempt to handle clustered data. Tests on the available data did not result in any substantial difference in optimization outcome, but since the analyses were performed on only three patient data sets this is not enough to disregard the method. As a means of determining optimal sampling times, the determinant of the inverse Fisher Information Matrix was minimized, which revealed that frequent sampling is most important during the initial phase (~50-300 s post injection) and at the very end (~1500-1800 s). Three new means of estimating the SD were proposed. Of these three, a spatio-temporal SD was deemed most reasonable under the current circumstances. If a better initial fit is achieved, yet another method of estimating the variance as an optimization parameter might be implemented.    As a result of the new standard deviation the model failed to be statistically accepted during optimizations. The additional model that was created to include the bolus effect, and therefore be better able to fit the initial phase data, was also rejected. Conclusions The value of regional optimization is uncertain at this time and additional tests must be made on a large number of patient data sets in order to determine its value. The Fisher Information Matrix will be of great use in determining when and how often to sample once the model has achieved a more acceptable model fit in both the early and the late phase of the system. Even though the indications that it is important to sample densely in the early phase is rather intuitive due to a poor model fit in that region, the analyses also revealed that the final observations have a relatively high impact on the model prediction error. This was not previously known. Hence, an important measurement of how suitable the sampling design is in terms of the resulting model accuracy has been suggested. The original model was rejected due to its inability to fit the data during the early phase. This poor initial fit could not be improved enough by modelling the bolus effect and so the new implementation of the model was also rejected. Recommendations have been made in this thesis that might assist in the further development the liver model so that it can describe the true physiology and behaviour of the system in all phases. Such recommendations include, but are not limited to, the addition of an extra blood plasma compartment, a more thorough modelling of the spleen’s uptake of the contrast agent and a separation of certain differing signals that are now averaged.
12

Some choices of moments of refinable function and applications

Zhanlav, Tugal 31 August 2006 (has links)
We propose a recursive formula for moments of scaling function and sum rule. It is shown that some quadrature formulae has a higher degree of accuracy under proposed moment condition. On this basis we obtain higher accuracy formula for wavelet expansion coefficients which are needed to start the fast wavelet transform and estimate convergence rate of wavelet approximation and sampling of smooth functions. We also present a direct algorithm for solving refinement equation.
13

Computer-aided applications in process plant safety

An, Hong January 2010 (has links)
Process plants that produce chemical products through pre-designed processes are fundamental in the Chemical Engineering industry. The safety of hazardous processing plants is of paramount importance as an accident could cause major damage to property and/or injury to people. HAZID is a computer system that helps designers and operators of process plants to identify potential design and operation problems given a process plant design. However, there are issues that need to be addressed before such a system will be accepted for common use. This research project considers how to improve the usability and acceptability of such a system by developing tools to test the developed models in order for the users to gain confidence in HAZID s output as HAZID is a model based system with a library of equipment models. The research also investigates the development of computer-aided safety applications and how they can be integrated together to extend HAZID to support different kinds of safety-related reasoning tasks. Three computer-aided tools and one reasoning system have been developed from this project. The first is called Model Test Bed, which is to test the correctness of models that have been built. The second is called Safe Isolation Tool, which is to define isolation boundary and identify potential hazards for isolation work. The third is an Instrument Checker, which lists all the instruments and their connections with process items in a process plant for the engineers to consider whether the instrument and its loop provide safeguards to the equipment during the hazard identification procedure. The fourth is a cause-effect analysis system that can automatically generate cause-effect tables for the control engineers to consider the safety design of the control of a plant as the table shows process events and corresponding process responses designed by the control engineer. The thesis provides a full description of the above four tools and how they are integrated into the HAZID system to perform control safety analysis and hazard identification in process plants.
14

Process Control in High-Noise Environments Using A Limited Number Of Measurements

Barajas, Leandro G. January 2003 (has links)
The topic of this dissertation is the derivation, development, and evaluation of novel hybrid algorithms for process control that use a limited number of measurements and that are suitable to operate in the presence of large amounts of process noise. As an initial step, affine and neural network statistical process models are developed in order to simulate the steady-state system behavior. Such models are vitally important in the evaluation, testing, and improvement of all other process controllers referred to in this work. Afterwards, fuzzy logic controller rules are assimilated into a mathematical characterization of a model that includes the modes and mode transition rules that define a hybrid hierarchical process control. The main processing entity in such framework is a closed-loop control algorithm that performs global and then local optimizations in order to asymptotically reach minimum bias error; this is done while requiring a minimum number of iterations in order to promptly reach a desired operational window. The results of this research are applied to surface mount technology manufacturing-lines yield optimization. This work achieves a practical degree of control over the solder-paste volume deposition in the Stencil Printing Process (SPP). Results show that it is possible to change the operating point of the process by modifying certain machine parameters and even compensate for the difference in height due to change in print direction.
15

Approximations polynomiales rigoureuses et applications / Rigorous Polynomial Approximations and Applications

Joldes, Mioara Maria 26 September 2011 (has links)
Quand on veut évaluer ou manipuler une fonction mathématique f, il est fréquent de la remplacer par une approximation polynomiale p. On le fait, par exemple, pour implanter des fonctions élémentaires en machine, pour la quadrature ou la résolution d'équations différentielles ordinaires (ODE). De nombreuses méthodes numériques existent pour l'ensemble de ces questions et nous nous proposons de les aborder dans le cadre du calcul rigoureux, au sein duquel on exige des garanties sur la précision des résultats, tant pour l'erreur de méthode que l'erreur d'arrondi.Une approximation polynomiale rigoureuse (RPA) pour une fonction f définie sur un intervalle [a,b], est un couple (P, Delta) formé par un polynôme P et un intervalle Delta, tel que f(x)-P(x) appartienne à Delta pour tout x dans [a,b].Dans ce travail, nous analysons et introduisons plusieurs procédés de calcul de RPAs dans le cas de fonctions univariées. Nous analysons et raffinons une approche existante à base de développements de Taylor.Puis nous les remplaçons par des approximants plus fins, tels que les polynômes minimax, les séries tronquées de Chebyshev ou les interpolants de Chebyshev.Nous présentons aussi plusieurs applications: une relative à l'implantation de fonctions standard dans une bibliothèque mathématique (libm), une portant sur le calcul de développements tronqués en séries de Chebyshev de solutions d'ODE linéaires à coefficients polynômiaux et, enfin, un processus automatique d'évaluation de fonction à précision garantie sur une puce reconfigurable. / For purposes of evaluation and manipulation, mathematical functions f are commonly replaced by approximation polynomials p. Examples include floating-point implementations of elementary functions, integration, ordinary differential equations (ODE) solving. For that, a wide range of numerical methods exists. We consider the application of such methods in the context of rigorous computing, where we need guarantees on the accuracy of the result, with respect to both the truncation and rounding errors.A rigorous polynomial approximation (RPA) for a function f defined over an interval [a,b] is a couple (P, Delta) where P is a polynomial and Delta is an interval such that f(x)-P(x) belongs to Delta, for all x in [a,b]. In this work we analyse and bring forth several ways of obtaining RPAs for univariate functions. Firstly, we analyse and refine an existing approach based on Taylor expansions. Secondly, we replace them with better approximations such as minimax approximations, Chebyshev truncated series or interpolation polynomials.Several applications are presented: one from standard functions implementation in mathematical libraries (libm), another regarding the computation of Chebyshev series expansions solutions of linear ODEs with polynomial coefficients, and finally an automatic process for function evaluation with guaranteed accuracy in reconfigurable hardware.

Page generated in 0.1084 seconds