• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 18
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 26
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A Constraint Satisfaction Approach for Enclosing Solutions to Initial Value Problems for Parametric Ordinary Differential Equations

Janssen, Micha 26 October 2001 (has links)
This work considers initial value problems (IVPs) for ordinary differential equations (ODEs) where some of the data is uncertain and given by intervals as is the case in many areas of science and engineering. Interval methods provide a way to approach these problems but they raise fundamental challenges in obtaining high accuracy and low computation costs. This work introduces a constraint satisfaction approach to these problems which enhances traditional interval methods with a pruning step based on a global relaxation of the ODE. The relaxation uses Hermite interpolation polynomials and enclosures of their error terms to approximate the ODE. Our work also shows how to find an evaluation time for the relaxation that minimizes its local error. Theoretical and experimental results show that the approach produces significant improvements in accuracy over the best interval methods for the same computation costs. The results also indicate that the new algorithm should be significantly faster when the ODE contains many operations.
2

A Constraint Satisfaction Approach for Enclosing Solutions to Initial Value Problems for Parametric Ordinary Differential Equations

Janssen, Micha 26 October 2001 (has links)
This work considers initial value problems (IVPs) for ordinary differential equations (ODEs) where some of the data is uncertain and given by intervals as is the case in many areas of science and engineering. Interval methods provide a way to approach these problems but they raise fundamental challenges in obtaining high accuracy and low computation costs. This work introduces a constraint satisfaction approach to these problems which enhances traditional interval methods with a pruning step based on a global relaxation of the ODE. The relaxation uses Hermite interpolation polynomials and enclosures of their error terms to approximate the ODE. Our work also shows how to find an evaluation time for the relaxation that minimizes its local error. Theoretical and experimental results show that the approach produces significant improvements in accuracy over the best interval methods for the same computation costs. The results also indicate that the new algorithm should be significantly faster when the ODE contains many operations.
3

Evaluation of error and reproducibility of qPCR for absolute quantification of DNA

Cicero, Michael Carmen 24 September 2015 (has links)
Absolute quantitative PCR (qPCR) is a method that determines the concentration of DNA in a sample. Accurate, and reproducible quantification is required during forensic DNA processing since the results determine the volume of sample used during STR genotyping. If too little DNA is utilized allelic dropout can occur; if too much DNA is used an increase in the number of artifacts can result. In either case, sub-optimal DNA input-masses can lead to the misinterpretation of the evidentiary profile, by increasing the probability of drop in and/or drop out. Generally, the qPCR method used during forensic DNA processing employs a set of standards, which are run with the questioned samples and used to generate a standard curve. These data are then used to establish a linear equation that is subsequently utilized to estimate the concentration of DNA in the unknown sample. However, standard curves have been shown to be prone to systematic and random error effects that impact the accuracy of the concentration estimate. This study examines two alternative methods to determine the DNA concentration for unknown samples, and compares them to the currently accepted protocol of running new dilutions/standards with every assay. The two alternative methods are: 1) using a validated standard curve, and 2) using linear regression of efficiency. To examine the feasibility of using these two methods for forensic purposes, two samples were quantified, using qPCR, in quadruplicate over the course of three years and concentrations were calculated using all three methods. Effects that time, kit lot, and instrument calibration had on the concentrations was examined for both total human and Y-DNA. Specifically, methods were compared by examining variances in concentration over the three- year period, and contrasting these results with the variances obtained within runs. The method which resulted in the smallest changes in concentration over time was regarded as the most stable. Results show that of the three methods, the use of a validated curve resulted in less variation of DNA concentration between multiple runs. Further, the factor that had the largest impact on concentration variance was the calibration of the instrument. Based on these results, recommendations are provided.
4

Suspension forces on a tri-axle air suspended semi-trailer

Kat, Cor-Jacques 24 June 2009 (has links)
The aim of this study is to investigate the use of multi-body vehicle simulation models to predict the suspension forces acting on the chassis of the vehicle, in order to perform durability analyses. Traditionally, durability of vehicles is evaluated with proving ground tests. This implies that a physical prototype of the vehicle is required before its durability can be evaluated. If we were able to evaluate the durability of the vehicle without any physical part or a full prototype of the vehicle available, great cost and time savings may be gained. These possible gains have lead to the use of computer aided engineering (CAE) tools. These tools have supplemented the proving ground durability test by using historical measured data and/or predicted data from vehicle simulation models, as input to the durability analyses i.e. Finite Element Analyses (FEA). The usefulness of the historical test data is limited and many of the vehicle simulation models that are used to predict the input data, have not been validated. In this study a validated mathematical model of a 40 ton flat bed tri-axle semi-trailer, able to predict the suspension forces, is created. The validation of the full vehicle model includes correlations for displacements, velocities, accelerations and forces of various vehicle parameters. A validated mathematical model of the air springs, that includes mass transfer and flow effects for use in full vehicle dynamic simulations, is also developed. The results obtained indicate that the air spring model, integrated into the full vehicle model, is able to give relative accurate predictions of displacements, velocities, accelerations and forces of various vehicle parameters, over a discrete road event and over a rough road. / Dissertation (MEng)--University of Pretoria, 2009. / Mechanical and Aeronautical Engineering / Unrestricted
5

Rigorous Simulation : Its Theory and Applications

Duracz, Adam January 2016 (has links)
Designing Cyber-Physical Systems is hard. Physical testing can be slow, expensive and dangerous. Furthermore computational components make testing all possible behavior unfeasible. Model-based design mitigates these issues by making it possible to iterate over a design much faster. Traditional simulation tools can produce useful results, but their results are traditionally approximations that make it impossible to distinguish a useful simulation from one dominated by numerical error. Verification tools require skills in formal specification and a priori understanding of the particular dynamical system being studied. This thesis presents rigorous simulation, an approach to simulation that uses validated numerics to produce results that quantify and bound all approximation errors accumulated during simulation. This makes it possible for the user to objectively and reliably distinguish accurate simulations from ones that do not provide enough information to be useful. Explicitly quantifying the error in the output has the side-effect of leading to a tool for dealing with inputs that come with quantified uncertainty. We formalize the approach as an operational semantics for a core subset of the domain-specific language Acumen. The operational semantics is extended to a larger subset through a translation. Preliminary results toward proving the soundness of the operational semantics with respect to a denotational semantics are presented. A modeling environment with a rigorous simulator based on the operational semantics is described. The implementation is portable, and its source code is freely available. The accuracy of the simulator on different kinds of systems is explored through a set of benchmark models that exercise different aspects of a rigorous simulator. A case study from the automotive domain is used to evaluate the applicability of the simulator and its modeling language. In the case study, the simulator is used to compute rigorous bounds on the output of a model.
6

Modal analysis of pedestrian-induced torsional vibrations based on validated FE models

Chamoun, Simon, Trabulsi, Marwan January 2017 (has links)
Finite element (FE) models serve as the base of many different types of analysis as e.g. dynamic analysis. Hence, obtaining FE models that represent the actual behaviour of real structures with great accuracy is of great importance. However, more often than not, there are differences between FE models and the structures being modelled, which can depend on numerous factors. These factors can consist of uncertainties in material behaviour, geometrical properties and boundary- and continuity conditions. Model validation is therefore an important aspect in obtaining FE models that represents reality to some degree. Furthermore, model verification is also important in terms of verifying theoretical models, other than FE models, in fields such as fatigue-, fracture- and dynamic analysis. In this thesis, two pedestrian steel bridges, the Kallhäll bridge and the Smista bridge, have been modelled in a FE software based on engineering drawings and validated against experimental results with regard to their natural frequencies. Furthermore, in this thesis, a model has been developed in MATLAB based on modal analysis that accounts for pedestrian-induced torsional vibrations, the 3D SDOF model. This model has been verified against the previously mentioned FE models. The aim of this thesis is hence two parted where the first part is to develop three-dimensional FE models of two pedestrian bridges and validate them against measured data regarding the natural frequencies. The second part is to further develop a model for analysing the effect of pedestrian-induced torsional vibrations and to investigate whether the model captures the actual dynamic response of such loading. The results showed that the natural frequencies for the first bending- and torsional mode from the FE models corresponded well to the measured ones with the largest difference of 5 \% obtained for the natural frequency of the first bending mode for the Smista bridge. Furthermore, the 3D SDOF model was able to capture the dynamic response of torsional vibrations with an overall difference of less than 2 \% in comparison to the FE models. The model can be improved by further studying the pedestrian-structure interaction as well as studying the effect of using approximative functions describing the mode shapes. / Finita elementmodeller (FE-modeller) utgör en bas för många olika typer av analyser som exempelvis dynamiska analyser. Därmed är det av stor betydelse att FE-modeller representerar det faktiska beteendet av verkliga strukturer med stor noggrannhet. Ofta är det emellertid skillnader mellan FE-modeller och de verkliga strukturer man modellerar. Dessa skillnader kan bero på en rad faktorer såsom exempelvis osäkerheter i materialbeteende, geometriska egenskaper samt upplag- och randvillkor. Modellvalidering är därför en viktig aspekt i att erhålla FE-modeller som representerar verkligheten i olika omfattningar. Utöver modellvalidering är även modellverifiering viktigt, inte endast för verifiering av FE-modeller utan även för verifiering av andra teoretiska modeller inom områden såsom utmaning-, fraktur- och dynamiska analyser. I detta arbete har två GC-broar, Kallhäll- och Smistabron modellerats i ett FE-program baserat på konstruktionsritningar och validerats mot experimentella resultat med avseende på de naturliga frekvenserna. Vidare har det i detta arbete utvecklats en modell i MATLAB som tar hänsyn till människo-inducerade torsionsvibrationer baserat på modalanalys, benämnd 3D SDOF modellen. Modellen har även verifierats mot de tidigare nämnda FE-modellerna. Målet med detta arbete är således uppdelat i två delar, där den första delen består av att utveckla tredimensionella FE-modeller av två GC-broar samt validera dessa mot mätdata vad gäller de naturliga frekvenserna. Den andra delen består av att utveckla en modell för att analysera effekten av människo-inducerade torsionsvibrationer och undersöka huruvida modellen fångar den dynamiska responsen. Resultaten visade att de naturliga frekvenserna för den första böj- och vridmoden från FE-modellerna motsvarade de uppmätta frekvenserna med en största relativ skillnad på 5 \% för den fösta böjmoden för Smistabron. Vidare visade resultaten att den utvecklade 3D SDOF modellen kunde fånga den dynamiska responsen av torsionsvibrationer med en skillnad på mindre än 2 \% i jämförelse med resultat från de FE-modellerna. Modellen kan förbättras genom att vidare studera interaktionen mellan fotgängare och gångbro samt studera effekten av att använda approximativa funktioner som beskriver modformen.
7

The discriminatory ability of analytical quality control test methods : a comparison of test results from different international monographs of quinine sulfate tablets / Chantal Britz

Britz, Chantal January 2013 (has links)
Malaria is a parasitic disease claiming one million lives worldwide annually. Unfortunately, malaria-endemic countries in need of good quality medicines are also overwhelmed with counterfeit or substandard medicine. This results in treatment inefficacy, resistance towards treatment and death. Counterfeit or substandard quinine sulfate tablets are known to have infiltrated the market, however at this point in time, treatment efficacy of quinine sulfate has fortunately not yet been significantly impaired by resistance, but immediate action is required to prevent it from becoming obsolete. Validated analytical methods with justified specifications are effective in controlling the quality of medicines and to minimise the effect of poor quality medicines. Pharmacopoeia specifies analytical quality control procedures and accompanying specifications to standardise acceptable levels of product quality. Understandably, different monographs of different pharmacopoeias are developed by different independent laboratories and therefore their respective test procedures/specifications for the same FPP may differ from each other. Institutions such as the Pharmacopoeial Dicussion Group (PDG) aim to harmonise pharmacopoeia in order to synchronise final outcomes. This study evaluated the relevancy of differences in analytical procedures, results and specifications for quinine sulfate tablets set by the United States Pharmacopoeia (USP), British Pharmacopoeia (BP) and International Pharmacopoeia (Ph.Int.) in an aim to ensure that these different methods all provide with similar final outcomes and that they be effective in successfully evaluating the quality of quinine sulfate tablets. Four quinine sulfate tablet products were obtained from different manufacturers and were subjected to the tests of all three pharmacopoeia – BP, USP and Ph.Int. The results from identification, assay and related substance testing concluded that the outcomes were the same between the pharmacopoeia despite their differences in techniques/procedures/specifications. The assay, identification and related substances methods and specifications set by each respective monograph were deemed appropriate to evaluate the quality of quinine sulfate tablets. Even with differences in methodology, quantitative techniques and specifications, the USP and BP dissolution methods for quinine sulfate tablets shared the same final outcome at the first stage of dissolution, whereas none of the products achieved a compliant outcome using the Ph.Int. dissolution method. Possible reasons for the poor dissolution (when using the Ph.Int. method) were identified and investigated. Investigation into the solubility of quinine sulfate found the Ph.Int. dissolution method conditions to be too stringent, as the solubility of quinine sulfate in phosphate buffer pH 6.8 (dissolution medium specified by the Ph.Int.) was found to be much less than in acidic media (as proposed by the BP and USP dissolution methods). Several adapted dissolution methods (called developmental studies) were investigated to serve as potential alternatives for the Ph.Int. dissolution method. The developmental studies investigated an alternative dissolution medium, agitation rates (50 rpm, 75 rpm, 100 rpm) and medium volumes (500 ml, 750 ml, 900 ml and 1000 ml). Developmental study 6 was proposed as an alternative dissolution method. Developmental study 6 stipulates the use of the same medium as the original Ph.Int. method, as it was deemed the medium of choice for its discriminatory ability. To address the impaired solubility of quinine sulfate in phosphate buffer, the medium volume and agitation were increased (in reference to the original method) to 900 ml and 100 rpm respectively. The same analytical quantitation technique (UV-Vis spectroscopy) is proposed for Developmental study 6. The newly proposed method provided with final outcomes comparable to that of the USP and BP, however having more discriminatory power than the USP and BP. / MSc (Pharmaceutics), North-West University, Potchefstroom Campus, 2014
8

The discriminatory ability of analytical quality control test methods : a comparison of test results from different international monographs of quinine sulfate tablets / Chantal Britz

Britz, Chantal January 2013 (has links)
Malaria is a parasitic disease claiming one million lives worldwide annually. Unfortunately, malaria-endemic countries in need of good quality medicines are also overwhelmed with counterfeit or substandard medicine. This results in treatment inefficacy, resistance towards treatment and death. Counterfeit or substandard quinine sulfate tablets are known to have infiltrated the market, however at this point in time, treatment efficacy of quinine sulfate has fortunately not yet been significantly impaired by resistance, but immediate action is required to prevent it from becoming obsolete. Validated analytical methods with justified specifications are effective in controlling the quality of medicines and to minimise the effect of poor quality medicines. Pharmacopoeia specifies analytical quality control procedures and accompanying specifications to standardise acceptable levels of product quality. Understandably, different monographs of different pharmacopoeias are developed by different independent laboratories and therefore their respective test procedures/specifications for the same FPP may differ from each other. Institutions such as the Pharmacopoeial Dicussion Group (PDG) aim to harmonise pharmacopoeia in order to synchronise final outcomes. This study evaluated the relevancy of differences in analytical procedures, results and specifications for quinine sulfate tablets set by the United States Pharmacopoeia (USP), British Pharmacopoeia (BP) and International Pharmacopoeia (Ph.Int.) in an aim to ensure that these different methods all provide with similar final outcomes and that they be effective in successfully evaluating the quality of quinine sulfate tablets. Four quinine sulfate tablet products were obtained from different manufacturers and were subjected to the tests of all three pharmacopoeia – BP, USP and Ph.Int. The results from identification, assay and related substance testing concluded that the outcomes were the same between the pharmacopoeia despite their differences in techniques/procedures/specifications. The assay, identification and related substances methods and specifications set by each respective monograph were deemed appropriate to evaluate the quality of quinine sulfate tablets. Even with differences in methodology, quantitative techniques and specifications, the USP and BP dissolution methods for quinine sulfate tablets shared the same final outcome at the first stage of dissolution, whereas none of the products achieved a compliant outcome using the Ph.Int. dissolution method. Possible reasons for the poor dissolution (when using the Ph.Int. method) were identified and investigated. Investigation into the solubility of quinine sulfate found the Ph.Int. dissolution method conditions to be too stringent, as the solubility of quinine sulfate in phosphate buffer pH 6.8 (dissolution medium specified by the Ph.Int.) was found to be much less than in acidic media (as proposed by the BP and USP dissolution methods). Several adapted dissolution methods (called developmental studies) were investigated to serve as potential alternatives for the Ph.Int. dissolution method. The developmental studies investigated an alternative dissolution medium, agitation rates (50 rpm, 75 rpm, 100 rpm) and medium volumes (500 ml, 750 ml, 900 ml and 1000 ml). Developmental study 6 was proposed as an alternative dissolution method. Developmental study 6 stipulates the use of the same medium as the original Ph.Int. method, as it was deemed the medium of choice for its discriminatory ability. To address the impaired solubility of quinine sulfate in phosphate buffer, the medium volume and agitation were increased (in reference to the original method) to 900 ml and 100 rpm respectively. The same analytical quantitation technique (UV-Vis spectroscopy) is proposed for Developmental study 6. The newly proposed method provided with final outcomes comparable to that of the USP and BP, however having more discriminatory power than the USP and BP. / MSc (Pharmaceutics), North-West University, Potchefstroom Campus, 2014
9

Aspects of interval analysis applied to initial-value problems for ordinary differential equations and hyperbolic partial differential equations

Anguelov, Roumen Anguelov 09 1900 (has links)
Interval analysis is an essential tool in the construction of validated numerical solutions of Initial Value Problems (IVP) for Ordinary (ODE) and Partial (PDE) Differential Equations. A validated solution typically consists of guaranteed lower and upper bounds for the exact solution or set of exact solutions in the case of uncertain data, i.e. it is an interval function (enclosure) containing all solutions of the problem. IVP for ODE: The central point of discussion is the wrapping effect. A new concept of wrapping function is introduced and applied in studying this effect. It is proved that the wrapping function is the limit of the enclosures produced by any method of certain type (propagate and wrap type). Then, the wrapping effect can be quantified as the difference between the wrapping function and the optimal interval enclosure of the solution set (or some norm of it). The problems with no wrapping effect are characterized as problems for which the wrapping function equals the optimal interval enclosure. A sufficient condition for no wrapping effect is that there exist a linear transformation, preserving the intervals, which reduces the right-hand side of the system of ODE to a quasi-isotone function. This condition is also necessary for linear problems and "near" necessary in the general case. Hyperbolic PDE: The Initial Value Problem with periodic boundary conditions for the wave equation is considered. It is proved that under certain conditions the problem is an operator equation with an operator of monotone type. Using the established monotone properties, an interval (validated) method for numerical solution of the problem is proposed. The solution is obtained step by step in the time dimension as a Fourier series of the space variable and a polynomial of the time variable. The numerical implementation involves computations in Fourier and Taylor functoids. Propagation of discontinuo~swaves is a serious problem when a Fourier series is used (Gibbs phenomenon, etc.). We propose the combined use of periodic splines and Fourier series for representing discontinuous functions and a method for propagating discontinuous waves. The numerical implementation involves computations in a Fourier hyper functoid. / Mathematical Sciences / D. Phil. (Mathematics)
10

Predictors of outcome for severely emotionally disturbed children in treatment

Luiker, Henry George January 2008 (has links)
Doctor of Philosophy (Phd) / Despite general agreement that severely emotionally disturbed children and adolescents are an "at risk" group, and that ongoing evaluation and research into the effectiveness of services provided for them is important, very little outcome evaluation actually takes place. The absence of well-conducted and appropriately interpreted studies is particularly notable for day or residential treatment programs, which cater for the most severely emotionally disturbed youths. This thesis outlines the main areas of conceptual, pragmatic and methodological confusion and neglect which impede progress in research in this area. It argues for plurality of data analytic strategies and research designs. It then critically reviews the reported findings about the effectiveness of day and residential treatment in specialist facilities, and the predictors of good outcomes for this treatment type. This review confirms that there is very little to guide practice. Having argued for the legitimacy of its methods and the necessity to address basic questions, the thesis reports the results of a naturalistic study based on data accumulated during a decade-long evaluative research program taking place at Arndell Child and Adolescent Unit, Sydney. The study addresses the question of what child, family and treatment variables predict outcome for 159 children and adolescents treated at this facility from 1990 to 1999. Statistically significant results with large effect size were obtained. Among the most disturbed subgroup of forty three children, (a) psychodynamic milieu-based treatment was shown to be more effective than the “empirically-validated” cognitive-behavioural treatment which superseded it in 1996, and (b) children from step-families showed better outcome than those from other family structures. Furthermore, it was found for the study sample as a whole that severe school-based problem behaviours were associated with a limited trajectory of improvement in home-based problem behaviour. These results are discussed with regard to implications for treatment, research methodology, policy and further studies.

Page generated in 0.3018 seconds