Spelling suggestions: "subject:"model"" "subject:"godel""
901 |
Bayesian inference about outputs of computationally expensive algorithms with uncertainty on the inputsHaylock, Richard George Edward January 1997 (has links)
In the field of radiation protection, complex computationally expensive algorithms are used to predict radiation doses, to organs in the human body from exposure to internally deposited radionuclides. These algorithms contain many inputs, the true values of which are uncertain. Current methods for assessing the effects of the input uncertainties on the output of the algorithms are based on Monte Carlo analyses, i.e. sampling from subjective prior distributions that represent the uncertainty on each input, evaluating the output of the model and calculating sample statistics. For complex computationally expensive algorithms, it is often not possible to get a large enough sample for a meaningful uncertainty analysis. This thesis presents an alternative general theory for uncertainty analysis, based on the use of stochastic process models, in a Bayesian context. The measures provided by the Monte Carlo analysis are obtained, plus extra more informative measures, but using a far smaller sample. The theory is initially developed in a general form and then specifically for algorithms with inputs whose uncertainty can be characterised by independent normal distributions. The Monte Carlo and Bayesian methodologies are then compared using two practical examples. The first example, is based on a simple model developed to calculate doses due to radioactive iodine. This model has two normally distributed uncertain parameters and due to its simplicity an independent measurement of the true uncertainty on the output is available for comparison. This exercise appears to show that the Bayesian methodology is superior in this simple case. The purpose of the second example is to determine if the methodology is practical in a 'real-life' situation and to compare it with a Monte Carlo analysis. A model for calculating doses due to plutonium contamination is used. This model is computationally expensive and has fourteen uncertain inputs. The Bayesian analysis compared favourably to the Monte Carlo, indicating that it has the potential to provide more accurate uncertainty analyses for the parameters of computationally expensive algorithms.
|
902 |
A framework for choosing the best model in mathematical modelling and simulationBrooks, Roger John January 1996 (has links)
No description available.
|
903 |
The application of deconvolution in well test analysisRoumboutsos, Athena January 1988 (has links)
No description available.
|
904 |
Forecasting monthly air passenger flows from Sweden : Evaluating forecast performance using the Airline model as benchmarkRobertson, Fredrik, Wallin, Max January 2014 (has links)
In this paper two different models for forecasting the number of monthly departing passengers from Sweden to any international destination are developed and compared. The Swedish transport agency produces forecasts on a yearly basis, where net export is the only explanatory variable controlled for in the latest report. More profound studies have shown a relevance of controlling for variables such as unemployment rate, oil price and exchange rates. Due to the high seasonality within passenger flows, these forecasts are based on monthly or quarterly data. This paper shows that a seasonal autoregressive integrated moving average model with exogenous input outperforms the benchmark model forecast in seven out of nine months. Thus, controlling for oil price, the SEK/EUR exchange rate and the occurrence of Easter reduces the mean absolute percentage error of the forecasts from 3,27 to 2,83 % on Swedish data.
|
905 |
A structural model of strategic alignment between information systems and business strategiesWong, Hon Shu January 2002 (has links)
No description available.
|
906 |
The characterisation of a thin film UV contactor and its application to the treatment of contaminated cutting oilsPeppiatt, Christopher J. January 1997 (has links)
The characteristics and applications of a novel design of a thin film photocontactor based on the principle of irradiating a 'water bell' with ultraviolet (UV) light a,e considered in this work. Measurements of UV doses received by the liquid films in single passes were made using both actinometric and bioassay-based methods. The chemical actinometer employed was potassium ferrioxalate (K,Fe(C,o.l,)) and the microorganisms used in the bioassay were Pseudomonas stutzeri (mRG) and a repair-deficient strain of Escherichia coli (NCIMB 11190). Good agreement was obtained between the doses measured using actinometry and the E. coli-based bioassay. At higher doses, good agreement was also obtained for the dose estimates made using actinometry and the Ps. stutzeri bioassay. In addition, a hydrodynamic water bell model, previously developed in the literature, was combined with a UV intensity model to predict UV doses with generally good results. Microbially contaminated metal working fluids were identified as a suitable medium for disinfection using the thin film contactor because they are not treatable using conventional UV contactors, and because the systems employed in industry vary widely in scale. Batches of contaminated emulsion ranging in volume from 200 to 1000 L were successfully disinfected. Representative members of the microbial population were isolated, and their changing status throughout treatment recorded. Against expectations, the population showed no capacity for the post-irradiation repair of UV-induced damage. A simplified disinfection model was established in order to model the treatment of batches of contaminated metal working fluids. Preliminary predictions made using a combination of experimental data for the population as a whole and that for individual species coupled with that generated using the hydrodynamic bell model, gave encouraging results.
|
907 |
Economic scheduling in electric power systems : a mathematical model for the U.A.EAl-Gobaisi, Darwish M. K. F. January 1988 (has links)
No description available.
|
908 |
Predicting the Evolution of Influenza ASandie, Reatha 02 April 2012 (has links)
Vaccination against the Influenza A virus (IAV) is often an important and critical task for much of the population, as IAV causes yearly epidemics, and can cause even deadlier pandemics. Designing the vaccine requires an understanding of the current major circulating strains of Influenza, as well as an understanding of how those strains could change over time to become either less harmful or more deadly, or simply die out completely. An error in the prediction process can lead to a non-immunized population at risk of epidemics, or even a pandemic. Presented here is a posterior predictive approach to generate emerging influenza strains based on a realistic genomic model that incorporates natural features of viral evolution such as selection and recombination. Also introduced is a sequence sampling scheme to relieve the computational burden of the posterior predictive analysis by clustering sequences based on their pairwise similarity. Finally, the impact of “evolutionary accidents” that take the form of bursts of evolution and or of recombination on the predictive power of our procedure is tested. An analysis of the impact of these bursts is carried out in a retrospective study that focuses on the unexpected emergence of a new H3N2 strain in the 2007-08 influenza season. Measuring the R2 values of both pairwise and patristic distances, the model reaches a predictive power of ∼40%, but is not able to simulate the emergence of the target Brisbane/10/2007 sequence with a high probability. The inclusion of “evolutionary accidents” improved the algorithm’s ability to predict HA sequences, but the prediction power of the NA gene remained low.
|
909 |
Model-Based Methodology for Building Confidence in a Dynamic Measuring SystemReese, Isaac Mark 03 October 2013 (has links)
This thesis examines the special case in which a newly developed dynamic measurement system must be characterized when an accepted standard qualification procedure does not yet exist. In order to characterize this type of system, both physical experimentation and computational simulation methods will be used to build trust in this measurement system. This process of establishing credibility will be presented in the form of a proposed methodology.
This proposed methodology will utilize verification and validation methods that apply within the simulation community as the foundation for this multi-faceted approach. The methodology will establish the relationships between four key elements: physical experimentation, conceptual modeling, computational simulations, and data processing. The combination of these activities will provide a comprehensive characterization study of the system.
In order to illustrate the methodology, a case study was performed on a dynamic force measurement system owned by Sandia National Laboratories. This system was designed to measure the force required to pull a specimen to failure in tension at a user-input velocity. The results of the case study found that there was a significant measurement error occurring as the pull event involved large break loads and high velocities. 100 pull events were recorded using an experimental test assembly. The highest load conditions discovered a force measurement error of over 100%. Using computational simulations, this measurement error was reduced to less than 10%. These simulations were designed to account for the inertial effects that skew the piezoelectric load cells. This thesis displays the raw data and the corrected data for five different pull settings. The simulations designed using the methodology significantly reduced the error in all five pull settings.
In addition to the force analysis, the simulations provide insight into the complete system performance. This includes the analysis of the maximum system velocity as well as the analysis of several proposed design changes. The findings suggest that the dynamic measurement system has a maximum velocity of 28 fps, and that this maximum velocity is unaffected by the track length or the mass of the moving carriage.
|
910 |
Interpolation theorems for many-sorted infinitary languages.Sharkey, Robert John January 1972 (has links)
No description available.
|
Page generated in 0.0723 seconds