• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 47
  • 5
  • 4
  • 2
  • 1
  • 1
  • Tagged with
  • 75
  • 75
  • 14
  • 12
  • 10
  • 9
  • 9
  • 9
  • 7
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Global-Scale Modelling of the Land-Surface Water Balance : Development and Analysis of WASMOD-M / Global modellering av landområdenas vattenbalans : Utveckling och analys av WASMOD-M

Widén-Nilsson, Elin January 2007 (has links)
Water is essential for all life on earth. Global population increase and climate change are projected to increase the water stress, which already today is very high in many areas of the world. The differences between the largest and smallest global runoff estimates exceed the highest continental runoff estimates. These differences, which are caused by different modelling and measurement techniques together with large natural variabilities need to be further addressed. This thesis focuses on global water balance models that calculate global runoff, evaporation and water storage from precipitation and other climate data. A new global water balance model, WASMOD-M was developed. Already when tuned against the volume error it reasonable produced within-year runoff patterns, but the volume error was not enough to confine the model parameter space. The parameter space and the simulated hydrograph could be better confined with, e.g., the Nash criterion. Calibration against snow-cover data confined the snow parameters better, although some equifinality still persisted. Thus, even the simple WASMOD-M showed signs of being overparameterised. A simple regionalisation procedure that only utilised proximity contributed to calculate a global runoff estimate in line with earlier estimations. The need for better specifications of global runoff estimates was highlighted. Global modellers depend on global data-sets that can have low quality in many areas. Major sources of uncertainty are precipitation and river regulation. A new routing method that utilises high-resolution flow network information in low-resolution calculations was developed and shown to perform well over all spatial scales, while the standard linear reservoir routing decreased in performance with decreasing resolution. This algorithm, called aggregated time-delay-histogram routing, is intended for inclusion in WASMOD-M.
42

A multi-fidelity analysis selection method using a constrained discrete optimization formulation

Stults, Ian Collier 17 August 2009 (has links)
The purpose of this research is to develop a method for selecting the fidelity of contributing analyses in computer simulations. Model uncertainty is a significant component of result validity, yet it is neglected in most conceptual design studies. When it is considered, it is done so in only a limited fashion, and therefore brings the validity of selections made based on these results into question. Neglecting model uncertainty can potentially cause costly redesigns of concepts later in the design process or can even cause program cancellation. Rather than neglecting it, if one were to instead not only realize the model uncertainty in tools being used but also use this information to select the tools for a contributing analysis, studies could be conducted more efficiently and trust in results could be quantified. Methods for performing this are generally not rigorous or traceable, and in many cases the improvement and additional time spent performing enhanced calculations are washed out by less accurate calculations performed downstream. The intent of this research is to resolve this issue by providing a method that will minimize the amount of time spent conducting computer simulations while meeting accuracy and concept resolution requirements for results.
43

Revisiting the Effects of IMF Programs on Poverty and Inequality

Oberdabernig, Doris Anita 06 1900 (has links) (PDF)
Investigating how lending programs of the International Monetary Fund (IMF) affect poverty and inequality, we explicitly address model uncertainty. We control for endogenous selection into IMF programs using data on 86 low- and middle income countries for the 1982-2009 period and analyze program effects on various poverty and inequality measures. The results rely on averaging over 90 specifications of treatment effect models and indicate adverse short-run effects of IMF agreements on poverty and inequality for the whole sample, while for a 2000-2009 subsample the results are reversed. There is evidence that significant short-run effects might disappear in the long-run. (author's abstract)
44

Integrating remotely sensed data into forest resource inventories / The impact of model and variable selection on estimates of precision

Mundhenk, Philip Henrich 26 May 2014 (has links)
Die letzten zwanzig Jahre haben gezeigt, dass die Integration luftgestützter Lasertechnologien (Light Detection and Ranging; LiDAR) in die Erfassung von Waldressourcen dazu beitragen kann, die Genauigkeit von Schätzungen zu erhöhen. Um diese zu ermöglichen, müssen Feldaten mit LiDAR-Daten kombiniert werden. Diverse Techniken der Modellierung bieten die Möglichkeit, diese Verbindung statistisch zu beschreiben. Während die Wahl der Methode in der Regel nur geringen Einfluss auf Punktschätzer hat, liefert sie unterschiedliche Schätzungen der Genauigkeit. In der vorliegenden Studie wurde der Einfluss verschiedener Modellierungstechniken und Variablenauswahl auf die Genauigkeit von Schätzungen untersucht. Der Schwerpunkt der Arbeit liegt hierbei auf LiDAR Anwendungen im Rahmen von Waldinventuren. Die Methoden der Variablenauswahl, welche in dieser Studie berücksichtigt wurden, waren das Akaike Informationskriterium (AIC), das korrigierte Akaike Informationskriterium (AICc), und das bayesianische (oder Schwarz) Informationskriterium. Zudem wurden Variablen anhand der Konditionsnummer und des Varianzinflationsfaktors ausgewählt. Weitere Methoden, die in dieser Studie Berücksichtigung fanden, umfassen Ridge Regression, der least absolute shrinkage and selection operator (Lasso), und der Random Forest Algorithmus. Die Methoden der schrittweisen Variablenauswahl wurden sowohl im Rahmen der Modell-assistierten als auch der Modell-basierten Inferenz untersucht. Die übrigen Methoden wurden nur im Rahmen der Modell-assistierten Inferenz untersucht. In einer umfangreichen Simulationsstudie wurden die Einflüsse der Art der Modellierungsmethode und Art der Variablenauswahl auf die Genauigkeit der Schätzung von Populationsparametern (oberirdische Biomasse in Megagramm pro Hektar) ermittelt. Hierzu wurden fünf unterschiedliche Populationen genutzt. Drei künstliche Populationen wurden simuliert, zwei weitere basierten auf in Kanada und Norwegen erhobenen Waldinveturdaten. Canonical vine copulas wurden genutzt um synthetische Populationen aus diesen Waldinventurdaten zu generieren. Aus den Populationen wurden wiederholt einfache Zufallsstichproben gezogen und für jede Stichprobe wurden der Mittelwert und die Genauigkeit der Mittelwertschätzung geschäzt. Während für das Modell-basierte Verfahren nur ein Varianzschätzer untersucht wurde, wurden für den Modell-assistierten Ansatz drei unterschiedliche Schätzer untersucht. Die Ergebnisse der Simulationsstudie zeigten, dass das einfache Anwenden von schrittweisen Methoden zur Variablenauswahl generell zur Überschätzung der Genauigkeiten in LiDAR unterstützten Waldinventuren führt. Die verzerrte Schätzung der Genauigkeiten war vor allem für kleine Stichproben (n = 40 und n = 50) von Bedeutung. Für Stichproben von größerem Umfang (n = 400), war die Überschätzung der Genauigkeit vernachlässigbar. Gute Ergebnisse, im Hinblick auf Deckungsraten und empirischem Standardfehler, zeigten Ridge Regression, Lasso und der Random Forest Algorithmus. Aus den Ergebnissen dieser Studie kann abgeleitet werden, dass die zuletzt genannten Methoden in zukünftige LiDAR unterstützten Waldinventuren Berücksichtigung finden sollten.
45

Contribution to the modelling of aircraft tyre-road interaction

Kiébré, Rimyalegdo 10 December 2010 (has links) (PDF)
This thesis is a part of the French national project called MACAO (Modélisation Avancée de Composants Aéronautiques et Outils associés). In collaboration with Messier-Dowty company (a landing gears manufacturer), the thesis has contributed to better understand the actual literature studies in the field of aircraft tyre-road interaction modelling and therefore, to help making an optimal choice of model for a specifie application. The objectives have been to propose models for representing the tyre behaviour on the ground with respect to the aircraft run types. Physical oriented models are preferred. To complete this study, a literature survey of the previous researches in tyre modelling for steady­state responses is first carried out. Then, based on the main factors playing an important role in tyre modelling, it is proposed a classification for the physical and the semi-empirical models, which are also investigated. Based on this classification, the study requirements and the measurement data constraints, an a priori choice of suitable models are studied. A further investigation of the tyre deformation at pure lateral slip is carried out. It has allowed to physically describe the mechanism of generation of the longitudinal component of the tyre force at pure lateral slip. This force is refened as induced longitudinal force. By taking this force into consideration, it has been possible to explain why the self-aligning moment can drop to zero before the tyre gets to full sliding at pure lateral slip. Besides, the sensitivity analysis is proposed as a means for determining the parameters that have most influence on the model output and thus, are responsible for the output uncertainty.
46

Avaliação de incertezas em modelo de dano com aplicação a prismas de alvenaria sob compressão / Evaluation of model uncertainties of a damage model with application in masonry prisms under compression

Luiz Aquino Gonçalves Júnior 27 August 2008 (has links)
A norma brasileira de cálculo de alvenaria é baseada no método de tensões admissíveis e passa por revisão para ser escrita no método dos estados limites. A confiabilidade estrutural é um ramo da engenharia que mede segurança das estruturas, sendo muitas vezes empregada para calibrar fatores de segurança. Para medir a confiabilidade de uma estrutura deve-se conhecer as incertezas que envolvem o problema. A incerteza de modelo estima a tendência do modelo (que pode ser eventualmente ser eliminada) e a variância do modelo (uma medida da sua variabilidade). O presente trabalho propõe um método de cálculo da incerteza de um modelo numérico de um prisma formado por três unidades concreto sujeito à compressão. O estudo numérico é feito em elementos finitos com análise não-linear baseada em dano. A incerteza é avaliada através de variáveis de projeto: tensão máxima, deformação na tensão máxima e módulo de elasticidade. São aplicados métodos probabilísticos para comparar resultados numéricos e ensaios experimentais disponíveis na literatura. Confronta-se a probabilidade de falha resultante de resistências corrigidas, sem correção e obtidas experimentalmente. Conclui-se que a incerteza de modelo é importante para quantificar a medida de segurança e deve ser levada em conta na análise da confiabilidade de uma estrutura. O procedimento também é útil para qualificar e comparar modelos de cálculo, com aplicações em alvenaria ou quaisquer outros tipos de estruturas. / The brazilian masonry code is based on the allowable stress method and is currently in revision to be written in the partial safety factor format. Structural reliability is a branch of engineering which allows quantitative evaluation of the safety of structures, being useful in the calibration of safety factors. To measure structural safety, it is necessary to know the uncertainties present in the problem. Model error variables estimate the bias of the model (wich can eventually be eliminated) and the variance of the model (a measure of the model variability). The present work suggests a method for evaluation of modeling uncertainty of the resistence of a prism made of three concrete units subject to compression. The numerical study is based on the finite element method and nonlinear analysis with damage mechanics. The uncertainty is evaluated by design variables: maximum stress, deformation in maximum stress and elasticity modulus of the prism. A probabilistic method is used to compare numerical results with experimental results taken from the literature. The probability of failure based on experimental resistances are compared with the probability of failure based on the model and corrected resistances. It is concluded that the model uncertainty is important to quantify safety and must be taken into account in structural reliability analysis. The procedure is also useful to qualify and compare different models, with application to masonry or other kinds of structural materials.
47

Modelling of conditional variance and uncertainty using industrial process data

Juutilainen, I. (Ilmari) 14 November 2006 (has links)
Abstract This thesis presents methods for modelling conditional variance and uncertainty of prediction at a query point on the basis of industrial process data. The introductory part of the thesis provides an extensive background of the examined methods and a summary of the results. The results are presented in detail in the original papers. The application presented in the thesis is modelling of the mean and variance of the mechanical properties of steel plates. Both the mean and variance of the mechanical properties depend on many process variables. A method for predicting the probability of rejection in a quali?cation test is presented and implemented in a tool developed for the planning of strength margins. The developed tool has been successfully utilised in the planning of mechanical properties in a steel plate mill. The methods for modelling the dependence of conditional variance on input variables are reviewed and their suitability for large industrial data sets are examined. In a comparative study, neural network modelling of the mean and dispersion narrowly performed the best. A method is presented for evaluating the uncertainty of regression-type prediction at a query point on the basis of predicted conditional variance, model variance and the effect of uncertainty about explanatory variables at early process stages. A method for measuring the uncertainty of prediction on the basis of the density of the data around the query point is proposed. The proposed distance measure is utilised in comparing the generalisation ability of models. The generalisation properties of the most important regression learning methods are studied and the results indicate that local methods and quadratic regression have a poor interpolation capability compared with multi-layer perceptron and Gaussian kernel support vector regression. The possibility of adaptively modelling a time-varying conditional variance function is disclosed. Two methods for adaptive modelling of the variance function are proposed. The background of the developed adaptive variance modelling methods is presented.
48

Predicting Glass Sponge (Porifera, Hexactinellida) Distributions in the North Pacific Ocean and Spatially Quantifying Model Uncertainty

Davidson, Fiona 07 January 2020 (has links)
Predictions of species’ ranges from distribution modeling are often used to inform marine management and conservation efforts, but few studies justify the model selected or quantify the uncertainty of the model predictions in a spatial manner. This thesis employs a multi-model, multi-area SDM analysis to develop a higher certainty in the predictions where similarities exist across models and areas. Partial dependence plots and variable importance rankings were shown to be useful in producing further certainty in the results. The modeling indicated that glass sponges (Hexactinellida) are most likely to exist within the North Pacific Ocean where alkalinity is greater than 2.2 μmol l-1 and dissolved oxygen is lower than 2 ml l-1. Silicate was also found to be an important environmental predictor. All areas, except Hecate Strait, indicated that high glass sponge probability of presence coincided with silicate values of 150 μmol l-1 and over, although lower values in Hecate Strait confirmed that sponges can exist in areas with silicate values of as low as 40 μmol l-1. Three methods of showing spatial uncertainty of model predictions were presented: the standard error (SE) of a binomial GLM, the standard deviation of predictions made from 200 bootstrapped GLM models, and the standard deviation of eight commonly used SDM algorithms. Certain areas with few input data points or extreme ranges of predictor variables were highlighted by these methods as having high uncertainty. Such areas should be treated cautiously regardless of the overall accuracy of the model as indicated by accuracy metrics (AUC, TSS), and such areas could be targeted for future data collection. The uncertainty metrics produced by the multi-model SE varied from the GLM SE and the bootstrapped GLM. The uncertainty was lowest where models predicted low probability of presence and highest where the models predicted high probability of presence and these predictions differed slightly, indicating high confidence in where the models predicted the sponges would not exist.
49

Adaptive Feedback Regulator for Powered Lower-Limb Exoskeleton under Model Uncertainty

Thakkar, Kirtankumar J. January 2021 (has links)
No description available.
50

Safety formats for non-linear finite element analyses of reinforced concrete beams loaded to shear failure

Ekesiöö, Anton, Ekhamre, Andreas January 2018 (has links)
There exists several different methods that can be used to implement a level of safety when performing non-linear finite element analysis of a structure. These methods are called safety formats and they estimate safety by different means and formulas which are partly discussed further in this thesis. The aim of this master thesis is to evaluate a model uncertainty factor for one safety format method called the estimation of coefficient of variation method (ECOV) since it is suggested to be included in the next version of Eurocode. The ECOV method will also be compared with the most common and widely used safety format which is the partial factor method (PF). The first part of this thesis presents the different safety formats more thoroughly followed by a theoretical part. The theory part aims to provide a deeper knowledge for the finite element method and non-linear finite element analysis together with some beam theory that explains shear mechanism in different beam types. The study was conducted on six beams in total, three deep beams and three slender beams. The deep beams were previously tested in the 1970s and the slender beams were previously tested in the 1990s, both test series were performed in a laboratory. All beams failed due to shear in the experimental tests. A detailed description of the beams are presented in the thesis. The simulations of the beams were all performed in the FEM- programme ATENA 2D to obtain high resemblance to the experimental test. In the results from the simulations it could be observed that the ECOV method generally got a higher capacity than the PF method. For the slender beams both methods received rather high design capacities with a mean of about 82% of the experimental capacity. For the deep beams both method reached low design capacities with a mean of around 46% of the experimental capacity. The results regarding the model uncertainty factor showed that the mean value for slender beams should be around 1.06 and for deep beams it should be around 1.25.

Page generated in 0.0786 seconds