• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 327
  • 113
  • 91
  • 76
  • 36
  • 24
  • 12
  • 8
  • 7
  • 5
  • 5
  • 5
  • 4
  • 3
  • 2
  • Tagged with
  • 877
  • 877
  • 145
  • 124
  • 121
  • 118
  • 113
  • 101
  • 101
  • 85
  • 82
  • 81
  • 73
  • 70
  • 68
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Climate impacts on hydrometric variables in the Mackenzie River Basin

Yip, Queenie 25 January 2008 (has links)
The research described in this thesis examines how the hydrologic cycle is affected by climate changes in the Mackenzie River Basin (MRB) in northern Canada. The study focuses on five hydro-meteorological variables; runoff, evapotranspiration, storage, temperature and precipitation. Two different climate input data sets were used: Environment Canada gridded observed data and the European Center for Medium range Weather Forecasting (ECMWF) Re-Analysis climate data (ERA-40). In both data sets, runoff and evapotranspiration were modelled using the WATFLOOD hydrological model for the period of 1961 to 2002 on a 20 by 20 km grid. Trends were assessed on a monthly and annual basis using the Mann-Kendall non-parametric trend test. The hydrologic cycle in the MRB appears to be strongly influenced by climate change. The results reveal a general pattern of warming temperatures, and increasing precipitation and evapotranspiration. Overall decreases in runoff and in storage were detected from the Environment Canada data set while increases in runoff and in storage were detected from the ECMWF data set. The trends in runoff and evapotranspiration reflected changes in both precipitation and temperature. The spatial pattern of changes in runoff followed the pattern of change in precipitation very closely in most of the months, with the exception of March and October. The effect of changes in temperature is much more noticeable than that of changes in precipitation in March and October. The change in spatial distribution of evapotranspiration, on the other hand, matched the pattern of changes in temperature better; yet its seasonal pattern follows more closely to that of precipitation. The sensitivity of annual runoff to changes in climate was also estimated using a nonparametric estimator. Among the most important findings are: 1) runoff was more sensitive to precipitation and less sensitive to temperature; 2) runoff was positively correlated with precipitation and evapotranspiration; 3) runoff was negatively correlated with temperature, implying any increase in melt runoff from glaciers caused by increases in temperature were offset by losses due to evapotranspiration within the basin; 4) soil moisture storage may play an important role in the runoff and evapotranspiration processes; and 5) the sensitivity of mean annual runoff to changes in precipitation and evapotranspiration is typically lower along the Rocky Mountain chain, higher in the central zone of the Interior Plain, and highly varied in the Canadian Shield region in the basin. Correlation analysis suggested that the agreement between the two data sets is very weak at the grid-cell level. However, there was broad degree of consistencies in the seasonal and spatial patterns of trends between the two data sets, suggesting that the data are more reliable for identifying hydrological changes on a regional scale than at grid-cell level.
182

Optimal Portfolio Execution Strategies: Uncertainty and Robustness

Moazeni, Somayeh 25 October 2011 (has links)
Optimal investment decisions often rely on assumptions about the models and their associated parameter values. Therefore, it is essential to assess suitability of these assumptions and to understand sensitivity of outcomes when they are altered. More importantly, appropriate approaches should be developed to achieve a robust decision. In this thesis, we carry out a sensitivity analysis on parameter values as well as model speci cation of an important problem in portfolio management, namely the optimal portfolio execution problem. We then propose more robust solution techniques and models to achieve greater reliability on the performance of an optimal execution strategy. The optimal portfolio execution problem yields an execution strategy to liquidate large blocks of assets over a given execution horizon to minimize the mean of the execution cost and risk in execution. For large-volume trades, a major component of the execution cost comes from price impact. The optimal execution strategy then depends on the market price dynamics, the execution price model, the price impact model, as well as the choice of the risk measure. In this study, rst, sensitivity of the optimal execution strategy to estimation errors in the price impact parameters is analyzed, when a deterministic strategy is sought to minimize the mean and variance of the execution cost. An upper bound on the size of change in the solution is provided, which indicates the contributing factors to sensitivity of an optimal execution strategy. Our results show that the optimal execution strategy and the e cient frontier may be quite sensitive to perturbations in the price impact parameters. Motivated by our sensitivity results, a regularized robust optimization approach is devised when the price impact parameters belong to some uncertainty set. We rst illustrate that the classical robust optimization might be unstable to variation in the uncertainty set. To achieve greater stability, the proposed approach imposes a regularization constraint on the uncertainty set before being used in the minimax optimization formulation. Improvement in the stability of the robust solution is discussed and some implications of the regularization on the robust solution are studied. Sensitivity of the optimal execution strategy to market price dynamics is then investigated. We provide arguments that jump di usion models using compound poisson processes naturally model uncertain price impact of other large trades. Using stochastic dynamic programming, we derive analytical solutions for minimizing the expected execution cost under jump di usion models and compare them with the optimal execution strategies obtained from a di usion process. A jump di usion model for the market price dynamics suggests the use of Conditional Value-at-Risk (CVaR) as the risk measure. Using Monte Carlo simulations, a smoothing technique, and a parametric representation of a stochastic strategy, we investigate an approach to minimize the mean and CVaR of the execution cost. The devised approach can further handle constraints using a smoothed exact penalty function.
183

Life Cycle Costing in the evaluation process of new production lines / Livscykelkostnad i utvecklingsprocessen av nya produktionslinor

Ludvigsson, Rebecka January 2010 (has links)
The purpose of this thesis is to develop a Life Cycle Cost model that could be used for investment, budgeting and comparing alternatives. An evaluation of existing models concluded that there was a need for a model that was easy to use and understand but in the same way economical and technical complex. Theoretical and empirical information was gathered in accordance with the purpose and made a base of the model. The model highlights operative, energy and maintenance costs. A case study to test the model has been carried out and selected company for this has been Swedwood International AB which is a part of IKEA. Swedwood currently works with pay back calculations which could lead to wrong decisions during the life length of the investment. The developed LCC model was tested on different techniques for applying an edge on a substrate. The result of the report is that the user will have a clear and structured overview of an investment during its economical life length. A final investment decision demands further tests and evaluations, for example technical test and MCDM. Further researches for the LCC model could be to investigate if the model lacks any critical aspects that should be included. A recommendation for Swedwood is to follow up the developed standards for collecting data at the factories in order to facilitate when investigating for new techniques and comparing between investment options. / Syftet med examensarbetet är att utveckla en livscykelkostnadsmodell som kan användas vid investeringar, budgeteringar och jämförelser. Efter en utvärdering av tillgängliga modeller konstaterades det att behov fanns för en modell som var ekonomisk och teknisk avancerad men ändå användarvänlig. Teori och empiri insamlades i enlighet med syftet och bildade en grund för modellen. Modellen belyser speciellt kostnadsaktiviteter så som operativa, energi och underhållskostnader. En fallstudie för att testa modellen har genomförts och fallföretaget var Swedwood International AB som är en del av IKEA. Swedwood arbetar nu med payback kalkyler vilket kan leda till fel beslut sett till hela investeringens livslängd. Den framtagna LCC modellen testades på olika tekniker för att applicera en kant på ett arbetstycke. Resultatet av rapporten är genom att använda modellen får man en klar och tydlig översikt av alla kostnader under en investerings ekonomiska livslängd. Ett investeringsbeslut kräver ytterligare tester och utvärderingar så som tekniska tester och MCDM. En fortsatt utveckling av modellen kan vara att undersöka om den saknar någon kritisk del som ska var inkluderad. En rekommendation till Swedwood är att följa upp de centralt utvecklade standarder på fabrikerna så att alla samlar in data på samma sätt, vilket skulle underlätta vid implementering av nya tekniker och vid jämförelser av investeringar.
184

Managing Uncertainty in Engineering Design Using Imprecise Probabilities and Principles of Information Economics

Aughenbaugh, Jason Matthew 22 June 2006 (has links)
The engineering design community recognizes that an essential part of the design process is decision making. Because decisions are generally made under uncertainty, engineers need appropriate methods for modeling and managing uncertainty. Two important characteristics of uncertainty in the context of engineering design are imprecision and irreducible uncertainty. In order to model both of these characteristics, it is valuable to use probabilities that are most generally imprecise and subjective. These imprecise probabilities generalize traditional, precise probabilities; when the available information is extensive, imprecise probabilities reduce to precise probabilities. An approach for comparing the practical value of different uncertainty models is developed. The approach examines the value of a model using the principles of information economics: value equals benefits minus costs. The benefits of a model are measured in terms of the quality of the product that results from the design process. Costs are measured not only in terms of direct design costs, but also the costs of creating and using the model. Using this approach, the practical value of using an uncertainty model that explicitly recognizes both imprecision and irreducible uncertainty is demonstrated in the context of a high-risk engineering design example in which the decision-maker has few statistical samples to support the decision. It is also shown that a particular imprecise probability model called probability bounds analysis generalizes sensitivity analysis, a process of identifying whether a particular decision is robust given the decision makers lack of complete information. An approach for bounding the value of future statistical data samples while collecting information to support design decisions is developed, and specific policies for making decisions in the presence of imprecise information are examined in the context of engineering.
185

Analyzing Non-Unique Parameters in a Cat Spinal Cord Motoneuron Model

Sowd, Matthew Michael 05 July 2006 (has links)
When modeling a neuron, modelers often focus on the values of parameters that produce a desired output. However, if these parameters are not unique, there could be a number of parameter sets that produce the same output. Thus, even though the values of the various maximum conductances, half activation voltages and so on differ, as a set they can produce the same spike height, firing rates, and so forth. To examine whether or not parameter sets are unique, a 3-compartment motoneuron model was created that has 15 target outputs and 59 parameters. Using parameter searches, over one hundred parameter sets were created for this model that produced the same output (within tolerances). Parameter values vary between parameter sets and indicate that the parameter values are not unique. In addition, some parameters are more tightly constrained than others. Principal component analysis is used to examine the dimensionality of the input and output spaces. However, neurons are more than static output generators. For example, a variety of neuromodulatory influences are known to shift parameter values to alter neuronal output. Thus the question arises as to whether this non-uniqueness extends from model outputs to the models sensitivities to its parameters. In this work, the non-unique parameter sets are further analyzed using sensitivity analyses and output correlations to show that these values vary significantly between these parameter sets. Therefore, each of these models will react to parameter variation differently. This work concludes that parameter sets are non-unique but have varying sensitivity analyses and output correlations. The ramifications of this are discussed for both modelers and neuroscientists.
186

Positive Analysis on the Stock Size of Argentine Shortfin Squid, Illex Argentinus in Southwest Atlantic

Wu, Pei-jung 08 July 2011 (has links)
This thesis is based on Gordon-Schaefer model, and assesses Argentine shortfin squid¡¦s stock by using the data of Southwest Atlantic from FAO between 1983 and 2009. First, estimate the equilibrium level of the open-access fishery and dynamic optimization fishery and compare to each other. Then estimate annual Argentine shortfin squid¡¦s stock size, comparing the stock size with the equilibrium level of the two fishery models. The result is that Argentine shortfin squid¡¦s stock size has no crisis of extinction now in Southwest Atlantic. In addition, simulate Argentine shortfin squid¡¦s stock size under management and no management status in the future. The result is that it will make the Argentine shortfin squid sustainable development under dynamic optimization fishery, and this fishery model will be a good management. Finally, this thesis based on the catch of Southwest Atlantic Argentine shortfin squid, which we figure out the fluctuation of catch by literatures, and do the sensitivity analysis.
187

Function-based Design Tools for Analyzing the Behavior and Sensitivity of Complex Systems During Conceptual Design

Hutcheson, Ryan S. 16 January 2010 (has links)
Complex engineering systems involve large numbers of functional elements. Each functional element can exhibit complex behavior itself. Ensuring the ability of such systems to meet the customer's needs and requirements requires modeling the behavior of these systems. Behavioral modeling allows a quantitative assessment of the ability of a system to meet specific requirements. However, modeling the behavior of complex systems is difficult due to the complexity of the elements involved and more importantly the complexity of these elements' interactions. In prior work, formal functional modeling techniques have been applied as a means of performing a qualitative decomposition of systems to ensure that needs and requirements are addressed by the functional elements of the system. Extending this functional decomposition to a quantitative representation of the behavior of a system represents a significant opportunity to improve the design process of complex systems. To this end, a functionality-based behavioral modeling framework is proposed along with a sensitivity analysis method to support the design process of complex systems. These design tools have been implemented in a computational framework and have been used to model the behavior of various engineering systems to demonstrate their maturity, application and effectiveness. The most significant result is a multi-fidelity model of a hybrid internal combustion-electric racecar powertrain that enabled a comprehensive quantitative study of longitudinal vehicle performance during various stages in the design process. This model was developed using the functionality-based framework and allowed a thorough exploration of the design space at various levels of fidelity. The functionality-based sensitivity analysis implemented along with the behavioral modeling approach provides measures similar to a variance-based approach with a computation burden of a local approach. The use of a functional decomposition in both the behavioral modeling and sensitivity analysis significantly contributes to the flexibility of the models and their application in current and future design efforts. This contribution was demonstrated in the application of the model to the 2009 Texas A&M Formula Hybrid powertrain design.
188

A one-group parametric sensitivity analysis for the graphite isotope ratio method and other related techniques using ORIGEN 2.2

Chesson, Kristin Elaine 02 June 2009 (has links)
Several methods have been developed previously for estimating cumulative energy production and plutonium production from graphite-moderated reactors. The Graphite Isotope Ratio Method (GIRM) is one well-known technique. This method is based on the measurement of trace isotopes in the reactor’s graphite matrix to determine the change in their isotopic ratios due to burnup. These measurements are then coupled with reactor calculations to determine the total plutonium and energy production of the reactor. To facilitate sensitivity analysis of these methods, a one-group cross section and fission product yield library for the fuel and graphite activation products has been developed for MAGNOX-style reactors. This library is intended for use in the ORIGEN computer code, which calculates the buildup, decay, and processing of radioactive materials. The library was developed using a fuel cell model in Monteburns. This model consisted of a single fuel rod including natural uranium metal fuel, magnesium cladding, carbon dioxide coolant, and Grade A United Kingdom (UK) graphite. Using this library a complete sensitivity analysis can be performed for GIRM and other techniques. The sensitivity analysis conducted in this study assessed various input parameters including 235U and 238U cross section values, aluminum alloy concentration in the fuel, and initial concentrations of trace elements in the graphite moderator. The results of the analysis yield insight into the GIRM method and the isotopic ratios the method uses as well as the level of uncertainty that may be found in the system results.
189

Calibration Of The Finite Element Model Of A Long Span Cantilever Through Truss Bridge Using Artificial Neural Networks

Yucel, Omer Burak 01 September 2008 (has links) (PDF)
In recent years, Artificial Neural Networks (ANN) have become widely popular tools in various disciplines of engineering, including civil engineering. In this thesis, Multi-layer perceptron with back-propagation type of network is utilized in calibration of the finite element model of a long span cantilever through truss called Commodore Barry Bridge (CBB). The essence of calibration lies in the phenomena of comparing and correlating the structural response of an analytical model with experimental results as closely as possible. Since CBB is a very large structure having complex structural mechanisms, formulation of mathematical expressions representing the relation between dynamics of the structure and the structural parameters is very complicated. Furthermore, when the errors in the structural model and noise in the experimental data are taken into account, a calibration study becomes more tedious. At this point, ANNs are useful tools since they have the capability of learning with noisy data and ability to approximate functions. In this study, firstly sensitivity analyses are conducted such that variations in dynamic properties of the bridge are observed with the changes in its structural parameters. In the second part, inverse relation between sensitive structural parameters and modal frequencies of CBB is approximated by training of a neural network. This successfully trained network is then fed up with experimental frequencies to acquire the as-is structural parameters and model updating is achieved accordingly.
190

Identification Of Low Order Vehicle Handling Models From Multibody Vehicle Dynamics Models

Saglam, Ferhat 01 January 2010 (has links) (PDF)
Vehicle handling models are commonly used in the design and analysis of vehicle dynamics. Especially, with the advances in vehicle control systems need for accurate and simple vehicle handling models have increased. These models have parameters, some of which are known or easily obtainable, yet some of which are unknown or difficult to obtain. These parameters are obtained by system identification, which is the study of building model from experimental data. In this thesis, identification of vehicle handling models is based on data obtained from the simulation of complex vehicle dynamics model from ADAMS representing the real vehicle and a general methodology has been developed. Identified vehicle handling models are the linear bicycle model and vehicle roll models with different tire models. Changes of sensitivity of the model outputs to model parameters with steering input frequency have been examined by sensitivity analysis to design the test input. To show that unknown parameters of the model can be identified uniquely, structural identifiability analysis has been performed. Minimizing the difference between the data obtained from the simulation of ADAMS vehicle model and the data obtained from the simulation of simple handling models by mathematical optimization methods, unknown parameters have been estimated and handling models have been identified. Estimation task has been performed using MATLAB Simulink Parameter Estimation Toolbox. By model validation it has been shown that identified handling models represent the vehicle system successfully.

Page generated in 0.0685 seconds