• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 17
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 29
  • 29
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Evaluation of Adaptation Options to Flood Risk in a Probabilistic Framework

Kheradmand, Saeideh 13 December 2021 (has links)
Probabilistic risk assessment (PRA) has been used in various engineering and technological fields to assist regulatory agencies, and decision-makers to assess and reduce the risks inherent in complex systems. PRA allows decision-makers to make risk-informed choices rather than simply relying on traditional deterministic flood analyses (e.g., a Probable Maximum Flood) and therefore supports good engineering design practice. Type and quantity of available data is often a key factor in PRA at an early stage for determining the best methodology. However, implementation of PRA becomes difficult and challenging since probability distributions need to be derived to describe the variable states. Flood protection is one of the rare fields in civil engineering where probability is extensively used to describe uncertainty and where the concept of failure risk is explicitly part of the design. The concept of return period is taught in all civil engineering classes throughout the world, and most cities in the developed world have developed flood risk maps where the limits of the 50-year or 100-year flood are shown. While this approach is useful, it has several limitations: • It is based on a single flow value while all flow ranges contribute to the risk; • It is not linked to the actual economic damage of floods; • So far, flood risk maps only account for river water levels. It has been demonstrated that intense rainfall causes significant property damages in West Africa. This study aimed to explore the possibility of developing and implementing a probabilistic flood risk estimation framework where all flow ranges are accounted for: 1) The probability of flood occurrence and the probabilistic distribution of hydraulic parameters, and 2) The probability of damages are spatially calculated in order for the decision-makers to take optimal adaptation decisions (e.g., flood protection dike design, recommendations for new buildings, etc.). In this study the challenges of inferring the probability distribution of different physical flood parameters in a context of sparse data, of linking their parameters to flood damages, and finally the translation of the estimation risk into decision were explored. The effect of the choice of the one-dimensional (1-D) or two-dimensional (2-D) hydraulic models on the estimated flood risk and ultimately on the adaptation decisions was investigated. A first case study on the city of Niamey (Niger, West Africa), was performed using readily available data and 1-D and 2-D HEC-RAS (Hydrologic Engineering Center-River Analysis System) models. Adaptation options to flood risk in Niamey area were examined by looking at two main variables: a) Buildings’ material (CAS: Informal constructions – a mixture of sundried clay and straw, also known as Banco, BAN: Mud walls, DUR: Concrete walls, and SDU: Mud walls covered by mortar); and b) Dike height within a scenario-based framework, where numerical modelling was undertaken to quantify the inundated area. The 1-D and 2-D hydraulic models, HEC-RAS, were tested on a 160 km reach of the Niger River. Using the numerical modelling, water levels within the inundated areas have been identified. The extent of residential areas as well as exposed assets (polygons and building material) associated with each scenario have been evaluated. 1000 probabilistic flood maps were generated and considered in the estimation of the total loss. Benefits and costs of different adaptation options were then compared for residential land-use class in order to implement flood risk maps in the city of Niamey. Results show the individual as well as the combined impact of the two abovementioned variables in flood risk estimation in Niamey region. Dike heights ranged from 180.5 m to 184 m, at a 0.5 m interval, and buildings’ material were considered to be of 0% to 100% of each type, respectively. The results enable decision-makers as well as the regulators to have a quantitative tool for choosing the best preventive measures to alleviate the adverse impacts arising from flood. Also, because of the lack of detailed information on the exposed infrastructure elements in the study area, a feasible yet fast and precise method of extracting buildings from high-resolution aerial images was investigated using an Artificial Intelligence (AI) method – Deep Learning (DL). The applied deep learning method showed promising results with high accuracy rate for the area of interest in this study and was able to successfully identify two introduced classes of Building and Background (non-building). The findings contend that although the proposed structural adaptation options, as a resisting to environment approach, are applied to the area of interest and considered to be technically feasible, other non-structural measures, which have long-term effect of risk mitigation, should be taken into consideration, especially for highly hazard-prone areas. The results of this study would significantly help in loss estimation to the buildings due to the yearly floods in the region of interest, Niamey, Niger. However, since the buildings are of various type of material, having an accurate building database has a great importance in assessing the expected level of damage in the inundated areas, especially to the critical buildings (hospitals, schools, research labs, etc.) in the area.
2

Attributable Risk Estimation in Matched Case-Control Studies

Nuamah, Isaac 07 1900 (has links)
This project discusses some of the methodologies developed over the years to estimate attributable risk among exposed persons and the attributable risk in the entire population (also called Etiologic Fraction). It provides a general framework for estimating attributable risk among the exposed (denoted lambda_e). By making use of the recent observation that the two measures of attributable risk can be linked through the prevalence of the risk factor among the cases (denoted V_x), an estimate of population attributable risk (denoted lambda) for matched case-control studies is determined. Using the methodology developed recently by Kuritz and Landis (1987), this project provides explicit formulas for estimating the attributable risk among the exposed and the population attributable risk, and their large sample variances. This has been done both in situations where exactly R controls have been matched to a case and for a variable number of controls per case. The methodologies are illustrated with data from some case-control studies reported in the literature. Asymptotic relative efficiencies of different matching designs computed in terms of the costs of gathering cases and controls, are presented, together with some recommendations on what design is considered optimal. / Thesis / Master of Science (MSc)
3

Effective dose estimation and risk assessment in patients treated with iodine 131I using Monte Carlo simulation

Zdraveska Kochovska Marina 05 December 2014 (has links)
<p>The most frequently used radiopharmaceutical for treatment of thyroid diseases such as Thyroid Cancer and Hyperthyroidism is radioactive iodine 131I. It has a very high success rate in treatment of patients with thyroid diseases and also it has been proven to be safe and relatively inexpensive treatment modality. Whenever radiation is used in the treatment of the benign or cancer diseases dosimetry is essential. The main aim of this study is to perform external measurements of dose rate after administered activity and to simulate dosimetry of internal organs and risk assessment using MCNP 4b code, either for thyroid cancer and Hyperthyroid patients. To search safety optimization in this kind of therapy, it was recognized the necessity of more accurate knowledge of dose levels received by stomach and other organs. Of great importance is to know the effective dose that will be reached in gastric and other surrounding organs such as liver, lung, bladder etc. Additional aim was to provide information to be used in the improvement of radiation therapy, radiation safety practices and improvement of the fundamentals of radiation protection as defined by ICRP: justification, optimization and application of dose limits. The significance of this research is that the doses to internal organs can be determined and it worth to mention that such internal dosimetry calculation has been performed&nbsp;&nbsp; rare in the field of nuclear medicine. In accordance with the calculations carried out during this study and reference available in the literature that the therapy with radioiodine will be improved at the Institute of pathophysiology and nuclear medicine. Designed quality programs will be useful also for regulatory and accreditation bodies in the process of accreditation and radiation protection strategy. This work is divided into interconnected chapters. Chapter 2 to 4 contains the literature review and theoretical background of the study. Chapter 3 covers material and methods and aspect of Monte Carlo transport code focused on MCNP 4b code. Chapter 4 provides the results and discussion of the findings. Conclusions and recommendations are discussed in Chapter 5. Chapter 6 contains references used in this work.</p> / <p>Cilj terapije sa radiaoktivnim jodom 131I kod pacijenata koji boluju od nekih tipova tiroidnih carcinoma i hipertiroidizma je isporuka doze i apsorpcija doze u tiroidnoj žlezdi. Terapija sa radioaktivnim jodom sprovodi se u obliku rastvora Na131I (natrijum jodida) u tečnoj formi ili aplicira se u formi kapsule. Efektivna doza je rezultat apsorbovane doze u tiroidnom tkivu, ali i ostali unutra&scaron;nji organi prime izvesnu dozu. Kapsule koje sadrže natrijum jodid ostaju u stomaku oko 15 minuta pre nego &scaron;to započne apsorpcija, vreme dovoljno dugo za rizično izlaganje. Ova činjenica je jedan od ciljeva doktorske teze, odrediti efektivnu dozu u stomaku i nekoliko unutra&scaron;njih okolnih organa modelovanje transporta i interakcije gama zračenja i beta čestica emitovanih iz radionuklida 131I je kori&scaron;ćen Monte Karlo kod (MCNP4b). Radiojod je modelovan kao tačkasti izvor na dnu stomaka. Proračunavana je apsorbovana energija po jedinici transformacije u stomaku i okolnim organima. Ekvivalentna doza u tim organima je izračunata da bi se odredila efektivna doza primenom odgovarajućih težinskih faktora. Dobijeni rezultati imaju značaja za za&scaron;titu od zračenja, ali su važni i za ustanovljavanje novih kalibracionih procedura kao deo sigurnosne kontrole i kontrole kvaliteta u proizvodnji i kontroli radiofarmaceutika kao i procedure administriranja radiofarmaceutika i primene bolnčikih puteva. Smatramo da če rezultati ovog istraživanja pobolj&scaron;ati bezbednosnu kulturu u na&scaron;em sistemu zdravstvene za&scaron;tite kao i u državnim organima koji kreiraju i donose regulative.</p>
4

Three essays on stock market risk estimation and aggregation

Chen, Hai Feng 27 March 2012 (has links)
This dissertation consists of three essays. In the first essay, I estimate a high dimensional covariance matrix of returns for 88 individual stocks from the S&P 100 index, using daily return data for 1995-2005. This study applies the two-step estimator of the dynamic conditional correlation multivariate GARCH model, proposed by Engle (2002b) and Engle and Sheppard (2001) and applies variations of this model. This is the first study estimating variances and covariances of returns using a large number of individual stocks (e.g., Engle and Sheppard (2001) use data on various aggregate sub-indexes of stocks). This avoids errors in estimation of GARCH models with contemporaneous aggregation of stocks (e.g. Nijman and Sentana 1996; Komunjer 2001). Second, this is the first multivariate GARCH adopting a systematic general-to-specific approach to specification of lagged returns in the mean equation. Various alternatives to simple GARCH are considered in step one univariate estimation, and econometric results favour an asymmetric EGARCH extension of Engle and Sheppard’s model. In essay two, I aggregate a variance-covariance matrix of return risk (estimated using DCC-MVGARCH in essay one) to an aggregate index of return risk. This measure of risk is compared with the standard approach to measuring risk from a simple univariate GARCH model of aggregate returns. In principle the standard approach implies errors in estimation due to contemporaneous aggregation of stocks. The two measures are compared in terms of correlation and economic values: measures are not perfectly correlated, and the economic value for the improved estimate of risk as calculated here is substantial. Essay three has three parts. The major part is an empirical study of the aggregate risk return tradeoff for U.S. stocks using daily data. Recent research indicates that past risk-return studies suffer from inadequate sample size, and this suggests using daily rather than monthly data. Modeling dynamics/lags is critical in daily models, and apparently this is the first such study to model lags correctly using a general to specific approach. This is also the first risk return study to apply Wu tests for possible problems of endogeneity/measurement error for the risk variable. Results indicate a statistically significant positive relation between expected returns and risk, as is predicted by capital asset pricing models. Development of the Wu test leads naturally into a model relating aggregate risk of returns to economic variables from the risk return study. This is the first such model to include lags in variables based on a general to specific methodology and to include covariances of such variables. I also derive coefficient links between such models and risk-return models, so in theory these models are more closely related than has been realized in past literature. Empirical results for the daily model are consistent with theory and indicate that the economic and financial variables explain a substantial part of variation in daily risk of returns. The first section of this essay also investigates at a theoretical and empirical level several alternative index number approaches for aggregating multivariate risk over stocks. The empirical results indicate that these indexes are highly correlated for this data set, so only the simplest indexes are used in the remainder of the essay.
5

BAYESIAN SEMIPARAMETRIC GENERALIZATIONS OF LINEAR MODELS USING POLYA TREES

Schoergendorfer, Angela 01 January 2011 (has links)
In a Bayesian framework, prior distributions on a space of nonparametric continuous distributions may be defined using Polya trees. This dissertation addresses statistical problems for which the Polya tree idea can be utilized to provide efficient and practical methodological solutions. One problem considered is the estimation of risks, odds ratios, or other similar measures that are derived by specifying a threshold for an observed continuous variable. It has been previously shown that fitting a linear model to the continuous outcome under the assumption of a logistic error distribution leads to more efficient odds ratio estimates. We will show that deviations from the assumption of logistic error can result in great bias in odds ratio estimates. A one-step approximation to the Savage-Dickey ratio will be presented as a Bayesian test for distributional assumptions in the traditional logistic regression model. The approximation utilizes least-squares estimates in the place of a full Bayesian Markov Chain simulation, and the equivalence of inferences based on the two implementations will be shown. A framework for flexible, semiparametric estimation of risks in the case that the assumption of logistic error is rejected will be proposed. A second application deals with regression scenarios in which residuals are correlated and their distribution evolves over an ordinal covariate such as time. In the context of prediction, such complex error distributions need to be modeled carefully and flexibly. The proposed model introduces dependent, but separate Polya tree priors for each time point, thus pooling information across time points to model gradual changes in distributional shapes. Theoretical properties of the proposed model will be outlined, and its potential predictive advantages in simulated scenarios and real data will be demonstrated.
6

Three essays on stock market risk estimation and aggregation

Chen, Hai Feng 27 March 2012 (has links)
This dissertation consists of three essays. In the first essay, I estimate a high dimensional covariance matrix of returns for 88 individual stocks from the S&P 100 index, using daily return data for 1995-2005. This study applies the two-step estimator of the dynamic conditional correlation multivariate GARCH model, proposed by Engle (2002b) and Engle and Sheppard (2001) and applies variations of this model. This is the first study estimating variances and covariances of returns using a large number of individual stocks (e.g., Engle and Sheppard (2001) use data on various aggregate sub-indexes of stocks). This avoids errors in estimation of GARCH models with contemporaneous aggregation of stocks (e.g. Nijman and Sentana 1996; Komunjer 2001). Second, this is the first multivariate GARCH adopting a systematic general-to-specific approach to specification of lagged returns in the mean equation. Various alternatives to simple GARCH are considered in step one univariate estimation, and econometric results favour an asymmetric EGARCH extension of Engle and Sheppard’s model. In essay two, I aggregate a variance-covariance matrix of return risk (estimated using DCC-MVGARCH in essay one) to an aggregate index of return risk. This measure of risk is compared with the standard approach to measuring risk from a simple univariate GARCH model of aggregate returns. In principle the standard approach implies errors in estimation due to contemporaneous aggregation of stocks. The two measures are compared in terms of correlation and economic values: measures are not perfectly correlated, and the economic value for the improved estimate of risk as calculated here is substantial. Essay three has three parts. The major part is an empirical study of the aggregate risk return tradeoff for U.S. stocks using daily data. Recent research indicates that past risk-return studies suffer from inadequate sample size, and this suggests using daily rather than monthly data. Modeling dynamics/lags is critical in daily models, and apparently this is the first such study to model lags correctly using a general to specific approach. This is also the first risk return study to apply Wu tests for possible problems of endogeneity/measurement error for the risk variable. Results indicate a statistically significant positive relation between expected returns and risk, as is predicted by capital asset pricing models. Development of the Wu test leads naturally into a model relating aggregate risk of returns to economic variables from the risk return study. This is the first such model to include lags in variables based on a general to specific methodology and to include covariances of such variables. I also derive coefficient links between such models and risk-return models, so in theory these models are more closely related than has been realized in past literature. Empirical results for the daily model are consistent with theory and indicate that the economic and financial variables explain a substantial part of variation in daily risk of returns. The first section of this essay also investigates at a theoretical and empirical level several alternative index number approaches for aggregating multivariate risk over stocks. The empirical results indicate that these indexes are highly correlated for this data set, so only the simplest indexes are used in the remainder of the essay.
7

Estimating Pedestrian Crashes at Urban Signalized Intersections

Kennedy, Jason Forrest 07 January 2009 (has links)
Crash prediction models are used to estimate the number of crashes using a set of explanatory variables. The highway safety community has used modeling techniques to predict vehicle-to-vehicle crashes for decades. Specifically, generalized linear models (GLMs) are commonly used because they can model non-linear count data such as motor vehicle crashes. Regression models such as the Poisson, Zero-inflated Poisson (ZIP), and the Negative Binomial are commonly used to model crashes. Until recently very little research has been conducted on crash prediction modeling for pedestrian-motor vehicle crashes. This thesis considers several candidate crash prediction models using a variety of explanatory variables and regression functions. The goal of this thesis is to develop a pedestrian crash prediction model to contribute to the field of pedestrian safety prediction research. Additionally, the thesis contributes to the work done by the Federal Highway Administration to estimate pedestrian exposure in urban areas. The results of the crash prediction analyses indicate the pedestrian-vehicle crash model is similar to models from previous work. An analysis of two pedestrian volume estimation methods indicates that using a scaling technique will produce volume estimates highly correlated to observed volumes. The ratio of crash and exposure estimates gives a crash rate estimation that is useful for traffic engineers and transportation policy makers to evaluate pedestrian safety at signalized intersections in an urban environment. / Master of Science
8

Posouzení rizik stroje dle ČSN 12100:2011 / Risk assessment of the machine according to ČSN 12100:2011

Steklý, Jakub January 2018 (has links)
This thesis deals with the safety of engraving machine. The main contribution of this thesis is the detailed implementation of the risk assessment according to valid standards. Further detailed design of measures to meet all safety requirements. Part of this thesis is familiarity with the machine and its current state and the procedure of risk analysis. Another part is the risk assessment. Conclusion of this thesis includes the documentation and ES certificate of conformity.
9

A risk-transaction cost trade-off model for index tracking

Singh, Alex January 2014 (has links)
This master thesis considers and evaluates a few different risk models for stock portfolios, including an ordinary sample covariance matrix, factor models and an approach inspired from random matrix theory. The risk models are evaluated by simulating minimum variance portfolios and employing a cross-validation. The Bloomberg+ transaction cost model is investigated and used to optimize portfolios of stocks, with respect to a trade off between the active risk of the portfolio and transaction costs. Further a few different simulations are performed while using the optimizer to rebalance long-only portfolios. The optimization problem is solved using an active-set algorithm. A couple of approaches are shown that may be used to visually try to decide a value for the risk aversion parameter λ in the objective function of the optimization problem. The thesis concludes that there is a practical difference between the different risk models that are evaluated. The ordinary sample covariance matrix is shown to not perform as well as the other models. It also shows that more frequent rebalancing is preferable to less frequent. Further the thesis goes on to show a peculiar behavior of the optimization problem, which is that the optimizer does not rebalance all the way to 0 in simulations, even if enough time is provided, unless it is explicitly required by the constraints.
10

A estimativa do risco na constituição da PDD. / The risk estimation for the allowance for doubtful accounts.

Vicente, Ernesto Fernando Rodrigues 15 May 2001 (has links)
Neste trabalho foram revisados os principais modelos, para a avaliação do risco de crédito e para o provisionamento de perdas com clientes, concluindo-se com uma proposta de adoção de um modelo estatístico, com o objetivo de medir o risco associado ao financiamento e empréstimo a clientes, com o conseqüente impacto na mensuração dos ativos. Sem o objetivo de exaurir o assunto, foram adotados os passos relacionados a seguir para o desenvolvimento do tema até a proposição final. Na introdução, são feitas as justificativas sobre o tema, qual a questão problema associada ao tema e os desafios da contabilidade quanto à mensuração dos ativos. Em relação à gestão de riscos, são relacionados os tipos de riscos em geral, detalhado o risco de crédito em particular e avaliados os modelos de concessão de crédito. Sobre a constituição da Provisão para Devedores Duvidosos, foram pesquisados os principais autores de contabilidade e de finanças, onde se constatou proposições semelhantes, que podem ser resumidas em 4 modelos de provisionamento para Perdas com Devedores Duvidosos: 1. Baixa – "Write-off"; 2. Percentual sobre as vendas; 3. Percentual sobre o montante de contas a receber; 4. Idade da carteira – "aging". Em seguida são analisadas as correlações entre os modelos de previsão de insolvência e as perdas com crédito, onde é possível identificar que os modelos de insolvência são úteis para a concessão do crédito, mas pouco utilizados para a estimativa da perda provável com devedores duvidosos. Em 21 de dezembro de 1999, o Banco Central do Brasil, emitiu a Resolução 2.682, na qual recomenda às instituições financeiras que alterem suas metodologias de provisionamento para perdas com devedores duvidosos. O Banco Central, entretanto, não indica qual modelo utilizar, deixando a cargo de cada instituição o desenvolvimento dos modelos. Utilizando a norma do Banco Central como referência, e procurando um embasamento científico para a constituição da PDD, é proposto um modelo para a sua constituição, modelo esse testado e avaliado, tanto em conformidade às normas do Banco Central, como com orientação gerencial. Para tanto, foi desenvolvido um modelo estatístico, aplicando-se a técnica da regressão logística, a 202 clientes de uma instituição financeira, onde foi possível concluir-se que o uso do modelo, na constituição da Provisão para Devedores Duvidosos, poderá trazer benefícios na mensuração do real valor dos investimentos em contas a receber. / This project aims at evaluating the used models and proposing the adoption of new models, for the Allowance for Doubtful Accounts, with the objective of measuring the risk related to customers financing and loan activities, and the resulting impact in assets measurements. In order to achieve this goal and try not to over exploit the subject, the following steps related to the development of the theme were adopted. In the introduction the theme is explained; the main issue associated to the theme and the challenges the accounting has to face concerning the assets measurements. As for the risk management the general kinds of risks are described, particularly the credit risk and the credit concession models are evaluated. Referring to the Allowance for Doubtful Accounts constitution, most meaningful authors in the field of Accounting and Finance were researched and similar propositions underlined their writings; that can be summarized in 4 allowance models for Doubtful Accounts. 1. Write off; 2. Percentage over the sales; 3. Percentage over receivables; 4. Aging. Then the correlations between the models of insolvency prediction and the credit losses are analyzed where it is possible to verify that the insolvency models are useful for credit concession, though not very much used for estimating probable loss with Doubtful Accounts. The Central Bank of Brazil (Banco Central do Brasil) issued the act number 2682 on the 21st of December, 1999, that urges all financial institutions to change their methodologies of Allowance for Doubtful Accounts. The Central Bank, however, does not indicate the model to be used, leaving the task of developing the models to each institution. Based on the policy of Central Bank and keeping a scientific approach to the constitution of Allowance for Doubtful Accounts, a model for their constitution in the portfolio is proposed. Such model is tested and evaluated not only, according to the rules of Central Bank of Brazil, but also in terms of management orientation. Having this purpose in mind a statistic model was developed, using LOGIT Regression applied to 202 customers of a financial institution where it was possible to come to the conclusion that the use of the model in the constitution of the Allowance for Doubtful Accounts can bring benefits in measuring the real value of investments in Receivables.

Page generated in 0.0861 seconds