1 |
Comparison of various methods of mitigating over pressure induced release events involving ammonia refrigeration using quantitative risk analysis (QRA)Hodges, Tyler January 1900 (has links)
Master of Science / Department of Mechanical Engineering / Donald L. Fenton / This project was done to determine the effectiveness of different methods of mitigating the effects of an ammonia release through a pressure relief device in an ammonia refrigeration system. Several methods were considered, and five were selected for further study. The methods chosen for further study were discharge into a tank containing standing water, discharge into the atmosphere, discharge into a flare, discharge into a wet scrubber, and an emergency pressure control system. Discharge into a tank containing standing water is the most common method in existence today but several people in the ammonia refrigeration industry have questioned its reliability. The methods were compared based on a quantitative risk analysis, combining failure rates of each system with ammonia dispersion modeling and the monetized health effects of a system’s failure to contain an ammonia release.
It was determined that the release height had a greater influence on the downwind cost impact than any other variable, including weather conditions and release from multiple sources. The discharge into a tank containing standing water was determined to have the lowest failure rate, while the flare system was found to be the most effective in terms of relative overall release consequent cost. The emergency pressure control system is now required by the codes, and any of the other mitigation systems would be very effective when used in conjunction with the emergency pressure control system.
|
2 |
The MaRiQ model: A quantitative approach to risk managementCarlsson, Elin, Mattsson, Moa January 2019 (has links)
In recent years, cyber attacks and data fraud have become major issues to companies, businesses and nation states alike. The need for more accurate and reliable risk management models is therefore substantial. Today, cybersecurity risk management is often carried out on a qualitative basis, where risks are evaluated to a predefined set of categories such as low, medium or high. This thesis aims to challenge that practice, by presenting a model that quantitatively assesses risks - therefore named MaRiQ (Manage Risks Quantitatively). MaRiQ was developed based on collected requirements and contemporary literature on quantitative risk management. The model consists of a clearly defined flowchart and a supporting tool created in Excel. To generate scientifically validated results, MaRiQ makes use of a number of statistical techniques and mathematical functions, such as Monte Carlo simulations and probability distributions. To evaluate whether our developed model really was an improvement compared to current qualitative processes, we conducted a workshop at the end of the project. The organization that tested MaRiQexperienced the model to be useful and that it fulfilled most of their needs. Our results indicate that risk management within cybersecurity can and should be performed using more quantitative approaches than what is praxis today. Even though there are several potential developments to be made, MaRiQ demonstrates the possible advantages of transitioning from qualitative to quantitative risk management processes.
|
3 |
Development of a computer-aided fault tree synthesis methodology for quantitative risk analysis in the chemical process industryWang, Yanjun 17 February 2005 (has links)
There has been growing public concern regarding the threat to people and
environment from industrial activities, thus more rigorous regulations. The investigation
of almost all the major accidents shows that we could have avoided those tragedies with
effective risk analysis and safety management programs. High-quality risk analysis is
absolutely necessary for sustainable development.
As a powerful and systematic tool, fault tree analysis (FTA) has been adapted to
the particular need of chemical process quantitative risk analysis (CPQRA) and found
great applications. However, the application of FTA in the chemical process industry
(CPI) is limited. One major barrier is the manual synthesis of fault trees. It requires a
thorough understanding of the process and is vulnerable to individual subjectivity. The
quality of FTA can be highly subjective and variable.
The availability of a computer-based FTA methodology will greatly benefit the
CPI. The primary objective of this research is to develop a computer-aided fault tree
synthesis methodology for CPQRA. The central idea is to capture the cause-and-effect
logic around each item of equipment directly into mini fault trees. Special fault tree
models have been developed to manage special features. Fault trees created by this
method are expected to be concise. A prototype computer program is provided to
illustrate the methodology. Ideally, FTA can be standardized through a computer
package that reads information contained in process block diagrams and provides
automatic aids to assist engineers in generating and analyzing fault trees.
Another important issue with regard to QRA is the large uncertainty associated
with available failure rate data. In the CPI, the ranges of failure rates observed could be
quite wide. Traditional reliability studies using point values of failure rates may result in
misleading conclusions. This dissertation discusses the uncertainty with failure rate data
and proposes a procedure to deal with data uncertainty in determining safety integrity
level (SIL) for a safety instrumented system (SIS). Efforts must be carried out to obtain
more accurate values of those data that might actually impact the estimation of SIL. This
procedure guides process hazard analysts toward a more accurate SIL estimation and
avoids misleading results due to data uncertainty.
|
4 |
Hierarchical Bayesian Benchmark Dose AnalysisFang, Qijun January 2014 (has links)
An important objective in statistical risk assessment is estimation of minimum exposure levels, called Benchmark Doses (BMDs) that induce a pre-specified Benchmark Response (BMR) in a target population. Established inferential approaches for BMD analysis typically involve one-sided, frequentist confidence limits, leading in practice to what are called Benchmark Dose Lower Limits (BMDLs). Appeal to hierarchical Bayesian modeling and credible limits for building BMDLs is far less developed, however. Indeed, for the few existing forms of Bayesian BMDs, informative prior information is seldom incorporated. Here, a new method is developed by using reparameterized quantal-response models that explicitly describe the BMD as a target parameter. This potentially improves the BMD/BMDL estimation by combining elicited prior belief with the observed data in the Bayesian hierarchy. Besides this, the large variety of candidate quantal-response models available for applying these methods, however, lead to questions of model adequacy and uncertainty. Facing this issue, the Bayesian estimation technique here is further enhanced by applying Bayesian model averaging to produce point estimates and (lower) credible bounds. Implementation is facilitated via a Monte Carlo-based adaptive Metropolis (AM) algorithm to approximate the posterior distribution. Performance of the method is evaluated via a simulation study. An example from carcinogenicity testing illustrates the calculations.
|
5 |
Evaluation of quantitative assessment extensions to a qualitative riskanalysis method / Utvärdering av kvantitativa bedömningsutvidgningar till en kvalitativ riskanalysmetodSvensson, Louise January 2017 (has links)
The usage of information systems (IS) within organizations has become crucial. Information is one of the most vulnerable resources within an enterprise. Information can be exposed, tampered or made non-accessible, where the integrity, confidentiality or availability becomes affected. The ability to manage risks is therefore a central issue in enterprises today. In order to manage risks, the risks need to be identified and further evaluated. All kind of threats with the possibility to negatively affect the confidentiality, integrity, or availability of the organization need to be reviewed. The process of identifying and estimating risks and possible measures is called risk analysis. There are two main categories of risk analysis, qualitative and quantitative. A quantitative method involves interpreting numbers from data and is based on objective inputs. A qualitative method involves interpreting of subjective inputs such as brainstorming and interviews. A common approach is to apply a qualitative method, however a lot of criticism has been raised against using subjective inputs to assessing risks. Secure State is a consulting company with specialist expertise in the field of information security. They help their customers to build trust in the customers systems and processes, making their customers businesses operate with consideration to information security. One service offered by Secure State is risk analysis, and currently they perform qualitative risk analysis. Given all criticisms against a qualitative approach for assessing risks, this study developed a quantitative risk analysis method for Secure State. According to participants, who attended at a risk analysis where the developed quantitative risk analysis method was used, the quantitative risk analysis method improved the risk assessment. Since risks and their effects are decomposed into smaller components in the proposed quantitative risk analysis method, interpretations of risks and their meaning during assessments less likely differed. Therefore, the common understanding of a risk increases, which makes the quality of the evaluation of risks increase. Furthermore, the usage of statistical data increases in the developed quantitative risk analysis method. Additionally, the quantitative method handles the fact that all data used is imperfect. The data is imperfect since it is used to describe the future, and the future has not happened yet.
|
6 |
Validity and validation of safety-related quantitative risk analysis: A reviewGoerlandt, Floris, Khakzad, Nima, Reniers, Genserik 11 November 2020 (has links)
Quantitative risk analysis (QRA) is widely applied in several industries as a tool to improve safety, as part of design, licensing or operational processes. Nevertheless, there is much less academic research on the validity and validation of QRA, despite their importance both for the science of risk analysis and with respect to its practical implication for decision-making and improving system safety. In light of this, this paper presents a review focusing on the validity and validation of QRA in a safety context. Theoretical, methodological and empirical contributions in the scientific literature are reviewed, focusing on three questions. Which theoretical views on validity and validation of QRA can be found? Which features of QRA are useful to validate a particular QRA, and which frameworks are proposed to this effect? What kinds of claims are made about QRA, and what evidence is available for QRA being valid for the stated purposes? A discussion follows the review, focusing on the available evidence for the validity of QRA and the effectiveness of validation methods.
|
7 |
Improving Project Management With Simulation And Completion DistributiCates, Grant 01 January 2004 (has links)
Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. Uncertainty has been identified as a contributing factor in late projects. This uncertainty resides in activity duration estimates, unplanned upsetting events, and the potential unavailability of critical resources. This research developed a comprehensive simulation based methodology for conducting quantitative project completion-time risk assessments. The methodology enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used to determine a project's completion distribution function. The project simulation is populated with both deterministic and stochastic elements. Deterministic inputs include planned activities and resource requirements. Stochastic inputs include activity duration growth distributions, probabilities for unplanned upsetting events, and other dynamic constraints upon project activities. Stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Multiple replications of the simulation are run to create the completion distribution function. The methodology was demonstrated to be effective for the on-going project to assemble the International Space Station. Approximately $500 million per month is being spent on this project, which is scheduled to complete by 2010. Project stakeholders participated in determining and managing completion distribution functions. The first result was improved project completion risk awareness. Secondly, mitigation options were analyzed to improve project completion performance and reduce total project cost.
|
8 |
Estudo sobre a modelagem da dispersão atmosférica de gases densos decorrente de liberações acidentais em análise quantitativa de risco. / Study on thedense gas atmospheric dispersion from accidental releases in quantitative risk analysis.Salazar, Márcio Piovezan 02 June 2016 (has links)
A percepção crescente da sociedade em relação aos perigos inerentes às instalações industriais que manipulam grandes inventários de substâncias perigosas faz com que a ferramenta análise quantitativa de risco ganhe importância na complexa discussão sobre a viabilidade destes empreendimentos, no intuito de promover a ocupação adequada do solo na área urbana e prevenir a ocorrência do chamado acidente maior. Contudo, para se chegar à expressão de risco de uma determinada instalação industrial deve-se aplicar um conjunto de técnicas e de modelos matemáticos, entre os quais estão os modelos de dispersão atmosférica, usados para se estimar a área afetada na vizinhança da mesma por liberações acidentais que levam à formação de nuvens de substâncias químicas na atmosfera. Em decorrência da complexidade inerente ao próprio processo de dispersão atmosférica, especialmente no que tange aos denominados gases densos, existe uma diversidade de modelos que podem ser aplicados no escopo da análise de risco, o que leva a seus usuários, naturalmente, ao questionamento sobre a suscetibilidade dos resultados finais ao tipo de modelagem adotada. Neste sentido, este trabalho estuda o processo de dispersão atmosférica de nuvens densas formadas em liberações acidentais, identificando as principais possibilidades de modelagem deste processo e, ao final, apresenta um estudo de caso demonstrando que diferentes modelagens desta dispersão, comumente empregadas em análise de risco de instalações industriais, podem produzir variações na estimativa do risco de uma mesma instalação e, portanto, influenciar as decisões baseadas em risco. / The concern of the society about the risks posed by activities that deal with hazardous substances has increased in an environment strongly industrialized and with high population density in view of the inherent potential hazards of them as well as the impact of recent accidental episodes, even though their benefits provided. In this context the quantitative risk analysis is presented as an essential tool to assess the risk of these activities and compose a complex discussion about its feasibility. Some of these accident scenarios may involve the formation of a hazardous product cloud and its subsequent air dispersion in the off-site region when an accidental released take place and one should apply the so-called atmospheric dispersion models for estimating the consequences of the releases. Due to the complexity involved in this atmospheric dispersion process, there is a wide variety of mathematical models that can be applied for estimating the offsite consequences of the accidental releases leading, naturally, to one wonder whether the final risk expression of a facility is susceptible to these differences. Often in the world of industrial use of hazardous materials, toxic or flammable there is a possibility that these accidental releases produce clouds that are denser than air, a situation that demands even more attention in terms of risk aspects involved. Then, this dissertation studies the process of atmospheric dispersion of heavier-than-air clouds produced after an accidental release, identifying the main ways of modelling the process and presents a case study comparing different dispersion models that demonstrates that the final expression of risk of a typical installation can be different when it is used different dispersion model in the process.
|
9 |
Risk Measures Constituting Risk Metrics for Decision Making in the Chemical Process IndustryPrem, Katherine 2010 December 1900 (has links)
The occurrence of catastrophic incidents in the process industry leave a marked legacy of resulting in staggering economic and societal losses incurred by the company, the government and the society. The work described herein is a novel approach proposed to help predict and mitigate potential catastrophes from occurring and for understanding the stakes at risk for better risk informed decision making.
The methodology includes societal impact as risk measures along with tangible asset damage monetization. Predicting incidents as leading metrics is pivotal to improving plant processes and, for individual and societal safety in the vicinity of the plant (portfolio). From this study it can be concluded that the comprehensive judgments of all the risks and losses should entail the analysis of the overall results of all possible incident scenarios. Value-at-Risk (VaR) is most suitable as an overall measure for many scenarios and for large number of portfolio assets. FN-curves and F$-curves can be correlated and this is very beneficial for understanding the trends of historical incidents in the U.S. chemical process industry.
Analyzing historical databases can provide valuable information on the incident occurrences and their consequences as lagging metrics (or lagging indicators) for the mitigation of the portfolio risks. From this study it can be concluded that there is a strong statistical relationship between the different consequence tiers of the safety pyramid and Heinrich‘s safety pyramid is comparable to data mined from the HSEES database. Furthermore, any chemical plant operation is robust only when a strategic balance is struck between optimal plant operations and, maintaining health, safety and sustaining environment.
The balance emerges from choosing the best option amidst several conflicting parameters. Strategies for normative decision making should be utilized for making choices under uncertainty. Hence, decision theory is utilized here for laying the framework for choice making of optimum portfolio option among several competing portfolios. For understanding the strategic interactions of the different contributing representative sets that play a key role in determining the most preferred action for optimum production and safety, the concepts of game theory are utilized and framework has been provided as novel application to chemical process industry.
|
10 |
Risk Assessement Of Petroleum Transportation Pipeline In Some Turkish Oil FieldsOgutcu, Gokcen 01 June 2004 (has links) (PDF)
In this thesis, quantitative risk assessment study of several oil field
transportation lines that belong to a private oil production company located in S.
East Turkey has been conducted. In order to achieve this goal, first primary risk
drivers were identified. Then relative ranking of all pipeline segments were
conducted. Quantitative risk assessment was based on Monte Carlo simulations
and a relative scoring index approach. In these simulations frequency of
occurrence of pipeline failures for different oil field pipeline systems was used.
Consequences of failures were also based on historical data gathered from the
same oil fields. Results of corrosion rate calculations in oil and water pipeline
systems were also reported.
iv
Most significant failures are identified as corrosion, third party damage,
mechanical failure, operational failure, weather effect and sabotage. It was
suggested that in order to reduce corrosion rate, thin metal sheets must be inserted
in pipelines. Aluminum sheets (anodes) must be used to reduce corrosion rate in
water pipeline system. The required number of anodes was calculated as 266 for
BE field water pipeline (the life of anode is 1.28 years), 959 for KA water
pipelines system (the life of anode is 3.2 years.) and 992 for KW water pipelines
(the life of anode is approximately 2 years). Furthermore high risk pipeline
segments for further assessment were identified. As a result of Monte Carlo
simulations, the highest risk was observed in return lines followed by flow lines,
water lines and trunk lines. The most risky field was field BE for which the risk
value in trunk lines were the highest followed by flow lines. Field SA was the
second risky region for flow lines and it was followed by KU region. Field KA
was forth-risky. Prioritization of maintenance activities was suggested and areas
of missing or incomplete data were identified.
|
Page generated in 0.0888 seconds