• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 9
  • 6
  • 1
  • 1
  • Tagged with
  • 31
  • 31
  • 8
  • 7
  • 7
  • 7
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

The integration of Dow's Fire and Explosion Index into process design and optimization to achieve an inherently safer design

Suardin, Jaffee Arizon 30 October 2006 (has links)
The integration of the safety parameter into process design and optimization is essential. However, there is no previous work in integrating the fire and explosion index (F&EI) into design and optimization. This research proposed a procedure for integrating safety into the design and optimization framework by using the safety parameter as optimization constraint. The method used in this research is Dow’s Fire and Explosion Index which is usually calculated manually. This research automates the calculation of F&EI. The ability to calculate the F&EI, to determine loss control credit factors and business interruption, and to perform process unit risk analysis are unique features of this F&EI program. In addition to F&EI calculation, the F&EI program provides descriptions of each item of the penalties, chemicals/materials databases, the flexibility to submit known chemical/material data to databases, and material factor calculations. Moreover, the sensitivity analyses are automated by generating charts and expressions of F&EI as a function of material inventory and pressure. The expression will be the focal point in the process of integrating F&EI into process design and optimization framework. The proposed procedure of integrating F&EI into process design and optimization framework is verified by applying it into reactor-distillation column system. The final result is the optimum economic and inherently safer design for the reactor and distillation column system.
12

Measurement and prediction of aerosol formation for thesafe utilization of industrial fuids

Krishna, Kiran 30 September 2004 (has links)
Mist or aerosol explosions present a serious hazard to process industries. Heat transfer fluids are widely used in the chemical process industry, are flammable above their flash points, and can cause aerosol explosions. Though the possibility of aerosol explosions has been widely documented, knowledge about their explosive potential is limited. Studying the formation of such aerosols by emulating leaks in process equipment will help define a source term for aerosol dispersions and aid in characterizing their explosion hazards. Analysis of the problem of aerosol explosions reveals three major steps: source term calculations, dispersion modeling, and explosion analysis. The explosion analysis, consisting of ignition and combustion, is largely affected by the droplet size distribution of the dispersed aerosol. The droplet size distribution of the dispersed aerosol is a function of the droplet size distribution of the aerosol formed from the leak. Existing methods of dealing with the problem of aerosol explosions are limited to enhancing the dispersion to prevent flammable concentrations and use of explosion suppression mechanisms. Insufficient data and theory on the flammability limits of aerosols renders such method speculative at best. Preventing the formation of aerosol upon leaking will provide an inherently safer solution to the problem. The research involves the non-intrusive measurement of heat transfer fluid aerosol sprays using a Malvern Diffraction Particle Analyzer. The aerosol is generated by plain orifice atomization to simulate the formation and dispersion of heat transfer fluid aerosols through leaks in process equipment. Predictive correlations relating aerosol droplet sizes to bulk liquid pressures, temperatures, thermal and fluid properties, leak sizes, and ambient conditions are presented. These correlations will be used to predict the conditions under which leaks will result in the formation of aerosols and will ultimately help in estimating the explosion hazards of heat transfer fluid aerosols. Heat transfer fluid selection can be based on liquids that are less likely to form aerosols. Design criteria also can incorporate the data to arrive at operating conditions that are less likely to produce aerosols. The goal is to provide information that will reduce the hazards of aerosol explosions thereby improving safety in process industries.
13

Stochastic Programming Approaches for the Placement of Gas Detectors in Process Facilities

Legg, Sean W 16 December 2013 (has links)
The release of flammable and toxic chemicals in petrochemical facilities is a major concern when designing modern process safety systems. While the proper selection of the necessary types of gas detectors needed is important, appropriate placement of these detectors is required in order to have a well-functioning gas detection system. However, the uncertainty in leak locations, gas composition, process and weather conditions, and process geometries must all be considered when attempting to determine the appropriate number and placement of the gas detectors. Because traditional approaches are typically based on heuristics, there exists the need to develop more rigorous optimization based approaches to handling this problem. This work presents several mixed-integer programming formulations to address this need. First, a general mixed-integer linear programming problem is presented. This formulation takes advantage of precomputed computational fluid dynamics (CFD) simulations to determine a gas detector placement that minimizes the expected detection time across all scenarios. An extension to this formulation is added that considers the overall coverage in a facility in order to improve the detector placement when enough scenarios may not be available. Additionally, a formulation considering the Conditional-Value-at-Risk is also presented. This formulation provides some control over the shape of the tail of the distribution, not only minimizing the expected detection time across all scenarios, but also improving the tail behavior. In addition to improved formulations, procedures are introduced to determine confidence in the placement generated and to determine if enough scenarios have been used in determining the gas detector placement. First, a procedure is introduced to analyze the performance of the proposed gas detector placement in the face of “unforeseen” scenarios, or scenarios that were not necessarily included in the original formulation. Additionally, a procedure for determine the confidence interval on the optimality gap between a placement generated with a sample of scenarios and its estimated performance on the entire uncertainty space. Finally, a method for determining if enough scenarios have been used and how much additional benefit is expected by adding more scenarios to the optimization is proposed. Results are presented for each of the formulations and methods presented using three data sets from an actual process facility. The use of an off-the-shelf toolkit for the placement of detectors in municipal water networks from the EPA, known as TEVA-SPOT, is explored. Because this toolkit was not designed for placing gas detectors, some adaptation of the files is necessary, and the procedure for doing so is presented.
14

Combining qualitative and quantitative reasoning to support hazard identification by computer

McCoy, Stephen Alexander January 1999 (has links)
This thesis investigates the proposition that use must be made of quantitative information to control the reporting of hazard scenarios in automatically generated HAZOP reports. HAZOP is a successful and widely accepted technique for identification of process hazards. However, it requires an expensive commitment of time and personnel near the end of a project. Use of a HAZOP emulation tool before conventional HAZOP could speed up the examination of routine hazards, or identify deficiencies I in the design of a plant. Qualitative models of process equipment can efficiently model fault propagation in chemical plants. However, purely qualitative models lack the representational power to model many constraints in real plants, resulting in indiscriminate reporting of failure scenarios. In the AutoHAZID computer program, qualitative reasoning is used to emulate HAZOP. Signed-directed graph (SDG) models of equipment are used to build a graph model of the plant. This graph is searched to find links between faults and consequences, which are reported as hazardous scenarios associated with process variable deviations. However, factors not represented in the SDG, such as the fluids in the plant, often affect the feasibility of scenarios. Support for the qualitative model system, in the form of quantitative judgements to assess the feasibility of certain hazards, was investigated and is reported here. This thesis also describes the novel "Fluid Modelling System" (FMS) which now provides this quantitative support mechanism in AutoHAZID. The FMS allows the attachment of conditions to SDG arcs. Fault paths are validated by testing the conditions along their arcs. Infeasible scenarios are removed. In the FMS, numerical limits on process variable deviations have been used to assess the sufficiency of a given fault to cause any linked consequence. In a number of case studies, use of the FMS in AutoHAZID has improved the focus of the automatically generated HAZOP results. This thesis describes qualitative model-based methods for identifying process hazards by computer, in particular AutoHAZID. It identifies a range of problems where the purely qualitative approach is inadequate and demonstrates how such problems can be tackled by selective use of quantitative information about the plant or the fluids in it. The conclusion is that quantitative knowledge is' required to support the qualitative reasoning in hazard identification by computer.
15

A paradigm shift in Natech risk management : Development of a framework for evaluating the performance of industry and enhancing territorial resilience / Natech リスクマネジメントのパラダイムシフト : 石油化学コンビナートの防災性能評価と地域のレジリエンスの向上のためのフレームワークの開発

SUAREZ, PABA MARIA CAMILA 24 September 2019 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(工学) / 甲第22056号 / 工博第4637号 / 新制||工||1723(附属図書館) / 京都大学大学院工学研究科都市社会工学専攻 / (主査)教授 CRUZ Ana Maria , 教授 山田 忠史, 准教授 松島 格也 / 学位規則第4条第1項該当 / Doctor of Philosophy (Engineering) / Kyoto University / DFAM
16

Kamufláž / Camouflage

Ošlejšková, Petra January 2012 (has links)
Masters thesis works with concepts such as home intimacy, safety, system, etc. During the period from 16.1.2012 to 12.3.2012 author secretly penetrated into the apartment their parents and disrupted by subtle interventions to break the routine like their home. Secretly watched the behavior and rules which operate on her parents (with their parents she dont live). The intervention provoked to a specific form of interaction between author and her family.
17

Fuzzy evidence theory and Bayesian networks for process systems risk analysis

Yazdi, M., Kabir, Sohag 21 October 2019 (has links)
Yes / Quantitative risk assessment (QRA) approaches systematically evaluate the likelihood, impacts, and risk of adverse events. QRA using fault tree analysis (FTA) is based on the assumptions that failure events have crisp probabilities and they are statistically independent. The crisp probabilities of the events are often absent, which leads to data uncertainty. However, the independence assumption leads to model uncertainty. Experts’ knowledge can be utilized to obtain unknown failure data; however, this process itself is subject to different issues such as imprecision, incompleteness, and lack of consensus. For this reason, to minimize the overall uncertainty in QRA, in addition to addressing the uncertainties in the knowledge, it is equally important to combine the opinions of multiple experts and update prior beliefs based on new evidence. In this article, a novel methodology is proposed for QRA by combining fuzzy set theory and evidence theory with Bayesian networks to describe the uncertainties, aggregate experts’ opinions, and update prior probabilities when new evidences become available. Additionally, sensitivity analysis is performed to identify the most critical events in the FTA. The effectiveness of the proposed approach has been demonstrated via application to a practical system. / The research of Sohag Kabir was partly funded by the DEIS project (Grant Agreement 732242).
18

Uncertainty handling in fault tree based risk assessment: State of the art and future perspectives

Yazdi, M., Kabir, Sohag, Walker, M. 18 October 2019 (has links)
Yes / Risk assessment methods have been widely used in various industries, and they play a significant role in improving the safety performance of systems. However, the outcomes of risk assessment approaches are subject to uncertainty and ambiguity due to the complexity and variability of system behaviour, scarcity of quantitative data about different system parameters, and human involvement in the analysis, operation, and decision-making processes. The implications for improving system safety are slowly being recognised; however, research on uncertainty handling during both qualitative and quantitative risk assessment procedures is a growing field. This paper presents a review of the state of the art in this field, focusing on uncertainty handling in fault tree analysis (FTA) based risk assessment. Theoretical contributions, aleatory uncertainty, epistemic uncertainty, and integration of both epistemic and aleatory uncertainty handling in the scientific and technical literature are carefully reviewed. The emphasis is on highlighting how assessors can handle uncertainty based on the available evidence as an input to FTA.
19

Uncertainty handling in fault tree based risk assessment: State of the art and future perspectives

Mohammad, Y., Kabir, Sohag, Martin, W. 18 October 2019 (has links)
Yes / Risk assessment methods have been widely used in various industries, and they play a significant role in improving the safety performance of systems. However, the outcomes of risk assessment approaches are subject to uncertainty and ambiguity due to the complexity and variability of system behaviour, scarcity of quantitative data about different system parameters, and human involvement in the analysis, operation, and decision-making processes. The implications for improving system safety are slowly being recognised; however, research on uncertainty handling during both qualitative and quantitative risk assessment procedures is a growing field. This paper presents a review of the state of the art in this field, focusing on uncertainty handling in fault tree analysis (FTA) based risk assessment. Theoretical contributions, aleatory uncertainty, epistemic uncertainty, and integration of both epistemic and aleatory uncertainty handling in the scientific and technical literature are carefully reviewed. The emphasis is on highlighting how assessors can handle uncertainty based on the available evidence as an input to FTA.
20

Reliability Analysis of Process Systems Using Intuitionistic Fuzzy Set Theory

Yazdi, M., Kabir, Sohag, Kumar, M., Ghafir, Ibrahim, Islam, F. 13 February 2023 (has links)
Yes / In different engineering processes, the reliability of systems is increasingly evaluated to ensure that the safety-critical process systems will operate within their expected operational boundary for a certain mission time without failure. Different methodologies used for reliability analysis of process systems include Failure Mode and Effect Analysis (FMEA), Fault Tree Analysis (FTA), and Bayesian Networks (BN). Although these approaches have their own procedures for evaluating system reliability, they rely on exact failure data of systems’ components for reliability evaluation. Nevertheless, obtaining exact failure data for complex systems can be difficult due to the complex behaviour of their components, and the unavailability of precise and adequate information about such components. To tackle the data uncertainty issue, this chapter proposes a framework by combining intuitionistic fuzzy set theory and expert elicitation that enables the reliability assessment of process systems using FTA. Moreover, to model the statistical dependencies between events, we use the BN for robust probabilistic inference about system reliability under different uncertainties. The efficiency of the framework is demonstrated through application to a real-world system and comparison of the results of analysis produced by the existing approaches. / The full text will be available at the end of the publisher's embargo, 9th April 2025

Page generated in 0.0618 seconds