• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 8
  • 8
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Consequence analysis of aqueous ammonia spills using an improved liquid pool evaporation model

Raghunathan, Vijay 17 February 2005 (has links)
Source term modeling is the key feature in predicting the consequences of releases from hazardous fluids. Aqueous ammonia serves the purpose of a reducing medium and is replacing anhydrous ammonia in most of the Selective catalytic reduction (SCR) units. This newly developed model can estimate the vaporization rate and net mass evaporating into the air from a multicomponent non- ideal chemical spill. The work has been divided into two parts. In the first step a generic, dynamic source term model was developed that can handle multicomponent non-ideal mixtures. The applicability of this improved pool model for aqueous ammonia spills was then checked to aid in the offsite consequence analysis of aqueous ammonia spills. The behavior of the chemical released depends on its various inherent properties, ambient conditions and the spill scenario. The different heat transfer mechanisms associated with the pool will strongly depend on the temperature of the liquid pool system at different times. The model accounts for all the temperature gradients within the contained pool and hence helps us establish better estimation techniques for source terms of chemical mixtures. This research work will help obtain more accurate and reliable liquid evaporation rates that become the critical input for dispersion modeling studies.
2

Studying the Risk Management Model of Petrochemical Enterprises by Risk Base Inspection System

Chen, Kuo-Liang 24 August 2011 (has links)
ABSTRACT Redirecting the inspection plan to place emphasis on high risk equipment items is not the only objective when implementing Risk Base Inspection (RBI). Rather it would be much more fruitful if company staff were educated to be equipped with the capability of identifying potential risks and were willing to actually put into real practice in eliminating all these potential threats to an enterprise. Since its release, the API-580 technology has seen growing acceptance and becomes a popular methodology in maintaining the mechanical integrity of pressure equipment and piping. In addition to U.S.A, many other country including European nations and Japan have also assimilated the same risk concepts into regulations that require plant operators to aim for practical performance of equipment management, not at the extent of obligations required by the government. Such a risk-based concept is not just incorporated in regulations, when utilized in close conjunction with plant maintenance and inspection, becomes a powerful tool in helping determine optimal inspection intervals of pressure equipment. In order for the equipment management system to perform effectively, fundamental tasks such as failure mechanisms identification and effectiveness of inspection methods are keys to a successful RBI program. Some might question Risk Base Inspection (RBI) to be a conservative, less aggressive approach that rather than opting for more aggressive managerial methods, it recommends to focus on the whole life cycle of plant equipment. Keywords: API-581¡BRBI¡Bbusiness risk¡Bbusiness administration¡Bquantitative analysis¡Bconsequence analysis¡Brisk base inspection
3

Modification of the Priority Risk Index: Adapting to Emergency Management Accreditation Program Standards for Institutes of Higher Learning Hazard Mitigation Plans

Harris, Joseph B., Bartlett, Geoffrey, Joyner, T. A., Hart, Matthew, Tollefson, William 01 March 2021 (has links)
The Priority Risk Index is increasingly used as a methodology for quantifying jurisdictional risk for hazard mitigation planning purposes, and it can evolve to meet specific community needs. The index incorporates probability, impact, spatial extent, warning time, and duration when assessing each hazard, but it does not explicitly integrate a vulnerability and consequence analysis into its final scoring. To address this gap, a new index was developed- the Enhanced Priority Risk Index (EPRI). The new index adds a sixth category, vulnerability, calculated from a vulnerability and consequence analysis of the impacts on seven sectors identified in Standard 4.1.2 of the Emergency Management Accreditation Program (EMAP). To obtain a vulnerability score, impacts are ranked by sector from low (1) to very high (4), then a weighting factor is applied to each sector. The vulnerability score is added to the EPRI and provides risk levels based on the number of exploitable weaknesses and countermeasures identified within a specific jurisdiction. The vulnerability score and resulting EPRI are scalable and can be applied across jurisdictions, providing a transferable methodology that improves the hazard identification and risk assessment process and provides an approach for meeting EMAP accreditation standards.
4

Comparative Effectiveness Research and Cost-consequence Analysis of Albuterol and Levalbuterol in Patients with Chronic Obstructive Pulmonary Disease

Zhang, Yanjun 11 September 2015 (has links)
No description available.
5

An Approach For Landslide Risk Assesment By Using Geographic Information Systems (gis) And Remote Sensing (rs)

Erener, Arzu 01 December 2009 (has links) (PDF)
This study aims to develop a Geographic Information Systems (GIS) and Remote Sensing (RS) Based systematic quantitative landslide risk assessment methodology for regional and local scales. Each component of risk, i.e., hazard assessment, vulnerability, and consequence analysis, is quantitatively assessed for both scales. The developed landslide risk assessment methodology is tested at Kumluca watershed, which covers an area of 330 km2, in Bartin province of the Western Black Sea Region, Turkey. GIS and RS techniques are used to create landslide factor maps, to obtain susceptibility maps, hazard maps, elements at risk and risk maps, and additionally to compare the obtained maps. In this study, the effect of mapping unit and mapping method upon susceptibility mapping method, and as a result the effect upon risk map, is evaluated. Susceptibility maps are obtained by using two different mapping units, namely slope unit-based and grid-based mapping units. When analyzing the effect of susceptibility mapping method, this study attempts to extend Logistic Regression (LR) and Artificial Neural Network (ANN) by implementing Geographically-Weighted Logistic Regression (GWR) and spatial regression (SR) techniques for landslide susceptibility assessment. In addition to spatial probability of occurrence of damaging events, landslide hazard calculation requires the determination of the temporal probability. Precipitation triggers the majority of landslides in the study region. The critical rainfall thresholds were estimated by using daily and antecedent rainfalls and landslide occurrence dates based on three different approaches: Time Series, Gumble Distribution and Intensity Duration Curves. Different procedures are adopted to obtain the element at risk values and vulnerability values for local and regional scale analyses. For regional scale analysis, the elements at risk were obtained from existing digital cadastral databases and vulnerabilities are obtained by adopting some generalization approaches. On the other hand, on local scale the elements at risk are obtained by high resolution remote sensing images by the developed algorithms in an automatic way. It is found that risk maps are more similar for slope unit-based mapping unit than grid&ndash / based mapping unit.
6

Ny teknik och gamla drömmar : En konsekvensprövning av relationen mellan människan och en artificiell intelligens

Eriksson, Nils January 2021 (has links)
The purpose of this thesis is to examine how a theology regarding artificial intelligence best could be formulated in concern of consequences for a christian view of human nature. This purpose is examined by means of a comparative study of consequences derived from four different perspectives on the emergence of AI and the theoretical implications of its relation with mankind. As premise for what is considered a desired, respectively an undesired consequence, the minimum amount of human suffering is used conditioning the possibility of living a good life. In conducting the analysis, Leslie Stevensons theory of humanity in relation to God is used to interpret the christian view of human nature and a general wide theory of AI based on Cornel Du Toits definition is applied. My assessment of the researched consequences ends in a proposal for a constructive christian theology which argues for the necessity of placing a high value on a human capacity for vulnerability. This is because it enables invaluable human qualities such as empathy and compassion. Qualities that also should  lead the way and be modeled into a concern for all of creation, whether it is considered natural or artificial.
7

MODEL-BASED COST-CONSEQUENCE ANALYSIS OF POSTOPERATIVE TROPONIN T SCREENING IN PATIENTS UNDERGOING NONCARDIAC SURGERY

Lurati, Buse AL Giovanna 10 1900 (has links)
<p>Introduction: Globally, more than 200 million patients undergo major non-cardiac surgery each year and more than 10 million patients will be exposed to postoperative myocardial ischemia, a condition strongly associated with 30-day mortality. The majority of these events go undetected without postoperative Troponin screening. Methods: We conducted a model-based cost-consequence analysis comparing a postoperative Troponin T screening vs. standard care in patients undergoing noncardiac surgery. In a first model, we evaluated the incremental number of detected perioperative myocardial infarctions and the incremental costs. A second model assessed the effect of the screening and consequent treatment on 1-year survival and the related cost. Model inputs based on the Vascular events In Non-cardiac Surgery patIents cOhort evaluatioN (VISION) Study, a large international cohort. We run probability sensitivity analyses with 5,000 iterations. We conducted extensive sensitivity analyses.</p> <p>Results: The cost to avoid missing an event amounted to CAD$ 5,184 for PMI and CAD$ 2,983 for isolated Troponin T. The cost-effectiveness of the postoperative Troponin screening was higher in patients’ subgroups at higher risk for PMI, e.g. patients undergoing urgent surgery. The incremental costs at 1 year of a postoperative PMI screening by 4 Troponin T measurements were CAD$ 169.20 per screened patient. The cost to prevent a death at 1 year amounted to CAD$ 96,314; however, there was relevant model uncertainty associated with the efficacy of the treatment in the 1-year model.</p> <p>Conclusion: Based on the estimated incremental cost per health gain, the implementation of a postoperative Troponin T screening after noncardiac surgery seems appealing, in particular in patients at high risk for perioperative myocardial infarction. However, decision-makers will have to consider it in terms of opportunity costs, i.e. in relation to the cost-effectiveness of other potential programs within the broader health care context.</p> / Master of Science (MSc)
8

Zvýšení efektivity kontroly ramene tankovací nádrže

Sobotková, Kateřina January 2019 (has links)
The diploma thesis consists of a new design of a controlling procedure for a plastic component of a tank. Its theoretical part deals with the potential sources of errors and uncertainties arising from the measuring itself. It also deals with the characteristics of available CMM devices and includes an analysis of the methods assessing the acceptability of the measurement plan. The practical part analyses systematically the current state and proposes a new solution using a program created using a coordinate machine. A comparison of both variants is presented as an output of the thesis.

Page generated in 0.0861 seconds