21 |
The Use of Risk Analysis Techniques to Determine the Probability of Producing Non-Compliant Drinking Water: Focusing on Dual Media Rapid Gravity FiltrationMcAllister, Lawrence Brett January 2006 (has links)
The main goal of a drinking water treatment plant is to provide safe drinking water for its consumers. Historically, this was accomplished through monitoring the influent and effluent water quality to ensure that the water quality met a set of guidelines and regulations. However, as the limitations of relying on compliance monitoring become more evident, water utilities and drinking water treatment plants are beginning to utilize risk management frameworks to help provide safe drinking water and to mitigate potential risks. Applying a risk management framework requires an evaluation of potential risks. This systematic evaluation can be performed through using risk analysis methods. <br /><br />
The overall goal of this research is to analyze and evaluate risk analysis methodologies that are used in a variety of engineering fields, select two risk analysis methods, and use them to evaluate the probability of producing non-compliant drinking water from a rapid gravity filtration unit with respect to turbidity. <br /><br />
The risk analysis methodologies that were used in this research were the consequence frequency assessment and computer modelling combined with probabilistic risk analysis. Both of the risk analysis methodologies were able to determine the probability of producing non-compliant water from a rapid gravity filtration unit with respect to turbidity. However, these methodologies were found to provide different numerical results with respect to each other. The consequence frequency assessment methodology was found to be easier to implement; however, the consequence frequency assessment was only able to be performed on one parameter at a time. Computer modelling and probabilistic risk analysis enabled the inclusion of multiple parameters which provided a more comprehensive understanding of the filtration unit. <br /><br />
The primary conclusion from this research is that the risk analysis methods, as they are described in this thesis, are not sufficient to use directly on a rapid gravity filtration unit without further modification. Furthermore, although the risk analysis methods provided some guidance, these methods should only be used as a part of a complete risk management process.
|
22 |
The Use of Risk Analysis Techniques to Determine the Probability of Producing Non-Compliant Drinking Water: Focusing on Dual Media Rapid Gravity FiltrationMcAllister, Lawrence Brett January 2006 (has links)
The main goal of a drinking water treatment plant is to provide safe drinking water for its consumers. Historically, this was accomplished through monitoring the influent and effluent water quality to ensure that the water quality met a set of guidelines and regulations. However, as the limitations of relying on compliance monitoring become more evident, water utilities and drinking water treatment plants are beginning to utilize risk management frameworks to help provide safe drinking water and to mitigate potential risks. Applying a risk management framework requires an evaluation of potential risks. This systematic evaluation can be performed through using risk analysis methods. <br /><br />
The overall goal of this research is to analyze and evaluate risk analysis methodologies that are used in a variety of engineering fields, select two risk analysis methods, and use them to evaluate the probability of producing non-compliant drinking water from a rapid gravity filtration unit with respect to turbidity. <br /><br />
The risk analysis methodologies that were used in this research were the consequence frequency assessment and computer modelling combined with probabilistic risk analysis. Both of the risk analysis methodologies were able to determine the probability of producing non-compliant water from a rapid gravity filtration unit with respect to turbidity. However, these methodologies were found to provide different numerical results with respect to each other. The consequence frequency assessment methodology was found to be easier to implement; however, the consequence frequency assessment was only able to be performed on one parameter at a time. Computer modelling and probabilistic risk analysis enabled the inclusion of multiple parameters which provided a more comprehensive understanding of the filtration unit. <br /><br />
The primary conclusion from this research is that the risk analysis methods, as they are described in this thesis, are not sufficient to use directly on a rapid gravity filtration unit without further modification. Furthermore, although the risk analysis methods provided some guidance, these methods should only be used as a part of a complete risk management process.
|
23 |
An assessment of geospatial technologies as used for wildland fire suppressionIqbāl, Jāvid 04 June 2010 (has links)
Wildland fire fighting is complex due to climatic variation, risk and uncertainty, and the proximity of human and resource values. Information about fire environments, resource availability and logistics, fire behavior, and values at risk are important issues fire managers must consider in allocating scarce resources. Improved information thus, has value in reducing risk and costs and damages. Geospatial technology, which includes remote sensing tools, geographic positioning systems (GPS), geographic information systems (GIS) and various maps are widely used in wildland fire management. My research evaluates geospatial tools in three different ways: their role in risk reduction, their effect on wildland fire costs and damages, and wildland fire managers perceived costs and benefits.<p>
A theoretical model was developed to analyze the role of geospatial tools in reducing the risk. Risk-averse fire managers were found to use more geospatial technologies compared to those who did not incorporate risk in their decision making, resulting in a creation of value for these technologies. A simultaneous equation system of fires was estimated using the two-stage and the three-stage least squares estimation methods to examine the impact of geospatial tools on fire size, cost and damages. The effect of geospatial technology on fire size was significant in the Full Response Zone. Fire size was positively related to drought and duff moisture codes. Damages and cost of suppression were not affected significantly by the use of digitized maps. The survey of wildland fire managers revealed that geospatial tools are useful in integrating information and provide more clarity, flexibility and accuracy in decision-making. It was also discovered in the survey that geospatial tools are most commonly used when multiple fires are burning at the same time and threatening high resource values. Overall, the findings from this research indicated that risk-averse fire managers use geospatial tools more intensively; that maps play a significant role in reducing the fire size in the Full Response Zone, and, finally, the fire managers view that these technologies are more economically efficient in the Full Response Zone makes a case for more investment in developing and employing them on fires. Record keeping and data collection as well as understanding the human element in terms of risk aversion will be important for future studies and for adopting new technology and allocating resources efficiently.
|
24 |
Investigation of the distribution and risk factors associated with Mycobacterium avium subspecies paratuberculosis in cow-calf herds in CanadaDouma, Dale Peter 14 April 2011 (has links)
This thesis summarizes an investigation of Mycobacterium avium subspecies paratuberculosis (Map) as a pathogen within the cow-calf industry in Canada. The specific objectives of this project were to describe the distribution of this pathogen in this industry provincially, as well as at the individual farm level in wildlife species, and in the environment. Secondary objectives of this project were to identify on-farm management risk factors that are associated with this disease and to examine potential options for herd level diagnostic capabilities. Nationally, 0.8% (95%CI = 0.4-1.1%) of the cows in the cow-calf industry were seropositive for Map with 11.7% (95%CI=7.0-16.5%) of the herds sampled having a minimum of one positive test result or 4.5% (95%CI=1.4-7.5%) of the herds having a minimum of two positive test results. The true cow prevalence was estimated as 1.8% (95%CI= 0.4 3.1). No Map was detected in any of the non-ruminant wildlife species sampled on cow-calf operations suggesting that these species were not of primary concern when dealing with the management of this disease. In a study not focussed on a cow-calf operation, Map was detected in one cluster of trapped coyote samples in a region with cow-calf production. The prevalence of Map infection in this cluster of coyotes was calculated to be 9.1% (CI: 5.7-12.5). The prevalence of infection in coyotes including all sites, ignoring the effect of clustering, was calculated to be 3.7% (CI: 2.3-5.1). The use of a commercial colostrum replacement on farm (Odds Ratio =3.96; 95% CI = 1.1014.23, p=0.035) and the presence of wild deer interacting with the cattle (Odds Ratio = 14.32; 95% CI = 1.13181.90, p=0.040) were positively associated with being a herd infected with paratuberculosis. The use of rotational grazing practices was protective (Odds Ratio = 0.20; 95% CI = 0.040.93, p=0.039). It was possible to detect environmental contamination with Map on cow-calf farms using bacterial culture and PCR for confirmation. No water samples were positive to Map; however, 6.2% of the non-water environmental samples were positive. The use of an environmental sampling protocol had a herd sensitivity of 29.6%. This finding led to a simulation modelling study to evaluate how various testing methods would compare in the broader population of cow-calf herds. The final mean risk of selecting a herd infected with Map that was not identified as positive via the herd screen test strategy was 12.9%, 9.8%, 9.6%, and 6.1% for no herd screen test, environmental sampling, ELISA serology, and pooled fecal culture strategies, respectively.
|
25 |
Profit-Based Unit Commitment and Risk AnalysisGow, Hong-Jey 27 July 2010 (has links)
For the power market participators, there are competition and more trade opportunities in the power industry under the deregulation. In the electricity market, the bidding model is adopted instead of the cost model. GenCos try to maximize the profit under bidding model according to the power demand. Electricity becomes commodity and its price varies with power demand, bidding strategy and the grid. GenCos perform the unit commitment in a price volatile environment to reach the maximal profit. In a deregulation environment, Independent System Operator (ISO) is very often responsible for the electricity auction and secured power scheduling. The ISO operation may involve all kinds of risks. These risks include price volatility risk, bidding risk, congestion risk, and so on. For some markets, it is very important how GenCos determine the optimal unit commitment schedule considering risk management. A good risk analysis will help GenCo maximize profit and purse sustainable development. In this study, price forecasting is developed to provide information for power producers to develop bidding strategies to maximize profit. Profit-Based Unit Commitment (PBUC) model was also derived. An Enhanced Immune Algorithm (EIA) is developed to solve the PBUC problem. Finally, the Value-at-Risk (VAR) of GenCos is found with a present confident level. Simulation results provide a risk management rule to find an optimal risk control strategy to maximize profit and raise its compatibility against other players.
|
26 |
Technical, economic and risk analysis of multilateral wellsArcos Rueda, Dulce Maria 15 May 2009 (has links)
The oil and gas industry, more than at any time in the past, is highly affected by
technological advancements, new products, drilling and completion techniques, capital
expenditures (CAPEX), operating expenditures (OPEX), risk/uncertainty, and
geopolitics. Therefore, to make a decision in the upstream business, projects require a
thorough understanding of the factors and conditions affecting them in order to
systematically analyze, evaluate and select the best choice among all possible
alternatives.
The objective of this study is to develop a methodology to assist engineers in the
decision making process of maximizing access to reserves. The process encompasses
technical, economic and risk analysis of various alternatives in the completion of a well
(vertical, horizontal or multilateral) by using a well performance model for technical
evaluation and a deterministic analysis for economic and risk assessment.
In the technical analysis of the decision making process, the flow rate for a defined
reservoir is estimated by using a pseudo-steady state flow regime assumption. The
economic analysis departs from the utilization of the flow rate data which assumes a
certain pressure decline. The financial cash flow (FCF) is generated for the purpose of
measuring the economic worth of investment proposals. A deterministic decision tree is
then used to represent the risks inherent due to geological uncertainty, reservoir
engineering, drilling, and completion for a particular well. The net present value (NPV) is utilized as the base economic indicator. By selecting a type of well that maximizes the
expected monetary value (EMV) in a decision tree, we can make the best decision based
on a thorough understanding of the prospect.
The method introduced in this study emphasizes the importance of a multi-discipline
concept in drilling, completion and operation of multilateral wells.
|
27 |
Resampling confidence regions and test procedures for second degree stochastic efficiency with respect to a functionSchumann, Keith Daniel 30 October 2006 (has links)
It is often desirable to compare risky investments in the context of economic
decision theory. Expected utility analyses are means by which stochastic alternatives
can be ranked by re-weighting the probability mass using a decision-making agentâÂÂs
utility function. By maximizing expected utility, an agent seeks to balance expected
returns with the inherent risk in each investment alternative. This can be accomplished
by ranking prospects based on the certainty equivalent associated with each
alternative.
In instances where only a small sample of observed data is available to estimate
the underlying distributions of the risky options, reliable inferences are difficult
to make. In this process of comparing alternatives, when estimating explicit probability
forms or nonparametric densities, the variance of the estimate, in this case
the certainty equivalent, is often ignored. Resampling methods allow for estimating
dispersion for a statistic when no parametric assumptions are made about the underlying
distribution. An objective of this dissertation is to utilize these methods to
estimate confidence regions for the sample certainty equivalents of the alternatives
over a subset of the parameter space of the utility function. A second goal of this research is to formalize a testing procedure when dealing
with preference ranking with respect to utility. This is largely based on MeyerâÂÂs
work (1977b) developing stochastic dominance with respect to a function and more
specific testing procedures outlined by Eubank et. al. (1993). Within this objective,
the asymptotic distribution of the test statistic associated with the hypothesis of
preference of one risky outcome over another given a sub-set of the utility function
parameter space is explored.
|
28 |
Quantitative transportation risk analysis based on available data/databases: decision support tools for hazardous materials transportationQiao, Yuanhua 17 September 2007 (has links)
Historical evidence has shown that incidents due to hazardous materials (HazMat) releases during transportation can lead to severe consequences. The public and some agencies such as the Department of Transportation (DOT) show an increasing concern with the hazard associated with HazMat transportation. Many hazards may be identified and controlled or eliminated through use of risk analysis. Transportation Risk Analysis (TRA) is a powerful tool in HazMat transportation decision support system. It is helpful in choosing among alternate routes by providing information on risks associated with each route, and in selecting appropriate risk reduction alternatives by demonstrating the effectiveness of various alternatives. Some methodologies have been developed to assess the transportation risk; however, most of those proposed methodologies are hard to employ directly by decision or policy makers. One major barrier is the lack of the match between available data/database analysis and the numerical methodologies for TRA. In this work methodologies to assess the transportation risk are developed based on the availability of data or databases. The match between the availability of data/databases and numerical TRA methodologies is pursued. Each risk component, including frequency, release scenario, and consequence, is assessed based on the available data/databases. The risk is measured by numerical algorithms step by step in the transportation network. Based on the TRA results, decisions on HazMat transportation could be made appropriately and reasonably. The combination of recent interest in expanding or building new facilities to receive liquefied natural gas (LNG) carriers, along with increased awareness and concern about potential terrorist action, has raised questions about the potential consequences of incidents involving LNG transportation. One of those consequences, rapid phase transition (RPT), is studied in this dissertation. The incidents and experiments of LNG-water RPT and theoretical analysis about RPT mechanism are reviewed. Some other consequences, like pool spread and vapor cloud dispersion, are analyzed by Federal Energy Regulatory Commission (FERC) model.
|
29 |
Accessing a web based business systemthrough a smartphone, a risk analysisNilsson, Anton January 2015 (has links)
This thesis project has been performed at (and for) a company named Strödata. The purpose of the project has been to perform a risk analysis on Strödata’s web based business system, and specifically analyze how access to the business system through smartphones would affect the risks posed to the system. This has been done to help decide if smartphone access should be enabled. An implementation of a web application which is suited for use on a smartphone has also been developed, as a proof-of-concept, to grant access to a limited part of the business system. The method used to perform the risk analysis has been CORAS, as presented by Braber et al in [1]. CORAS is a risk analysis method designed with IT-systems specifically in mind. The method is divided into seven steps. The new web application is an ASP.NET MVC3 site that uses JavaScript, jQuery and Ajax-JSON. The risk analysis showed, among other things, that the benefits of enabling smartphone access to the business system are larger than the risks it introduces. Smartphone access also opens up many new possibilities to implement interesting new features or improve old ones. The risk analysis also showed that there are risks to the system that need to be dealt with. For these, risks treatments were identified to lessen their probabilities and/or their consequences should they occur. Some treatments were completely successful in eliminating the risks they treat, others were not. However, the treatments that were not completely successful did reduce the risks far enough that perhaps they should be re-evaluated as un-/acceptable. The conclusions that can be drawn from this thesis project are that although enabling smartphone access to the business system introduces new risks to the system, the access also reduces certain risks. How costly the new risks are and how much the access reduces risks varies from company to company and from system to system. For Strödata, the reduction to certain risks was large enough to outweigh the new risks that would be introduced. Regarding the possibility to implement smartphone access to the business system, it is possible using more modern technologies, methods and frameworks; such as those mentioned above.
|
30 |
Risk Analysis of the applied RFID system : Project StolpenGrunzke, Richard January 2007 (has links)
This thesis will be a risk analysis of a RFID-system for a logistical application. The system works as follows: Around Karlstad in Sweden there are three new weighing machines for lorries. The load weight will be measured for the police to control overweight and for logistical reasons such as issuing invoices and optimising the supply chain. The lorries do not have to stop to be weighed. They have to drive slowly over the weighing machine, so the loss of time is minimal. The lorries will be identified via RFID-tags. So every time a lorry will be driven over the weighing machine, the identification number and the measured weight will be logged and send to a database. In the future it is planed to store the weight on the tag itself. The task is now to analyse the RFID-communication and the transmission to the database. The thesis will contain several parts. First RFID in general and how RFID will be used in the application-scenario will be described. Next sections will be about the security and privacy requirements and the risks in detail. Then possible solutions are outlined and concrete suggestions are presented. Finally a conclusion will be drawn, which will show that the application has a low level of security.
|
Page generated in 0.0785 seconds