Spelling suggestions: "subject:"incertainty,"" "subject:"ncertainty,""
251 |
Quantifying the financial and level of service implications of network variable uncertainty in infrastructure management2015 September 1900 (has links)
There are existing standards and guidelines for the effective management of infrastructure through infrastructure asset management planning (IAM). However, few if any of these standards explicitly address the financial implications associated with the uncertainty that underlies the risk associated with service provision. Without credibly quantifying the potential implications of this network variable uncertainty (i.e. an extreme weather event that affects the performance and costs of many segments within the study network, or the introduction of a new technology that may impact the network cost estimates) infrastructure management systems may actually regularly and significantly over or under estimate the actual financial requirements required to provide services. Therefore, financial projections may actually include a systematic bias. It was hypothesized that a model could be developed that quantifies and communicates the financial implications of network variable uncertainty within the IAM context.
A model was developed to demonstrate how network variable uncertainty could be included in financial planning for infrastructure networks. The model was able to: (1) be applied to various types of infrastructure networks, (2) incorporate network variable uncertainty, (3) compare alternatives and scenarios, and (4) support effective communication of results. The outputs of the model were the average network annual worth (AW) and network present worth (PW). These outputs, along with tornado plots, risks curves, level of service dashboards, and existing budget levels, were used to communicate the impacts of the network variable uncertainty on the financial projections. The model was developed using Excel tools linked to DPL software to utilize probabilistic methods. The Life Cycle Cost (LCC) portion of the model was successfully verified against an existing infrastructure costing tool, the Land and Infrastructure Resiliency Assessment (LIRA) tool developed by the Agri-Environmental Services Branch of Agriculture and Agri-Food Canada. The impact of the network variable uncertainty within the variables was also quantified in terms of levels of service provided by the organization.
The developed model was first applied to a hypothetical twelve segment road network for illustrative purposes. For the hypothetical road network there were four events, representing network variable uncertainty, that were considered. These decisions or events included the: (1) decision to implement a new technology, (2) event of changing standards, (3) event of increased material costs, and (4) occurrence of an extreme rainfall event. The hypothetical network illustrated that if the defined decisions or events occurred then the expected network AW would increase by 41%. The impacts of decisions or events on the hypothetical network levels of service, stemming from network variable uncertainty, were also considered. The measured levels of service for the hypothetical network included the network financial sustainability indicator (an indicator reflecting the network current budget divided by the network annual worth as a percentage) and the frequency of blading of the roads.
The model was next applied to a case study using the Town of Shellbrook sanitary main network. The Town has a large quantity of aging mains which were constructed in the 1960’s and are expected to require renewal in the near term. The network variable uncertainty for the case study resulted from the potential decision to implement a new trenchless technology for the renewal of sanitary mains. The new technology was expected to decrease the renewal costs. However, there was uncertainty as to what percentage of the sanitary mains would be found to be suitable for the new technology. Using the model it was determined that if the decision was made to implement the new technology, there would be an expected reduction of 17% in the network AW. The levels of service that were used for the Shellbrook case study were the network financial sustainability indicator (annual budget / network AW) and the meeting of standards set by regulating bodies. It was determined that the network financial sustainability indicator was sensitive to the decision to implement the trenchless technology, while the meeting of regulating bodies was not. If the decision was made to implement the new technology the network sustainability indicator would be expected to increase from 28% (if the new technology was not implemented) to 34% (if the new technology were implemented).
The model was finally applied to a case study looking at the RM of Wilton gravel road network. The network variable uncertainty for this case study resulted from the potential increase in gravel material costs. The network variable uncertainty represented the magnitude of the annual increase in gravel costs. Given the event of increasing gravel costs the expected network AW would increase by 14%. The levels of service indicators used for the RM of Wilton case study were the network financial sustainability indicator and the frequency of blading. It was determined that the network financial sustainability indicator was sensitive to the event (increasing gravel costs), while the frequency of blading was not directly impacted (although it may be indirectly impacted). If the event of increasing gravel costs were to occur then the network financial sustainability indicator would be expected to decrease from 59% (if gravel costs did not increase) to 52% (if gravel costs did increase).
This research proved that the hypothesis was correct, and that a model could be developed that quantified and communicated the financial implications and level of service impacts of network variable uncertainty for IAM planning. This research illustrated and quantified that IAM planning without accounting for network variable uncertainty, such as: (1) changing technology, (2) changing standards, (3) increasing material costs, and (4) extreme weather events, managers may introduce a systematic bias into long term planning. Network variable uncertainty can significantly impact the projected expenditures required for the long term provision of services. Infrastructure managers and decision makers need to manage infrastructure in a sustainable way over the long term in the face of uncertainty. It is necessary that decision makers have information regarding the impacts of network variable uncertainty on both LCCs and levels of service to make fully informed decision.
|
252 |
Breaking Uncertainties for Product Offerings : "A Holistic Framework of Uncertainty Management for Planning, Designing and Developing PSS (Product/Service System) "Ashok Kumar, Allan, Chau Trinh, Giang January 2011 (has links)
In the last decade, PSS (Product/ Service System) emerged as a new effective business model in helping manufacturers increase significantly productivity and customer’s satisfaction, whist minimizing environmental impact. PSS contributes drastically to the development of an innovative transaction trend, in which rather than just providing physical products separately, industrial Companies are more focusing on integrated service offers and customer’s need fulfillment. However, to implement successfully PSS, manufacturers have to overcome many challenges and uncertainties. The uncertainties in the PSS planning phase are related to market, environment or company analysis; reliability, product/service integration, supplier coordination etc in the design and development stages are considered as potential uncertainties. Uncertainty is defined as “State of deficiency of information related to a future event” (Sakao et al., 2009). In which, risks derived from negative side of uncertainties may reduce efficiency of the model or even make the implementation process fail to some extent. If the uncertainty is resolved in a favorable way, risks can be seen as potential business opportunities for the development of PSS Companies. While many Companies already have their own uncertainty management initiative; others just utilize their long time experience to treat uncertainties. Therefore, numerous Companies are seeking a comprehensive uncertainty management framework that could be applicable in most circumstances. In order to fulfill this urgent need, our thesis aimed to develop a holistic framework in order to manage risks occurred in PSS planning, design and development stages. Based on previous valuable PSS researches and useful empirical data collected, our dissertation first determined successfully critical uncertainty factors and potential business opportunities exploited from those. In addition, the research investigated elaborately PSS product quality thresholds and producers’ perception on reliability of their products before constructing a general uncertainty management framework. In which the whole management process based on Active Risk Management philosophy, included Risk Management Planning, Risk Identification, Risk Assessment and Prioritization, Risk Quantification, Risk Response Planning, Risk Tracking and Control were introduced as a helpful guideline to support PSS Companies to treat effectively uncertainties in PSS planning, design and development.
|
253 |
Incorporating sensor uncertainty in robot map building using fuzzy boundary representationTovar, Alejandro 17 April 2014 (has links)
A map is important for autonomous mobile robots to traverse an environment safely and efficiently through highly competent abilities in path planning, navigation and localization. Maps are generated from sensors data. However, sensor uncertainties affect the mapping process and thus influence the performance of path planning, navigation and localization capabilities. This thesis proposes to incorporate sensor uncertainty information in robot environmental map using Fuzzy Boundary Representation (B-rep). Fuzzy B-rep map is generated by first converting measured range data into scan polygons, then combining scan polygons into resultant robot B-rep map by union operation and finally fuzzifying the B-rep map by sweeping sensor uncertainty membership function along generated B-rep map. A map of the fifth floor of E1 building is generated using the proposed method to demonstrate the alleviation in computational and memory load for robot environment mapping using Fuzzy B-rep, in contrast to the conventional grid based mapping methods.
|
254 |
Statistical approach toward designing expert systemHu, Zhiji January 1988 (has links)
Inference under uncertainty plays a crucial role in expert system and receives growing attention from artificial intelligence experts, statisticians, and psychologists. In searching for new satisfactory ways to model inference under uncertainty, it will be necessary to combine the efforts of researchers from different areas. It is expected that with deep insight into this crucial problem, it will not only have enormous impact on development of AI and expert system, but also bring classical areas like statistics into a new stage. This research paper gives a precise synopsis of present work in the field and explores the mechanics of statistical inference to a new depth by combining efforts of computer scientists, statisticians, and psychologists. One important part of the paper is the comparison of different paradigms, including the difference between statistical and logical views. Special attentions, which need to be paid when combining various methods, are considered in the paper. Also, some examples and counterexamples will be given to illustrate the availability of individual model which describes human behavior. Finally, a new framework to deal with uncertainty is proposed, and future trends of uncertainty management are projected. / Department of Mathematical Sciences
|
255 |
Affine Arithmetic Based Methods for Power Systems Analysis Considering Intermittent Sources of PowerMunoz Guerrero, Juan Carlos January 2013 (has links)
Intermittent power sources such as wind and solar are increasingly penetrating electrical grids, mainly motivated by global warming concerns and government policies. These intermittent and non-dispatchable sources of power affect the operation and control of the power system because of the uncertainties associated with their output power. Depending on the penetration level of intermittent sources of power, the electric grid may experience considerable changes in power flows and synchronizing torques associated with system stability, because of the variability of the power injections, among several other factors. Thus, adequate and efficient techniques are required to properly analyze the system stability under such uncertainties.
A variety of methods are available in the literature to perform power flow, transient, and voltage stability analyses considering uncertainties associated with electrical parameters. Some of these methods are computationally inefficient and require assumptions regarding the probability density functions (pdfs) of the uncertain variables that may be unrealistic in some cases. Thus, this thesis proposes computationally efficient Affine Arithmetic (AA)-based approaches for voltage and transient stability assessment of power systems, considering uncertainties associated with power injections due to intermittent sources of power. In the proposed AA-based methods, the estimation of the output power of the intermittent sources and their associated uncertainty are modeled as intervals, without any need for assumptions regarding pdfs. This is a more desirable characteristic when dealing with intermittent sources of power, since the pdfs of the output power depends on the planning horizon and prediction method, among several other factors. The proposed AA-based approaches take into account the correlations among variables, thus avoiding error explosions attributed to other self-validated techniques such as Interval Arithmetic (IA).
|
256 |
The Utility of Using Multiple Conceptual Models for the Design of Groundwater Remediation SystemsSheffield, Philip January 2014 (has links)
The design of pump and treat systems for groundwater remediation is often aided by
numerical groundwater modelling. Model predictions are uncertain, with this uncertainty
resulting from unknown parameter values, model structure and future system forcings.
Researchers have begun to suggest that uncertainty in groundwater model predictions is largely dominated by structural/conceptual model uncertainty and that multiple conceptual
models be developed in order to characterize this uncertainty. As regulatory bodies
begin to endorse the more expensive multiple conceptual model approach, it is useful to
assess whether a multiple model approach provides a signi cant improvement over a conventional single model approach for pump and treat system design, supplemented with a factor of safety. To investigate this question, a case study located in Tacoma, Washington which was provided by Conestoga-Rovers & Associates (CRA) was used.
Twelve conceptual models were developed to represent conceptual model uncertainty
at the Tacoma, Washington site and a pump and treat system was optimally designed for each conceptual model. Each design was tested across all 12 conceptual models with no factor of safety applied, and a factor of safety of 1.5 and 2 applied. Adding a factor of safety of 1.5 decreased the risk of containment failure to 15 percent, compared to 21 percent with no factor of safety. Increasing the factor of safety from 1.5 to 2 further reduced the risk of containment failure to 9 percent, indicating that the application of a factor of safety reduces the risk of design failure at a cost directly proportional to the value of the factor of safety.
To provide a relatively independent estimate of a factor of safety approach a single
"best" model developed by CRA was compared against the multiple model approach.
With a factor of safety of 1.5 or greater, adequate capture was demonstrated across all
12 conceptual models. This demonstrated that in this case using the single \best" model developed by CRA with a factor of safety would have been a reasonable surrogate for a multiple model approach. This is of practical importance to engineers as it demonstrates that the a conventional single model approach may be su cient. However, it is essential that the model used is a good model. Furthermore, a multiple model approach will likely be an excessive burden in cases such as pump and treat system design, where the cost of failure is low as the system can be adjusted during operation to respond to new data. This may not be the case for remedial systems with high capital costs such as permeable reactive barriers, which cannot be easily adjusted.
|
257 |
Climate change impact assessment and uncertainty analysis of the hydrology of a northern, data-sparse catchment using multiple hydrological modelsBohrn, Steven 17 December 2012 (has links)
The objective of this research was to determine the impact of climate change on
the Churchill River basin and perform analysis on uncertainty related to this
impact.
Three hydrological models were used to determine this impact and were
calibrated to approximately equivalent levels of efficiency. These include
WATFLOODTM, a semi-physically based, distributed model; HBV-EC, a semidistributed,
conceptual model; and HMETS, a lumped, conceptual model. These
models achieved Nash-Sutcliffe calibration values ranging from 0.51 to 0.71.
Climate change simulations indicated that the average of simulations predict a
small increase in flow for the 2050s and a slight decrease for the 2080s. Each
hydrological model predicted earlier freshets and a shift in timing of low flow
events.
Uncertainty analysis indicated that the chief contributor of uncertainty was the
selection of GCM followed by hydrological model with less significant sources of
uncertainty being parameterization of the hydrological model and selection of
emissions scenario.
|
258 |
The development and application of a normative framework for considering uncertainty and variability in economic evaluationCoyle, Douglas January 2004 (has links)
The focus of this thesis is in the development and application of a normative framework for handling both variability and uncertainty in making decisions using economic evaluation. The framework builds on the recent work which takes an intuitive Bayesian approach to handling uncertainty as well as adding a similar approach for the handling of variability. The technique of stratified cost effectiveness analysis is introduced as an innovative, intuitive and theoretically sound basis for consideration of variability with respect to cost effectiveness. The technique requires the identification of patient strata where there are differences between strata but individual strata are relatively homogenous. For handling uncertainty, the normative framework requires a twofold approach. First, the cost effectiveness of therapies within each patient stratum must be assessed using probabilistic analysis. Secondly, techniques for estimation of the expected value of perfect information should be applied to determine an efficient research plan for the disease of interest. For the latter, a new technique for estimating EVPI based on quadrature is described which is both accurate and allows simpler calculation of the expected value of sample information. In addition the unit normal loss integral method previously ignored as a method of estimating EVPPI is shown to be appropriate in specific circumstances. The normative framework is applied to decisions relating to the public funding of the treatment of osteoporosis in the province of Ontario. The optimal limited use criteria would be to fund treatment with alendronate for women aged 75 years and over with previous fracture and 77 years and over with no previous fracture. An efficient research plan would fund a randomised controlled trial comparing etidronate to no therapy with a sample size of 640. Certain other research studies are of lesser value. Subsequent to the analysis contained in this thesis, the province of Ontario revised there limited use criteria to be broadly in line with the conclusions of this analysis. Thus, the application of the framework to this area demonstrates both its feasibility and acceptability. The normative framework developed in this thesis provides an optimal solution for decision makers in terms of handling uncertainty and variability in economic evaluation. Further research refining methods for estimating information value and considering other forms of uncertainty within models will enhance the framework.
|
259 |
Static critical properties of the pure and diluted Heisenberg or Ising modelsDavies, Mathew Raymond January 1982 (has links)
Real space renormalisation group scaling techniques are used to investigate the static critical behaviour of the pure and dilute, classical, anisotropic Heisenberg model. Transfer matrix methods are employed to obtain asymptotically exact expressions for the correlation lengths and susceptibilities of the one-dimensional system. The resulting scaling relationships are combined with an approximate bond moving scheme to treat pure and dilute models in higher dimensionalities. Detailed discussions are given for the dependence of correlation lengths and susceptibilities on temperature, anisotropy and concentration, and fcr the critical temperature on anisotropy and concentration. Particular emphasis is given to the weakly anisotropic system near percolation threshold and comparisons are made between the results of the present analysis and those of neutron-scattering experiments on dilute quasi-two- and three-dimensional systems.
|
260 |
Management of Uncertainties in Publish/Subscribe SystemLiu, Haifeng 18 February 2010 (has links)
In the publish/subscribe paradigm, information providers disseminate publications to all consumers who have expressed interest by registering
subscriptions. This paradigm has found wide-spread applications, ranging from selective information dissemination to network management. However, all existing publish/subscribe systems cannot capture uncertainty inherent to the information in either subscriptions or publications.
In many situations the large number of data sources exhibit various kinds of uncertainties. Examples of imprecision include: exact knowledge to either specify subscriptions or publications is not available; the match between a subscription and a publication with uncertain data is
approximate; the constraints used to define a match is not only content based, but also take the semantic information into consideration. All these kinds of uncertainties have not received much attention in the context of publish/subscribe systems.
In this thesis, we propose new publish/subscribe models to express
uncertainties and semantics in publications and subscriptions, along with the matching semantics for each model. We also develop
efficient algorithms to perform filtering for our models so that it can be applied to process the rapidly increasing information on the Internet. A thorough experimental evaluation is presented to demonstrate that the proposed systems can offer scalability to large number of subscribers and high publishing rates.
|
Page generated in 0.0753 seconds