• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1695
  • 419
  • 238
  • 214
  • 136
  • 93
  • 31
  • 26
  • 25
  • 21
  • 20
  • 15
  • 8
  • 7
  • 7
  • Tagged with
  • 3604
  • 597
  • 432
  • 363
  • 358
  • 358
  • 346
  • 326
  • 326
  • 294
  • 282
  • 255
  • 214
  • 213
  • 210
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Denoising and contrast constancy.

McIlhagga, William H. January 2004 (has links)
No / Contrast constancy is the ability to perceive object contrast independent of size or spatial frequency, even though these affect both retinal contrast and detectability. Like other perceptual constancies, it is evidence that the visual system infers the stable properties of objects from the changing properties of retinal images. Here it is shown that perceived contrast is based on an optimal thresholding estimator of object contrast, that is identical to the VisuShrink estimator used in wavelet denoising.
352

Ökologische Bewertung von zentralen und dezentralen Abwasserentsorgungssystemen

Schubert, Rebecca 09 May 2014 (has links) (PDF)
Die zentralen Abwasserentsorgungssysteme sind aufgrund der langen Nutzungsdauern und der Leistungsgebundenheit unflexibel gegenüber sich ändernden Rahmenbedingungen, wie dem demografischen Wandel, dem Klimawandel und einem sinkenden Trinkwasserverbrauch. Kleinkläranlagen werden daher häufig als eine Alternative diskutiert, vor allem wenn es um die Erschließung von neuen dezentral gelegenen Grundstücken geht, da eventuell höhere Kosten für das neue Kanalnetz zur zentralen Kläranlage anfallen. Um sich für eine der beiden Möglichkeiten zu entscheiden, werden bisher vorrangig ökonomische Instrumente eingesetzt. Ökologische Bewertungsmethoden finden bislang als Entscheidungsinstrument noch keine Anwendung. In dieser Arbeit wird exemplarisch eine Ökobilanzierung durchgeführt. Sie kann jedoch nur dann zur Entscheidungsfindung genutzt werden, wenn die Ergebnisse auch robust und zuverlässig sind. Deshalb ist es notwendig mit möglichen Unsicherheiten in der Ökobilanzierung offensiv umzugehen. Aus diesen Problematiken ergibt sich das zentrale Forschungsinteresse der ökologischen Bewertung von dezentralen Abwasserentsorgungssystemen im Vergleich zu zentralen Systemen. Dabei fließen die Abhängigkeit der Länge des Kanalnetzes und die Anzahl der angeschlossenen Einwohner, unter besonderer Berücksichtigung von Datenunsicherheiten bei der Ökobilanzierung, ein. Es wird die Methode der Ökobilanzierung angewendet, welche sich nach DIN EN ISO 14040 (2009) und DIN EN ISO 14044 (2006) richtet. Die Literaturrecherche ergibt, dass wenig Literatur über Kleinkläranlagen zur Erstellung einer Ökobilanz zur Verfügung steht. Deshalb wird ein Fragebogen an Hersteller von Kleinkläranlagen versendet. Auf Grundlage der erhobenen Daten wird eine Ökobilanz für eine SBR-Anlage und eine Rotationstauchkörperanlage erstellt. Um die Breite der Daten zu erfassen, wird die Szenarioanalyse verwendet. Dagegen stammen die Daten für die zentrale Anlage aus der EcoIvent-Datenbank für die Abwasserentsorgung (DOKA, G. (2007)), wofür weitere Szenarien erstellt werden. Die funktionelle Einheit sind vier Einwohnerwerte. Nach Aufstellung der Sachbilanz, wird die Methode „CML 2 baseline 2000“ zur Wirkungsabschätzung in SimaPro 7.1 eingesetzt. Mit einer Break-Even-Analyse werden die Abhängigkeiten von der Länge des Kanalnetzes und der angeschlossenen Einwohner für die Abwasserentsorgungssysteme untersucht. Die Integration der Datenunsicherheit erfolgt mittels einer Sensitivitäts- und einer Monte Carlo-Analyse. Die Analysen ergeben, dass sich die Installation einer Kleinkläranlage eher lohnt, wenn die kommunale Kläranlage weiter vom Haushalt entfernt und die Anzahl der angeschlossenen Einwohner gering ist. Eine SBR-Anlage ist einer Rotationstauchkörperanlage vorzuziehen. Die verwendeten Methoden zum Umgang mit Unsicherheiten zeigen, dass sich Ökobilanzen als Entscheidungsinstrument eignen. Die Anwendung von Methoden zur Vermeidung bzw. Eingliederung von Unsicherheiten muss noch viel stärker in die Ökobilanzen von Abwasserentsorgungssystemen einfließen. / Next to municipal sewage treatment plants for sewage treatment, decentralized small wastewater treatment plants can be applied. Given a long working life and a dependence on user numbers, central sewage systems are inflexible in respect to overall conditions such as demography, climate or declining water consumption. Small wastewater treatment plants in rural areas can be an alternative, especially in the case of new plots for buildings because of new expensive sewer grids. Economic methodologies help to choose one of these facilities whereas ecological assessments are neglected. This thesis focuses a life cycle assessment (LCA), but this can only been used if the results are robust and reliable. Hence it is necessary to analyze uncertainty factors of life cycle assessments. The following main objective arises: The ecological assessment of small wastewater treatment plants in comparison to a municipal sewage treatment plant as a function of the length of sewer grid and the number of connected inhabitants to the central sewer system in consideration of dealing with data uncertainty of LCA. A simplified LCA study, which is based on the DIN EN ISO 14040 (2006) and the DIN EN ISO 14044 (2006), is applied to capture this objective, which means in detail the analysis of environmental effects during the production, operation, and disposal phase. As the literature review has shown, available data on small wastewater treatment plants is nearly non-existent. Hence, conducting a survey among the producers of small wastewater treatment plants completes the less available literature. Based on the questionnaire-data, a LCA for of sequencing batch reactor (SBR) plant and rotating biological contactor plant is applied. The scenario analysis captures the extensive amount of the available data adequately. In contrast, the data for the central sewage treatment plant comes from the EcoInvent-Database for Wastewater Treatment (DOKA, G. (2007)) and is further used for other scenarios. The functional unit consists of four population equivalents. After collecting all data for the Life Cycle Inventory Analysis, the estimation of effectiveness is done by applying the “CML 2 baseline 2000” method implemented in SimaPro 7. By means of Break-Even-Analysis the dependence on the length of sewer grid and the number of inhabitants connected to the sewer grid is analyzed. To explicitly incorporate the uncertainty in LCA the Sensitivity analysis and the Monte Carlo-Analysis is used. The most important result is that the higher the distance between the household and the central sewage treatment plant and the lower the number of connected inhabitance, the more preferable is the small wastewater treatment plant. Furthermore, SBRs are generally preferred in comparison to rotating biological contactor plants. Concerning the treatment of uncertainty factors, the analysis of different methods confirms the LCA as the most accurate variant. Further requirements of research are in respect to prevent uncertainty factors or integrate them in LCAs of sewerage systems.
353

Conquering Variability for Robust and Low Power Designs

Sun, Jin January 2011 (has links)
As device feature sizes shrink to nano-scale, continuous technology scaling has led to a large increase in parameter variability during semiconductor manufacturing process. According to the source of uncertainty, parameter variations can be classified into three categories: process variations, environmental variations, and temporal variations. All these variation sources exert significant influences on circuit performance, and make it more challenging to characterize parameter variability and achieve robust, low-power designs. The scope of this dissertation is conquering parameter variability and successfully designing efficient yet robust integrated circuit (IC) systems. Previous experiences have indicated that we need to tackle this issue at every design stage of IC chips. In this dissertation, we propose several robust techniques for accurate variability characterization and efficient performance prediction under parameter variations. At pre-silicon verification stage, a robust yield prediction scheme under limited descriptions of parameter uncertainties, a robust circuit performance prediction methodology based on importance of uncertainties, and a robust gate sizing framework by ElasticR estimation model, have been developed. These techniques provide possible solutions to achieve both prediction accuracy and computation efficiency in early design stage. At on-line validation stage, a dynamic workload balancing framework and an on-line self-tuning design methodology have been proposed for application-specific multi-core systems under variability-induced aging effects. These on-line validation techniques are beneficial to alleviate device performance degradation due to parameter variations and extend device lifetime.
354

A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

Akram, Muhammad Farooq 28 March 2012 (has links)
The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to-be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system, make it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.
355

An optimal framework of investment strategy in brownfields redevelopment by integrating site-specific hydrogeological and financial uncertainties

Yu, Soonyoung January 2009 (has links)
Brownfields redevelopment has been encouraged by governments or the real estate market because of economic, social and environmental benefits. However, uncertainties in contaminated land redevelopment may cause massive investment risk and need to be managed so that contaminated land redevelopment is facilitated. This study was designed to address hydrogeological as well as economic uncertainty in a hypothetical contaminated land redevelopment project and manage the risk from these uncertainties through the integration of the hydrogeological and economic uncertainties. Hydrogeological uncertainty is derived from incomplete site information, including aquifer heterogeneity, and must be assessed with scientific expertise, given the short history of redevelopment projects and their unique hydrogeological characteristics. Hydrogeological uncertainty has not yet been incorporated in one framework with the economic uncertainty that has been relatively well observed in financial markets. Two cases of Non-Aqueous Phase Liquid (NAPL) contamination were simulated using a physically-based hydrogeological model to address hydrogeological uncertainty: one concerns the effect of an ethanol spill on a light NAPL (LNAPL) contaminated area in the vadose zone, and the other is regarding the vapour phase intrusion of volatile organic compounds, in particular, Trichloroethylene (TCE), a dense NAPL (DNAPL), into indoor air through a variably saturated heterogeneous aquifer. The first simulation replicated experimental observations in the laboratory, such as the capillary fringe depressing and the NAPL pool remobilizing and collecting in a reduced area exhibiting higher saturations than observed prior to an ethanol injection. However, the data gap, in particular, on the chemical properties between the model and the experiment caused the uncertainty in the model simulation. The second NAPL simulation has been performed based on a hypothetical scenario where new dwellings in a redeveloped area have the potential risk of vapour phase intrusion from a subsurface source into indoor air because remediation or foundation design might fail. The simulation results indicated that the aquifer heterogeneity seemed the most significant factor controlling the indoor air exposure risk from a TCE source in the saturated zone. Then, the exposure risk was quantified using Monte Carlo simulations with 50 statistically equivalent heterogeneous aquifer permeability fields. The quantified risk (probability) represents the hydrogeological uncertainty in the scenario and gives the information on loss occurrence intensity of redevelopment failure. Probability of failure (or loss occurrence intensity) was integrated with cost of failure (or loss magnitude) to evaluate the risk capital in the hypothetical brownfields redevelopment project. The term “risk capital” is adopted from financial literature and is the capital you can lose from high risk investment. Cost of failure involves economic uncertainty and can be defined based on a developer’s financial agreement with new dwellers to prevent litigation in the case of certain events, such as an environmental event where indoor air concentrations of pollutants exceed regulatory limits during periodic inspections. The developer makes such a financial agreement with new dwellers because new dwellings have been constructed founded on flawed site information, and municipalities may require it if a land use planning approval is required. An agreement was presumed that the developer would repurchase the affected houses from new dwellers immediately, if indoor air contamination exceeded the regulatory limit. Furthermore, the developer would remediate any remaining contamination, demolish the affected houses and build new houses if they were worth investing in. With this financial plan assumed, the stochastic housing price, stochastic inflation rate and stochastic interest rate have been considered to cause the uncertainty in the cost of failure, and the information on these stochastic variables was obtained from the financial market due to its long history of observations. This research reviewed appropriate risk capital valuation methods for hydrogeologists to apply straightforwardly to their projects, with integrating probability of failure (hydrogeological uncertainty) and cost of failure (economic uncertainty). The risk capital is essentially the probability of failure times the cost of failure with safety loading added to compensate investors against hydrogeological and financial uncertainty. Fair market prices of risk capital have been valuated using financial mathematics and actuarial premium calculations, and each method has a specific safety loading term to reflect investors’ level of risk aversion. Risk capital results indicated that the price of the risk capital was much more sensitive to hydrogeological uncertainty than financial uncertainty. Developers can manage the risk capital by saving a contingency fee for future events or paying an insurance premium, given that the price of this risk capital is the price of a contingent claim, subsequent to failure in remediation or in foundation design, and equivalent to an environmental insurance premium if there is an insurance company to indemnify the liability for the developer. The optimal framework of investment strategy in brownfields redevelopment can be built by linkage of addressing and integrating uncertainties and valuating risk capital from the uncertainties. This framework involves balancing the costs associated with each step while maximizing a net profit from land redevelopment. The optimal investment strategy, such as if or when to remediate or redevelop and to what degree, is given when the future price of the land minus time and material costs as well as the contingency fee or insurance premium maximizes a net profit.
356

Three essays on fair division and decision making under uncertainty

Xue, Jingyi 16 September 2013 (has links)
The first chapter is based on a paper with Jin Li in fair division. It was recently discovered that on the domain of Leontief preferences, Hurwicz (1972)'s classic impossibility result does not hold; that is, one can find efficient, strategy-proof and individually rational rules to divide resources among agents. Here we consider the problem of dividing l divisible goods among n agents with the generalized Leontief preferences. We propose and characterize the class of generalized egalitarian rules which satisfy efficiency, group strategy-proofness, anonymity, resource monotonicity, population monotonicity, envy-freeness and consistency. On the Leontief domain, our rules generalize the egalitarian-equivalent rules with reference bundles. We also extend our rules to agent-specific and endowment-specific egalitarian rules. The former is a larger class of rules satisfying all the previous properties except anonymity and envy-freeness. The latter is a class of efficient, group strategy-proof, anonymous and individually rational rules when the resources are assumed to be privately owned. The second and third chapters are based on two working papers of mine in decision making under uncertainty. In the second chapter, I study the wealth effect under uncertainty --- how the wealth level impacts a decision maker's degree of uncertainty aversion. I axiomatize a class of preferences displaying decreasing absolute uncertainty aversion, which allows a decision maker to be more willing to take uncertainty-bearing behavior when he becomes wealthier. Three equivalent preference representations are obtained. The first is a variation on the constraint criterion of Hansen and Sargent (2001). The other two respectively generalize Gilboa and Schmeidler (1989)'s maxmin criterion and Maccheroni, Marinacci and Rustichini (2006)'s variational representation. This class, when restricted to preferences exhibiting constant absolute uncertainty aversion, is exactly Maccheroni, Marinacci and Rustichini (2006)'s ariational preferences. Thus, the results further enable us to establish relationships among the representations for several important classes within variational preferences. The three representations provide different decision rules to rationalize the same class of preferences. The three decision rules correspond to three ways which are proposed in the literature to identify a decision maker's perception about uncertainty and his attitude toward uncertainty. However, I give examples to show that these identifications conflict with each other. It means that there is much freedom in eliciting two unobservable and subjective factors, one's perception about and attitude toward uncertainty, from only his choice behavior. This exactly motivates the work in Chapter 3. In the third chapter, I introduce confidence orders in addition to preference orders. Axioms are imposed on both orders to reveal a decision maker's perception about uncertainty and to characterize the following decision rule. A decision maker evaluates an act based on his aspiration and his confidence in this aspiration. Each act corresponds to a trade-off line between the two criteria: The more he aspires, the less his confidence in achieving the aspiration level. The decision maker ranks an act by the optimal combination of aspiration and confidence on its trade-off line according to an aggregating preference of his over the two-criterion plane. The aggregating preference indicates his uncertainty attitude, while his perception about uncertainty is summarized by a generalized second-order belief over the prior space, and this belief is revealed by his confidence order.
357

Bayesian calibration of building energy models for energy retrofit decision-making under uncertainty

Heo, Yeonsook 10 November 2011 (has links)
Retrofitting of existing buildings is essential to reach reduction targets in energy consumption and greenhouse gas emission. In the current practice of a retrofit decision process, professionals perform energy audits, and construct dynamic simulation models to benchmark the performance of existing buildings and predict the effect of retrofit interventions. In order to enhance the reliability of simulation models, they typically calibrate simulation models based on monitored energy use data. The calibration techniques used for this purpose are manual and expert-driven. The current practice has major drawbacks: (1) the modeling and calibration methods do not scale to large portfolio of buildings due to their high costs and heavy reliance on expertise, and (2) the resulting deterministic models do not provide insight into underperforming risks associated with each retrofit intervention. This thesis has developed a new retrofit analysis framework that is suitable for large-scale analysis and risk-conscious decision-making. The framework is based on the use of normative models and Bayesian calibration techniques. Normative models are light-weight quasi-steady state energy models that can scale up to large sets of buildings, i.e. to city and regional scale. In addition, they do not require modeling expertise since they follow a set of modeling rules that produce a standard measure for energy performance. The normative models are calibrated under a Bayesian approach such that the resulting calibrated models quantify uncertainties in the energy outcomes of a building. Bayesian calibration models can also incorporate additional uncertainties associated with retrofit interventions to generate probability distributions of retrofit performance. Probabilistic outputs can be straightforwardly translated into a measure that quantifies underperforming risks of retrofit interventions and thus enable decision making relative to the decision-makers' rational objectives and risk attitude. This thesis demonstrates the feasibility of the new framework on retrofit applications by verifying the following two hypotheses: (1) normative models supported by Bayesian calibration have sufficient model fidelity to adequately support retrofit decisions, and (2) they can support risk-conscious decision-making by explicitly quantifying risks associated with retrofit options. The first and second hypotheses are examined through case studies that compare outcomes from the calibrated normative model with those from a similarly calibrated transient simulation model and compare decisions derived by the proposed framework with those derived by standard practices respectively. The new framework will enable cost-effective retrofit analysis at urban scale with explicit management of uncertainties.
358

An optimal framework of investment strategy in brownfields redevelopment by integrating site-specific hydrogeological and financial uncertainties

Yu, Soonyoung January 2009 (has links)
Brownfields redevelopment has been encouraged by governments or the real estate market because of economic, social and environmental benefits. However, uncertainties in contaminated land redevelopment may cause massive investment risk and need to be managed so that contaminated land redevelopment is facilitated. This study was designed to address hydrogeological as well as economic uncertainty in a hypothetical contaminated land redevelopment project and manage the risk from these uncertainties through the integration of the hydrogeological and economic uncertainties. Hydrogeological uncertainty is derived from incomplete site information, including aquifer heterogeneity, and must be assessed with scientific expertise, given the short history of redevelopment projects and their unique hydrogeological characteristics. Hydrogeological uncertainty has not yet been incorporated in one framework with the economic uncertainty that has been relatively well observed in financial markets. Two cases of Non-Aqueous Phase Liquid (NAPL) contamination were simulated using a physically-based hydrogeological model to address hydrogeological uncertainty: one concerns the effect of an ethanol spill on a light NAPL (LNAPL) contaminated area in the vadose zone, and the other is regarding the vapour phase intrusion of volatile organic compounds, in particular, Trichloroethylene (TCE), a dense NAPL (DNAPL), into indoor air through a variably saturated heterogeneous aquifer. The first simulation replicated experimental observations in the laboratory, such as the capillary fringe depressing and the NAPL pool remobilizing and collecting in a reduced area exhibiting higher saturations than observed prior to an ethanol injection. However, the data gap, in particular, on the chemical properties between the model and the experiment caused the uncertainty in the model simulation. The second NAPL simulation has been performed based on a hypothetical scenario where new dwellings in a redeveloped area have the potential risk of vapour phase intrusion from a subsurface source into indoor air because remediation or foundation design might fail. The simulation results indicated that the aquifer heterogeneity seemed the most significant factor controlling the indoor air exposure risk from a TCE source in the saturated zone. Then, the exposure risk was quantified using Monte Carlo simulations with 50 statistically equivalent heterogeneous aquifer permeability fields. The quantified risk (probability) represents the hydrogeological uncertainty in the scenario and gives the information on loss occurrence intensity of redevelopment failure. Probability of failure (or loss occurrence intensity) was integrated with cost of failure (or loss magnitude) to evaluate the risk capital in the hypothetical brownfields redevelopment project. The term “risk capital” is adopted from financial literature and is the capital you can lose from high risk investment. Cost of failure involves economic uncertainty and can be defined based on a developer’s financial agreement with new dwellers to prevent litigation in the case of certain events, such as an environmental event where indoor air concentrations of pollutants exceed regulatory limits during periodic inspections. The developer makes such a financial agreement with new dwellers because new dwellings have been constructed founded on flawed site information, and municipalities may require it if a land use planning approval is required. An agreement was presumed that the developer would repurchase the affected houses from new dwellers immediately, if indoor air contamination exceeded the regulatory limit. Furthermore, the developer would remediate any remaining contamination, demolish the affected houses and build new houses if they were worth investing in. With this financial plan assumed, the stochastic housing price, stochastic inflation rate and stochastic interest rate have been considered to cause the uncertainty in the cost of failure, and the information on these stochastic variables was obtained from the financial market due to its long history of observations. This research reviewed appropriate risk capital valuation methods for hydrogeologists to apply straightforwardly to their projects, with integrating probability of failure (hydrogeological uncertainty) and cost of failure (economic uncertainty). The risk capital is essentially the probability of failure times the cost of failure with safety loading added to compensate investors against hydrogeological and financial uncertainty. Fair market prices of risk capital have been valuated using financial mathematics and actuarial premium calculations, and each method has a specific safety loading term to reflect investors’ level of risk aversion. Risk capital results indicated that the price of the risk capital was much more sensitive to hydrogeological uncertainty than financial uncertainty. Developers can manage the risk capital by saving a contingency fee for future events or paying an insurance premium, given that the price of this risk capital is the price of a contingent claim, subsequent to failure in remediation or in foundation design, and equivalent to an environmental insurance premium if there is an insurance company to indemnify the liability for the developer. The optimal framework of investment strategy in brownfields redevelopment can be built by linkage of addressing and integrating uncertainties and valuating risk capital from the uncertainties. This framework involves balancing the costs associated with each step while maximizing a net profit from land redevelopment. The optimal investment strategy, such as if or when to remediate or redevelop and to what degree, is given when the future price of the land minus time and material costs as well as the contingency fee or insurance premium maximizes a net profit.
359

Eliminating Design Alternatives under Interval-Based Uncertainty

Rekuc, Steven Joseph 19 July 2005 (has links)
Typically, design is approached as a sequence of decisions in which designers select what they believe to be the best alternative in each decision. While this approach can be used to arrive at a final solution quickly, it is unlikely to result in the most-preferred solution. The reason for this is that all the decisions in the design process are coupled. To determine the most preferred alternative in the current decision, the designer would need to know the outcomes of all future decisions, information that is currently unavailable or indeterminate. Since the designer cannot select a single alternative because of this indeterminate (interval-based) uncertainty, a set-based design approach is introduced. The approach is motivated by the engineering practices at Toyota and is based on the structure of the Branch and Bound Algorithm. Instead of selecting a single design alternative that is perceived as being the most preferred at the time of the decision, the proposed set-based design approach eliminates dominated design alternatives: rather than selecting the best, eliminate the worst. Starting from a large initial design space, the approach sequentially reduces the set of non-dominated design alternatives until no further reduction is possible ??e remaining set cannot be rationally differentiated based on the available information. A single alternative is then selected from the remaining set of non-dominated designs. In this thesis, the focus is on the elimination step of the set-based design method: A criterion for rational elimination under interval-based uncertainty is derived. To be efficient, the criterion takes into account shared uncertainty ??certainty shared between design alternatives. In taking this uncertainty into account, one is able to eliminate significantly more design alternatives, improving the efficiency of the set-based design approach. Additionally, the criterion uses a detailed reference design to allow more elimination of inferior design sets without evaluating each alternative in that set. The effectiveness of this elimination is demonstrated in two examples: a beam design and a gearbox design.
360

A Robust Design Method for Model and Propagated Uncertainty

Choi, Hae-Jin 04 November 2005 (has links)
One of the important factors to be considered in designing an engineering system is uncertainty, which emanates from natural randomness, limited data, or limited knowledge of systems. In this study, a robust design methodology is established in order to design multifunctional materials, employing multi-time and length scale analyses. The Robust Concept Exploration Method with Error Margin Index (RCEM-EMI) is proposed for design incorporating non-deterministic system behavior. The Inductive Design Exploration Method (IDEM) is proposed to facilitate distributed, robust decision-making under propagated uncertainty in a series of multiscale analyses or simulations. These methods are verified in the context of Design of Multifunctional Energetic Structural Materials (MESM). The MESM is being developed to replace the large amount of steel reinforcement in a missile penetrator for light weight, high energy release, and sound structural integrity. In this example, the methods facilitate following state-of-the-art design capabilities, robust MESM design under (a) random microstructure changes and (b) propagated uncertainty in a multiscale analysis chain. The methods are designed to facilitate effective and efficient materials design; however, they are generalized to be applicable to any complex engineering systems design that incorporates computationally intensive simulations or expensive experiments, non-deterministic models, accumulated uncertainty in multidisciplinary analyses, and distributed, collaborative decision-making.

Page generated in 0.0747 seconds