Spelling suggestions: "subject:"bobust design"" "subject:"bobust 1design""
1 |
Steering drift and wheel movement during braking: parameter sensitivity studiesKlaps, J., Day, Andrew J. January 2003 (has links)
Yes / In spite of the many signi cant improvements in car chassis design over the past two
decades, steering drift during braking where the driver must apply a corrective steering torque in
order to maintain course can still be experienced under certain conditions while driving. In the past,
such drift, or `pull¿, would have been attributed to side-to-side braking torque variation [1], but
modern automotive friction brakes and friction materials are now able to provide braking torque
with such high levels of consistency that side-to-side braking torque variation is no longer regarded
as a cause of steering drift during braking. Consequently, other in uences must be considered. This
paper is the rst of two papers to report on an experimental investigation into braking-related steering
drift in motor vehicles. Parameters that might in uence steering drift during braking include suspension
compliance and steering o set, and these have been investigated to establish the sensitivity of
steering drift to such parameters. The results indicate how wheel movement arising from compliance
in the front suspension and steering system of a passenger car during braking can be responsible for
steering drift during braking. Braking causes changes in wheel alignment which in turn a ect the toe
steer characteristics of each wheel and therefore the straight-line stability during braking. It is concluded
that a robust design of suspension is possible in which side-to-side variation in toe steer is
not a ected by changes in suspension geometry during braking, and that the magnitude of these
changes and the relationships between the braking forces and the suspension geometry and compliance
require further investigation, which will be presented in the second paper of the two.
|
2 |
Optimal Design of District Energy Systems: a Multi-Objective ApproachWang, Cong January 2016 (has links)
The aim of this thesis is to develop a holistic approach to the optimal design of energy systems for building clusters or districts. The emerging Albano university campus, which is planned to be a vivid example of sustainable urban development, is used as a case study through collaboration with the property owners, Akademiska Hus and Svenska Bostäder. The design addresses aspects of energy performance, environmental performance, economic performance, and exergy performance of the energy system. A multi-objective optimization approach is applied to minimize objectives such as non-renewable primary energy consumptions, the greenhouse gas emissions, the life cycle cost, and the net exergy deficit. These objectives reflect both practical requirements and research interest. The optimization results are presented in the form of Pareto fronts, through which decision-makers can understand the options and limitations more clearly and ultimately make better and more informed decisions. Sensitivity analyses show that solutions could be sensitive to certain system parameters. To overcome this, a robust design optimization method is also developed and employed to find robust optimal solutions, which are less sensitive to the variation of system parameters. The influence of different preferences for objectives on the selection of optimal solutions is examined. Energy components of the selected solutions under different preference scenarios are analyzed, which illustrates the advantages and disadvantages of certain energy conversion technologies in the pursuit of various objectives. As optimal solutions depend on the system parameters, a parametric analysis is also conducted to investigate how the composition of optimal solutions varies to the changes of certain parameters. In virtue of the Rational Exergy Management Model (REMM), the planned buildings on the Albano campus are further compared to the existing buildings on KTH campus, based on energy and exergy analysis. Four proposed alternative energy supply scenarios as well as the present case are analyzed. REMM shows that the proposed scenarios have better levels of match between supply and demand of exergy and result in lower avoidable CO2 emissions, which promise cleaner energy structures. / <p>QC 20160923</p>
|
3 |
A stochastic expansion-based approach for design under uncertaintyWalter, Miguel 12 February 2013 (has links)
An approach for robust design based on stochastic expansions is investigated. The research consists of two parts : 1) stochastic expansions for uncertainty propagation and 2) adaptive sampling for Pareto front approximation. For the first part, a strategy based on the generalized polynomial chaos (gPC) expansion method is developed. Second, in order to alleviate the computational cost of approximating the Pareto front, two strategies based on adaptive sampling for multi-objective problems are presented. The first one is based on the two aforementioned methods, whereas the second one considers, in addition, two levels of fidelity of the uncertainty propagation method.
|
4 |
Probabilistic Robust Design For Dynamic Systems Using MetamodellingSeecharan, Turuna Saraswati January 2007 (has links)
Designers use simulations to observe the behaviour of a system and to make design decisions to improve dynamic performance. However, for complex dynamic systems, these simulations are often time-consuming and, for robust design purposes, numerous simulations are required as a range of design variables is investigated. Furthermore, the optimum set is desired to meet specifications at particular instances in time. In this thesis, the dynamic response of a system is broken into discrete time instances and recorded into a matrix. Each column of this matrix corresponds to a discrete time instance and each row corresponds to the response at a particular design variable set. Singular Value Decomposition (SVD) is then used to separate this matrix into two matrices: one that consists of information in parameter-space and the other containing information in time-space. Metamodels are then used to efficiently and accurately calculate the response at some arbitrary set of design variables at any time. This efficiency is especially useful in Monte Carlo simulation where the responses are required at a very large sample of design variable sets. This work is then extended where the normalized sensitivities along with the first and second moments of the response are required at specific times. Later, the procedure of calculating the metamodel at specific times and how this metamodel is used in parameter design or integrated design for finding the optimum parameters given specifications at specific time steps is shown. In conclusion, this research shows that SVD and metamodelling can be used to apply probabilistic robust design tools where specifications at certain times are required for the optimum performance of a system.
|
5 |
Probabilistic Robust Design For Dynamic Systems Using MetamodellingSeecharan, Turuna Saraswati January 2007 (has links)
Designers use simulations to observe the behaviour of a system and to make design decisions to improve dynamic performance. However, for complex dynamic systems, these simulations are often time-consuming and, for robust design purposes, numerous simulations are required as a range of design variables is investigated. Furthermore, the optimum set is desired to meet specifications at particular instances in time. In this thesis, the dynamic response of a system is broken into discrete time instances and recorded into a matrix. Each column of this matrix corresponds to a discrete time instance and each row corresponds to the response at a particular design variable set. Singular Value Decomposition (SVD) is then used to separate this matrix into two matrices: one that consists of information in parameter-space and the other containing information in time-space. Metamodels are then used to efficiently and accurately calculate the response at some arbitrary set of design variables at any time. This efficiency is especially useful in Monte Carlo simulation where the responses are required at a very large sample of design variable sets. This work is then extended where the normalized sensitivities along with the first and second moments of the response are required at specific times. Later, the procedure of calculating the metamodel at specific times and how this metamodel is used in parameter design or integrated design for finding the optimum parameters given specifications at specific time steps is shown. In conclusion, this research shows that SVD and metamodelling can be used to apply probabilistic robust design tools where specifications at certain times are required for the optimum performance of a system.
|
6 |
Robust Design of Electronic Ballasts for Fluorescent LampsCheng, Hung-Wei 06 June 2001 (has links)
A robust design utilizing consecutive orthogonal arrays algorithm is proposed for designing electronic ballasts of fluorescent lamps. By this design method, the variation in the lamp power can be less than 10% under different operating conditions. In the manipulation of the consecutive orthogonal arrays, component values of the ballast circuit and DC-link voltage are used as controllable variables for inner orthogonal arrays; while manufacturers, ambient temperature, used hours, and variation in DC-link voltage are treated as uncontrollable variables for outer orthogonal arrays. The average effects of the output power for each controllable variable are calculated from simulation results, which are served as indexes to find the combination of circuit parameters with a better solution. With consecutive orthogonal arrays, the target values of the circuit parameters are approached step by step. In addition, the effect of the DC-link voltage on the lamp power can be understood from the uncontrollable variable of outer orthogonal arrays. The proposed design tool is implemented on the design of an electronic ballast for a 40W fluorescent lamp. The test results show that the designed electronic ballast can be adopted for the lamps from different manufacturers, with different used hours, and under variation in a wide range of ambient temperature.
|
7 |
Robust A-optimal designs for mixture experiments in Scheffe' modelsChou, Chao-Jin 28 July 2003 (has links)
A mixture experiment is an
experiments in which the q-ingredients are nonnegative
and subject to the simplex restriction on
the (q-1)-dimentional probability simplex. In this
work , we investigate the robust A-optimal designs for mixture
experiments with uncertainty on the linear, quadratic models
considered by Scheffe' (1958). In Chan (2000), a review on the
optimal designs including A-optimal designs are presented for
each of the Scheffe's linear and quadratic models. We will use
these results to find the robust A-optimal design for the linear
and quadratic models under some robust A-criteria. It is shown
with the two types of robust A-criteria defined here, there
exists a convex combination of the individual A-optimal designs
for linear and quadratic models respectively to be robust
A-optimal. In the end, we compare efficiencies of these optimal
designs with respect to different A-criteria.
|
8 |
Robust design of control charts for autocorrelated processes with model uncertaintyLee, Hyun Cheol 01 November 2005 (has links)
Statistical process control (SPC) procedures suitable for autocorrelated processes have been extensively investigated in recent years. The most popular method is the residual-based control chart. To implement this method, a time series model, which is usually an autoregressive moving average (ARMA) model, of the process is required. However, the model must be estimated from data in practice and the resulting ARMA modeling errors are unavoidable. Residual-based control charts are known to be sensitive to ARMA modeling errors and often suffer from inflated false alarm rates. As an alternative, control charts can be applied directly to the autocorrelated data with widened control limits. The widened amount is determined by the autocorrelation function of the process. The alternative method, however, can not be also free from the effects of modeling errors because it relies on an accurate process model to be effective.
To compare robustness to the ARMA modeling errors between the preceding two kinds of methods for control charting autocorrelated data, this dissertation investigates the sensitivity analytically. Then, two robust design procedures for residual-based control charts are developed from the result of the sensitivity analysis. The first approach for robust design uses the worst-case (maximum) variance of a chart statistic to guarantee the initial specification of control charts. The second robust design method uses the expected variance of the chart statistic. The resulting control limits are widened by an amount that depends on the variance of chart statistic - maximum or expected - as a function of (among other things) the parameter estimation error covariances.
|
9 |
Robust manufacturing system design using petri nets and bayesian methodsSharda, Bikram 10 October 2008 (has links)
Manufacturing system design decisions are costly and involve significant
investment in terms of allocation of resources. These decisions are complex, due to
uncertainties related to uncontrollable factors such as processing times and part
demands. Designers often need to find a robust manufacturing system design that meets
certain objectives under these uncertainties. Failure to find a robust design can lead to
expensive consequences in terms of lost sales and high production costs. In order to find
a robust design configuration, designers need accurate methods to model various
uncertainties and efficient ways to search for feasible configurations.
The dissertation work uses a multi-objective Genetic Algorithm (GA) and Petri net
based modeling framework for a robust manufacturing system design. The Petri nets are
coupled with Bayesian Model Averaging (BMA) to capture uncertainties associated with
uncontrollable factors. BMA provides a unified framework to capture model, parameter
and stochastic uncertainties associated with representation of various manufacturing
activities. The BMA based approach overcomes limitations associated with uncertainty representation using classical methods presented in literature. Petri net based modeling is
used to capture interactions among various subsystems, operation precedence and to
identify bottleneck or conflicting situations. When coupled with Bayesian methods, Petri
nets provide accurate assessment of manufacturing system dynamics and performance in
presence of uncertainties. A multi-objective Genetic Algorithm (GA) is used to search
manufacturing system designs, allowing designers to consider multiple objectives. The
dissertation work provides algorithms for integrating Bayesian methods with Petri nets.
Two manufacturing system design examples are presented to demonstrate the proposed
approach. The results obtained using Bayesian methods are compared with classical
methods and the effect of choosing different types of priors is evaluated.
In summary, the dissertation provides a new, integrated Petri net based modeling
framework coupled with BMA based approach for modeling and performance analysis
of manufacturing system designs. The dissertation work allows designers to obtain
accurate performance estimates of design configurations by considering model,
parameter and stochastic uncertainties associated with representation of uncontrollable
factors. Multi-objective GA coupled with Petri nets provide a flexible and time saving
approach for searching and evaluating alternative manufacturing system designs.
|
10 |
Multiobjective Optimization Algorithm Benchmarking and Design Under Parameter UncertaintyLALONDE, NICOLAS 13 August 2009 (has links)
This research aims to improve our understanding of multiobjective optimization, by comparing the performance of five multiobjective optimization algorithms, and by proposing a new formulation to consider input uncertainty in multiobjective optimization problems. Four deterministic multiobjective optimization algorithms and one probabilistic algorithm were compared: the Weighted Sum, the Adaptive Weighted Sum, the Normal Constraint, the Normal Boundary Intersection methods, and the Nondominated Sorting Genetic Algorithm-II (NSGA-II).
The algorithms were compared using six test problems, which included a wide range of optimization problem types (bounded vs. unbounded, constrained vs. unconstrained). Performance metrics used for quantitative comparison were the total run (CPU) time, number of function evaluations, variance in solution distribution, and numbers of dominated and non-optimal solutions. Graphical representations of the resulting Pareto fronts were also presented.
No single method outperformed the others for all performance metrics, and the two different classes of algorithms were effective for different types of problems. NSGA-II did not effectively solve problems involving unbounded design variables or equality constraints. On the other hand, the deterministic algorithms could not solve a problem with a non-continuous objective function.
In the second phase of this research, design under uncertainty was considered in multiobjective optimization. The effects of input uncertainty on a Pareto front were quantitatively investigated by developing a multiobjective robust optimization framework. Two possible effects on a Pareto front were identified: a shift away from the Utopia point, and a shrinking of the Pareto curve. A set of Pareto fronts were obtained in which the optimum solutions have different levels of insensitivity or robustness.
Four test problems were used to examine the Pareto front change. Increasing the insensitivity requirement of the objective function with regard to input variations moved the Pareto front away from the Utopia point or reduced the length of the Pareto front. These changes were quantified, and the effects of changing robustness requirements were discussed. The approach would provide designers with not only the choice of optimal solutions on a Pareto front in traditional multiobjective optimization, but also an additional choice of a suitable Pareto front according to the acceptable level of performance variation. / Thesis (Master, Mechanical and Materials Engineering) -- Queen's University, 2009-08-10 21:59:13.795
|
Page generated in 0.0869 seconds