Spelling suggestions: "subject:"desponse burface"" "subject:"desponse 1surface""
171 |
An approach to integrating numerical and response surface models for robust design of production systemsKini, Satish D. 30 March 2004 (has links)
No description available.
|
172 |
Development of Approximations for HSCT Wing Bending Material Weight using Response Surface MethodologyBalabanov, Vladimir Olegovich 01 October 1997 (has links)
A procedure for generating a customized weight function for wing bending material weight of a High Speed Civil Transport (HSCT) is described. The weight function is based on HSCT configuration parameters. A response surface methodology is used to fit a quadratic polynomial to data gathered from a large number of structural optimizations. To reduce the time of performing a large number of structural optimizations, coarse-grained parallelization with a master-slave processor assignment on an Intel Paragon computer is used. The results of the structural optimization are noisy. Noise reduction in the structural optimization results is discussed. It is shown that the response surface filters out this noise. A statistical design of experiments technique is used to minimize the number of required structural optimizations and to maintain accuracy. Simple analysis techniques are used to find regions of the design space where reasonable HSCT designs could occur, thus customizing the weight function to the design requirements of the HSCT, while the response surface itself is created employing detailed analysis methods. Analysis of variance is used to reduce the number of polynomial terms in the response surface model function. Linear and constant corrections based on a small number of high fidelity results are employed to improve the accuracy of the response surface model. Configuration optimization of the HSCT employing a customized weight function is compared to the configuration optimization of the HSCT with a general weight function. / Ph. D.
|
173 |
Fractional Catalytic Pyrolysis Technology for the Production of Upgraded Bio-oil using FCC CatalystMante, Nii Ofei Daku 06 January 2012 (has links)
Catalytic pyrolysis technology is one of the thermochemical platforms used to produce high quality bio-oil and chemicals from biomass feedstocks. In the catalytic pyrolysis process, the biomass is rapidly heated under inert atmosphere in the presence of an acid catalyst or zeolite to promote deoxygenation and cracking of the primary vapors into hydrocarbons and small oxygenates. This dissertation examines the utilization of conventional fluid catalytic cracking (FCC) catalyst in the fractional catalytic pyrolysis of hybrid poplar wood. The influence of Y-zeolite content, steam treatment, addition of ZSM-5 additive, process conditions (temperature, weight hourly space velocity (WHSV) and vapor residence time) and recycling of the non-condensable gases (NCG) on the product distribution and the quality of the bio-oil were investigated.
The first part of the study demonstrates the influence of catalytic property of FCC catalyst on the product distribution and quality of the bio-oil. It was found that FCC catalyst with higher Y-zeolite content produces higher coke yield and lower organic liquid fraction (OLF). Conversely, FCC catalyst with lower Y-zeolite content results in lower coke yield and higher OLF. The results showed that higher Y-zeolite content extensively cracks dehydrated products from cellulose decomposition and demethoxylates phenolic compounds from lignin degradation. The Y-zeolite promoted both deoxygenation and coke forming reactions due to its high catalytic activity and large pore size. Higher Y-zeolite content increased the quality of the bio-oil with respect to higher heating value (HHV), pH, density, and viscosity. The steam treatment at 732 oC and 788 oC decreased the total BET surface area of the FCC catalyst. The findings suggest that steam treatment reduces the coking tendency of the FCC catalyst and enhances the yield of the OLF. Analysis of the bio-oils showed that the steamed FCC catalyst produces bio-oil with lower viscosity and density. Gas chromatography and 13C-NMR spectrometry suggest that steam treatment affect the catalyst selectivity in the formation of CO, CO2, H2, CH4, C2-C5 hydrocarbons and aromatic hydrocarbons. The addition of ZSM-5 additive to the FCC catalyst was found to alter the characteristic/functionality of the catalytic medium. The product slate showed decrease in coke yield and increase in OLF with increase in ZSM-5 additive. The FCC/ZSM-5 additive hybrid catalysts produced bio-oils with relatively lower viscosity and higher pH value. The formation of CO2, CH4, and H2 decreased whilst C5 and aromatic hydrocarbons increased with increase in ZSM-5 additive level.
The second part of the work assesses the effect of operating conditions on the catalytic pyrolysis process. The response surface methodology study showed reaction temperature to be the most influential statistically significant independent variable on char/coke yield, concentration of non-condensable gases, carbon content, oxygen content, pH and viscosity of the bio-oils. The WHSV was the most important statistically significant independent variable that affects the yield of organic liquid and water. Adequate and statistically significant models were generated for the prediction of the responses with the exception of viscosity. Recycling of the NCG in the process was found to potentially increase the liquid yield and decrease char/coke yield. The experiments with the model fluidizing gases showed that CO/N2, CO2/N2, CO/CO2/N2 and H2/N2 increase the liquid yield and CO2/N2 decrease char/coke yield. The results showed that recycling of NCG increases the higher heating value and the pH of the bio-oil as well as decreases the viscosity and density. The concept of recycling the NCG in the catalytic cracking of biomass vapors with FCC catalyst improved the overall process. The evaluation of the reactivity of conventional FCC catalyst towards bio-based molecules provide essential direction for FCC catalyst formulation and design for the production of high quality bio-oils from catalytic pyrolysis of biomass. / Ph. D.
|
174 |
Statistical Experimental Design Framework for Cognitive RadioAmanna, Ashwin Earl 30 April 2012 (has links)
This dissertation presents an empirical approach to identifying decisions for adapting cognitive radio parameters with no a priori knowledge of the environment. Cognitively inspired radios, attempt to combine observed metrics of system performance with artificial intelligence decision-making algorithms. Current architectures trend towards hybrid combinations of heuristics, such as genetic algorithms (GA) and experiential methods, such as case-based reasoning (CBR). A weakness in the GA is its reliance on limited mathematical models for estimating bit error rate, packet error rate, throughput, and signal-to-noise ratio. The CBR approach is similarly limited by its dependency on past experiences. Both methods have potential to suffer in environments not previously encountered. In contrast, the statistical methods identify performance estimation models based on exercising defined experimental designs. This represents an experiential decision-making process formed in the present rather than the past. There are three core contributions from this empirical framework: 1) it enables a new approach to decision making based on empirical estimation models of system performance, 2) it provides a systematic method for initializing cognitive engine configuration parameters, and 3) it facilitates deeper understanding of system behavior by quantifying parameter significance, and interaction effects. Ultimately, this understanding enables simplification of system models by identifying insignificant parameters. This dissertation defines an abstract framework that enables application of statistical approaches to cognitive radio systems regardless of its platform or application space. Specifically, it assesses factorial design of experiments and response surface methodology (RSM) to an over-the-air wireless radio link. Results are compared to a benchmark GA cognitive engine. The framework is then used for identifying software-defined radio initialization settings. Taguchi designs, a related statistical method, are implemented to identify initialization settings of a GA. / Ph. D.
|
175 |
ANÁLISIS DE LA DEFORMACIÓN EN LA INYECCIÓN DE TERMOPLÁSTICOS BAJO VARIABLES DE FORMA DE LA PIEZA MEDIANTE RED NEURONAL Y SUPERFICIES RESPUESTAGámez Martínez, Juan Luis 03 September 2014 (has links)
La gran parte de productos de consumo contienen partes realizadas a través del proceso de inyección de termoplásticos, esto constata la importancia de este proceso de conformado con respecto a otros procesos de transformación de plástico. La minimización de los costes para ser más competitivos así como la eliminación o reducción de defectos en las piezas inyectadas, han sido los motivos principales para controlar el proceso a través de la optimización de las variables que entran en juego en este proceso, es por ello que se han realizado numerosos estudios referentes a obtener las relaciones existentes entre las variables del proceso y los aspectos de rentabilidad, estética y defectología de las piezas inyectadas. Modelizar dichas relaciones a través de algoritmos matemáticos con el fin de optimizar los resultados obtenidos y predecir el estado final de las piezas inyectadas han sido los objetivos de la mayoría de estudios.
Uno de los efectos intrínsecos a la inyección es la deformación de la pieza, esta deformación tiene lugar debido a distintos factores que intervienen en el diseño del proceso en su conjunto, diferencias en la contracción, diferencias en la refrigeración, las esquinas de la pieza, la orientación molecular, etc son elementos condicionantes de la deformación que se han estudiado en infinidad de artículos, en esta divulgación científica se estudiará la deformación bajo aspectos dimensionales de la pieza con la finalidad de intentar descubrir y optimizar las condiciones de entrada que en este caso serían las dimensiones de la pieza a través de la observación y modelización de las variables de salida que seria la deformación.
Y la pregunta que nos realizamos es ¿Cómo varían las deformaciones modificando las dimensiones de la pieza? ¿Cuáles son las dimensiones de la pieza a estudio que minimizan los efectos negativos de la deformación? ¿se puede predecir la deformación que obtendremos en una pieza solo con las dimensiones de una pieza?
A todas estas preguntas intentamos dar respuesta en el estudio siguiente. / Gámez Martínez, JL. (2014). ANÁLISIS DE LA DEFORMACIÓN EN LA INYECCIÓN DE TERMOPLÁSTICOS BAJO VARIABLES DE FORMA DE LA PIEZA MEDIANTE RED NEURONAL Y SUPERFICIES RESPUESTA [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/39350
|
176 |
Advances in Aero-Propulsive Modeling for Fixed-Wing and eVTOL Aircraft Using Experimental DataSimmons, Benjamin Mason 09 July 2023 (has links)
Small unmanned aircraft and electric vertical takeoff and landing (eVTOL) aircraft have recently emerged as vehicles able to perform new missions and stimulate future air transportation methods. This dissertation presents several system identification research advancements for these modern aircraft configurations enabling accurate mathematical model development for flight dynamics simulations based on wind-tunnel and flight-test data. The first part of the dissertation focuses on advances in flight-test system identification methods using small, fixed-wing, remotely-piloted, electric, propeller-driven aircraft. A generalized approach for flight dynamics model development for small fixed-wing aircraft from flight data is described and is followed by presentation of novel flight-test system identification applications, including: aero-propulsive model development for propeller aircraft and nonlinear dynamic model identification without mass properties. The second part of the dissertation builds on established fixed-wing and rotary-wing aircraft system identification methods to develop modeling strategies for transitioning, distributed propulsion, eVTOL aircraft. Novel wind-tunnel experiment designs and aero-propulsive modeling approaches are developed using a subscale, tandem tilt-wing, eVTOL aircraft, leveraging design of experiments and response surface methodology techniques. Additionally, a method applying orthogonal phase-optimized multisine input excitations to aircraft control effectors in wind-tunnel testing is developed to improve test efficiency and identified model utility. Finally, the culmination of this dissertation is synthesis of the techniques described throughout the document to form a flight-test system identification approach for eVTOL aircraft that is demonstrated using a high-fidelity flight dynamics simulation. The research findings highlighted throughout the dissertation constitute substantial progress in efficient empirical aircraft modeling strategies that are applicable to many current and future aeronautical vehicles enabling accurate flight simulation development, which can subsequently be used to foster advancement in many other pertinent technology areas. / Doctor of Philosophy / Small, electric-powered airplanes flown without an onboard pilot, as well as novel electric aircraft configurations with many propellers that operate at a wide range of speeds, referred to as electric vertical takeoff and landing (eVTOL) aircraft, have recently emerged as aeronautical vehicles able to perform new tasks for future airborne transportation methods. This dissertation presents several mathematical modeling research advancements for these modern aircraft that foster accurate description and prediction of their motion in flight. The mathematical models are developed from data collected in wind-tunnel tests that force air over a vehicle to simulate the aerodynamic forces in flight, as well as from data collected while flying the aircraft. The first part of the dissertation focuses on advances in mathematical modeling approaches using flight data collected from small traditional airplane configurations that are controlled by a pilot operating the vehicle from the ground. A generalized approach for mathematical model development for small airplanes from flight data is described and is followed by presentation of novel modeling applications, including: characterization of the coupled airframe and propulsion aerodynamics and model development when vehicle mass properties are not known. The second part of the dissertation builds on established airplane, helicopter, and multirotor mathematical modeling methods to develop strategies for characterization of the flight motion of eVTOL aircraft. Innovative data collection and modeling approaches using wind-tunnel testing are developed and applied to a subscale eVTOL aircraft with two tilting wings. Statistically rigorous experimentation strategies are employed to allow the effects of many individual controls and their interactions to be simultaneously distinguished while also allowing expeditious test execution and enhancement of the mathematical model prediction capability. Finally, techniques highlighted throughout the dissertation are combined to form a mathematical modeling approach for eVTOL aircraft using flight data, which is demonstrated using a realistic flight simulation. The research findings described throughout the dissertation constitute substantial progress in efficient aircraft modeling strategies that are applicable to many current and future vehicles enabling accurate flight simulator development, which can subsequently be used for many research applications.
|
177 |
Reliability Assessment and Probabilistic Optimization in Structural DesignMansour, Rami January 2016 (has links)
Research in the field of reliability based design is mainly focused on two sub-areas: The computation of the probability of failure and its integration in the reliability based design optimization (RBDO) loop. Four papers are presented in this work, representing a contribution to both sub-areas. In the first paper, a new Second Order Reliability Method (SORM) is presented. As opposed to the most commonly used SORMs, the presented approach is not limited to hyper-parabolic approximation of the performance function at the Most Probable Point (MPP) of failure. Instead, a full quadratic fit is used leading to a better approximation of the real performance function and therefore more accurate values of the probability of failure. The second paper focuses on the integration of the expression for the probability of failure for general quadratic function, presented in the first paper, in RBDO. One important feature of the proposed approach is that it does not involve locating the MPP. In the third paper, the expressions for the probability of failure based on general quadratic limit-state functions presented in the first paper are applied for the special case of a hyper-parabola. The expression is reformulated and simplified so that the probability of failure is only a function of three statistical measures: the Cornell reliability index, the skewness and the kurtosis of the hyper-parabola. These statistical measures are functions of the First-Order Reliability Index and the curvatures at the MPP. In the last paper, an approximate and efficient reliability method is proposed. Focus is on computational efficiency as well as intuitiveness for practicing engineers, especially regarding probabilistic fatigue problems where volume methods are used. The number of function evaluations to compute the probability of failure of the design under different types of uncertainties is a priori known to be 3n+2 in the proposed method, where n is the number of stochastic design variables. / <p>QC 20160317</p>
|
178 |
以機器學習方法估計電腦實驗之目標區域 / Estimation of Target Regions in Computer Experiments: A Machine Learning Approach林家立, Lin, Chia Li Unknown Date (has links)
電腦實驗(computer experiment)是探索複雜系統輸出反應值和輸入參數之間關係的重要工具,其重要特性是每一次的實驗非常耗費時間及運算的成本。一般在電腦實驗中,研究者較常關心的多是反應曲面的配適和輸出反應值的最佳化等問題(如極大或極小值)。借由一真實平行分散處理系統的啟發,本文所關心的是如何找出系統反應值的局部目標區域。此目標區域有一個非常重要的特性,即區域內外的輸出值所呈現的反應曲面並不連續,因此一般傳統的反應曲面法(response surface methodology)無法適用。本文提出一個新的、可估計不同類型電腦實驗目標區域的有效方法,其中包含了逐步均勻設計和建立分類模型的概
念,電腦模擬的結果也證明了所提方法準確又有效率。 / Computer experiment has been an important tool for exploring the relationships between the input factors and the output responses. It’s important feature is that conducting an experiment is usually time consuming and computationally expensive. In general, researchers are more interested in finding an adequate model for the response surface and the related output optimization problems over the entire input space. Motivated by a real-life parallel and distributed system, here we focus on finding a localized “target region” for the computer experiment. The experiment here has an important characteristic - the response surface is not continuous over the target region of interest. Thus, the traditional response surface methodology (RSM) cannot be directly applied. In this thesis, a novel and efficient methodology for estimating this type of target regions of computer experiment is proposed. The method incorporates the concept of sequential uniform design (UD) and the development of classification techniques based on support vector machines (SVM). Computer simulation shows that the proposed method can efficiently and precisely estimate the target region of
computer experiment with different shapes.
|
179 |
A multi-configuration approach to reliability based structural integrity assessment for ultimate strengthKolios, Athanasios Ioannis January 2010 (has links)
Structural Reliability treats uncertainties in structural design systematically, evaluating the levels of safety and serviceability of structures. During the past decades, it has been established as a valuable design tool for the description of the performance of structures, and lately stands as a basis in the background of the most of the modern design standards, aiming to achieve a uniform behaviour within a class of structures. Several methods have been proposed for the estimation of structural reliability, both deterministic (FORM and SORM) and stochastic (Monte Carlo Simulation etc) in nature. Offshore structures should resist complicated and, in most cases, combined environmental phenomena of greatly uncertain magnitude (eg. wind, wave, current, operational loads etc). Failure mechanisms of structural systems and components are expressed through limit state functions, which distinguish a failure and a safe region of operation. For a jacket offshore structure, which comprises of multiple tubular members interconnected in a three dimensional truss configuration, the limit state function should link the actual load or load combination acting on it locally, to the response of each structural member. Cont/d.
|
180 |
Modélisation statistique pour données fonctionnelles : approches non-asymptotiques et méthodes adaptatives / Statistical modeling for functional data : non-asymptotic approaches and adaptive methodsRoche, Angelina 07 July 2014 (has links)
L'objet principal de cette thèse est de développer des estimateurs adaptatifs en statistique pour données fonctionnelles. Dans une première partie, nous nous intéressons au modèle linéaire fonctionnel et nous définissons un critère de sélection de la dimension pour des estimateurs par projection définis sur des bases fixe ou aléatoire. Les estimateurs obtenus vérifient une inégalité de type oracle et atteignent la vitesse de convergence minimax pour le risque lié à l'erreur de prédiction. Pour les estimateurs définis sur une collection de modèles aléatoires, des outils de théorie de la perturbation ont été utilisés pour contrôler les projecteurs aléatoires de manière non-asymptotique. D'un point de vue numérique, cette méthode de sélection de la dimension est plus rapide et plus stable que les méthodes usuelles de validation croisée. Dans une seconde partie, nous proposons un critère de sélection de fenêtre inspiré des travaux de Goldenshluger et Lepski, pour des estimateurs à noyau de la fonction de répartition conditionnelle lorsque la covariable est fonctionnelle. Le risque de l'estimateur obtenu est majoré de manière non-asymptotique. Des bornes inférieures sont prouvées ce qui nous permet d'établir que notre estimateur atteint la vitesse de convergence minimax, à une perte logarithmique près. Dans une dernière partie, nous proposons une extension au cadre fonctionnel de la méthodologie des surfaces de réponse, très utilisée dans l'industrie. Ce travail est motivé par une application à la sûreté nucléaire. / The main purpose of this thesis is to develop adaptive estimators for functional data.In the first part, we focus on the functional linear model and we propose a dimension selection device for projection estimators defined on both fixed and data-driven bases. The prediction error of the resulting estimators satisfies an oracle-type inequality and reaches the minimax rate of convergence. For the estimator defined on a data-driven approximation space, tools of perturbation theory are used to solve the problems related to the random nature of the collection of models. From a numerical point of view, this method of dimension selection is faster and more stable than the usual methods of cross validation.In a second part, we consider the problem of bandwidth selection for kernel estimators of the conditional cumulative distribution function when the covariate is functional. The method is inspired by the work of Goldenshluger and Lepski. The risk of the estimator is non-asymptotically upper-bounded. We also prove lower-bounds and establish that our estimator reaches the minimax convergence rate, up to an extra logarithmic term.In the last part, we propose an extension to a functional context of the response surface methodology, widely used in the industry. This work is motivated by an application to nuclear safety.
|
Page generated in 0.0487 seconds