Spelling suggestions: "subject:"fractional factorial design"" "subject:"fractional actorial design""
11 |
Develop a Robust Design Framework by Integrating Statistical Concepts and using Finite Element AnalysisVeluchamy, Bharatharun January 2024 (has links)
In the constantly changing field of engineering and design, achieving solutions that are both resilient and optimized is crucial. This thesis introduces a robust design methodology (RDM) that integrates statistical concepts and Finite Element Analysis (FEA) to enhance the product reliability and durability. Altair Hyperstudy is used for design exploration, and it examines how changes in geometry and material properties influence the IKEA Wall-Mounting shelving unit. The objective is to efficiently parameterize geometric shapes, compare different Design of Experiments (DOE) methodologies, identify key input variables impacting outputs, and generate a response surface model based on training data. The parameterization of geometric shapes is achieved using the morphing tool in Altair Hypermesh, while the design exploration is conducted using Full factorial, Fractional factorial V, IV and III approaches with selected resolutions. A space-filling modified extensive lattice sequence design (MELS) is employed to examine the entire design space, providing input for the development of the response surface methodology (RSM). Finally, the RSM is developed for all design variables and the quality of fit is assessed. The methodology described is used to explore the design space and determine which parameters influence the system’s output response, ensuring these parameters are considered during the design phase.
|
12 |
Agile Prototyping : A combination of different approaches into one main processAbu Baker, Mohamed January 2009 (has links)
<p>Software prototyping is considered to be one of the most important tools that are used by software engineersnowadays to be able to understand the customer’s requirements, and develop software products that are efficient,reliable, and acceptable economically. Software engineers can choose any of the available prototyping approaches tobe used, based on the software that they intend to develop and how fast they would like to go during the softwaredevelopment. But generally speaking all prototyping approaches are aimed to help the engineers to understand thecustomer’s true needs, examine different software solutions and quality aspect, verification activities…etc, that mightaffect the quality of the software underdevelopment, as well as avoiding any potential development risks.A combination of several prototyping approaches, and brainstorming techniques which have fulfilled the aim of theknowledge extraction approach, have resulted in developing a prototyping approach that the engineers will use todevelop one and only one throwaway prototype to extract more knowledge than expected, in order to improve thequality of the software underdevelopment by spending more time studying it from different points of view.The knowledge extraction approach, then, was applied to the developed prototyping approach in which thedeveloped model was treated as software prototype, in order to gain more knowledge out of it. This activity hasresulted in several points of view, and improvements that were implemented to the developed model and as a resultAgile Prototyping AP, was developed. AP integrated more development approaches to the first developedprototyping model, such as: agile, documentation, software configuration management, and fractional factorialdesign, in which the main aim of developing one, and only one prototype, to help the engineers gaining moreknowledge, and reducing effort, time, and cost of development was accomplished but still developing softwareproducts with satisfying quality is done by developing an evolutionary prototyping and building throwawayprototypes on top of it.</p>
|
13 |
Analysis Of The Influence Of Non-machining Process Parameters On Product Quality By Experimental Design And Statistical AnalysisYurtseven, Saygin 01 September 2003 (has links) (PDF)
This thesis illustrates analysis of the influence of the non-machining processes on product quality by experimental design and statistical analysis. For the analysis objective / dishwasher production in Arcelik Dishwasher plant is examined. Sheet metal forming processes of dishwasher production constitutes the greatest portion of production cost and using the Pareto analysis technique / four pieces among twenty six pieces are determined to be investigated. These four pieces are the U Sheet, L Sheet, Inner Door and Side Panel of the dishwasher. By the help of the flow diagrams production process of the determined pieces are defined. Brainstorming technique and cause& / effect diagrams are used to determine which non-machining process parameters can cause pieces to be scrapped. These parameters are used as control factors in experimental design. Taguchi& / #8217 / s L16(215) orthogonal array, Taguchi& / #8217 / s L16(215) orthogonal array using S/N transformation and 28-4 fractional factorial design are used on purpose. With repetitions and confirmation experiments the effective parameters are determined and optimum level of these parameters are defined for the improvements on scrap quantity and quality of production.
|
14 |
Some Contributions to Design Theory and ApplicationsMandal, Abhyuday 13 June 2005 (has links)
The thesis focuses on the development of statistical theory in experimental design with applications in global optimization. It consists of four parts. In the first part, a criterion of design efficiency, under model uncertainty, is studied with reference to possibly nonregular fractions of general factorials. The results
are followed by a numerical study and the findings are compared with those based on other design criteria.
In the second part, optimal designs are dentified using Bayesian methods. This work is linked with response surface methodology where the first step is to perform factor screening, followed by response surface exploration using different experiment plans. A Bayesian analysis approach is used that aims to achieve both goals using one experiment design. In addition we use a Bayesian design criterion, based on the priors for the analysis approach. This creates an integrated design and analysis framework. To distinguish between competing models, the HD criterion is used, which is based on the pairwise Hellinger distance between predictive densities.
Mixed-level fractional factorial designs are commonly used in practice but its aliasing relations have not been studied in full rigor. These designs take the form of a product array. Aliasing patterns of mixed level factorial designs are discussed in the third part.
In the fourth part, design of experiment ideas are used to introduce a new global optimization technique called SELC (Sequential Elimination of Level Combinations), which is motivated by genetic algorithms but finds the optimum faster. The two key features of the SELC algorithm, namely, forbidden array and weighted mutation, enhance the performance of the search procedure. Illustration is given with the optimization of three functions, one of which is from Shekel's family. A real example on compound optimization is also given.
|
15 |
A multivariate approach to QSARHellberg, Sven January 1986 (has links)
Quantitative structure-activity relationships (OSAR) constitute empirical analogy models connecting chemical structure and biological activity. The analogy approach to QSAR assume that the factors important in the biological system also are contained in chemical model systems. The development of a QSAR can be divided into subproblems: 1. to quantify chemical structure in terms of latent variables expressing analogy, 2. to design test series of compounds, 3. to measure biological activity and 4. to construct a mathematical model connecting chemical structure and biological activity. In this thesis it is proposed that many possibly relevant descriptors should be considered simultaneously in order to efficiently capture the unknown factors inherent in the descriptors. The importance of multivariately and multipositionally varied test series is discussed. Multivariate projection methods such as PCA and PLS are shown to be appropriate far QSAR and to closely correspond to the analogy assumption. The multivariate analogy approach is applied to a beta- adrenergic agents, b haloalkanes, c halogenated ethyl methyl ethers and d four different families of peptides. / <p>Diss. (sammanfattning) Umeå : Umeå universitet, 1986, härtill 8 uppsatser</p> / digitalisering@umu
|
16 |
Agile Prototyping : A combination of different approaches into one main processAbu Baker, Mohamed January 2009 (has links)
Software prototyping is considered to be one of the most important tools that are used by software engineersnowadays to be able to understand the customer’s requirements, and develop software products that are efficient,reliable, and acceptable economically. Software engineers can choose any of the available prototyping approaches tobe used, based on the software that they intend to develop and how fast they would like to go during the softwaredevelopment. But generally speaking all prototyping approaches are aimed to help the engineers to understand thecustomer’s true needs, examine different software solutions and quality aspect, verification activities…etc, that mightaffect the quality of the software underdevelopment, as well as avoiding any potential development risks.A combination of several prototyping approaches, and brainstorming techniques which have fulfilled the aim of theknowledge extraction approach, have resulted in developing a prototyping approach that the engineers will use todevelop one and only one throwaway prototype to extract more knowledge than expected, in order to improve thequality of the software underdevelopment by spending more time studying it from different points of view.The knowledge extraction approach, then, was applied to the developed prototyping approach in which thedeveloped model was treated as software prototype, in order to gain more knowledge out of it. This activity hasresulted in several points of view, and improvements that were implemented to the developed model and as a resultAgile Prototyping AP, was developed. AP integrated more development approaches to the first developedprototyping model, such as: agile, documentation, software configuration management, and fractional factorialdesign, in which the main aim of developing one, and only one prototype, to help the engineers gaining moreknowledge, and reducing effort, time, and cost of development was accomplished but still developing softwareproducts with satisfying quality is done by developing an evolutionary prototyping and building throwawayprototypes on top of it.
|
17 |
Energy-efficient Benchmarking for Energy-efficient SoftwarePukhkaiev, Dmytro 14 January 2016 (has links)
With respect to the continuous growth of computing systems, the energy-efficiency requirement of their processes becomes even more important. Different configurations, implying different energy-efficiency of the system, could be used to perform the process. A configuration denotes the choice among different hard- and software settings (e.g., CPU frequency, number of threads, the concrete algorithm, etc.). The identification of the most energy-efficient configuration demands to benchmark all configurations. However, this benchmarking is time- and energy-consuming, too. This thesis explores (a) the effect of dynamic voltage and frequency scaling (DVFS) in combination with dynamic concurrency throttling (DCT) on the energy consumption of (de)compression, DBMS query executions, encryption/decryption and sorting; and (b) a generic approach to reduce the benchmarking efforts to determine the optimal configuration. Our findings show that the utilization of optimal configurations can save wavg. 15.14% of energy compared to the default configuration. Moreover, we propose a generic heuristic (fractional factorial design) that utilizes data mining (adaptive instance selection) together with machine learning techniques (multiple linear regression) to decrease benchmarking effort by building a regression model based on the smallest feasible subset of the benchmarked configurations. Our approach reduces the energy consumption required for benchmarking by 63.9% whilst impairing the energy-efficiency of performing the computational process by only 1.88 pp, due to not using the optimal but a near-optimal configuration.
|
18 |
Understanding the relationship of lumber yield and cutting bill requirements: a statistical approachBuehlmann, Urs 13 October 1998 (has links)
Secondary hardwood products manufacturers have been placing heavy emphasis on lumber yield improvements in recent years. More attention has been on lumber grade and cutting technology rather than cutting bill design. However, understanding the underlying physical phenomena of cutting bill requirements and yield is essential to improve lumber yield in rough mills. This understanding could also be helpful in constructing a novel lumber yield estimation model.
The purpose of this study was to advance the understanding of the phenomena relating cutting bill requirements and yield. The scientific knowledge gained was used to describe and quantify the effect of part length, width, and quantity on yield. Based on this knowledge, a statistics based approach to the lumber yield estimation problem was undertaken. Rip-first rough mill simulation techniques and statistical methods were used to attain the study's goals.
To facilitate the statistical analysis of the relationship of cutting bill requirements and lumber yield, a theoretical concept, called cutting bill part groups, was developed. Part groups are a standardized way to describe cutting bill requirements. All parts required by a cutting bill are clustered within 20 individual groups according to their size. Each group's midpoint is the representative part size for all parts falling within an individual group. These groups are made such that the error from clustering is minimized. This concept allowed a decrease in the number of possible factors to account for in the analysis of the cutting bill requirements - lumber yield relationship. Validation of the concept revealed that the average error due to clustering parts is 1.82 percent absolute yield.
An orthogonal, 220-11 fractional factorial design of resolution V was then used to determine the contribution of different part sizes to lumber yield. All 20 part sizes and 113 of a total of 190 unique secondary interactions were found to be significant (a = 0.05) in explaining the variability in yield observed. Parameter estimates of the part sizes and the secondary interactions were then used to specify the average yield contribution of each variable. Parts with size 17.50 inches in length and 2.50 inches in width were found to contribute the most to higher yield. The positive effect on yield due to parts smaller than 17.50 by 2.50 inches is less pronounced because their quantity is relatively small in an average cutting bill. Parts with size 72.50 by 4.25 inches, on the other hand, had the most negative influence on high yield. However, as further analysis showed, not only the individual parts required by a cutting bill, but also their interaction determines yield. By adding a sufficiently large number of smaller parts to a cutting bill that requires large parts to be cut, high levels of yield can be achieved.
A novel yield estimation model using linear least squares techniques was derived based on the data from the fractional factorial design. This model estimates expected yield based on part quantities required by a standardized cutting bill. The final model contained all 20 part groups and their 190 unique secondary interactions. The adjusted R2 for this model was found to be 0.94. The model estimated 450 of the 512 standardized cutting bills used for its derivation to within one percent absolute yield. Standardized cutting bills, whose yield level differs by more than two percent can thus be classified correctly in 88 percent of the cases. Standardized cutting bills whose part quantities were tested beyond the established framework, i.e. the settings used for the data derivation, were estimated with an average error of 2.19 percent absolute yield. Despite the error observed, the model ranked the cutting bills as to their yield level quite accurately. However, cutting bills from actual rough mill operations, which were well beyond the framework of the model, were found to have an average estimation error of 7.62 percent. Nonetheless, the model classified four out of five cutting bills correctly as to their ranking of the yield level achieved. The least squares estimation model thus is a helpful tool in ranking cutting bills for their expected yield level. Overall, the model performs well for standardized cutting bills, but more work is needed to make the model generally applicable for cutting bills whose requirements are beyond the framework established in this study. / Ph. D.
|
19 |
Modern design of experiments methods for screening and experimentations with mixture and qualitative variablesChantarat, Navara 06 November 2003 (has links)
No description available.
|
20 |
Desenvolvimento de uma metodologia experimental para obtenção e caracterização de formulações de compostos de borracha EPDM / Development of experimental method for obtaining and characterization of EPDM rubber compound formulationsPalaoro, Denilso 24 February 2015 (has links)
Made available in DSpace on 2016-12-08T15:56:17Z (GMT). No. of bitstreams: 1
Denilso Palaoro.pdf: 3446834 bytes, checksum: a842ffb16a48a459dc2b2e44efa303af (MD5)
Previous issue date: 2015-02-24 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / The rubber industry has a great role in many areas of the economy, such as automotive, construction, footwear industry, hospital, etc. Rubber products are produced from complex mixtures of different raw materials, both natural and synthetic sources. From an industrial point of view, a major difficulty is to develop a formulation that meets the particular product requirements. This work seeks to develop an experimental methodology to obtain EPDM rubber compounds, using experimental design techniques coupled with computer numerical optimisation. A fractional factorial design was used to design and analyse the experiments, with factors the contents of calcium carbonate, paraffinic oil and vulcanizing accelerator (weight fractions). Twelve properties were measured (six of original samples, three of heat aged and three of processing). Statistical analyses enabled to find regression models for the properties and the cost of the formulation. A computer program was developed to minimise the cost function, subject to constraints on the properties. The results showed that it was possible to obtain formulations of EPDM rubber compounds which can be optimized at low cost, for example, of US$ 2.02/kg a 2.43/kg, for use in hoses and pads at different manufacturing processes such as compression moulding, transfer or injection. Selected compositions were analysed by FTIR, SEM and TGA, regards to their chemical and structural characteristics. Compositions with low vulcanization accelerator contents contribute to form cross-links with many sulphur- sulphur between the carbon chains which can damage mechanical properties in the original cured and aged samples. Using higher accelerator content, better properties are achieved probably due to lower content of the same sulphur- sulphur cross-links between polymer chains. The EPDM compounds studied may be used in cushions, hoses products which can withstand to hot air environments. Thus, the present study provides an experimental and scientific technique that allows developing rubber compounds with increased efficiency and reliability in research and development, taking into account the cost of the material. / A indústria da borracha possui um papel significativo em diversas áreas da economia, tais como: indústria automobilística, construção civil, indústria calçadista, hospitalar, etc. Artefatos de borracha são produzidos a partir de misturas complexas de diversas matérias-primas, naturais e sintéticas. Na indústria, uma grande dificuldade é desenvolver uma formulação que atenda aos requisitos de determinado produto. Assim, neste trabalho, busca se desenvolver uma metodologia experimental para obtenção de compostos de borracha de etileno propileno dieno (EPDM), utilizando-se técnicas de planejamento de experimentos aliado com otimização numérica computacional. Foi utilizado um planejamento fatorial fracionado 33-1 (três níveis e três fatores), sendo os fatores: teor de carbonato de cálcio, teor de óleo parafínico e teor de acelerador de vulcanização. Foram medidas no total, 12 propriedades (seis originais, três envelhecidas e três propriedades de processos). A partir de estudos estatísticos foram obtidos modelos de regressão para as propriedades e para o custo da formulação. Um programa computacional foi desenvolvido para minimizar a função custo, sujeita às restrições nas propriedades. Os resultados mostraram que foi possível obter formulações de compostos de borracha EPDM otimizadas a um custo variando entre US$ 2,02/kg a 2,43/kg, para aplicação em mangueiras e coxins e em processos de transformação diversos, tais como, moldagem por compressão, transferência ou injeção. Composições selecionadas foram escolhidas e analisadas, por meio de FTIR, MEV e TGA, quanto às suas características químicas e estruturais. Composições com baixos teores de acelerador de vulcanização contribuem para formar ligações cruzadas com cerca de 4 a 7 átomos de enxofre entre as cadeias carbônicas, prejudicando as propriedades mecânicas no vulcanizado original e envelhecido. Com um teor maior de acelerador, melhores propriedades são obtidas, tendo em vista que um número elevado de ligações cruzadas com uma quantidade de átomos de enxofre inferior a 4 na cadeia são formadas. Os compostos de EPDM estudados podem ser utilizados em produtos de coxins e mangueiras para aplicações resistentes ao calor. Assim, esta pesquisa apresenta uma metodologia experimental, para a pesquisa e desenvolvimento de compostos de borracha EPDM, levando em conta o custo do material e restrições nas propriedades.
|
Page generated in 0.13 seconds