• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • 13
  • 5
  • Tagged with
  • 34
  • 34
  • 26
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Avaliação da influência de alguns fatores nas propriedades mecânicas de misturas asfálticas densas, à luz da técnica de planejamento e análise de experimentos fatoriais fracionários assimétricos / Influence evaluation of some factors in the mechanical properties of binder mixtures using design and analysis of asymmetric fractional factorial experiments technique

Jisela Aparecida Santanna Greco 27 May 2004 (has links)
Trata-se de uma investigação sobre a influência de alguns fatores no comportamento mecânico de misturas asfálticas densas quanto à estabilidade e à flexibilidade. Foram testados três tipos de ligantes, asfalto convencional, modificado com 4,5% de SBS e modificado com 20% de borracha reciclada de pneu; duas distribuições granulométricas do agregado, centros das faixas B e C do DNER (1997); quatro teores de ligante, escolhidos com base nos valores de volumes de vazios e espessuras de película almejados; três condições de envelhecimento a longo prazo, mistura não envelhecida, envelhecida em estufa ventilada a 85ºC por 5 dias e envelhecida por exposição ao tempo por 4 meses; e duas condições de envelhecimento a curto prazo, mistura não envelhecida e envelhecida em estufa ventilada a 135ºC por 4 horas. A técnica de planejamento e análise de experimentos fatoriais fracionários assimétricos foi utilizada para a consideração simultânea dos fatores citados. O comportamento mecânico das misturas foi avaliado através dos ensaios de resistência à tração, módulo de resiliência e fluência por compressão uniaxial estática e dinâmica. A análise de variância dos resultados permitiu a identificação dos fatores com influência significativa nas respostas dos ensaios. O modo como cada fator interferiu nas propriedades apresentadas pelas misturas foi estabelecido através da construção de modelos estatísticos de comportamento. Os resultados mostraram que a adição de modificadores ao asfalto melhora a resistência das misturas à fadiga e à deformação permanente. Os processos de envelhecimento aumentaram os módulos de resiliência das misturas mas diminuíram sua capacidade de recuperação elástica, o que significa queda de resistência à fadiga. Por outro lado, a resistência a deformações permanentes das misturas, inclusive daquelas compostas por asfaltos modificados, aumentou com o envelhecimento. / This work deals with the influence of some factors in the mechanical behavior of asphalt mixtures stability and flexibility. Three types of binders were tested, conventional one, modified with 4,5% of SBS and modified with 20% of recycled tire rubber. Two aggregate gradations were tested, center of B and C gradations of DNER (1997). Four binder contents were chosen based on the air voids and film thickness. Three types of long-term aging were tested, not aged, aged in a forced-draft oven for 5 days at 85ºC and aged through weather exposition for the period of 4 months. Two types of short-term aging were tested; not aged and aged in a forced-draft oven for 4 hours at 135ºC. The technique of design and analysis of asymmetric fractional factorial experiments was used for the simultaneous analysis of the factors. The mechanical behavior of the mixtures was evaluated based on indirect tensile strength test, resilient modulus test and static and dynamic creep tests. The results of the analysis of variance allowed the identification of factors with significant influence in the answers. The influence of the factors in the mixtures properties was established through statistical models of behavior. The results showed that modified binders improves the mixtures resistance in relation to fatigue and to permanent deformation. The aging processes increased the resilient modulus of the mixtures, but also decreased its capacity of elastic return, resulting in a lost of fatigue resistance. On the other hand, the permanent deformation resistance of the aging mixtures increased, including the mixtures with modified binders cases.
22

Agile Prototyping : A combination of different approaches into one main process

Abu Baker, Mohamed January 2009 (has links)
<p>Software prototyping is considered to be one of the most important tools that are used by software engineersnowadays to be able to understand the customer’s requirements, and develop software products that are efficient,reliable, and acceptable economically. Software engineers can choose any of the available prototyping approaches tobe used, based on the software that they intend to develop and how fast they would like to go during the softwaredevelopment. But generally speaking all prototyping approaches are aimed to help the engineers to understand thecustomer’s true needs, examine different software solutions and quality aspect, verification activities…etc, that mightaffect the quality of the software underdevelopment, as well as avoiding any potential development risks.A combination of several prototyping approaches, and brainstorming techniques which have fulfilled the aim of theknowledge extraction approach, have resulted in developing a prototyping approach that the engineers will use todevelop one and only one throwaway prototype to extract more knowledge than expected, in order to improve thequality of the software underdevelopment by spending more time studying it from different points of view.The knowledge extraction approach, then, was applied to the developed prototyping approach in which thedeveloped model was treated as software prototype, in order to gain more knowledge out of it. This activity hasresulted in several points of view, and improvements that were implemented to the developed model and as a resultAgile Prototyping AP, was developed. AP integrated more development approaches to the first developedprototyping model, such as: agile, documentation, software configuration management, and fractional factorialdesign, in which the main aim of developing one, and only one prototype, to help the engineers gaining moreknowledge, and reducing effort, time, and cost of development was accomplished but still developing softwareproducts with satisfying quality is done by developing an evolutionary prototyping and building throwawayprototypes on top of it.</p>
23

Analysis Of The Influence Of Non-machining Process Parameters On Product Quality By Experimental Design And Statistical Analysis

Yurtseven, Saygin 01 September 2003 (has links) (PDF)
This thesis illustrates analysis of the influence of the non-machining processes on product quality by experimental design and statistical analysis. For the analysis objective / dishwasher production in Arcelik Dishwasher plant is examined. Sheet metal forming processes of dishwasher production constitutes the greatest portion of production cost and using the Pareto analysis technique / four pieces among twenty six pieces are determined to be investigated. These four pieces are the U Sheet, L Sheet, Inner Door and Side Panel of the dishwasher. By the help of the flow diagrams production process of the determined pieces are defined. Brainstorming technique and cause&amp / effect diagrams are used to determine which non-machining process parameters can cause pieces to be scrapped. These parameters are used as control factors in experimental design. Taguchi&amp / #8217 / s L16(215) orthogonal array, Taguchi&amp / #8217 / s L16(215) orthogonal array using S/N transformation and 28-4 fractional factorial design are used on purpose. With repetitions and confirmation experiments the effective parameters are determined and optimum level of these parameters are defined for the improvements on scrap quantity and quality of production.
24

Some Contributions to Design Theory and Applications

Mandal, Abhyuday 13 June 2005 (has links)
The thesis focuses on the development of statistical theory in experimental design with applications in global optimization. It consists of four parts. In the first part, a criterion of design efficiency, under model uncertainty, is studied with reference to possibly nonregular fractions of general factorials. The results are followed by a numerical study and the findings are compared with those based on other design criteria. In the second part, optimal designs are dentified using Bayesian methods. This work is linked with response surface methodology where the first step is to perform factor screening, followed by response surface exploration using different experiment plans. A Bayesian analysis approach is used that aims to achieve both goals using one experiment design. In addition we use a Bayesian design criterion, based on the priors for the analysis approach. This creates an integrated design and analysis framework. To distinguish between competing models, the HD criterion is used, which is based on the pairwise Hellinger distance between predictive densities. Mixed-level fractional factorial designs are commonly used in practice but its aliasing relations have not been studied in full rigor. These designs take the form of a product array. Aliasing patterns of mixed level factorial designs are discussed in the third part. In the fourth part, design of experiment ideas are used to introduce a new global optimization technique called SELC (Sequential Elimination of Level Combinations), which is motivated by genetic algorithms but finds the optimum faster. The two key features of the SELC algorithm, namely, forbidden array and weighted mutation, enhance the performance of the search procedure. Illustration is given with the optimization of three functions, one of which is from Shekel's family. A real example on compound optimization is also given.
25

A multivariate approach to QSAR

Hellberg, Sven January 1986 (has links)
Quantitative structure-activity relationships (OSAR) constitute empirical analogy models connecting chemical structure and biological activity. The analogy approach to QSAR assume that the factors important in the biological system also are contained in chemical model systems. The development of a QSAR can be divided into subproblems: 1. to quantify chemical structure in terms of latent variables expressing analogy, 2. to design test series of compounds, 3. to measure biological activity and 4. to construct a mathematical model connecting chemical structure and biological activity. In this thesis it is proposed that many possibly relevant descriptors should be considered simultaneously in order to efficiently capture the unknown factors inherent in the descriptors. The importance of multivariately and multipositionally varied test series is discussed. Multivariate projection methods such as PCA and PLS are shown to be appropriate far QSAR and to closely correspond to the analogy assumption. The multivariate analogy approach is applied to a beta- adrenergic agents, b haloalkanes, c halogenated ethyl methyl ethers and d four different families of peptides. / <p>Diss. (sammanfattning) Umeå : Umeå universitet, 1986, härtill 8 uppsatser</p> / digitalisering@umu
26

Desempenhos dos fatoriais fracionados em estimar efeitos principais na presença de interações duplas / Performance of fractional factorial to estimate main effects in the presence of double interactions

Damasceno, Luiz Carlos Medeiros 25 July 2011 (has links)
Made available in DSpace on 2015-03-26T13:32:12Z (GMT). No. of bitstreams: 1 texto completo.pdf: 500078 bytes, checksum: 214f761ce3ebbbbfdafedbd488eff22d (MD5) Previous issue date: 2011-07-25 / This study evaluated, through data simulation, the performance of fractional factorial designs of resolutions III, IV and V, the design of Cotter (1979) and UFPV method to estimate the main effects of the factors studied on an answer, presence or absence of only double interactions between them. For this, we estimated the main effects of three, four, five and ten factors with two levels. In the study of three factors, the main effects were estimated by the factorial and the UFPV. In order to study the four factors, the main effects were estimated by the fractional factorial, the UFPV and the design of Cotter (1979). For five factors were estimated by the main effects fractional factorial and at UFPV and the design of Cotter (1979). For the study of the ten factors, the main effects were estimated by the fractional factorial and at UFPV and the design of Cotter (1979). In all cases, 100 simulations were performed. To evaluate the performance of proposed designs, the main effects were estimated and applied separately for each situation, the Student t test for an average of 5% probability with 100 repetitions, according to the following hypothesis: Ho: p = parameter defined for each main effect separately. After analyzing the results, it was concluded that under the double presence of interactions, the fractional factorial resolution III UFPV and the method should not be used. Since the fractional factorial designs of resolutions IV and V and the design of Cotter (1979) study were satisfactory for three, four, five and ten factors, with or without the presence of double interactions. / O presente trabalho avaliou, por meio da simulação de dados, os desempenhos dos fatoriais fracionados de resoluções III, IV e V, do delineamento de Cotter (1979) e do método UFPV, para estimar os efeitos principais dos fatores estudados sobre uma resposta, na presença ou não de apenas interações duplas entre eles. Para tanto, foram estimados os efeitos principais de três, quatro, cinco e dez fatores com dois níveis. No estudo de três fatores, foram estimados os efeitos principais pelo fatorial e pelo UFPV. Para estudar os quatro fatores, os efeitos principais foram estimados pelo fatorial fracionado, pelo UFPV e pelo delineamento de Cotter (1979). Para cinco fatores foram estimados os efeitos principais pelos fatoriais fracionados, pelo UFPV e pelo delineamento de Cotter (1979). Para o estudo dos dez fatores, os efeitos principais foram estimados pelos fatoriais fracionados, pelo UFPV e pelo delineamento de Cotter (1979). Em todos os casos foram realizadas 100 simulações. Para avaliar o desempenho dos delineamentos propostos, foram estimados os efeitos principais e aplicado, para cada situação separadamente, o teste t de Student para uma média a 5% de probabilidade com 100 repetições, de acordo com a seguinte hipótese: Ho: ep = parâmetro definido para cada efeito principal, separadamente. Após a análise dos resultados, concluiu-se que sob a presença das interações duplas, o fatorial fracionado de resolução III e o método UFPV não devem ser utilizados. Já os fatoriais fracionados de resoluções IV e V e o delineamento de Cotter (1979) foram satisfatórios para estudarem três, quatro, cinco e dez fatores, sem ou com a presença das interações duplas.
27

Agile Prototyping : A combination of different approaches into one main process

Abu Baker, Mohamed January 2009 (has links)
Software prototyping is considered to be one of the most important tools that are used by software engineersnowadays to be able to understand the customer’s requirements, and develop software products that are efficient,reliable, and acceptable economically. Software engineers can choose any of the available prototyping approaches tobe used, based on the software that they intend to develop and how fast they would like to go during the softwaredevelopment. But generally speaking all prototyping approaches are aimed to help the engineers to understand thecustomer’s true needs, examine different software solutions and quality aspect, verification activities…etc, that mightaffect the quality of the software underdevelopment, as well as avoiding any potential development risks.A combination of several prototyping approaches, and brainstorming techniques which have fulfilled the aim of theknowledge extraction approach, have resulted in developing a prototyping approach that the engineers will use todevelop one and only one throwaway prototype to extract more knowledge than expected, in order to improve thequality of the software underdevelopment by spending more time studying it from different points of view.The knowledge extraction approach, then, was applied to the developed prototyping approach in which thedeveloped model was treated as software prototype, in order to gain more knowledge out of it. This activity hasresulted in several points of view, and improvements that were implemented to the developed model and as a resultAgile Prototyping AP, was developed. AP integrated more development approaches to the first developedprototyping model, such as: agile, documentation, software configuration management, and fractional factorialdesign, in which the main aim of developing one, and only one prototype, to help the engineers gaining moreknowledge, and reducing effort, time, and cost of development was accomplished but still developing softwareproducts with satisfying quality is done by developing an evolutionary prototyping and building throwawayprototypes on top of it.
28

Energy-efficient Benchmarking for Energy-efficient Software

Pukhkaiev, Dmytro 14 January 2016 (has links)
With respect to the continuous growth of computing systems, the energy-efficiency requirement of their processes becomes even more important. Different configurations, implying different energy-efficiency of the system, could be used to perform the process. A configuration denotes the choice among different hard- and software settings (e.g., CPU frequency, number of threads, the concrete algorithm, etc.). The identification of the most energy-efficient configuration demands to benchmark all configurations. However, this benchmarking is time- and energy-consuming, too. This thesis explores (a) the effect of dynamic voltage and frequency scaling (DVFS) in combination with dynamic concurrency throttling (DCT) on the energy consumption of (de)compression, DBMS query executions, encryption/decryption and sorting; and (b) a generic approach to reduce the benchmarking efforts to determine the optimal configuration. Our findings show that the utilization of optimal configurations can save wavg. 15.14% of energy compared to the default configuration. Moreover, we propose a generic heuristic (fractional factorial design) that utilizes data mining (adaptive instance selection) together with machine learning techniques (multiple linear regression) to decrease benchmarking effort by building a regression model based on the smallest feasible subset of the benchmarked configurations. Our approach reduces the energy consumption required for benchmarking by 63.9% whilst impairing the energy-efficiency of performing the computational process by only 1.88 pp, due to not using the optimal but a near-optimal configuration.
29

Understanding the relationship of lumber yield and cutting bill requirements: a statistical approach

Buehlmann, Urs 13 October 1998 (has links)
Secondary hardwood products manufacturers have been placing heavy emphasis on lumber yield improvements in recent years. More attention has been on lumber grade and cutting technology rather than cutting bill design. However, understanding the underlying physical phenomena of cutting bill requirements and yield is essential to improve lumber yield in rough mills. This understanding could also be helpful in constructing a novel lumber yield estimation model. The purpose of this study was to advance the understanding of the phenomena relating cutting bill requirements and yield. The scientific knowledge gained was used to describe and quantify the effect of part length, width, and quantity on yield. Based on this knowledge, a statistics based approach to the lumber yield estimation problem was undertaken. Rip-first rough mill simulation techniques and statistical methods were used to attain the study's goals. To facilitate the statistical analysis of the relationship of cutting bill requirements and lumber yield, a theoretical concept, called cutting bill part groups, was developed. Part groups are a standardized way to describe cutting bill requirements. All parts required by a cutting bill are clustered within 20 individual groups according to their size. Each group's midpoint is the representative part size for all parts falling within an individual group. These groups are made such that the error from clustering is minimized. This concept allowed a decrease in the number of possible factors to account for in the analysis of the cutting bill requirements - lumber yield relationship. Validation of the concept revealed that the average error due to clustering parts is 1.82 percent absolute yield. An orthogonal, 220-11 fractional factorial design of resolution V was then used to determine the contribution of different part sizes to lumber yield. All 20 part sizes and 113 of a total of 190 unique secondary interactions were found to be significant (a = 0.05) in explaining the variability in yield observed. Parameter estimates of the part sizes and the secondary interactions were then used to specify the average yield contribution of each variable. Parts with size 17.50 inches in length and 2.50 inches in width were found to contribute the most to higher yield. The positive effect on yield due to parts smaller than 17.50 by 2.50 inches is less pronounced because their quantity is relatively small in an average cutting bill. Parts with size 72.50 by 4.25 inches, on the other hand, had the most negative influence on high yield. However, as further analysis showed, not only the individual parts required by a cutting bill, but also their interaction determines yield. By adding a sufficiently large number of smaller parts to a cutting bill that requires large parts to be cut, high levels of yield can be achieved. A novel yield estimation model using linear least squares techniques was derived based on the data from the fractional factorial design. This model estimates expected yield based on part quantities required by a standardized cutting bill. The final model contained all 20 part groups and their 190 unique secondary interactions. The adjusted R2 for this model was found to be 0.94. The model estimated 450 of the 512 standardized cutting bills used for its derivation to within one percent absolute yield. Standardized cutting bills, whose yield level differs by more than two percent can thus be classified correctly in 88 percent of the cases. Standardized cutting bills whose part quantities were tested beyond the established framework, i.e. the settings used for the data derivation, were estimated with an average error of 2.19 percent absolute yield. Despite the error observed, the model ranked the cutting bills as to their yield level quite accurately. However, cutting bills from actual rough mill operations, which were well beyond the framework of the model, were found to have an average estimation error of 7.62 percent. Nonetheless, the model classified four out of five cutting bills correctly as to their ranking of the yield level achieved. The least squares estimation model thus is a helpful tool in ranking cutting bills for their expected yield level. Overall, the model performs well for standardized cutting bills, but more work is needed to make the model generally applicable for cutting bills whose requirements are beyond the framework established in this study. / Ph. D.
30

Modern design of experiments methods for screening and experimentations with mixture and qualitative variables

Chantarat, Navara 06 November 2003 (has links)
No description available.

Page generated in 0.3311 seconds