• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 6
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 38
  • 38
  • 18
  • 17
  • 17
  • 9
  • 9
  • 9
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Simultaneous model building and validation with uniform designs of experiments

Wood, Alastair S., Campean, Felician, Narayanan, A., Toropov, V.V. January 2007 (has links)
No
2

Parametric study of a dog clutch used in a transfer case for trucks

Eriksson, Fredrik, Kuttikkal, Joseph Linu, Mehari, Amanuel January 2013 (has links)
Normally the trucks with four wheel drive option will be running in rear wheel drives and the front wheels will be rotating freely. In extreme tough driving conditions, the risk for getting stopped or slipping the rear wheels in mud is high. When the driver tries to engage the four wheel drive option and due to the difference in relative rotational speed of the dog clutch parts, there is a risk for slipping off or bouncing back of the dog clutch. After studying the importance of gear geometry and a few parameters, the team ended up with a new design and the performance of the design found satisfactory when simulated in MSC ADAMS.
3

Análise e propagação de incertezas associadas à Dispersão atmosférica dos gases da unidade snox®

MELO, Rony Glauco de 18 September 2015 (has links)
Submitted by Haroudo Xavier Filho (haroudo.xavierfo@ufpe.br) on 2016-07-01T12:40:07Z No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Tese_RonyMelo_Versao_BC_2.pdf: 1810035 bytes, checksum: 3c75da73e467a1195a630f09d398de6a (MD5) / Made available in DSpace on 2016-07-01T12:40:07Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Tese_RonyMelo_Versao_BC_2.pdf: 1810035 bytes, checksum: 3c75da73e467a1195a630f09d398de6a (MD5) Previous issue date: 2015-09-18 / Anp/prh-28 / O aprimoramento de tecnologias que possam tornar o processo produtivo mais amigável a sociedade e ao meio ambiente é uma busca constante das grandes indústrias, seja por questões mercadológicas, seja por obrigações legais. A indústria do refino de petróleo, pela própria natureza composicional de sua matéria prima principal, produz efluentes com os mais diferentes riscos, os quais necessitam ser eliminados ou reduzidos a níveis aceitáveis. Inserido dentro deste contexto surge à unidade de abatimentos de emissões atmosféricas SNOX®, cujos objetivos visam o tratamento de efluentes e produção de H2SO4 agregando assim valor comercial ao processo, contudo esses mesmos efluentes conferem a possibilidade de sofrer diversos processos corrosivos e que pode acarretar vazamentos de seus gases, os quais são, em sua maioria, nocivos. O presente trabalho teve como objetivos a elaboração de uma simulação em modo estacionário, do processo SNOX® utilizando o software Hysys® a fim de calcular as concentrações dos diversos gases circulantes, e avaliar, de forma probabilística, a dispersão atmosférica (através do modelo SLAB) desses gases devido à presença de incertezas em diversas variáveis. Para a avaliação probabilística foi utilizada técnicas de Quasi-Monte Carlo (Latin Hypercube) para: definição das incertezas relevantes e hierarquização destas através de análise de sensibilidade por decomposição de variâncias; cálculo do tamanho ideal das amostras que representarão as incertezas, considerando um intervalo de confiança de 90%; e exibição dos resultados na forma de famílias de curvas de distribuição de probabilidade para obtenção probabilidades de certos efeitos adversos referentes aos gases presentes no processo SNOX®. Os resultados mostraram que, considerando as condições operacionais da unidade e o tipo de consequência abordado (intoxicação por gases): coeficiente de descarga, vazão de descarga, velocidade (intensidade) dos ventos e diâmetro do orifício são as variáveis que possuem relevância e as incertezas associadas a esses resultados se propagam até as concentrações finais obtidas pelo modelo SLAB, fazendo com que sua melhor representação seja na forma de curvas de distribuição de probabilidades cumulativas. / The improvement of technologies which can implement greater eco-socialfriendly production processes are a goal for the major industries, either by marketing issues or legal restrictions. The oil industry, by its compositional nature of its feedstock, produces effluents with several hazards which must be eliminated or reduced to acceptable levels. In this context, the SNOX® unit rises as answer to the reduction of the atmospheric emissions, aiming the effluent treatment and H2SO4 production, which increases the commercial value to the process, notwithstanding the fact of these emissions enable corrosive process that may lead to leakage of gases, which are usually harmful. The current work has as main objectives the development of a simulation at stationary-state of the SNOX® process by using the HYSYS® software in order calculate the concentration of released gases and probabilistically evaluate the atmospheric dispersion of these gases employing SLAB method. The Quasi-Monte Carlo (Latin Hypercube) was used for probabilistic estimation for: defining the relevant uncertainties as well its hierarchization through sensibility analysis by variance decomposition; calculation of the ideal size for the samples which will represent the uncertainty with a reliability of 90%; and finally for displaying the results as groups of probability distribution curves to obtain the probability of some adverse effects associated with the gases at the process. The results evidenced that considering the operational conditions and the studied kind of consequences (gas intoxication): discharge coefficient, discharge flow rate, wind velocity (intensity of the wind) and the diameter of the orifice were the variables of relevance and the associated uncertainties of the results propagate to the final concentrations obtained by the SLAB model. Hence the results must be suitably represented by cumulative probability distribution curves.
4

Abordagem estocástica de máquinas rotativas utilizando os métodos hipercubo latino e caos polinomial / Stochastic analysis of rotating machines by using the latin hypercube and polynomial chaos methods

Queiroz, Layane Rodrigues de Souza 15 September 2017 (has links)
Submitted by Luciana Ferreira (lucgeral@gmail.com) on 2017-11-17T13:13:41Z No. of bitstreams: 2 Dissertação - Layane Rodrigues de Souza Queiroz - 2017.pdf: 8629970 bytes, checksum: af72c48499b0b9a9f4f284f02a016ae0 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2017-11-17T13:14:07Z (GMT) No. of bitstreams: 2 Dissertação - Layane Rodrigues de Souza Queiroz - 2017.pdf: 8629970 bytes, checksum: af72c48499b0b9a9f4f284f02a016ae0 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Made available in DSpace on 2017-11-17T13:14:07Z (GMT). No. of bitstreams: 2 Dissertação - Layane Rodrigues de Souza Queiroz - 2017.pdf: 8629970 bytes, checksum: af72c48499b0b9a9f4f284f02a016ae0 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2017-09-15 / Mechanical systems may suffer with uncertainties that can appear from non-precise data and due the dynamic nature of the problem. Different methods have been used to deal with uncertainty propagation, such as the Latin Hypercube sampling and Polynomial Chaos. Latin hypercube allows to obtain the solution of the random process, from sampling using some probability distribution, over the process domain data. In its turn, the polynomial chaos expansion allows to separate the stochastic components from the deterministic ones of the random solution by using orthogonal polynomials in conformity with the probability distribution of the random variables representing uncertainties. In this work, we apply the Latin hypercube and the polynomial chaos in the quantification of uncertainties. In the beginning some simple mechanical systems were considered, for the purpose to validate the methodology and, then, we studied the effects of uncertainties on a rotor supported by hydrodynamic bearings. / Sistemas mecânicos estão sujeitos a incertezas que surgem a partir da imprecisão dos dados ou da natureza dinâmica do problema. Diferentes métodos têm sido utilizados para lidar com a propagação de incertezas, entre eles o Hipercubo Latino e o Caos Polinomial. O hipercubo latino permite obter a resposta do processo aleatório, a partir da amostragem por alguma destruição de probabilidade, sobre pontos do domínio do processo. Por sua vez, a expansão em caos polinomial permite separar as componentes estocásticas e determinísticas da resposta do processo aleatório a partir do uso de polinômios ortogonais condizentes com a distribuição de probabilidade das variáveis aleatórias que representam as incertezas. Neste trabalho, utiliza-se hipercubo latino e o caos polinomial para a quantificação de incertezas. Inicialmente foram considerados sistemas mecânicos mais simples, como forma de validação da metodologia e, em seguida, faz-se um estudo do efeito de incertezas em um rotor com mancais hidrodinâmicos.
5

Seismic Vulnerability Assessment of a Shallow Two-Story Underground RC Box Structure

Huh, Jungwon, Tran, Quang, Haldar, Achintya, Park, Innjoon, Ahn, Jin-Hee 18 July 2017 (has links)
Tunnels, culverts, and subway stations are the main parts of an integrated infrastructure system. Most of them are constructed by the cut-and-cover method at shallow depths (mainly lower than 30 m) of soil deposits, where large-scale seismic ground deformation can occur with lower stiffness and strength of the soil. Therefore, the transverse racking deformation (one of the major seismic ground deformation) due to soil shear deformations should be included in the seismic design of underground structures using cost- and time-efficient methods that can achieve robustness of design and are easily understood by engineers. This paper aims to develop a simplified but comprehensive approach relating to vulnerability assessment in the form of fragility curves on a shallow two-story reinforced concrete underground box structure constructed in a highly-weathered soil. In addition, a comparison of the results of earthquakes per peak ground acceleration (PGA) is conducted to determine the effective and appropriate number for cost- and time-benefit analysis. The ground response acceleration method for buried structures (GRAMBS) is used to analyze the behavior of the structure subjected to transverse seismic loading under quasi-static conditions. Furthermore, the damage states that indicate the exceedance level of the structural strength capacity are described by the results of nonlinear static analyses (or so-called pushover analyses). The Latin hypercube sampling technique is employed to consider the uncertainties associated with the material properties and concrete cover owing to the variation in construction conditions. Finally, a large number of artificial ground shakings satisfying the design spectrum are generated in order to develop the seismic fragility curves based on the defined damage states. It is worth noting that the number of ground motions per PGA, which is equal to or larger than 20, is a reasonable value to perform a structural analysis that produces satisfactory fragility curves.
6

Contributions to Optimal Experimental Design and Strategic Subdata Selection for Big Data

January 2020 (has links)
abstract: In this dissertation two research questions in the field of applied experimental design were explored. First, methods for augmenting the three-level screening designs called Definitive Screening Designs (DSDs) were investigated. Second, schemes for strategic subdata selection for nonparametric predictive modeling with big data were developed. Under sparsity, the structure of DSDs can allow for the screening and optimization of a system in one step, but in non-sparse situations estimation of second-order models requires augmentation of the DSD. In this work, augmentation strategies for DSDs were considered, given the assumption that the correct form of the model for the response of interest is quadratic. Series of augmented designs were constructed and explored, and power calculations, model-robustness criteria, model-discrimination criteria, and simulation study results were used to identify the number of augmented runs necessary for (1) effectively identifying active model effects, and (2) precisely predicting a response of interest. When the goal is identification of active effects, it is shown that supersaturated designs are sufficient; when the goal is prediction, it is shown that little is gained by augmenting beyond the design that is saturated for the full quadratic model. Surprisingly, augmentation strategies based on the I-optimality criterion do not lead to better predictions than strategies based on the D-optimality criterion. Computational limitations can render standard statistical methods infeasible in the face of massive datasets, necessitating subsampling strategies. In the big data context, the primary objective is often prediction but the correct form of the model for the response of interest is likely unknown. Here, two new methods of subdata selection were proposed. The first is based on clustering, the second is based on space-filling designs, and both are free from model assumptions. The performance of the proposed methods was explored visually via low-dimensional simulated examples; via real data applications; and via large simulation studies. In all cases the proposed methods were compared to existing, widely used subdata selection methods. The conditions under which the proposed methods provide advantages over standard subdata selection strategies were identified. / Dissertation/Thesis / Doctoral Dissertation Statistics 2020
7

Sensitivity analysis using the Latin Hypercube-OAT Method for the Conservational Channel Evaluation and Pollutant Transport System (CONCEPTS) Model

Celik, Kubra 09 December 2016 (has links)
Streambank erosion is a major problem and a major known source of sediment in impaired streams. Stream deterioration is mainly due to the excess sediment in the United States. Many models have been developed to predict streambank erosion and sediment transport in the streams. Determining the most sensitive soil-specific parameters of the CONCEPTS Model for Goodwin Creek, MS was the focus of the study. The Latin Hypercube Oneactor-At-a-Time (LH-OAT) method was used to complete the sensitivity analysis on soil-specific parameters in CONCEPTS. Overall results demonstrate that erodibility and critical shear stress parameters should be determined very carefully and realistic to determine streambank erosion and sediment transport rate more accurately. This sensitivity analysis also shows the minimum effects of suction angle and cohesion on results. In this case, making an assumption in a literal range, or safely ignoring them should not cause a big variation on CONCEPTS results.
8

Evaluating Parameter Uncertainty in Transportation Demand Models

Gray, Natalie Mae 12 June 2023 (has links) (PDF)
The inherent uncertainty in travel forecasting models -- arising from errors in input data, parameter estimation, or model formulation -- is receiving increasing attention from the scholarly and practicing community. In this research, we investigate the variance in forecasted traffic volumes resulting from varying the mode and destination choice parameters in an advanced trip-based travel demand model. Using Latin hypercube sampling to construct several hundred combinations of parameters across the plausible parameter space, we introduce substantial changes to mode and destination choice logsums and probabilities. However, the aggregate effects of of these changes on forecasted traffic volumes is small, with a variance of approximately 1 percent on high-volume facilities. Thus, parameter uncertainty does not appear to be a significant factor in forecasting traffic volumes using transportation demand models.
9

Application of Permutation Genetic Algorithm for Sequential Model Building–Model Validation Design of Experiments

Kianifar, Mohammed R., Campean, Felician, Wood, Alastair S. 08 1900 (has links)
Yes / The work presented in this paper is motivated by a complex multivariate engineering problem associated with engine mapping experiments, which require efficient Design of Experiment (DoE) strategies to minimise expensive testing. The paper describes the development and evaluation of a Permutation Genetic Algorithm (PermGA) to support an exploration-based sequential DoE strategy for complex real-life engineering problems. A known PermGA was implemented to generate uniform OLH DoEs, and substantially extended to support generation of Model Building–Model Validation (MB-MV) sequences, by generating optimal infill sets of test points as OLH DoEs, that preserve good space filling and projection properties for the merged MB + MV test plan. The algorithm was further extended to address issues with non-orthogonal design spaces, which is a common problem in engineering applications. The effectiveness of the PermGA algorithm for the MB-MV OLH DoE sequence was evaluated through a theoretical benchmark problem based on the Six-Hump-Camel-Back (SHCB) function, as well as the Gasoline Direct Injection (GDI) engine steady state engine mapping problem that motivated this research. The case studies show that the algorithm is effective at delivering quasi-orthogonal space-filling DoEs with good properties even after several MB-MV iterations, while the improvement in model adequacy and accuracy can be monitored by the engineering analyst. The practical importance of this work, demonstrated through the engine case study, also is that significant reduction in the effort and cost of testing can be achieved. / The research work presented in this paper was funded by the UK Technology Strategy Board (TSB) through the Carbon Reduction through Engine Optimization (CREO) project.
10

Machine Learning from Computer Simulations with Applications in Rail Vehicle Dynamics and System Identification

Taheri, Mehdi 01 July 2016 (has links)
The application of stochastic modeling for learning the behavior of multibody dynamics models is investigated. The stochastic modeling technique is also known as Kriging or random function approach. Post-processing data from a simulation run is used to train the stochastic model that estimates the relationship between model inputs, such as the suspension relative displacement and velocity, and the output, for example, sum of suspension forces. Computational efficiency of Multibody Dynamics (MBD) models can be improved by replacing their computationally-intensive subsystems with stochastic predictions. The stochastic modeling technique is able to learn the behavior of a physical system and integrate its behavior in MBS models, resulting in improved real-time simulations and reduced computational effort in models with repeated substructures (for example, modeling a train with a large number of rail vehicles). Since the sampling plan greatly influences the overall accuracy and efficiency of the stochastic predictions, various sampling plans are investigated, and a space-filling Latin Hypercube sampling plan based on the traveling salesman problem (TPS) is suggested for efficiently representing the entire parameter space. The simulation results confirm the expected increased modeling efficiency, although further research is needed for improving the accuracy of the predictions. The prediction accuracy is expected to improve through employing a sampling strategy that considers the discrete nature of the training data and uses infill criteria that considers the shape of the output function and detects sample spaces with high prediction errors. It is recommended that future efforts consider quantifying the computation efficiency of the proposed learning behavior by overcoming the inefficiencies associated with transferring data between multiple software packages, which proved to be a limiting factor in this study. These limitations can be overcome by using the user subroutine functionality of SIMPACK and adding the stochastic modeling technique to its force library. / Ph. D.

Page generated in 0.0708 seconds