• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 47
  • 24
  • 11
  • 9
  • 8
  • 7
  • 5
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 125
  • 38
  • 37
  • 32
  • 27
  • 24
  • 21
  • 18
  • 18
  • 15
  • 15
  • 14
  • 14
  • 13
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

[en] A METAMODEL FOR CONFIGURING COLLABORATIVE VIRTUAL WORKSPACES: APPLICATION IN DISASTER MANAGEMENT OF OIL AND GAS OFFSHORE STRUCTURES. / [pt] UM METAMODELO PARA CONFIGURAÇÃO DE ESPAÇOS DE TRABALHO VIRTUAIS COLABORATIVOS: APLICAÇÃO NO GERENCIAMENTO DE DESASTRES DE ESTRUTURAS OFFSHORE DE ÓLEO E GÁS

ENIO EMANUEL RAMOS RUSSO 04 September 2006 (has links)
[pt] Várias companhias têm criado equipes virtuais para agregar trabalhadores de diversas especialidades que estão dispersos geograficamente, aumentando a demanda por aplicações CSCW (Computer Supported Cooperative Work). De modo a facilitar o desenvolvimento de uma ampla gama destas aplicações colaborativas, devemos prover uma arquitetura genérica que seja adaptável a diferentes situações, tarefas e configurações de um modo flexível. Este trabalho investiga como um ambiente de trabalho distribuído pode apoiar o gerenciamento de desastres, envolvendo equipes técnicas colaborativas distribuídas. Primeiramente, identificamos os requisitos para o espaço de trabalho distribuído, a partir dos atores envolvidos em um desastre, e analisamos os sistemas de emergência comerciais disponíveis. Em seguida, elaboramos um metamodelo de multi-perspectiva para auxiliar a configurar este espaço de trabalho virtual colaborativo. Finalmente, derivamos, a partir do metamodelo, um protótipo para o gerenciamento de desastres de estruturas offshore de óleo e gás e desenvolvemos uma implementação aderente ao padrão HLA (High Level Architecture) para este protótipo, como prova de conceito deste metamodelo. / [en] Many companies have been creating virtual teams that bring together geographically dispersed workers with complementary skills, increasing the demand for CSCW (Computer Supported Cooperative Work) applications. In order to facilitate the development of a wide range of these collaborative applications, we should offer a general architecture that is adaptable to different situations, tasks, and settings in a flexible way. This work investigates how a distributed workspace environment can support disaster management, involving distributed collaborative technical teams. We first identify the requirements for the distributed workspace, from the stakeholders involved in a disaster, and analyse the commercial emergency systems available. We then elaborate a multi-perspective metamodel to support configuring this collaborative virtual workspace. Finally a prototype for oil and gas offshore structures disaster management based on our multi- perspective metamodel is derived and an HLA (High Level Architecture) compliant implementation for this prototype is developed as a proof- of-concept of the metamodel.
22

Aeroacústica de motores aeronáuticos: uma abordagem por meta-modelo / Aeroengine aeroacoustics: a meta-model approach

Cuenca, Rafael Gigena 20 June 2017 (has links)
Desde a última década, as autoridades aeronáuticas dos países membros da ICAO vem, gradativamente, aumentando as restrições nos níveis de ruído externo de aeronaves, principalmente nas proximidades dos aeroportos. Por isso os novos motores aeronáuticos precisam ter projetos mais silenciosos, tornando as técnicas de predição de ruído de motores cada vez mais importantes. Diferente das técnicas semi-analíticas, que vêm evoluindo nas últimas décadas, as técnicas semiempíricas possuem suas bases lastreadas em técnicas e dados que remontam à década de 70, como as desenvolvidas no projeto ANOPP. Uma bancada de estudos aeroacústicos para um conjunto rotor/estator foi construída no departamento de Engenharia Aeronáutica da Escola de Engenharia de São Carlos, permitindo desenvolver uma metodologia capaz de gerar uma técnica semi-empírica utilizando métodos e dados novos. Tal bancada é capaz de variar a rotação, o espaçamento rotor/estator e controlar a vazão mássica, resultando em 71 configurações avaliadas. Para isso, uma antena de parede com 14 microfones foi usada. O espectro do ruído de banda larga é modelado como um ruído rosa e o ruído tonal é modelado por um comportamento exponencial, resultando em 5 parâmetros: nível do ruído, decaimento linear e fator de forma da banda larga, nível do primeiro tonal e o decaimento exponencial de seus harmônicos. Uma regressão superficial Kriging é utilizada para aproximar os 5 parâmetros utilizando as variáveis do experimento e o estudo mostrou que Mach Tip e RSS são as principais variáveis que definem o ruído, assim como utilizado pelo projeto ANOPP. Assim, um modelo de previsão é definido para o conjunto rotor/estator estudado na bancada, o que permite prever o espectro em condições não ensaiadas. A análise do modelo resultou em uma ferramenta de interpretação dos resultados. Ao modelo são aplicadas 3 técnicas de validação cruzada: leave one out, monte carlo e repeated k-folds e mostrou que o modelo desenvolvido possui um erro médio, do nível do ruído total do espectro, de 2.35 dBs e desvio padrão de 0.91. / Since the last decade, the countries members of ICAO, via its aeronautical authorities, has been gradually increasing the restrictions on external aircraft noise levels, especially in the vicinity of airports. Because that, the new aero-engines need quieter designs, so noise prediction techniques for aero-engines are getting even more important. Semi-analytical techniques have undergone a major evolution since the 70th until nowadays, but semi-empirical techniques still have their bases pegged in techniques and data defined on the 70th, developed in the ANOPP project. An Aeroacoustics Fan Rig to investigate a Rotor/Stator assembly was developed at Aeronautical Engineering Department of São Carlos School of Engineering, allowing the development of a methodology capable of defining a semi-empirical technique based on new data and methods. Such rig is able to vary the rotation, the rotor/stator spacing and mass flow rate, resulting in a set of 71 configurations tested. To measure the noise, a microphone wall antenna with 14 sensors were used. The broadband noise was modeled by a pink noise and the tonal with exponential behavior, resulting in 5 parameters: broadband noise level, decay and form factor and the level and decay of tonal noise. A superficial kriging regression were used to approach the parameters using the experimental variables and the investigation has shown that Mach Tip and RSS are the most important variables that defines the noise, as well on ANOPP. A prediction model for the rotor/stator noise are defined with the 5 approximation of the parameters, that allow to predict the spectra at operations points not measured. The model analyses of the model resulted on a tool for results interpretation. Tree different cross validation techniques are applied to model: leave ou out, Monte Carlo and repeated k-folds. That analysis shows that the model developed has average error of 2.35 dBs and standard deviation of 0.91 for the spectrum level predicted.
23

Multiscale modeling of multimaterial systems using a Kriging based approach

Sen, Oishik 01 December 2016 (has links)
The present work presents a framework for multiscale modeling of multimaterial flows using surrogate modeling techniques in the particular context of shocks interacting with clusters of particles. The work builds a framework for bridging scales in shock-particle interaction by using ensembles of resolved mesoscale computations of shocked particle laden flows. The information from mesoscale models is “lifted” by constructing metamodels of the closure terms - the thesis analyzes several issues pertaining to surrogate-based multiscale modeling frameworks. First, to create surrogate models, the effectiveness of several metamodeling techniques, viz. the Polynomial Stochastic Collocation method, Adaptive Stochastic Collocation method, a Radial Basis Function Neural Network, a Kriging Method and a Dynamic Kriging Method is evaluated. The rate of convergence of the error when used to reconstruct hypersurfaces of known functions is studied. For sufficiently large number of training points, Stochastic Collocation methods generally converge faster than the other metamodeling techniques, while the DKG method converges faster when the number of input points is less than 100 in a two-dimensional parameter space. Because the input points correspond to computationally expensive micro/meso-scale computations, the DKG is favored for bridging scales in a multi-scale solver. After this, closure laws for drag are constructed in the form of surrogate models derived from real-time resolved mesoscale computations of shock-particle interactions. The mesoscale computations are performed to calculate the drag force on a cluster of particles for different values of Mach Number and particle volume fraction. Two Kriging-based methods, viz. the Dynamic Kriging Method (DKG) and the Modified Bayesian Kriging Method (MBKG) are evaluated for their ability to construct surrogate models with sparse data; i.e. using the least number of mesoscale simulations. It is shown that unlike the DKG method, the MBKG method converges monotonically even with noisy input data and is therefore more suitable for surrogate model construction from numerical experiments. In macroscale models for shock-particle interactions, Subgrid Particle Reynolds’ Stress Equivalent (SPARSE) terms arise because of velocity fluctuations due to fluid-particle interaction in the subgrid/meso scales. Mesoscale computations are performed to calculate the SPARSE terms and the kinetic energy of the fluctuations for different values of Mach Number and particle volume fraction. Closure laws for SPARSE terms are constructed using the MBKG method. It is found that the directions normal and parallel to those of shock propagation are the principal directions of the SPARSE tensor. It is also found that the kinetic energy of the fluctuations is independent of the particle volume fraction and is 12-15% of the incoming shock kinetic energy for higher Mach Numbers. Finally, the thesis addresses the cost of performing large ensembles of resolved mesoscale computations for constructing surrogates. Variable fidelity techniques are used to construct an initial surrogate from ensembles of coarse-grid, relative inexpensive computations, while the use of resolved high-fidelity simulations is limited to the correction of initial surrogate. Different variable-fidelity techniques, viz the Space Mapping Method, RBFs and the MBKG methods are evaluated based on their ability to correct the initial surrogate. It is found that the MBKG method uses the least number of resolved mesoscale computations to correct the low-fidelity metamodel. Instead of using 56 high-fidelity computations for obtaining a surrogate, the MBKG method constructs surrogates from only 15 resolved computations, resulting in drastic reduction of computational cost.
24

Finite Element based Parametric Studies of a Truck Cab subjected to the Swedish Pendulum Test

Engström, Henrik, Raine, Jens January 2007 (has links)
<p>Scania has a policy to attain a high crashworthiness standard and their trucks have to conform to Swedish cab safety standards. The main objective of this thesis is to clarify which parameter variations, present during the second part of the Swedish cab crashworthiness test on a Scania R-series cab, that have significance on the intrusion response. An LS-DYNA FE-model of the test case is analysed where parameter variations are introduced through the use of the probabilistic analysis tool LS-OPT.</p><p>Example of analysed variations are the sheet thickness variation as well as the material variations such as stress-strain curve of the structural components, but also variations in the test setup such as the pendulum velocity and angle of approach on impact are taken into account. The effect of including the component forming in the analysis is investigated, where the variations on the material parameters are implemented prior to the forming. An additional objective is to analyse the influence of simulation and model dependent variations and weigh their respective effect on intrusion with the above stated physical variations.</p><p>A submodel is created due to the necessity to speed up the simulations since the numerous parameter variations yield a large number of different designs, resulting in multiple analyses.</p><p>Important structural component sensitivities are taken from the results and should be used as a pointer where to focus the attention when trying to increase the robustness of the cab. Also, the results show that the placement of the pendulum in the y direction (sideways seen from the driver perspective) is the most significant physical parameter variation during the Swedish pendulum test. It is concluded that to be able to achieve a fair comparison of the structural performance from repeated crash testing, this pendulum variation must be kept to a minimum. </p><p>Simulation and model dependent parameters in general showed to have large effects on the intrusion. It is concluded that further investigations on individual simulation or model dependent parameters should be performed to establish which description to use. </p><p>Mapping material effects from the forming simulation into the crash model gave a slight stiffer response compared to the mean pre-stretch approximations currently used by Scania. This is still however a significant result considering that Scanias approximations also included bake hardening effects from the painting process. </p>
25

Robust design using sequential computer experiments

Gupta, Abhishek 30 September 2004 (has links)
Modern engineering design tends to use computer simulations such as Finite Element Analysis (FEA) to replace physical experiments when evaluating a quality response, e.g., the stress level in a phone packaging process. The use of computer models has certain advantages over running physical experiments, such as being cost effective, easy to try out different design alternatives, and having greater impact on product design. However, due to the complexity of FEA codes, it could be computationally expensive to calculate the quality response function over a large number of combinations of design and environmental factors. Traditional experimental design and response surface methodology, which were developed for physical experiments with the presence of random errors, are not very effective in dealing with deterministic FEA simulation outputs. In this thesis, we will utilize a spatial statistical method (i.e., Kriging model) for analyzing deterministic computer simulation-based experiments. Subsequently, we will devise a sequential strategy, which allows us to explore the whole response surface in an efficient way. The overall number of computer experiments will be remarkably reduced compared with the traditional response surface methodology. The proposed methodology is illustrated using an electronic packaging example.
26

Simulation-driven design : Motives, Means, and Opportunities

Sellgren, Ulf January 1999 (has links)
Efficiency and innovative problem solving are contradictory requirements for productdevelopment (PD), and both requirements must be satisfied in companies that strive to remainor to become competitive. Efficiency is strongly related to ”doing things right”, whereasinnovative problem solving and creativity is focused on ”doing the right things”.Engineering design, which is a sub-process within PD, can be viewed as problem solving or adecision-making process. New technologies in computer science and new software tools openthe way to new approaches for the solution of mechanical problems. Product datamanagement (PDM) technology and tools can enable concurrent engineering (CE) bymanaging the formal product data, the relations between the individual data objects, and theirrelation to the PD process. Many engineering activities deal with the relation betweenbehavior and shape. Modern CAD systems are highly productive tools for conceptembodiment and detailing. The finite element (FE) method is a general tool used to study thephysical behavior of objects with arbitrary shapes. Since a modern CAD technology enablesdesign modification and change, it can support the innovative dimension of engineering aswell as the verification of physical properties and behavior. Concepts and detailed solutionshave traditionally been evaluated and verified with physical testing. Numerical modeling andsimulation is in many cases a far more time efficient method than testing to verify theproperties of an artifact. Numerical modeling can also support the innovative dimension ofproblem solving by enabling parameter studies and observations of real and syntheticbehavior. Simulation-driven design is defined as a design process where decisions related tothe behavior and performance of the artifact are significantly supported by computer-basedproduct modeling and simulation.A framework for product modeling, that is based on a modern CAD system with fullyintegrated FE modeling and simulation functionality provides the engineer with tools capableof supporting a number of engineering steps in all life-cycle phases of a product. Such aconceptual framework, that is based on a moderately coupled approach to integratecommercial PDM, CAD, and FE software, is presented. An object model and a supportingmodular modeling methodology are also presented. Two industrial cases are used to illustratethe possibilities and some of the opportunities given by simulation-driven design with thepresented methodology and framework. / QC 20100810
27

Finite Element based Parametric Studies of a Truck Cab subjected to the Swedish Pendulum Test

Engström, Henrik, Raine, Jens January 2007 (has links)
Scania has a policy to attain a high crashworthiness standard and their trucks have to conform to Swedish cab safety standards. The main objective of this thesis is to clarify which parameter variations, present during the second part of the Swedish cab crashworthiness test on a Scania R-series cab, that have significance on the intrusion response. An LS-DYNA FE-model of the test case is analysed where parameter variations are introduced through the use of the probabilistic analysis tool LS-OPT. Example of analysed variations are the sheet thickness variation as well as the material variations such as stress-strain curve of the structural components, but also variations in the test setup such as the pendulum velocity and angle of approach on impact are taken into account. The effect of including the component forming in the analysis is investigated, where the variations on the material parameters are implemented prior to the forming. An additional objective is to analyse the influence of simulation and model dependent variations and weigh their respective effect on intrusion with the above stated physical variations. A submodel is created due to the necessity to speed up the simulations since the numerous parameter variations yield a large number of different designs, resulting in multiple analyses. Important structural component sensitivities are taken from the results and should be used as a pointer where to focus the attention when trying to increase the robustness of the cab. Also, the results show that the placement of the pendulum in the y direction (sideways seen from the driver perspective) is the most significant physical parameter variation during the Swedish pendulum test. It is concluded that to be able to achieve a fair comparison of the structural performance from repeated crash testing, this pendulum variation must be kept to a minimum. Simulation and model dependent parameters in general showed to have large effects on the intrusion. It is concluded that further investigations on individual simulation or model dependent parameters should be performed to establish which description to use. Mapping material effects from the forming simulation into the crash model gave a slight stiffer response compared to the mean pre-stretch approximations currently used by Scania. This is still however a significant result considering that Scanias approximations also included bake hardening effects from the painting process.
28

Evolving Software Systems for Self-Adaptation

Amoui Kalareh, Mehdi 23 April 2012 (has links)
There is a strong synergy between the concepts of evolution and adaptation in software engineering: software adaptation refers to both the current software being adapted and to the evolution process that leads to the new adapted software. Evolution changes for the purpose of adaptation are usually made at development or compile time, and are meant to handle predictable situations in the form of software change requests. On the other hand, software may also change and adapt itself based on the changes in its environment. Such adaptive changes are usually dynamic, and are suitable for dealing with unpredictable or temporary changes in the software's operating environment. A promising solution for software adaptation is to develop self-adaptive software systems that can manage changes dynamically at runtime in a rapid and reliable way. One of the main advantages of self-adaptive software is its ability to manage the complexity that stems from highly dynamic and nondeterministic operating environments. If a self-adaptive software system has been engineered and used properly, it can greatly improve the cost-effectiveness of software change through its lifespan. However, in practice, many of the existing approaches towards self-adaptive software are rather expensive and may increase the overall system complexity, as well as subsequent future maintenance costs. This means that in many cases, self-adaptive software is not a good solution, because its development and maintenance costs are not paid off. The situation is even worse in the case of making current (legacy) systems adaptive. There are several factors that have an impact on the cost-effectiveness and usability of self-adaptive software; however the main objective of this thesis is to make a software system adaptive in a cost-effective way, while keeping the target adaptive software generic, usable, and evolvable, so as to support future changes. In order to effectively engineer and use self-adaptive software systems, in this thesis we propose a new conceptual model for identifying and specifying problem spaces in the context of self-adaptive software systems. Based on the foundations of this conceptual model, we propose a model-centric approach for engineering self-adaptive software by designing a generic adaptation framework and a supporting evolution process. This approach is particularly tailored to facilitate and simplify the process of evolving and adapting current (legacy) software towards runtime adaptivity. The conducted case studies reveal the applicability and effectiveness of this approach in bringing self-adaptive behaviour into non-adaptive applications that essentially demand adaptive behaviour to sustain.
29

A Robust Design Method for Model and Propagated Uncertainty

Choi, Hae-Jin 04 November 2005 (has links)
One of the important factors to be considered in designing an engineering system is uncertainty, which emanates from natural randomness, limited data, or limited knowledge of systems. In this study, a robust design methodology is established in order to design multifunctional materials, employing multi-time and length scale analyses. The Robust Concept Exploration Method with Error Margin Index (RCEM-EMI) is proposed for design incorporating non-deterministic system behavior. The Inductive Design Exploration Method (IDEM) is proposed to facilitate distributed, robust decision-making under propagated uncertainty in a series of multiscale analyses or simulations. These methods are verified in the context of Design of Multifunctional Energetic Structural Materials (MESM). The MESM is being developed to replace the large amount of steel reinforcement in a missile penetrator for light weight, high energy release, and sound structural integrity. In this example, the methods facilitate following state-of-the-art design capabilities, robust MESM design under (a) random microstructure changes and (b) propagated uncertainty in a multiscale analysis chain. The methods are designed to facilitate effective and efficient materials design; however, they are generalized to be applicable to any complex engineering systems design that incorporates computationally intensive simulations or expensive experiments, non-deterministic models, accumulated uncertainty in multidisciplinary analyses, and distributed, collaborative decision-making.
30

A Framework For Developing Conceptual Models Of The Mission Space For Simulation Systems

Karagoz, N. Alpay 01 June 2008 (has links) (PDF)
The simulation world defines conceptual modeling as a tool that provides a clear understanding of the target domain or problem. Although there are some approaches offering useful insights on conceptual modeling in the simulation development lifecycle, they do not provide adequate guidance on how to develop a conceptual model. This thesis study presents a framework for developing conceptual models for simulation systems that is based on the idea that the modelers will develop conceptual models more effectively by following a defined conceptual modeling method, using a domain specific notation and a tool. The conceptual model development method is defined in a step-by-step manner and explanations about the notation and tool are provided when required. A multiple-case study involving two cases is conducted in order to evaluate the applicability of the method for conceptual modeling and validate the expected benefits.

Page generated in 0.4404 seconds