• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 50
  • 9
  • 7
  • 6
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 97
  • 97
  • 34
  • 33
  • 26
  • 24
  • 15
  • 14
  • 12
  • 11
  • 11
  • 11
  • 11
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Design and Optimisation Methods for Structures produced by means of Additive Layer Manufacturing processes / Conception et optimisation des structures obtenues par Additive Layer Manufacturing

Costa, Giulio 22 October 2018 (has links)
Le développement récent des technologies de fabrication additive par couches (Additive Layer Manufacturing) a généré de nouvelles opportunités en termes de conception. Généralement, une étape d'optimisation topologique est réalisée pour les structures ALM. Cette tâche est aujourd'hui facilitée par des progiciels commerciaux, comme Altair OptiStruct ou Simulia TOSCA. Néanmoins, la liberté accordée par l’ALM est seulement apparente et des problèmes majeurs empêchent une exploitation complète et généralisée de cette technologie.La première lacune importante provient de l'intégration directe du résultat d'un calcul d’optimisation topologique dans un environnement CAO approprié. Quoi qu'il en soit, la géométrie optimisée résultante n'est disponible que sous une forme discrétisée, c'est-à-dire en termes d'éléments finis (FE) obtenus à la fin de l'optimisation. La frontière de la géométrie optimisée n'est pas décrite par une entité géométrique, par conséquent la topologie résultante n'est pas compatible avec les logiciels de CAO qui constituent l'environnement naturel du concepteur. Une phase de reconstruction CAO longue est nécessaire et le concepteur est obligé de prendre une quantité considérable de décisions arbitraires. Souvent la topologie CAO compatible résultante ne répond plus aux contraintes d'optimisation.La deuxième restriction majeure est liée aux exigences technologiques spécifiques à l’ALM qui doivent être intégrées directement dans la formulation du problème d'optimisation: considérer la spécificité de l’ALM uniquement comme un post-traitement de la tâche d’optimisation topologique impliquerait des modifications si importantes de la pièce que la topologie optimisée pourrait être complètement différente de la solution optimisée.Cette thèse propose une méthodologie générale pour résoudre les inconvénients mentionnés ci-dessus. Un algorithme d’optimisation topologique innovant a été développé: il vise à fournir une description de la topologie basée sur des entités NURBS et B-Spline purement géométriques, qui sont nativement CAO compatibles. Dans ce cadre, les analyses éléments finis sont utilisées uniquement pour évaluer les réponses physiques du problème étudié. En particulier, une entité géométrique NURBS / B-Spline de dimension D + 1 est utilisée pour résoudre le problème d’optimisation topologique de dimension D.L'efficacité de cette approche a été testée sur certains benchmarks 2D et 3D, issus de la littérature. L'utilisation d'entités NURBS dans la formulation de l’optimisation topologique accélère considérablement la phase de reconstruction CAO pour les structures 2D et présente un grand potentiel pour les problèmes 3D. En outre, il est prouvé que les contraintes géométriques, comme par exemple les épaisseurs minimale et maximale de matière, peuvent être efficacement et systématiquement traitées au moyen de l'approche proposée. De plus, des contraintes géométriques spéciales (non disponibles dans les outils commerciaux), par exemple le rayon de courbure local de la frontière de la phase solide, peuvent être formulées également grâce à la formulation NURBS. La robustesse de la méthodologie proposée a été testée en prenant en compte d'autres grandeurs mécaniques, telles que les charges de flambement et les fréquences naturelles liées aux modes de vibration.Enfin, malgré la nature intrinsèque de l'algorithme d’optimisation topologique basé sur les NURBS, certains outils ont été développés pour déterminer automatiquement le contour des pièces 2D sous forme de courbe et sous forme de surface dans le cadre 3D. L’identification automatique des paramètres des courbes 2D a été entièrement développée et un algorithme original a été proposé. Les principes fondamentaux de la méthode sont également discutés pour l'identification des paramètres des surfaces limites pour les pièces 3D. / The recent development of Additive Layer Manufacturing (ALM) technologies has made possible new opportunities in terms of design. Complicated shapes and topologies, resulting from dedicated optimisation processes or by the designer decisions, are nowadays attainable. Generally, a Topology Optimisation (TO) step is considered when dealing with ALM structures and today this task is facilitated by commercial software packages, like Altair OptiStruct or Simulia TOSCA. Nevertheless, the freedom granted by ALM is only apparent and there are still major issues hindering a full and widespread exploitation of this technology.The first important shortcoming comes from the integration of the result of a TO calculation into a suitable CAD environment. The optimised geometry is available only in a discretised form, i.e. in terms of Finite Elements (FE), which are retained into the computational domain at the end of the TO analysis. Therefore, the boundary of the optimised geometry is not described by a geometrical entity, hence the resulting topology is not compatible with CAD software that constitutes the natural environment for the designer. A time consuming CAD-reconstruction phase is needed and the designer is obliged to make a considerable amount of arbitrary decisions. Consequently, often the resulting CAD-compatible topology does not meet the optimisation constraints.The second major restriction is related to ALM specific technological requirements that should be integrated directly within the optimisation problem formulation and not later: considering ALM specificity only as post-treatment of the TO task would imply so deep modifications of the component that the optimised configuration would be completely overturned.This PhD thesis proposes a general methodology to overcome the aforementioned drawbacks. An innovative TO algorithm has been developed: it aims at providing a topology description based on purely geometric, intrinsically CAD-compliant entities. In this framework, NURBS and B-Spline geometric entities have been naturally considered and FE analyses are used only to evaluate the physical responses for the problem at hand. In particular, a NURBS/B-Spline geometric entity of dimension D+1 is used to solve the TO problem of dimension D. The D+1 coordinate of the NURBS/B-Spline entity is related to a pseudo-density field that is affected to the generic element stiffness matrix; according to the classical penalisation scheme employed in density-based TO methods.The effectiveness of this approach has been tested on some 2D and 3D benchmarks, taken from literature. The use of NURBS entities in the TO formulation significantly speeds up the CAD reconstruction phase for 2D structures and exhibits a great potential for 3D TO problems. Further, it is proven that geometrical constraints, like minimum and maximum length scales, can be effectively and consistently handled by means of the proposed approach. Moreover, special geometric constraints (not available in commercial tools), e.g. on the local curvature radius of the boundary, can be formulated thanks to the NURBS formulation as well. The robustness of the proposed methodology has been tested by taking into account other mechanical quantities of outstanding interest in engineering, such as buckling loads and natural frequencies.Finally, in spite of the intrinsic CAD-compliant nature of the NURBS-based TO algorithm, some support tools have been developed in order to perform the curve and surface fitting in a very general framework. The automatic curve fitting has been completely developed and an original algorithm is developed for choosing the best values of the NURBS curve parameters, both discrete and continuous. The fundamentals of the method are also discussed for the more complicated surface fitting problem and ideas/suggestions for further researches are provided.
22

Simulation-Based Design Under Uncertainty for Compliant Microelectromechanical Systems

Wittwer, Jonathan W. 11 March 2005 (has links)
The high cost of experimentation and product development in the field of microelectromechanical systems (MEMS) has led to a greater emphasis on simulation-based design for increasing first-pass design success and reliability. The use of compliant or flexible mechanisms can help eliminate friction, wear, and backlash, but compliant MEMS are sensitive to variations in material properties and geometry. This dissertation proposes approaches for design stage uncertainty analysis, model validation, and robust optimization of nonlinear compliant MEMS to account for critical process uncertainties including residual stress, layer thicknesses, edge bias, and material stiffness. Methods for simulating and mitigating the effects of non-idealities such joint clearances, semi-rigid supports, non-ideal loading, and asymmetry are also presented. Approaches are demonstrated and experimentally validated using bistable micromechanisms and thermal microactuators as examples.
23

Finite Element based Parametric Studies of a Truck Cab subjected to the Swedish Pendulum Test

Engström, Henrik, Raine, Jens January 2007 (has links)
<p>Scania has a policy to attain a high crashworthiness standard and their trucks have to conform to Swedish cab safety standards. The main objective of this thesis is to clarify which parameter variations, present during the second part of the Swedish cab crashworthiness test on a Scania R-series cab, that have significance on the intrusion response. An LS-DYNA FE-model of the test case is analysed where parameter variations are introduced through the use of the probabilistic analysis tool LS-OPT.</p><p>Example of analysed variations are the sheet thickness variation as well as the material variations such as stress-strain curve of the structural components, but also variations in the test setup such as the pendulum velocity and angle of approach on impact are taken into account. The effect of including the component forming in the analysis is investigated, where the variations on the material parameters are implemented prior to the forming. An additional objective is to analyse the influence of simulation and model dependent variations and weigh their respective effect on intrusion with the above stated physical variations.</p><p>A submodel is created due to the necessity to speed up the simulations since the numerous parameter variations yield a large number of different designs, resulting in multiple analyses.</p><p>Important structural component sensitivities are taken from the results and should be used as a pointer where to focus the attention when trying to increase the robustness of the cab. Also, the results show that the placement of the pendulum in the y direction (sideways seen from the driver perspective) is the most significant physical parameter variation during the Swedish pendulum test. It is concluded that to be able to achieve a fair comparison of the structural performance from repeated crash testing, this pendulum variation must be kept to a minimum. </p><p>Simulation and model dependent parameters in general showed to have large effects on the intrusion. It is concluded that further investigations on individual simulation or model dependent parameters should be performed to establish which description to use. </p><p>Mapping material effects from the forming simulation into the crash model gave a slight stiffer response compared to the mean pre-stretch approximations currently used by Scania. This is still however a significant result considering that Scanias approximations also included bake hardening effects from the painting process. </p>
24

Estimating population parameters of the Louisiana black bear in the Upper Atchafalaya River Basin

Lowe, Carrie Lynne 01 May 2011 (has links)
In 1992, the Louisiana black bear (Ursus americanus luteolus) was granted threatened status under the Endangered Species Act primarily because of extensive habitat loss and fragmentation. Currently, the Louisiana black bear is restricted to 3 relatively small, disjunct breeding subpopulations located in the Tensas River Basin of northeast Louisiana, the upper Atchafalaya River Basin (ARB) of south-central Louisiana, and coastal Louisiana. The 1995 Recovery Plan mandates research to determine the viability of the remaining subpopulations. I conducted a capture-mark-recapture study during 2007–2009 to estimate population parameters for the ARB bear subpopulation by collecting hair samples (n = 2,977) from 115 barbed-wire hair traps during 8 1-week periods each summer. DNA was extracted from those hair samples and microsatellite genotypes were used to identify individuals. I analyzed encounter histories using the Huggins full heterogeneity estimator in a robust design framework in Program MARK. I compared candidate models incorporating heterogeneity, behavior, and time effects on capture using information-theoretic methods. I directly estimated apparent survival, temporary emigration, probability of capture and recapture, and probability of belonging to 1 of 2 mixtures; population abundance was a derived parameter. Apparent survival was 0.91 (SE = 0.06) and did not vary by gender or year. There was some evidence of temporary emigration for males only (0.10, 95% CI = 0.001–0.900). I modeled capture probabilities with a 2-mixture distribution for both male and females. Overall mean weekly capture probability was 0.12 (SE = 0.03) and 0.25 (SE = 0.04) for males and females, respectively. Recapture rates indicated a positive behavioral response to capture. Model-averaged mean annual abundance was 56 (SE = 4.5, 95% CI = 49–68). I calculated population density using spatially-explicit maximum-likelihood methods; model-averaged density was 0.15 bears/km2 (SE = 0.03). My results updated previous abundance estimates for the ARB bear subpopulation and will be used in a population viability analysis to determine if recovery criteria for the Louisiana black bear have been met.
25

Robust design using sequential computer experiments

Gupta, Abhishek 30 September 2004 (has links)
Modern engineering design tends to use computer simulations such as Finite Element Analysis (FEA) to replace physical experiments when evaluating a quality response, e.g., the stress level in a phone packaging process. The use of computer models has certain advantages over running physical experiments, such as being cost effective, easy to try out different design alternatives, and having greater impact on product design. However, due to the complexity of FEA codes, it could be computationally expensive to calculate the quality response function over a large number of combinations of design and environmental factors. Traditional experimental design and response surface methodology, which were developed for physical experiments with the presence of random errors, are not very effective in dealing with deterministic FEA simulation outputs. In this thesis, we will utilize a spatial statistical method (i.e., Kriging model) for analyzing deterministic computer simulation-based experiments. Subsequently, we will devise a sequential strategy, which allows us to explore the whole response surface in an efficient way. The overall number of computer experiments will be remarkably reduced compared with the traditional response surface methodology. The proposed methodology is illustrated using an electronic packaging example.
26

Transceiver Design for Multiple Antenna Communication Systems with Imperfect Channel State Information

Zhang, Xi January 2008 (has links)
Wireless communication links with multiple antennas at both the transmitter and the receiver sides, so-called multiple-input-multiple-output (MIMO)systems, are attracting much interest since they can significantly increase the capacity of band-limited wireless channels to meet the requirements of the future high data rate wireless communications. The treatment of channel state information (CSI) is critical in the design of MIMO systems. Accurate CSI at the transmitter is often not possible or may require high feedback rates, especially in multi-user scenarios. Herein, we consider the robust design of linear transceivers with imperfect CSI either at the transmitter or at both sides of the link. The framework considers the design problem where the imperfect CSI consists of a channel mean and an channel covariance matrix or, equivalently, a channel estimate and an estimation error covariance matrix. For single-user systems, the proposed robust transceiver designs are based on a general cost function of the average mean square errors. Under different CSI conditions, our robust designs exhibit a similar structure to the transceiver designs for perfect CSI, but with a different equivalent channel and/or noise covariance matrix. Utilizing majorization theory, the robust linear transceiver design can be readily solved by convex optimization approaches in practice. For multi-user systems, we consider both the communication link from the users to the access point (up-link) as well as the reverse link from the access point to the users (down-link). For the up-link channel, it is possible to optimally design robust linear transceivers minimizing the average sum mean square errors of all the data streams for the users. Our robust linear transceivers are designed either by reformulating the optimization problem as a semidefinite program or by extending the design of a single-user system in an iterative manner. Under certain channel conditions, we show that the up-link design problem can even be solved partly in a distributed fashion. For the down-link channel, a system with one receive antenna per user is considered. A robust system design is obtained by reducing the feedback load from all users to allow only a few selected users to feed back accurate CSI to the access point. We study the properties of four typical user selection algorithms in conjunction with beamforming that guarantee certain signal-to-interference-plus-noise ratio (SINR) requirements under transmit power minimization. Specifically, we show that norm-based user selection is asymptotically optimal in the number of transmitter antennas and close-to-optimal in the number of users. Rooted in the practical significance of this result, a simpler down-link system design with reduced feedback requirements is proposed. / QC 20100922
27

Finite Element based Parametric Studies of a Truck Cab subjected to the Swedish Pendulum Test

Engström, Henrik, Raine, Jens January 2007 (has links)
Scania has a policy to attain a high crashworthiness standard and their trucks have to conform to Swedish cab safety standards. The main objective of this thesis is to clarify which parameter variations, present during the second part of the Swedish cab crashworthiness test on a Scania R-series cab, that have significance on the intrusion response. An LS-DYNA FE-model of the test case is analysed where parameter variations are introduced through the use of the probabilistic analysis tool LS-OPT. Example of analysed variations are the sheet thickness variation as well as the material variations such as stress-strain curve of the structural components, but also variations in the test setup such as the pendulum velocity and angle of approach on impact are taken into account. The effect of including the component forming in the analysis is investigated, where the variations on the material parameters are implemented prior to the forming. An additional objective is to analyse the influence of simulation and model dependent variations and weigh their respective effect on intrusion with the above stated physical variations. A submodel is created due to the necessity to speed up the simulations since the numerous parameter variations yield a large number of different designs, resulting in multiple analyses. Important structural component sensitivities are taken from the results and should be used as a pointer where to focus the attention when trying to increase the robustness of the cab. Also, the results show that the placement of the pendulum in the y direction (sideways seen from the driver perspective) is the most significant physical parameter variation during the Swedish pendulum test. It is concluded that to be able to achieve a fair comparison of the structural performance from repeated crash testing, this pendulum variation must be kept to a minimum. Simulation and model dependent parameters in general showed to have large effects on the intrusion. It is concluded that further investigations on individual simulation or model dependent parameters should be performed to establish which description to use. Mapping material effects from the forming simulation into the crash model gave a slight stiffer response compared to the mean pre-stretch approximations currently used by Scania. This is still however a significant result considering that Scanias approximations also included bake hardening effects from the painting process.
28

A Robust Design Method for Model and Propagated Uncertainty

Choi, Hae-Jin 04 November 2005 (has links)
One of the important factors to be considered in designing an engineering system is uncertainty, which emanates from natural randomness, limited data, or limited knowledge of systems. In this study, a robust design methodology is established in order to design multifunctional materials, employing multi-time and length scale analyses. The Robust Concept Exploration Method with Error Margin Index (RCEM-EMI) is proposed for design incorporating non-deterministic system behavior. The Inductive Design Exploration Method (IDEM) is proposed to facilitate distributed, robust decision-making under propagated uncertainty in a series of multiscale analyses or simulations. These methods are verified in the context of Design of Multifunctional Energetic Structural Materials (MESM). The MESM is being developed to replace the large amount of steel reinforcement in a missile penetrator for light weight, high energy release, and sound structural integrity. In this example, the methods facilitate following state-of-the-art design capabilities, robust MESM design under (a) random microstructure changes and (b) propagated uncertainty in a multiscale analysis chain. The methods are designed to facilitate effective and efficient materials design; however, they are generalized to be applicable to any complex engineering systems design that incorporates computationally intensive simulations or expensive experiments, non-deterministic models, accumulated uncertainty in multidisciplinary analyses, and distributed, collaborative decision-making.
29

Robust design using sequential computer experiments

Gupta, Abhishek 30 September 2004 (has links)
Modern engineering design tends to use computer simulations such as Finite Element Analysis (FEA) to replace physical experiments when evaluating a quality response, e.g., the stress level in a phone packaging process. The use of computer models has certain advantages over running physical experiments, such as being cost effective, easy to try out different design alternatives, and having greater impact on product design. However, due to the complexity of FEA codes, it could be computationally expensive to calculate the quality response function over a large number of combinations of design and environmental factors. Traditional experimental design and response surface methodology, which were developed for physical experiments with the presence of random errors, are not very effective in dealing with deterministic FEA simulation outputs. In this thesis, we will utilize a spatial statistical method (i.e., Kriging model) for analyzing deterministic computer simulation-based experiments. Subsequently, we will devise a sequential strategy, which allows us to explore the whole response surface in an efficient way. The overall number of computer experiments will be remarkably reduced compared with the traditional response surface methodology. The proposed methodology is illustrated using an electronic packaging example.
30

Uncertainty management in the design of multiscale systems

Sinha, Ayan 07 April 2011 (has links)
In this thesis, a framework is laid for holistic uncertainty management for simulation-based design of multiscale systems. The work is founded on uncertainty management for microstructure mediated design (MMD) of material and product, which is a representative example of a system over multiple length and time scales, i.e., a multiscale system. The characteristics and challenges for uncertainty management for multiscale systems are introduced context of integrated material and product design. This integrated approach results in different kinds of uncertainty, i.e., natural uncertainty (NU), model parameter uncertainty (MPU), model structure uncertainty (MSU) and propagated uncertainty (PU). We use the Inductive Design Exploration Method to reach feasible sets of robust solutions against MPU, NU and PU. MMD of material and product is performed for the product autonomous underwater vehicle (AUV) employing the material in-situ metal matrix composites using IDEM to identify robust ranged solution sets. The multiscale system results in decision nodes for MSU consideration at hierarchical levels, termed as multilevel design. The effectiveness of using game theory to model strategic interaction between the different levels to facilitate decision making for mitigating MSU in multilevel design is illustrated using the compromise decision support problem (cDSP) technique. Information economics is identified as a research gap to address holistic uncertainty management in simulation-based multiscale systems, i.e., to address the reduction or mitigation of uncertainty considering the current design decision and scope for further simulation model refinement in order to reach better robust solutions. It necessitates development of an improvement potential (IP) metric based on value of information which suggests the scope of improvement in a designer's decision making ability against modeled uncertainty (MPU) in simulation models in multilevel design problem. To address the research gap, the integration of robust design (using IDEM), information economics (using IP) and game theoretic constructs (using cDSP) is proposed. Metamodeling techniques and expected value of information are critically reviewed to facilitate efficient integration. Robust design using IDEM and cDSP are integrated to improve MMD of material and product and address all four types of uncertainty simultaneously. Further, IDEM, cDSP and IP are integrated to assist system level designers in allocating resources for simulation model refinement in order to satisfy performance and robust process requirements. The approach for managing MPU, MSU, NU and PU while mitigating MPU is presented using the MMD of material and product. The approach presented in this article can be utilized by system level designers for managing all four types of uncertainty and reducing model parameter uncertainty in any multiscale system.

Page generated in 0.4419 seconds