• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 1
  • Tagged with
  • 8
  • 8
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Anisotropic Quadrilateral Mesh Optimization

Ferguson, Joseph Timothy Charles 12 August 2016 (has links)
In order to determine the validity and the quality of meshes, mesh optimization methods have been formulated with quality measures. The basic idea of mesh optimization is to relocate the vertices to obtain a valid mesh (untangling) or improve the mesh quality (smoothing), or both. We will look at a new algebraic way of calculating quality measure on quadrilateral meshes, based on triangular meshes in 2D as well as new optimization methods for simultaneous untangling and smoothing for severely deformed meshes. An innovative anisotropic diffusion method will be introduced for consideration of inner boundary deformation movements for quadrangle meshes in 2D.
2

Reduced order modeling techniques for mesh movement as applied to fluid structure interactions

Bogaers, Alfred Edward Jules 11 August 2010 (has links)
In this thesis, the method of Proper Orthogonal Decomposition (POD) is implemented to construct approximate, reduced order models (ROM) of mesh movement methods. Three mesh movement algorithms are implemented and comparatively evaluated, namely radial basis function interpolation, mesh optimization and elastic deformation. POD models of the mesh movement algorithms are constructed using a series of system observations, or snapshots of a given mesh for a set of boundary deformations. The scalar expansion coefficients for the POD basis modes are computed in three different ways, through coefficient optimization, Galerkin projection of the governing set of equations and coefficient interpolation. It is found that using only coefficient interpolation yields mesh movement models that accurately approximates the full order mesh movement, with CPU cost savings in excess of 99%. We further introduce a novel training procedure whereby the POD models are generated in a fully automated fashion. The technology is applicable to any mesh movement method and enables potential reductions of up to four orders of magnitude in mesh movement related costs. The proposed model can be implemented without having to pre-train the POD model, to any fluid-structure interaction code with an existing mesh movement scheme. Copyright / Dissertation (MEng)--University of Pretoria, 2010. / Mechanical and Aeronautical Engineering / unrestricted
3

Computational Fluid Dynamics Unstructured Mesh Optimization for the Siemens 4th Generation DLE Burner

Koren, Dejan January 2015 (has links)
Every computational fluid dynamics engineer deals with a never ending story – limitedcomputer resources. In computational fluid dynamics there is practically never enoughcomputer power. Limited computer resources lead to long calculation times which result inhigh costs and one of the main reasons is that large quantity of elements are needed in acomputational mesh in order to obtain accurate and reliable results.Although there exist established meshing approaches for the Siemens 4th generation DLEburner, mesh dependency has not been fully evaluated yet. The main goal of this work istherefore to better optimize accuracy versus cell count for this particular burner intended forsimulation of air/gas mixing where eddy-viscosity based turbulence models are employed.Ansys Fluent solver was used for all simulations in this work. For time effectivisationpurposes a 30° sector model of the burner was created and validated for the meshconvergence study. No steady state solutions were found for this case therefore timedependent simulations with time statistics sampling were employed. The mesh convergencestudy has shown that a coarse computational mesh in air casing of the burner does not affectflow conditions downstream where air/gas mixing process is taking place and that a majorpart of the combustion chamber is highly mesh independent. A large reduction of cell count inthose two parts is therefore allowed. On the other hand the RPL (Rich Pilot Lean) and thepilot burner turned out to be highly mesh density dependent. The RPL and the Pilot burnerneed to have significantly more refined mesh as it has been used so far with the establishedmeshing approaches. The mesh optimization has finally shown that at least as accurate resultsof air/gas mixing results may be obtained with 3x smaller cell count. Furthermore it has beenshown that significantly more accurate results may be obtained with 60% smaller cell count aswith the established meshing approaches.A short mesh study of the Siemens 3rd generation DLE burner in ignition stage of operationwas also performed in this work. This brief study has shown that the established meshingapproach for air/gas mixing purposes is sufficient for use with Ansys Fluent solver whilecertain differences were discovered when comparing the results obtained with Ansys Fluentagainst those obtained with Ansys CFX solver. Differences between Fluent and CFX solverwere briefly discussed in this work as identical simulation set up in both solvers producedslightly different results. Furthermore the obtained results suggest that Fluent solver is lessmesh dependent as CFX solver for this particular case.
4

Size Function Based Mesh Relaxation

Howlett, John David 18 March 2005 (has links) (PDF)
This thesis addresses the problem of relaxing a finite element mesh to more closely match a size function. The main contributions include new methods for performing size function based mesh relaxation, as well as an algorithm for measuring the performance of size function based mesh relaxation methods.
5

Contributions to Mean Shift filtering and segmentation : Application to MRI ischemic data

Li, Ting 04 April 2012 (has links) (PDF)
Medical studies increasingly use multi-modality imaging, producing multidimensional data that bring additional information that are also challenging to process and interpret. As an example, for predicting salvageable tissue, ischemic studies in which combinations of different multiple MRI imaging modalities (DWI, PWI) are used produced more conclusive results than studies made using a single modality. However, the multi-modality approach necessitates the use of more advanced algorithms to perform otherwise regular image processing tasks such as filtering, segmentation and clustering. A robust method for addressing the problems associated with processing data obtained from multi-modality imaging is Mean Shift which is based on feature space analysis and on non-parametric kernel density estimation and can be used for multi-dimensional filtering, segmentation and clustering. In this thesis, we sought to optimize the mean shift process by analyzing the factors that influence it and optimizing its parameters. We examine the effect of noise in processing the feature space and how Mean Shift can be tuned for optimal de-noising and also to reduce blurring. The large success of Mean Shift is mainly due to the intuitive tuning of bandwidth parameters which describe the scale at which features are analyzed. Based on univariate Plug-In (PI) bandwidth selectors of kernel density estimation, we propose the bandwidth matrix estimation method based on multi-variate PI for Mean Shift filtering. We study the interest of using diagonal and full bandwidth matrix with experiment on synthesized and natural images. We propose a new and automatic volume-based segmentation framework which combines Mean Shift filtering and Region Growing segmentation as well as Probability Map optimization. The framework is developed using synthesized MRI images as test data and yielded a perfect segmentation with DICE similarity measurement values reaching the highest value of 1. Testing is then extended to real MRI data obtained from animals and patients with the aim of predicting the evolution of the ischemic penumbra several days following the onset of ischemia using only information obtained from the very first scan. The results obtained are an average DICE of 0.8 for the animal MRI image scans and 0.53 for the patients MRI image scans; the reference images for both cases are manually segmented by a team of expert medical staff. In addition, the most relevant combination of parameters for the MRI modalities is determined.
6

Methods for increased computational efficiency of multibody simulations

Epple, Alexander 08 August 2008 (has links)
This thesis is concerned with the efficient numerical simulation of finite element based flexible multibody systems. Scaling operations are systematically applied to the governing index-3 differential algebraic equations in order to solve the problem of ill conditioning for small time step sizes. The importance of augmented Lagrangian terms is demonstrated. The use of fast sparse solvers is justified for the solution of the linearized equations of motion resulting in significant savings of computational costs. Three time stepping schemes for the integration of the governing equations of flexible multibody systems are discussed in detail. These schemes are the two-stage Radau IIA scheme, the energy decaying scheme, and the generalized-α method. Their formulations are adapted to the specific structure of the governing equations of flexible multibody systems. The efficiency of the time integration schemes is comprehensively evaluated on a series of test problems. Formulations for structural and constraint elements are reviewed and the problem of interpolation of finite rotations in geometrically exact structural elements is revisited. This results in the development of a new improved interpolation algorithm, which preserves the objectivity of the strain field and guarantees stable simulations in the presence of arbitrarily large rotations. Finally, strategies for the spatial discretization of beams in the presence of steep variations in cross-sectional properties are developed. These strategies reduce the number of degrees of freedom needed to accurately analyze beams with discontinuous properties, resulting in improved computational efficiency.
7

Développement de modèles graphiques probabilistes pour analyser et remailler les maillages triangulaires 2-variétés / Development of probabilistic graphical models to analyze and remesh 2-manifold triangular meshes

Vidal, Vincent 09 December 2011 (has links)
Ce travail de thèse concerne l'analyse structurelle des maillages triangulaires surfaciques, ainsi que leur traitement en vue de l'amélioration de leur qualité (remaillage) ou de leur simplification. Dans la littérature, le repositionnement des sommets d'un maillage est soit traité de manière locale, soit de manière globale mais sans un contrôle local de l'erreur géométrique introduite, i.e. les solutions actuelles ne sont pas globales ou introduisent de l'erreur géométrique non-contrôlée. Les techniques d'approximation de maillage les plus prometteuses se basent sur une décomposition en primitives géométriques simples (plans, cylindres, sphères etc.), mais elles n'arrivent généralement pas à trouver la décomposition optimale, celle qui optimise à la fois l'erreur géométrique de l'approximation par les primitives choisies, et le nombre et le type de ces primitives simples. Pour traiter les défauts des approches de remaillage existantes, nous proposons une méthode basée sur un modèle global, à savoir une modélisation graphique probabiliste, intégrant des contraintes souples basées sur la géométrie (l'erreur de l'approximation), la qualité du maillage et le nombre de sommets du maillage. De même, pour améliorer la décomposition en primitives simples, une modélisation graphique probabiliste a été choisie. Les modèles graphiques de cette thèse sont des champs aléatoires de Markov, ces derniers permettant de trouver une configuration optimale à l'aide de la minimisation globale d'une fonction objectif. Nous avons proposé trois contributions dans cette thèse autour des maillages triangulaires 2-variétés : (i) une méthode d'extraction statistiquement robuste des arêtes caractéristiques applicable aux objets mécaniques, (ii) un algorithme de segmentation en régions approximables par des primitives géométriques simples qui est robuste à la présence de données aberrantes et au bruit dans la position des sommets, (iii) et finalement un algorithme d'optimisation de maillages qui cherche le meilleur compromis entre l'amélioration de la qualité des triangles, la qualité de la valence des sommets, le nombre de sommets et la fidélité géométrique à la surface initiale. / The work in this thesis concerns structural analysis of 2-manifold triangular meshes, and their processing towards quality enhancement (remeshing) or simplification. In existing work, the repositioning of mesh vertices necessary for remeshing is either done locally or globally, but in the latter case without local control on the introduced geometrical error. Therefore, current results are either not globally optimal or introduce unwanted geometrical error. Other promising remeshing and approximation techniques are based on a decomposition into simple geometrical primitives (planes, cylinders, spheres etc.), but they generally fail to find the best decomposition, i.e. the one which jointly optimizes the residual geometrical error as well as the number and type of selected simple primitives. To tackle the weaknesses of existing remeshing approaches, we propose a method based on a global model, namely a probabilistic graphical model integrating soft constraints based on geometry (approximation error), mesh quality and the number of mesh vertices. In the same manner, for segmentation purposes and in order to improve algorithms delivering decompositions into simple primitives, a probabilistic graphical modeling has been chosen. The graphical models used in this work are Markov Random Fields, which allow to find an optimal configuration by a global minimization of an objective function. We have proposed three contributions in this thesis about 2-manifold triangular meshes : (i) a statistically robust method for feature edge extraction for mechanical objects, (ii) an algorithm for the segmentation into regions which are approximated by simple primitives, which is robust to outliers and to the presence of noise in the vertex positions, (iii) and lastly an algorithm for mesh optimization which jointly optimizes triangle quality, the quality of vertex valences, the number of vertices, as well as the geometrical fidelity to the initial surface.
8

Contributions to Mean Shift filtering and segmentation : Application to MRI ischemic data / Contributions au filtrage Mean Shift à la segmentation : Application à l’ischémie cérébrale en imagerie IRM

Li, Thing 04 April 2012 (has links)
De plus en plus souvent, les études médicales utilisent simultanément de multiples modalités d'acquisition d'image, produisant ainsi des données multidimensionnelles comportant beaucoup d'information supplémentaire dont l'interprétation et le traitement deviennent délicat. Par exemple, les études sur l'ischémie cérébrale se basant sur la combinaison de plusieurs images IRM, provenant de différentes séquences d'acquisition, pour prédire l'évolution de la zone nécrosée, donnent de bien meilleurs résultats que celles basées sur une seule image. Ces approches nécessitent cependant l'utilisation d'algorithmes plus complexes pour réaliser les opérations de filtrage, segmentation et de clustering. Une approche robuste pour répondre à ces problèmes de traitements de données multidimensionnelles est le Mean Shift qui est basé sur l'analyse de l'espace des caractéristiques et l'estimation non-paramétrique par noyau de la densité de probabilité. Dans cette thèse, nous étudions les paramètres qui influencent les résultats du Mean Shift et nous cherchons à optimiser leur choix. Nous examinons notamment l'effet du bruit et du flou dans l'espace des caractéristiques et comment le Mean Shift doit être paramétrés pour être optimal pour le débruitage et la réduction du flou. Le grand succès du Mean Shift est principalement du au réglage intuitif de ces paramètres de la méthode. Ils représentent l'échelle à laquelle le Mean Shift analyse chacune des caractéristiques. En se basant sur la méthode du Plug In (PI) monodimensionnel, fréquemment utilisé pour le filtrage Mean Shift et permettant, dans le cadre de l'estimation non-paramétrique par noyau, d'approximer le paramètre d'échelle optimal, nous proposons l'utilisation du PI multidimensionnel pour le filtrage Mean Shift. Nous évaluons l'intérêt des matrices d'échelle diagonales et pleines calculées à partir des règles du PI sur des images de synthèses et naturelles. Enfin, nous proposons une méthode de segmentation automatique et volumique combinant le filtrage Mean Shift et la croissance de région ainsi qu'une optimisation basée sur les cartes de probabilité. Cette approche est d'abord étudiée sur des images IRM synthétisées. Des tests sur des données réelles issues d'études sur l'ischémie cérébrale chez le rats et l'humain sont aussi conduits pour déterminer l'efficacité de l'approche à prédire l'évolution de la zone de pénombre plusieurs jours après l'accident vasculaire et ce, à partir des IRM réalisées peu de temps après la survenue de cet accident. Par rapport aux segmentations manuelles réalisées des experts médicaux plusieurs jours après l'accident, les résultats obtenus par notre approche sont mitigés. Alors qu'une segmentation parfaite conduirait à un coefficient DICE de 1, le coefficient est de 0.8 pour l'étude chez le rat et de 0.53 pour l'étude sur l'homme. Toujours en utilisant le coefficient DICE, nous déterminons la combinaison de d'images IRM conduisant à la meilleure prédiction. / Medical studies increasingly use multi-modality imaging, producing multidimensional data that bring additional information that are also challenging to process and interpret. As an example, for predicting salvageable tissue, ischemic studies in which combinations of different multiple MRI imaging modalities (DWI, PWI) are used produced more conclusive results than studies made using a single modality. However, the multi-modality approach necessitates the use of more advanced algorithms to perform otherwise regular image processing tasks such as filtering, segmentation and clustering. A robust method for addressing the problems associated with processing data obtained from multi-modality imaging is Mean Shift which is based on feature space analysis and on non-parametric kernel density estimation and can be used for multi-dimensional filtering, segmentation and clustering. In this thesis, we sought to optimize the mean shift process by analyzing the factors that influence it and optimizing its parameters. We examine the effect of noise in processing the feature space and how Mean Shift can be tuned for optimal de-noising and also to reduce blurring. The large success of Mean Shift is mainly due to the intuitive tuning of bandwidth parameters which describe the scale at which features are analyzed. Based on univariate Plug-In (PI) bandwidth selectors of kernel density estimation, we propose the bandwidth matrix estimation method based on multi-variate PI for Mean Shift filtering. We study the interest of using diagonal and full bandwidth matrix with experiment on synthesized and natural images. We propose a new and automatic volume-based segmentation framework which combines Mean Shift filtering and Region Growing segmentation as well as Probability Map optimization. The framework is developed using synthesized MRI images as test data and yielded a perfect segmentation with DICE similarity measurement values reaching the highest value of 1. Testing is then extended to real MRI data obtained from animals and patients with the aim of predicting the evolution of the ischemic penumbra several days following the onset of ischemia using only information obtained from the very first scan. The results obtained are an average DICE of 0.8 for the animal MRI image scans and 0.53 for the patients MRI image scans; the reference images for both cases are manually segmented by a team of expert medical staff. In addition, the most relevant combination of parameters for the MRI modalities is determined.

Page generated in 0.1017 seconds