131 |
Generalized Statistical Tolerance Analysis and Three Dimensional Model for Manufacturing Tolerance Transfer in Manufacturing Process PlanningJanuary 2011 (has links)
abstract: Mostly, manufacturing tolerance charts are used these days for manufacturing tolerance transfer but these have the limitation of being one dimensional only. Some research has been undertaken for the three dimensional geometric tolerances but it is too theoretical and yet to be ready for operator level usage. In this research, a new three dimensional model for tolerance transfer in manufacturing process planning is presented that is user friendly in the sense that it is built upon the Coordinate Measuring Machine (CMM) readings that are readily available in any decent manufacturing facility. This model can take care of datum reference change between non orthogonal datums (squeezed datums), non-linearly oriented datums (twisted datums) etc. Graph theoretic approach based upon ACIS, C++ and MFC is laid out to facilitate its implementation for automation of the model. A totally new approach to determining dimensions and tolerances for the manufacturing process plan is also presented. Secondly, a new statistical model for the statistical tolerance analysis based upon joint probability distribution of the trivariate normal distributed variables is presented. 4-D probability Maps have been developed in which the probability value of a point in space is represented by the size of the marker and the associated color. Points inside the part map represent the pass percentage for parts manufactured. The effect of refinement with form and orientation tolerance is highlighted by calculating the change in pass percentage with the pass percentage for size tolerance only. Delaunay triangulation and ray tracing algorithms have been used to automate the process of identifying the points inside and outside the part map. Proof of concept software has been implemented to demonstrate this model and to determine pass percentages for various cases. The model is further extended to assemblies by employing convolution algorithms on two trivariate statistical distributions to arrive at the statistical distribution of the assembly. Map generated by using Minkowski Sum techniques on the individual part maps is superimposed on the probability point cloud resulting from convolution. Delaunay triangulation and ray tracing algorithms are employed to determine the assembleability percentages for the assembly. / Dissertation/Thesis / Ph.D. Mechanical Engineering 2011
|
132 |
Mapas auto-organizáveis com estrutura variante do tempo para reconstrução de superfíciesRÊGO, Renata Lucia Mendonça Ernesto do 11 March 2013 (has links)
Submitted by João Arthur Martins (joao.arthur@ufpe.br) on 2015-03-12T19:35:36Z
No. of bitstreams: 2
Tese Renata Lucia do Rego.pdf: 9069635 bytes, checksum: b1ae50c257ceadf38ef9b992d5d95e82 (MD5)
license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) / Made available in DSpace on 2015-03-12T19:35:36Z (GMT). No. of bitstreams: 2
Tese Renata Lucia do Rego.pdf: 9069635 bytes, checksum: b1ae50c257ceadf38ef9b992d5d95e82 (MD5)
license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)
Previous issue date: 2013-03-11 / processo de aprendizagem de variedades tem por objetivo recuperar informações sobre
uma variedade M não conhecida a partir de um conjunto de pontos L amostrados
em M. Neste contexto, sub-complexos da triangulação de Delaunay tem sido utilizados
para construir uma aproximação fiel de M a partir de L. Particularmente, provou-se
que o complexo Delaunay restrito é uma boa aproximação, tanto topológica quanto geometricamente,
de curvas planas ou superfícies no espaço 3D, assumindo que a amostra
disponível é suficientemente densa (Amenta e Bern, 1998). Desde então, ela tem sido
utilizada por diferentes métodos de reconstrução de superfícies (Amenta et al., 2001;
Boissonnat e Oudot, 2006; Dey e Giesen, 2001; Dey e Goswami, 2006, 2003).
O aprendizado Hebbiano Competitivo (Competitive Hebbian Learning-CHL) (Martinetz
e Schulten, 1994) é um método simples e elegante para aprender a topologia de
uma variedade a partir de pontos amostrados, que tem sido amplamente utilizado por variantes
do Mapa Auto-organizável com a habilidade de aprender topologias. Martinetz e
Schulten (1994) provou que o CHL produz um subconjunto da triangulação de Delaunay.
Infelizmente, o CHL só é capaz de produzir grafos, e portanto não pode ser diretamente
empregado para produzir malhas de triângulos.
Os resultados de Martinetz e Schulten (1994) deram origem a trabalhos relacionados
no campo da geometria computacional. Particularmente, De Silva e Carlsson (2004)
introduziram o complexo de testemunhas, que pode ser considerado uma aproximação
da triangulação Delaunay restrita. O complexo de testemunhas generaliza o grafo de
preservação de topologia gerado com o CHL, i.e. ele é um complexo simplicial em vez
de um grafo. De Silva e Carlsson (2004) também apresentou definições relaxadas para
centros Delaunay e testemunhas. E Boissonnat et al. (2011) mostrou que, sob determinadas
condições, o complexo Delaunay relaxado é equivalente ao complexo Delaunay
restrito.
Neste contexto, investigamos a capacidade dos Mapas Auto-organizáveis com estrutura
variante no tempo na solução do problema de reconstrução de superfícies. Em
seguida, desenvolvemos algoritmos baseados em aprendizado para reconstrução de superfícies
a partir de nuvens de pontos não estruturados, que consistem de Mapas Autoorganizáveis
combinando métodos de aprendizado para selecionar os vértices da malha,
e métodos de aprendizado de topologia para geração de complexos simpliciais. Basicamente
os métodos de aprendizado de topologia introduzidos nesta tese são variantes do
CHL inspirados no complexo de testemunhas e no complexo Delaunay relaxado, com
a adição de algumas heurísticas para tratar problemas observados em situações práticas.Outros aspecto positivos do uso de Mapas Auto-organizáveis para reconstrução de superfícies
são a habilidade para lidar com dados ruidosos e para produzir malhas com
diferentes resoluções.
Os resultados experimentais mostram que as soluções propostas foram capazes de
produzir malhas que são boas aproximações das superfícies alvo. Tais malhas foram
avaliadas de acordo com diferentes métricas: distância de Hausdorff, distribuição de vizinhança,
regularidade dos polígonos, ângulo minimo. Os resultados foram comparados
com outros métodos de reconstrução de superfícies para apontar as vantagens e desvantagens
das soluções propostas. Na maioria dos casos as soluções propostas apresentaram
melhores resultados com respeito às métricas consideradas. Os experimentos também
indicam que as soluções propostas são adequadas para reconstrução de variedades em
dimensões mais altas.
|
133 |
Applicering av en 2D dungeon algoritm i en 3D rymd : Hur bra presterar TinyKeeps dungeon algoritm i tre dimensioner? / Application of a 2D dungeon algorithm in 3D space : How well does TinyKeep’s dungeon algorithm perform in three dimensions?Birgersson, Emil January 2021 (has links)
Procedural content generation inom spel refererar till algoritmiskt, procedurellt skapande av digitalt innehåll i syfte att automatisera och minska mängden arbete för designers och grafiker. Ett område procedural content generation används för inom spel är dungeon generering. Målet för detta arbete var att utforska en 2D algoritm som används i spelet TinyKeep för just generering av dungeons och se hur den algoritmen kunde prestera om den modifierades för generering i en 3D rymd. Den modifierade algoritmen utvärderades baserat på attributen variation, justerbarhet, pålitlighet samt tidseffektivitet. Resultatet visade visade att det är möjligt att använda algoritmen i spelet TinyKeep i en 3D-rymd med godtagbart prestationsresultat. Som värst visade den modifierade algoritmen på en viss svaghet angående genereringstid vid en större tom rymd med större antal rum hos en dungeon. För framtida arbete vore det intressant att dela upp av delar av den modifierade algoritmen på separata trådar. / <p>Det finns övrigt digitalt material (t.ex. film-, bild- eller ljudfiler) eller modeller/artefakter tillhörande examensarbetet som ska skickas till arkivet.</p>
|
134 |
Knihovna pro práci s tetraedrální sítí / Tetrahedral Mesh Processing LibraryHromádka, David January 2013 (has links)
Many architecure, medical and engineering applications need a spacial support for various numerical computations (i.e. FEM simulations). Tetrahedral meshes are one of perspective spatial representations for them. In this thesis, several possibilities of effective tetrahedral mesh representation for its generating and processing are described. A computer library for the mesh processing is proposed which can be characterized by memory efficient imposition of the mesh while preserving the ability to apply topological and geometric algorithms effectively. The library is implemented in C++ language using templates. Time and space complexity of typical mesh operations is compared with CGAL library and according to measurements the proposed library has lower memory requirements than CGAL.
|
135 |
Ohodnocení okolí bodů v obraze / Parametrization of Image Point NeighborhoodZamazal, Zdeněk January 2011 (has links)
This master thesis is focused on parametrization of image point neighborhood. Some methods for point localization and point descriptors are described and summarized. Gabor filter is described in detail. The practical part of thesis is chiefly concerned with particle filter tracking system. The weight of each particle is determined by the Gabor filter.
|
136 |
Automatische Generierung und Visualisierung einer triangulierten Oberfläche eines 3-D-Objektes aus digitalisierten parallelen SchnittdatenHaller, Christian 28 February 2002 (has links)
In dieser Arbeit wird eine Erweiterung des Verfahrens von BOISSONAT vorgestellt, mit dem automatisch die Oberfläche eines 3-D-Objektes aus digitalisierten parallelen Schnittdaten rekonstrukiert werden
kann. Dabei wird gewährleistet, dass die Oberfläche aus einer konsistenten und zulässigen Triangulierung besteht.
|
137 |
Interactive Glyph Placement for Tensor Fields: Tracking Lines in Higher Order Tensor FieldsHlawitschka, Mario, Scheuermann, Gerik, Hamann, Bernd 04 February 2019 (has links)
Visualization of glyphs has a long history in medical imaging but gains much more power when the glyphs are properly placed to fill the screen. Glyph packing is often performed via an iterative approach to improve the location of glyphs. We present an alternative implementation of glyph packing based on a Delaunay triangulation to speed up the clustering process and reduce costs for neighborhood searches. Our approach does not require a re–computation of acceleration structures when a plane is moved through a volume, which can be done interactively. We provide two methods for initial placement of glyphs to improve the convergence of our algorithm for glyphs larger and glyphs smaller than the data set’s voxel size. The main contribution of this paper is a novel approach to glyph packing that supports simpler parameterization and can be used easily for highly efficient interactive data exploration, in contrast to previous methods.
|
138 |
Advances in aircraft design: multiobjective optimization and a markup languageDeshpande, Shubhangi Govind 23 January 2014 (has links)
Today's modern aerospace systems exhibit strong interdisciplinary coupling and require a multidisciplinary, collaborative approach. Analysis methods that were once considered feasible only for advanced and detailed design are now available and even practical at the conceptual design stage. This changing philosophy for conducting conceptual design poses additional challenges beyond those encountered in a low fidelity design of aircraft. This thesis takes some steps towards bridging the gaps in existing technologies and advancing the state-of-the-art in aircraft design.
The first part of the thesis proposes a new Pareto front approximation method for multiobjective optimization problems. The method employs a hybrid optimization approach using two derivative free direct search techniques, and is intended for solving blackbox simulation based multiobjective optimization problems with possibly nonsmooth functions where the analytical form of the objectives is not known and/or the evaluation of the objective function(s) is very expensive (very common in multidisciplinary design optimization). A new adaptive weighting scheme is proposed to convert a multiobjective optimization problem to a single objective optimization problem. Results show that the method achieves an arbitrarily close approximation to the Pareto front with a good collection of well-distributed nondominated points.
The second part deals with the interdisciplinary data communication issues involved in a collaborative mutidisciplinary aircraft design environment. Efficient transfer, sharing, and manipulation of design and analysis data in a collaborative environment demands a formal structured representation of data. XML, a W3C recommendation, is one such standard concomitant with a number of powerful capabilities that alleviate interoperability issues. A compact, generic, and comprehensive XML schema for an aircraft design markup language (ADML) is proposed here to provide a common language for data communication, and to improve efficiency and productivity within a multidisciplinary, collaborative environment. An important feature of the proposed schema is the very expressive and efficient low level schemata. As a proof of concept the schema is used to encode an entire Convair B58. As the complexity of models and number of disciplines increases, the reduction in effort to exchange data models and analysis results in ADML also increases. / Ph. D.
|
139 |
Generování a optimalizace meshů / Generování a optimalizace meshůMokriš, Dominik January 2012 (has links)
This thesis is devoted to the problem of finding a suitable geometrical de- scription of the domain for the Finite Element Method (FEM). We present the most important methods used in generation and improvement of unstructured triangular meshes (grids) for two dimensional FEM. Possible measures of mesh quality are discussed with respect to their usage in linear Lagrange FEM. The relationship between mesh geometry (especially angles of particular triangles), discretization error and stiffness matrix condition number is examined. Two methods of mesh improvement, based on Centroidal Voronoi Tessellations (CVT) and Optimal Delaunay Triangulations (ODT), are discussed in detail and some results on convergence of CVT based methods are reviewed. Some aspects of these methods, e.g. the relation between density of boundary points and interior mesh vertices and the treatment of the boundary triangles is reconsidered in a new way. We have implemented these two methods and we discuss possible im- provements and new algorithms. A geometrically very interesting idea of recent alternative to FEM, Isogeometric Analysis (IGA), is outlined and demonstrated on a simple example. Several numerical tests are made in order to the compare the accuracy of solutions of isotropic PDEs obtained by FEM on bad mesh, mesh improved...
|
140 |
Squelettes et graphes de Voronoï 2D et 3DAttali, Dominique 13 October 1995 (has links) (PDF)
Notre travail concerne l'étude, le calcul et la simplification des squelettes d'objets 2D et 3D. Le squelette d'un objet est une figure mince, centrée dans la forme et qui en résume l'aspect. Il est utile pour la description et la reconnaissance de formes, la quantification, la mise en correspondance, etc. Dans un premier temps, nous recensons les différentes techniques de calcul du squelette. La très grande majorité d'entre elles travaille sur des images binaires avec des outils de la géométrie discrète. Or, dernièrement, une nouvelle famille de méthodes, appelées méthodes continues a vu le jour. Le squelette est approché à l'aide du graphe de Voronoï d'un échantillonnage de la frontière, et se calcule par des moyens propres à la géométrie algorithmique. Notre intérêt s'est porté sur cette nouvelle approche et les problèmes qui s'y rattache. Pour commencer, nous proposons une formulation des méthodes continues à l'aide du squelette d'une union finie de sphères. En effet, nous montrons que le squelette d'une union finie de sphères se construit de façon exacte à l'aide d'éléments très simples comme des segments de droite en 2D et des polygones en 3D. La construction du squelette nécessite de pouvoir interpoler par des facettes triangulaires un ensemble de points localisés sur la frontière d'un objet. Nous proposons une méthode, fondée sur le calcul du graphe de Delaunay et dont nous montrons la convergence en 2D. Enfin, des méthodes de simplification du squelette sont présentées. Elles permettent de sélectionner les branches correspondant à des renflements significatifs de la forme et conduisent en 3D soit à des squelettes surfaciques, soit à des squelettes filiformes selon les besoins de l'utilisateur. Pour finir, nous décrivons une application qui valide notre approche, et l'illustre sur des données biologiques
|
Page generated in 0.0343 seconds