• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 35
  • 32
  • 5
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 94
  • 94
  • 28
  • 24
  • 23
  • 18
  • 16
  • 14
  • 13
  • 11
  • 11
  • 10
  • 10
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Integrated Multi-Scale Modeling Framework for Simulating Failure Response of Fiber Reinforced Composites

Ahmadian Ahmadabad, Hossein 28 August 2019 (has links)
No description available.
12

Validation of a Mesh Generation Strategy for Predicting Ice Accretion on Wings

Bassou, Rania 09 December 2016 (has links) (PDF)
Researchers have been developing techniques to predict inlight icing in order to determine aircraft behavior under different icing conditions. A key component of the techniques is the mesh generation strategy. Automated meshing facilitates numerical simulation of ice accretion on realistic aircraft configurations by deforming the surface and volume meshes in response to the evolving ice shape. The objective of this research is to validate an ice accretion strategy for wings, using a previously developed meshing strategy. The intent is to investigate the effect of varying numerical parameters, on the predicted ice shape. Using this framework, results are simulated for rime and glaze ice accretions on a rectangular planform wing with a constant GLC-305 airfoil section. The number of time steps is shown to have a significant effect on the ice shape, depending on the icing time and conditions. Decreasing the height smoothing parameters generally improves the ice shape accuracy.
13

All Hexahedral Meshing of Multiple Source, Multiple Target, Multiple Axis Geometries Via Automatic Grafting and Sweeping

Earp, Matthew N. 18 March 2005 (has links) (PDF)
The development of algorithms for the automatic creation of finite element meshes composed entirely of hexahedra (all-hex) is an active area of research. All-hex meshes are desirable for their characteristic of high accuracy with a low node count. Sweeping is one of the most widely used algorithms for generating all-hex meshes. A limitation of sweeping, however, is that it can currently be applied only to prismatic or extruded geometry types. This thesis develops a method to combine sweeping with another algorithm known as "Grafting". Grafting adjusts the mesh on one volume to conform to a second volume. In this manner it is useful for meshing multi-axis geometry in that a single axis can be meshed with sweeping and then secondary axes can be grafted on. By creating an algorithm for automatically performing these processes, the base set of geometry that can be automatically meshed with these methods is greatly increased. This new algorithm is called Graft-Sweeping. With the combination of sweeping and Grafting, geometry that contains multiple source surfaces, multiple target surfaces, and multiple sweep axes can be meshed. The results of this algorithm on several example geometries are given showing the strengths and weaknesses of this technique. From the results it can be seen that the Graft-Sweep algorithm can produce a finite element mesh in about half the time of manual Grafting and sweeping operations with similar mesh quality. When compared to sweeping alone, Graft-Sweeping is several times faster but the quality is usually reduced. A second area of research for this thesis is to determine when Grafting can be used to enhance the meshing process. It is shown that the best results are obtained when Grafting is used on structured meshes and the mesh size is considerably smaller than the size of the feature that is being grafted.
14

A Selective Approach to Hexahedral Refinement of Unstructured Conformal Meshes

Parrish, Michael Hubbard 13 July 2007 (has links) (PDF)
Hexahedral refinement increases the density of an all-hexahedral mesh in a specified region, improving numerical accuracy. Previous research using solely sheet refinement theory made the implementation computationally expensive and unable to effectively handle multiply-connected transition elements and self-intersecting hexahedral sheets. The Selective Approach method is a new procedure that combines two diverse methodologies to create an efficient and robust algorithm able to handle the above stated problems. These two refinement methods are: 1) element by element refinement and 2) directional refinement. In element by element refinement, the three inherent directions of a hexahedron are refined in one step using one of seven templates. Because of its computational superiority over directional refinement, but its inability to handle multiply-connected transition elements, element by element refinement is used in all areas of the specified region except regions local to multiply-connected transition elements. The directional refinement scheme refines the three inherent directions of a hexahedron separately on a hexahedron by hexahedron basis. This differs from sheet refinement which refines hexahedra using hexahedral sheets. Directional refinement is able to correctly handle multiply-connected transition elements. A ranking system and propagation scheme allow directional refinement to work within the confines of the Selective Approach Algorithm.
15

Lidar In Coastal Storm Surge Modeling: Modeling Linear Raised Features

Coggin, David 01 January 2008 (has links)
A method for extracting linear raised features from laser scanned altimetry (LiDAR) datasets is presented. The objective is to automate the method so that elements in a coastal storm surge simulation finite element mesh might have their edges aligned along vertical terrain features. Terrain features of interest are those that are high and long enough to form a hydrodynamic impediment while being narrow enough that the features might be straddled and not modeled if element edges are not purposely aligned. These features are commonly raised roadbeds but may occur due to other manmade alterations to the terrain or natural terrain. The implementation uses the TauDEM watershed delineation software included in the MapWindow open source Geographic Information System to initially extract watershed boundaries. The watershed boundaries are then examined computationally to determine which sections warrant inclusion in the storm surge mesh. Introductory work towards applying image analysis techniques as an alternate means of vertical feature extraction is presented as well. Vertical feature lines extracted from a LiDAR dataset for Manatee County, Florida are included in a limited storm surge finite element mesh for the county and Tampa Bay. Storm surge simulations using the ADCIRC-2DDI model with two meshes, one which includes linear raised features as element edges and one which does not, verify the usefulness of the method.
16

Development of Finite Element Modeling Mesh Generation and Analysis Software for Light Wood Frame Houses

Pathak, Rakesh 03 February 2005 (has links)
This thesis presents the development of an automatic mesh generator, named WoodFrameMesh, using object oriented C++. The program developed is capable of generating complete finite element models of wooden houses incorporating frames, linear links, springs, nodal loads and restraints at the desired locations. The finite element mesh generated by the program may be triangular or quadrilateral. The triangular mesh can be generated over any arbitrary domain with multiple openings and line constraints. The program implements the advancing front method for triangulation as discussed by Lee and Hobbs. The difference is made by implementing the algorithm using object oriented concepts and the extensive use of the powerful C++ Standard Template Library (STL). Quadrilateral mesh generation is limited to simple quadrilateral domains with no openings or constraint lines. A simple structured technique is implemented to generate the quadrilateral mesh. The amount of time spent in manual generation of the complete finite element model of wooden houses has been considerably reduced by automating the modeling process. Overall, the use of object oriented design has facilitated the code development and has provided a platform for further additions. The program relies on the use of STL as it provides dynamic data structures, algorithms for storage, searching, sorting, etc. Efficiency of the program is improved by the use of the in-built features in STL instead of developing new code. Analysis of the finite element models generated by the automatic mesh generator is performed using SAP 2000 and WoodFrameSolver. WoodFrameSolver is a finite element analysis engine for WoodFrameMesh, which was developed at Virginia Tech by a group of graduate students (including the author) and professors as a separate project. A chapter discussing the WoodFrameSolver architecture, its extensibility features and its verification is also presented in this thesis. The solver performance and accuracy are similar to those of SAP 2000, which was chosen as the benchmark for testing the analysis results. / Master of Science
17

Scalable Algorithms for Delaunay Mesh Generation

Slatton, Andrew G. January 2014 (has links)
No description available.
18

A Multidimensional Discontinuous Galerkin Modeling Framework for Overland Flow and Channel Routing

West, Dustin Wayne 19 May 2015 (has links)
No description available.
19

Aerosol Transport Simulations in Indoor and Outdoor Environments using Computational Fluid Dynamics (CFD)

Landázuri, Andrea Carolina January 2016 (has links)
This dissertation focuses on aerosol transport modeling in occupational environments and mining sites in Arizona using computational fluid dynamics (CFD). The impacts of human exposure in both environments are explored with the emphasis on turbulence, wind speed, wind direction and particle sizes. Final emissions simulations involved the digitalization process of available elevation contour plots of one of the mining sites to account for realistic topographical features. The digital elevation map (DEM) of one of the sites was imported to COMSOL MULTIPHYSICS® for subsequent turbulence and particle simulations. Simulation results that include realistic topography show considerable deviations of wind direction. Inter-element correlation results using metal and metalloid size resolved concentration data using a Micro-Orifice Uniform Deposit Impactor (MOUDI) under given wind speeds and directions provided guidance on groups of metals that coexist throughout mining activities. Groups between Fe-Mg, Cr-Fe, Al-Sc, Sc-Fe, and Mg-Al are strongly correlated for unrestricted wind directions and speeds, suggesting that the source may be of soil origin (e.g. ore and tailings); also, groups of elements where Cu is present, in the coarse fraction range, may come from mechanical action mining activities and saltation phenomenon. Besides, MOUDI data under low wind speeds (<2 m/s) and at night showed a strong correlation for particles 1-micrometer in diameter between the groups: Sc-Be-Mg, Cr-Al, Cu-Mn, Cd-Pb-Be, Cd-Cr, Cu-Pb, Pb-Cd, As-Cd-Pb. The As-Cd-Pb group correlates strongly in almost all ranges of particle sizes. When restricted low wind speeds were imposed more groups of elements are evident and this may be justified with the fact that at lower speeds particles are more likely to settle. When linking these results with CFD simulations and Pb-isotope results it is concluded that the source of elements found in association with Pb in the fine fraction come from the ore that is subsequently processed in the smelter site, whereas the source of elements associated to Pb in the coarse fraction is of different origin. CFD simulation results will not only provide realistic and quantifiable information in terms of potential deleterious effects, but also that the application of CFD represents an important contribution to actual dispersion modeling studies; therefore, Computational Fluid Dynamics can be used as a source apportionment tool to identify areas that have an effect over specific sampling points and susceptible regions under certain meteorological conditions, and these conclusions can be supported with inter-element correlation matrices and lead isotope analysis, especially since there is limited access to the mining sites. Additional results concluded that grid adaption is a powerful tool that allows to refine specific regions that require lots of detail and therefore better resolve flow detail, provides higher number of locations with monotonic convergence than the manual grids, and requires the least computational effort. CFD simulations were approached using the k-epsilon model, with the aid of computer aided engineering software: ANSYS® and COMSOL MULTIPHYSICS®. The success of aerosol transport simulations depends on a good simulation of the turbulent flow. A lot of attention was placed on investigating and choosing the best models in terms of convergence, independence and computational effort. This dissertation also includes preliminary studies of transient discrete phase, eulerian and species transport modeling, importance of saltation of particles, information on CFD methods, and strategies for future directions that should be taken.
20

An improved incremental/decremental delaunay mesh-generation strategy for image representation

EL Marzouki, Badr Eddine 16 December 2016 (has links)
Two highly effective content-adaptive methods for generating Delaunay mesh models of images, known as IID1 and IID2, are proposed. The methods repeatedly alternate between mesh simplification and refinement, based on the incremental/decremental mesh-generation framework of Adams, which has several free parameters. The effect of different choices of the framework's free parameters is studied, and the results are used to derive two mesh-generation methods that differ in computational complexity. The higher complexity IID2 method generates mesh models of superior reconstruction quality, while the lower complexity IID1 method trades mesh quality in return for a decrease in computational cost. Some of the contributions of our work include the recommendation of a better choice for the growth-schedule parameter of the framework, as well as the use of Floyd-Steinberg error diffusion for the initial-mesh selection. As part of our work, we evaluated the performance of the proposed methods using a data set of 50 images varying in type (e.g., photographic, computer generated, and medical), size and bit depth with multiple target mesh densities ranging from 0.125% to 4%. The experimental results show that our proposed methods perform extremely well, yielding high-quality image approximations in terms of peak-signal-to-noise ratio (PSNR) and subjective visual quality, at an equivalent or lower computational cost compared to other well known approaches such as the ID1, ID2, and IDDT methods of Adams, and the greedy point removal (GPR) scheme of Demaret and Iske. More specifically, the IID2 method outperforms the GPR scheme in terms of mesh quality by 0.2-1.0 dB with a 62-93% decrease in computational cost. Furthermore, the IID2 method yields meshes of similar quality to the ID2 method at a computational cost that is lower by 9-41%. The IID1 method provides improvements in mesh quality in 93% of the test cases by margins of 0.04-1.31 dB compared to the IDDT scheme, while having a similar complexity. Moreover, reductions in execution time of 4-59% are achieved compared to the ID1 method in 86% of the test cases. / Graduate / 0544, 0984, 0537 / marzouki@uvic.ca

Page generated in 0.0535 seconds