• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • Tagged with
  • 148
  • 148
  • 148
  • 148
  • 56
  • 50
  • 44
  • 41
  • 41
  • 39
  • 38
  • 36
  • 35
  • 34
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Design of a 5 Degree of Freedom Kinematic Stage for the Dual Crystal Backlighter Imager Diagnostic

Nguyen, Nicholas 01 June 2020 (has links)
The National Ignition Facility (NIF) is home to the world’s most energetic laser. The facility is one of the leading centers in inertial confinement fusion (ICF) experiments to research and understand sustainable fusion energy. To fully document and understand the physics occurring during experiments, precise diagnostics are used for a wide range of purposes. One diagnostic, the crystal backlighter imager (CBI), allows for X-ray imaging of the target at late stages of its implosion. The aim of this project was to increase the current capabilities of the CBI diagnostic with the addition of a second crystal. This thesis focuses on the design development of the 5 degrees of freedom precision stages used to align each of the crystals. The motivations for the addition of a second crystal are covered in the introduction. A ray tracing model was generated to explore the required range of travel for both crystals, as well as explore potential effects of transitioning to a two-crystal system. The requirements of the precision stage are outlined based on the flaws of the current stage and areas with desired improvements. A dynamic analysis was performed on modified supporting hardware for CBI, to determine areas of interest in redesigning components for the two-crystal system. Further research is performed on commercial and literature methods used to design precision optomechanical stages. Finally, the design development is documented outlining the considered options, modifications to the existing system, and the proposed design solution. A design is proposed that meets the project requirements set at the beginning of design development.
62

DESIGN AND PROCESS OF 3D-PRINTED PARTS USING COMPOSITE THEORY

Garcia, Jordan 01 January 2019 (has links)
3D printing is a revolutionary manufacturing method that allows the productions of engineering parts almost directly from modeling software on a computer. With 3D printing technology, future manufacturing could become vastly efficient. However, it is observed that the procedures used in 3D printing differ substantially among the printers and from those used in conventional manufacturing. In this thesis, the mechanical properties of engineering products fabricated by 3D printing were comprehensively evaluated and then compared with those made by conventional manufacturing. Three open-source 3D printers, i.e., the Flash Forge Dreamer, the Tevo Tornado, and the Prusa, were used to fabricate the identical parts out of the same material (acrylonitrile butadiene styrene). The parts were printed at various positions on the printer platforms and then tested in bending. Results indicate that there exist substantial differences in mechanical responses among the parts by different 3D printers. Specimens from the Prusa printer exhibit the best elastic properties while specimens from the Flash Forge printer exhibit the greatest post-yield responses. There further exist noticeable variations in mechanical properties among the parts that were fabricated by the same printer. Depending on the positions that the parts were placed on a printer platform, the properties of resultant parts can vary greatly. For comparison, identical parts were fabricated using a conventional manufacturing method, i.e., compression molding. Results show that compression molded parts exhibit more robust and more homogeneous properties than those from 3D printing. During 3D printing, the machine code (e.g., the Gcode) would provide the processing instructions (the x, y, and z coordinates and the linear movements) to the printer head to construct the physical parts. Often times the default processing instructions used by commercial 3D printers may not yield the optimal mechanical properties of the parts. In the second part of this thesis, the orientation-dependent properties of 3D printed parts were examined. The multi-layered composite theory was used to design the directions of printing so that the properties of 3D printed objects can be optimized. Such method can potentially be used to design and optimize the 3D printing of complex engineering products. In the last part of this thesis, the printing process of an actual automobile A-pillar structure was designed and optimized. The finite element software (ANSYS) was used to design and optimize the filament orientations of the A-pillar. Actual parts from the proposed designs were fabricated using 3D printer and then tested. Consistent results have been observed between computational designs and experimental testing. It is recommended that the filament orientations in 3D-printing be “designed” or “tailored” by using laminate composite theory. The method would allow 3D printers to produce parts with optimal microstructure and mechanical properties to better satisfy the specific needs.
63

Computer solution to inverse problems of elliptic form: V²U(x,y)=g(a,U,x,y)

Jeter, Frederick Alvin 01 January 1971 (has links)
One important aspect of our present age of monolithic high speed computers is the computer's capability to solve complex problems hitherto impossible to tackle due to their complexity. This paper explains how to use a. digital computer to solve a specific type of problem; specifically, to find the inverse solution of a in the elliptical equation V2U(x,y) = g(a,U,x,y), with appropriate boundary conditions. This equation is very useful in the electronics field. The knowns are the complete set of boundary values of U(x,y) and a set of observations taken on internal points of U(x,y). Given this information, plus the specific form of the governing equation, we can solve for the unknown a. Once the computer program has been written using the technique of quasilinearization, Newton’S convergence method, discrete invariant imbedding, and the use of sensitivity functions, then we take data from the computer results and analyse it for proper convergence. This data shows that there are definite limits to the usefulness and capability of the technique. One of the results of this study is the observation that it is important to the proper functioning of this problem solving technique that the observations taken on U(x,y) are placed in the most efficient locations with the most efficient geometry in the region of largest effectiveness. Another result deals with the number of observation points used: too few gives insufficient information for proper program functioning, and too many tends to saturate the effectiveness of the observations. Thus this paper has two objectives. first to develop the technique and secondly to analyse the results from the realization of the technique through the use of a computer.
64

Finite Element Analysis of a Femur to Deconstruct the Design Paradox of Bone Curvature

Jade, Sameer 01 January 2012 (has links) (PDF)
The femur is the longest limb bone found in humans. Almost all the long limb bones found in terrestrial mammals, including the femur studied herein, have been observed to be loaded in bending and are curved longitudinally. The curvature in these long bones increases the bending stress developed in the bone, potentially reducing the bone’s load carrying capacity, i.e. its mechanical strength. Therefore, bone curvature poses a paradox in terms of the mechanical function of long limb bones. The aim of this study is to investigate and explain the role of longitudinal bone curvature in the design of long bones. In particular, it has been hypothesized that curvature of long bones results in a trade-off between the bone’s mechanical strength and its bending predictability. This thesis employs finite element analysis of human femora to address this issue. Simplified human femora with different curvatures were modeled and analyzed using ANSYS Workbench finite element analysis software. The results obtained are compared between different curvatures including a straight bone. We examined how the bone curvature affects the bending predictability and load carrying capacity of bones. Results were post processed to yield probability density functions (PDFs) for circumferential location of maximum equivalent stress for various bone curvatures to assess the bending predictability of bones. To validate our findings on the geometrically simplified ANSYS Workbench femur models, a digitally reconstructed femur model from a CT scan of a real human femur was employed. For this model we performed finite element analysis in the FEA tool, Strand7, executing multiple simulations for different load cases. The results from the CT scanned femur model and those from the CAD femur model were then compared. We found general agreement in trends but some quantitative differences most likely due to the geometric differences between the digitally reconstructed femur model and the simplified CAD models. As postulated by others, our results support the hypothesis that the bone curvature is a trade-off between the bone strength and its bending predictability. Bone curvature increases bending predictability at the expense of load carrying capacity.
65

Towards Accessible, Usable Knowledge Frameworks in Engineering

Mcpherson, Jeffrey 01 January 2014 (has links) (PDF)
A substantial amount of research has been done in the field of engineering knowledge management, where countless ontologies have been developed for various applications within the engineering community. However, despite the success shown in these research efforts, the techniques have not been adopted by industry. This research aims to uncover the reasons for the slow adoption of engineering knowledge frameworks, namely ontologies, in industry. There are two projects covered in this thesis. The first project is the development of a cross-domain ontology for the Biomesh Project, which spans the fields of mechanical engineering, biology, and anthropology. The biology community is known for its embrace of ontologies and has made their use quite popular with the creation of the Gene Ontology. This ontology spawned the establishment of the Open Biological and Biomedical Ontologies (OBO) Foundry, a consortium which approves and curates ontologies in the biology field. No such consortium exists in the field of engineering. This project demonstrates the usefulness of curated reference ontologies. Ontological knowledge bases in four different domains were imported and integrated together to connect previously disparate information. A case study with data from the Biomesh Project demonstrates cross-domain queries and inferences that were not possible before the creation of this ontology. In the second part of this thesis we investigate the usability of current ontology tools. Protégé, the most popular ontology editing tool, is compared to OntoWiki, a semantic wiki. This comparison is done using proven techniques from the field of Human-computer interaction to uncover usability problems and point out areas where each system excels. A field of 16 subjects completed a set of tasks in each system and gave feedback based on their experience. It is shown that while OntoWiki offers users a satisfying interface, it lacks in some areas that can be easily improved. Protégé provides users with adequate functionality, but it is not intended for a novice user.
66

Application of Finite Element Method in Protein Normal Mode Analysis

Hsu, Chiung-fang 01 January 2013 (has links) (PDF)
This study proposed a finite element procedure for protein normal mode analysis (NMA). The finite element model adopted the protein solvent-excluded surface to generate a homogeneous and isotropic volume. A simplified triangular approximation of coarse molecular surface was generated from the original surface model by using the Gaussian-based blurring technique. Similar to the widely adopted elastic network model, the finite element model holds a major advantage over standard all-atom normal mode analysis: the computationally expensive process of energy minimization that may distort the initial protein structure has been eliminated. This modification significantly increases the efficiency of normal mode analysis. In addition, the finite element model successfully brings out the capability of normal mode analysis in low-frequency/high collectivity molecular motion by capturing protein shape properties. Fair results from six protein models in this study have fortified the capability of the finite element model in protein normal mode analysis.
67

Lattice Boltzmann-based Sharp-interface schemes for conjugate heat and mass transfer and diffuse-interface schemes for Dendritic growth modeling

Wang, Nanqiao 13 May 2022 (has links) (PDF)
Analyses of heat and mass transfer between different materials and phases are essential in numerous fundamental scientific problems and practical engineering applications, such as thermal and chemical transport in porous media, design of heat exchangers, dendritic growth during solidification, and thermal/mechanical analysis of additive manufacturing processes. In the numerical simulation, interface treatment can be further divided into sharp interface schemes and diffuse interface schemes according to the morphological features of the interface. This work focuses on the following subjects through computational studies: (1) critical evaluation of the various sharp interface schemes in the literature for conjugate heat and mass transfer modeling with the lattice Boltzmann method (LBM), (2) development of a novel sharp interface scheme in the LBM for conjugate heat and mass transfer between materials/phases with very high transport property ratios, and (3) development of a new diffuse-interface phase-field-lattice Boltzmann method (PFM/LBM) for dendritic growth and solidification modeling. For comparison of the previous sharp interface schemes in the LBM, the numerical accuracy and convergence orders are scrutinized with representative test cases involving both straight and curved geometries. The proposed novel sharp interface scheme in the LBM is validated with both published results in the literature as well as in-house experimental measurements for the effective thermal conductivity (ETC) of porous lattice structures. Furthermore, analytical correlations for the normalized ETC are proposed for various material pairs and over the entire range of porosity based on the detailed LBM simulations. In addition, we provide a modified correlation based on the SS420-air and SS316L-air metal pairs and the high porosity range for specific application. The present PFM/LBM model has several improved features compared to those in the literature and is capable of modeling dendritic growth with fully coupled melt flow and thermosolutal convection-diffusion. The applicability and accuracy of the PFM/LBM model is verified with numerical tests including isothermal, iso-solutal and thermosolutal convection-diffusion problems in both 2D and 3D. Furthermore, the effects of natural convection on the growth of multiple crystals are numerically investigated.
68

Molecular dynamics of high temperature hydrogen attack

Bodden Connor, Mike Travis 09 December 2022 (has links) (PDF)
High temperature hydrogen attack (HTHA) is a damage mechanism that only affects carbon steel and low alloy material. Most of the data regarding HTHA are experimental-driven. Even though this approach has been successful, there are still much more things that the oil and gas industry does not understand about HTHA. The regions that were considered safe (below the Nelson curves) have experienced catastrophic failure. Our research consists of performing Molecular Dynamics (MD) and the Nudge Elastic Band (NEB) calculation of HTHA to better understand the atomistic behavior of this damage mechanism.
69

Surrogate model-based design optimization of a mobile deployable structure for overpressure load and vehicular impact mitigation

Tellkamp, Daniela F 09 December 2022 (has links) (PDF)
Artificial Neural Network (ANN) ensemble and Response Surface Method (RSM) surrogate models were generated from Finite Element (FE) simulations to predict the overpressure load and vehicle impact response of a novel rapidly deployable protective structure. A Non-dominated Sorting Genetic Algorithm-II (NSGA-II) was used in conjunction with the surrogate models to determine structure topology input variable configurations which were suited to produce the optimal balance of minimum mass, minimum rotation angle, minimum displacement, and maximum total length of the deployable structure. The structure was designed to retract into a container, be lightweight to facilitate transportation, and be able to adapt to varying terrain slopes. This research demonstrates that, in comparison to the RSM, ANN ensembles can more accurately and efficiently be used for identifying optimal design solutions for multi-objective design problems when two surrogate models from the same method corresponding to separate FE models are used simultaneously in a NSGA-II.
70

Modeling and Numerical Investigation of Hot Gas Defrost on a Finned Tube Evaporator Using Computational Fluid Dynamics

Ha, Oai The 01 November 2010 (has links) (PDF)
Defrosting in the refrigeration industry is used to remove the frost layer accumulated on the evaporators after a period of running time. It is one way to improve the energy efficiency of refrigeration systems. There are many studies about the defrosting process but none of them use computational fluid dynamics (CFD) simulation. The purpose of this thesis is (1) to develop a defrost model using the commercial CFD solver FLUENT to simulate numerically the melting of frost coupled with the heat and mass transfer taking place during defrosting, and (2) to investigate the thermal response of the evaporator and the defrost time for different hot gas temperatures and frost densities. A 3D geometry of a finned tube evaporator is developed and meshed using Gambit 2.4.6, while numerical computations were conducted using FLUENT 12.1. The solidification and melting model is used to simulate the melting of frost and the Volume of Fluid (VOF) model is used to render the surface between the frost and melted frost during defrosting. A user-defined-function in C programming language was written to model the frost evaporation and sublimation taking place on the free surface between frost and air. The model was run under different hot gas temperatures and frost densities and the results were analyzed to show the effects of these parameters on defrosting time, input energy and stored energy in the metal mass of the evaporator. The analyses demonstrate that an optimal hot gas temperature can be identified so that the defrosting process takes place at the shortest possible melting time and with the lowest possible input energy.

Page generated in 0.1107 seconds