• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 57
  • 23
  • 14
  • 6
  • 4
  • 3
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 141
  • 102
  • 101
  • 38
  • 36
  • 25
  • 23
  • 21
  • 17
  • 16
  • 14
  • 13
  • 13
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Testing Accuracy and Convergence of GPUSPH for Free-Surface Flows

Rooney, Erin Ann 2011 August 1900 (has links)
The effect of vegetation on the dissipation of waves is important in understanding the vegetation's role in protecting coastal communities during extreme events such as hurricanes and tsunamis. Numerical modeling makes it possible to study the flow through vegetation fields, but it is important to understand the flow dynamics around one piece of vegetation and validate the numerical model used, before the dynamics of an entire vegetated patch can be modeled and understood. This project validated GPUSPH, a Lagrangian mesh-free numerical model, by determining the optimal characteristics to obtain accurate simulations for flow through a flume with and without an obstruction. The validation of GPUSPH and determination of optimal characteristics was accomplished by varying model particle spacing, sub-particle scale (SPS) turbulence inclusion in the conservation of momentum equation, and kernel weighting function for two test cases. The model particle spacing sets the initial distance between the moving grid points, known as particles, in the system. The SPS turbulence term is intended to account for turbulence generated at the sub-particle scale between the particles. The kernel weighting functions used are the quadratic kernel and the cubic spline kernel. These kernels determine how much influence surrounding particles have on the flow characteristics of an individual particle. The numerical results of these tests were compared with experimental results to obtain conclusions about the accuracy of these simulations. Based on comparisons with experimental velocities and forces, the optimal particle spacing was found to occur when the number of particles was in the high 100,000s for single precision calculations, or mid-range capabilities, for the hardware used in this project. The sub-particle scale turbulence term was only necessary when there was large-scale turbulence in the system and created less accurate results when there was no large-scale turbulence present. There was no definitive conclusion regarding the best kernel weighting function because neither kernel had overall more accurate results than the other. Based on these conclusions, GPUSPH was shown to be a viable option for modeling free-surface flows for certain conditions concerning the particle spacing and the inclusion of the subparticle scale turbulence term.
12

Empirical Likelihood Confidence Intervals for ROC Curves with Missing Data

An, Yueheng 25 April 2011 (has links)
The receiver operating characteristic, or the ROC curve, is widely utilized to evaluate the diagnostic performance of a test, in other words, the accuracy of a test to discriminate normal cases from diseased cases. In the biomedical studies, we often meet with missing data, which the regular inference procedures cannot be applied to directly. In this thesis, the random hot deck imputation is used to obtain a 'complete' sample. Then empirical likelihood (EL) confidence intervals are constructed for ROC curves. The empirical log-likelihood ratio statistic is derived whose asymptotic distribution isproved to be a weighted chi-square distribution. The results of simulation study show that the EL confidence intervals perform well in terms of the coverage probability and the average length for various sample sizes and response rates.
13

Modelling Probability Distributions from Data and its Influence on Simulation

Hörmann, Wolfgang, Bayar, Onur January 2000 (has links) (PDF)
Generating random variates as generalisation of a given sample is an important task for stochastic simulations. The three main methods suggested in the literature are: fitting a standard distribution, constructing an empirical distribution that approximates the cumulative distribution function and generating variates from the kernel density estimate of the data. The last method is practically unknown in the simulation literature although it is as simple as the other two methods. The comparison of the theoretical performance of the methods and the results of three small simulation studies show that a variance corrected version of kernel density estimation performs best and should be used for generating variates directly from a sample. (author's abstract) / Series: Preprint Series / Department of Applied Statistics and Data Processing
14

Empirical Likelihood Confidence Intervals for the Difference of Two Quantiles with Right Censoring

Yau, Crystal Cho Ying 21 November 2008 (has links)
In this thesis, we study two independent samples under right censoring. Using a smoothed empirical likelihood method, we investigate the difference of quantiles in the two samples and construct the pointwise confidence intervals from it as well. The empirical log-likelihood ratio is proposed and its asymptotic limit is shown as a chi-squared distribution. In the simulation studies, in terms of coverage accuracy and average length of confidence intervals, we compare the empirical likelihood and the normal approximation method. It is concluded that the empirical likelihood method has a better performance. At last, a real clinical trial data is used for the purpose of illustration. Numerical examples to illustrate the efficacy of the method are presented.
15

Physically-based baking animation using smoothed particle hydrodynamics for non-Newtonian fluids

Rodriguez-Arenas, Omar Isidro Unknown Date
No description available.
16

A GPU Accelerated Smoothed Particle Hydrodynamics Capability For Houdini

Sanford, Mathew 2012 August 1900 (has links)
Fluid simulations are computationally intensive and therefore time consuming and expensive. In the field of visual effects, it is imperative that artists be able to efficiently move through iterations of the simulation to quickly converge on the desired result. One common fluid simulation technique is the Smoothed Particle Hydrodynamics (SPH) method. This method is highly parellelizable. I have implemented a method to integrate a Graphics Processor Unit (GPU) accelerated SPH capability into the 3D software package Houdini. This helps increase the speed with which artists are able to move through these iterations. This approach is extendable to allow future accelerations of the algorithm with new SPH techniques. Emphasis is placed on the infrastructure design so it can also serve as a guideline for both GPU programming and integrating custom code with Houdini.
17

A new smooth particle hydrodynamics scheme for 3D free surface flows /

Ferrari, Angela. January 2009 (has links)
Zugl.: Stuttgart, University, Diss.
18

Parallel object oriented simulation with Lagrangian particle methods

Fleissner, Florian January 2009 (has links)
Zugl.: Stuttgart, Univ., Diss., 2009
19

Analyse de complexité d'enveloppes convexes aléatoires / Complexity analysis of random convex hulls

Thomasse, Rémy 18 December 2015 (has links)
Dans cette thèse, nous donnons de nouveaux résultats sur la taille moyenne d’enveloppes convexes de points choisis dans un convexe. Cette taille est connue lorsque les points sont choisis uniformément (et indépendamment) dans un polytope convexe, ou un convexe suffisamment «lisse» ; ou encore lorsque les points sont choisis indépendamment selon une loi normale centrée. Dans la première partie de cette thèse, nous développons une technique nous permettant de donner de nouveaux résultats lorsque les points sont choisis arbitrairement dans un convexe, puis «bruités» par une perturbation aléatoire. Ce type d’analyse, appelée analyse lissée, a initialement été développée par Spielman et Teng dans leur étude de l’algorithme du simplexe. Pour un ensemble de points arbitraires dans une boule, nous obtenons une borne inférieure et une borne supérieure de cette complexité lissée dans le cas de perturbations uniformes dans une boule en dimension arbitraire, ainsi que dans le cas de perturbations gaussiennes en dimension 2. La taille de l'enveloppe convexe de points choisis uniformément dans un convexe, peut avoir un comportement logarithmique si ce convexe est un polytope ou polynomial s’il est lisse. Nous construisons un convexe produisant un comportement oscillant entre ces deux extrêmes. Dans la dernière partie, nous présentons un algorithme pour engendrer efficacement une enveloppe convexe aléatoire de points choisis uniformément et indépendamment dans un disque sans avoir à engendrer explicitement tous les points. Il a été implémenté en C++ et intégré dans la bibliothèque CGAL. / In this thesis, we give some new results about the average size of convex hulls made of points chosen in a convex body. This size is known when the points are chosen uniformly (and independently) in a convex polytope or in a "smooth" enough convex body. This average size is also known if the points are independently chosen according to a centered Gaussian distribution. In the first part of this thesis, we introduce a technique that will give new results when the points are chosen arbitrarily in a convex body, and then noised by some random perturbations. This kind of analysis, called smoothed analysis, has been initially developed by Spielman and Teng in their study of the simplex algorithm. For an arbitrary set of point in a ball, we obtain a lower and a upper bound for this smoothed complexity, in the case of uniform perturbation in a ball (in arbitrary dimension) and in the case of Gaussian perturbations in dimension 2. The asymptotic behavior of the expected size of the convex hull of uniformly random points in a convex body is polynomial for a "smooth" body and polylogarithmic for a polytope. In the second part, we construct a convex body so that the expected size of the convex hull of points uniformly chosen in that body oscillates between these two behaviors when the number of points increases. In the last part, we present an algorithm to generate efficiently a random convex hull made of points chosen uniformly and independently in a disk. We also compute its average time and space complexity. This algorithm can generate a random convex hull without explicitly generating all the points. It has been implemented in C++ and integrated in the CGAL library.
20

Laser micromachining of coronary stents for medical applications

Muhammad, Noorhafiza Binti January 2012 (has links)
This PhD thesis reports an investigation into medical coronary stent cutting using three different types of lasers and associated physical phenomena. This study is motivated by a gap in the current knowledge in stent cutting identified in an extensive literature review. Although lasers are widely used for stent cutting, in general the laser technology employed is still traditionally based on millisecond pulsed Nd:YAG lasers. Although recent studies have demonstrated the use of fibre lasers, picosecond and femtosecond lasers for stent cutting, it has been preliminary studies.To further understand the role of new types of lasers such as pulsed fibre lasers, picosecond and femtosecond pulsed lasers in stent cutting, these three lasers based stent cutting were investigated in this project. The first investigation was on a new cutting method using water assisted pulsed (millisecond) fibre laser cutting of stainless steel 316L tubes to explore the advantages of the presence of water compared to the dry cutting condition. Significant improvements were observed with the presence of water; narrower kerf width, lower surface roughness, less dross attachment, absence of backwall damage and smaller heat affected zone (HAZ). This technique is now fully commercialised by Swisstec, an industrial project partner that manufactures stent cutting machines.The second investigation used the picosecond laser (with 6 ps pulse duration in the UV wavelength range) for cutting nickel titanium alloy (nitinol) and platinum iridium alloy. The main achievement in this study was obtaining dross-free cut as well as clean backwall, which may eliminate the need for extensive post-processing. Picosecond laser cutting of stents is investigated and reported for the first time. The third area of investigation was on the use of a femtosecond laser at 100 fs pulse duration for cutting nickel titanium alloy tubes. It was found that dry cutting degraded the cut quality due to debris and recast formation. For improvement, a water assisted cutting technique was undertaken, for the first time, by submerging the workpiece in a thin layer of water for comparison with the dry cutting condition. The final part of the thesis presents a three dimensional numerical model of the laser micromachining process using smoothed particle hydrodynamics (SPH). The model was used to provide better understanding of the laser beam and material interaction (with static beam) including the penetration depth achieved, phase changes, melt ejection velocity, also recast and spatter formation. Importantly, the model also simulated the wet machining condition by understanding the role of water removing the melt ejected during the process which avoided backwall damages. Results with the fibre laser in millisecond pulse duration were used for the validation purposes. The conclusions reached in this project and recommendations for future work are enclosed.The work has resulted in the publication of 3 journal papers and 2 additional journal paper submissions.

Page generated in 0.0486 seconds