• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 200
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 462
  • 63
  • 56
  • 56
  • 54
  • 48
  • 44
  • 43
  • 41
  • 40
  • 37
  • 37
  • 35
  • 33
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Parallel Mesh Adaptation and Graph Analysis Using Graphics Processing Units

Mcguiness, Timothy P 01 January 2011 (has links) (PDF)
In the field of Computational Fluid Dynamics, several types of mesh adaptation strategies are used to enhance a mesh’s quality, thereby improving simulation speed and accuracy. Mesh smoothing (r-refinement) is a simple and effective technique, where nodes are repositioned to increase or decrease local mesh resolution. Mesh partitioning divides a mesh into sections, for use on distributed-memory parallel machines. As a more abstract form of modeling, graph theory can be used to simulate many real-world problems, and has applications in the fields of computer science, sociology, engineering and transportation, to name a few. One of the more important graph analysis tasks involves moving through the graph to evaluate and calculate nodal connectivity. The basic structures of meshes and graphs are the same, as both rely heavily on connectivity information, representing the relationships between constituent nodes and edges. This research examines the parallelization of these algorithms using commodity graphics hardware; a low-cost tool readily available to the computing community. Not only does this research look at the benefits of the fine-grained parallelism of an individual graphics processor, but the use of Message Passing Interface (MPI) on large-scale GPU-based supercomputers is also studied.
182

Validation of Criteria Used to Predict Warfarin Dosing Decisions

Thomas, Nicole 13 May 2004 (has links) (PDF)
People at risk for blood clots are often treated with anticoagulants, warfarin is such an anticoagulant. The dose's effect is measured by comparing the time for blood to clot to a control time called an INR value. Previous anticoagulant studies have addressed agreement between fingerstick (POC) devices and the standard laboratory, however these studies rely on mathematical formulas as criteria for clinical evaluations, i.e. clinical evaluation vs. precision and bias. Fourteen such criteria were found in the literature. There exists little consistency among these criteria for assessing clinical agreement, furthermore whether these methods of assessing agreement are reasonable estimates of clinical decision-making is unknown and has yet to be validated. One previous study compared actual clinical agreement by having two physicians indicate a dosing decision based on patient history and INR values. This analysis attempts to justify previously used mathematical criteria for clinical agreement. Generalized additive models with smoothing spline estimates were calculated for each of the 14 criteria and compared to the smoothing spline estimate for the method using actual physician decisions (considered the "gold standard"). The area between the criteria method spline and the gold standard method spline served as the comparison, using bootstrapping for statistical inference. Although some of the criteria methods performed better than others, none of them matched the gold standard. This stresses the need for clinical assessment of devices.
183

Analysis Using Smoothing Via Penalized Splines as Implemented in LME() in R

Howell, John R. 16 February 2007 (has links) (PDF)
Spline smoothers as implemented in common mixed model software provide a familiar framework for estimating semi-parametric and non-parametric models. Following a review of literature on splines and mixed models, details for implementing mixed model splines are presented. The examples use an experiment in the health sciences to demonstrate how to use mixed models to generate the smoothers. The first example takes a simple one-group case, while the second example fits an expanded model using three groups simultaneously. The second example also demonstrates how to fit confidence bands to the three-group model. The examples use mixed model software as implemented in lme() in R. Following the examples a discussion of the method is presented.
184

Smoothing Parameter Selection In Nonparametric Functional Estimation

Amezziane, Mohamed 01 January 2004 (has links)
This study intends to build up new techniques for how to obtain completely data-driven choices of the smoothing parameter in functional estimation, within the confines of minimal assumptions. The focus of the study will be within the framework of the estimation of the distribution function, the density function and their multivariable extensions along with some of their functionals such as the location and the integrated squared derivatives.
185

Earnings management inom”Big-six”-lagen. : Förekommer det earnings management bland ”Big-six”-lagen, hara nvändandet ökat efter införandet av FFP-regelverket och vilka faktorer påverkar användandet av earnings management?

Fransson, Erik, Rångeby, Oskar January 2023 (has links)
Fotbollsklubbar bedriver inte sin verksamhet som vanliga företag. Fotbollsklubbar strävarnämligen inte efter att generera stora vinster ekonomiskt. Vinsterna ska i stället ske påfotbollsplanen, eftersom fotbollsklubbar strävar efter sportslig framgång. Dock har den strävan inneburit att fotbollsklubbar hamnat i ekonomiska svårigheter, vilket medfört att regelverket FFP-regelverket implementerats för att undvika att klubbar går i konkurs. FFP-regelverket tvingar fotbollsklubbar att generera positiva resultat för att undvika sanktioner. Inom det engelska ligasystemet Premier League finns det däremot sex klubbar som särskiljer sig från mängden. Klubbarna som benämns ”Big-six”-lagen har dominerat ligan historiskt och har kapitalstarka ägare som inte är oroade för ekonomiska svårigheter. Tidigare forskning menar att ett sätt att lyckas med att utföra två motstridiga intressen, vilka för fotbollsklubbarna är ekonomisk lönsamhet och sportslig framgång, är earnings management. Den här uppsatsen undersöker om det förekommer earnings management inom ”Big-six”-lagen, samt om det blivit någon förändring i användandet efter införandet av FFP-regelverket. Uppsatsen undersöker även vilka andra faktorer som påverkar earnings management. Studien bygger på användandet av två proxies för earnings management, vilka är income smoothing och accrual management.Resultaten visar att det förekommer earnings management inom ”Big-six”-lagen samt att användandet ökat efter införandet av FFP-regelverket. Resultaten stödjer även hypoteserna att ett negativt operativt kassaflöde och en stor tillväxt i intäkter har en signifikant påverkan på de diskrektionära periodiseringarna och i förlängningen earnings management inom ”Big-six”-lagen.
186

Three Essays on Hedge Fund Fee Structure, Return Smoothing and Gross Performance

Feng, Shuang 01 September 2011 (has links)
Hedge funds feature special compensation structure compared to traditional investments. Previous studies mainly focus on the provisions and incentive structure of hedge fund contract, such as 2/20, hurdle rates, and high-water mark. The first essay develops an algorithm to empirically estimate the monthly fees, fund flows and gross asset values of individual hedge funds. We find that management fee is a major component in the dollar amount of hedge fund total fees, and fund flow is more important in determining the change in fund size compared to net returns, especially when fund is shrinking in size. We also find that best paid hedge funds concentrate in the largest hedge fund quintile. Large funds tend to perform better, earn more, and rely less on management fee for their managers' compensation. Further, we find that fund flow is an important determinant of hedge fund managerial incentives. Together with the "visible" hands of hedge fund management, i.e. the provisions of hedge fund incentive contracts, the "invisible" hands -- fund flows enable investors to effectively impact hedge fund managerial compensation and incentives. The second essay studies the relation between return smoothing and managerial incentives of hedge funds. We use gross returns to estimate both unconditional and conditional return smoothing models. While unconditional return smoothing is a proxy of illiquidity, conditional return smoothing is related to intentional return smoothing and may be used as a first screen for hedge fund fraud. We find that return smoothing is significantly underestimated using net returns, especially for the graveyard funds. We also find that managerial incentives are positively associated with both types of return smoothing. While managers of more illiquid funds tend to earn more incentive fees, funds featuring conditional return smoothing under-perform other funds and do not earn more incentive fees on average. Finally, we find that failed hedge funds feature more illiquidity and conditional return smoothing. The third essay explores the difference between the gross-of-fee and net-of-fee hedge fund performance, by investigating the difference in distribution, factor exposures and alphas between gross returns and net returns. We find that gross returns are distributed significantly differently from net returns. The gross-of-fee alphas are higher than the net-of-fee alphas by about 4% per year on average. We also find positive relation between hedge fund performance and fund size, fund flows, and managerial incentives, which holds for both gross-of-fee performance and net-of-fee performance. Our findings suggest that it is necessary to examine the gross-of-fee performance of hedge funds separately from the net-of-fee performance, which may give us a clearer picture of the risk structure and performance of hedge fund portfolios.
187

An Assessment of The Nonparametric Approach for Evaluating The Fit of Item Response Models

Liang, Tie 01 February 2010 (has links)
As item response theory (IRT) has developed and is widely applied, investigating the fit of a parametric model becomes an important part of the measurement process when implementing IRT. The usefulness and successes of IRT applications rely heavily on the extent to which the model reflects the data, so it is necessary to evaluate model-data fit by gathering sufficient evidence before any model application. There is a lack of promising solutions on the detection of model misfit in IRT. In addition, commonly used fit statistics are not satisfactory in that they often do not possess desirable statistical properties and lack a means of examining the magnitude of misfit (e.g., via graphical inspections). In this dissertation, a newly-proposed nonparametric approach, RISE was thoroughly and comprehensively studied. Specifically, the purposes of this study are to (a) examine the promising fit procedure, RISE, (b) compare the statistical properties of RISE with that of the commonly used goodness-of-fit procedures, and (c) investigate how RISE may be used to examine the consequences of model misfit. To reach the above-mentioned goals, both a simulation study and empirical study were conducted. In the simulation study, four factors including ability distribution, sample size, test length and model were varied as the factors which may influence the performance of a fit statistic. The results demonstrated that RISE outperformed G2 and S-X2 in that it controlled Type I error rates and provided adequate power under all conditions. In the empirical study, the three fit statistics were applied to one empirical data and the misfitting items were flagged. RISE and S-X2 detected reasonable numbers of misfitting items while G2 detected almost all items when sample size is large. To further demonstrate an advantage of RISE, the residual plot on each misfitting item was shown. Compared to G2 and S-X2, RISE gave a much clearer picture of the location and magnitude of misfit for each misfitting item. Other than statistical properties and graphical displays, the score distribution and test characteristic curve (TCC) were investigated as model misfit consequence. The results indicated that for the given data, there was no practical consequence on classification before and after replacement of misfitting items detected by three fit statistics.
188

Ensemble Kalman Filtering (EnKF) with One-Step-Ahead Smoothing: Application to Challenging Ocean Data Assimilation Problems

Raboudi, Naila Mohammed Fathi 20 September 2022 (has links)
Predicting and characterizing the state of the ocean is needed for various scientific, industrial, social, management, and recreational activities. Despite the tremendous progress in ocean modeling and simulation capabilities, the ocean models still suffer from different sources of uncertainties. To obtain accurate ocean state predictions, data assimilation (DA) is widely used to constrain the ocean model outputs with available observations. Ensemble Kalman filtering (EnKF) is a sequential DA approach that represents the distribution of the system state through an ensemble of ocean state samples. Different factors may limit the performance of an EnKF in realistic ocean applications, particularly the use of small ensembles and poorly known model error statistics, and also to a lesser extent the strongly nonlinear variations and abrupt regime changes, and unsatisfied underlying assumptions such as the commonly used white observation noise assumption. The objective of this PhD thesis is to develop, implement and test efficient ensemble filtering schemes to enhance the performances of EnKFs in such challenging settings. We resort to the one-step-ahead (OSA) smoothing formulation of the Bayesian filtering problem to introduce EnKFs involving a new update step with future observations (smoothing) between two successive analyses, thereby conditioning the ensemble sampling with more information. We show that this approach enhances the EnKFs performances by providing improved ensemble background statistics, and showcase its performance with realistic ocean DA and forecasting applications, namely a storm surge EnKF forecasting system and the Red Sea ensemble DA and forecasting system. We then derive new EnKF-based schemes accounting for time-correlated observation errors for efficient DA into the class of large dimensional DA problems where observation errors statistics are correlated in time, and further propose a new approach for online estimation of the parameters of the observation error time-correlations model concurrently with the state. We also exploit the OSA-smoothing formulation to propose a new joint EnKF with OSA-smoothing which mitigates for the reported inconsistencies in the joint EnKF update for efficient DA into one-way-coupled systems.
189

Advanced Smoothed Finite Element Modeling for Fracture Mechanics Analyses

Bhowmick, Sauradeep 28 June 2021 (has links)
No description available.
190

New Procedures for Data Mining and Measurement Error Models with Medical Imaging Applications

Wang, Xiaofeng 15 July 2005 (has links)
No description available.

Page generated in 0.0789 seconds