• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 199
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 460
  • 63
  • 56
  • 56
  • 53
  • 48
  • 44
  • 43
  • 41
  • 39
  • 37
  • 37
  • 35
  • 33
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Parameter parsimony, model selection, and smooth density estimation

Atilgan, Taskin. January 1900 (has links)
Thesis (Ph. D.)--University of Wisconsin--Madison, 1983. / Typescript. Vita. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 242-248).
32

Metrische Regressoren in exponentiellen Glättungsmodellen /

Bell, Michael. January 2003 (has links)
Thesis (doctoral)--Kath. Universiẗat, Eichstätt, 2003.
33

Bayesian surface smoothing under anisotropy

Chakravarty, Subhashish. January 2007 (has links)
Thesis (Ph. D.)--University of Iowa, 2007. / Supervisors: George Woodworth, Matthew Bognar. Includes bibliographical references (leaves 72-73).
34

Bayesian semiparametric spatial and joint spatio-temporal modeling

White, Gentry, January 2006 (has links)
Thesis (Ph.D.)--University of Missouri-Columbia, 2006. / The entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file. Title from title screen of research.pdf file viewed on (May 2, 2007) Vita. Includes bibliographical references.
35

Multi-view hockey tracking with trajectory smoothing and camera selection

Wu, Lan 11 1900 (has links)
We address the problem of multi-view multi-target tracking using multiple stationary cameras in the application of hockey tracking and test the approach with data from two cameras. The system is based on the previous work by Okuma et al. [50]. We replace AdaBoost detection with blob detection in both image coordinate systems after background subtraction. The sets of blob-detection results are then mapped to the rink coordinate system using a homography transformation. These observations are further merged into the final detection result which will be incorporated into the particle filter. In addition, we extend the particle filter to use multiple observation models, each corresponding to a view. An observation likelihood and a reference color model are also maintained for each player in each view and are updated only when the player is not occluded in that view. As a result of the expanded coverage range and multiple perspectives in the multi-view tracking, even when the target is occluded in one view, it still can be tracked as long as it is visible from another view. The multi-view tracking data are further processed by trajectory smoothing using the Maximum a posteriori smoother. Finally, automatic camera selection is performed using the Hidden Markov Model to create personalized video programs. / Science, Faculty of / Computer Science, Department of / Graduate
36

A new deterministic Ensemble Kalman Filter with one-step-ahead smoothing for storm surge forecasting

Raboudi, Naila Mohammed Fathi 11 1900 (has links)
The Ensemble Kalman Filter (EnKF) is a popular data assimilation method for state-parameter estimation. Following a sequential assimilation strategy, it breaks the problem into alternating cycles of forecast and analysis steps. In the forecast step, the dynamical model is used to integrate a stochastic sample approximating the state analysis distribution (called analysis ensemble) to obtain a forecast ensemble. In the analysis step, the forecast ensemble is updated with the incoming observation using a Kalman-like correction, which is then used for the next forecast step. In realistic large-scale applications, EnKFs are implemented with limited ensembles, and often poorly known model errors statistics, leading to a crude approximation of the forecast covariance. This strongly limits the filter performance. Recently, a new EnKF was proposed in [1] following a one-step-ahead smoothing strategy (EnKF-OSA), which involves an OSA smoothing of the state between two successive analysis. At each time step, EnKF-OSA exploits the observation twice. The incoming observation is first used to smooth the ensemble at the previous time step. The resulting smoothed ensemble is then integrated forward to compute a "pseudo forecast" ensemble, which is again updated with the same observation. The idea of constraining the state with future observations is to add more information in the estimation process in order to mitigate for the sub-optimal character of EnKF-like methods. The second EnKF-OSA "forecast" is computed from the smoothed ensemble and should therefore provide an improved background. In this work, we propose a deterministic variant of the EnKF-OSA, based on the Singular Evolutive Interpolated Ensemble Kalman (SEIK) filter. The motivation behind this is to avoid the observations perturbations of the EnKF in order to improve the scheme's behavior when assimilating big data sets with small ensembles. The new SEIK-OSA scheme is implemented and its efficiency is demonstrated by performing assimilation experiments with the highly nonlinear Lorenz model and a realistic setting of the Advanced Circulation (ADCIRC) model configured for storm surge forecasting in the Gulf of Mexico during Hurricane Ike.
37

Anisotropic Quadrilateral Mesh Optimization

Ferguson, Joseph Timothy Charles 12 August 2016 (has links)
In order to determine the validity and the quality of meshes, mesh optimization methods have been formulated with quality measures. The basic idea of mesh optimization is to relocate the vertices to obtain a valid mesh (untangling) or improve the mesh quality (smoothing), or both. We will look at a new algebraic way of calculating quality measure on quadrilateral meshes, based on triangular meshes in 2D as well as new optimization methods for simultaneous untangling and smoothing for severely deformed meshes. An innovative anisotropic diffusion method will be introduced for consideration of inner boundary deformation movements for quadrangle meshes in 2D.
38

Is Strong Corporate Governance Associated with Informative Income Smoothing?

Faello, Joseph Peter 12 May 2012 (has links)
This study examines the links between corporate governance, income smoothing, and informativeness in financial reporting. Firms’ strong corporate governance is measured by variables employed in other studies – the presence of a financial expert serving on the audit committee; whether the audit committee consists entirely of independent directors; whether the members of the audit committee meet at least four times annually; and the percentage of outsiders serving on the board of directors. Income smoothing is measured by the Albrecht-Richardson (AR) and Tucker-Zarowin (TZ) income smoothing measures. The AR measure encompasses four definitions of earnings that include accrual and cash-based transactions. The TZ measure includes only accrual-based transactions. The degree of informativeness is measured by association with two opposing ends of the spectrum. On the one hand, firms that are the most informative are predicted to have a greater association between earnings and one period ahead operating cash flows. Prior researchers have defined in a similar manner the information content of earnings to predict cash flows. On the other hand, the existence of a regulatory violation clearly indicates firms’ lack of informativeness (i.e., deceptiveness) in financial reporting. The results do not show a strong relationship between strong corporate governance and degree of income smoothing. First, results for the link between income smoothing and informativeness show only a strong, positive association between accrual-based income smoothing (i.e., TZ measure) and informativeness. Second, results for the links between deceptiveness, corporate governance, and income smoothing are weak. The corporate governance variables show no significant association with deceptiveness. A negative relationship between corporate governance and deceptiveness was predicted. For the link between income smoothing and deceptiveness, only the AR measures show the predicted negative relationship. The TZ measure shows no significant association with deceptiveness. Taken together, the results of this study provide unique insights into the links between corporate governance, income smoothing, and informativeness in financial reporting. The results confirm the informativeness of accrual accounting, but do not resolve the debate of whether corporate governance measures impact the quality of financial reporting.
39

A Monte Carlo Investigation of Smoothing Methods for Error Density Estimation in Functional Data Analysis with an Illustrative Application to a Chemometric Data Set

Thompson, John R.J. 06 1900 (has links)
Functional data analysis is a eld in statistics that analyzes data which are dependent on time or space and from which inference can be conducted. Functional data analysis methods can estimate residuals from functional regression models that in turn require robust univariate density estimators for error density estimation. The accurate estimation of the error density from the residuals allows evaluation of the performance of functional regression estimation. Kernel density estimation using maximum likelihood cross-validation and Bayesian bandwidth selection techniques with a Gaussian kernel are reproduced and compared to least-squares cross-validation and plug-in bandwidth selection methods with an Epanechnikov kernel. For simulated data, Bayesian bandwidth selection methods for kernel density estimation are shown to give the minimum mean expected square error for estimating the error density, but are computationally ine cient and may not be adequately robust for real data. The (bounded) Epanechnikov kernel function is shown to give similar results as the Gaussian kernel function for error density estimation after functional regression. When the functional regression model is applied to a chemometric data set, the local least-squares cross-validation method, used to select the bandwidth for the functional regression estimator, is shown to give a signi cantly smaller mean square predicted error than that obtained with Bayesian methods. / Thesis / Master of Science (MSc)
40

Generation of simulated ultrasound images using a Gaussian smoothing function

Li, Jian-Cheng January 1995 (has links)
No description available.

Page generated in 0.0502 seconds