• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 201
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 463
  • 63
  • 56
  • 56
  • 55
  • 48
  • 44
  • 43
  • 41
  • 40
  • 37
  • 37
  • 35
  • 33
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Some statistical aspects of LULU smoothers

Jankowitz, Maria Dorothea 12 1900 (has links)
Thesis (PhD (Statistics and Actuarial Science))--University of Stellenbosch, 2007. / The smoothing of time series plays a very important role in various practical applications. Estimating the signal and removing the noise is the main goal of smoothing. Traditionally linear smoothers were used, but nonlinear smoothers became more popular through the years. From the family of nonlinear smoothers, the class of median smoothers, based on order statistics, is the most popular. A new class of nonlinear smoothers, called LULU smoothers, was developed by using the minimum and maximum selectors. These smoothers have very attractive mathematical properties. In this thesis their statistical properties are investigated and compared to that of the class of median smoothers. Smoothing, together with related concepts, are discussed in general. Thereafter, the class of median smoothers, from the literature is discussed. The class of LULU smoothers is defined, their properties are explained and new contributions are made. The compound LULU smoother is introduced and its property of variation decomposition is discussed. The probability distributions of some LULUsmoothers with independent data are derived. LULU smoothers and median smoothers are compared according to the properties of monotonicity, idempotency, co-idempotency, stability, edge preservation, output distributions and variation decomposition. A comparison is made of their respective abilities for signal recovery by means of simulations. The success of the smoothers in recovering the signal is measured by the integrated mean square error and the regression coefficient calculated from the least squares regression of the smoothed sequence on the signal. Finally, LULU smoothers are practically applied.
102

Ensuring Safe Exploitation of Wind Turbine Kinetic Energy : An Invariance Kernel Formulation

Rawn, Barry Gordon 21 April 2010 (has links)
This thesis investigates the computation of invariance kernels for planar nonlinear systems with one input, with application to wind turbine stability. Given a known bound on the absolute value of the input variations (possibly around a fixed non-zero value), it is of interest to determine if the system's state can be guaranteed to stay within a desired region K of the state space irrespective of the input variations. The collection of all initial conditions for which trajectories will never exit K irrespective of input variations is called the invariance kernel. This thesis develops theory to characterize the boundary of the invariance kernel and develops an algorithm to compute the exact boundary of the invariance kernel. The algorithm is applied to two simplified wind turbine systems that tap kinetic energy of the turbine to support the frequency of the grid. One system provides power smoothing, and the other provides inertial response. For these models, limits on speed and torque specify a desired region of operation K in the state space, while the wind is represented as a bounded input. The theory developed in the thesis makes it possible to define a measure called the wind disturbance margin. This measure quantifies the largest range of wind variations under which the specified type of grid support may be provided. The wind disturbance margin quantifies how the exploitation of kinetic energy reduces a turbine's tolerance to wind disturbances. The improvement in power smoothing and inertial response made available by the increased speed range of a full converter-interfaced turbine is quantified as an example.
103

Reconstruction of foliations from directional information

Yeh, Shu-Ying January 2007 (has links)
In many areas of science, especially geophysics, geography and meteorology, the data are often directions or axes rather than scalars or unrestricted vectors. Directional statistics considers data which are mainly unit vectors lying in two- or three-dimensional space (R² or R³). One way in which directional data arise is as normals to foliations. A (codimension-1) foliation of {R} {d} is a system of non-intersecting (d-1)-dimensional surfaces filling out the whole of {R} {d}. At each point z of {R} {d}, any given codimension-1 foliation determines a unit vector v normal to the surface through z. The problem considered here is that of reconstructing the foliation from observations ({z}{i}, {v}{i}), i=1,...,n. One way of doing this is rather similar to fitting smooth splines to data. That is, the reconstructed foliation has to be as close to the data as possible, while the foliation itself is not too rough. A tradeoff parameter is introduced to control the balance between smoothness and closeness. The approach used in this thesis is to take the surfaces to be surfaces of constant values of a suitable real-valued function h on {R} {d}. The problem of reconstructing a foliation is translated into the language of Schwartz distributions and a deep result in the theory of distributions is used to give the appropriate general form of the fitted function h. The model parameters are estimated by a simplified Newton method. Under appropriate distributional assumptions on v{1},...,v{n}, confidence regions for the true normals are developed and estimates of concentration are given.
104

An Analysis of Smoothing of Proved Oil and Gas Reserve Quantities and an Analysis of Bias and Variability in Revisions of Previous Estimates of Proved Oil and Gas Reserve Quantities

Campbell, Alan D. 08 1900 (has links)
The purpose of this study is to determine whether oil and gas producing companies smooth their ending reserve quantities. Smoothing is defined as a reduction in variance in the trend of ending reserve quantities over time compared to the trend of ending reserve quantities less the hypothesized smoothing variable over time. This study focuses on two variables that are most susceptible to manipulation—revisions of previous estimates and additions. This study also examines whether revisions are positively or negatively biased and the variability of the revisions. The sample consists of 70 companies chosen from oil & Gas Reserve Disclosures: 1980-1984 Survey of 400 Public Companies by Arthur Andersen and Company. For each company, ending reserve quantities for the years 1978-1984 were regressed over time, and the standard deviation of the estimate (SDE) was calculated. Then the ending reserve quantities less the hypothesized smoothing variable were regressed over time, and the SDE was calculated. A linear model and a semi-logarithmic model were used. A smoothing ratio (SR) was determined by dividing the SDE of reserves less the hypothesized smoothing variable by the SDE of ending reserve quantities. An SR greater than one indicates smoothing, and an SR less than one indicates that smoothing did not occur. The mean percentage revision and a t-test were used to test for positive or negative bias in the revisions. The mean absolute percentage revision was used to assess the relative variability of revisions. The number of companies classified as smoothers of oil reserves was statistically significant for the semi-logarithmic model but not for the linear model. Under both models the number of companies classified as smoothers of gas reserves was statistically significant. Few companies had mean percentage revisions that were significantly different from zero. The majority of companies had mean absolute revisions of under ten percent.
105

Computerized Analysis of Radiograph Images of Embedded Objects as Applied to Bone Location and Mineral Content Measurement

Buckner, Richard L. 08 1900 (has links)
This investigation dealt with locating and measuring x-ray absorption of radiographic images. The methods developed provide a fast, accurate, minicomputer control, for analysis of embedded objects. A PDP/8 computer system was interfaced with a Joyce Loebl 3CS Microdensitometer and a Leeds & Northrup Recorder. Proposed algorithms for bone location and data smoothing work on a twelve-bit minicomputer. Designs of a software control program and operational procedure are presented. The filter made wedge and limb scans monotonic from minima to maxima. It was tested for various convoluted intervals. Ability to resmooth the same data in multiple passes was tested. An interval size of fifteen works well in one pass.
106

Particle tracking using the unscented Kalman filter in high energy physics experiments

Akhtar, Jahanzeb January 2015 (has links)
The extended Kalman lter (EKF) has a long history in the field of non-linear tracking. More recently, statistically-based estimators have emerged that avoid the need for a deterministic linearisation process. The Unscented Kalman filter (UKF) is one such technique that has been shown to perform favourably for some non-linear systems when compared to an EKF implementation, both in terms of accuracy and robustness. In this Thesis, the UKF is applied to a high energy physics particle tracking problem where currently the EKF is being implemented. The effects of measurement redundancy are investigated to determine improvements in accuracy of particle track reconstruction. The relationship between measurement redundancy and relative observability is also investigated through an experimental and theoretical analysis. Smoothing (backward filtering), in the high energy physics experiments, is implementedusing the Rauch Tung Striebel (RTS) smoother with the EKF , however, in Unscented Kalman filter algorithms, the Jacobian matrices required by the RTS method, are not available. The Unscented Rauch Tung Striebel (URTS) smoother addresses this problem by avoiding the use of Jacobian matrices but is not effi cient for large dimensional systems such as high energy physics experiments. A technique is implemented in the RTS smoother to make it suitable for the UKF. The method is given the name the Jacobian Equivalent Rauch Tung Striebel (JE-RTS) smoother. The implementation of this method is quite straight forward when the UKF is used as an estimator.
107

Payout policy in family firms : A study on payout levels and dividend smoothing in Sweden

Bolin, Patrick, Widerberg, Carl January 2019 (has links)
This study investigates payouts in Swedish family firms by focusing on both the level and speed of adjustment of dividends. In addition, the use of dual-class shares in family firms is examined to further identify potential drivers of payout differences between family-controlled companies and non-family firms. Agency theory and previous studies suggest that high and stable payouts are used by controlling families to mitigate minority shareholders’ concerns of being expropriated. We find that family firms in Sweden do not differ from non-family firms in their payouts. The results could be seen as an indication of expropriation if minority shareholders should be compensated for higher agency costs, but it could also be that family control does not worsen agency conflicts between majority and minority shareholders. Rather, other ownership structures such as the use of dual-class shares to gain control in excess of ownership seem to be associated with higher levels of payouts. Neither do family firms smooth their dividends more than non-family firms. Instead, they adapt towards their target dividend at a higher pace.
108

Uma comparação entre modelos de previsão de preços do boi gordo paulista / A comparison between São Paulo\'s live cattle prices forecasting models

Lanzetta, Vitor Bianchi 23 February 2018 (has links)
O estudo comparou o desempenho preditivo dos modelos de previsão de redes neurais e de suavização exponencial, empregando dados diários do preço da arroba do boi gordo futuro (BM&FBOVESPA) entre janeiro de 2010 até dezembro de 2015. Os resultados mostram que modelos relativamente mais complexos como redes neurais não necessariamente apresentam melhor desempenho se comparados a modelos mais simples, e também mostram que a classificação relativa muda conforme variam as medidas de ajuste e/ou horizonte de previsão além de vantagens associadas a combinação de diversos modelos. / This study compared the predictive performance between neural network models and exponential smoothing, using daily data of live cattle future price (BM&FBOVESPA) from January 2010 to December 2015. The results show that relatively more complex models like neural networks do not necessarily display better performance compared to simpler ones. Results also shows that relative classification changes with respect to adjust measures and/or forecast horizons changes besides advantages achieved by model combinaion.
109

AUC estimation under various survival models

Unknown Date (has links)
In the medical science, the receiving operationg characteristic (ROC) curve is a graphical representation to evaluate the accuracy of a medical diagnostic test for any cut-off point. The area under the ROC curve (AUC) is an overall performance measure for a diagnostic test. There are two parts in this dissertation. In the first part, we study the properties of bi-Exponentiated Weibull models. FIrst, we derive a general moment formula for single Exponentiated Weibull models. Then we move on to derive the precise formula of AUC and study the maximus likelihood estimation (MLE) of the AUC. Finally, we obtain the asymptotoc distribution of the estimated AUC. Simulation studies are used to check the performance of MLE of AUC under the moderate sample sizes. The second part fo the dissertation is to study the estimation of AUC under the crossing model, which extends the AUC formula in Gonen and Heller (2007). / by Fazhe Chang. / Thesis (Ph.D.)--Florida Atlantic University, 2012. / Includes bibliography. / Mode of access: World Wide Web. / System requirements: Adobe Reader.
110

Tetrahedral Meshes in Biomedical Applications: Generation, Boundary Recovery and Quality Enhancements

Ghadyani, Hamid R 30 March 2009 (has links)
Mesh generation is a fundamental precursor to finite element implementations for solution of partial differential equations in engineering and science. This dissertation advances the field in three distinct but coupled areas. A robust and fast three dimensional mesh generator for arbitrarily shaped geometries was developed. It deploys nodes throughout the domain based upon user-specified mesh density requirements. The system is integer and pixel based which eliminates round off errors, substantial memory requirements and cpu intensive calculations. Linked, but fully detachable, to the mesh generation system is a physical boundary recovery routine. Frequently, the original boundary topology is required for specific boundary condition applications or multiple material constraints. Historically, this boundary preservation was not available. An algorithm was developed, refined and optimized that recovers the original boundaries, internal and external, with fidelity. Finally, a node repositioning algorithm was developed that maximizes the minimum solid angle of tetrahedral meshes. The highly coveted 2D Delaunay property that maximizes the minimum interior angle of a triangle mesh does not extend to its 3D counterpart, to maximize the minimum solid angle of a tetrahedron mesh. As a consequence, 3D Delaunay created meshes have unacceptable sliver tetrahedral elements albeit composed of 4 high quality triangle sides. These compromised elements are virtually unavoidable and can foil an otherwise intact mesh. The numerical optimization routine developed takes any preexisting tetrahedral mesh and repositions the nodes without changing the mesh topology so that the minimum solid angle of the tetrahedrons is maximized. The overall quality enhancement of the volume mesh might be small, depending upon the initial mesh. However, highly distorted elements that create ill-conditioned global matrices and foil a finite element solver are enhanced significantly.

Page generated in 0.0641 seconds