• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 203
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 465
  • 63
  • 56
  • 56
  • 55
  • 48
  • 45
  • 43
  • 41
  • 40
  • 38
  • 37
  • 35
  • 33
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Regression analysis with longitudinal measurements

Ryu, Duchwan 29 August 2005 (has links)
Bayesian approaches to the regression analysis for longitudinal measurements are considered. The history of measurements from a subject may convey characteristics of the subject. Hence, in a regression analysis with longitudinal measurements, the characteristics of each subject can be served as covariates, in addition to possible other covariates. Also, the longitudinal measurements may lead to complicated covariance structures within each subject and they should be modeled properly. When covariates are some unobservable characteristics of each subject, Bayesian parametric and nonparametric regressions have been considered. Although covariates are not observable directly, by virtue of longitudinal measurements, the covariates can be estimated. In this case, the measurement error problem is inevitable. Hence, a classical measurement error model is established. In the Bayesian framework, the regression function as well as all the unobservable covariates and nuisance parameters are estimated. As multiple covariates are involved, a generalized additive model is adopted, and the Bayesian backfitting algorithm is utilized for each component of the additive model. For the binary response, the logistic regression has been proposed, where the link function is estimated by the Bayesian parametric and nonparametric regressions. For the link function, introduction of latent variables make the computing fast. In the next part, each subject is assumed to be observed not at the prespecifiedtime-points. Furthermore, the time of next measurement from a subject is supposed to be dependent on the previous measurement history of the subject. For this outcome- dependent follow-up times, various modeling options and the associated analyses have been examined to investigate how outcome-dependent follow-up times affect the estimation, within the frameworks of Bayesian parametric and nonparametric regressions. Correlation structures of outcomes are based on different correlation coefficients for different subjects. First, by assuming a Poisson process for the follow- up times, regression models have been constructed. To interpret the subject-specific random effects, more flexible models are considered by introducing a latent variable for the subject-specific random effect and a survival distribution for the follow-up times. The performance of each model has been evaluated by utilizing Bayesian model assessments.
142

Statistical analysis and modeling: cancer, clinical trials, environment and epidemiology.

Vovoras, Dimitrios 01 January 2011 (has links)
The current thesis is structured in four parts. Vector smoothing methods are used to study environmental data, in particular records of extreme precipitation, the models utilized belong to the vector generalized additive class. In the statistical analysis of observational studies the identification and adjustment for prognostic factors is an important component of the analysis; employing flexible statistical methods to identify and characterize the effect of potential prognostic factors in a clinical trial, namely "generalized additive models", presents an alternative to the traditional linear statistical model. The classes of models for which the methodology gives generalized additive extensions include grouped survival data from the Surveillance, Epidemiology, and End Results tumors of the brain and the central nervous system database; we are employing piecewise linear functions of the covariates to characterize the survival experienced by the population. Finally, both descriptive and analytical methods are utilized to study incidence rates and tumor sizes associated with the disease.
143

Video stabilization and rectification for handheld cameras

Jia, Chao 26 June 2014 (has links)
Video data has increased dramatically in recent years due to the prevalence of handheld cameras. Such videos, however, are usually shakier compared to videos shot by tripod-mounted cameras or cameras with mechanical stabilizers. In addition, most handheld cameras use CMOS sensors. In a CMOS sensor camera, different rows in a frame are read/reset sequentially from top to bottom. When there is fast relative motion between the scene and the video camera, a frame can be distorted because each row was captured under a different 3D-to-2D projection. This kind of distortion is known as rolling shutter effect. Digital video stabilization and rolling shutter rectification seek to remove the unwanted frame-to-frame jitter and rolling shutter effect, in order to generate visually stable and pleasant videos. In general, we need to (1) estimate the camera motion, (2) regenerate camera motion, and (3) synthesize new frames. This dissertation aims at improving the first two steps of video stabilization and rolling shutter rectification. It has been shown that the inertial sensors in handheld devices can provide more accurate and robust motion estimation compared to vision-based methods. This dissertation proposes an online camera-gyroscope calibration method for sensor fusion while a user is capturing video. The proposed method uses an implicit extended Kalman filter and is based on multiple-view geometry in a rolling shutter camera model. It is able to estimate the needed calibration parameters online with all kinds of camera motion. Given the camera motion estimated from inertial sensors after the pro- posed calibration method, this dissertation first proposes an offline motion smoothing algorithm based on a 3D rotational camera motion model. The offline motion smoothing is formulated as a geodesic-convex regression problem on the manifold of rotation matrix sequences. The formulated problem is solved by an efficient two-metric projection algorithm on the manifold. The geodesic-distance-based smoothness metric better exploits the manifold structure of sequences of rotation matrices. Then this dissertation proposes two online motion smoothing algorithms that are also based on a 3D rotational camera motion model. The first algorithm extends IIR filtering from Euclidean space to the nonlinear manifold of 3D rotation matrices. The second algorithm uses unscented Kalman filtering on a constant angular velocity model. Both offline and online motion smoothing algorithms are constrained to guarantee that no black borders intrude into the stabilized frames. / text
144

Topology optimization for additive manufacturing of customized meso-structures using homogenization and parametric smoothing functions

Sundararajan, Vikram Gopalakrishnan 16 February 2011 (has links)
Topology optimization tools are useful for distributing material in a geometric domain to match targets for mass, displacement, structural stiffness, and other characteristics as closely as possible. Topology optimization tools are especially applicable to additive manufacturing applications, which provide nearly unlimited freedom for customizing the internal and external architecture of a part. Existing topology optimization tools, however, do not take full advantage of the capabilities of additive manufacturing. Prominent tools use micro- or meso-scale voids or artificial materials to parameterize the topology optimization problem, but they use filters, penalization functions, and other schemes to force convergence to regions of fully dense (solid) material and fully void (open) space in the final structure as a means of accommodating conventional manufacturing processes. Since additive manufacturing processes are capable of fabricating intermediate densities (e.g., via porous mesostructures), significant performance advantages could be achieved by preserving and exploiting those features during the topology optimization process. Towards this goal, a topology optimization tool has been created by combining homogenization with parametric smoothing functions. Rectangular mesoscale voids are used to represent material topology. Homogenization is used to analyze its properties. B-spline based parametric smoothing functions are used to control the size of the voids throughout the design domain, thereby smoothing the topology and reducing the number of required design variables relative to homogenization-based approaches. Resulting designs are fabricated with selective laser sintering technology, and their geometric and elastic properties are evaluated experimentally. / text
145

Next Generation Computer Controlled Optical Surfacing

Kim, Dae Wook January 2009 (has links)
Precision optics can be accurately fabricated by computer controlled optical surfacing (CCOS) that uses well characterized polishing tools driven by numerically controlled machines. The CCOS process is optimized to vary the dwell time of the tool on the workpiece according to the desired removal and the calibrated tool influence function (TIF), which is the shape of the wear function by the tool. This study investigates four major topics to improve current CCOS processes, and provides new solutions and approaches for the next generation CCOS processes.The first topic is to develop a tool for highly aspheric optics fabrication. Both the TIF stability and surface finish rely on the tool maintaining intimate contact with the workpiece. Rigid tools smooth the surface, but do not maintain intimate contacts for aspheric surfaces. Flexible tools conform to the surface, but lack smoothing. A rigid conformal (RC) lap using a visco-elastic non-Newtonian medium was developed. It conforms to the aspheric shape, yet maintains stability to provide natural smoothing.The second topic is a smoothing model for the RC lap. The smoothing naturally removes mid-to-high frequency errors while a large tool runs over the workpiece to remove low frequency errors efficiently. The CCOS process convergence rate can be significantly improved by predicting the smoothing effects. A parametric smoothing model was introduced and verified.The third topic is establishing a TIF model to represent measured TIFs. While the linear Preston's model works for most cases, non-linear removal behavior as the tool overhangs the workpiece edge introduces a difficulty in modeling. A parametric model for the edge TIFs was introduced and demonstrated. Various TIFs based on the model are provided as a library.The last topic is an enhanced process optimization technique. A non-sequential optimization technique using multiple TIFs was developed. Operating a CCOS with a small and well characterized TIF achieves excellent performance, but takes a long time. Sequential polishing runs using large and small tools can reduce this polishing time. The non-sequential approach performs multiple dwell time optimizations for the entire CCOS runs simultaneously. The actual runs will be sequential, but the optimization is comprehensive.
146

Stochastic claims reserving in non-life insurance : Bootstrap and smoothing models

Björkwall, Susanna January 2011 (has links)
In practice there is a long tradition of actuaries calculating reserve estimates according to deterministic methods without explicit reference to a stochastic model. For instance, the chain-ladder was originally a deterministic reserving method. Moreover, the actuaries often make ad hoc adjustments of the methods, for example, smoothing of the chain-ladder development factors, in order to fit the data set under analysis. However, stochastic models are needed in order to assess the variability of the claims reserve. The standard statistical approach would be to first specify a model, then find an estimate of the outstanding claims under that model, typically by maximum likelihood, and finally the model could be used to find the precision of the estimate. As a compromise between this approach and the actuary's way of working without reference to a model the object of the research area has often been to first construct a model and a method that produces the actuary's estimate and then use this model in order to assess the uncertainty of the estimate. A drawback of this approach is that the suggested models have been constructed to give a measure of the precision of the reserve estimate without the possibility of changing the estimate itself. The starting point of this thesis is the inconsistency between the deterministic approaches used in practice and the stochastic ones suggested in the literature. On one hand, the purpose of Paper I is to develop a bootstrap technique which easily enables the actuary to use other development factor methods than the pure chain-ladder relying on as few model assumptions as possible. This bootstrap technique is then extended and applied to the separation method in Paper II. On the other hand, the purpose of Paper III is to create a stochastic framework which imitates the ad hoc deterministic smoothing of chain-ladder development factors which is frequently used in practice.
147

ECG Noise Filtering Using Online Model-Based Bayesian Filtering Techniques

Su, Aron Wei-Hsiang January 2013 (has links)
The electrocardiogram (ECG) is a time-varying electrical signal that interprets the electrical activity of the heart. It is obtained by a non-invasive technique known as surface electromyography (EMG), used widely in hospitals. There are many clinical contexts in which ECGs are used, such as medical diagnosis, physiological therapy and arrhythmia monitoring. In medical diagnosis, medical conditions are interpreted by examining information and features in ECGs. Physiological therapy involves the control of some aspect of the physiological effort of a patient, such as the use of a pacemaker to regulate the beating of the heart. Moreover, arrhythmia monitoring involves observing and detecting life-threatening conditions, such as myocardial infarction or heart attacks, in a patient. ECG signals are usually corrupted with various types of unwanted interference such as muscle artifacts, electrode artifacts, power line noise and respiration interference, and are distorted in such a way that it can be difficult to perform medical diagnosis, physiological therapy or arrhythmia monitoring. Consequently signal processing on ECGs is required to remove noise and interference signals for successful clinical applications. Existing signal processing techniques can remove some of the noise in an ECG signal, but are typically inadequate for extraction of the weak ECG components contaminated with background noise and for retention of various subtle features in the ECG. For example, the noise from the EMG usually overlaps the fundamental ECG cardiac components in the frequency domain, in the range of 0.01 Hz to 100 Hz. Simple filters are inadequate to remove noise which overlaps with ECG cardiac components. Sameni et al. have proposed a Bayesian filtering framework to resolve these problems, and this gives results which are clearly superior to the results obtained from application of conventional signal processing methods to ECG. However, a drawback of this Bayesian filtering framework is that it must run offline, and this of course is not desirable for clinical applications such as arrhythmia monitoring and physiological therapy, both of which re- quire online operation in near real-time. To resolve this problem, in this thesis we propose a dynamical model which permits the Bayesian filtering framework to function online. The framework with the proposed dynamical model has less than 4% loss in performance compared to the previous (offline) version of the framework. The proposed dynamical model is based on theory from fixed-lag smoothing.
148

Faktorgraph-basierte Sensordatenfusion zur Anwendung auf einem Quadrocopter / Factor Graph Based Sensor Fusion for a Quadrotor UAV

Lange, Sven 13 December 2013 (has links) (PDF)
Die Sensordatenfusion ist eine allgegenwärtige Aufgabe im Bereich der mobilen Robotik und darüber hinaus. In der vorliegenden Arbeit wird das typischerweise verwendete Verfahren zur Sensordatenfusion in der Robotik in Frage gestellt und anhand von neuartigen Algorithmen, basierend auf einem Faktorgraphen, gelöst sowie mit einer korrespondierenden Extended-Kalman-Filter-Implementierung verglichen. Im Mittelpunkt steht dabei das technische sowie algorithmische Sensorkonzept für die Navigation eines Flugroboters im Innenbereich. Ausführliche Experimente zeigen die Qualitätssteigerung unter Verwendung der neuen Variante der Sensordatenfusion, aber auch Einschränkungen und Beispiele mit nahezu identischen Ergebnissen beider Varianten der Sensordatenfusion. Neben Experimenten anhand einer hardwarenahen Simulation wird die Funktionsweise auch anhand von realen Hardwaredaten evaluiert.
149

Robust techniques for regression models with minimal assumptions / M.M. van der Westhuizen

Van der Westhuizen, Magdelena Marianna January 2011 (has links)
Good quality management decisions often rely on the evaluation and interpretation of data. One of the most popular ways to investigate possible relationships in a given data set is to follow a process of fitting models to the data. Regression models are often employed to assist with decision making. In addition to decision making, regression models can also be used for the optimization and prediction of data. The success of a regression model, however, relies heavily on assumptions made by the model builder. In addition, the model may also be influenced by the presence of outliers; a more robust model, which is not as easily affected by outliers, is necessary in making more accurate interpretations about the data. In this research study robust techniques for regression models with minimal assumptions are explored. Mathematical programming techniques such as linear programming, mixed integer linear programming, and piecewise linear regression are used to formulate a nonlinear regression model. Outlier detection and smoothing techniques are included to address the robustness of the model and to improve predictive accuracy. The performance of the model is tested by applying it to a variety of data sets and comparing the results to those of other models. The results of the empirical experiments are also presented in this study. / Thesis (M.Sc. (Computer Science))--North-West University, Potchefstroom Campus, 2011.
150

Robust techniques for regression models with minimal assumptions / M.M. van der Westhuizen

Van der Westhuizen, Magdelena Marianna January 2011 (has links)
Good quality management decisions often rely on the evaluation and interpretation of data. One of the most popular ways to investigate possible relationships in a given data set is to follow a process of fitting models to the data. Regression models are often employed to assist with decision making. In addition to decision making, regression models can also be used for the optimization and prediction of data. The success of a regression model, however, relies heavily on assumptions made by the model builder. In addition, the model may also be influenced by the presence of outliers; a more robust model, which is not as easily affected by outliers, is necessary in making more accurate interpretations about the data. In this research study robust techniques for regression models with minimal assumptions are explored. Mathematical programming techniques such as linear programming, mixed integer linear programming, and piecewise linear regression are used to formulate a nonlinear regression model. Outlier detection and smoothing techniques are included to address the robustness of the model and to improve predictive accuracy. The performance of the model is tested by applying it to a variety of data sets and comparing the results to those of other models. The results of the empirical experiments are also presented in this study. / Thesis (M.Sc. (Computer Science))--North-West University, Potchefstroom Campus, 2011.

Page generated in 0.0302 seconds