• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 200
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 462
  • 63
  • 56
  • 56
  • 54
  • 48
  • 44
  • 43
  • 41
  • 40
  • 37
  • 37
  • 35
  • 33
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Geolocation by Light using Target Tracking / Målföljning med ljusmätningar

Envall, Linus January 2013 (has links)
In order to understand the migration patterns of migrating birds, it is necessary to understand whenand where to they migrate. Many of these birds are very small and thus cannot carry heavy sensors;hence it is necessary to be able to perform positioning using a very small sensor. One way to do this isto use a light-intensity sensor. Since the sunrise and sunset times are known given time and position onthe earth, it is possible to determine the global position using light intensity. Light intensity increasesas the sun rises. Data sets from several calibration sensors, mainly from different locations in Sweden, have been examinedin different ways in order to get an understanding of the measurements and what affects them. Inorder to perform positioning, it is necessary to know the solar elevation angle, which can be computedif the time and position are known, as is the case for the calibration sensors. This has been utilized toidentify a mapping from measured light intensity to solar elevation angle, which is used to computepseudo-measurements for target tracking, described below. In this thesis, positioning is performed using methods from the field of target tracking. This is doneboth causally (filtering) and non-causally (smoothing). There are certain problems that arise; firstly,the measured light intensity can be attenuated due to weather conditions such as cloudiness, which ismodelled as a time-varying offset. Secondly, the sensor can be shadowed causing outliers in the data.Furthermore, birds are not always in a migratory state, they oftentimes stay in one place. The lattertwo phenomena are modelled using an Interacting Multiple Model (IMM) where they are representedas discrete states, corresponding to different models.
132

ECG Noise Filtering Using Online Model-Based Bayesian Filtering Techniques

Su, Aron Wei-Hsiang January 2013 (has links)
The electrocardiogram (ECG) is a time-varying electrical signal that interprets the electrical activity of the heart. It is obtained by a non-invasive technique known as surface electromyography (EMG), used widely in hospitals. There are many clinical contexts in which ECGs are used, such as medical diagnosis, physiological therapy and arrhythmia monitoring. In medical diagnosis, medical conditions are interpreted by examining information and features in ECGs. Physiological therapy involves the control of some aspect of the physiological effort of a patient, such as the use of a pacemaker to regulate the beating of the heart. Moreover, arrhythmia monitoring involves observing and detecting life-threatening conditions, such as myocardial infarction or heart attacks, in a patient. ECG signals are usually corrupted with various types of unwanted interference such as muscle artifacts, electrode artifacts, power line noise and respiration interference, and are distorted in such a way that it can be difficult to perform medical diagnosis, physiological therapy or arrhythmia monitoring. Consequently signal processing on ECGs is required to remove noise and interference signals for successful clinical applications. Existing signal processing techniques can remove some of the noise in an ECG signal, but are typically inadequate for extraction of the weak ECG components contaminated with background noise and for retention of various subtle features in the ECG. For example, the noise from the EMG usually overlaps the fundamental ECG cardiac components in the frequency domain, in the range of 0.01 Hz to 100 Hz. Simple filters are inadequate to remove noise which overlaps with ECG cardiac components. Sameni et al. have proposed a Bayesian filtering framework to resolve these problems, and this gives results which are clearly superior to the results obtained from application of conventional signal processing methods to ECG. However, a drawback of this Bayesian filtering framework is that it must run offline, and this of course is not desirable for clinical applications such as arrhythmia monitoring and physiological therapy, both of which re- quire online operation in near real-time. To resolve this problem, in this thesis we propose a dynamical model which permits the Bayesian filtering framework to function online. The framework with the proposed dynamical model has less than 4% loss in performance compared to the previous (offline) version of the framework. The proposed dynamical model is based on theory from fixed-lag smoothing.
133

An Optimization Based Approach to Visual Odometry Using Infrared Images

Nilsson, Emil January 2010 (has links)
The goal of this work has been to improve the accuracy of a pre-existing algorithm for vehicle pose estimation, which uses intrinsic measurements of vehicle motion and measurements derived from far infrared images. Estimating the pose of a vehicle, based on images from an on-board camera and intrinsic measurements of vehicle motion, is a problem of simultanoeus localization and mapping (SLAM), and it can be solved using the extended Kalman filter (EKF). The EKF is a causal filter, so if the pose estimation problem is to be solved off-line acausal methods are expected to increase estimation accuracy significantly. In this work the EKF has been compared with an acausal method for solving the SLAM problem called smoothing and mapping (SAM) which is an optimization based method that minimizes process and measurement noise. Analyses of how improvements in the vehicle motion model, using a number of different model extensions, affects accuracy of pose estimates have also been performed.
134

R&D Capitalization and The Income Smoothing Hypothesis – A study of Swedish listed Companies

Fuentes, Karen, Persson, Annelie January 2011 (has links)
This paper examines whether Swedish listed firms use research and development (R&D) accounting as a tool for income smoothing (hypothesis 1). One controversial accounting issue concerning R&D is that R&D capitalization could be influenced by earnings management purposes due to a subjective accounting treatment. We also examine whether firms´ degree of fluctuation in return on assets (ROA) has an effect on income smoothing behavior (hypothesis 2). Finally, we investigate if the level of flexibility allowed in the R&D accounting with the different accounting standards, BFN R1, RR 15 and IAS 38 has an effect on income smoothing behavior (hypothesis 3). We study the accounts for 21 firms for the years 1998-2000, 52 firms for 2002-2004 and 59 firms for 2007-2009. Using multiple regression analysis we find that the income smoothing hypothesis is supported in period two (2002-2004). The regression analysis also indicates that firms with low change in ROA tend to capitalize more R&D when they are less profitable than prior year. Our results also imply that the level of flexibility in different accounting standards does not have an effect on income smoothing behavior and hypothesis 3 is not supported.
135

Analysing stochastic call demand with time varying parameters

Li, Song 25 November 2005 (has links)
In spite of increasingly sophisticated workforce management tools, a significant gap remains between the goal of effective staffing and the present difficulty predicting the stochastic demand of inbound calls. We have investigated the hypothesized nonhomogeneous Poisson process model of modem pool callers of the University community. In our case, we tested if the arrivals could be approximated by a piecewise constant rate over short intervals. For each of 1 and 10-minute intervals, based on the close relationship between the Poisson process and the exponential distribution, the test results did not show any sign of homogeneous Poisson process. We have examined the hypothesis of a nonhomogeneous Poisson process by a transformed statistic. Quantitative and graphical goodness-of-fit tests have confirmed nonhomogeneous Poisson process. <p>Further analysis on the intensity function revealed that linear rate intensity was woefully inadequate in predicting time varying arrivals. For sinusoidal rate model, difficulty arose in setting the period parameter. Spline models, as an alternative to parametric modelling, had more control of balance between data fitting and smoothness, which was appealing to our analysis on call arrival process.
136

Bayesian classification and survival analysis with curve predictors

Wang, Xiaohui 15 May 2009 (has links)
We propose classification models for binary and multicategory data where the predictor is a random function. The functional predictor could be irregularly and sparsely sampled or characterized by high dimension and sharp localized changes. In the former case, we employ Bayesian modeling utilizing flexible spline basis which is widely used for functional regression. In the latter case, we use Bayesian modeling with wavelet basis functions which have nice approximation properties over a large class of functional spaces and can accommodate varieties of functional forms observed in real life applications. We develop an unified hierarchical model which accommodates both the adaptive spline or wavelet based function estimation model as well as the logistic classification model. These two models are coupled together to borrow strengths from each other in this unified hierarchical framework. The use of Gibbs sampling with conjugate priors for posterior inference makes the method computationally feasible. We compare the performance of the proposed models with the naive models as well as existing alternatives by analyzing simulated as well as real data. We also propose a Bayesian unified hierarchical model based on a proportional hazards model and generalized linear model for survival analysis with irregular longitudinal covariates. This relatively simple joint model has two advantages. One is that using spline basis simplifies the parameterizations while a flexible non-linear pattern of the function is captured. The other is that joint modeling framework allows sharing of the information between the regression of functional predictors and proportional hazards modeling of survival data to improve the efficiency of estimation. The novel method can be used not only for one functional predictor case, but also for multiple functional predictors case. Our methods are applied to analyze real data sets and compared with a parameterized regression method.
137

An Additive Bivariate Hierarchical Model for Functional Data and Related Computations

Redd, Andrew Middleton 2010 August 1900 (has links)
The work presented in this dissertation centers on the theme of regression and computation methodology. Functional data is an important class of longitudinal data, and principal component analysis is an important approach to regression with this type of data. Here we present an additive hierarchical bivariate functional data model employing principal components to identify random e ects. This additive model extends the univariate functional principal component model. These models are implemented in the pfda package for R. To t the curves from this class of models orthogonalized spline basis are used to reduce the dimensionality of the t, but retain exibility. Methods for handing spline basis functions in a purely analytical manner, including the orthogonalizing process and computing of penalty matrices used to t the principal component models are presented. The methods are implemented in the R package orthogonalsplinebasis. The projects discussed involve complicated coding for the implementations in R. To facilitate this I created the NppToR utility to add R functionality to the popular windows code editor Notepad . A brief overview of the use of the utility is also included.
138

Improving Tool Paths for Impellers

Kuo, Hsin-Hung 02 September 2004 (has links)
Impellers are important components in the field of aerospace, energy technology, and precision machine industries. Considering the high accuracy and structural integrity, impellers might be manufactured by cutting. Due to their complex geometries and high degrees of interference in machining, multi-axis machines are requested to produce impellers. The object of this thesis is to improve 5-axis tool paths for surface quality of impellers by smoothing point cutting tool paths in terms of linear segments and B-Splines and by using flank milling technologies with linear segment and B-Splines tool paths. Experimental results show that the surface quality of impeller blades can be improved by point cutting with smoothed tool paths and by flank milling. Moreover, the required milling time can be reduced by 18 percent and 13percent based on smoothed linear tool paths and smoothed B-Splines tool paths, respectively.
139

Regression analysis with longitudinal measurements

Ryu, Duchwan 29 August 2005 (has links)
Bayesian approaches to the regression analysis for longitudinal measurements are considered. The history of measurements from a subject may convey characteristics of the subject. Hence, in a regression analysis with longitudinal measurements, the characteristics of each subject can be served as covariates, in addition to possible other covariates. Also, the longitudinal measurements may lead to complicated covariance structures within each subject and they should be modeled properly. When covariates are some unobservable characteristics of each subject, Bayesian parametric and nonparametric regressions have been considered. Although covariates are not observable directly, by virtue of longitudinal measurements, the covariates can be estimated. In this case, the measurement error problem is inevitable. Hence, a classical measurement error model is established. In the Bayesian framework, the regression function as well as all the unobservable covariates and nuisance parameters are estimated. As multiple covariates are involved, a generalized additive model is adopted, and the Bayesian backfitting algorithm is utilized for each component of the additive model. For the binary response, the logistic regression has been proposed, where the link function is estimated by the Bayesian parametric and nonparametric regressions. For the link function, introduction of latent variables make the computing fast. In the next part, each subject is assumed to be observed not at the prespecifiedtime-points. Furthermore, the time of next measurement from a subject is supposed to be dependent on the previous measurement history of the subject. For this outcome- dependent follow-up times, various modeling options and the associated analyses have been examined to investigate how outcome-dependent follow-up times affect the estimation, within the frameworks of Bayesian parametric and nonparametric regressions. Correlation structures of outcomes are based on different correlation coefficients for different subjects. First, by assuming a Poisson process for the follow- up times, regression models have been constructed. To interpret the subject-specific random effects, more flexible models are considered by introducing a latent variable for the subject-specific random effect and a survival distribution for the follow-up times. The performance of each model has been evaluated by utilizing Bayesian model assessments.
140

Statistical analysis and modeling: cancer, clinical trials, environment and epidemiology.

Vovoras, Dimitrios 01 January 2011 (has links)
The current thesis is structured in four parts. Vector smoothing methods are used to study environmental data, in particular records of extreme precipitation, the models utilized belong to the vector generalized additive class. In the statistical analysis of observational studies the identification and adjustment for prognostic factors is an important component of the analysis; employing flexible statistical methods to identify and characterize the effect of potential prognostic factors in a clinical trial, namely "generalized additive models", presents an alternative to the traditional linear statistical model. The classes of models for which the methodology gives generalized additive extensions include grouped survival data from the Surveillance, Epidemiology, and End Results tumors of the brain and the central nervous system database; we are employing piecewise linear functions of the covariates to characterize the survival experienced by the population. Finally, both descriptive and analytical methods are utilized to study incidence rates and tumor sizes associated with the disease.

Page generated in 1.1393 seconds