• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 684
  • 252
  • 79
  • 57
  • 42
  • 37
  • 30
  • 26
  • 25
  • 14
  • 9
  • 8
  • 7
  • 7
  • 7
  • Tagged with
  • 1504
  • 1030
  • 249
  • 238
  • 223
  • 215
  • 195
  • 185
  • 167
  • 163
  • 151
  • 124
  • 123
  • 122
  • 111
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
931

Method for Improving the Efficiency of Image Super-Resolution Algorithms Based on Kalman Filters

Dobson, William Keith 01 December 2009 (has links)
The Kalman Filter has many applications in control and signal processing but may also be used to reconstruct a higher resolution image from a sequence of lower resolution images (or frames). If the sequence of low resolution frames is recorded by a moving camera or sensor, where the motion can be accurately modeled, then the Kalman filter may be used to update pixels within a higher resolution frame to achieve a more detailed result. This thesis outlines current methods of implementing this algorithm on a scene of interest and introduces possible improvements for the speed and efficiency of this method by use of block operations on the low resolution frames. The effects of noise on camera motion and various blur models are examined using experimental data to illustrate the differences between the methods discussed.
932

Advanced Statistical Methodologies in Determining the Observation Time to Discriminate Viruses Using FTIR

Luo, Shan 13 July 2009 (has links)
Fourier transform infrared (FTIR) spectroscopy, one method of electromagnetic radiation for detecting specific cellular molecular structure, can be used to discriminate different types of cells. The objective is to find the minimum time (choice among 2 hour, 4 hour and 6 hour) to record FTIR readings such that different viruses can be discriminated. A new method is adopted for the datasets. Briefly, inner differences are created as the control group, and Wilcoxon Signed Rank Test is used as the first selecting variable procedure in order to prepare the next stage of discrimination. In the second stage we propose either partial least squares (PLS) method or simply taking significant differences as the discriminator. Finally, k-fold cross-validation method is used to estimate the shrinkages of the goodness measures, such as sensitivity, specificity and area under the ROC curve (AUC). There is no doubt in our mind 6 hour is enough for discriminating mock from Hsv1, and Coxsackie viruses. Adeno virus is an exception.
933

Data-driven estimation for Aalen's additive risk model

Boruvka, Audrey 02 August 2007 (has links)
The proportional hazards model developed by Cox (1972) is by far the most widely used method for regression analysis of censored survival data. Application of the Cox model to more general event history data has become possible through extensions using counting process theory (e.g., Andersen and Borgan (1985), Therneau and Grambsch (2000)). With its development based entirely on counting processes, Aalen’s additive risk model offers a flexible, nonparametric alternative. Ordinary least squares, weighted least squares and ridge regression have been proposed in the literature as estimation schemes for Aalen’s model (Aalen (1989), Huffer and McKeague (1991), Aalen et al. (2004)). This thesis develops data-driven parameter selection criteria for the weighted least squares and ridge estimators. Using simulated survival data, these new methods are evaluated against existing approaches. A survey of the literature on the additive risk model and a demonstration of its application to real data sets are also provided. / Thesis (Master, Mathematics & Statistics) -- Queen's University, 2007-07-18 22:13:13.243
934

Novel 3D Back Reconstruction using Stereo Digital Cameras

Kumar, Anish Unknown Date
No description available.
935

Harmonization of SACU Trade Policies in the Tourism & Hospitality Service Sectors.

Masuku, Gabriel Mthokozisi Sifiso. January 2009 (has links)
<p>The general objective of the proposed research is to do a needs analysis for the tourism and hospitality industries of South Africa, Botswana, Namibia, Lesotho and Swaziland. This will be followed by an alignment of these industries with the provisions of the General Agreement of Trade in Services, commonly known as GATS, so that a Tourism and Hospitality Services Charter may be moulded that may be used uniformly throughout SACU. The specific objectives of the research are: To analyze impact assessment reports and studies conducted on the Tourism and Hospitality Industries for all five SACU member states with the aim of harmonizing standards, costs and border procedures. To ecognize SACU member states&rsquo / schedule of GATS Commitments, especially in the service sectors being investigated, by improving market access, and to recommend minimal infrastructural development levels to be attained for such sectors&rsquo / support. To make recommendations to harness the challenges faced by the said industries into a working document. To calibrate a uniformity of trade standards in these sectors that shall be used by the SACU membership. To ensure that the template is flexible enough for SACU to easily adopt and use in ongoing bilateral negotiations, for example.</p>
936

Special and differential treatment for trade in agriculture :does it answer the quest for development in African countries?

Fantu Farris Mulleta January 2009 (has links)
<p>The research paper seeks to investigate the possible ways in which African countries can maximise their benefit from the existing special and differential treatment clauses for trade in agriculture, and, then, make recommendations as to what should be the potential bargaining position of African countries with regard to future trade negotiations on agricultural trade.</p>
937

Second-order Least Squares Estimation in Generalized Linear Mixed Models

Li, He 06 April 2011 (has links)
Maximum likelihood is an ubiquitous method used in the estimation of generalized linear mixed model (GLMM). However, the method entails computational difficulties and relies on the normality assumption for random effects. We propose a second-order least squares (SLS) estimator based on the first two marginal moments of the response variables. The proposed estimator is computationally feasible and requires less distributional assumptions than the maximum likelihood estimator. To overcome the numerical difficulties of minimizing an objective function that involves multiple integrals, a simulation-based SLS estimator is proposed. We show that the SLS estimators are consistent and asymptotically normally distributed under fairly general conditions in the framework of GLMM. Missing data is almost inevitable in longitudinal studies. Problems arise if the missing data mechanism is related to the response process. This thesis develops the proposed estimators to deal with response data missing at random by either adapting the inverse probability weight method or applying the multiple imputation approach. In practice, some of the covariates are not directly observed but are measured with error. It is well-known that simply substituting a proxy variable for the unobserved covariate in the model will generally lead to biased and inconsistent estimates. We propose the instrumental variable method for the consistent estimation of GLMM with covariate measurement error. The proposed approach does not need any parametric assumption on the distribution of the unknown covariates. This makes the method less restrictive than other methods that rely on either a parametric distribution of the covariates, or to estimate the distribution using some extra information. In the presence of data outliers, it is a concern that the SLS estimators may be vulnerable due to the second-order moments. We investigated the robustness property of the SLS estimators using their influence functions. We showed that the proposed estimators have a bounded influence function and a redescending property so they are robust to outliers. The finite sample performance and property of the SLS estimators are studied and compared with other popular estimators in the literature through simulation studies and real world data examples.
938

Variable Selection and Function Estimation Using Penalized Methods

Xu, Ganggang 2011 December 1900 (has links)
Penalized methods are becoming more and more popular in statistical research. This dissertation research covers two major aspects of applications of penalized methods: variable selection and nonparametric function estimation. The following two paragraphs give brief introductions to each of the two topics. Infinite variance autoregressive models are important for modeling heavy-tailed time series. We use a penalty method to conduct model selection for autoregressive models with innovations in the domain of attraction of a stable law indexed by alpha is an element of (0, 2). We show that by combining the least absolute deviation loss function and the adaptive lasso penalty, we can consistently identify the true model. At the same time, the resulting coefficient estimator converges at a rate of n^(?1/alpha) . The proposed approach gives a unified variable selection procedure for both the finite and infinite variance autoregressive models. While automatic smoothing parameter selection for nonparametric function estimation has been extensively researched for independent data, it is much less so for clustered and longitudinal data. Although leave-subject-out cross-validation (CV) has been widely used, its theoretical property is unknown and its minimization is computationally expensive, especially when there are multiple smoothing parameters. By focusing on penalized modeling methods, we show that leave-subject-out CV is optimal in that its minimization is asymptotically equivalent to the minimization of the true loss function. We develop an efficient Newton-type algorithm to compute the smoothing parameters that minimize the CV criterion. Furthermore, we derive one simplification of the leave-subject-out CV, which leads to a more efficient algorithm for selecting the smoothing parameters. We show that the simplified version of CV criteria is asymptotically equivalent to the unsimplified one and thus enjoys the same optimality property. This CV criterion also provides a completely data driven approach to select working covariance structure using generalized estimating equations in longitudinal data analysis. Our results are applicable to additive, linear varying-coefficient, nonlinear models with data from exponential families.
939

A multivariate approach to QSAR

Hellberg, Sven January 1986 (has links)
Quantitative structure-activity relationships (OSAR) constitute empirical analogy models connecting chemical structure and biological activity. The analogy approach to QSAR assume that the factors important in the biological system also are contained in chemical model systems. The development of a QSAR can be divided into subproblems: 1. to quantify chemical structure in terms of latent variables expressing analogy, 2. to design test series of compounds, 3. to measure biological activity and 4. to construct a mathematical model connecting chemical structure and biological activity. In this thesis it is proposed that many possibly relevant descriptors should be considered simultaneously in order to efficiently capture the unknown factors inherent in the descriptors. The importance of multivariately and multipositionally varied test series is discussed. Multivariate projection methods such as PCA and PLS are shown to be appropriate far QSAR and to closely correspond to the analogy assumption. The multivariate analogy approach is applied to a beta- adrenergic agents, b haloalkanes, c halogenated ethyl methyl ethers and d four different families of peptides. / <p>Diss. (sammanfattning) Umeå : Umeå universitet, 1986, härtill 8 uppsatser</p> / digitalisering@umu
940

Design and implementation of sensor fusion for the towed synthetic aperture sonar

Meng, Rui Daniel January 2007 (has links)
For synthetic aperture imaging, position and orientation deviation is of great concern. Unknown motions of a Synthetic Aperture Sonar (SAS) can blur the reconstructed images and degrade image quality considerably. Considering the high sensitivity of synthetic aperture imaging technique to sonar deviation, this research aims at providing a thorough navigation solution for a free-towed synthetic aperture sonar (SAS) comprising aspects from the design and construction of the navigation card through to data postprocessing to produce position, velocity, and attitude information of the sonar. The sensor configuration of the designed navigation card is low-cost Micro-Electro-Mechanical-Systems (MEMS) Magnetic, Angular Rate, and Gravity (MARG) sensors including three angular rate gyroscopes, three dual-axial accelerometers, and a triaxial magnetic hybrid. These MARG sensors are mounted orthogonally on a standard 180mm Eurocard PCB to monitor the motions of the sonar in six degrees of freedom. Sensor calibration algorithms are presented for each individual sensor according to its characteristics to precisely determine sensor parameters. The nonlinear least square method and two-step estimator are particularly used for the calibration of accelerometers and magnetometers. A quaternion-based extended Kalman filter is developed based on a total state space model to fuse the calibrated navigation data. In the model, the frame transformations are described using quaternions instead of other attitude representations. The simulations and experimental results are demonstrated in this thesis to verify the capability of the sensor fusion strategy.

Page generated in 0.0234 seconds