Spelling suggestions: "subject:"bsplines"" "subject:"ciplines""
111 |
An algorithm for automatic crystal identification in pixelated scintillation detectors using thin plate splines and Gaussian mixture modelsSchellenberg, Graham 19 January 2016 (has links)
Positron emission tomography (PET) is a non-invasive imaging technique which utilizes positron-emitting radiopharmaceuticals (PERs) to characterize biological processes in tissues of interest. A PET scanner is usually composed of multiple scintillation crystal detectors placed in a ring so as to capture coincident photons from a position annihilation. These detectors require a crystal lookup table (CLUT) to map the detector response to the crystal of interaction. These CLUTs must be accurate, lest events get mapped to the wrong crystal of interaction degrading the final image quality. This work describes an automated algorithm, for CLUT generation, focused around Gaussian Mixture Models (GMM) with Thin Plate Splines (TPS). The algorithm was tested with flood image data collected from 16 detectors. The method maintained at least 99.8% accuracy across all tests. This method is considerably faster than manual techniques and can be adapted to different detector configurations. / February 2016
|
112 |
Spatio-Temporal Modeling Of Anatomic Motion For Radiation TherapyZachariah, Elizabeth 01 January 2015 (has links)
In radiation therapy, it is imperative to deliver high doses of radiation to the tumor while reducing radiation to the healthy tissue. Respiratory motion is the most significant source of errors during treatment. Therefore, it is essential to accurately model respiratory motion for precise and effective radiation delivery. Many approaches exist to account for respiratory motion, such as controlled breath hold and respiratory gating, and they have been relatively successful. They still present many drawbacks. Thus, research has been expanded to tumor tracking.
The overall goal of 4D-CT is to predict tumor motion in real time, and this work attempts to move in that direction. The following work addresses both the temporal and the spatial aspects of four-dimensional CT reconstruction. The aims of the paper are to (1) estimate the temporal parameters of 4D models for anatomy deformation using a novel neural network approach and (2) to use intelligently chosen non-uniform, non-separable splines to improve the spatial resolution of the deformation models in image registration.
|
113 |
Real-time acquisition of human gestures for interacting with virtual environments / Acquisition temps réel de la gestuelle humaine pour l'interaction en réalité virtuelleVatavu, Radu-Daniel 18 March 2008 (has links)
La thèse traite du problème de la reconnaissance des gestes avec des accents particuliers orientés vers la modélisation des trajectoires de mouvement ainsi que vers l’estimation de la variabilité présente dans l’exécution gestuelle. Les gestes sont acquis dans un scénario typique pour la vision par ordinateur qui approche les particularités des surfaces interactives. On propose un modèle flexible pour les commandes gestuelles à partir d’une représentation par courbes splines et des analogies avec des éléments de la théorie d’élasticité de la physique classique. On utilise les propriétés du modèle pour la reconnaissance des gestes dans un contexte d’apprentissage supervisé. Pour adresser le problème de la variation présente dans l’exécution des gestes, on propose un modèle qui mesure dans une manière quantitative et objective les tendances locales que les utilisateurs sont tentés d'introduire dans leurs exécutions. On utilise ce modèle pour proposer une solution à un problème reconnu comme difficile dans la communauté : la segmentation automatique des trajectoires continues de mouvement et l’identification invariante a l’échelle des commands gestuelles. On démontre aussi l’efficacité du modèle pour effectuer des analyses de type ergonomique pour les dictionnaires de gestes. / We address in this thesis the problem of gesture recognition with specific focus on providing a flexible model for movement trajectories as well as for estimating the variation in execution that is inherently present when performing gestures. Gestures are captured in a computer vision scenario which approaches somewhat the specifics of interactive surfaces. We propose a flexible model for gesture commands based on a spline representation which is enhanced with elastic properties in a direct analogy with the theory of elasticity from classical physics. The model is further used for achieving gesture recognition in the context of supervised learning. In order to address the problem of variability in execution, we propose a model that measures objectively and quantitatively the local tendencies that users introduce in their executions. We make use of this model in order to address a problem that is considered hard by the community: automatic segmentation of continuous motion trajectories and scale invariant identification of gesture commands. We equally show the usefulness of our model for performing ergonomic analysis on gesture dictionaries.
|
114 |
Expektilová regrese / Expectile regressionOndřej, Josef January 2015 (has links)
In this thesis we present an alternative to quantiles, which is known as expectiles. At first we define the notion of expectile of a distribution of ran- dom variable and then we show some of its basic properties such as linearity or monotonic behavior of τ-th expectile eτ in τ. Let (Y, X), Y ∈ R, X ∈ Rp be a ran- dom vector. We define conditional expectile of Y given X = x, which we denote eτ (Y |X = x). We introduce model of expectile regression eτ (Y |X = x) = x⊤ βτ , where βτ ∈ Rp and we examine asymptotic behavior of estimate of the regression coefficients βτ and ways how to calculate it. Further we introduce semiparametric expectile regression, which generalizes the previous case and adds restrictions on the estimate of the regression coefficients which enforce desired properties such as smoothness of fitted curves. We illustrate the use of theoretical results on me- chanographic data, which describe dependence of power and force of a jump on age of children and adolescents aged between 6 and 18. Keywords: expectiles, expectile regression, quantiles, penalized B-splines 1
|
115 |
Defining and predicting fast-selling clothing optionsJesperson, Sara January 2019 (has links)
This thesis aims to find a definition of fast-selling clothing options and to find a way to predict them using only a few weeks of sale data as input. The data used for this project contain daily sales and intake quantity for seasonal options, with sale start 2016-2018, provided by the department store chain Åhléns. A definition is found to describe fast-selling clothing options as those having sold a certain percentage of their intake after a fixed number of days. An alternative definition based on cluster affiliation is proven less effective. Two predictive models are tested, the first one being a probabilistic classifier and the second one being a k-nearest neighbor classifier, using the Euclidean distance. The probabilistic model is divided into three steps: transformation, clustering, and classification. The time series are transformed with B-splines to reduce dimensionality, where each time series is represented by a vector with its length and B-spline coefficients. As a tool to improve the quality of the predictions, the B-spline vectors are clustered with a Gaussian mixture model where every cluster is assigned one of the two labels fast-selling or ordinary, thus dividing the clusters into disjoint sets: one containing fast-selling clusters and the other containing ordinary clusters. Lastly, the time series to be predicted are assumed to be Laplace distributed around a B-spline and using the probability distributions provided by the clustering, the posterior probability for each class is used to classify the new observations. In the transformation step, the number of knots for the B-splines are evaluated with cross-validation and the Gaussian mixture models, from the clustering step, are evaluated with the Bayesian information criterion, BIC. The predictive performance of both classifiers is evaluated with accuracy, precision, and recall. The probabilistic model outperforms the k-nearest neighbor model with considerably higher values of accuracy, precision, and recall. The performance of each model is improved by using more data to make the predictions, most prominently with the probabilistic model.
|
116 |
On the Short-Time Fourier Transform and Gabor Frames generated by B-splinesFredriksson, Henrik January 2012 (has links)
In this thesis we study the short-time Fourier transform. The short-time Fourier transform of a function f(x) is obtained by restricting our function to a short time segment and take the Fourier transform of this restriction. This method gives information locally of f in both time and frequency simultaneously.To get a smooth frequency localization one wants to use a smooth window, whichmeans that the windows will overlap. The continuous short-time Fourier transform is not appropriate for practical purpose, therefore we want a discrete representation of f. Using Gabor theory, we can write a function f as a linear combination of time- and frequency shifts of a fixed window function g with integer parameters a; b > 0. We show that if the window function g has compact support, then g generates a Gabor frame G(g; a; b). We also show that for such a g there exists a dual frame such that both G(g; a; b) and its dual frame has compact support and decay fast in the Fourier domain. Based on [2], we show that B-splines generates a pair of Gabor frames.
|
117 |
Bayesian surface smoothing under anisotropyChakravarty, Subhashish 01 January 2007 (has links)
Bayesian surface smoothing using splines usually proceeds by choosing the smoothness parameter through the use of data driven methods like generalized cross validation. In this methodology, knots of the splines are assumed to lie at the data locations. When anisotropy is present in the data, modeling is done via parametric functions.
In the present thesis, we have proposed a non-parametric approach to Bayesian surface smoothing in the presence of anisotropy. We use eigenfunctions generated by thin-plate splines as our basis functions. Using eigenfunctions does away with having to place knots arbitrarily, as is done customarily. The smoothing parameter, the anisotropy matrix, and other parameters are simultaneously updated by a Reversible Jump Markov Chain Monte Carlo (RJMCMC) sampler. Unique in our implementation is model selection, which is again done concurrently with the parameter updates.
Since the posterior distribution of the coefficients of the basis functions for any given model order is available in closed form, we are able to simplify the sampling algorithm in the model selection step. This also helps us in isolating the parameters which influence the model selection step.
We investigate the relationship between the number of basis functions used in the model and the smoothness parameter and find that there is a delicate balance which exists between the two. Higher values of the smoothness parameter correspond to more number of basis functions being selected.
Use of a non-parametric approach to Bayesian surface smoothing provides for more modeling flexibility. We are not constrained by the shape defined by a parametric shape of the covariance as used by earlier methods. A Bayesian approach also allows us to include the results obtained from previous analysis of the same data, if any, as prior information. It also allows us to evaluate pointwise estimates of variability of the fitted surface. We believe that our research also poses many questions for future research.
|
118 |
Two-level lognormal frailty model and competing risks model with missing cause of failureTang, Xiongwen 01 May 2012 (has links)
In clustered survival data, unobservable cluster effects may exert powerful influences on the outcomes and thus induce correlation among subjects within the same cluster. The ordinary partial likelihood approach does not account for this dependence. Frailty models, as an extension to Cox regression, incorporate multiplicative random effects, called frailties, into the hazard model and have become a very popular way to account for the dependence within clusters. We particularly study the two-level nested lognormal frailty model and propose an estimation approach based on the complete data likelihood with frailty terms integrated out. We adopt B-splines to model the baseline hazards and adaptive Gauss-Hermite quadrature to approximate the integrals efficiently. Furthermore, in finding the maximum likelihood estimators, instead of the Newton-Raphson iterative algorithm, Gauss-Seidel and BFGS methods are used to improve the stability and efficiency of the estimation procedure. We also study competing risks models with missing cause of failure in the context of Cox proportional hazards models. For competing risks data, there exists more than one cause of failure and each observed failure is exclusively linked to one cause. Conceptually, the causes are interpreted as competing risks before the failure is observed. Competing risks models are constructed based on the proportional hazards model specified for each cause of failure respectively, which can be estimated using partial likelihood approach. However, the ordinary partial likelihood is not applicable when the cause of failure could be missing for some reason. We propose a weighted partial likelihood approach based on complete-case data, where weights are computed as the inverse of selection probability and the selection probability is estimated by a logistic regression model. The asymptotic properties of the regression coefficient estimators are investigated by applying counting process and martingale theory. We further develop a double robust approach based on the full data to improve the efficiency as well as the robustness.
|
119 |
Modèles de prédiction de l'interaction rotor/stator dans un moteur d'avionLegrand, Mathias 25 March 2005 (has links) (PDF)
Ces travaux s'inscrivent dans le cadre d'un partenariat entre SNECMA (Société Nationale d'Etude et de Construction de Moteurs d'Avion), l'Equipe Structures et Simulations du laboratoire GeM à l'Ecole Centrale de Nantes et le Département de Vibrations et d'Acoustique de l'Université du Michigan, avec pour objectif le développement de modèles de prédiction d'interaction rotor/stator dans un moteur d'avion. En effet, afin de rester un acteur compétitif, un motoriste a un objectif principal : augmenter le rendement d'un moteur d'avion en accord avec les normes en vigueur. La minimisation du jeu entre l'extrémité des aubes des différents composants en rotation et le carter qui leur fait face est une réponse à cette contrainte. Malheureusement, la diminution de cette distance augmente sensiblement les possibilités de contact entre les deux parties. Les conséquences d'un tel phénomène pouvant être dramatiques pour le moteur, le bureau d'études doit concevoir des structures capables de résister à ces différentes sollicitations dynamiques : il est donc primordial de comprendre l'origine et l'action de ces forces de contact pour réduire de façon optimale le jeu entre les deux parties sans compromettre la sécurité des passagers. Dans cette étude, nous nous concentrons plus particulièrement sur l'interaction entre deux modes de vibration à diamètres, caractéristiques des structures à symétrie axiale, pendant laquelle les structures se touchent légèrement pour atteindre des régimes permanents potentiellement dangereux. A cet effet, trois modèles de complexité croissante sont proposés et analysés, en plus de deux méthodes de résolution complémentaires développées dans les domaines fréquentiel et temporel. Ces trois modèles prédisent une vitesse de rotation critique au-dessus de laquelle, sous certaines conditions qui restent difficiles à déterminer, les amplitudes de vibrations deviennent importantes. Ils ont aussi permis une meilleure compréhension du phénomène d'interaction modale et de ses caractéristiques physiques.
|
120 |
Modélisation de surfaces à l'aide de fonctions splines :Tazeroualti, Mahammed 26 February 1993 (has links) (PDF)
Ce travail se décompose en trois parties distinctes. Dans la première partie, on introduit un algorithme du type Gauss-Seidel pour la minimisation de fonctionnelles symétriques semi-définies positives. La convergence de cet algorithme est démontrée. En application, on donne deux méthodes de lissage de surfaces. Ces méthodes sont basées sur l'idée de ramener un probleme de lissage a deux dimensions a la resolution d'une suite de problèmes a une dimension faciles a résoudre. Pour cela on utilise l'opération d'inf-convolution spline. Dans la deuxième partie, on introduit une nouvelle methode pour la conception d'un verre progressif. Ce verre est représente par une surface suffisamment régulière, a laquelle on impose des conditions sur ses courbures principales dans certaines zones (zone de vision de loin et zone de vision de pres), et des conditions sur ses directions principales de courbure dans d'autres zones (zone nasale et zone temporale). La surface est écrite sous forme de produit tensoriel de b-splines de degré quatre. Pour la calculer, on est amené a minimiser un opérateur non quadratique. Cette minimisation est alors effectuée par un procédé itératif dont on a teste numériquement la convergence rapide
|
Page generated in 0.0485 seconds