• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 25
  • 16
  • 13
  • 5
  • 4
  • 4
  • 1
  • 1
  • Tagged with
  • 77
  • 77
  • 13
  • 13
  • 12
  • 11
  • 10
  • 10
  • 9
  • 8
  • 7
  • 7
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

DCOBS: forecasting the term structure of interest rates with dynamic constrained smoothing B-Splines / DCOBS: previsão da estrutura a termo de taxa de juros com B-Splines restritas, suavizadas e dinâmicas

Eduardo Phillipe Mineo 01 December 2017 (has links)
The Nelson-Siegel framework published by Diebold and Li a decade ago created an important benchmark and originated several works in the literature of forecasting term structure of interest rates. For instance, the Arbitrage-Free Nelson-Siegel framework improved predictive performance by imposing no-arbitrage conditions to the Nelson-Siegel framework. However, these frameworks were built on the top of a parametric curve model that may lead to poor fitting for sensible term structure shapes affecting forecast results. We propose DCOBS with no-arbitrage restrictions to forecast the term structure. It is built on the top of the nonparametric constrained smoothing Bsplines yield curve model. This curve model has shown to be an optimum solution between financial integrity and respect to yield curve shapes. Even though this curve model may provide more volatile forward curves than parametric models, they are still more accurate than those from Nelson-Siegel frameworks. A software was developed with a complete implementation of yield curve fitting techniques discussed in this paper. DCOBS has been evaluated for ten years of brazilian government bond data and it has shown good consistence with stylized facts of yield curves. The results of DCOBS are promising, specially in short-term forecast, and has shown greater stability and lower root mean square errors than Arbitrage-Free Nelson-Siegel. / O framework Nelson-Siegel publicado por Diebold e Li uma década atrás criou um importante benchmark e originou diversos trabalhos na literatura de previsão de estrutura a termo de taxas de juros. Por exemplo, o framework Nelson-Siegel Livre de Arbitragem melhorou a performance preditiva impondo condições de não-arbitragem ao framework Nelson-Siegel. No entanto, estes frameworks foram construídos em cima do modelo de curvas paramétricas. Em casos mais sensíveis de formas de curvas, este modelo tem um desempenho muito ruim, afetando o resultado da previsão. Nós propomos o DCOBS com restrições de não-arbitragem para prever a estrutura a termo. Ele é construído em cima do modelo de curva não-paramétrico com B-Splines restritas e suavizadas. Este modelo demonstrou ser uma solução ótima entre integridade financeira e respeito às formas de curvas de juros. Embora este modelo de curva possa resultar em curvas forwards mais voláteis que os modelos paramétricos, ele é ainda mais acurado que aqueles do framework Nelson-Siegel. Um software foi desenvolvido com uma implementação completa das técnicas de ajustes de curvas de juros discutidas nesta dissertação. DCOBS foi avaliado utilizando dez anos de dados de títulos públicos do governo brasileiro e demonstrou boa consistência com os fatos estilizados das curvas de juros. Os resultados do DCOBS são promissores, especialmente na previsão de curto prazo, e demonstrou maior estabilidade e menor erro quadrático médio que o modelo Nelson-Siegel Livre de Arbitragem.
12

DCOBS: forecasting the term structure of interest rates with dynamic constrained smoothing B-Splines / DCOBS: previsão da estrutura a termo de taxa de juros com B-Splines restritas, suavizadas e dinâmicas

Mineo, Eduardo Phillipe 01 December 2017 (has links)
The Nelson-Siegel framework published by Diebold and Li a decade ago created an important benchmark and originated several works in the literature of forecasting term structure of interest rates. For instance, the Arbitrage-Free Nelson-Siegel framework improved predictive performance by imposing no-arbitrage conditions to the Nelson-Siegel framework. However, these frameworks were built on the top of a parametric curve model that may lead to poor fitting for sensible term structure shapes affecting forecast results. We propose DCOBS with no-arbitrage restrictions to forecast the term structure. It is built on the top of the nonparametric constrained smoothing Bsplines yield curve model. This curve model has shown to be an optimum solution between financial integrity and respect to yield curve shapes. Even though this curve model may provide more volatile forward curves than parametric models, they are still more accurate than those from Nelson-Siegel frameworks. A software was developed with a complete implementation of yield curve fitting techniques discussed in this paper. DCOBS has been evaluated for ten years of brazilian government bond data and it has shown good consistence with stylized facts of yield curves. The results of DCOBS are promising, specially in short-term forecast, and has shown greater stability and lower root mean square errors than Arbitrage-Free Nelson-Siegel. / O framework Nelson-Siegel publicado por Diebold e Li uma década atrás criou um importante benchmark e originou diversos trabalhos na literatura de previsão de estrutura a termo de taxas de juros. Por exemplo, o framework Nelson-Siegel Livre de Arbitragem melhorou a performance preditiva impondo condições de não-arbitragem ao framework Nelson-Siegel. No entanto, estes frameworks foram construídos em cima do modelo de curvas paramétricas. Em casos mais sensíveis de formas de curvas, este modelo tem um desempenho muito ruim, afetando o resultado da previsão. Nós propomos o DCOBS com restrições de não-arbitragem para prever a estrutura a termo. Ele é construído em cima do modelo de curva não-paramétrico com B-Splines restritas e suavizadas. Este modelo demonstrou ser uma solução ótima entre integridade financeira e respeito às formas de curvas de juros. Embora este modelo de curva possa resultar em curvas forwards mais voláteis que os modelos paramétricos, ele é ainda mais acurado que aqueles do framework Nelson-Siegel. Um software foi desenvolvido com uma implementação completa das técnicas de ajustes de curvas de juros discutidas nesta dissertação. DCOBS foi avaliado utilizando dez anos de dados de títulos públicos do governo brasileiro e demonstrou boa consistência com os fatos estilizados das curvas de juros. Os resultados do DCOBS são promissores, especialmente na previsão de curto prazo, e demonstrou maior estabilidade e menor erro quadrático médio que o modelo Nelson-Siegel Livre de Arbitragem.
13

A Smooth Finite Element Method Via Triangular B-Splines

Khatri, Vikash 02 1900 (has links) (PDF)
A triangular B-spline (DMS-spline)-based finite element method (TBS-FEM) is proposed along with possible enrichment through discontinuous Galerkin, continuous-discontinuous Galerkin finite element (CDGFE) and stabilization techniques. The developed schemes are also numerically explored, to a limited extent, for weak discretizations of a few second order partial differential equations (PDEs) of interest in solid mechanics. The presently employed functional approximation has both affine invariance and convex hull properties. In contrast to the Lagrangian basis functions used with the conventional finite element method, basis functions derived through n-th order triangular B-splines possess (n ≥ 1) global continuity. This is usually not possible with standard finite element formulations. Thus, though constructed within a mesh-based framework, the basis functions are globally smooth (even across the element boundaries). Since these globally smooth basis functions are used in modeling response, one can expect a reduction in the number of elements in the discretization which in turn reduces number of degrees of freedom and consequently the computational cost. In the present work that aims at laying out the basic foundation of the method, we consider only linear triangular B-splines. The resulting formulation thus provides only a continuous approximation functions for the targeted variables. This leads to a straightforward implementation without a digression into the issue of knot selection, whose resolution is required for implementing the method with higher order triangular B-splines. Since we consider only n = 1, the formulation also makes use of the discontinuous Galerkin method that weakly enforces the continuity of first derivatives through stabilizing terms on the interior boundaries. Stabilization enhances the numerical stability without sacrificing accuracy by suitably changing the weak formulation. Weighted residual terms are added to the variational equation, which involve a mesh-dependent stabilization parameter. The advantage of the resulting scheme over a more traditional mixed approach and least square finite element is that the introduction of additional unknowns and related difficulties can be avoided. For assessing the numerical performance of the method, we consider Navier’s equations of elasticity, especially the case of nearly-incompressible elasticity (i.e. as the limit of volumetric locking approaches). Limited comparisons with results via finite element techniques based on constant-strain triangles help bring out the advantages of the proposed scheme to an extent.
14

Reconstruction en tomographie dynamique par approche inverse sans compensation de mouvement / Reconstruction in dynamic tomography by an inverse approach without motion compensation

Momey, Fabien 20 June 2013 (has links)
La tomographie est la discipline qui cherche à reconstruire une donnée physique dans son volume, à partir de l’information indirecte de projections intégrées de l’objet, à différents angles de vue. L’une de ses applications les plus répandues, et qui constitue le cadre de cette thèse, est l’imagerie scanner par rayons X pour le médical. Or, les mouvements inhérents à tout être vivant, typiquement le mouvement respiratoire et les battements cardiaques, posent de sérieux problèmes dans une reconstruction classique. Il est donc impératif d’en tenir compte, i.e. de reconstruire le sujet imagé comme une séquence spatio-temporelle traduisant son “évolution anatomique” au cours du temps : c’est la tomographie dynamique. Élaborer une méthode de reconstruction spécifique à ce problème est un enjeu majeur en radiothérapie, où la localisation précise de la tumeur dans le temps est un prérequis afin d’irradier les cellules cancéreuses en protégeant au mieux les tissus sains environnants. Des méthodes usuelles de reconstruction augmentent le nombre de projections acquises, permettant des reconstructions indépendantes de plusieurs phases de la séquence échantillonnée en temps. D’autres compensent directement le mouvement dans la reconstruction, en modélisant ce dernier comme un champ de déformation, estimé à partir d’un jeu de données d’acquisition antérieur. Nous proposons dans ce travail de thèse une approche nouvelle ; se basant sur la théorie des problèmes inverses, nous affranchissons la reconstruction dynamique du besoin d’accroissement de la quantité de données, ainsi que de la recherche explicite du mouvement, elle aussi consommatrice d’un surplus d’information. Nous reconstruisons la séquence dynamique à partir du seul jeu de projections courant, avec pour seules hypothèses a priori la continuité et la périodicité du mouvement. Le problème inverse est alors traité rigoureusement comme la minimisation d’un terme d’attache aux données et d’une régularisation. Nos contributions portent sur la mise au point d’une méthode de reconstruction adaptée à l’extraction optimale de l’information compte tenu de la parcimonie des données — un aspect typique du problème dynamique — en utilisant notamment la variation totale (TV) comme régularisation. Nous élaborons un nouveau modèle de projection tomographique précis et compétitif en temps de calcul, basé sur des fonctions B-splines séparables, permettant de repousser encore la limite de reconstruction imposée par la parcimonie. Ces développements sont ensuite insérés dans un schéma de reconstruction dynamique cohérent, appliquant notamment une régularisation TV spatio-temporelle efficace. Notre méthode exploite ainsi de façon optimale la seule information courante à disposition ; de plus sa mise en oeuvre fait preuve d’une grande simplicité. Nous faisons premièrement la démonstration de la force de notre approche sur des reconstructions 2-D+t à partir de données simulées numériquement. La faisabilité pratique de notre méthode est ensuite établie sur des reconstructions 2-D et 3-D+t à partir de données physiques “réelles”, acquises sur un fantôme mécanique et sur un patient / Computerized tomography (CT) aims at the retrieval of 3-D information from a set of projections acquired at different angles around the object of interest (OOI). One of its most common applications, which is the framework of this Ph.D. thesis, is X-ray CT medical imaging. This reconstruction can be severely impaired by the patient’s breath (respiratory) motion and cardiac beating. This is a major challenge in radiotherapy, where the precise localization of the tumor is a prerequisite for cancer cells irradiation with preservation of surrounding healthy tissues. The field of methods dealing with the reconstruction of a dynamic sequence of the OOI is called Dynamic CT. Some state-of-the-art methods increase the number of projections, allowing an independent reconstruction of several phases of the time sampled sequence. Other methods use motion compensation in the reconstruction, by a beforehand estimation on a previous data set, getting the explicit motion through a deformation model. Our work takes a different path ; it uses dynamic reconstruction, based on inverse problems theory, without any additional information, nor explicit knowledge of the motion. The dynamic sequence is reconstructed out of a single data set, only assuming the motion’s continuity and periodicity. This inverse problem is considered as a minimization of an error term combined with a regularization. One of the most original features of this Ph.D. thesis, typical of dynamic CT, is the elaboration of a reconstruction method from very sparse data, using Total Variation (TV) as a very efficient regularization term. We also implement a new rigorously defined and computationally efficient tomographic projector, based on B-splines separable functions, outperforming usual reconstruction quality in a data sparsity context. This reconstruction method is then inserted into a coherent dynamic reconstruction scheme, applying an efficient spatio-temporal TV regularization. Our method exploits current data information only, in an optimal way ; moreover, its implementation is rather straightforward. We first demonstrate the strength of our approach on 2-D+t reconstructions from numerically simulated dynamic data. Then the practical feasibility of our method is established on 2-D and 3-D+t reconstructions of a mechanical phantom and real patient data
15

Expektilová regrese / Expectile regression

Ondřej, Josef January 2015 (has links)
In this thesis we present an alternative to quantiles, which is known as expectiles. At first we define the notion of expectile of a distribution of ran- dom variable and then we show some of its basic properties such as linearity or monotonic behavior of τ-th expectile eτ in τ. Let (Y, X), Y ∈ R, X ∈ Rp be a ran- dom vector. We define conditional expectile of Y given X = x, which we denote eτ (Y |X = x). We introduce model of expectile regression eτ (Y |X = x) = x⊤ βτ , where βτ ∈ Rp and we examine asymptotic behavior of estimate of the regression coefficients βτ and ways how to calculate it. Further we introduce semiparametric expectile regression, which generalizes the previous case and adds restrictions on the estimate of the regression coefficients which enforce desired properties such as smoothness of fitted curves. We illustrate the use of theoretical results on me- chanographic data, which describe dependence of power and force of a jump on age of children and adolescents aged between 6 and 18. Keywords: expectiles, expectile regression, quantiles, penalized B-splines 1
16

Defining and predicting fast-selling clothing options

Jesperson, Sara January 2019 (has links)
This thesis aims to find a definition of fast-selling clothing options and to find a way to predict them using only a few weeks of sale data as input. The data used for this project contain daily sales and intake quantity for seasonal options, with sale start 2016-2018, provided by the department store chain Åhléns. A definition is found to describe fast-selling clothing options as those having sold a certain percentage of their intake after a fixed number of days. An alternative definition based on cluster affiliation is proven less effective. Two predictive models are tested, the first one being a probabilistic classifier and the second one being a k-nearest neighbor classifier, using the Euclidean distance. The probabilistic model is divided into three steps: transformation, clustering, and classification. The time series are transformed with B-splines to reduce dimensionality, where each time series is represented by a vector with its length and B-spline coefficients. As a tool to improve the quality of the predictions, the B-spline vectors are clustered with a Gaussian mixture model where every cluster is assigned one of the two labels fast-selling or ordinary, thus dividing the clusters into disjoint sets: one containing fast-selling clusters and the other containing ordinary clusters. Lastly, the time series to be predicted are assumed to be Laplace distributed around a B-spline and using the probability distributions provided by the clustering, the posterior probability for each class is used to classify the new observations. In the transformation step, the number of knots for the B-splines are evaluated with cross-validation and the Gaussian mixture models, from the clustering step, are evaluated with the Bayesian information criterion, BIC. The predictive performance of both classifiers is evaluated with accuracy, precision, and recall. The probabilistic model outperforms the k-nearest neighbor model with considerably higher values of accuracy, precision, and recall. The performance of each model is improved by using more data to make the predictions, most prominently with the probabilistic model.
17

On the Short-Time Fourier Transform and Gabor Frames generated by B-splines

Fredriksson, Henrik January 2012 (has links)
In this thesis we study the short-time Fourier transform. The short-time Fourier transform of a function f(x) is obtained by restricting our function to a short time segment and take the Fourier transform of this restriction. This method gives information locally of f in both time and frequency simultaneously.To get a smooth frequency localization one wants to use a smooth window, whichmeans that the windows will overlap. The continuous short-time Fourier transform is not appropriate for practical purpose, therefore we want a discrete representation of f. Using Gabor theory, we can write a function f as a linear combination of time- and frequency shifts of a fixed window function g with integer parameters a; b > 0. We show that if the window function g has compact support, then g generates a Gabor frame G(g; a; b). We also show that for such a g there exists a dual frame such that both G(g; a; b) and its dual frame has compact support and decay fast in the Fourier domain. Based on [2], we show that B-splines generates a pair of Gabor frames.
18

Two-level lognormal frailty model and competing risks model with missing cause of failure

Tang, Xiongwen 01 May 2012 (has links)
In clustered survival data, unobservable cluster effects may exert powerful influences on the outcomes and thus induce correlation among subjects within the same cluster. The ordinary partial likelihood approach does not account for this dependence. Frailty models, as an extension to Cox regression, incorporate multiplicative random effects, called frailties, into the hazard model and have become a very popular way to account for the dependence within clusters. We particularly study the two-level nested lognormal frailty model and propose an estimation approach based on the complete data likelihood with frailty terms integrated out. We adopt B-splines to model the baseline hazards and adaptive Gauss-Hermite quadrature to approximate the integrals efficiently. Furthermore, in finding the maximum likelihood estimators, instead of the Newton-Raphson iterative algorithm, Gauss-Seidel and BFGS methods are used to improve the stability and efficiency of the estimation procedure. We also study competing risks models with missing cause of failure in the context of Cox proportional hazards models. For competing risks data, there exists more than one cause of failure and each observed failure is exclusively linked to one cause. Conceptually, the causes are interpreted as competing risks before the failure is observed. Competing risks models are constructed based on the proportional hazards model specified for each cause of failure respectively, which can be estimated using partial likelihood approach. However, the ordinary partial likelihood is not applicable when the cause of failure could be missing for some reason. We propose a weighted partial likelihood approach based on complete-case data, where weights are computed as the inverse of selection probability and the selection probability is estimated by a logistic regression model. The asymptotic properties of the regression coefficient estimators are investigated by applying counting process and martingale theory. We further develop a double robust approach based on the full data to improve the efficiency as well as the robustness.
19

Modélisation de surfaces à l'aide de fonctions splines :

Tazeroualti, Mahammed 26 February 1993 (has links) (PDF)
Ce travail se décompose en trois parties distinctes. Dans la première partie, on introduit un algorithme du type Gauss-Seidel pour la minimisation de fonctionnelles symétriques semi-définies positives. La convergence de cet algorithme est démontrée. En application, on donne deux méthodes de lissage de surfaces. Ces méthodes sont basées sur l'idée de ramener un probleme de lissage a deux dimensions a la resolution d'une suite de problèmes a une dimension faciles a résoudre. Pour cela on utilise l'opération d'inf-convolution spline. Dans la deuxième partie, on introduit une nouvelle methode pour la conception d'un verre progressif. Ce verre est représente par une surface suffisamment régulière, a laquelle on impose des conditions sur ses courbures principales dans certaines zones (zone de vision de loin et zone de vision de pres), et des conditions sur ses directions principales de courbure dans d'autres zones (zone nasale et zone temporale). La surface est écrite sous forme de produit tensoriel de b-splines de degré quatre. Pour la calculer, on est amené a minimiser un opérateur non quadratique. Cette minimisation est alors effectuée par un procédé itératif dont on a teste numériquement la convergence rapide
20

Modélisation mathématique des propriétés de mélanges‎ : B-splines et optimisation avec conditions de forme

Odeh, Nabih 19 March 1990 (has links) (PDF)
La séparation dans le domaine pétrolier consiste a arranger ou déplacer des corps ou des classes d'espèces dans des régions différentes de façon a avoir plus de facilites a évaluer leurs propriétés ou bien a produire d'autres mélanges. Le travail, que nous présentons ici, est une contribution a la resolution numérique de certains problèmes rencontres dans l'étude de séparation du pétrole. Les problèmes étudiés, dans ce travail, concernent deux types de mélanges différents: mélanges complexes et mélanges simples. L'étude effectuée, sur différents problèmes de la gestion des mélanges, a conduit a mettre au point un logiciel interactif et visuel ainsi que plusieurs programmes d'expérimentation; ceci va permettre une exploitation numérique et graphique meilleure

Page generated in 0.0297 seconds