• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 5
  • 1
  • 1
  • Tagged with
  • 19
  • 19
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Desigualdade regional de renda e migrações : mobilidade intergeracional educacional e intrageracional de renda no Brasil

Netto Junior, José Luis da Silva January 2008 (has links)
A presente tese tem como objetivo analisar as relações entre as variáveis educacionais e a desigualdade de renda no Brasil e suas repercussões no que se refere a mobilidade intergeracional educacional e intrageracional de renda. O objetivo específico é o de verificar como a mobilidade intergeracional educacional e intrageracional de renda se diferencia regionalmente e de que modo se distingue entre os migrantes e não migrantes. Os resultados sugerem que a desigualdade de renda e de capital humano têm uma relação positiva não linear. Nas áreas onde o indicador de desigualdade de capital humano é maior, a influência dos pais nos mais baixos estratos educacionais é grande se comparado as regiões onde a desigualdade educacional é mais baixa. De um modo geral, nas regiões e estados mais pobres, os pais menos qualificados têm maior influência sobre a trajetória educacional de seus filhos. Em paralelo na região onde os estados têm os mais altos indicadores de desigualdade educacional apresenta a menor mobilidade de renda dentre as regiões analisadas. Os pais migrantes com baixa escolaridade têm uma influência menor sobre a educação dos seus filhos que seus equivalentes nas áreas de origem. E por último, os migrantes têm uma mobilidade de renda maior que a população de suas áreas de origem o que sugere uma seletividade positiva destes. / This thesis aims to analyze the relationship between educational variables and income inequality in Brazil and its repercussion related to educational and income mobility. The specific goal is to verify how the income mobility and human capital accumulation behave considering the regional differences in Brazil and migrant and native population. The results show a non-linear and positive relationship between income and human capital inequality. In the areas where the human capital inequality is higher, parents with no schooling have more influence than in the places where educational inequality is lower. At same time, the income mobility is higher in the Center and Southeast regions e lower in Northeast. The migrant parents with low schooling have less influence over the child schooling in comparison with the equivalents in their origin region. population has higher income mobility than non-migrant.
12

Periodic and Non-Periodic Filter Structures in Lasers / Periodiska och icke-periodisk filterstrukturer i lasrar

Enge, Leo January 2020 (has links)
Communication using fiber optics is an integral part of modern societies and one of the most important parts of this is the grating filter of a laser. In this report we introduce both the periodic and the non-periodic grating filter and discuss how there can be resonance in these structures. We then provide an exact method for calculating the spectrum of these grating filters and study three different methods to calculate this approximately. The first one is the \emph{Fourier approximation} which is very simple. For the studied filters the fundamental form of the results for this method is correct, even though the details are not. The second method consists of calculating the spectrum exactly for some values and then use interpolation by splines. This method gives satisfactory results for the types of gratings analysed. Finally a method of perturbation is provided for the periodic grating filter as well as an outline for how this can be extended to the non-periodic grating filter. For the studied filters the results of this method are very promising. The method of perturbations may also give a deeper understanding of how a filter works and we therefore conclude that it would be of interest to study the method of perturbations further, while all the studied methods can be useful for computation of the spectrum depending on the required precision. / Fiberoptisk kommunikation utgör en viktig del i moderna samhällen och en av de grudläggande delarna av detta är Bragg-filter i lasrar. I den här rapporten introducerar vi både det periodiska och det icke-periodiska Bragg-filtret och diskuterar hur resonans kan uppstå i dessa. Vi presenterar sedan en exakt metod för att beräkna spektrumet av dessa filter samt studerar tre approximativa metoder för att beräkna spektrumet. Den första metoden är \emph{Fourier-approximationen} som är väldigt enkel. För de studerade filtrena blir de grundläggande formerna korrekta med Fourier-approximationen, medan detaljerna är fel. Den andra metoden består av att räkna ut spektrumet exakt för några punkter och sedan interpolera med hjälp av splines. Den här metoden ger mycket bra resultat för de studerade filtrena. Till sist presenteras en metod baserad på störningsteori för det periodiska filtret, samt en översikt över hur det här kan utökas till det icke-periodiska filtret. Denna metod ger mycket lovande resulat och den kan även ge djupare insikt i hur ett filter fungerar. Vi sluter oss därför till att det vore intressant att vidare studera metoder med störningar, men även att alla studerade metoder kan vara användabara för beräkningen av spektra beroende på vilken precision som krävs.
13

Semiparametric Bayesian Approach using Weighted Dirichlet Process Mixture For Finance Statistical Models

Sun, Peng 07 March 2016 (has links)
Dirichlet process mixture (DPM) has been widely used as exible prior in nonparametric Bayesian literature, and Weighted Dirichlet process mixture (WDPM) can be viewed as extension of DPM which relaxes model distribution assumptions. Meanwhile, WDPM requires to set weight functions and can cause extra computation burden. In this dissertation, we develop more efficient and exible WDPM approaches under three research topics. The first one is semiparametric cubic spline regression where we adopt a nonparametric prior for error terms in order to automatically handle heterogeneity of measurement errors or unknown mixture distribution, the second one is to provide an innovative way to construct weight function and illustrate some decent properties and computation efficiency of this weight under semiparametric stochastic volatility (SV) model, and the last one is to develop WDPM approach for Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) model (as an alternative approach for SV model) and propose a new model evaluation approach for GARCH which produces easier-to-interpret result compared to the canonical marginal likelihood approach. In the first topic, the response variable is modeled as the sum of three parts. One part is a linear function of covariates that enter the model parametrically. The second part is an additive nonparametric model. The covariates whose relationships to response variable are unclear will be included in the model nonparametrically using Lancaster and Šalkauskas bases. The third part is error terms whose means and variance are assumed to follow non-parametric priors. Therefore we denote our model as dual-semiparametric regression because we include nonparametric idea for both modeling mean part and error terms. Instead of assuming all of the error terms follow the same prior in DPM, our WDPM provides multiple candidate priors for each observation to select with certain probability. Such probability (or weight) is modeled by relevant predictive covariates using Gaussian kernel. We propose several different WDPMs using different weights which depend on distance in covariates. We provide the efficient Markov chain Monte Carlo (MCMC) algorithms and also compare our WDPMs to parametric model and DPM model in terms of Bayes factor using simulation and empirical study. In the second topic, we propose an innovative way to construct weight function for WDPM and apply it to SV model. SV model is adopted in time series data where the constant variance assumption is violated. One essential issue is to specify distribution of conditional return. We assume WDPM prior for conditional return and propose a new way to model the weights. Our approach has several advantages including computational efficiency compared to the weight constructed using Gaussian kernel. We list six properties of this proposed weight function and also provide the proof of them. Because of the additional Metropolis-Hastings steps introduced by WDPM prior, we find the conditions which can ensure the uniform geometric ergodicity of transition kernel in our MCMC. Due to the existence of zero values in asset price data, our SV model is semiparametric since we employ WDPM prior for non-zero values and parametric prior for zero values. On the third project, we develop WDPM approach for GARCH type model and compare different types of weight functions including the innovative method proposed in the second topic. GARCH model can be viewed as an alternative way of SV for analyzing daily stock prices data where constant variance assumption does not hold. While the response variable of our SV models is transformed log return (based on log-square transformation), GARCH directly models the log return itself. This means that, theoretically speaking, we are able to predict stock returns using GARCH models while this is not feasible if we use SV model. Because SV models ignore the sign of log returns and provides predictive densities for squared log return only. Motivated by this property, we propose a new model evaluation approach called back testing return (BTR) particularly for GARCH. This BTR approach produces model evaluation results which are easier to interpret than marginal likelihood and it is straightforward to draw conclusion about model profitability by applying this approach. Since BTR approach is only applicable to GARCH, we also illustrate how to properly cal- culate marginal likelihood to make comparison between GARCH and SV. Based on our MCMC algorithms and model evaluation approaches, we have conducted large number of model fittings to compare models in both simulation and empirical study. / Ph. D.
14

Trajectory generation and data fusion for control-oriented advanced driver assistance systems / Génération de trajectoires et fusion de données pour des systèmes de commande d'aide à la conduite avancés

Daniel, Jérémie 01 December 2010 (has links)
Depuis l'invention de l'automobile à la fin du 19eme siècle, la taille du parc ainsi que l'importance du trafic routier n'ont cessées d'augmenter. Ceci a malheureusement été suivi par l'augmentation constante du Nombre d'accidents routiers. Un grand nombre d'études et notamment un rapport fourni par l'Organisation Mondiale de la Santé, a présenté un état alarmant du nombre de blessés et de décès liés aux accidents routiers. Afin de réduire ces chiffres, une solution réside dans le Développement de systèmes d'aide à la conduite qui ont pour but d'assister le conducteur dans sa tâche de conduite. La recherche dans le domaine des aides à la conduite s'est montrée très dynamique et productive durant les vingt dernières années puisque des systèmes tels que l'antiblocage de sécurité (ABS), le programme de stabilité électronique (ESP), le régulateur de vitesse intelligent (ACC), l'assistant aux manœuvres de parking (PMA), les phares orientables (DBL), etc. sont maintenant commercialisés et acceptés par la majorité des conducteurs. Cependant, si ces systèmes ont permis d'améliorer la sécurité des conducteurs, de nombreuses pistes sont encore à explorer. En effet, les systèmes d'aide à la conduite existants ont un comportement microscopique, en d'autres termes ils se focalisent uniquement sur la tâche qu'ils ont à effectuer. Partant du principe que la collaboration entre toutes ces aides à la conduite est plus efficace que leur utilisation en parallèle, une approche globale d'aide à la conduite devient nécessaire. Ceci se traduit par la nécessité de développer une nouvelle génération d'aide à la conduite, prenant en compte d'avantage d'informations et de contraintes liées au véhicule, au conducteur et à son environnement. [...] / Since the origin of the automotive at the end of the 19th century, the traffic flow is subject to a constant increase and, unfortunately, involves a constant augmentation of road accidents. Research studies such as the one performed by the World Health Organization, show alarming results about the number of injuries and fatalities due to these accidents. To reduce these figures, a solution lies in the development of Advanced Driver Assistance Systems (ADAS) which purpose is to help the Driver in his driving task. This research topic has been shown to be very dynamic and productive during the last decades. Indeed, several systems such as Anti-lock Braking System (ABS), Electronic Stability Program (ESP), Adaptive Cruise Control (ACC), Parking Manoeuvre Assistant (PMA), Dynamic Bending Light (DBL), etc. are yet market available and their benefits are now recognized by most of the drivers. This first generation of ADAS are usually designed to perform a specific task in the Controller/Vehicle/Environment framework and thus requires only microscopic information, so requires sensors which are only giving local information about an element of the Vehicle or of its Environment. On the opposite, the next ADAS generation will have to consider more aspects, i.e. information and constraints about of the Vehicle and its Environment. Indeed, as they are designed to perform more complex tasks, they need a global view about the road context and the Vehicle configuration. For example, longitudinal control requires information about the road configuration (straight line, bend, etc.) and about the eventual presence of other road users (vehicles, trucks, etc.) to determine the best reference speed. [...]
15

Impact Angle Constrained Guidance Using Cubic Splines

Dhabale, Ashwin January 2015 (has links) (PDF)
In this thesis the cubic spline guidance law and its variants are derived. A detailed analysis is carried out to find the initial conditions for successful interception. The results are applied to three dimensional guidance design and for solving waypoint following problems. The basic cubic spline guidance law is derived for intercepting a stationary target at a desired impact angle in a surface-to-surface engagement scenario. The guidance law is obtained using an inverse method, from a cubic spline curve based trajectory. For overcoming the drawbacks of the basic cubic spline guidance law, it is modified by introducing an additional parameter. This modification has an interesting feature that the guidance command can be obtained using a single cubic spline polynomial even for impact angles greater than π/2, while resulting in substantial improvement in the guidance performance in terms of lateral acceleration demand and length of the trajectory. For imparting robustness to the cubic spline guidance law, in the presence of uncertainties and acceleration saturation, an explicit guidance expression is also derived. A comprehensive capturability study of the proposed guidance law is carried out. The capturability for the cubic spline guidance law is defined in terms of the set of all feasible initial conditions for successful interception. This set is analytically derived and its dependence on various factors, such as initial engagement geometry and interceptor capability, are also established. The basic cubic spline guidance and its variants are also derived for a three dimen- sional scenario. The novelty of the present work lies in the particular representation of the three dimensional cubic spline curve and the adoption of the analytical results available for two dimensional cubic spline guidance law. This enables selection of the boundary condition at launch for given terminal boundary condition and also in avoiding the singularities associated with the inverse method based guidance laws. For establishing the feasibility of the guidance laws in the real world, the rigid body dynamics of the interceptor is presented as a 6 degrees-of-freedom model. Further, using a simplified model, elementary autopilots are also designed. The successful interception of the target in the presence of the rigid body dynamics proves practical applicability of the cubic spline based guidance laws. Finally, the theory developed in the first part of the thesis is applied to solve the waypoint following problem. A smooth path is designed for transition of vehicle velocity from incoming to outgoing direction. The approach developed is similar to Dubins’ path, as it comprises line–cubic spline–line segments. The important feature of this method is that the cubic spline segments are fitted such that the path curvature is bounded by a pre-specified constrained value and the acceleration demand for following the smooth path obtained by this method, gradually increases to the maximum value and then decreases. This property is advantageous from a practical point of view. All the results obtained are verified with the help of numerical simulations which are included in the thesis. The proposed cubic spline guidance law is conceptually simple, does not use linearised kinematic equations, is independent of time-to-go es- timates, and is also computationally inexpensive.
16

Mésothéliome : étiologie professionnelle à partir d’enquêtes cas-témoins françaises / Mesothelioma : occupational etiology from French case-control studies

Lacourt, Aude 03 December 2010 (has links)
Le mésothéliome pleural est considéré comme très spécifique d’une exposition à l’amiante. Cependant, certains aspects de l’étiologie de cette maladie n’ont pas encore été bien caractérisés. Les objectifs de cette étude sont : i) d’estimer la relation dose-effet entre exposition professionnelle aux fibres d’amiante et survenue de mésothéliome pleural selon différents indicateurs temporels d’exposition ; ii) d’étudier l’effet d’une exposition professionnelle aux laines minérales et aux poussières alvéolaires de silice cristalline libre sur le risque de survenue de mésothéliome pleural et iii) d’identifier les professions et secteurs d’activité à risque de survenue de mésothéliome pleural à partir de données recueillies sur une période de 20 ans. Les cas provenaient de ceux recrutés dans une précédente étude cas-témoins réalisée entre 1987 et 1993 et des cas enregistrés dans le programme national de surveillance du mésothéliome entre 1998 et 2006 (1 199 hommes). Les témoins ont été appariés en fréquence sur l’année de naissance et le sexe (2 378 hommes). L’exposition professionnelle à l’amiante, aux laines minérales et à la silice a été évaluée à partir de matrices emplois-exposition. Les relations dose-effet ont été estimées à l’aide du modèle logistique et leur forme a été obtenue grâce à l’utilisation de fonctions splines cubiques restreintes. Si la relation dose-effet à l’amiante est bien confirmée (particulièrement aux faibles doses), cette étude apporte de nouveaux résultats sur la relation temps-effet (rôle du temps écoulé depuis la dernière exposition ou effet de l’âge à la première exposition). Elle ouvre également de nouvelles perspectives sur le rôle des co-expositions (laines minérales) et permet d’identifier de nouvelles activités à risque, comme les mécaniciens automobiles. / Asbestos exposure is recognized as the primary cause of pleural mesothelioma. However, some aspects of etiology of this disease have not been well characterized. The objective of this study was to elucidate dose-response relationships of temporal pattern of occupational asbestos exposure in males, using case-control data, to study effect of man made vitreous fibers and silica dust on the risk of pleural mesothelioma and finally, to describe occupations and industries at high risk for this cancer among men in France according a period of twenty years of observation. Cases came from a French case-control study conducted in 1987-1993 and from the French National Mesothelioma Surveillance Program in 1998-2006 (1,199 males). Population controls were frequency matched by sex and year of birth (2,378 males). Occupational asbestos exposure was evaluated with a job-exposure matrix. The dose-response relationships were estimated using logistic regression models and form of this relationship were estimated using restricted cubic spline functions. Dose-response relationship was confirmed (particularly for lowest doses). However, this study provides new results about time-effect relationships (role of time since last exposure or effect of age at first exposure). This study opens up new prospects on the role of co-exposure (mineral wool) and permit to identify new activities at risk for pleural mésothéliome as motor vehicle mechanics.
17

Modelos parcialmente lineares com erros simétricos autoregressivos de primeira ordem / Symmetric partially linear models with first-order autoregressive errors.

Relvas, Carlos Eduardo Martins 19 April 2013 (has links)
Neste trabalho, apresentamos os modelos simétricos parcialmente lineares AR(1), que generalizam os modelos parcialmente lineares para a presença de erros autocorrelacionados seguindo uma estrutura de autocorrelação AR(1) e erros seguindo uma distribuição simétrica ao invés da distribuição normal. Dentre as distribuições simétricas, podemos considerar distribuições com caudas mais pesadas do que a normal, controlando a curtose e ponderando as observações aberrantes no processo de estimação. A estimação dos parâmetros do modelo é realizada por meio do critério de verossimilhança penalizada, que utiliza as funções escore e a matriz de informação de Fisher, sendo todas essas quantidades derivadas neste trabalho. O número efetivo de graus de liberdade e resultados assintóticos também são apresentados, assim como procedimentos de diagnóstico, destacando-se a obtenção da curvatura normal de influência local sob diferentes esquemas de perturbação e análise de resíduos. Uma aplicação com dados reais é apresentada como ilustração. / In this master dissertation, we present the symmetric partially linear models with AR(1) errors that generalize the normal partially linear models to contain autocorrelated errors AR(1) following a symmetric distribution instead of the normal distribution. Among the symmetric distributions, we can consider heavier tails than the normal ones, controlling the kurtosis and down-weighting outlying observations in the estimation process. The parameter estimation is made through the penalized likelihood by using score functions and the expected Fisher information. We derive these functions in this work. The effective degrees of freedom and asymptotic results are also presented as well as the residual analysis, highlighting the normal curvature of local influence under different perturbation schemes. An application with real data is given for illustration.
18

Modelos parcialmente lineares com erros simétricos autoregressivos de primeira ordem / Symmetric partially linear models with first-order autoregressive errors.

Carlos Eduardo Martins Relvas 19 April 2013 (has links)
Neste trabalho, apresentamos os modelos simétricos parcialmente lineares AR(1), que generalizam os modelos parcialmente lineares para a presença de erros autocorrelacionados seguindo uma estrutura de autocorrelação AR(1) e erros seguindo uma distribuição simétrica ao invés da distribuição normal. Dentre as distribuições simétricas, podemos considerar distribuições com caudas mais pesadas do que a normal, controlando a curtose e ponderando as observações aberrantes no processo de estimação. A estimação dos parâmetros do modelo é realizada por meio do critério de verossimilhança penalizada, que utiliza as funções escore e a matriz de informação de Fisher, sendo todas essas quantidades derivadas neste trabalho. O número efetivo de graus de liberdade e resultados assintóticos também são apresentados, assim como procedimentos de diagnóstico, destacando-se a obtenção da curvatura normal de influência local sob diferentes esquemas de perturbação e análise de resíduos. Uma aplicação com dados reais é apresentada como ilustração. / In this master dissertation, we present the symmetric partially linear models with AR(1) errors that generalize the normal partially linear models to contain autocorrelated errors AR(1) following a symmetric distribution instead of the normal distribution. Among the symmetric distributions, we can consider heavier tails than the normal ones, controlling the kurtosis and down-weighting outlying observations in the estimation process. The parameter estimation is made through the penalized likelihood by using score functions and the expected Fisher information. We derive these functions in this work. The effective degrees of freedom and asymptotic results are also presented as well as the residual analysis, highlighting the normal curvature of local influence under different perturbation schemes. An application with real data is given for illustration.
19

[en] NON-PARAMETRIC ESTIMATIONS OF INTEREST RATE CURVES : MODEL SELECTION CRITERION: MODEL SELECTION CRITERIONPERFORMANCE DETERMINANT FACTORS AND BID-ASK S / [pt] ESTIMAÇÕES NÃO PARAMÉTRICAS DE CURVAS DE JUROS: CRITÉRIO DE SELEÇÃO DE MODELO, FATORES DETERMINANTES DEDESEMPENHO E BID-ASK SPREAD

ANDRE MONTEIRO DALMEIDA MONTEIRO 11 June 2002 (has links)
[pt] Esta tese investiga a estimação de curvas de juros sob o ponto de vista de métodos não-paramétricos. O texto está dividido em dois blocos. O primeiro investiga a questão do critério utilizado para selecionar o método de melhor desempenho na tarefa de interpolar a curva de juros brasileira em uma dada amostra. Foi proposto um critério de seleção de método baseado em estratégias de re-amostragem do tipo leave-k-out cross validation, onde K k £ £ 1 e K é função do número de contratos observados a cada curva da amostra. Especificidades do problema reduzem o esforço computacional requerido, tornando o critério factível. A amostra tem freqüência diária: janeiro de 1997 a fevereiro de 2001. O critério proposto apontou o spline cúbico natural -utilizado com método de ajuste perfeito aos dados - como o método de melhor desempenho. Considerando a precisão de negociação, este spline mostrou-se não viesado. A análise quantitativa de seu desempenho identificou, contudo, heterocedasticidades nos erros simulados. A partir da especificação da variância condicional destes erros e de algumas hipóteses, foi proposto um esquema de intervalo de segurança para a estimação de taxas de juros pelo spline cúbico natural, empregado como método de ajuste perfeito aos dados. O backtest sugere que o esquema proposto é consistente, acomodando bem as hipóteses e aproximações envolvidas. O segundo bloco investiga a estimação da curva de juros norte-americana construída a partir dos contratos de swaps de taxas de juros dólar-Libor pela Máquina de Vetores Suporte (MVS), parte do corpo da Teoria do Aprendizado Estatístico. A pesquisa em MVS tem obtido importantes avanços teóricos, embora ainda sejam escassas as implementações em problemas reais de regressão. A MVS possui características atrativas para a modelagem de curva de juros: é capaz de introduzir já na estimação informações a priori sobre o formato da curva e sobre aspectos da formação das taxas e liquidez de cada um dos contratos a partir dos quais ela é construída. Estas últimas são quantificadas pelo bid-ask spread (BAS) de cada contrato. A formulação básica da MVS é alterada para assimilar diferentes valores do BAS sem que as propriedades dela sejam perdidas. É dada especial atenção ao levantamento de informação a priori para seleção dos parâmetros da MVS a partir do formato típico da curva. A amostra tem freqüência diária: março de 1997 a abril de 2001. Os desempenhos fora da amostra de diversas especificações da MVS foram confrontados com aqueles de outros métodos de estimação. A MVS foi o método que melhor controlou o trade- off entre viés e variância dos erros. / [en] This thesis investigates interest rates curve estimation under non-parametric approach. The text is divided into two parts. The first one focus on which criterion to use to select the best performance method in the task of interpolating Brazilian interest rate curve. A selection criterion is proposed to measure out-of-sample performance by combining resample strategies leave-k-out cross validation applied upon the whole sample curves, where K k £ £ 1 and K is function of observed contract number in each curve. Some particularities reduce substantially the required computational effort, making the proposed criterion feasible. The data sample range is daily, from January 1997 to February 2001. The proposed criterion selected natural cubic spline, used as data perfect-fitting estimation method. Considering the trade rate precision, the spline is non-biased. However, quantitative analysis of performance determinant factors showed the existence of out-of-sample error heteroskedasticities. From a conditional variance specification of these errors, a security interval scheme is proposed for interest rate generated by perfect-fitting natural cubic spline. A backtest showed that the proposed security interval is consistent, accommodating the evolved assumptions and approximations. The second part estimate US free-for-floating interest rate swap contract curve by using Support Vector Machine (SVM), a method derived from Statistical Learning Theory. The SVM research has got important theoretical results, however the number of implementation on real regression problems is low. SVM has some attractive characteristics for interest rates curves modeling: it has the ability to introduce already in its estimation process a priori information about curve shape and about liquidity and price formation aspects of the contracts that generate the curve. The last information set is quantified by the bid-ask spread. The basic SVM formulation is changed in order to be able to incorporate the different values for bid-ask spreads, without losing its properties. Great attention is given to the question of how to extract a priori information from swap curve typical shape to be used in MVS parameter selection. The data sample range is daily, from March 1997 to April 2001. The out-of-sample performances of different SVM specifications are faced with others method performances. SVM got the better control of trade- off between bias and variance of out-of-sample errors.

Page generated in 0.0648 seconds