• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 85
  • 26
  • 13
  • 12
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 162
  • 162
  • 26
  • 26
  • 24
  • 22
  • 21
  • 20
  • 19
  • 18
  • 18
  • 17
  • 17
  • 16
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Méthodes bayésiennes semi-paramétriques d'extraction et de sélection de variables dans le cadre de la dendroclimatologie / Semi-parametric Bayesian Methods for variables extraction and selection in a dendroclimatological context

Guin, Ophélie 14 April 2011 (has links)
Selon le Groupe Intergouvernemental d'experts sur l'Évolution du Climat (GIEC), il est important de connaitre le climat passé afin de replacer le changement climatique actuel dans son contexte. Ainsi, de nombreux chercheurs ont travaillé à l'établissement de procédures permettant de reconstituer les températures ou les précipitations passées à l'aide d'indicateurs climatiques indirects. Ces procédures sont généralement basées sur des méthodes statistiques mais l'estimation des incertitudes associées à ces reconstructions reste une difficulté majeure. L'objectif principal de cette thèse est donc de proposer de nouvelles méthodes statistiques permettant une estimation précise des erreurs commises, en particulier dans le cadre de reconstructions à partir de données sur les cernes d'arbres.De manière générale, les reconstructions climatiques à partir de mesures de cernes d'arbres se déroulent en deux étapes : l'estimation d'une variable cachée, commune à un ensemble de séries de mesures de cernes, et supposée climatique puis l'estimation de la relation existante entre cette variable cachée et certaines variables climatiques. Dans les deux cas, nous avons développé une nouvelle procédure basée sur des modèles bayésiens semi- paramétriques. Tout d'abord, concernant l'extraction du signal commun, nous proposons un modèle hiérarchique semi-paramétrique qui offre la possibilité de capturer les hautes et les basses fréquences contenues dans les cernes d'arbres, ce qui était difficile dans les études dendroclimatologiques passées. Ensuite, nous avons développé un modèle additif généralisé afin de modéliser le lien entre le signal extrait et certaines variables climatiques, permettant ainsi l'existence de relations non-linéaires contrairement aux méthodes classiques de la dendrochronologie. Ces nouvelles méthodes sont à chaque fois comparées aux méthodes utilisées traditionnellement par les dendrochronologues afin de comprendre ce qu'elles peuvent apporter à ces derniers. / As stated by the Intergovernmental Panel on Climate Change (IPCC), it is important to reconstruct past climate to accurately assess the actual climatic change. A large number of researchers have worked to develop procedures to reconstruct past temperatures or precipitation with indirect climatic indicators. These methods are generally based on statistical arguments but the estimation of uncertainties associated to these reconstructions remains an active research field in statistics and in climate studies. The main goal of this thesis is to propose and study novel statistical methods that allow a precise estimation of uncertainties when reconstructing from tree-ring measurements data. Generally, climatic reconstructions from tree-ring observations are based on two steps. Firstly, a hidden environmental hidden variable, common to a collection of tree-ring measurements series, has to be adequately inferred. Secondly, this extracted signal has to be explained with the relevant climatic variables. For these two steps, we have opted to work within a semi-parametric bayesian framework that reduces the number of assumptions and allows to include prior information from the practitioner. Concerning the extraction of the common signal, we propose a model which can catch high and low frequencies contained in tree-rings. This was not possible with previous dendroclimatological methods. For the second step, we have developed a bayesian Generalized Additive Model (GAM) to explore potential links between the extracted signal and some climatic variables. This allows the modeling of non-linear relationships among variables and strongly differs from past dendrochronological methods. From a statistical perspective, a new selection scheme for bayesien GAM was also proposed and studied.
72

Reliable Prediction Intervals and Bayesian Estimation for Demand Rates of Slow-Moving Inventory

Lindsey, Matthew Douglas 08 1900 (has links)
Application of multisource feedback (MSF) increased dramatically and became widespread globally in the past two decades, but there was little conceptual work regarding self-other agreement and few empirical studies investigated self-other agreement in other cultural settings. This study developed a new conceptual framework of self-other agreement and used three samples to illustrate how national culture affected self-other agreement. These three samples included 428 participants from China, 818 participants from the US, and 871 participants from globally dispersed teams (GDTs). An EQS procedure and a polynomial regression procedure were used to examine whether the covariance matrices were equal across samples and whether the relationships between self-other agreement and performance would be different across cultures, respectively. The results indicated MSF could be applied to China and GDTs, but the pattern of relationships between self-other agreement and performance was different across samples, suggesting that the results found in the U.S. sample were the exception rather than rule. Demographics also affected self-other agreement disparately across perspectives and cultures, indicating self-concept was susceptible to cultural influences. The proposed framework only received partial support but showed great promise to guide future studies. This study contributed to the literature by: (a) developing a new framework of self-other agreement that could be used to study various contextual factors; (b) examining the relationship between self-other agreement and performance in three vastly different samples; (c) providing some important insights about consensus between raters and self-other agreement; (d) offering some practical guidelines regarding how to apply MSF to other cultures more effectively.
73

Computing optimal and realised monetary policy rules for Brazil : a markov-switching dsge approach

Paranhos, Lívia Silva January 2017 (has links)
A evolução da economia brasileira durante os primeiros anos do século XXI é examinada através de um modelo microfundamentado de uma pequena economia aberta, permitindo mudanças no comportamento do Banco Central do Brasil, no parâmetro de rigidez nominal e na volatilidade dos choques estruturais. Mesmo os resultados não sendo conclusivos a respeito da presença de mudanças de regime durante o período analisado, encontramos evidências de troca de regime no âmbito da política monetária, passando em 2003 de um regime Dove para um regime Hawk, assim como evidências de choques externos mais voláteis durante períodos de incerteza. Na sequência, deixamos de lado a estimação empírica e derivamos regras de política monetária ótima para o caso brasileiro. É possível encontrar uma regra ótima capaz de estabilizar a inflação, o produto e a taxa de câmbio, mantendo uma taxa de juros estável. Por fim, o modelo trás uma discussão interessante sobre a dinâmica de determinadas variáveis macroeconômicas: uma moeda mais estável implica em uma taxa de juros mais volátil, e vice versa; um maior controle sobre a taxa de juros e/ou sobre a taxa de câmbio parece gerar uma maior instabilidade do produto e da inflação. / The evolution of the Brazilian economy during the first years of this century is examined through the lens of a micro-founded small open economy model that allows for changes in the behaviour of the Central Bank of Brazil, in the nominal price rigidity and in the volatility of structural shocks. Although the results are not conclusive about the presence of regime changes during the analysed sample, we find evidences in favour of shifts in the monetary policy stance, moving from a Dove to a Hawk regime in 2003, as well as evidences of more volatile external shocks during uncertainty periods. We further move away from the empirical estimation and derive optimal monetary policy rules for Brazil. It is possible to find an optimal rule that is successful in stabilizing inflation, output and exchange rates, whilst keeping interest rates stable. Finally, the model offers interesting insights about the standard deviation dynamics of macroeconomic variables: a more stable currency implies a more volatile interest rate and vice versa, and a higher control over interest rates and/or exchange rates seem to produce output and inflation instability.
74

Application of the Fusion Model for Cognitive Diagnostic Assessment with Non-diagnostic Algebra-Geometry Readiness Test Data

Fay, Robert H. 06 July 2018 (has links)
This study retrofitted a Diagnostic Classification Model (DCM) known as the Fusion model onto non-diagnostic test data from of the University of Chicago School Mathematics Project (UCSMP) Algebra and Geometry Readiness test post-test used with Transition Mathematics (Third Edition, Field-Trial Version). The test contained 24 multiple-choice middle school math items, and was originally given to 95 advanced 6th grade and 293 7th grade students. The use of these test answers for this study was an attempt to show that by using cognitive diagnostic analysis techniques on test items not constructed for that purpose, highly predictable multidimensional cognitive attribute profiles for each test taker could be obtained. These profiles delineated whether a given test taker was a master or non-master for each attribute measured by the test, thus allowing detailed diagnostic feedback to be disseminated to both the test takers and their teachers. The full version of the non-compensatory Fusion model, specifically, along with the Arpeggio software package, was used to estimate test taker profiles on each of the four cognitive attributes found to be intrinsic to the items on this test, because it handled both slips and guesses by test takers and accounted for residual skills not defined by the four attributes and twenty-four items in the Q-matrix. The attributes, one or more of which was needed to correctly answer an item, were defined as: Skills— those procedures that students should master with fluency; e.g., multiplying positive and negative numbers; Properties—which deal with the principles underlying the mathematics concepts being studied, such as being able to recognize and use the Repeated-Addition Property of Multiplication; Uses—which deal with applications of mathematics in real situations ranging from routine "word problems" to the development and use of mathematical models, like finding unknowns in real situations involving multiplication; and, Representations—which deal with pictures, graphs, or objects that illustrate concepts. Ultimately, a Q-matrix was developed from the rating of four content experts, with the attributes needed to answer each item clearly delineated. A validation of this Q-matrix was obtained from the Fusion model Arpeggio application to the data as test taker profiles showed which attributes were mastered by each test taker and which weren’t. Masters of the attributes needed to be acquired to successfully answer a test item had a proportion-correct difference from non-masters of .44, on average. Regression analysis produced an R-squared of .89 for the prediction of total scores on the test items by the attribute mastery probabilities obtained from the Fusion model with the final Q-matrix. Limitations of the study are discussed, along with reasons for the significance of the study.
75

Reachable sets analysis in the cooperative control of pursuer vehicles.

Chung, Chern Ferng, Mechanical & Manufacturing Engineering, Faculty of Engineering, UNSW January 2008 (has links)
This thesis is concerned with the Pursuit-and-Evasion (PE) problem where the pursuer aims to minimize the time to capture the evader while the evader tries to prevent capture. In the problem, the evader has two advantages: a higher manoeuvrability and that the pursuer is uncertain about the evader??s state. Cooperation among multiple pursuer vehicles can thus be used to overcome the evader??s advantages. The focus here is on the formulation and development of frameworks and algorithms for cooperation amongst pursuers, aiming at feasible implementation on real and autonomous vehicles. The thesis is split into Parts I and II. Part I considers the problem of capturing an evader of higher manoeuvrability in a deterministic PE game. The approach is the employment of Forward Reachable Set (FRS) analysis in the pursuers?? control. The analysis considers the coverage of the evader??s FRS, which is the set of reachable states at a future time, with the pursuer??s FRS and assumes that the chance of capturing the evader is dependent on the degree of the coverage. Using the union of multiple pursuers?? FRSs intuitively leads to more evader FRS coverage and this forms the mechanism of cooperation. A framework for cooperative control based on the FRS coverage, or FRS-based control, is proposed. Two control algorithms were developed within this framework. Part II additionally introduces the problem of evader state uncertainty due to noise and limited field-of-view of the pursuers?? sensors. A search-and-capture (SAC) problem is the result and a hybrid architecture, which includes multi-sensor estimation using the Particle Filter as well as FRS-based control, is proposed to accomplish the SAC task. The two control algorithms in Part I were tested in simulations against an optimal guidance algorithm. The results show that both algorithms yield a better performance in terms of time and miss distance. The results in Part II demonstrate the effectiveness of the hybrid architecture for the SAC task. The proposed frameworks and algorithms provide insights for the development of effective and more efficient control of pursuer vehicles and can be useful in the practical applications such as defence systems and civil law enforcement.
76

Particle Filtering for Track Before Detect Applications

Torstensson, Johan, Trieb, Mikael January 2005 (has links)
<p>Integrated tracking and detection, based on unthresholded measurements, also referred to as track before detect (TBD) is a hard nonlinear and non-Gaussian dynamical estimation and detection problem. However, it is a technique that enables the user to track and detect targets that would be extremely hard to track and detect, if possible at all with ''classical'' methods. TBD enables us to be better able to detect and track weak, stealthy or dim targets in noise and clutter and particles filter have shown to be very useful in the implementation of TBD algorithms. </p><p>This Master's thesis has investigated the use of particle filters on radar measurements, in a TBD approach.</p><p>The work has been divided into two major problems, a time efficient implementation and new functional features, as estimating the radar cross section (RCS) and the extension of the target. The later is of great importance when the resolution of the radar is such, that specific features of the target can be distinguished. Results will be illustrated by means of realistic examples.</p>
77

Dynamic Demand for New and Used Durable Goods without Physical Depreciation

Ishihara, Masakazu 31 August 2011 (has links)
This thesis studies the interaction between new and used durable goods without physical depreciation. In product categories such as CDs/DVDs and video games, the competition from used goods markets has been viewed as a serious problem by producers. These products physically depreciate negligibly, but owners' consumption values could depreciate quickly due to satiation. Consequently, used goods that are almost identical to new goods may become available immediately after a new product release. However, the existence of used goods markets also provides consumers with a selling opportunity. If consumers are forward-looking and account for the future resale value of a product in their buying decision, used goods markets could increase the sales of new goods. Thus, whether used good markets are harmful or beneficial to new-good producers is an empirical question. To tackle this question, I extend the previous literature in three ways. First, I assemble a new data set from the Japanese video game market. This unique data set includes not only the sales and prices of new and used goods, but also the resale value of used copies, the quantity of used copies retailers purchased from consumers, and the inventory level of used copies at retailers. Second, I develop a structural model of forward-looking consumers that incorporates (i) new and used goods buying decisions, (ii) used goods selling decisions, (iii) consumer expectations about future prices of new and used goods as well as resale values of used goods, and (iv) the depreciation of both owners' and potential buyers' consumption values. Third, I develop a new Bayesian estimation method to estimate my model. In particular, my method can alleviate the computational burden of estimating non-stationary discrete choice dynamic programming models with continuous state variables that evolve stochastically over time. The estimation results suggest that consumers are forward-looking in the Japanese video game market and the substitutability between new and used video games is quite low. Using the estimates, I quantify the impact of eliminating the used video game market on new-game revenues. I find that the elimination of used video game market could reduce the revenue for a new game.
78

Dynamic Demand for New and Used Durable Goods without Physical Depreciation

Ishihara, Masakazu 31 August 2011 (has links)
This thesis studies the interaction between new and used durable goods without physical depreciation. In product categories such as CDs/DVDs and video games, the competition from used goods markets has been viewed as a serious problem by producers. These products physically depreciate negligibly, but owners' consumption values could depreciate quickly due to satiation. Consequently, used goods that are almost identical to new goods may become available immediately after a new product release. However, the existence of used goods markets also provides consumers with a selling opportunity. If consumers are forward-looking and account for the future resale value of a product in their buying decision, used goods markets could increase the sales of new goods. Thus, whether used good markets are harmful or beneficial to new-good producers is an empirical question. To tackle this question, I extend the previous literature in three ways. First, I assemble a new data set from the Japanese video game market. This unique data set includes not only the sales and prices of new and used goods, but also the resale value of used copies, the quantity of used copies retailers purchased from consumers, and the inventory level of used copies at retailers. Second, I develop a structural model of forward-looking consumers that incorporates (i) new and used goods buying decisions, (ii) used goods selling decisions, (iii) consumer expectations about future prices of new and used goods as well as resale values of used goods, and (iv) the depreciation of both owners' and potential buyers' consumption values. Third, I develop a new Bayesian estimation method to estimate my model. In particular, my method can alleviate the computational burden of estimating non-stationary discrete choice dynamic programming models with continuous state variables that evolve stochastically over time. The estimation results suggest that consumers are forward-looking in the Japanese video game market and the substitutability between new and used video games is quite low. Using the estimates, I quantify the impact of eliminating the used video game market on new-game revenues. I find that the elimination of used video game market could reduce the revenue for a new game.
79

A Comparative Study of the Particle Filter and the Ensemble Kalman Filter

Datta Gupta, Syamantak January 2009 (has links)
Non-linear Bayesian estimation, or estimation of the state of a non-linear stochastic system from a set of indirect noisy measurements is a problem encountered in several fields of science. The particle filter and the ensemble Kalman filter are both used to get sub-optimal solutions of Bayesian inference problems, particularly for high-dimensional non-Gaussian and non-linear models. Both are essentially Monte Carlo techniques that compute their results using a set of estimated trajectories of the variable to be monitored. It has been shown that in a linear and Gaussian environment, solutions obtained from both these filters converge to the optimal solution obtained by the Kalman Filter. However, it is of interest to explore how the two filters compare to each other in basic methodology and construction, especially due to the similarity between them. In this work, we take up a specific problem of Bayesian inference in a restricted framework and compare analytically the results obtained from the particle filter and the ensemble Kalman filter. We show that for the chosen model, under certain assumptions, the two filters become methodologically analogous as the sample size goes to infinity.
80

Particle Filtering for Track Before Detect Applications

Torstensson, Johan, Trieb, Mikael January 2005 (has links)
Integrated tracking and detection, based on unthresholded measurements, also referred to as track before detect (TBD) is a hard nonlinear and non-Gaussian dynamical estimation and detection problem. However, it is a technique that enables the user to track and detect targets that would be extremely hard to track and detect, if possible at all with ''classical'' methods. TBD enables us to be better able to detect and track weak, stealthy or dim targets in noise and clutter and particles filter have shown to be very useful in the implementation of TBD algorithms. This Master's thesis has investigated the use of particle filters on radar measurements, in a TBD approach. The work has been divided into two major problems, a time efficient implementation and new functional features, as estimating the radar cross section (RCS) and the extension of the target. The later is of great importance when the resolution of the radar is such, that specific features of the target can be distinguished. Results will be illustrated by means of realistic examples.

Page generated in 0.1184 seconds