• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 3
  • Tagged with
  • 8
  • 5
  • 5
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Statistická korekce denních srážkových úhrnů z klimatických modelů / Statistical correction of daily precipitation sums from climate models

Hnilica, Jan January 2016 (has links)
Climate change prediction and evaluation of its impact currently represent one of the key challenges for the science community. Regional climate models (RCM) have been recently established as a main source of the data for climate change assessment studies. Nevertheless, RCM outputs suffer from systematic errors caused primarily by their low spatial resolution and cannot be used directly without any form of bias correction. The bias correction is an actual topic in climatology and several correction methods were developed, ranging from the simple additive method to more advanced approaches (e.g. quantile mapping). However, despite this progress, the bias correction methods suffer from several difficulties, which bring another source of uncertainty into the climate change impact assessment studies. This thesis is focused on two problematic points connected with the bias correction of daily precipitation data. The first one is a non-stationarity between calibration and application periods. New correction methods are developed, showing an increased resistance to non-stationary conditions. The second problem is related to the correction of a dependence (i.e. correlation and covariance) structure of multivariate precipitation data. A new procedure is proposed, correcting the complete dependence structure of the model data. All newly introduced methods are validated using measured and RCM-simulated data; the validation demonstrates their suitable applicability.
2

Diagnostika kovariancí chyb předběžného pole ve spojeném systému globální a regionální asimilace dat / Diagnostics of background error covariances in a connected global and regional data assimilation system

Bučánek, Antonín January 2018 (has links)
The thesis deals with the preparation of initial conditions for nume- rical weather prediction in high resolution limited area models. It focuses on the problem of preserving the large-scale part of the global driving model analysis, which can not be determined in sufficient quality in limited-area models. For this purpose, the so-called BlendVar scheme is used. The scheme consists of the appli- cation of the Digital Filter (DF) Blending method, which assures the transmission of a large-scale part of the analysis of the driving model to the limited area model, and of the three-dimensional variational method (3D-Var) at high resolution. The thesis focuses on the appropriate background error specification, which is one of the key components of 3D-Var. Different approaches to modeling of background errors are examined, including the possibility of taking into account the flow- dependent character of background errors. Approaches are also evaluated from the point of view of practical implementation. Study of evolution of background errors during DF Blending and BlendVar assimilation cycles leads to a new pro- posal for the preparation of a background error covariance matrix suitable for the BlendVar assimilation scheme. The use of the new background error covariance matrix gives the required property...
3

Portfolio Value at Risk and Expected Shortfall using High-frequency data / Portfólio Value at Risk a Expected Shortfall s použitím vysoko frekvenčních dat

Zváč, Marek January 2015 (has links)
The main objective of this thesis is to investigate whether multivariate models using Highfrequency data provide significantly more accurate forecasts of Value at Risk and Expected Shortfall than multivariate models using only daily data. Our objective is very topical since the Basel Committee announced in 2013 that is going to change the risk measure used for calculation of capital requirement from Value at Risk to Expected Shortfall. The further improvement of accuracy of both risk measures can be also achieved by incorporation of high-frequency data that are rapidly more available due to significant technological progress. Therefore, we employed parsimonious Heterogeneous Autoregression and its asymmetric version that uses high-frequency data for the modeling of realized covariance matrix. The benchmark models are chosen well established DCC-GARCH and EWMA. The computation of Value at Risk (VaR) and Expected Shortfall (ES) is done through parametric, semi-parametric and Monte Carlo simulations. The loss distributions are represented by multivariate Gaussian, Student t, multivariate distributions simulated by Copula functions and multivariate filtered historical simulations. There are used univariate loss distributions: Generalized Pareto Distribution from EVT, empirical and standard parametric distributions. The main finding is that Heterogeneous Autoregression model using high-frequency data delivered superior or at least the same accuracy of forecasts of VaR to benchmark models based on daily data. Finally, the backtesting of ES remains still very challenging and applied Test I. and II. did not provide credible validation of the forecasts.
4

Ensemblový Kalmanův filtr na prostorech velké a nekonečné dimenze / Ensemble Kalman filter on high and infinite dimensional spaces

Kasanický, Ivan January 2017 (has links)
Title: Ensemble Kalman filter on high and infinite dimensional spaces Author: Mgr. Ivan Kasanický Department: Department of Probability and Mathematical Statistics Supervisor: doc. RNDr. Daniel Hlubinka, Ph.D., Department of Probability and Mathematical Statistics Consultant: prof. RNDr. Jan Mandel, CSc., Department of Mathematical and Statistical Sciences, University of Colorado Denver Abstract: The ensemble Kalman filter (EnKF) is a recursive filter, which is used in a data assimilation to produce sequential estimates of states of a hidden dynamical system. The evolution of the system is usually governed by a set of di↵erential equations, so one concrete state of the system is, in fact, an element of an infinite dimensional space. In the presented thesis we show that the EnKF is well defined on a infinite dimensional separable Hilbert space if a data noise is a weak random variable with a covariance bounded from below. We also show that this condition is su cient for the 3DVAR and the Bayesian filtering to be well posed. Additionally, we extend the already known fact that the EnKF converges to the Kalman filter in a finite dimension, and prove that a similar statement holds even in a infinite dimension. The EnKF su↵ers from a low rank approximation of a state covariance, so a covariance localization is required in...
5

Využití nekonvenčních pozorování v asimilaci dat do numerického předpovědního modelu počasí ve vysokém rozlišení spojení se studiem pomalého podprostoru řešení modelu / Non-conventional data assimilation in high resolution numerical weather prediction model with study of the slow manifold of the model

Benáček, Patrik January 2019 (has links)
Satellite instruments currently provide the largest source of infor- mation to today's data assimilation (DA) systems for numerical weather predic- tion (NWP). With the development of high-resolution models, the efficient use of observations at high density is essential to improve small-scale information in the weather forecast. However, a large amount of satellite radiances has to be removed from DA by horizontal data thinning due to uncorrelated observation error assumptions. Moreover, satellite radiances include systematic errors (biases) that may be even larger than the observation signal itself, and must be properly removed prior to DA. Although the Variational Bias Correction (VarBC) scheme is widely used by global NWP centers, there are still open questions regarding its use in Limited-Area Models (LAMs). This thesis aims to tackle the obser- vation error difficulties in assimilating polar satellite radiances in the meso-scale ALADIN system. Firstly, we evaluate spatial- and inter-channel error correla- tions to enhance the positive effect of data thinning. Secondly, we study satellite radiance bias characteristics with the key aspects of the VarBC in LAMs, and we compare the different VarBC configurations with regards to forecast performance. This work is a step towards improving the...
6

Statistická analýza intervalových dat / Statistical analysis of interval data

Troshkov, Kirill January 2011 (has links)
Traditional statistical analysis starts with computing the basic statisti- cal characteristics such as the population mean E, population variance V , cova- riance and correlation. In computing these characteristics, it is usually assumed that the corresponding data values are known exactly. In real life there are many situations in which a more complete information can be achieved by describing a set of statistical units in terms of interval data. For example, daily tempera- tures registered as minimum and maximum values offer a more realistic view on the weather conditions variations with respect to the simple average values. In environmental analysis, we observe a pollution level x(t) in a lake at different mo- ments of time t, and we would like to estimate standard statistical characteristics such as mean, variance and correlation with other measurements. Another exam- ple can be given by financial series. The minimum and the maximum transaction prices recorded daily for a set of stocks represent a more relevant information for experts in order to evaluate the stocks tendency and volatility in the same day. We must therefore modify the existing statistical algorithms to process such interval data. In this work we will analyze algorithms and their modifications for computing various statistics under...
7

Sovereign credit risk drivers in a spatial perspective. / Sovereign credit risk drivers in a spatial perspective.

Záhlava, Josef January 2018 (has links)
This thesis analyses what drives sovereign credit risk when contagion is con- trolled for. CDS spreads are used as a measure of credit risk and bond yields are used to estimate interconnectedness of the examined countries. The main contribution lies in the use of high-frequency data and a robust wavelet based estimator in addition to spatial econometric model. The aim of this thesis is to test for presence of contagion and to evaluate which fundamentals are decisive for market perception of sovereign credit risk. Another goal is to evaluate the possibility of a structural break caused by the Greek debt restructuring. The results show that the restructuring did bring change. Contagion is present during the post-crisis period and it diminishes as the economies recover. Sim- ilarly, fundamentals are of higher importance in the post-crisis period when compared with the following period. JEL Classification C22, C31, C33, G01, G32, G33 Keywords spatial econometrics, CDS spreads, sovereign credit risk, financial contagion, realised covari- ance Author's e-mail josef.zahlava@gmail.com Supervisor's e-mail petr.gapko@seznam.cz
8

Hodnocení míry mentální zátěže za použití mozkové konektivity / Classification of mental workload using brain connectivity measure

Doležalová, Radka January 2015 (has links)
Tato práce se zabývá využitím EEG dat pro výpočet mozkové konektivity a vytvořením klasifikátoru mentální zátěže. Nejdříve je popsán teoretický základ EEG, následně jsou rozebrány některé metody pro určení mozkové konektivity. Pro výpočet klasifikačních příznaků byla použita data nasnímaná během experimentu, který manipuloval s mentální zátěží ve dvou stupních. V práci je popsán průběh experimentu, zpracování a redukce nasnímaných dat, stejně jako extrakce příznaků z nasnímaných EEG dat pomocí několika metod měření konektivity (korelační funkce, kovariance, koherence a míra fázové soudržnosti) a následná automatická klasifikace třemi způsoby (na základě vzdálenosti od vzoru tvořeného průměrem, metoda nejbližšího souseda a diskriminační alanýza). Dosažené výsledky jsou detailně popsány a diskutovány. Nejlepšího výsledku (úspěšnost 60,64%) bylo dosaženo při použití kovarianční matice určené z dat získaných ze 4 elektrod z různých mozkových oblastí (beta pásmo EEG) při klasifikaci založené na lineární diskriminační funkci.

Page generated in 0.0593 seconds