• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 26
  • 11
  • 3
  • 3
  • 1
  • Tagged with
  • 52
  • 52
  • 19
  • 11
  • 11
  • 10
  • 9
  • 9
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Modelling complex network dynamics a statistical physics approach /

Leung, Chi-chung. January 2006 (has links)
Thesis (M. Phil.)--University of Hong Kong, 2007. / Title proper from title frame. Also available in printed format.
2

The nonadditive generalization of Klimontovich's S-theorem for open systems and Boltzmann's orthodes

Bagci, Gokhan Baris. Kobe, Donald Holm, January 2008 (has links)
Thesis (Ph. D.)--University of North Texas, August, 2008. / Title from title page display. Includes bibliographical references.
3

An evaluation of regional stream sediment data by advanced statistical procedures

Matysek, Paul Frank January 1985 (has links)
This study was directed towards the development of rigorous, systematic, computer-assisted statistical procedures for the interpretation of quantitative and qualitative data commonly encountered in practical exploration-oriented surveys. A suite of data analysis tools were developed to evaluate the quality of geochemical data sets, to investigate the value and utilization of categorical field data, and to recognize and rank anomalous samples. Data obtained from regional stream sediment surveys as undertaken by the British Columbia Ministry of Energy, Mines and Petroleum Resources in southern British Columbia were examined as a case history. A procedure based on a statistical analysis of field-site duplicates was developed to evaluate the quality of regional geochemical silt data. The technique determines: (1) whether differences in metal concentrations between sample sites reflect a real trend related to geological and geochemical features and not merely a consequence of sampling and analytical error, and (2) absolute precision estimates at any particular accumulation across a metal's concentration range. Results for metals Zn, Cu, Ni, Co, Fe and Mn indicated that combined variability due to local and procedural error averaged less than 5% of the total error and that precision estimates at the 95th percentile concentration value averaged less than 6.0%. Results presented indicate duplicates are more in accord with splits of individual samples (analytical duplicates) rather than separate field-site duplicates. This type of systematic approach provides a basis for interpreting geochemical trends within the survey area, while simultaneously allowing evaluation of the method of sampling and laboratory analysis. A procedure utilizing Duncan's Multiple Range Test examined the relationships between metal concentrations and class-interval and categorical observations of the drainage catchment, sample site and sediment sample. Results show that, many field observations can be systematically related to metal content of drainage sediments. Some elements are more susceptible than others to environmental factors and some factors influence few or many elements. For example, in sediments derived from granites there are significant relationships between bank type and concentration of 8 elements (Zn, Cu, Ni, Pb, Co, Fe, Mn and Hg). In contrast, the texture of these sediments, using estimates of fines contents as an index, did not significantly affect the concentration of any of the elements studied. In general, results indicate that groups of environmental factors acting collectively are more important than any single factor in determining background metal contents of drainage sediments. A procedure utilizing both a graphical and multiple regression approach was developed to identify and characterize anomalous samples. The procedure determines multivariate models based on background metal values which are used to describe very general geochemical relations of no interest for prospecting purposes. These models are then applied to sample subsets selected on the basis of factor/s known to strongly influence geochemical results. Individual samples are characterized after comparisons with relevant determined threshold levels and background multielemenmodels. One hundred and fifteen anomalous samples for zinc from seven provenance groups draining 1259 sample sites were identified and characterized by this procedure. Forty three of these samples had zinc concentrations greater than its calculated provenance threshold, while 72 of these anomalous samples were identified solely because their individual metal associations were significantly different than their provenance multivariate background model. The method provides a means to reduce the effects of background variations while simultaneously identifying and characterizing anomalous samples. The data analysis tools described here allow extraction of useful information from regional geochemical data, and as a result provide and effective means of defining problems of geological interest that warrant further investigation. / Science, Faculty of / Earth, Ocean and Atmospheric Sciences, Department of / Graduate
4

Statistical inference on some long memory volatility models

Li, Muyi., 李木易. January 2011 (has links)
published_or_final_version / Statistics and Actuarial Science / Doctoral / Doctor of Philosophy
5

Maximum-likelihood-based confidence regions and hypothesis tests for selected statistical models

Riggs, Kent Edward. Young, Dean M. January 2006 (has links)
Thesis (Ph.D.)--Baylor University, 2006. / Includes bibliographical references (p. 168-171).
6

Financial Independence and Economic Policy in Costa Rica

Calvo, Minor Vargas January 1978 (has links)
The main objective of the dissertation is to formulate a .system for financial analysis and planning in the Costa Rican economy. First, a tentative structure within which a model of the financial system can be organized is outlined, and the corresponding statistical tables are constructed for the period 1961-1975. It is expected that such tables will be periodically updated and incorporated into a series of publications which describe and analyze financial developments in Costa Rica. The study then analyzes the evolution of the Costa Rican financial system based on the generated data and a set of well-known financial indicators. Finally, alternative linear financial models are formulated and estimated; they are the basis for an analysis of financial interdependence, i.e., the process through which sectoral investment and/or saving decisions generate indirect financi~l consequences in the rest of the economy. Some interesting findings of the study, related to the analysis of financial developments during the period 1961-1975, are the observed rapid growth of the relative size of the Costa Rican financial super-structure, coupled with an accelerated process of monetization of the economy. In addition the importance of the foreign sector in the financial.activities of the country shows a considerable increase. The empirical evidence that emerges from estimating alternative linear financial models suggests a considerable degree of financial interdependence, with the government sector taking a predominant role in the generation of financial multiplier effects. It also indicates that the definition of sectoral preferences over liability holdings as a behavioral assumption in the underlying model, explains the pattern of sectoral financial behavior in the Costa Rican economy more effectively than the alternative assumption that preferences are defined over asset holdings. / Thesis / Doctor of Philosophy (PhD)
7

Parametric inference for time series based upon goodness-of-fit

胡寶璇, Woo, Pao-sun. January 2001 (has links)
published_or_final_version / Statistics and Actuarial Science / Master / Master of Philosophy
8

Maximum likelihood estimation and forecasting for GARCH, Markov switching, and locally stationary wavelet processes /

Xie, Yingfu, January 2007 (has links) (PDF)
Diss. (sammanfattning) Umeå : Sveriges lantbruksuniv., 2007. / Härtill 5 uppsatser.
9

The estimation and inference of complex models

Zhou, Min 24 August 2017 (has links)
In this thesis, we investigate the estimation problem and inference problem for the complex models. Two major categories of complex models are emphasized by us, one is generalized linear models, the other is time series models. For the generalized linear models, we consider one fundamental problem about sure screening for interaction terms in ultra-high dimensional feature space; for time series models, an important model assumption about Markov property is considered by us. The first part of this thesis illustrates the significant interaction pursuit problem for ultra-high dimensional models with two-way interaction effects. We propose a simple sure screening procedure (SSI) to detect significant interactions between the explanatory variables and the response variable in the high or ultra-high dimensional generalized linear regression models. Sure screening method is a simple, but powerful tool for the first step of feature selection or variable selection for ultra-high dimensional data. We investigate the sure screening properties of the proposal method from theoretical insight. Furthermore, we indicate that our proposed method can control the false discovery rate at a reasonable size, so the regularized variable selection methods can be easily applied to get more accurate feature selection in the following model selection procedures. Moreover, from the viewpoint of computational efficiency, we suggest a much more efficient algorithm-discretized SSI (DSSI) to realize our proposed sure screening method in practice. And we also investigate the properties of these two algorithms SSI and DSSI in simulation studies and apply them to some real data analyses for illustration. For the second part, our concern is the testing of the Markov property in time series processes. Markovian assumption plays an extremely important role in time series analysis and is also a fundamental assumption in economic and financial models. However, few existing research mainly focused on how to test the Markov properties for the time series processes. Therefore, for the Markovian assumption, we propose a new test procedure to check if the time series with beta-mixing possesses the Markov property. Our test is based on the Conditional Distance Covariance (CDCov). We investigate the theoretical properties of the proposed method. The asymptotic distribution of the proposed test statistic under the null hypothesis is obtained, and the power of the test procedure under local alternative hypothesizes have been studied. Simulation studies are conducted to demonstrate the finite sample performance of our test.
10

Characterization and mitigation of radiation damage on the Gaia Astrometric Field

Brown, Scott William January 2011 (has links)
In November 2012, the European Space Agency (ESA) is planning to launch Gaia, a mission designed to measure with microarcsecond accuracy the astrometric properties of over a billion stars. Microarcsecond astrometry requires extremely accurate positional measurements of individual stellar transits on the focal plane, which can be disrupted by radiation-induced Charge Transfer Inefficiency (CTI). Gaia will suffer radiation damage, impacting on the science performance, which has led to a series of Radiation Campaigns (RCs) being carried out by industry to investigate these issues. The goal of this thesis is to rigorously assess these campaigns and facilitate how to deal with CTI in the data processing. We begin in Chapter 1 by giving an overview of astrometry and photometry, introducing the concept of stellar parallax, and establishing why observing from space is paramount for performing global, absolute astrometry. As demonstrated by Hipparcos, the concept is sound. After reviewing the Gaia payload and discussing how astrometric and photometric parameters are determined in practice, we introduce the issue of radiation-induced CTI and how it may be dealt with. The on board mitigating strategies are investigated in detail in Chapter 2. Here we analyse the effects of radiation damage as a function of magnitude with and without a diffuse optical background, charge injection and the use of gates, and also discover a number of calibration issues. Some of these issues are expected to be removed during flight testing, others will have to be dealt with as part of the data processing, e.g. CCD stitches and the charge injection tail. In Chapter 3 we turn to look at the physical properties of a Gaia CCD. Using data from RC2 we probe the density of traps (i.e. damaged sites) in each pixel and, for the first time, measure the Full Well Capacity of the Supplementary Buried Channel, a part of every Gaia pixel that constrains the passage of faint signals away from the bulk of traps throughout the rest of the pixel. The Data Processing and Analysis Consortium (DPAC) is currently adopting a 'forward modelling' approach to calibrate radiation damage in the data processing. This incorporates a Charge Distortion Model (CDM), which is investigated in Chapter 4. We find that although the CDM performs well there are a number of degeneracies in the model parameters, which may be probed further by better experimental data and a more realistic model. Another way of assessing the performance of a CDM is explored in Chapter 5. Using a Monte Carlo approach we test how well the CDM can extract accurate image parameters. It is found that the CDM must be highly robust to achieve a moderate degree of accuracyand that the fitting is limited by assigning finite window sizes to the image shapes. Finally, in Chapter 6 we summarise our findings on the campaign analyses, the on-board mitigating strategies and on how well we are currently able to handle radiation damage in the data processing.

Page generated in 0.0722 seconds