• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1228
  • 305
  • 123
  • 100
  • 67
  • 60
  • 42
  • 24
  • 22
  • 18
  • 14
  • 13
  • 8
  • 7
  • 7
  • Tagged with
  • 2423
  • 879
  • 404
  • 336
  • 302
  • 245
  • 238
  • 204
  • 196
  • 191
  • 178
  • 170
  • 167
  • 152
  • 148
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Parameter parsimony, model selection, and smooth density estimation

Atilgan, Taskin. January 1900 (has links)
Thesis (Ph. D.)--University of Wisconsin--Madison, 1983. / Typescript. Vita. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 242-248).
132

Multivariate analysis in vibration modal parameter identification /

Zhou, Wenliang, January 2006 (has links)
Thesis (Ph. D.)--University of Rhode Island, 2006. / Includes bibliographical references (leaves 108-112).
133

Small anomalous mass detection from airborne gradiometry

Dumrongchai, Puttipol, January 2007 (has links)
Thesis (Ph. D.)--Ohio State University, 2007. / Title from first page of PDF file. Includes bibliographical references (p. 226-232).
134

GPS determination of diurnal and semidiurnal variations in earth rotation parameters and the geocenter /

Nam, Young-sun, January 1999 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 1999. / Vita. Includes bibliographical references (leaves 135-153). Available also in a digital version from Dissertation Abstracts.
135

Likelihood inference for parametric models of dispersal /

Jones, Mary Beatrix. January 2000 (has links)
Thesis (Ph. D.)--University of Washington, 2000. / Vita. Includes bibliographical references (p. 142-147).
136

A spatio-temporal individual-based network framework for West Nile virus in the USA: parameter estimation and spreading pattern selection using approximate Bayesian computation

Moon, Sifat Afroj January 1900 (has links)
Master of Science / Department of Electrical and Computer Engineering / Caterina M. Scoglio / West Nile virus (WNV) ---a mosquito-borne arbovirus--- entered the USA through New York City in 1999 and spread to the contiguous USA within three years while transitioning from epidemic outbreaks to endemic transmission. The virus is transmitted by vector competent mosquitoes and maintained in the avian populations. WNV spatial distribution is mainly determined by the movement of residential and migratory avian populations. We developed an individual-level heterogeneous network framework across the USA with the goal of understanding the long-range spatial distribution of WNV. To this end, we proposed three distance dispersal kernels model: 1) exponential ---short-range dispersal, 2) power-law ---long-range dispersal in all directions, and 3) power-law biased by flyway direction ---long-range dispersal only along established migratory routes. To select the appropriate dispersal kernel we used the human case data and adopted a model selection framework based on approximate Bayesian computation with sequential Monte Carlo sampling (ABC-SMC). From estimated parameters, we find that the power-law biased by flyway direction kernel is the best kernel to fit WNV human case data, supporting the hypothesis of long-range WNV transmission is mainly along the migratory bird flyways. Through extensive simulation from 2014 to 2016, we proposed and tested hypothetical mitigation strategies and found that mosquito population reduction in the infected states and neighboring states is potentially cost-effective.
137

Lumped parameter thermal modelling for UK domestic buildings based on measured operational data

Dimitriou, Vanda January 2016 (has links)
The development and use of thermal models is an integral part of the design process in existing buildings due for refurbishment. Energy predictions for existing buildings are often based on models which assume thermal property values of the building construction elements. However, once built, the actual thermal properties may differ significantly from their estimated values. Possible reasons include thermal bridging, material distortion and moisture content, sub-standard construction on-site and unavailability of construction details. The uncertainties can be reduced if the modelling process can also make use of operational measurements, such as the fuel use and internal temperatures, which have been recorded in the building during operation. To make use of operational data, performance-based models can be used. Performance-based models rely on measured data for the development of the model s architecture and for informing the estimation of the model parameters that would otherwise be based on the modeller s assumptions of the building s characteristics. One solution to the challenge of using performance-based models for existing buildings is to use the Lumped Parameter modelling approach. The Lumped Parameter modelling technique is often used for performance-based modelling of existing buildings due to the moderate knowledge of the building s physical properties required and the limited operational data needed for model training. This thesis investigates the potential of performance-based modelling techniques for existing UK domestic buildings, based on the Lumped Parameter thermal modelling technique, and the use of measured operational data to inform the model structure and parameters. Operational data have been collected in 20 homes as part of the REFIT project, an EPSRC-funded research project on Smart Meters and Smart Homes (REFIT, 2016). This thesis explores 11 houses from the REFIT dataset and, in particular, the temperature, gas and electricity measurements from the participating households, and develops whole-house and sub-system performance-based models using the Lumped Parameter technique. The suitability of simple performance-based Lumped Parameter models in representing typical UK domestic buildings using mainstream operational data such as temperatures and gas consumption measurements is explored. This thesis concludes on the adequacy of the operational data as measured. High correlations (>0.9) between whole-house average indoor temperatures and individual room air temperature measurements prove the use of averages adequate for representing the main rooms of the houses, whereas individual representation of the house s main rooms in use in the same model can prove challenging. A similar result is observed for whole-house radiator representation and the individual radiators. The relationships between the operational data is explored to inform the model structure and to identify collinearity and multi collinearity in the measurements. In terms of whole-house modelling, when using constraints for the parameter values during the model calibration to the measured data the resulting model parameters can be realistic and a good agreement to the measured data can be achieved (on average an RMSE of 1.03 for air temperature). The most significant parameters affecting the mean value of internal air temperatures are the external envelope resistance Re, the non-inertia elements (e.g. windows and doors) resistance, the window area for solar gains, boiler efficiency and the infiltration rate. The indoor air and internal element heat capacitance had the greatest impact on the swing in the internal air temperature (a 75% decrease in the capacitance value resulted in a 190.70% increase in the standard deviation value on average across the 11 houses). The building envelope heat capacitance and the envelope node positioning were the two parameters with the least impact on the model goodness of fit (a 75% decrease in capacitance and a value of 0.9 in envelope node positioning resulted in a 2.57% and 6.68% increase respectively in the RMSE on average across all 11 houses). Finally, the heating system representation using the Lumped Parameter model showed that the whole-house gas consumption data at the meter level, consisting of gas used for space heating as well as other purposes, is inadequate to drive the heating system model. A temperature threshold (e.g. of 1oC) indicating model overprediction can be used to remove the time-stamps of mixed use gas consumption from the model calibration. The heating system model can then be used to quantify gas consumption for space heating and non-space heating uses. In the 11 houses under study, 82.96% of the total gas consumption served for space heating, with 17.04% serving for other non-space heating purposes.
138

Identificação de parâmetros em obras civis. / Parameter identification in civil structures.

Adriane Costa 18 May 2006 (has links)
O problema de identificação de parâmetros consiste em se determinar parâmetros que minimizem a diferença entre valores medidos e calculados de determinadas grandezas. Certamente, essa identificação é realizada para parâmetros que apresentam razoável grau de incerteza nos seus valores. Neste trabalho apresentam-se os principais conceitos e fundamentos matemáticos envolvidos no assunto, desenvolve-se um procedimento de identificação de parâmetros com base matemática sólida e aplica-se esse procedimento em problemas de interesse prático da engenharia. São estudados o Túnel de Hudvudsta e a barragem de Machadinho, nos quais são identificados parâmetros relacionados com as ações ou com as propriedades físicas dos materiais, considerando modelos hierárquicos para representar as estruturas. Utilizam-se os principais critérios de identificação para a definição das funções objetivo e métodos do tipo Newton para a minimização dessas funções. / The parameter identification problem consists of determining the values of the parameters that minimize the difference between measured and calculated values of some variables. Indeed, this identification is performed to parameters that present some uncertainty on their values. In this work the main mathematical concepts and fundaments related to back analysis are presented. A procedure for parameter identification with a consistent mathematical basis is developed and applied in practical engineering problems. The Hudvudsta tunnel and the Machadinho dam are studied to identify parameters related to loads or material physical properties by using hierarchical models to represent the structures. The objetive functions are defined with the main identification criteria and minimized with Newton´s methods.
139

Off-line quality control by robust parameter design

Min, Jun Young January 1900 (has links)
Master of Science / Department of Statistics / Shie-Shien Yang / There have been considerable debates over the robust parameter design. As a result, there have been many approaches presented that are suited to the robust parameter design. In my report, I illustrate and present Taguchi's robust parameter design, response surface approach and semi-parameter design. Considerable attention has been placed on the semi-parameter design. This approach is new technology that was introduced to Picke, Robinson, Birch and Anderson-Cook (2006). The method is a combined parametric and nonparametric technique to improve the estimates of both the mean and the variance of the response.
140

Estimation of parameters and tests for parameter changes in fractional Gaussian noise

Robbertse, Johannes Lodewickes 29 July 2013 (has links)
D.Phil. (Mathematical Statistics) / Fractional Brownian motion and its increment process, fractional Gaussian noise, are syn- onymous with the concept of long range dependence. A strictly stationary time series is said to exhibit long range dependence or long memory if its autocorrelations decrease to zero as a power of the lag, but their sum over all lags is not absolutely convergent. This phenomenon has been observed in numerous scientific areas such as hydrology, ethernet traffic data, stock returns and exchange rates, to name just a few. The extent of long memory dependence is characterized by the value of the so called Hurst exponent or Hurst coefficient H. Approximate normality and unbiasedness of the maximum likelihood estimate of H hold reasonably well for sample sizes as small as 20 if the mean and scale parameters are known. We show in a Monte Carlo study that if the latter two parameters are unknown, the bias and variance of the maximum likelihood estimate of H both increase substantially. We also show that the bias can be reduced by using a jackknife or parametric bootstrap proce- dure. However, in very large samples, maximum likelihood estimation becomes problematic because of the large dimension of the covariance matrix that must be inverted. We consider an approach for estimating the Hurst exponent by taking first order differ- ences of fractional Gaussian noise. We find that this differenced process has short memory and that, consequently, we may assume approximate independence between the estimates of the Hurst exponents in disjoint blocks of data. We split the data into a number of con- tiguous blocks, each containing a relatively small number of observations. Computation of the likelihood function in a block then presents no computational problem. We form a pseudo likelihood function consisting of the product of the likelihood functions in each of the blocks and provide a formula for the standard error of the resulting estimator of H. This formula is shown in a Monte Carlo study to provide a good approximation to the true standard error. Application of the methodology is illustrated in two data sets. The long memory property of a time series is primarily characterized by H. In general, such series are exceptionally long, therefore it is natural to enquire whether or not H remains constant over the full extent of the time series. We propose a number of tests for the hypothesis that H remains constant, against an alternative of a change in one or more values of H. Formulas are given to enable calculation of asymptotic p-values. We also propose a permutational procedure for evaluating exact p-values. The proposed tests are applied to three sets of data.

Page generated in 0.0409 seconds