Spelling suggestions: "subject:"[een] PARAMETER ESTIMATION"" "subject:"[enn] PARAMETER ESTIMATION""
51 |
Small anomalous mass detection from airborne gradiometryDumrongchai, Puttipol, January 2007 (has links)
Thesis (Ph. D.)--Ohio State University, 2007. / Title from first page of PDF file. Includes bibliographical references (p. 226-232).
|
52 |
GPS determination of diurnal and semidiurnal variations in earth rotation parameters and the geocenter /Nam, Young-sun, January 1999 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 1999. / Vita. Includes bibliographical references (leaves 135-153). Available also in a digital version from Dissertation Abstracts.
|
53 |
Likelihood inference for parametric models of dispersal /Jones, Mary Beatrix. January 2000 (has links)
Thesis (Ph. D.)--University of Washington, 2000. / Vita. Includes bibliographical references (p. 142-147).
|
54 |
A spatio-temporal individual-based network framework for West Nile virus in the USA: parameter estimation and spreading pattern selection using approximate Bayesian computationMoon, Sifat Afroj January 1900 (has links)
Master of Science / Department of Electrical and Computer Engineering / Caterina M. Scoglio / West Nile virus (WNV) ---a mosquito-borne arbovirus--- entered the USA through New York City in 1999 and spread to the contiguous USA within three years while transitioning from epidemic outbreaks to endemic transmission. The virus is transmitted by vector competent mosquitoes and maintained in the avian populations. WNV spatial distribution is mainly determined by the movement of residential and migratory avian populations. We developed an individual-level heterogeneous network framework across the USA with the goal of understanding the long-range spatial distribution of WNV. To this end, we proposed three distance dispersal kernels model: 1) exponential ---short-range dispersal, 2) power-law ---long-range dispersal in all directions, and 3) power-law biased by flyway direction ---long-range dispersal only along established migratory routes. To select the appropriate dispersal kernel we used the human case data and adopted a model selection framework based on approximate Bayesian computation with sequential Monte Carlo sampling (ABC-SMC). From estimated parameters, we find that the power-law biased by flyway direction kernel is the best kernel to fit WNV human case data, supporting the hypothesis of long-range WNV transmission is mainly along the migratory bird flyways. Through extensive simulation from 2014 to 2016, we proposed and tested hypothetical mitigation strategies and found that mosquito population reduction in the infected states and neighboring states is potentially cost-effective.
|
55 |
Estimation of parameters and tests for parameter changes in fractional Gaussian noiseRobbertse, Johannes Lodewickes 29 July 2013 (has links)
D.Phil. (Mathematical Statistics) / Fractional Brownian motion and its increment process, fractional Gaussian noise, are syn- onymous with the concept of long range dependence. A strictly stationary time series is said to exhibit long range dependence or long memory if its autocorrelations decrease to zero as a power of the lag, but their sum over all lags is not absolutely convergent. This phenomenon has been observed in numerous scientific areas such as hydrology, ethernet traffic data, stock returns and exchange rates, to name just a few. The extent of long memory dependence is characterized by the value of the so called Hurst exponent or Hurst coefficient H. Approximate normality and unbiasedness of the maximum likelihood estimate of H hold reasonably well for sample sizes as small as 20 if the mean and scale parameters are known. We show in a Monte Carlo study that if the latter two parameters are unknown, the bias and variance of the maximum likelihood estimate of H both increase substantially. We also show that the bias can be reduced by using a jackknife or parametric bootstrap proce- dure. However, in very large samples, maximum likelihood estimation becomes problematic because of the large dimension of the covariance matrix that must be inverted. We consider an approach for estimating the Hurst exponent by taking first order differ- ences of fractional Gaussian noise. We find that this differenced process has short memory and that, consequently, we may assume approximate independence between the estimates of the Hurst exponents in disjoint blocks of data. We split the data into a number of con- tiguous blocks, each containing a relatively small number of observations. Computation of the likelihood function in a block then presents no computational problem. We form a pseudo likelihood function consisting of the product of the likelihood functions in each of the blocks and provide a formula for the standard error of the resulting estimator of H. This formula is shown in a Monte Carlo study to provide a good approximation to the true standard error. Application of the methodology is illustrated in two data sets. The long memory property of a time series is primarily characterized by H. In general, such series are exceptionally long, therefore it is natural to enquire whether or not H remains constant over the full extent of the time series. We propose a number of tests for the hypothesis that H remains constant, against an alternative of a change in one or more values of H. Formulas are given to enable calculation of asymptotic p-values. We also propose a permutational procedure for evaluating exact p-values. The proposed tests are applied to three sets of data.
|
56 |
Regularized Numerical Algorithms For Stable Parameter Estimation In Epidemiology And Implications For ForecastingDeCamp, Linda 08 August 2017 (has links)
When an emerging outbreak occurs, stable parameter estimation and reliable projections of future incidence cases using limited (early) data can play an important role in optimal allocation of resources and in the development of effective public health intervention programs. However, the inverse parameter identification problem is ill-posed and cannot be solved with classical tools of computational mathematics. In this dissertation, various regularization methods are employed to incorporate stability in parameter estimation algorithms. The recovered parameters are then used to generate future incident curves as well as the carrying capacity of the epidemic and the turning point of the outbreak.
For the nonlinear generalized Richards model of disease progression, we develop a novel iteratively regularized Gauss-Newton-type algorithm to reconstruct major characteristics of an emerging infection. This problem-oriented numerical scheme takes full advantage of a priori information available for our specific application in order to stabilize the iterative process. Another important aspect of our research is a reliable estimation of time-dependent transmission rate in a compartmental SEIR disease model. To that end, the ODE-constrained minimization problem is reduced to a linear Volterra integral equation of the first kind, and a combination of regularizing filters is employed to approximate the unknown transmission parameter in a stable manner. To justify our theoretical findings, extensive numerical experiments have been conducted with both synthetic and real data for various infectious diseases.
|
57 |
ON-LINE PARAMETER ESTIMATION AND ADAPTIVE CONTROL OF PERMANENT MAGNET SYNCHRONOUS MACHINESUnderwood, Samuel J. 17 May 2006 (has links)
No description available.
|
58 |
Method for Evaluating Changing Blood PerfusionSheng, Baoyi 21 December 2023 (has links)
This thesis provides insight into methods for estimating blood perfusion, emphasizing the need for accurate modeling in dynamic physiological environments. The thesis critically examines conventional error function solutions used in steady state or gradually changing blood flow scenarios, revealing their shortcomings in accurately reflecting more rapid changes in blood perfusion. To address this limitation, this study introduces a novel prediction model based on the finite-difference method (FDM) specifically designed to produce accurate results under different blood flow perfusion conditions. A comparative analysis concludes that the FDM-based model is consistent with traditional error function methods under constant blood perfusion conditions, thus establishing its validity under dynamic and steady blood flow conditions. In addition, the study attempts to determine whether analytical solutions exist that are suitable for changing perfusion conditions. Three alternative analytical estimation methods were explored, each exposing the common thread of inadequate responsiveness to sudden changes in blood perfusion. Based on the advantages and disadvantages of the error function and FDM estimation, a combination of these two methods was developed. Utilizing the simplicity and efficiency of the error function, the prediction of contact resistance and core temperature along with the initial blood perfusion was first made at the beginning of the data. Then the subsequent blood perfusion values were predicted using the FDM, as the FDM can effectively respond to changing blood perfusion values. / Master of Science / Blood perfusion, the process of blood flowing through our body's tissues, is crucial for our health. It's like monitoring traffic flow on roads, which is especially important during rapid changes, such as during exercise or medical treatments. Traditional methods for estimating blood perfusion, akin to older traffic monitoring techniques, struggle to keep up with these rapid changes. This research introduces a new approach, using a method often found in engineering and physics, called the finite-difference method (FDM), to create more accurate models of blood flow in various conditions. This study puts this new method to the test against the old standards. We discover that while both are effective under steady conditions, the FDM shines when blood flow changes quickly. We also examined three other methods, but they, too, fell short in these fast-changing scenarios. This work is more than just numbers and models; it's about potentially transforming how we understand and manage health. By combining the simplicity of traditional methods for initial blood flow estimates with the dynamic capabilities of the FDM, we're paving the way for more precise medical diagnostics and treatments.
|
59 |
Poisson Approximation to Image Sensor NoiseJin, Xiaodan January 2010 (has links)
No description available.
|
60 |
Maximization of propylene in an industrial FCC unitJohn, Yakubu M., Patel, Rajnikant, Mujtaba, Iqbal 15 May 2018 (has links)
Yes / The FCC riser cracks gas oil into useful fuels such as gasoline, diesel and some lighter products such as ethylene and propylene, which are major building blocks for the polyethylene and polypropylene production. The production objective of the riser is usually the maximization of gasoline and diesel, but it can also be to maximize propylene. The optimization and parameter estimation of a six-lumped catalytic cracking reaction of gas oil in FCC is carried out to maximize the yield of propylene using an optimisation framework developed in gPROMS software 5.0 by optimizing mass flow rates and temperatures of catalyst and gas oil. The optimal values of 290.8 kg/s mass flow rate of catalyst and 53.4 kg/s mass flow rate of gas oil were obtained as propylene yield is maximized to give 8.95 wt%. When compared with the base case simulation value of 4.59 wt% propylene yield, the maximized propylene yield is increased by 95%.
|
Page generated in 0.1168 seconds