• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 16
  • 16
  • 16
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Autoregresní modely typu NIAR(1) / Near integrated AR(1) models

Onderko, Martin January 2015 (has links)
My final thesis firstly addresses basic knowledge of the theory of stochastic processes. This is firstly due to the author's effort to make the thesis more comprehensible, and also due to the need for introduction of key concepts. The autoregressive model AR(1) is defined in the thesis through basic linear time series models, and in this model, the estimation of model parameter by the method of least squares is introduced. For this estimation, the theoretical findings of the thesis are extended through the classical limit theory. Furthermore, the models with their parameter dependent on number of observations are introduced and models of NIAR (1) are defined. Classical limit theory for least squares estimation is then enriched by the limit theory in these models. The category of more general models is introduced and using the acquired knowledge, the features for the model AR (1) are derived. This thesis deals with this issue in models of NIAR (1) and its area of interest is also the bootstrap. The theoretical part of the thesis is supplemented by a practical part represented by numerical studies.
2

Spatial econometrics models, methods and applications /

Tao, Ji, January 2005 (has links)
Thesis (Ph. D.)--Ohio State University, 2005. / Title from first page of PDF file. Document formatted into pages; contains x, 140 p. Includes bibliographical references (p. 137-140). Available online via OhioLINK's ETD Center
3

AIC Under the Framework of Least Squares Estimation

Banks, H. T., Joyner, Michele L. 01 December 2017 (has links)
In this note we explain the use of the Akiake Information Criterion and its related model comparison indices (usually derived for maximum likelihood estimator inverse problem formulations) in the context of least squares (ordinary, weighted, iterative weighted or “generalized”, etc.) based inverse problem formulations. The ideas are illustrated with several examples of interest in biology.
4

Frequency Response and Coherence function estimation methods

Patwardhan, Rohit S. 04 November 2020 (has links)
No description available.
5

Dimensionality Reduction of Hyperspectral Imagery Using Random Projections

Menon, Vineetha 09 December 2016 (has links)
Hyperspectral imagery is often associated with high storage and transmission costs. Dimensionality reduction aims to reduce the time and space complexity of hyperspectral imagery by projecting data into a low-dimensional space such that all the important information in the data is preserved. Dimensionality-reduction methods based on transforms are widely used and give a data-dependent representation that is unfortunately costly to compute. Recently, there has been a growing interest in data-independent representations for dimensionality reduction; of particular prominence are random projections which are attractive due to their computational efficiency and simplicity of implementation. This dissertation concentrates on exploring the realm of computationally fast and efficient random projections by considering projections based on a random Hadamard matrix. These Hadamard-based projections are offered as an alternative to more widely used random projections based on dense Gaussian matrices. Such Hadamard matrices are then coupled with a fast singular value decomposition in order to implement a two-stage dimensionality reduction that marries the computational benefits of the data-independent random projection to the structure-capturing capability of the data-dependent singular value transform. Finally, random projections are applied in conjunction with nonnegative least squares to provide a computationally lightweight methodology for the well-known spectral-unmixing problem. Overall, it is seen that random projections offer a computationally efficient framework for dimensionality reduction that permits hyperspectral-analysis tasks such as unmixing and classification to be conducted in a lower-dimensional space without sacrificing analysis performance while reducing computational costs significantly.
6

Identification Of Periodic Autoregressive Moving Average Models

Akgun, Burcin 01 September 2003 (has links) (PDF)
In this thesis, identification of periodically varying orders of univariate Periodic Autoregressive Moving-Average (PARMA) processes is mainly studied. The identification of the varying orders of PARMA process is carried out by generalizing the well-known Box-Jenkins techniques to a seasonwise manner. The identification of pure periodic moving-average (PMA) and pure periodic autoregressive (PAR) models are considered only. For PARMA model identification, the Periodic Autocorrelation Function (PeACF) and Periodic Partial Autocorrelation Function (PePACF), which play the same role as their ARMA counterparts, are employed. For parameter estimation, which is considered only to refine model identification, the conditional least squares estimation (LSE) method is used which is applicable to PAR models. Estimation becomes very complicated, difficult and may give unsatisfactory results when a moving-average (MA) component exists in the model. On account of overcoming this difficulty, seasons following PMA processes are tried to be modeled as PAR processes with reasonable orders in order to employ LSE. Diagnostic checking, through residuals of the fitted model, is also performed stating its reasons and methods. The last part of the study demonstrates application of identification techniques through analysis of two seasonal hydrologic time series, which consist of average monthly streamflows. For this purpose, computer programs were developed specially for PARMA model identification.
7

Estimation In The Simple Linear Regression Model With One-fold Nested Error

Ulgen, Burcin Emre 01 June 2005 (has links) (PDF)
In this thesis, estimation in simple linear regression model with one-fold nested error is studied. To estimate the fixed effect parameters, generalized least squares and maximum likelihood estimation procedures are reviewed. Moreover, Minimum Norm Quadratic Estimator (MINQE), Almost Unbiased Estimator (AUE) and Restricted Maximum Likelihood Estimator (REML) of variance of primary units are derived. Also, confidence intervals for the fixed effect parameters and the variance components are studied. Finally, the aforesaid estimation techniques and confidence intervals are applied to a real-life data and the results are presented
8

Mean preservation in censored regression using preliminary nonparametric smoothing

Heuchenne, Cédric 18 August 2005 (has links)
In this thesis, we consider the problem of estimating the regression function in location-scale regression models. This model assumes that the random vector (X,Y) satisfies Y = m(X) + s(X)e, where m(.) is an unknown location function (e.g. conditional mean, median, truncated mean,...), s(.) is an unknown scale function, and e is independent of X. The response Y is subject to random right censoring, and the covariate X is completely observed. In the first part of the thesis, we assume that m(x) = E(Y|X=x) follows a polynomial model. A new estimation procedure for the unknown regression parameters is proposed, which extends the classical least squares procedure to censored data. The proposed method is inspired by the method of Buckley and James (1979), but is, unlike the latter method, a non-iterative procedure due to nonparametric preliminary estimation. The asymptotic normality of the estimators is established. Simulations are carried out for both methods and they show that the proposed estimators have usually smaller variance and smaller mean squared error than the Buckley-James estimators. For the second part, suppose that m(.)=E(Y|.) belongs to some parametric class of regression functions. A new estimation procedure for the true, unknown vector of parameters is proposed, that extends the classical least squares procedure for nonlinear regression to the case where the response is subject to censoring. The proposed technique uses new `synthetic' data points that are constructed by using a nonparametric relation between Y and X. The consistency and asymptotic normality of the proposed estimator are established, and the estimator is compared via simulations with an estimator proposed by Stute in 1999. In the third part, we study the nonparametric estimation of the regression function m(.). It is well known that the completely nonparametric estimator of the conditional distribution F(.|x) of Y given X=x suffers from inconsistency problems in the right tail (Beran, 1981), and hence the location function m(x) cannot be estimated consistently in a completely nonparametric way, whenever m(x) involves the right tail of F(.|x) (like e.g. for the conditional mean). We propose two alternative estimators of m(x), that do not share the above inconsistency problems. The idea is to make use of the assumed location-scale model, in order to improve the estimation of F(.|x), especially in the right tail. We obtain the asymptotic properties of the two proposed estimators of m(x). Simulations show that the proposed estimators outperform the completely nonparametric estimator in many cases.
9

Implementation Of Software Gps Receiver

Gunaydin, Ezgi 01 July 2005 (has links) (PDF)
A software GPS receiver is a functional GPS receiver in software. It has several advantages compared to its hardware counterparts. For instance, improvements in receiver architecture as well as GPS system structure can be easily adapted to it. Furthermore, interaction between nearby sensors can be coordinated easily. In this thesis, a SGR (software GPS receiver) is presented from a practical point of view. Major components of the SGR are implemented in Matlab environment. Furthermore, some alternative algorithms are implemented. SGR implementation is considered in two main sections namely a signal processing section and a navigation section. Signal processing section is driven by the raw GPS signal samples obtained from a GPS front-end of NordNavTM R-25 instrument. The conventional and the block adjustment of synchronizing signal (BAAS) processing methods are implemented and their performances are compared in terms of their speed and outputs. Signal processing section outputs raw GPS measurements and navigation data bits. Since the output data length is insufficient in our case, navigation section input is fed from AshtechTM GPS receiver for a moving platform and TrimbleTM GPS Receiver for a stationary platform. Satellite position computation, pseudorange corrections, Kalman filter and LSE (least squares estimation) are implemented in the navigation section. Kalman filter and LSE methods are compared in terms of positioning accuracy for a moving as well as a stationary platform. Results are compared with the commercial GPS outputs. This comparison shows that the software navigation section is equivalent to the commercial GPS in terms of positioning accuracy.
10

Understanding the relationship of lumber yield and cutting bill requirements: a statistical approach

Buehlmann, Urs 13 October 1998 (has links)
Secondary hardwood products manufacturers have been placing heavy emphasis on lumber yield improvements in recent years. More attention has been on lumber grade and cutting technology rather than cutting bill design. However, understanding the underlying physical phenomena of cutting bill requirements and yield is essential to improve lumber yield in rough mills. This understanding could also be helpful in constructing a novel lumber yield estimation model. The purpose of this study was to advance the understanding of the phenomena relating cutting bill requirements and yield. The scientific knowledge gained was used to describe and quantify the effect of part length, width, and quantity on yield. Based on this knowledge, a statistics based approach to the lumber yield estimation problem was undertaken. Rip-first rough mill simulation techniques and statistical methods were used to attain the study's goals. To facilitate the statistical analysis of the relationship of cutting bill requirements and lumber yield, a theoretical concept, called cutting bill part groups, was developed. Part groups are a standardized way to describe cutting bill requirements. All parts required by a cutting bill are clustered within 20 individual groups according to their size. Each group's midpoint is the representative part size for all parts falling within an individual group. These groups are made such that the error from clustering is minimized. This concept allowed a decrease in the number of possible factors to account for in the analysis of the cutting bill requirements - lumber yield relationship. Validation of the concept revealed that the average error due to clustering parts is 1.82 percent absolute yield. An orthogonal, 220-11 fractional factorial design of resolution V was then used to determine the contribution of different part sizes to lumber yield. All 20 part sizes and 113 of a total of 190 unique secondary interactions were found to be significant (a = 0.05) in explaining the variability in yield observed. Parameter estimates of the part sizes and the secondary interactions were then used to specify the average yield contribution of each variable. Parts with size 17.50 inches in length and 2.50 inches in width were found to contribute the most to higher yield. The positive effect on yield due to parts smaller than 17.50 by 2.50 inches is less pronounced because their quantity is relatively small in an average cutting bill. Parts with size 72.50 by 4.25 inches, on the other hand, had the most negative influence on high yield. However, as further analysis showed, not only the individual parts required by a cutting bill, but also their interaction determines yield. By adding a sufficiently large number of smaller parts to a cutting bill that requires large parts to be cut, high levels of yield can be achieved. A novel yield estimation model using linear least squares techniques was derived based on the data from the fractional factorial design. This model estimates expected yield based on part quantities required by a standardized cutting bill. The final model contained all 20 part groups and their 190 unique secondary interactions. The adjusted R2 for this model was found to be 0.94. The model estimated 450 of the 512 standardized cutting bills used for its derivation to within one percent absolute yield. Standardized cutting bills, whose yield level differs by more than two percent can thus be classified correctly in 88 percent of the cases. Standardized cutting bills whose part quantities were tested beyond the established framework, i.e. the settings used for the data derivation, were estimated with an average error of 2.19 percent absolute yield. Despite the error observed, the model ranked the cutting bills as to their yield level quite accurately. However, cutting bills from actual rough mill operations, which were well beyond the framework of the model, were found to have an average estimation error of 7.62 percent. Nonetheless, the model classified four out of five cutting bills correctly as to their ranking of the yield level achieved. The least squares estimation model thus is a helpful tool in ranking cutting bills for their expected yield level. Overall, the model performs well for standardized cutting bills, but more work is needed to make the model generally applicable for cutting bills whose requirements are beyond the framework established in this study. / Ph. D.

Page generated in 0.1435 seconds