191 |
CONTINUOUS TIME MARKOVIAN SEQUENTIAL CONTROL PROCESSESUnknown Date (has links)
Source: Dissertation Abstracts International, Volume: 29-03, Section: B, page: 1201. / Thesis (Ph.D.)--The Florida State University, 1968.
|
192 |
RANDOMIZATION TESTS FOR THE MULTIVARIATE TWO-SAMPLE AND PAIRED-SAMPLE PROBLEMS AND THE PROBLEM OF INDEPENDENCEUnknown Date (has links)
Source: Dissertation Abstracts International, Volume: 31-09, Section: B, page: 5701. / Thesis (Ph.D.)--The Florida State University, 1970.
|
193 |
A study of Hougaard distributions, Hougaard processes and their applications /Fook Chong, Stéphanie M. C. January 1992 (has links)
No description available.
|
194 |
Likelihoods : a surveyRahme, Elham H. January 1994 (has links)
No description available.
|
195 |
A case of scientific fraud? : a statistical approachBohossian, Nora. January 2006 (has links)
No description available.
|
196 |
Doubly censored prevalent cohort survival dataGuo, Hui, 1974- January 2006 (has links)
No description available.
|
197 |
The use of markers to enhance time-to-event analysis /MacKenzie, Todd. January 1997 (has links)
No description available.
|
198 |
Neural Networks for Time Series Forecasting: Practical Implications of Theoretical Results.Thielbar, Melinda F. Unknown Date (has links)
Research on using autoregressive neural networks to forecast nonlinear time series has produced mixed results. While neural networks have been established as universal approximators, large-scale studies comparing neural network forecasts with simpler models have rarely shown better performance for the neural network model. In the best cases, the neural network models prevail only after careful tuning. / We examine the simplest case of an autoregressive neural network, where the current value of Yt is dependent on a function of one lag with one shortcut connection and one hidden unit. We find that even when data are generated from the autoregressive neural network, the location of the series attraction point often leads to data that exhibit little nonlinear behavior. We use these results as a guide in extending traditional theory on nonlinear time series models. Our added theory is used to select parameter values for a simulation and to generate starting values for training a neural network. Performance for different methods of estimating forecasts are compared. We find that even for our relatively simple neural network, where we know the correct number of hidden units, estimating the parameters is a nontrivial task, and forecasts should be approached with caution. / The one-lag, one hidden unit model is then applied to a time series from an experiment in engineering. We find that the methods developed in this paper work well for this data and have promise in applications where measurements are taken often using a computerized setup.
|
199 |
Robust Variable SelectionSchumann, David Heinz 20 April 2009 (has links)
The prevalence of extreme outliers in many regression data sets has led to the development of robust methods that can handle these observations. While much attention has been placed on the problem of estimating regression coefficients in the presence of outliers, few methods address variable selection. We develop and study robust versions of the forward selection algorithm, one of the most popular standard variable selection techniques. Specifically we modify the VAMS procedure, a version of forward selection tuned to control the false selection rate, to simultaneously select variables and eliminate outliers. In an alternative approach, robust versions of the forward selection algorithm are developed using the robust forward addition sequence associated with the generalized score statistic. Combining the robust forward addition sequence with robust versions of BIC and the VAMS procedure, a final model is obtained. Monte Carlo simulation compares these robust methods to current robust methods like the LSA and LAD-LASSO. Further simulation investigates the relationship between the breakdown point of the estimation methods central to each procedure and the breakdown point of the final variable selection method.
|
200 |
Fast FSR Methods for Second-Order Linear Regression ModelsCrews, Hugh Bates 13 May 2008 (has links)
Many variable selection techniques have been developed that focus on first-order linear regression models. In some applications, such as modeling response surfaces, fitting second-order terms can improve predictive accuracy. However, the number of spurious interactions can be large leading to poor results with many methods. We focus on forward selection, describing algorithms that use the natural hierarchy existing in second-order linear regression models to limit spurious interactions. We then develop stopping rules by extending False Selection Rate methodology to these algorithms. In addition, we describe alternative estimation methods for fitting regression models including the LASSO, CART, and MARS. We also propose a general method for controlling multiple-group false selection rates, which we apply to second-order linear regression models. By estimating a separate entry level for first-order and second-order terms, we obtain equal contributions to the false selection rate from each group. We compare the methods via Monte Carlo simulation and apply them to optimizing response surface experimental designs.
|
Page generated in 0.0313 seconds