• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 435
  • 312
  • 159
  • 14
  • 11
  • 11
  • 11
  • 11
  • 11
  • 11
  • 9
  • 6
  • 3
  • 3
  • 3
  • Tagged with
  • 1101
  • 1101
  • 326
  • 324
  • 310
  • 240
  • 214
  • 160
  • 153
  • 148
  • 143
  • 120
  • 106
  • 101
  • 91
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.

Generation of the steady state for Markov chains using regenerative simulation.

January 1993 (has links)
by Yuk-ka Chung. / Thesis (M.Phil.)--Chinese University of Hong Kong, 1993. / Includes bibliographical references (leaves 73-74). / Chapter Chapter 1 --- Introduction --- p.1 / Chapter Chapter 2 --- Regenerative Simulation --- p.5 / Chapter § 2.1 --- Discrete time discrete state space Markov chain --- p.5 / Chapter § 2.2 --- Discrete time continuous state space Markov chain --- p.8 / Chapter Chapter 3 --- Estimation --- p.14 / Chapter § 3.1 --- Ratio estimators --- p.14 / Chapter § 3.2 --- General method for generation of steady states from the estimated stationary distribution --- p.17 / Chapter § 3.3 --- Bootstrap method --- p.22 / Chapter § 3.4 --- A new approach: the scoring method --- p.26 / Chapter § 3.4.1 --- G(0) method --- p.29 / Chapter § 3.4.2 --- G(1) method --- p.31 / Chapter Chapter 4 --- Bias of the Scoring Sampling Algorithm --- p.34 / Chapter § 4.1 --- General form --- p.34 / Chapter § 4.2 --- Bias of G(0) estimator --- p.36 / Chapter § 4.3 --- Bias of G(l) estimator --- p.43 / Chapter § 4.4 --- Estimation of bounds for bias: stopping criterion for simulation --- p.51 / Chapter Chapter 5 --- Simulation Study --- p.54 / Chapter Chapter 6 --- Discussion --- p.70 / References --- p.73

Simulation results of a sequential fixed-width confidence interval for a function of parameters

Paik, Chang Soo January 2010 (has links)
Photocopy of typescript. / Digitized by Kansas Correctional Industries

Center-based cluster analysis using inter-point distances.

January 2009 (has links)
Law, Shu Kei. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2009. / Includes bibliographical references (leaves 39-40). / Abstract also in Chinese. / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Basic concept of clustering --- p.1 / Chapter 1.2 --- Main problems --- p.2 / Chapter 1.3 --- Review --- p.3 / Chapter 1.4 --- Newly proposed method --- p.7 / Chapter 1.5 --- Summary --- p.7 / Chapter 2 --- k-means clustering --- p.9 / Chapter 2.1 --- Algorithm of k-means clustering --- p.9 / Chapter 2.2 --- Selecting k in k-mcans clustering --- p.11 / Chapter 2.3 --- Disadvantages of k-means clustering --- p.12 / Chapter 3 --- Methodology and Algorithm --- p.14 / Chapter 3.1 --- Methodology and Algorithm --- p.14 / Chapter 3.2 --- Illustrative Example --- p.20 / Chapter 4 --- Simulation Study --- p.25 / Chapter 4.1 --- Simulation Plan --- p.25 / Chapter 4.2 --- Simulation Details --- p.27 / Chapter 4.3 --- Simulation Result --- p.30 / Chapter 4.4 --- Summary --- p.34 / Chapter 5 --- Conclusion and Further research --- p.36 / Bibliography --- p.38

Effectiveness of using two and three-parameter distributions in place of "best-fit distributions" in discrete event simulation models of production lines

Sharma, Akash 12 December 2003 (has links)
This study presents the results of using common two or three-parameter "default" distributions in place of "best fit distributions" in simulations of serial production lines with finite buffers and blocking. The default distributions used instead of the best-fit distribution are chosen such that they are non-negative, unbounded, and can match either the first two moments or the first three moments of the collected data. Furthermore, the selected default distributions must be commonly available (or easily constructed from) distributions in simulation software packages. The lognormal is used as the two-parameter distribution to match the first two moments of the data. The two-level hyper-exponential and three-parameter lognormal are used as three-parameter distributions to match the first three moments of the data. To test the use of these distributions in simulations, production lines have been separated into two major classes: automated and manual. In automated systems the workstations have fixed processing times and random time between failures, and random repair times. In manual systems, the workstations are reliable but have random processing times. Results for both classes of lines show that the differences in throughput from simulations using best-fit distributions and two parameter lognormal is small in some cases and can be reduced in others by matching the first three moments of the data. Also, different scenarios are identified which lead to higher differences in throughput when using a two-parameter default distribution. / Graduation date: 2004

A computer simulation model of seasonal transpiration in Douglas-fir based on a model of stomatal resistance /

Reed, Kenneth Lee, January 1972 (has links)
Thesis (Ph. D.)--Oregon State University, 1972. / Typescript (photocopy). Includes bibliographical references. Also available on the World Wide Web.

Truncation rules in simulation analysis : effect of batch size, time scale and input distribution on the application of Schriber's rule

Baxter, Lori K. 04 June 1990 (has links)
The objective of many simulations is to study the steady-state behavior of a nonterminating system. The initial conditions of the system are often atypical because of the complexity of the system. Simulators often start the simulation with the system empty and idle, and truncate, or delete, some quantity of the initial observations to reduce the initialization bias. This paper studies the application of Schriber's truncation rule to a queueing model, and the effects of parameter selection. Schriber's rule requires the simulator to select the parameters of batch size, number of batches, and a measure of precision. In addition, Schriber's rule assumes the output is a time series of discrete observations. Previous studies of Schriber's rule have not considered the effect of variation in the time scale (time between observations). The performance measures for comparison are the mean squared error and the half-length of the confidence interval. The results indicate that the time scale and batch size are significant parameters, and that the number of batches has little effect on the output. A change in the distribution of service time did not alter the results. In addition, it was determined that multiple replicates should be used in establishing the truncation point instead of a single run, and the simulator should carefully consider the choice of time scale for the output series and the batch size. / Graduation date: 1991

A Computer-based corporate modeling system.

Zant, Robert Franklin, January 1972 (has links)
Thesis--University of Florida. / Description based on print version record. Typescript. Vita. Bibliography: leaves 144-145.

Huygens probe entry, descent, and landing trajectory reconstruction using the Program to Optimize Simulated Trajectories II

Striepe, Scott A. (Scott Allen), 1965- 29 August 2008 (has links)
The objectives of this research were to develop a reconstruction capability using the Program to Optimize Simulated Trajectories II (POST2), apply this capability to reconstruct the Huygens Titan probe entry, descent, and landing (EDL) trajectory, evaluate the newly developed POST2 reconstruction module, analyze the reconstructed trajectory, and assess the pre-flight simulation models used for Huygens EDL simulation. An extended Kalman filter (EKF) module was developed and integrated into POST2 to enable trajectory reconstruction (especially when using POST2-based mission specific simulations). Several validation cases, ranging from a single, constant parameter estimate to multivariable estimation cases similar to an actual mission flight, were executed to test the POST2 reconstruction module. Trajectory reconstruction of the Huygens entry probe at Titan was accomplished using accelerometer measurements taken during flight to adjust an estimated state (e.g., position, velocity, parachute drag, wind velocity, etc.) in a POST2-based simulation developed to support EDL analyses and design prior to entry. Although the main emphasis of the trajectory reconstruction was to evaluate models used in the NASA pre-entry trajectory simulation, the resulting reconstructed trajectory was also assessed to provide an independent evaluation of the ESA result. Major findings from this analysis include: Altitude profiles from this analysis agree well with other NASA and ESA results but not with Radar data, whereas a scale factor of about 0.93 would bring the radar measurements into compliance with these results; entry capsule aerodynamics predictions (axial component only) were well within 3-[sigma] bounds established pre-flight for most of the entry when compared to reconstructed values; Main parachute drag of 9% to 19% above ESA model was determined from the reconstructed trajectory; based on the tilt sensor and accelerometer data, the conclusion from this assessment was that the probe was tilted about 10 degrees during the Drogue parachute phase.

Interstage stock control for series production lines with variable operation times

龐維宗, Pong, Wai-chung. January 1985 (has links)
published_or_final_version / Industrial Engineering / Master / Master of Philosophy

Open pit mine dispatching: a simulation study

Williamson, Gary Beyers, 1945- January 1972 (has links)
No description available.

Page generated in 0.1482 seconds