• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 65
  • 22
  • 9
  • 7
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 133
  • 28
  • 27
  • 20
  • 17
  • 16
  • 16
  • 14
  • 14
  • 13
  • 13
  • 12
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

A Empirical Study on Stock Market Timing with Technical Trading rules

Chao, Yung-Yu 10 July 2002 (has links)
In the last few years, it has been proved that the movements of financial asset have the property of non-linearity and show some tendency within a given period. Increasing evidence that technical trading rules can detect non-linearity in financial time series has renewed interest in technical analysis. This study evaluates the market timing ability of the moving average trading rules in twelve equity markets in the developed markets and the emerging markets from January 1990 through Match 2002. We use traditional test, bootstrap p-value test, Cumby-Modest¡¦s market timing ability test and simulation stock trade to evaluate market timing ability. The overall results indicate that the moving average trading rules have predictive ability with respect to market indices in the Asia emerging stock markets. These findings may provide investors with important asset allocation information.
52

INTEGRATED DECISION MAKING FOR PLANNING AND CONTROL OF DISTRIBUTED MANUFACTURING ENTERPRISES USING DYNAMIC-DATA-DRIVEN ADAPTIVE MULTI-SCALE SIMULATIONS (DDDAMS)

Celik, Nurcin January 2010 (has links)
Discrete-event simulation has become one of the most widely used analysis tools for large-scale, complex and dynamic systems such as supply chains as it can take randomness into account and address very detailed models. However, there are major challenges that are faced in simulating such systems, especially when they are used to support short-term decisions (e.g., operational decisions or maintenance and scheduling decisions considered in this research). First, a detailed simulation requires significant amounts of computation time. Second, given the enormous amount of dynamically-changing data that exists in the system, information needs to be updated wisely in the model in order to prevent unnecessary usage of computing and networking resources. Third, there is a lack of methods allowing dynamic data updates during the simulation execution. Overall, in a simulation-based planning and control framework, timely monitoring, analysis, and control is important not to disrupt a dynamically changing system. To meet this temporal requirement and address the above mentioned challenges, a Dynamic-Data-Driven Adaptive Multi-Scale Simulation (DDDAMS) paradigm is proposed to adaptively adjust the fidelity of a simulation model against available computational resources by incorporating dynamic data into the executing model, which then steers the measurement process for selective data update. To the best of our knowledge, the proposed DDDAMS methodology is one of the first efforts to present a coherent integrated decision making framework for timely planning and control of distributed manufacturing enterprises.To this end, comprehensive system architecture and methodologies are first proposed, where the components include 1) real time DDDAM-Simulation, 2) grid computing modules, 3) Web Service communication server, 4) database, 5) various sensors, and 6) real system. Four algorithms are then developed and embedded into a real-time simulator for enabling its DDDAMS capabilities such as abnormality detection, fidelity selection, fidelity assignment, and prediction and task generation. As part of the developed algorithms, improvements are made to the resampling techniques for sequential Bayesian inferencing, and their performance is benchmarked in terms of their resampling qualities and computational efficiencies. Grid computing and Web Services are used for computational resources management and inter-operable communications among distributed software components, respectively. A prototype of proposed DDDAM-Simulation was successfully implemented for preventive maintenance scheduling and part routing scheduling in a semiconductor manufacturing supply chain, where the results look quite promising.
53

Monte Carlo based Threat Assessment: An in depth Analysis

Danielsson, Simon January 2007 (has links)
This thesis presents improvements and extensions of a previously presented threat assessment algorithm. The algorithm uses Monte Carlo simulation to find threats in a road scene. It is shown that, by using a wider sample distribution and only apply the most likely samples from the Monte Carlo simulation, for the threat assessment, improved results are obtained. By using this method more realistic paths will be chosen by the simulated vehicles and more complex traffic situations will be adequately handled. An improvement of the dynamic model is also suggested, which improves the realism of the Monte Carlo simulations. Using the new dynamic model less false positive and more valid threats are detected. A systematic method to choose parameters in a stochastic space, using optimisation, is suggested. More realistic trajectories can be chosen, by applying this method on the parameters that represents the human behaviour, in the threat assessment algorithm. A new definition of obstacles in a road scene is suggested, dividing them into two groups, Hard and Soft obstacles. A change to the resampling step, in the Monte Carlo simulation, using the soft and hard obstacles is also suggested.
54

Evaluating Variance of the Model Credibility Index

Xiao, Yan 30 November 2007 (has links)
Model credibility index is defined to be a sample size under which the power of rejection equals 0.5. It applies goodness-of-fit testing thinking and uses a one-number summary statistic as an assessment tool in a false model world. The estimation of the model credibility index involves a bootstrap resampling technique. To assess the consistency of the estimator of model credibility index, we instead study the variance of the power achieved at a fixed sample size. An improved subsampling method is proposed to obtain an unbiased estimator of the variance of power. We present two examples to interpret the mechanics of building model credibility index and estimate its error in model selection. One example is two-way independent model by Pearson Chi-square test, and another example is multi-dimensional logistic regression model using likelihood ratio test.
55

Development of a Multi-body Statistical Shape Model of the Wrist

Semechko, Anton 21 December 2011 (has links)
With continually growing availability of high performance computing resources, the finite element methods (FEM) are becoming increasingly more efficient and practical research tools. In the domain of computational biomechanics, FEMs have been successfully applied in investigation of biomedical problems that include impact and fracture mechanics of bone, load transmission through the joints, feasibility of joint replacements, and many others. The present research study was concerned with the development of a detailed, anatomically accurate, finite element model of the human hand and wrist. As a first step in this direction, we used a publically available database of wrist bone anatomy and carpal kinematics to construct a multi-body statistical shape model (SSM) of the wrist. The resulting model provides an efficient parameterization of anatomical variations of the entire training set and can thus overcome the major shortcoming of conventional biomechanical models associated with limited generalization ability. The main contributions of this work are: 1) A robust method for constructing multi-body SSM of the wrist from surface meshes. 2) A novel technique for resampling closed genus-0 meshes to produce high quality triangulations suitable for finite element simulations. Additionally, all techniques developed in the course of this study could be directly applied to create an equivalent model of the tarsus.
56

Data Assimilation for Agent-Based Simulation of Smart Environment

Wang, Minghao 18 December 2014 (has links)
Agent-based simulation of smart environment finds its application in studying people’s movement to help the design of a variety of applications such as energy utilization, HAVC control and egress strategy in emergency situation. Traditionally, agent-based simulation is not dynamic data driven, they run offline and do not assimilate real sensor data about the environment. As more and more buildings are equipped with various sensors, it is possible to utilize real time sensor data to inform the simulation. To incorporate the real sensor data into the simulation, we introduce the method of data assimilation. The goal of data assimilation is to provide inference about system state based on the incomplete, ambiguous and uncertain sensor data using a computer model. A typical data assimilation framework consists of a computer model, a series of sensors and a melding scheme. The purpose of this dissertation is to develop a data assimilation framework for agent-based simulation of smart environment. With the developed data assimilation framework, we demonstrate an application of building occupancy estimation which focuses on position estimation using the framework. We build an agent based model to simulate the occupants’ movement s in the building and use this model in the data assimilation framework. The melding scheme we use to incorporate sensor data into the built model is particle filter algorithm. It is a set of statistical method aiming at compute the posterior distribution of the underlying system using a set of samples. It has the benefit that it does not have any assumption about the target distribution and does not require the target system to be written in analytic form .To overcome the high dimensional state space problem as the number of agents increases, we develop a new resampling method named as the component set resampling and evaluate its effectiveness in data assimilation. We also developed a graph-based model for simulating building occupancy. The developed model will be used for carrying out building occupancy estimation with extremely large number of agents in the future.
57

Parallel Hardware for Sampling Based Nonlinear Filters in FPGAs

Kota Rajasekhar, Rakesh January 2014 (has links)
Particle filters are a class of sequential Monte-Carlo methods which are used commonly when estimating various unknowns of the time-varying signals presented in real time, especially when dealing with nonlinearity and non-Gaussianity in BOT applications. This thesis work is designed to perform one such estimate involving tracking a person using the road information available from an IR surveillance video. In this thesis, a parallel custom hardware is implemented in Altera cyclone IV E FPGA device utilizing SIRF type of particle filter. This implementation has accounted how the algorithmic aspects of this sampling based filter relate to possibilities and constraints in a hardware implementation. Using 100MHz clock frequency, the synthesised hardware design can process almost 50 Mparticles/s. Thus, this implementation has resulted in tracking the target, which is defined by a 5-dimensional state variable, using the noisy measurements available from the sensor.
58

Statistical analysis and simulation methods related to load-sharing models.

Rydén, Patrik January 2000 (has links)
We consider the problem of estimating the reliability of bundles constructed of several fibres, given a particular kind of censored data. The bundles consist of several fibres which have their own independent identically dis-tributed failure stresses (i.e.the forces that destroy the fibres). The force applied to a bundle is distributed between the fibres in the bundle, accord-ing to a load-sharing model. A bundle with these properties is an example of a load-sharing system. Ropes constructed of twisted threads, compos-ite materials constructed of parallel carbon fibres, and suspension cables constructed of steel wires are all examples of load-sharing systems. In par-ticular, we consider bundles where load-sharing is described by either the Equal load-sharing model or the more general Local load-sharing model. In order to estimate the cumulative distribution function of failure stresses of bundles, we need some observed data. This data is obtained either by testing bundles or by testing individual fibres. In this thesis, we develop several theoretical testing methods for both fibres and bundles, and related methods of statistical inference. Non-parametric and parametric estimators of the cumulative distribu-tion functions of failure stresses of fibres and bundles are obtained from different kinds of observed data. It is proved that most of these estimators are consistent, and that some are strongly consistent estimators. We show that resampling, in this case random sampling with replacement from sta-tistically independent portions of data, can be used to assess the accuracy of these estimators. Several numerical examples illustrate the behavior of the obtained estimators. These examples suggest that the obtained estimators usually perform well when the number of observations is moderate.
59

Modelling and resampling based multiple testing with applications to genetics

Huang, Yifan. January 2005 (has links)
Thesis (Ph. D.)--Ohio State University, 2005. / Title from first page of PDF file. Document formatted into pages; contains xii, 97 p.; also includes graphics. Includes bibliographical references (p. 94-97). Available online via OhioLINK's ETD Center
60

Probabilistic and statistical problems related to long-range dependence

Bai, Shuyang 11 August 2016 (has links)
The thesis is made up of a number of studies involving long-range dependence (LRD), that is, a slow power-law decay in the temporal correlation of stochastic models. Such a phenomenon has been frequently observed in practice. The models with LRD often yield non-standard probabilistic and statistical results. The thesis includes in particular the following topics: Multivariate limit theorems. We consider a vector made of stationary sequences, some components of which have LRD, while the others do not. We show that the joint scaling limits of the vector exhibit an asymptotic independence property. Non-central limit theorems. We introduce new classes of stationary models with LRD through Volterra-type nonlinear filters of white noise. The scaling limits of the sum lead to a rich class of non-Gaussian stochastic processes defined by multiple stochastic integrals. Limit theorems for quadratic forms. We consider continuous-time quadratic forms involving continuous-time linear processes with LRD. We show that the scaling limit of such quadratic forms depends on both the strength of LRD and the decaying rate of the quadratic coefficient. Behavior of the generalized Rosenblatt process. The generalized Rosenblatt process arises from scaling limits under LRD. We study the behavior of this process as its two critical parameters approach the boundaries of the defining region. Inference using self-normalization and resampling. We introduce a procedure called "self-normalized block sampling" for the inference of the mean of stationary time series. It provides a unified approach to time series with or without LRD, as well as with or without heavy tails. The asymptotic validity of the procedure is established.

Page generated in 0.0699 seconds