61 |
EVALUATION OF SOLID STATE ACCELEROMETER SENSOR FOR EFFECTIVE POSITION ESTIMATIONLele, Meenal Anand 22 November 2010 (has links)
Inertial sensors such as Gyroscope and Accelerometer show systematic as well as random errors in the measurement. Furthermore, double integration method shows accumulation of error in position estimation due to inherent accelerometer bias drift. The primary objective of this research was to evaluate ADXL 335 acceleration sensor for better position estimation using acceleration bias drift error model. In addition, measurement data was recorded with four point rotation test for investigation of error characteristics. The fitted model was validated by using nonlinear regression analysis. The secondary objective was to examine the effect of bias drift and scale factor errors by introducing error model in Kalman Filter smoothing algorithm. The study showed that the accelerometer may be used for short distance mobile robot position estimation. This research would also help to establish a generalized test procedure for evaluation of accelerometer in terms of sensitivity, accuracy and data reliability.
|
62 |
Application of the Ensemble Kalman Filter to Estimate Fracture Parameters in Unconventional Horizontal Wells by Downhole Temperature MeasurementsGonzales, Sergio Eduardo 16 December 2013 (has links)
The increase in energy demand throughout the world has forced the oil industry to develop and expand on current technologies to optimize well productivity. Distributed temperature sensing has become a current and fairly inexpensive way to monitor performance in hydraulic fractured wells in real time by the aid of fiber optic. However, no applications have yet been attempted to describe or estimate the fracture parameters using distributed temperature sensing as the observation parameter. The Ensemble Kalman Filter, a recursive filter, has proved to be an effective tool in the application of inverse problems to determine parameters of non-linear models. Even though large amounts of data are acquired as the information used to apply an estimation, the Ensemble Kalman Filter effectively minimizes the time of operation by only using “snapshots” of the ensembles collected by various simulations where the estimation is updated continuously to be calibrated by comparing it to a reference model.
A reservoir model using ECLIPSE is constructed that measures temperature throughout the wellbore. This model is a hybrid representation of what distributed temperature sensing measures in real-time throughout the wellbore. Reservoir and fracture parameters are selected in this model with similar properties and values to an unconventional well. However, certain parameters such as fracture width are manipulated to significantly diminish the computation time.
A sensitivity study is performed for all the reservoir and fracture parameters in order to understand which parameters require more or less data to allow the Ensemble Kalman Filter to arrive to an acceptable estimation. Two fracture parameters are selected based on their low sensitivity and importance in fracture design to perform the Ensemble Kalman Filter on various simulations.
Fracture permeability has very low sensitivity. However, when applying the estimation the Ensemble Kalman Filter arrives to an acceptable estimation. Similarly fracture halflength, with medium sensitivity, arrives to an acceptable estimation around the same number of integration steps. The true effectiveness of the Ensemble Kalman Filter is presented when both parameters are estimated jointly and arrive to an acceptable estimation without being computationally expensive. The effectiveness of the Ensemble Kalman Filter is directly connected to the quantity of data acquired. The more data available to run simulations, the better and faster the filter performs.
|
63 |
New Algorithms in Rigid-Body Registration and Estimation of Registration AccuracyHedjazi Moghari, MEHDI 28 September 2008 (has links)
Rigid-body registration is an important research area with major applications in computer-assisted and image-guided surgery. In these surgeries, often the relationship between the preoperative and intraoperative images taken from a patient must be established. This relationship is computed through a registration process, which finds a set of transformation parameters that maps
some point fiducials measured on a patient anatomy to a preoperative model. Due to point measurement error caused by medical measurement instruments, the estimated registration parameters are imperfect and this reduces the accuracy of the performed registrations. Medical measurement instruments often perturb the collected points from the patient anatomy by heterogeneous noise. If the noise characteristics are known, they
can be incorporated in the registration algorithm in order to more reliably and accurately estimate the registration parameters and their variances.
Current techniques employed in rigid-body registration are primarily based on the well-known Iterative Closest Points (ICP)
algorithm. Such techniques are susceptible to the existence of noise in the data sets, and are also very sensitive to the initial
alignment errors. Also, the literature offers no analytical solution on how to estimate the accuracy of the performed registrations in the presence of heterogenous noise.
In an effort to alleviate these problems, we propose and validate various novel registration techniques based on the Unscented Kalman Filter (UKF) algorithm. This filter is generally employed for analyzing nonlinear systems corrupted by additive heterogenous Gaussian noise. First, we propose a new registration algorithm to
fit two data sets in the presence of arbitrary Gaussian noise,
when the corresponding points between the two data sets are assumed to be known. Next, we extend this algorithm to perform
surface-based registration, where point correspondences are not available, but the data sets are roughly aligned. A solution to
multi-body point and surface-based registration problem is then
proposed based on the UKF algorithm.
The outputs of the proposed UKF registration algorithms are then utilized to estimate the
accuracy of the performed registration. For the first time, novel derivations are presented that can estimate the distribution of registration error at a target in the presence of an arbitrary Gaussian noise. / Thesis (Ph.D, Electrical & Computer Engineering) -- Queen's University, 2008-09-28 07:25:38.229
|
64 |
On-Line Optimization for a Batch-Fed Copolymerization Reactor with Partial State MeasurementOKORAFO, ONYINYE 06 October 2009 (has links)
Polymerization processes require adequate monitoring to ensure that the final product meets specification. Various on-line measuring techniques have been developed and implemented to track polymer properties in reactors. For most processes, however, on-line measurement cannot be implemented. In other situations, certain polymer properties or states might not be measurable and hence have to be estimated. This work deals with improving an on-line optimization technique and demonstrating its eff ectiveness by sensitivity analysis. In addition, state estimation is used as a tool to reconstruct states that are unavailable for measurement in a styrene and butyl methacrylate batch-fed solution free-radical copolymerization process subject to on-line optimization. A hybrid extended Kalman filter is used to observe the nonlinear dynamic system which is subject to real-time dynamic optimization. With very limited measurement information, the states of the system were reconstructed. Additional unobservable quantities were reconstructed using the process model and estimated states. / Thesis (Master, Chemical Engineering) -- Queen's University, 2009-09-28 16:02:55.974
|
65 |
Adaptive flood forecasting using weather radar dataTomlin, Christopher Michael January 1999 (has links)
No description available.
|
66 |
Testing a Coupled Global-limited-area Data Assimilation System Using Observations from the 2004 Pacific Typhoon SeasonHolt, Christina 2011 August 1900 (has links)
Tropical cyclone (TC) track and intensity forecasts have improved in recent years due to increased model resolution, improved data assimilation, and the rapid increase in the number of routinely assimilated observations over oceans. The data assimilation approach that has received the most attention in recent years is Ensemble Kalman Filtering (EnKF). The most attractive feature of the EnKF is that it uses a fully flow-dependent estimate of the error statistics, which can have important benefits for the analysis of rapidly developing TCs.
We implement the Local Ensemble Transform Kalman Filter algorithm, a variation of the EnKF, on a reduced-resolution version of the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) model and the NCEP Regional Spectral Model (RSM) to build a coupled global-limited area analysis/forecast system. This is the first time, to our knowledge, that such a system is used for the analysis and forecast of tropical cyclones. We use data from summer 2004 to study eight tropical cyclones in the Northwest Pacific.
The benchmark data sets that we use to assess the performance of our system are the NCEP Reanalysis and the NCEP Operational GFS analyses from 2004. These benchmark analyses were both obtained by the Statistical Spectral Interpolation, which was the operational data assimilation system of NCEP in 2004. The GFS Operational analysis assimilated a large number of satellite radiance observations in addition to the observations assimilated in our system. All analyses are verified against the Joint Typhoon Warning Center Best Track data set. The errors are calculated for the position and intensity of the TCs.
The global component of the ensemble-based system shows improvement in position analysis over the NCEP Reanalysis, but shows no significant difference from the NCEP operational analysis for most of the storm tracks. The regional component of our system improves position analysis over all the global analyses. The intensity analyses, measured by the minimum sea level pressure, are of similar quality in all of the analyses. Regional deterministic forecasts started from our analyses are generally not significantly different from those started from the GFS operational analysis. On average, the regional experiments performed better for longer than 48 h sea level pressure forecasts, while the global forecast performed better in predicting the position for longer than 48 h.
|
67 |
Essays on the credit default swap marketWang, Peipei, Banking & Finance, Australian School of Business, UNSW January 2009 (has links)
The focus of this dissertation is the European Credit Default Swaps (CDSs) market. CDSs are the most popular credit derivative products. Three issues are discussed, the first, which is covered in chapter 2, is the investigation of non-diversifiable jump risk in iTraxx sector indices based on a multivariate model that explicitly admits discrete common jumps for an index and its components. Our empirical research shows that both the iTraxx Non-Financials and their components experience jumps during the sample period, which means that the jump risks in the iTraxx sector index are not diversifiable. The second issue, which is covered in chapter 3 is the component structure of credit default swap spreads and their determinants. We firstly extract a transitory component and a persistent component from two different maturities of the Markit iTraxx index and then regress these components against proxies for several commonly used explanatory variables. Our results show that these explanatory variables have significant but differing impacts on the extracted components, which indicates that a two-factor formulation may be needed to model CDS options. The last issue, which is covered in chapters 4, 5 and 6 is the investigation of the linkage between the credit default swap market and the equity market within the European area. We innovatively calibrate the CDS option with the Heston Model to get the implied volatility in the CDS market, which allows us to investigate both the characteristic of implied volatility in the CDS market and the relationship of the two markets not only on the level of daily changes but also with regard to its second moment. Our analysis shows that the stock market weakly leads the CDS market on daily changes but for implied volatility, the stock market leads the CDS market. A VECM analysis shows that only the stock market contributes to price discovery. For sub-investment grade entities, the interactivities between the implied volatility of the CDS market and the implied volatility of the stock market are stronger, especially during the recent credit crunch period. All these results have important implications for the construction of portfolios with credit-sensitive instruments.
|
68 |
Robuste Schätzung und Sensorfusion zur Navigation von wiederverwendbaren RaumtransporternSchlotterer, Markus January 2007 (has links)
Zugl.: Bremen, Univ., Diss., 2007
|
69 |
Untersuchungen zur kooperativen Fahrzeuglokalisierung in dezentralen SensornetzenObst, Marcus. Richter, Eric. January 2009 (has links)
Chemnitz, Techn. Univ., Diplomarb., 2009.
|
70 |
Mehrkanalige Geräuschreduktion bei Sprachsignalen mittels Kalman-Filter /Kaps, Alexander Michael. January 2008 (has links)
Zugl.: Darmstadt, Techn. Universiẗat, Diss.
|
Page generated in 0.0356 seconds