• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 480
  • 92
  • 35
  • 32
  • 10
  • 5
  • 5
  • 5
  • 4
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 822
  • 822
  • 127
  • 121
  • 117
  • 101
  • 85
  • 81
  • 76
  • 70
  • 70
  • 63
  • 62
  • 59
  • 56
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
311

Performance Analysis of Parametric Spectral Estimators

Völcker, Björn January 2002 (has links)
No description available.
312

Topics on fractional Brownian motion and regular variation for stochastic processes

Hult, Henrik January 2003 (has links)
<p>The first part of this thesis studies tail probabilities forelliptical distributions and probabilities of extreme eventsfor multivariate stochastic processes. It is assumed that thetails of the probability distributions satisfy a regularvariation condition. This means, roughly speaking, that thereis a non-negligible probability for very large or extremeoutcomes to occur. Such models are useful in applicationsincluding insurance, finance and telecommunications networks.It is shown how regular variation of the marginals, or theincrements, of a stochastic process implies regular variationof functionals of the process. Moreover, the associated tailbehavior in terms of a limit measure is derived.</p><p>The second part of the thesis studies problems related toparameter estimation in stochastic models with long memory.Emphasis is on the estimation of the drift parameter in somestochastic differential equations driven by the fractionalBrownian motion or more generally Volterra-type processes.Observing the process continuously, the maximum likelihoodestimator is derived using a Girsanov transformation. In thecase of discrete observations the study is carried out for theparticular case of the fractional Ornstein-Uhlenbeck process.For this model Whittle’s approach is applied to derive anestimator for all unknown parameters.</p>
313

Towards Individualized Drug Dosage - General Methods and Case Studies

Fransson, Martin January 2007 (has links)
<p>Progress in individualized drug treatment is of increasing importance, promising to avoid much human suffering and reducing medical treatment costs for society. The strategy is to maximize the therapeutic effects and minimize the negative side effects of a drug on individual or group basis. To reach the goal, interactions between the human body and different drugs must be further clarified, for instance by using mathematical models. Whether clinical studies or laboratory experiments are used as primary sources of information, greatly</p><p>influences the possibilities of obtaining data. This must be considered both prior and during model development and different strategies must be used. The character of the data may also restrict the level of complexity for the models, thus limiting their usage as tools for individualized treatment.</p><p>In this thesis work two case studies have been made, each with the aim to develop a model for a specific human-drug interaction. The first case study concerns treatment of inflammatory bowel disease with thiopurines, whereas the second is about treatment of ovarian cancer with paclitaxel. Although both case studies make use of similar amounts of experimental data, model development depends considerably on prior knowledge about the systems, the character of the data and the choice of modelling tools. All these factors are presented for</p><p>each of the case studies along with current results. Further, a system for classifying different but related models is also proposed with the intention that an increased understanding will contribute to advancement in individualized drug dosage.</p> / Report code: LiU-Tek-Lic-2007:41.
314

Chest Observer for Crash Safety Enhancement

Blåberg, Christian January 2008 (has links)
<p>Feedback control of Chest Acceleration or Chest Deflection is believed to be a good way of minimizing the risk of injury. In order to implement such a controller in a car, an observer estimating these responses is needed. The objective of the study was to develop a model of the dummy’s chest capable of estimating the Chest Acceleration and the Chest Deflection during frontal crashes in real time. The used sensor data come from car accelerometer and spindle rotation sensor of the belt, the data has been collected from dummies during crash tests. This study has accomplished the aims using a simple linear model of the chest using masses, springs and dampers. The parameters of the model have been estimated through system identification. Two types of black-box models have also been studied, one ARX model and one state-space model. The models have been tested and validated against data coming from different crash setups. The results show that all of the studied models can be used to estimate the dummy responses, the physical grey-box model and the black-box state-space model in particular.</p> / <p>Genom att använda återkoppling av storheterna bröstacceleration och bröstintryck antas man kunna minska risken för skador vid krockar i personbilar. För att kunna implementera detta behövs en observatör för dessa storheter. Målet med denna studie är att ta fram en modell för att kunna skatta accelerationen i bröstkorgen samt bröstintrycket i realtid i frontala krockar. Sensordata som använts kom från en accelerometer och en givare för att mäta rotationen i bältessnurran. Detta har gjorts genom att modellera bröstkorgen med linjära fjädrar och dämpare. Dess parametrar har skattats från data från krocktester från krockdockor. Två s.k. black-box-modeller har också tagits fram, en ARX-modell och en på tillståndsform. Modellerna har testats och validerats mha data från olika sorters krocktester. Resultaten visar att alla studerade modeller kan användas för att skatta de ovan nämnda storheterna, den fysikaliska modellen och black-box-modellen på tillståndsform fungerade bäst.</p>
315

Ill-posedness of parameter estimation in jump diffusion processes

Düvelmeyer, Dana, Hofmann, Bernd 25 August 2004 (has links) (PDF)
In this paper, we consider as an inverse problem the simultaneous estimation of the five parameters of a jump diffusion process from return observations of a price trajectory. We show that there occur some ill-posedness phenomena in the parameter estimation problem, because the forward operator fails to be injective and small perturbations in the data may lead to large changes in the solution. We illustrate the instability effect by a numerical case study. To overcome the difficulty coming from ill-posedness we use a multi-parameter regularization approach that finds a trade-off between a least-squares approach based on empircal densities and a fitting of semi-invariants. In this context, a fixed point iteration is proposed that provides good results for the example under consideration in the case study.
316

Parameter estimation in a generalized bivariate Ornstein-Uhlenbeck model

Krämer, Romy, Richter, Matthias, Hofmann, Bernd 07 October 2005 (has links) (PDF)
In this paper, we consider the inverse problem of calibrating a generalization of the bivariate Ornstein-Uhlenbeck model introduced by Lo and Wang. Even though the generalized Black-Scholes option pricing formula still holds, option prices change in comparison to the classical Black-Scholes model. The time-dependent volatility function and the other (real-valued) parameters in the model are calibrated simultaneously from option price data and from some empirical moments of the logarithmic returns. This gives an ill-posed inverse problem, which requires a regularization approach. Applying the theory of Engl, Hanke and Neubauer concerning Tikhonov regularization we show convergence of the regularized solution to the true data and study the form of source conditions which ensure convergence rates.
317

Modeling Direct Runoff Hydrographs with the Surge Function

Voytenko, Denis 01 January 2011 (has links)
A surge function is a mathematical function of the form f(x)=axpe-bx. We simplify the surge function by holding p constant at 1 and investigate the simplified form as a potential model to represent the full peak of a stream discharge hydrograph. The previously studied Weibull and gamma distributions are included for comparison. We develop an analysis algorithm which produces the best-fit parameters for every peak for each model function, and we process the data with a MATLAB script that uses spectral analysis to filter year-long, 15-minute, stream-discharge data sets. The filtering is necessary to locate the concave-upward inflection points used to separate the data set into its constituent, individual peaks. The Levenberg-Marquardt algorithm is used to iteratively estimate the unknown parameters for each version of the modeled peak by minimizing the sum of squares of residuals. The results allow goodness-of-fit comparisons between the three model functions, as well as a comparison of peaks at the same gage through the year of record. Application of these methods to five rivers from three distinct hydrologic regions shows that the simple surge function is a special case of the gamma distribution, which is known to be useful as a modeling function for a full-peak hydrograph. The study also confirms that the Weibull distribution produces good fits to 15-minute hydrograph data.
318

Data assimilation for parameter estimation in coastal ocean hydrodynamics modeling

Mayo, Talea Lashea 25 February 2014 (has links)
Coastal ocean models are used for a vast array of applications. These applications include modeling tidal and coastal flows, waves, and extreme events, such as tsunamis and hurricane storm surges. Tidal and coastal flows are the primary application of this work as they play a critical role in many practical research areas such as contaminant transport, navigation through intracoastal waterways, development of coastal structures (e.g. bridges, docks, and breakwaters), commercial fishing, and planning and execution of military operations in marine environments, in addition to recreational aquatic activities. Coastal ocean models are used to determine tidal amplitudes, time intervals between low and high tide, and the extent of the ebb and flow of tidal waters, often at specific locations of interest. However, modeling tidal flows can be quite complex, as factors such as the configuration of the coastline, water depth, ocean floor topography, and hydrographic and meteorological impacts can have significant effects and must all be considered. Water levels and currents in the coastal ocean can be modeled by solv- ing the shallow water equations. The shallow water equations contain many parameters, and the accurate estimation of both tides and storm surge is dependent on the accuracy of their specification. Of particular importance are the parameters used to define the bottom stress in the domain of interest [50]. These parameters are often heterogeneous across the seabed of the domain. Their values cannot be measured directly and relevant data can be expensive and difficult to obtain. The parameter values must often be inferred and the estimates are often inaccurate, or contain a high degree of uncertainty [28]. In addition, as is the case with many numerical models, coastal ocean models have various other sources of uncertainty, including the approximate physics, numerical discretization, and uncertain boundary and initial conditions. Quantifying and reducing these uncertainties is critical to providing more reliable and robust storm surge predictions. It is also important to reduce the resulting error in the forecast of the model state as much as possible. The accuracy of coastal ocean models can be improved using data assimilation methods. In general, statistical data assimilation methods are used to estimate the state of a model given both the original model output and observed data. A major advantage of statistical data assimilation methods is that they can often be implemented non-intrusively, making them relatively straightforward to implement. They also provide estimates of the uncertainty in the predicted model state. Unfortunately, with the exception of the estimation of initial conditions, they do not contribute to the information contained in the model. The model error that results from uncertain parameters is reduced, but information about the parameters in particular remains unknown. Thus, the other commonly used approach to reducing model error is parameter estimation. Historically, model parameters such as the bottom stress terms have been estimated using variational methods. Variational methods formulate a cost functional that penalizes the difference between the modeled and observed state, and then minimize this functional over the unknown parameters. Though variational methods are an effective approach to solving inverse problems, they can be computationally intensive and difficult to code as they generally require the development of an adjoint model. They also are not formulated to estimate parameters in real time, e.g. as a hurricane approaches landfall. The goal of this research is to estimate parameters defining the bottom stress terms using statistical data assimilation methods. In this work, we use a novel approach to estimate the bottom stress terms in the shallow water equations, which we solve numerically using the Advanced Circulation (ADCIRC) model. In this model, a modified form of the 2-D shallow water equations is discretized in space by a continuous Galerkin finite element method, and in time by finite differencing. We use the Manning’s n formulation to represent the bottom stress terms in the model, and estimate various fields of Manning’s n coefficients by assimilating synthetic water elevation data using a square root Kalman filter. We estimate three types of fields defined on both an idealized inlet and a more realistic spatial domain. For the first field, a Manning’s n coefficient is given a constant value over the entire domain. For the second, we let the Manning’s n coefficient take two distinct values, letting one define the bottom stress in the deeper water of the domain and the other define the bottom stress in the shallower region. And finally, because bottom stress terms are generally spatially varying parameters, we consider the third field as a realization of a stochastic process. We represent a realization of the process using a Karhunen-Lo`ve expansion, and then seek to estimate the coefficients of the expansion. We perform several observation system simulation experiments, and find that we are able to accurately estimate the bottom stress terms in most of our test cases. Additionally, we are able to improve forecasts of the model state in every instance. The results of this study show that statistical data assimilation is a promising approach to parameter estimation. / text
319

Channel, spectrum, and waveform awareness in OFDM-based cognitive radio systems

Yücek, Tevfik 01 January 2007 (has links)
The radio spectrum is becoming increasingly congested everyday with emerging technologies and with the increasing number of wireless devices. Considering the limited bandwidth availability, accommodating the demand for higher capacity and data rates is a challenging task, requiring innovative technologies that can offer new ways of exploiting the available radio spectrum. Cognitive radio arises to be a tempting solution to the spectral crowding problem by introducing the notion of opportunistic spectrum usage. Because of its attractive features, orthogonal frequency division multiplexing (OFDM) has been successfully used in numerous wireless standards and technologies. We believe that OFDM will play an important role in realizing the cognitive radio concept as well by providing a proven, scalable, and adaptive technology for air interface. The goal of this dissertation is to identify and address some of the challenges that arise from the introduction of cognitive radio. Specifically, we propose methods for obtaining awareness about channel, spectrum, and waveform in OFDM-based cognitive radio systems in this dissertation. Parameter estimation for enabling adaptation, spectrum sensing, and OFDM system identification are the three main topics discussed. OFDM technique is investigated as a candidate for cognitive radio systems. Cognitive radio features and requirements are discussed in detail, and OFDM's ability to satisfy these requirements is explained. In addition, we identify the challenges that arise from employing OFDM technology in cognitive radio. Algorithms for estimating various channel related parameters are presented. These parameters are vital for enabling adaptive system design, which is a key requirement for cognitive radio. We develop methods for estimating root-mean-square (RMS) delay spread, Doppler spread, and noise variance. The spectrum opportunity and spectrum sensing concepts are re-evaluated by considering different dimensions of the spectrum which is known as multi-dimensional spectrum space. Spectrum sensing problem in a multi-dimensional space is addressed by developing a new sensing algorithm termed as partial match filtering (PMF). Cognitive radios are expected to recognize different wireless networks and have capability of communicating with them. Algorithms for identification of multi-carrier transmissions are developed. Within the same work, methods for blindly detecting transmission parameters of an OFDM based system are developed. Blind detection is also very helpful in reducing system signaling overhead in the case of adaptive transmission where transmission parameters are changed depending on the environmental characteristics or spectrum availability.
320

Systematic optimization and experimental validation of simulated moving bed chromatography systems for ternary separations and equilibrium limited reactions

Agrawal, Gaurav 21 September 2015 (has links)
Simulated Moving Bed (SMB) chromatography is a separation process where the components are separated due to their varying affinity towards the stationary phase. Over the past decade, many modifications have been proposed in SMB chromatography in order to effectively separate a binary mixture. However, the separation of multi-component mixtures using SMB is still one of the major challenges. Although many different strategies have been proposed, previous studies have rarely performed comprehensive investigations for finding the best ternary separation strategy from various possible alternatives. Furthermore, the concept of combining reaction with SMB has been proposed in the past for driving the equilibrium limited reactions to completion by separating the products from the reaction zone. However, the design of such systems is still challenging due to the complex dynamics of simultaneous reaction and adsorption. The first objective of the study is to find the best ternary separation strategy among various alternatives design of SMB. The performance of several ternary SMB operating schemes, that are proposed in the literature, are compared in terms of the optimal productivity obtained and the amount of solvent consumed. A multi- objective optimization problem is formulated which maximizes the SMB productivity and purity of intermediate eluting component at the same time. Furthermore, the concept of optimizing a superstructure formulation is proposed, where numerous SMB operating schemes can be incorporated into a single formulation. This superstructure approach has a potential to find more advantageous operating scheme compared to existing operating schemes in the literature. The second objective of the study is to demonstrate the Generalized Full Cycle (GFC) operation experimentally for the first time, and compare its performance to the JO process. A Semba OctaveTM chromatography system is used as an experimental SMB unit to implement the optimal operating schemes. In addition, a simultaneous optimization and model correction (SOMC) scheme is used to resolve the model mismatch in a systematic way. We also show a systematic comparison of both JO and GFC operations by presenting a Pareto plot of the productivity achieved against the desired purity of the intermediate eluting component experimentally. The third objective of the study is to develop an simulated moving bed reactor (SMBR) process for an industrial-scale application, and demonstrate the potential of the ModiCon operation for improving the performance of the SMBR compared to the conventional operating strategy. A novel industrial application involving the esterification of acetic acid and 1-methoxy-2-propanol is considered to produce propylene glycol methyl ether (PMA) as the product. A multi-objective optimization study is presented to find the best reactive separation strategy for the production of the PMA product. We also present a Pareto plot that compares the ModiCon operation, which allows periodical change of the feed composition and the conventional operating strategy for the optimal production rate of PMA that can be achieved against the desired conversion of acetic acid.

Page generated in 0.1375 seconds