Spelling suggestions: "subject:"saddlepoint approximation"" "subject:"saddlepoints approximation""
1 |
Saddlepoint Approximation for Calculating Performance of Spectrum-Sliced WDM SystemsTeotia, Seemant 06 August 1999 (has links)
Spectrum slicing is a novel technique for the implementation of wavelength-division multiplexing (WDM). While conventional WDM systems employ laser diodes operating at discrete wavelengths as carriers for the different data channels that are to be multiplexed, spectrum-sliced systems make use of spectral slices of a broadband noise source for the different data channels, thus being economically attractive.
In spectrum-sliced WDM systems with an optical preamplifier receiver there is an optimum m=BoT (Bo = optical channel bandwidth, T = bit duration) to minimize the average number of photons-per-bit (Np) required at the receiver for a given error probability (Pe). Both the optimum m and the minimum increase as interchannel interference increases. This has been analyzed previously by using the Gaussian approximation, or by assuming that the signals at the decision point are chi-square distributed. Although the chi-square distribution is valid in the case where there is no interference, it is not valid in the presence of interference, since the interference from the neighboring channel has a smaller bandwidth than the signal. In this thesis, a different method is used to analyze this problem. This method is called the Saddlepoint Approximation, and while the exact analysis required determination of the probability density function (pdf) of the received signal, the saddlepoint method makes use of moment generating functions (MGFs) which have a much simpler form and don't require the convolution operations the pdfs require.
The saddlepoint method is validated by comparing results obtained with the chi-square analysis for the no interchannel interference case when a rectangular shaped filter is used. The effect of non-rectangular spectra on receiver sensitivity with the use of the Saddlepoint Approximation is also investigated. After verifying its validity, the method is applied to the interchannel interference case caused by filter overlap. It is shown that for small filter overlap, use of an equivalent chi-square distribution is valid, but when the overlap becomes larger, the performance approaches that calculated using the Gaussian distribution. It is shown that there is an optimum filter overlap to maximize the total system throughput when total bandwidth is constrained. Operating at this optimum, the total system throughput is 135 Gbits/s when the total system bandwidth is 4.4 THz (35 nm) for a Bit Error Rate (BER) of 10e-9. / Master of Science
|
2 |
Approaches to the multivariate random variables associated with stochastic processesYu, Jihnhee 15 November 2004 (has links)
Stochastic compartment models are widely used in modeling processes for biological populations. The residence time has been especially useful in describing the system dynamics in the models. The direct calculation of the distribution for the residence time of stochastic multi-compartment models is very complicated even with a relatively simple model and often impossible to calculate directly. This dissertation presents an analytical method to obtain the moment generating function for stochastic multi-compartment models and describe the distribution of the residence times, especially systems with nonexponential lifetime distributions.
A common method for obtaining moments of the residence time is using the coefficient matrix, however it has a limitation in obtaining high order moments and moments for combined compartments in a system.
In this dissertation, we first derive the bivariate moment generating function of the residence time distribution for stochastic two-compartment models with general lifetimes. It provides any order of moments and also enables us to approximate the density of the residence time using the saddlepoint approximation. The approximation method is applied to various situations including the approximation of the bivariate distribution of residence times in two-compartment models or approximations based on the truncated moment generating function.
Special attention is given to the distribution of the residence time for multi-compartment semi-Markov models. The cofactor rule and the analytic approach to the two-compartment model facilitate the derivation of the moment generating function. The properties from the embedded Markov chain are also used to extend the application of the approach.
This approach provides a complete specification of the residence time distribution based on the moment generating function and thus provides an easier calculation of high-order moments than the approach using the coefficient matrix.
Applications to drug kinetics demonstrate the simplicity and usefulness of this approach.
|
3 |
Revisiting Empirical Bayes Methods and Applications to Special Types of DataDuan, Xiuwen 29 June 2021 (has links)
Empirical Bayes methods have been around for a long time and have a wide range of
applications. These methods provide a way in which historical data can be aggregated
to provide estimates of the posterior mean. This thesis revisits some of the empirical
Bayesian methods and develops new applications. We first look at a linear empirical Bayes estimator and apply it on ranking and symbolic data. Next, we consider
Tweedie’s formula and show how it can be applied to analyze a microarray dataset.
The application of the formula is simplified with the Pearson system of distributions.
Saddlepoint approximations enable us to generalize several results in this direction.
The results show that the proposed methods perform well in applications to real data
sets.
|
4 |
A Multidimensional Convolutional Bootstrapping Method for the Analysis of Degradation DataClark, Jared M. 18 April 2022 (has links)
While Monte Carlo methods for bootstrapping are typically easy to implement, they can be quite time intensive. This work aims to extend an established convolutional method of bootstrapping to work when convolutions in two or more dimensions are required. The convolutional method relies on efficient computational tools rather than Monte Carlo simulation which can greatly reduce the computation time. The proposed method is particularly well suited for the analysis of degradation data when the data are not collected on time intervals of equal length. The convolutional bootstrapping method is typically much faster than the Monte Carlo bootstrap and can be used to produce exact results in some simple cases. Even in more complicated applications, where it is not feasible to find exact results, mathematical bounds can be placed on the resulting distribution. With these benefits of the convolutional method, this bootstrapping approach has been shown to be a useful alternative to the traditional Monte Carlo bootstrap.
|
5 |
Novel computational methods for stochastic design optimization of high-dimensional complex systemsRen, Xuchun 01 January 2015 (has links)
The primary objective of this study is to develop new computational methods for robust design optimization (RDO) and reliability-based design optimization (RBDO) of high-dimensional, complex engineering systems. Four major research directions, all anchored in polynomial dimensional decomposition (PDD), have been defined to meet the objective. They involve: (1) development of new sensitivity analysis methods for RDO and RBDO; (2) development of novel optimization methods for solving RDO problems; (3) development of novel optimization methods for solving RBDO problems; and (4) development of a novel scheme and formulation to solve stochastic design optimization problems with both distributional and structural design parameters.
The major achievements are as follows. Firstly, three new computational methods were developed for calculating design sensitivities of statistical moments and reliability of high-dimensional complex systems subject to random inputs. The first method represents a novel integration of PDD of a multivariate stochastic response function and score functions, leading to analytical expressions of design sensitivities of the first two moments. The second and third methods, relevant to probability distribution or reliability analysis, exploit two distinct combinations built on PDD: the PDD-SPA method, entailing the saddlepoint approximation (SPA) and score functions; and the PDD-MCS method, utilizing the embedded Monte Carlo simulation (MCS) of the PDD approximation and score functions. For all three methods developed, both the statistical moments or failure probabilities and their design sensitivities are both determined concurrently from a single stochastic analysis or simulation. Secondly, four new methods were developed for RDO of complex engineering systems. The methods involve PDD of a high-dimensional stochastic response for statistical moment analysis, a novel integration of PDD and score functions for calculating the second-moment sensitivities with respect to the design variables, and standard gradient-based optimization algorithms. The methods, depending on how statistical moment and sensitivity analyses are dovetailed with an optimization algorithm, encompass direct, single-step, sequential, and multi-point single-step design processes. Thirdly, two new methods were developed for RBDO of complex engineering systems. The methods involve an adaptive-sparse polynomial dimensional decomposition (AS-PDD) of a high-dimensional stochastic response for reliability analysis, a novel integration of AS-PDD and score functions for calculating the sensitivities of the failure probability with respect to design variables, and standard gradient-based optimization algorithms, resulting in a multi-point, single-step design process. The two methods, depending on how the failure probability and its design sensitivities are evaluated, exploit two distinct combinations built on AS-PDD: the AS-PDD-SPA method, entailing SPA and score functions; and the AS-PDD-MCS method, utilizing the embedded MCS of the AS-PDD approximation and score functions. In addition, a new method, named as the augmented PDD method, was developed for RDO and RBDO subject to mixed design variables, comprising both distributional and structural design variables. The method comprises a new augmented PDD of a high-dimensional stochastic response for statistical moment and reliability analyses; an integration of the augmented PDD, score functions, and finite-difference approximation for calculating the sensitivities of the first two moments and the failure probability with respect to distributional and structural design variables; and standard gradient-based optimization algorithms, leading to a multi-point, single-step design process.
The innovative formulations of statistical moment and reliability analysis, design sensitivity analysis, and optimization algorithms have achieved not only highly accurate but also computationally efficient design solutions. Therefore, these new methods are capable of performing industrial-scale design optimization with numerous design variables.
|
Page generated in 0.1304 seconds