541 |
Real-time Transmission Over InternetGao, Qi January 2004 (has links)
With the Internet expansion, real-time transmission over Internet is becoming a new promising application. Successful real-time communication over IP networks requires reasonably reliable, low delay, low loss date transport. Since Internet is a non-synchronous packet switching network, high load and lack of guarantees on data delivery make real-time communication such as Voice and Video over IP a challenging application to become realistic on the Internet. This thesis work is composed of two parts within real-time voice and video communication: network simulation and measurement on the real Internet. In the network simulation, I investigate the requirement for the network"overprovisioning"in order to reach certain quality-of-service. In the experiments on the real Internet, I simulate real-time transmission with UDP packets along two different traffic routes and analyze the quality-of- service I get in each case. The overall contribution of this work is: To create scenarios to understand the concept of overprovisioning and how it affects the quality-of-service. To develop a mechanism to measure the quality-of-service for real-time traffic provided by the current best-effort network.
|
542 |
The transformed rejection method for generating Poisson random variablesHörmann, Wolfgang January 1992 (has links) (PDF)
The transformed rejection method, a combination of the inversion and the rejection method, which is used to generate non-uniform random numbers from a variety of continuous distributions can be applied to discrete distributions as well. For the Poisson distribution a short and simple algorithm is obtained which is well suited for large values of the Poisson parameter $\mu$, even when $\mu$ may vary from call to call. The average number of uniform deviates required is lower than for any of the known uniformly fast algorithms. Timings for a C implementation show that the algorithm needs only half of the code but is - for $\mu$ not too small - at least as fast as the current state-of-the-art algorithms. (author's abstract) / Series: Preprint Series / Department of Applied Statistics and Data Processing
|
543 |
A novel approach to restoration of Poissonian imagesShaked, Elad 09 February 2010 (has links)
The problem of reconstruction of digital images from their degraded measurements is regarded as a problem of central importance in various fields of engineering and imaging sciences. In such cases, the degradation is typically caused by the resolution limitations of an imaging device in use and/or by the destructive influence of measurement noise. Specifically, when the noise obeys a Poisson probability law, standard approaches to the problem of image reconstruction are based on using fixed-point algorithms which follow the methodology proposed by Richardson and Lucy in the beginning of the 1970s. The practice of using such methods, however, shows that their convergence properties tend to deteriorate at relatively high noise levels (which typically takes place in so-called low-count settings). This work introduces a novel method for de-noising and/or de-blurring of digital images that have been corrupted by Poisson noise. The proposed method is derived using the framework of MAP estimation, under the assumption that the image of interest can be sparsely represented in the domain of a properly designed linear transform. Consequently, a shrinkage-based iterative procedure is proposed, which guarantees the maximization of an associated maximum-a-posteriori criterion. It is shown in a series of both computer-simulated and real-life experiments that the proposed method outperforms a number of existing alternatives in terms of stability, precision, and computational efficiency.
|
544 |
Comparisons of methods for generating conditional Poisson samples and Sampford samplesGrafström, Anton January 2005 (has links)
Methods for conditional Poisson sampling (CP-sampling) and Sampford sampling are compared and the focus is on the efficiency of the methods. The efficiency is investigated by simulation in different sampling situations. It was of interest to compare methods since new methods for both CP-sampling and Sampford sampling were introduced by Bondesson, Traat & Lundqvist in 2004. The new methods are acceptance rejection methods that use the efficient Pareto sampling method. They are found to be very efficient and useful in all situations. The list sequential methods for both CP-sampling and Sampford sampling are other methods that are found to be efficient, especially if many samples are to be generated.
|
545 |
The investigation of mechanical properties of ZrCu/Zr/ZrCu amorphous¡Ðcrystalline¡@nanolaminates with inclined interface by molecular statics simulationFeng, Yu-ting 23 July 2012 (has links)
In this study, the mechanical properties of Cu-Zr binary bulk metallic glasses (BMG) were investigated at the nano-scale. The stable amorphous structures and corresponding energies of BMG structures are performed by density functional theory (DFT) calculation as reference data. This study will combine the Force-Matching (FM) method and Basin-Hopping (BH) method to develop a new method for fitting the Cu-Zr Tight-binding (TB) potential parameters. Moreover, the Bulk modulus, Shear modulus, Young's modulus and Poisson ratio of Cu46Zr54, Cu50Zr50 and Cu64Zr36 structures are calculated with the fitting TB parameters. In addition, the compression process of BMG materials is simulated by the Molecular Statics. The stress and strain are obtained to investigate the deformation mechanism of CuZr/Zr/CuZr nanolaminates at 0 and 45 inclined degree.
Finally, we investigate the angle in the deformation process under different strain in the shear band, shear transformation zones (STZs) and force caused by the slip of the atomic distribution of TFMGs layer.
|
546 |
Spectral (h-p) Element Methods Approach To The Solution Of Poisson And Helmholtz Equations Using MatlabMaral, Tugrul 01 December 2006 (has links) (PDF)
A spectral element solver program using MATLAB is written for the solution of Poisson and Helmholtz equations. The accuracy of spectral methods (p-type high order) and the geometric flexibility of the low-order h-type finite elements
are combined in spectral element methods.
Rectangular elements are used to solve Poisson and Helmholtz equations with Dirichlet and Neumann boundary conditions which are homogeneous or non homogeneous. Robin (mixed) boundary conditions are also implemented.
Poisson equation is also solved by discretising the domain with curvilinear quadrilateral elements so that the accuracy of both isoparametric quadrilateral and rectangular element stiffness matrices and element mass matrices are tested.
Quadrilateral elements are used to obtain the stream functions of the inviscid flow around a cylinder problem. Nonhomogeneous Neumann boundary conditions are imposed to the quadrilateral element stiffness matrix to solve the
velocity potentials.
|
547 |
Probabilistic Seismic Hazard Analysis: A Sensitivity Study With Respect To Different ModelsYilmaz Ozturk, Nazan 01 February 2008 (has links) (PDF)
Due to the randomness inherent in the occurrence of earthquakes with respect to time, space and magnitude as well as other various sources of uncertainties, seismic hazard assessment should be carried out in a probabilistic manner.
Basic steps of probabilistic seismic hazard analysis are the delineation of seismic sources, assessment of the earthquake occurrence characteristics for each seismic source, selection of the appropriate ground motion attenuation relationship and identification of the site characteristics. Seismic sources can be modeled as area and line sources. Also, the seismic activity that can not be related with any major seismic sources can be treated as background source in which the seismicity is assumed to be uniform or spatially smoothed. Exponentially distributed magnitude and characteristic earthquake models are often used to describe the magnitude recurrence relationship. Poisson and renewal models are used to model the occurrence of earthquakes in the time domain.
In this study, the sensitivity of seismic hazard results to the models associated with the different assumptions mentioned above is investigated. The effects of different sources of uncertainties involved in probabilistic seismic hazard analysis methodology to the results are investigated for a number of sites with different distances to a single fault. Two case studies are carried out to examine the influence of different assumptions on the final results based on real data as well as to illustrate the implementation of probabilistic seismic hazard analysis methodology for a large region (e.g. a country) and a smaller region (e.g. a province).
|
548 |
Dynamic Switching Times For Season And Single Tickets In Sports And Entertainment With Time Dependent Demand RatesPakyardim, Yusuf Kenan 01 August 2011 (has links) (PDF)
The most important market segmentation in sports and entertainment industry is the competition between customers that buy bundle and single tickets. A common selling practice is starting the selling season with bundle ticket sales and switching to selling single tickets later on. The aim of this practice is to increase the number of customers that buy bundles, to create a fund before the season starts and to increase the load factor of the games with low demand. In this thesis, we investigate the effect of time dependent demand on dynamic switching times and the potential revenue gain over the case where the demand rate is assumed to be constant with time.
|
549 |
A essay on the housing price jump risk and the catastrophe risk for the property insurance companyChang, Chia-Chien 29 September 2008 (has links)
This dissertation includes two topics. For the first topic about the housing price jump risk, we use EM gradient algorithms to estimate parameters of the jump diffusion model and test whether the US monthly housing price have jump risk during 1986 to 2006. Then, in order to obtain a viable pricing framework of mortgage insurance contracts, this paper uses the jump diffusion processes of Merton (1976) to model the dynamic process of housing price. Using this model, we investigate the impact of price jump risk on the valuation of mortgage insurance premium from jump intensity, abnormal volatility of jump size and normal volatility. Empirical results indicate that the abnormal volatility of jump size has the most significant impact on the mortgage insurance premium.
For the second topic about the catastrophe risk, we investigate that, for catastrophic events, the assumption that catastrophe claims occur in terms of the Poisson process seems inadequate as it has constant intensity. We propose Markov Modulated Poisson process to model the arrival process for catastrophic events. Under this process, the underlying state is governed by a homogenous Markov chain, and it is the generalization of Cummins and Geman (1993, 1995), Chang, Chang, and Yu (1996), Geman and Yor (1997) and Vaugirard (2003a, 2003b). We apply Markov jump diffusion model to derive pricing formulas for catastrophe insurance products, included catastrophe futures call option, catastrophe PCS call spread and catastrophe bond. We use the data of PCS index and the annual number of hurricane events during 1950 to 2004 to test the quality of the fitting under the Markov Modulated Poisson process and the Poisson process. We reach the conclusion that the Markov Modulated Poisson process is fitter than the Poisson process and Weiner process in modeling the arrival rate of hurricane events when pricing three insurance products. Hence, if different status of climate environment has significant different arrival intensity in real economy, using jump diffusion model to evaluate CAT insurance products could cause significant mispricing.
|
550 |
The Fourier-finite-element method with Nitsche-mortaringHeinrich, Bernd, Jung, Beate 01 September 2006 (has links) (PDF)
The paper deals with a combination of the
Fourier-finite-element method with the
Nitsche-finite-element method (as a mortar method).
The approach is applied to the Dirichlet problem
of the Poisson equation in three-dimensional
axisymmetric domains $\widehat\Omega$ with
non-axisymmetric data. The approximating Fourier
method yields a splitting of the 3D-problem into
2D-problems. For solving the 2D-problems on the
meridian plane $\Omega_a$,
the Nitsche-finite-element method with
non-matching meshes is applied. Some important
properties of the approximation scheme are
derived and the rate of convergence in some
$H^1$-like norm is proved to be of the type
${\mathcal O}(h+N^{-1})$ ($h$: mesh size on
$\Omega_a$, $N$: length of the Fourier sum) in
case of a regular solution of the boundary value
problem. Finally, some numerical results are
presented.
|
Page generated in 0.0524 seconds