• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 67
  • 31
  • 9
  • 7
  • 6
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 157
  • 157
  • 37
  • 32
  • 32
  • 31
  • 24
  • 24
  • 23
  • 22
  • 22
  • 19
  • 19
  • 19
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Some Contributions to Distribution Theory and Applications

Selvitella, Alessandro 11 1900 (has links)
In this thesis, we present some new results in distribution theory for both discrete and continuous random variables, together with their motivating applications. We start with some results about the Multivariate Gaussian Distribution and its characterization as a maximizer of the Strichartz Estimates. Then, we present some characterizations of discrete and continuous distributions through ideas coming from optimal transportation. After this, we pass to the Simpson's Paradox and see that it is ubiquitous and it appears in Quantum Mechanics as well. We conclude with a group of results about discrete and continuous distributions invariant under symmetries, in particular invariant under the groups $A_1$, an elliptical version of $O(n)$ and $\mathbb{T}^n$. As mentioned, all the results proved in this thesis are motivated by their applications in different research areas. The applications will be thoroughly discussed. We have tried to keep each chapter self-contained and recalled results from other chapters when needed. The following is a more precise summary of the results discussed in each chapter. In chapter \ref{chapter 2}, we discuss a variational characterization of the Multivariate Normal distribution (MVN) as a maximizer of the Strichartz Estimates. Strichartz Estimates appear as a fundamental tool in the proof of wellposedness results for dispersive PDEs. With respect to the characterization of the MVN distribution as a maximizer of the entropy functional, the characterization as a maximizer of the Strichartz Estimate does not require the constraint of fixed variance. In this chapter, we compute the precise optimal constant for the whole range of Strichartz admissible exponents, discuss the connection of this problem to Restriction Theorems in Fourier analysis and give some statistical properties of the family of Gaussian Distributions which maximize the Strichartz estimates, such as Fisher Information, Index of Dispersion and Stochastic Ordering. We conclude this chapter presenting an optimization algorithm to compute numerically the maximizers. Chapter \ref{chapter 3} is devoted to the characterization of distributions by means of techniques from Optimal Transportation and the Monge-Amp\`{e}re equation. We give emphasis to methods to do statistical inference for distributions that do not possess good regularity, decay or integrability properties. For example, distributions which do not admit a finite expected value, such as the Cauchy distribution. The main tool used here is a modified version of the characteristic function (a particular case of the Fourier Transform). An important motivation to develop these tools come from Big Data analysis and in particular the Consensus Monte Carlo Algorithm. In chapter \ref{chapter 4}, we study the \emph{Simpson's Paradox}. The \emph{Simpson's Paradox} is the phenomenon that appears in some datasets, where subgroups with a common trend (say, all negative trend) show the reverse trend when they are aggregated (say, positive trend). Even if this issue has an elementary mathematical explanation, the statistical implications are deep. Basic examples appear in arithmetic, geometry, linear algebra, statistics, game theory, sociology (e.g. gender bias in the graduate school admission process) and so on and so forth. In our new results, we prove the occurrence of the \emph{Simpson's Paradox} in Quantum Mechanics. In particular, we prove that the \emph{Simpson's Paradox} occurs for solutions of the \emph{Quantum Harmonic Oscillator} both in the stationary case and in the non-stationary case. We prove that the phenomenon is not isolated and that it appears (asymptotically) in the context of the \emph{Nonlinear Schr\"{o}dinger Equation} as well. The likelihood of the \emph{Simpson's Paradox} in Quantum Mechanics and the physical implications are also discussed. Chapter \ref{chapter 5} contains some new results about distributions with symmetries. We first discuss a result on symmetric order statistics. We prove that the symmetry of any of the order statistics is equivalent to the symmetry of the underlying distribution. Then, we characterize elliptical distributions through group invariance and give some properties. Finally, we study geometric probability distributions on the torus with applications to molecular biology. In particular, we introduce a new family of distributions generated through stereographic projection, give several properties of them and compare them with the Von-Mises distribution and its multivariate extensions. / Thesis / Doctor of Philosophy (PhD)
152

利用混合模型估計風險值的探討

阮建豐 Unknown Date (has links)
風險值大多是在假設資產報酬為常態分配下計算而得的,但是這個假設與實際的資產報酬分配不一致,因為很多研究者都發現實際的資產報酬分配都有厚尾的現象,也就是極端事件的發生機率遠比常態假設要來的高,因此利用常態假設來計算風險值對於真實損失的衡量不是很恰當。 針對這個問題,本論文以歷史模擬法、變異數-共變異數法、混合常態模型來模擬報酬率的分配,並依給定的信賴水準估算出風險值,其中混合常態模型的參數是利用準貝式最大概似估計法及EM演算法來估計;然後利用三種風險值的評量方法:回溯測試、前向測試與二項檢定,來評判三種估算風險值方法的優劣。 經由實證結果發現: 1.報酬率分配在左尾臨界機率1%有較明顯厚尾的現象。 2.利用混合常態分配來模擬報酬率分配會比另外兩種方法更能準確的捕捉到左尾臨界機率1%的厚尾。 3.混合常態模型的峰態係數值接近於真實報酬率分配的峰態係數值,因此我們可以確認混合常態模型可以捕捉高峰的現象。 關鍵字:風險值、厚尾、歷史模擬法、變異數-共變異教法、混合常態模型、準貝式最大概似估計法、EM演算法、回溯測試、前向測試、高峰 / Initially, Value at Risk (VaR) is calculated by assuming that the underline asset return is normal distribution, but this assumption sometimes does not consist with the actual distribution of asset return. Many researchers have found that the actual distribution of the underline asset return have Fat-Tail, extreme value events, character. So under normal distribution assumption, the VaR value is improper compared with the actual losses. The paper discuss three methods. Historical Simulated method - Variance-Covariance method and Mixture Normal .simulating those asset, return and VaR by given proper confidence level. About the Mixture Normal Distribution, we use both EM algorithm and Quasi-Bayesian MLE calculating its parameters. Finally, we use tree VaR testing methods, Back test、Forward tes and Binomial test -----comparing its VaR loss probability We find the following results: 1.Under 1% left-tail critical probability, asset return distribution has significant Fat-tail character. 2.Using Mixture Normal distribution we can catch more Fat-tail character precisely than the other two methods. 3.The kurtosis of Mixture Normal is close to the actual kurtosis, this means that the Mixture Normal distribution can catch the Leptokurtosis phenomenon. Key words: Value at Risk、VaR、Fat tail、Historical simulation method、 Variance-Covariance method、Mixture Normal distribution、Quasi-Bayesian MLE、EM algorithm、Back test、 Forward test、 Leptokurtosis
153

Sur les familles des lois de fonction de hasard unimodale : applications en fiabilité et analyse de survie

Saaidia, Noureddine 24 June 2013 (has links)
En fiabilité et en analyse de survie, les distributions qui ont une fonction de hasard unimodale ne sont pas nombreuses, qu'on peut citer: Gaussienne inverse ,log-normale, log-logistique, de Birnbaum-Saunders, de Weibull exponentielle et de Weibullgénéralisée. Dans cette thèse, nous développons les tests modifiés du Chi-deux pour ces distributions tout en comparant la distribution Gaussienne inverse avec les autres. Ensuite nousconstruisons le modèle AFT basé sur la distribution Gaussienne inverse et les systèmes redondants basés sur les distributions de fonction de hasard unimodale. / In reliability and survival analysis, distributions that have a unimodalor $\cap-$shape hazard rate function are not too many, they include: the inverse Gaussian,log-normal, log-logistic, Birnbaum-Saunders, exponential Weibull and power generalized Weibulldistributions. In this thesis, we develop the modified Chi-squared tests for these distributions,and we give a comparative study between the inverse Gaussian distribution and the otherdistributions, then we realize simulations. We also construct the AFT model based on the inverseGaussian distribution and redundant systems based on distributions having a unimodal hazard ratefunction.
154

Minimization of Noise and Vibration Related to Driveline Imbalance using Robust Design Processes

Al-Shubailat, Omar 17 August 2013 (has links)
Variation in vehicle noise, vibration and harshness (NVH) response can be caused by variability in design (e.g. tolerance), material, manufacturing, or other sources of variation. Such variation in the vehicle response causes a higher percentage of produced vehicles to have higher levels (out of specifications) of NVH leading to higher number of warranty claims and loss of customer satisfaction, which are proven costly. Measures must be taken to ensure less warranty claims and higher levels of customer satisfactions. As a result, original equipment manufacturers (OEMs) have implemented design for variation in the design process to secure an acceptable (or within specification) response. The focus here will be on aspects of design variations that should be considered in the design process of drivelines. Variations due to imbalance in rotating components can be unavoidable or costly to control. Some of the major components in the vehicle that are known to have imbalance and traditionally cause NVH issues and concerns include the crankshaft, the drivetrain components (transmission, driveline, half shafts, etc.), and wheels. The purpose is to assess NVH as a result of driveline imbalance variations and develop a tool to help design a more robust system to such variations.
155

Highway Development Decision-Making Under Uncertainty: Analysis, Critique and Advancement

El-Khatib, Mayar January 2010 (has links)
While decision-making under uncertainty is a major universal problem, its implications in the field of transportation systems are especially enormous; where the benefits of right decisions are tremendous, the consequences of wrong ones are potentially disastrous. In the realm of highway systems, decisions related to the highway configuration (number of lanes, right of way, etc.) need to incorporate both the traffic demand and land price uncertainties. In the literature, these uncertainties have generally been modeled using the Geometric Brownian Motion (GBM) process, which has been used extensively in modeling many other real life phenomena. But few scholars, including those who used the GBM in highway configuration decisions, have offered any rigorous justification for the use of this model. This thesis attempts to offer a detailed analysis of various aspects of transportation systems in relation to decision-making. It reveals some general insights as well as a new concept that extends the notion of opportunity cost to situations where wrong decisions could be made. Claiming deficiency of the GBM model, it also introduces a new formulation that utilizes a large and flexible parametric family of jump models (i.e., Lévy processes). To validate this claim, data related to traffic demand and land prices were collected and analyzed to reveal that their distributions, heavy-tailed and asymmetric, do not match well with the GBM model. As a remedy, this research used the Merton, Kou, and negative inverse Gaussian Lévy processes as possible alternatives. Though the results show indifference in relation to final decisions among the models, mathematically, they improve the precision of uncertainty models and the decision-making process. This furthers the quest for optimality in highway projects and beyond.
156

Highway Development Decision-Making Under Uncertainty: Analysis, Critique and Advancement

El-Khatib, Mayar January 2010 (has links)
While decision-making under uncertainty is a major universal problem, its implications in the field of transportation systems are especially enormous; where the benefits of right decisions are tremendous, the consequences of wrong ones are potentially disastrous. In the realm of highway systems, decisions related to the highway configuration (number of lanes, right of way, etc.) need to incorporate both the traffic demand and land price uncertainties. In the literature, these uncertainties have generally been modeled using the Geometric Brownian Motion (GBM) process, which has been used extensively in modeling many other real life phenomena. But few scholars, including those who used the GBM in highway configuration decisions, have offered any rigorous justification for the use of this model. This thesis attempts to offer a detailed analysis of various aspects of transportation systems in relation to decision-making. It reveals some general insights as well as a new concept that extends the notion of opportunity cost to situations where wrong decisions could be made. Claiming deficiency of the GBM model, it also introduces a new formulation that utilizes a large and flexible parametric family of jump models (i.e., Lévy processes). To validate this claim, data related to traffic demand and land prices were collected and analyzed to reveal that their distributions, heavy-tailed and asymmetric, do not match well with the GBM model. As a remedy, this research used the Merton, Kou, and negative inverse Gaussian Lévy processes as possible alternatives. Though the results show indifference in relation to final decisions among the models, mathematically, they improve the precision of uncertainty models and the decision-making process. This furthers the quest for optimality in highway projects and beyond.
157

ON-MACHINE MEASUREMENT OF WORKPIECE FORM ERRORS IN ULTRAPRECISION MACHINING

Gomersall, Fiona January 2016 (has links)
Ultraprecision single point diamond turning is required to produce parts with sub-nanometer surface roughness and sub-micrometer surface profiles tolerances. These parts have applications in the optics industry, where tight form accuracy is required while achieving high surface finish quality. Generally, parts can be polished to achieve the desired finish, but then the form accuracy can easily be lost in the process rendering the part unusable. Currently, most mid to low spatial frequency surface finish errors are inspected offline. This is done by physically removing the workpiece from the machining fixture and mounting the part in a laser interferometer. This action introduces errors in itself through minute differences in the support conditions of the over constrained part on a machine as compared to the mounting conditions used for part measurement. Once removed, the fixture induced stresses and the part’s internal residual stresses relax and change the shape of the generally thin parts machined in these applications. Thereby, the offline inspection provides an erroneous description of the performance of the machine. This research explores the use of a single, high resolution, capacitance sensor to quickly and qualitatively measure the low to mid spatial frequencies on the workpiece surface, while it is mounted in a fixture on a standard ultraprecision single point diamond turning machine after a standard facing operation. Following initial testing, a strong qualitative correlation exists between the surface profiling on a standard offline system and this online measuring system. Despite environmental effects and the effects of the machine on the measurement system, the capacitive system with some modifications and awareness of its measurement method is a viable option for measuring mid to low spatial frequencies on a workpiece surface mounted on an ultraprecision machine with a resolution of 1nm with an error band of ±5nm with a 20kHz bandwidth. / Thesis / Master of Applied Science (MASc)

Page generated in 0.1157 seconds