1 |
Density functions with extremal antiperiodic eigenvalues and related topicsKung, Shing-Yuan 22 January 2005 (has links)
In this thesis, we prove 2 theorems. First let £l0 be
a minimizing (or maximizing) density function for the first
antiperiodic eigenvalue £f1' in E[h,H,M], then £l0=h£q(a,b)+H£q[0,£k]/(a,b) (or £l0=H£q(a,b)+h£q[0,£k]/(a,b)) a.e. Finally, we prove min£f1'=min£g1=min£h1 where £g1 and £h1 are the first Dirichlet and second Neumann eigenvalues, respectively. Furthermore, we determine the jump point X0 of £l0 and the corresponding eigenvalue £f1', assuming that £l0 is symmetric about £k/2 We derive the nonlinear equations for this jump point X0 and £f1',then use Mathematica to solve the equations numerically.
|
2 |
Generalised beta type II distributions - emanating from a sequential processAdamski, Karien January 2013 (has links)
This study focuses on the development of a generalised multivariate beta type II distribution
as well as the noncentral and bimatrix counterparts with positive domain. These
models emanate from a sequential quality monitoring procedure with the normal and
multivariate normal distributions as the underlying process distributions. Three different
scenarios are considered, namely:
1. The variance is monitored from a normal process and the mean remains unchanged;
2. The above-mentioned scenario but the known mean also encounters a sustained
shift;
3. The covariance structure of a multivariate normal distribution is monitored with
the known mean vector unchanged.
The statistics originating from the above-mentioned scenarios considered are constructed
from different dependent chi-squared or Wishart ratios. Exact expressions are derived for
the probability density functions of these statistics. These new distributions contribute
to the statistical discipline in the sense that it can serve as alternatives to existing probability
models, and can be used in determining the performance of the quality monitoring
procedure. / Thesis (PhD)--University of Pretoria, 2013. / gm2014 / Statistics / unrestricted
|
3 |
Analysis of Droplet Impact on a Liquid PoolRadhika Arvind Bhopatkar (9012413) 25 June 2020 (has links)
<p>Secondary
atomization is very important in applications like IC engine and aircraft
engine performance, agricultural sprays, and inkjet printing to name a few. In
case of IC engines and aircraft engines, a good understanding of the modes of
secondary atomization and the resultant drop size can contribute to improving
the fuel injection and hence the efficiency of the engine. Similarly, with the
help of appropriate secondary atomization desired agro-spray quality, ink usage
and print quality can be achieved which would optimize the usage of chemicals
and ink respectively and avoid any harmful effects on the environment.</p>
<p> </p>
<p>One of
the reasons for secondary atomization that occurs very often in most of the
spray applications is the drop impact on a solid or liquid surface. Especially
it is cardinal to understand the impact of a drop on a liquid film since even
in case of impact of liquid drops on a solid surface ultimately the drops that
are injected at a later time are going have a target surface as a thin liquid
film on the solid base due to the accumulation of the previously injected
drops. Analysis of drop impact on a liquid film with non-dimensional thickness
ranging from 0.1 to 1 has been done thoroughly before (Cossali <i>et al.,</i> 2004, Vander Waal <i>et al.,</i>
2006, Moreira <i>et al.,</i> 2010), however,
analysis of drop impact on a liquid film with non-dimensional thickness greater
than 1 is still in a rudimentary stage. This work focuses on determining the
probability density functions for the secondary drop sizes for drops produced
in case of drop impact on a liquid film while varying the h/d ratio beyond 1. The
experimental set-up used to study drop impact includes a droplet generator and
DIH system as mentioned in, Yao <i>et al.</i>
(2017). The DIH set-up includes a CW laser, spatial filter, beam expander and a
collimator as adapted from Guildenbecher <i>et
al.</i> (2016). The height of drop impact is varied to vary the impact <i>We</i>,
by adjusting the syringe height. Three fluids- DI-Water, ethanol and glycerol
are tested for examining the effect of viscosity on the resultant drop sizes. Results
are plotted with respect to viscosity, impact <i>We</i> and the non-dimensional
film thickness, as the fragmentation of drops is directly associated to these
parameters. Results indicate that majority of the secondary droplets lie in the
size range of 25 µm to 50 µm. It is also observed that the tendency of
secondary atomization from crown splashing increases with the increase in <i>We</i>
and decreases with increase in <i>Oh.</i></p>
|
4 |
Generalised density function estimation using moments and the characteristic functionEsterhuizen, Gerhard 03 1900 (has links)
139 leaves printed single pages, preliminary pages i-xi and numbered pages 1-127. Includes bibliography and a list of figures and tables. Digitized at 600 dpi grayscale to pdf format (OCR),using a Bizhub 250 Konica Minolta Scanner. / Thesis (MScEng (Electrical and Electronic Engineering))--University of Stellenbosch, 2003. / ENGLISH ABSTRACT: Probability density functions (PDFs) and cumulative distribution functions (CDFs)
play a central role in statistical pattern recognition and verification systems. They allow
observations that do not occur according to deterministic rules to be quantified and modelled.
An example of such observations would be the voice patterns of a person that is
used as input to a biometric security device.
In order to model such non-deterministic observations, a density function estimator
is employed to estimate a PDF or CDF from sample data. Although numerous density
function estimation techniques exist, all the techniques can be classified into one of two
groups, parametric and non-parametric, each with its own characteristic advantages and
disadvantages.
In this research, we introduce a novel approach to density function estimation that
attempts to combine some of the advantages of both the parametric and non-parametric
estimators. This is done by considering density estimation using an abstract approach in
which the density function is modelled entirely in terms of its moments or characteristic
function. New density function estimation techniques are first developed in theory, after
which a number of practical density function estimators are presented.
Experiments are performed in which the performance of the new estimators are compared
to two established estimators, namely the Parzen estimator and the Gaussian mixture
model (GMM). The comparison is performed in terms of the accuracy, computational requirements
and ease of use of the estimators and it is found that the new estimators does
combine some of the advantages of the established estimators without the corresponding
disadvantages. / AFRIKAANSE OPSOMMING: Waarskynlikheids digtheidsfunksies (WDFs) en Kumulatiewe distribusiefunksies (KDFs)
speel 'n sentrale rol in statistiese patroonherkenning en verifikasie stelsels. Hulle maak dit
moontlik om nie-deterministiese observasies te kwantifiseer en te modelleer. Die stempatrone
van 'n spreker wat as intree tot 'n biometriese sekuriteits stelsel gegee word, is 'n
voorbeeld van so 'n observasie.
Ten einde sulke observasies te modelleer, word 'n digtheidsfunksie afskatter gebruik om
die WDF of KDF vanaf data monsters af te skat. Alhoewel daar talryke digtheidsfunksie
afskatters bestaan, kan almal in een van twee katagoriee geplaas word, parametries en
nie-parametries, elk met hul eie kenmerkende voordele en nadele.
Hierdie werk Ie 'n nuwe benadering tot digtheidsfunksie afskatting voor wat die voordele
van beide die parametriese sowel as die nie-parametriese tegnieke probeer kombineer. Dit
word gedoen deur digtheidsfunksie afskatting vanuit 'n abstrakte oogpunt te benader waar
die digtheidsfunksie uitsluitlik in terme van sy momente en karakteristieke funksie gemodelleer
word. Nuwe metodes word eers in teorie ondersoek en ontwikkel waarna praktiese
tegnieke voorgele word. Hierdie afskatters het die vermoe om 'n wye verskeidenheid digtheidsfunksies
af te skat en is nie net ontwerp om slegs sekere families van digtheidsfunksies
optimaal voor te stel nie.
Eksperimente is uitgevoer wat die werkverrigting van die nuwe tegnieke met twee gevestigde
tegnieke, naamlik die Parzen afskatter en die Gaussiese mengsel model (GMM), te
vergelyk. Die werkverrigting word gemeet in terme van akkuraatheid, vereiste numeriese
verwerkingsvermoe en die gemak van gebruik. Daar word bevind dat die nuwe afskatters
weI voordele van die gevestigde afskatters kombineer sonder die gepaardgaande nadele.
|
5 |
Estudo de modelos estatisticos utilizados na caracterização de tecidos por ultra-som / A study of statistical models used for ultrasonic tissue characterizationVivas, Gustavo de Castro 08 April 2006 (has links)
Orientadores: Eduardo Tavares Costa, Ricardo Grossi Dantas / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica e de Computação / Made available in DSpace on 2018-08-07T21:03:41Z (GMT). No. of bitstreams: 1
Vivas_GustavodeCastro_M.pdf: 7295002 bytes, checksum: 6c61cdae482950b95224f30787f35db0 (MD5)
Previous issue date: 2006 / Resumo: O diagnóstico médico por ultra-som vem sendo amplamente difundido, tornando-se referência em muitos exames clínicos, destacando-se as imagens em modo-B, capazes de representar a anatomia de tecidos e órgãos de forma não-invasiva, em tempo real e sem a utilização de radiação ionizante. Entretanto, o speckle, artefato inerente aos sistemas que utilizam fontes coerentes como nos sistemas de ultra-som, degrada a qualidade das imagens, podendo reduzir bastante a capacidade de detecção de lesões pelo médico. A caracterização de tecidos por ultra-som visa extrair informações de relevância clínica sobre as reais características da estrutura biológica sob investigação e que não podem ser facilmente percebidas por inspeção visual. Neste trabalho foi realizado um estudo comparativo entre os principais modelos de distribuição estatística encontrados na literatura e adotados na caracterização de tecidos por ultra-som. Foram utilizadas funções densidade de probabilidade que melhor representassem o padrão de brilho existente em uma dada região de uma imagem. Os resultados indicaram a versatilidade da distribuição Composta (K-Nakagami) em modelar diferentes condições de espalhamento existentes nos tecidos, mostrando-se uma forte candidata para a caracterização de tecidos por ultra-som. Entretanto, usando o conceito de espalhadores equivalentes, pôde ser mostrado que a abordagem estatística utilizada não fornece parâmetros quantitativos conclusivos sobre a estrutura investigada, mas uma contribuição conjunta de vários fatores, entre eles a densidade e a distribuição de amplitudes dos espalhadores acústicos / Abstract: Ultrasound medical diagnosis has been widely used and has become a reference in many clinical examinations, especially B-mode imaging, capable of representing tissue and organ anatomy without ionizing radiation in a non-invasive way and in real-time. However, speckle, an inherent artifact of systems that use coherent sources like ultrasound systems, degrades image quality, leading to subjective and possibly misleading diagnostics. Ultrasonic tissue characterization aims to extract clinical relevant information of the biological structure characteristics under investigation and that cannot be easily achieved by visual inspection. In this dissertation it was carried out a comparative study of the most popular models of statistics distributions found in literature and commonly adopted in ultrasonic tissue characterization. It has been used probability density functions that better represented the brightness pattern of a given region of an ultrasound image. The results indicated the versatility of the Compound distribution (K-Nakagami) in modeling different scattering conditions of tissues, revealing itself a good model for use in ultrasonic tissue characterization. However, using the concept of equivalent scatterers, it could be shown that the statistics approach does not supply conclusive quantitative parameters of the structure under investigation, being a joint contribution of many factors such as density and amplitude distribution of the acoustic scatterers / Mestrado / Engenharia Biomedica / Mestre em Engenharia Elétrica
|
6 |
Interactions in multi-robot systemsDiaz-Mercado, Yancy J. 27 May 2016 (has links)
The objective of this research is to develop a framework for multi-robot coordination and control with emphasis on human-swarm and inter-agent interactions. We focus on two problems: in the first we address how to enable a single human operator to externally influence large teams of robots. By directly imposing density functions on the environment, the user is able to abstract away the size of the swarm and manipulate it as a whole, e.g., to achieve specified geometric configurations, or to maneuver it around. In order to pursue this approach, contributions are made to the problem of coverage of time-varying density functions. In the second problem, we address the characterization of inter-agent interactions and enforcement of desired interaction patterns in a provably safe (i.e., collision free) manner, e.g., for achieving rich motion patterns in a shared space, or for mixing of sensor information. We use elements of the braid group, which allows us to symbolically characterize classes of interaction patterns. We further construct a new specification language that allows us to provide rich, temporally-layered specifications to the multi-robot mixing framework, and present algorithms that significantly reduce the search space of specification-satisfying symbols with exactness guarantees. We also synthesize provably safe controllers that generate and track trajectories to satisfy these symbolic inputs. These controllers allow us to find bounds on the amount of safe interactions that can be achieved in a given bounded domain.
|
7 |
Derivation of Probability Density Functions for the Relative Differences in the Standard and Poor's 100 Stock Index Over Various Intervals of TimeBunger, R. C. (Robert Charles) 08 1900 (has links)
In this study a two-part mixed probability density function was derived which described the relative changes in the Standard and Poor's 100 Stock Index over various intervals of time. The density function is a mixture of two different halves of normal distributions. Optimal values for the standard deviations for the two halves and the mean are given. Also, a general form of the function is given which uses linear regression models to estimate the standard deviations and the means.
The density functions allow stock market participants trading index options and futures contracts on the S & P 100 Stock Index to determine probabilities of success or failure of trades involving price movements of certain magnitudes in given lengths of time.
|
8 |
Petit périple aux confins du modèle standard avec HERASauvan, E. 30 October 2009 (has links) (PDF)
-
|
9 |
Quark Distributions and Charged Higgs Boson Production : Studies of Proton Structure and New PhysicsAlwall, Johan January 2005 (has links)
<p>The Standard Model describes all elementary particles known today, but at larger energies it will have to be complemented with new particles and interactions. To be able to distinguish new physics at proton colliders such as LHC at CERN, it is essential to have an appropriate description of the colliding protons and their interactions. The study of the proton is important also in itself, to get a better understanding of the non-perturbative aspects of the strong interaction.</p><p>In paper I-IV of this thesis, a model for the non-perturbative dynamics of quarks and gluons is developed, based on quantum fluctuations in hadrons. The parton distributions of the proton are given by momentum fluctuations, with sea quark distributions generated by fluctuations into baryon-meson pairs. This model can reproduce proton structure function data, as well as measured asymmetries between up and down valence quark distributions and between the anti-up and anti-down sea. It provides an intrinsic charm quark component as indicated by data. It also predicts an asymmetry in the strange sea of the proton, which can explain the NuTeV anomaly first attributed to new physics beyond the Standard Model.</p><p>Charged Higgs bosons are predicted by several theories for new physics, including Supersymmetry. At proton colliders, the predicted dominant production mechanism is in association with top and bottom quarks. In paper V-VII, different contributions to this production are studied, and an algorithm is developed for combining the two dominant processes gb -> tH<sup>+/-</sup> and gg -> tbH<sup>+/-</sup>. The algorithm gives a smooth transition from small to large transverse momenta of the b-quark, which is important when the b-quark is observed. It also gives arguments for the choice of factorisation scale in the process.</p>
|
10 |
Quark Distributions and Charged Higgs Boson Production : Studies of Proton Structure and New PhysicsAlwall, Johan January 2005 (has links)
The Standard Model describes all elementary particles known today, but at larger energies it will have to be complemented with new particles and interactions. To be able to distinguish new physics at proton colliders such as LHC at CERN, it is essential to have an appropriate description of the colliding protons and their interactions. The study of the proton is important also in itself, to get a better understanding of the non-perturbative aspects of the strong interaction. In paper I-IV of this thesis, a model for the non-perturbative dynamics of quarks and gluons is developed, based on quantum fluctuations in hadrons. The parton distributions of the proton are given by momentum fluctuations, with sea quark distributions generated by fluctuations into baryon-meson pairs. This model can reproduce proton structure function data, as well as measured asymmetries between up and down valence quark distributions and between the anti-up and anti-down sea. It provides an intrinsic charm quark component as indicated by data. It also predicts an asymmetry in the strange sea of the proton, which can explain the NuTeV anomaly first attributed to new physics beyond the Standard Model. Charged Higgs bosons are predicted by several theories for new physics, including Supersymmetry. At proton colliders, the predicted dominant production mechanism is in association with top and bottom quarks. In paper V-VII, different contributions to this production are studied, and an algorithm is developed for combining the two dominant processes gb -> tH+/- and gg -> tbH+/-. The algorithm gives a smooth transition from small to large transverse momenta of the b-quark, which is important when the b-quark is observed. It also gives arguments for the choice of factorisation scale in the process.
|
Page generated in 0.0734 seconds