Spelling suggestions: "subject:"amathematical statistics."" "subject:"dmathematical statistics.""
591 |
Estatística e a teoria de conjuntos fuzzy / Statistic and fuzzy set theoryGonzález Campos, José Alejandro, 1979- 03 June 2015 (has links)
Orientadores: Víctor Hugo Lachos Dávila, Alexandre Galvão Patriota / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Matemática Estatística e Computação Científica / Made available in DSpace on 2018-08-27T08:57:41Z (GMT). No. of bitstreams: 1
GonzalezCampos_JoseAlejandro_D.pdf: 1384046 bytes, checksum: 50ce1df9dcb6387479bcfdd819262218 (MD5)
Previous issue date: 2015 / Resumo: A teoria de conjuntos fuzzy é uma teoria nova introduzida por Zadeh no ano 1965. Estes últimos anos tem tido uma frutífera massificação, atingindo variados campos da ciência. Neste trabalho são distinguidas três dimensões: A teoria de conjuntos fuzzy de maneira pura, conexões da estatística e a teoria dos conjuntos fuzzy (interpretação e visualização) e finalmente a estatística aplicada a dados fuzzy. São apresentadas as definições elementares da teoria de conjuntos fuzzy, tais como: número fuzzy, core e conjuntos fuzzy normais, de maneira a se fazer a tese autocontida. Baseada na primeira dimensão foi definida uma nova forma de ordem nos números fuzzy LR-Type, caracterizada pela sua simplicidade nos cálculos onde a ordem dos números reais fica como uma situação particular quando estes são considerados números fuzzy. Esta proposta supera muitas das limitações de outras propostas de ordem, como a indeterminação e indefinição. A definição de uma ordem permitirá obter ferramentas estatísticas como a mediana e medidas de variabilidade. Na segunda dimensão é apresentada uma nova ferramenta de interpretação das regiões de confiança depois de observada a amostra. É definida uma função de membership que representa de maneira fuzzy o espaço paramétrico dependendo de cada região de confiança. Também é apresentada uma nova forma de visualização de uma sequencia infinita de regiões de confiança. Finalmente, na terceira dimensão é estudada a generalização do estimador de Kaplan-Meier na situação que os tempos de vida são considerados como números fuzzy, abrindo uma linha de pesquisa baseado nas suas propriedades assintóticas. Nesta seção é utilizado um exemplo típico de analises de sobrevivência. Este trabalho de tese apresenta as bases teóricas elementares para dar início a uma nova linha de pesquisa, atendendo a nossa natureza humana e tentar escapar de supostos platônicos / Abstract: The fuzzy sets theory is a new theory introduced by Zadeh in 1965. These past years have been a fruitful massification, reaching diverse fields of science. This work distinguishes three dimensions: A fuzzy set theory in a pure way, connections statistics and the theory of fuzzy sets (interpretation and visualization) and finally applied statistics to fuzzy data. The basic definitions of the fuzzy sets theory are presented, such as fuzzy number, core and normal fuzzy sets, the way to make the thesis self-contained. Based on the first dimension was defined a new order in the LR-type fuzzy numbers. It is characterized by its simplicity in calculations where the order of the real numbers is a particular situation when they are considered fuzzy numbers. This proposal overcomes many of the limitations of other order proposed, such as the indetermination and indefiniteness. The definition of an order will allow to get statistical tools such as median and variability measures. In the second dimension is presented a new tool for the interpretation of confidence regions after the sample was observed. Is defined a function of membership that representing the parametric space of fuzzy way depending of each confidence region. Also is presented a new form of visualization of an infinite sequence of confidence regions. Finally, in the third dimension is studied the generalization of the Kaplan-Meier estimator, in which the lifetime is considered as fuzzy number, opening a line of research around their asymptotic properties. In this section a typical example of survival analysis is used. This thesis work presents the basic theoretical foundations to begin a new line of research, given our human nature and to try to escape from Platonic assumptions / Doutorado / Estatistica / Doutor em Estatística
|
592 |
Optimal Strategies for Stopping Near the Top of a SequenceIslas Anguiano, Jose Angel 12 1900 (has links)
In Chapter 1 the classical secretary problem is introduced. Chapters 2 and 3 are variations of this problem. Chapter 2, discusses the problem of maximizing the probability of stopping with one of the two highest values in a Bernoulli random walk with arbitrary parameter p and finite time horizon n. The optimal strategy (continue or stop) depends on a sequence of threshold values (critical probabilities) which has an oscillating pattern. Several properties of this sequence have been proved by Dr. Allaart. Further properties have been recently proved. In Chapter 3, a gambler will observe a finite sequence of continuous random variables. After he observes a value he must decide to stop or continue taking observations. He can play two different games A) Win at the maximum or B) Win within a proportion of the maximum. In the first section the sequence to be observed is independent. It is shown that for each n>1, theoptimal win probability in game A is bounded below by (1-1/n)^{n-1}. It is accomplished by reducing the problem to that of choosing the maximum of a special sequence of two-valued random variables and applying the sum-the-odds theorem of Bruss (2000). Secondly, it is assumed the sequence is i.i.d. The best lower bounds are provided for the winning probabilities in game B given any continuous distribution. These bounds are the optimal win probabilities of a game A which was examined by Gilbert and Mosteller (1966).
|
593 |
Random matrix theory in machine learning / Slumpmatristeori i maskininlärningLeopold, Lina January 2023 (has links)
In this thesis, we review some applications of random matrix theory in machine learning and theoretical deep learning. More specifically, we review data modelling in the regime of numerous and large dimensional data, a method for estimating covariance matrix distances in the aforementioned regime, as well as an asymptotic analysis of a simple neural network model in the limit where the number of neurons is large and the data is both numerous and large dimensional. We also review some recent research where random matrix models and methods have been applied to Hessian matrices of neural networks with interesting results. As becomes apparent, random matrix theory is a useful tool for various machine learning applications and it is a fruitful field of mathematics toexplore, in particular, in the context of theoretical deep learning. / I denna uppsatsen undersöker vi några tillämpningar av slumpmatristeori inom maskininlärning och teoretisk djupinlärning. Mer specifikt undersöker vi datamodellering i domänet där både datamängden och dimensionen på datan är stor, en metod för att uppskatta avstånd mellan kovariansmatriser i det tidigare nämnda domänet, samt en asymptotisk analys av en enkel neuronnätsmodell i gränsen där antalet neuroner är stort och både datamängden och dimensionen pådatan är stor. Vi undersöker också en del aktuell forskning där slumpmatrismodeller och metoder från slumpmatristeorin har tillämpats på Hessianska matriserför artificiella neuronnätverk med intressanta resultat. Det visar sig att slumpmatristeori är ett användbart verktyg för olika maskininlärningstillämpningaroch är ett område av matematik som är särskilt givande att utforska inom kontexten för teoretisk djupinlärning.
|
594 |
Contributions to the theory of unequal probability samplingLundquist, Anders January 2009 (has links)
This thesis consists of five papers related to the theory of unequal probability sampling from a finite population. Generally, it is assumed that we wish to make modelassisted inference, i.e. the inclusion probability for each unit in the population is prescribed before the sample is selected. The sample is then selected using some random mechanism, the sampling design. Mostly, the thesis is focused on three particular unequal probability sampling designs, the conditional Poisson (CP-) design, the Sampford design, and the Pareto design. They have different advantages and drawbacks: The CP design is a maximum entropy design but it is difficult to determine sampling parameters which yield prescribed inclusion probabilities, the Sampford design yields prescribed inclusion probabilities but may be hard to sample from, and the Pareto design makes sample selection very easy but it is very difficult to determine sampling parameters which yield prescribed inclusion probabilities. These three designs are compared probabilistically, and found to be close to each other under certain conditions. In particular the Sampford and Pareto designs are probabilistically close to each other. Some effort is devoted to analytically adjusting the CP and Pareto designs so that they yield inclusion probabilities close to the prescribed ones. The result of the adjustments are in general very good. Some iterative procedures are suggested to improve the results even further. Further, balanced unequal probability sampling is considered. In this kind of sampling, samples are given a positive probability of selection only if they satisfy some balancing conditions. The balancing conditions are given by information from auxiliary variables. Most of the attention is devoted to a slightly less general but practically important case. Also in this case the inclusion probabilities are prescribed in advance, making the choice of sampling parameters important. A complication which arises in the context of choosing sampling parameters is that certain probability distributions need to be calculated, and exact calculation turns out to be practically impossible, except for very small cases. It is proposed that Markov Chain Monte Carlo (MCMC) methods are used for obtaining approximations to the relevant probability distributions, and also for sample selection. In general, MCMC methods for sample selection does not occur very frequently in the sampling literature today, making it a fairly novel idea.
|
595 |
Correlations and linkages in credit risk : an investigation of the credit default swap market during the turmoilWu, Weiou January 2013 (has links)
This thesis investigates correlations and linkages in credit risk that widely exist in all sectors of the financial markets. The main body of this thesis is constructed around four empirical chapters. I started with extending two main issues focused by earlier empirical studies on credit derivatives markets: the determinants of CDS spreads and the relationship between CDS spreads and bond yield spreads, with a special focus on the effect of the subprime crisis. By having observed that the linear relationship can not fully explain the variation in CDS spreads, the third empirical chapter investigated the dependence structure between CDS spread changes and market variables using a nonlinear copula method. The last chapter investigated the relationship between the CDS spread and another credit spread - the TED spread, in that a MVGARCH model and twelve copulas are set forth including three time varying copulas. The results of this thesis greatly enhanced our understanding about the effect of the subprime crisis on the credit default swap market, upon which a set of useful practical suggestions are made to policy makers and market participants.
|
596 |
A psychographic study of the students market of Hong KongTong, Kam-shing., 湯錦成. January 1981 (has links)
published_or_final_version / Business Administration / Master / Master of Business Administration
|
597 |
Nelson-type Limits for α-Stable Lévy ProcessesAl-Talibi, Haidar January 2010 (has links)
<p>Brownian motion has met growing interest in mathematics, physics and particularly in finance since it was introduced in the beginning of the twentieth century. Stochastic processes generalizing Brownian motion have influenced many research fields theoretically and practically. Moreover, along with more refined techniques in measure theory and functional analysis more stochastic processes were constructed and studied. Lévy processes, with Brownian motionas a special case, have been of major interest in the recent decades. In addition, Lévy processes include a number of other important processes as special cases like Poisson processes and subordinators. They are also related to stable processes.</p><p>In this thesis we generalize a result by S. Chandrasekhar [2] and Edward Nelson who gave a detailed proof of this result in his book in 1967 [12]. In Nelson’s first result standard Ornstein-Uhlenbeck processes are studied. Physically this describes free particles performing a random and irregular movement in water caused by collisions with the water molecules. In a further step he introduces a nonlinear drift in the position variable, i.e. he studies the case when these particles are exposed to an external field of force in physical terms.</p><p>In this report, we aim to generalize the result of Edward Nelson to the case of α-stable Lévy processes. In other words we replace the driving noise of a standard Ornstein-Uhlenbeck process by an α-stable Lévy noise and introduce a scaling parameter uniformly in front of all vector fields in the cotangent space, even in front of the noise. This corresponds to time being sent to infinity. With Chandrasekhar’s and Nelson’s choice of the diffusion constant the stationary state of the velocity process (which is approached as time tends to infinity) is the Boltzmann distribution of statistical mechanics.The scaling limits we obtain in the absence and presence of a nonlinear drift term by using the scaling property of the characteristic functions and time change, can be extended to other types of processes rather than α-stable Lévy processes.</p><p>In future, we will consider to generalize this one dimensional result to Euclidean space of arbitrary finite dimension. A challenging task is to consider the geodesic flow on the cotangent bundle of a Riemannian manifold with scaled drift and scaled Lévy noise. Geometrically the Ornstein-Uhlenbeck process is defined on the tangent bundle of the real line and the driving Lévy noise is defined on the cotangent space.</p>
|
598 |
A min/max algorithm for cubic splines over k-partitionsUnknown Date (has links)
The focus of this thesis is to statistically model violent crime rates against population over the years 1960-2009 for the United States. We approach this question as to be of interest since the trend of population for individual states follows different patterns. We propose here a method which employs cubic spline regression modeling. First we introduce a minimum/maximum algorithm that will identify potential knots. Then we employ least squares estimation to find potential regression coefficients based upon the cubic spline model and the knots chosen by the minimum/maximum algorithm. We then utilize the best subsets regression method to aid in model selection in which we find the minimum value of the Bayesian Information Criteria. Finally, we preent the R2adj as a measure of overall goodness of fit of our selected model. We have found among the fifty states and Washington D.C., 42 out of 51 showed an R2adj value that was greater than 90%. We also present an overall model of the United States. Also, we show additional applications our algorithm for data which show a non linear association. It is hoped that our method can serve as a unified model for violent crime rate over future years. / by Eric David Golinko. / Thesis (M.S.)--Florida Atlantic University, 2012. / Includes bibliography. / Electronic reproduction. Boca Raton, Fla., 2012. Mode of access: World Wide Web.
|
599 |
Detection of multiple change-points in hazard modelsUnknown Date (has links)
Change-point detection in hazard rate function is an important research topic in survival
analysis. In this dissertation, we firstly review existing methods for single change-point detection in
piecewise exponential hazard model. Then we consider the problem of estimating the change point in
the presence of right censoring and long-term survivors while using Kaplan-Meier estimator for the
susceptible proportion. The maximum likelihood estimators are shown to be consistent. Taking one
step further, we propose an counting process based and least squares based change-point detection
algorithm. For single change-point case, consistency results are obtained. We then consider the
detection of multiple change-points in the presence of long-term survivors via maximum likelihood
based and counting process based method. Last but not least, we use a weighted least squares based
and counting process based method for detection of multiple change-points with long-term survivors
and covariates. For multiple change-points detection, simulation studies show good performances of
our estimators under various parameters settings for both methods. All methods are applied to real
data analyses. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2014. / FAU Electronic Theses and Dissertations Collection
|
600 |
Analysis of dependence structure between the Rand/U.S Dollar exchange rate and the gold/platinum pricesMalandala, Kajingulu 04 1900 (has links)
Copulas functions are a flexible tool for modelling the dependence structure between variables. The joint and marginal distributions of Copulas are not constrained by the assumptions of normality. This study examines the dependence structure between the gold, platinum prices and the ZAR/U.S.D exchange rate using Copulas. The study found that marginal distributions of Copulas follows the ARMA (1, 1)-EGARCH (1, 1) and ARMA(1, 1)-APARCH (1, 1) models under different error terms including the normal, the student-t and the skew student-t error terms. It used the Normal, the Student-t, the Gumbel, the rotated Gumbel, the Clayton, the rotated Clayton, the Plackett, the Joe Clayton and the Normal time varying Copulas to analyse the dependence structure between returns prices of gold, platinum and ZAR/U.S.D exchange rate. The results showed evidence of a positive strong dependence between the returns prices of gold, platinum and returns on the Rand/U.S.D exchange rate for constant and time varying Copulas. The result also showed a co-movement of exchange rates and gold and platinum prices during a rise or declining prices of gold and platinum. The results imply that fluctuations in gold and platinum prices generate Rand/U.S.D exchange rate volatility. / Statistics / M. Sc. (Statistics)
|
Page generated in 0.1082 seconds