131 |
Bayesian analysis of errors-in-variables in generalized linear models鄧沛權, Tang, Pui-kuen. January 1992 (has links)
published_or_final_version / Statistics / Doctoral / Doctor of Philosophy
|
132 |
Development of high performance implantable cardioverter defibrillatorbased statistical analysis of electrocardiographyKwan, Siu-ki., 關兆奇. January 2007 (has links)
published_or_final_version / abstract / Electrical and Electronic Engineering / Doctoral / Doctor of Philosophy
|
133 |
Bayesian carrier frequency offset estimation in orthogonal frequency division multiplexing systemsCai, Kun, 蔡琨 January 2009 (has links)
published_or_final_version / Electrical and Electronic Engineering / Master / Master of Philosophy
|
134 |
Online auction price prediction: a Bayesian updating framework based on the feedback historyYang, Boye., 扬博野. January 2009 (has links)
published_or_final_version / Business / Master / Master of Philosophy
|
135 |
Decision theory to support evacuation in advance of catastrophic disaster including modular influence diagrams and spatial data analysisKailiponi, Paul January 2012 (has links)
Catastrophic disaster represents a vital issue in emergency management for many countries in the European Union (EU) and around the world. Given the damage to human lives that different hazards represent, evacuation operations can be the only option available to emergency managers to mitigate the loss of life from catastrophic disaster. However, due to the amount of time needed to effectively evacuate a large area, the decision to evacuate must occur when there is a relatively low probability of the event. An explicit understanding of the evacuation decision can lead to better organisational preparedness in advance of catastrophic disaster events. This research represents work performed with 159 emergency experts and professionals across ten countries. The goal of this research was to create decision-making aids for evacuations in advance of a variety of catastrophic disaster scenarios. Traditional Decision Theory (DT) provides a rational approach to decision-making that emphasizes the optimization of subjective preferences combined with uncertainty. Within evacuation decision-making, DT and its respective outputs are appealing; however the analytical process can be difficult due to the lack of observed data to support quantitative assessments from catastrophic events and relative infrequency of evacuation operations. This research explored the traditional use of DT applied to catastrophic evacuation scenarios. Theoretical contributions to DT and emergency management include: 1) identification of evacuation decision criteria, 2) inter-model analysis between decision structures called Influence Diagrams (IDs), 3) complete application of quantitative decision analysis to support evacuation decision-making and 4) multi-criteria analysis for evacuation vulnerability using spatial data. Important contributions from this work include:1) An analysis of evacuation criteria for a variety of catastrophic disaster scenarios2) Inter-model analysis of evacuation scenarios (flooding, nuclear dispersion and terrorist attack) to identify common probabilistic structures to support multi-hazard strategy planning3) Quantitative decision models to support evacuation strategies, identify key uncertainties and policy analysis 4) Process to use spatial data to support multi-criteria evacuation vulnerability analysis 5) Organisational self-assessment for evacuation decision-making and spatial data use based on findings across all participating countries.
|
136 |
The effectiveness of hedge fund strategies and managers’ skills during market crises: a fuzzy, non-parametric and Bayesian analysis05 November 2012 (has links)
Ph.D. / This thesis investigates the persistence of hedge fund managers’ skills, the optimality of strategies they use to outperform consistently the market during periods of boom and/or recession, and the market risk encountered thereby. We consider a data set of monthly investment strategy indices published by Hedge Fund Research group. The data set spans from January 1995 to June 2010. We divide this sample period into four overlapping sub- sample periods that contain different economic market trends. We define a skilled manager as a manager who can outperform the market consistently during two consecutive sub-sample periods. To investigate the presence of managerial skills among hedge fund managers we first distinguish between outperformance, selectivity and market timing skills. We thereafter employ three different econometric models: frequentist, Bayesian and fuzzy regression, in order to estimate outperformance, selectivity and market timing skills using both linear and quadratic CAPM. Persistence in performance is carried out in three different fashions: contingence table, chi-square test and cross-sectional auto-regression technique. The results obtained with the first two probabilistic methods (frequentist and Bayesian) show that fund managers have skills to outperform the market during the period of positive economic growth (i.e. between sub-sample period 1 and sub-sample period 3). This market outperformance is due to both selectivity skill (during sub-sample period 2 and sub-sample period 3), and market timing skill (during sub-sample period 1 and sub- sample period 2). These results contradict the EMH and suggest that the “market is not always efficient,” it is possible to make abnormal rate of returns.However, the results obtained with the uncertainty fuzzy credibility method show that dispite the presence of few fund managers who possess selectivity skills during bull market period (sub-sample period 2 and sub-sample period 3), and market timing skills during recovery period (sub-sample period 3 and sub-sample period 4); there is no evidence of overall market outperformance during the entire sample period. Therefore the fuzzy credibility results support the appeal of the EMH according to which no economic agent can make risk-adjusted abnormal rate of return. The difference in findings obtained with the probabilistic method (frequentist and Bayesian) and uncertainty method (fuzzy credibility theory) is primarily due to the way uncertainty is modelled in the hedge fund universe in particular and in financial markets in general. Probability differs fundamentally from uncertainty: probability assumes that the total number of states of economy is known, whereas uncertainty assumes that the total number of states of economy is unknown. Furthermore, probabilistic methods rely on the assumption that asset returns are normally distributed and that transaction costs are negligible.
|
137 |
Generalizing the number of states in Bayesian belief propagation, as applied to portfolio management.Kruger, Jan Walters. January 1996 (has links)
A research report submitted to the Faculty of Science, University of the
Witwatersrand, Johannesburg, in partial fulfillment of the requirements for the
degree of Master of' Science. / This research report describes the use or the Pearl's algorithm in Bayesian belief
networks to induce a belief network from a database. With a solid grounding in
probability theory, the Pearl algorithm allows belief updating by propagating
likelihoods of leaf nodes (variables) and the prior probabilities.
The Pearl algorithm was originally developed for binary variables and a
generalization to more states is investigated.
The data 'Used to test this new method, in a Portfolio Management context, are the
Return and various attributes of companies listed on the Johannesburg Stock
Exchange ( JSE ).
The results of this model is then compared to a linear regression model. The
bayesian method is found to perform better than a linear regression approach. / Andrew Chakane 2018
|
138 |
Various Approaches on Parameter Estimation in Mixture and Non-Mixture Cure ModelsUnknown Date (has links)
Analyzing life-time data with long-term survivors is an important topic in
medical application. Cure models are usually used to analyze survival data with the
proportion of cure subjects or long-term survivors. In order to include the propor-
tion of cure subjects, mixture and non-mixture cure models are considered. In this
dissertation, we utilize both maximum likelihood and Bayesian methods to estimate
model parameters. Simulation studies are carried out to verify the nite sample per-
formance of the estimation methods. Real data analyses are reported to illustrate
the goodness-of- t via Fr echet, Weibull and Exponentiated Exponential susceptible
distributions. Among the three parametric susceptible distributions, Fr echet is the
most promising.
Next, we extend the non-mixture cure model to include a change point in a covariate
for right censored data. The smoothed likelihood approach is used to address the
problem of a log-likelihood function which is not di erentiable with respect to the
change point. The simulation study is based on the non-mixture change point cure
model with an exponential distribution for the susceptible subjects. The simulation
results revealed a convincing performance of the proposed method of estimation. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2018. / FAU Electronic Theses and Dissertations Collection
|
139 |
Bayesian approach to an exponential hazard regression model with a change pointUnknown Date (has links)
This thesis contains two parts. The first part derives the Bayesian estimator of
the parameters in a piecewise exponential Cox proportional hazard regression model,
with one unknown change point for a right censored survival data. The second part
surveys the applications of change point problems to various types of data, such as
long-term survival data, longitudinal data and time series data. Furthermore, the
proposed method is then used to analyse a real survival data. / Includes bibliography. / Thesis (M.S.)--Florida Atlantic University, 2014. / FAU Electronic Theses and Dissertations Collection
|
140 |
FBST seqüencial / Sequential FBSTArruda, Marcelo Leme de 04 June 2012 (has links)
O FBST (Full Bayesian Significance Test) é um instrumento desenvolvido por Pereira e Stern (1999) com o objetivo de apresentar uma alternativa bayesiana aos testes de hipóteses precisas. Desde sua introdução, o FBST se mostrou uma ferramenta muito útil para a solução de problemas para os quais não havia soluções freqüentistas. Esse teste, contudo, depende de que a amostra seja coletada uma única vez, após o que a distribuição a posteriori dos parâmetros é obtida e a medida de evidência, calculada. Ensejadas por esse aspecto, são apresentadas abordagens analíticas e computacionais para a extensão do FBST ao contexto de decisão seqüencial (DeGroot, 2004). É apresentado e analisado um algoritmo para a execução do FBST Seqüencial, bem como o código-fonte de um software baseado nesse algoritmo. / FBST (Full Bayesian Significance Test) is a tool developed by Pereira and Stern (1999), to show a bayesian alternative to the tests of precise hypotheses. Since its introduction, FBST has shown to be a very useful tool to solve problems to which there were no frequentist solutions. This test, however, needs that the sample be collected just one time and, after this, the parameters posterior distribution is obtained and the evidence measure, computed. Suggested by this feature, there are presented analytic and computational approaches to the extension of the FBST to the sequential decision context (DeGroot, 2004). It is presented and analyzed an algorithm to execute the Sequential FBST, as well as the source code of a software based on this algorithm.
|
Page generated in 0.0595 seconds