• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 37
  • 9
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 75
  • 75
  • 41
  • 34
  • 26
  • 24
  • 20
  • 16
  • 13
  • 12
  • 12
  • 11
  • 11
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

ARIMA forecasts of the number of beneficiaries of social security grants in South Africa

Luruli, Fululedzani Lucy 12 1900 (has links)
The main objective of the thesis was to investigate the feasibility of accurately and precisely fore- casting the number of both national and provincial bene ciaries of social security grants in South Africa, using simple autoregressive integrated moving average (ARIMA) models. The series of the monthly number of bene ciaries of the old age, child support, foster care and disability grants from April 2004 to March 2010 were used to achieve the objectives of the thesis. The conclusions from analysing the series were that: (1) ARIMA models for forecasting are province and grant-type spe- ci c; (2) for some grants, national forecasts obtained by aggregating provincial ARIMA forecasts are more accurate and precise than those obtained by ARIMA modelling national series; and (3) for some grants, forecasts obtained by modelling the latest half of the series were more accurate and precise than those obtained from modelling the full series. / Mathematical Sciences / M.Sc. (Statistics)
52

Understanding patterns of aggregation in count data

Sebatjane, Phuti 06 1900 (has links)
The term aggregation refers to overdispersion and both are used interchangeably in this thesis. In addressing the problem of prevalence of infectious parasite species faced by most rural livestock farmers, we model the distribution of faecal egg counts of 15 parasite species (13 internal parasites and 2 ticks) common in sheep and goats. Aggregation and excess zeroes is addressed through the use of generalised linear models. The abundance of each species was modelled using six different distributions: the Poisson, negative binomial (NB), zero-inflated Poisson (ZIP), zero-inflated negative binomial (ZINB), zero-altered Poisson (ZAP) and zero-altered negative binomial (ZANB) and their fit was later compared. Excess zero models (ZIP, ZINB, ZAP and ZANB) were found to be a better fit compared to standard count models (Poisson and negative binomial) in all 15 cases. We further investigated how distributional assumption a↵ects aggregation and zero inflation. Aggregation and zero inflation (measured by the dispersion parameter k and the zero inflation probability) were found to vary greatly with distributional assumption; this in turn changed the fixed-effects structure. Serial autocorrelation between adjacent observations was later taken into account by fitting observation driven time series models to the data. Simultaneously taking into account autocorrelation, overdispersion and zero inflation proved to be successful as zero inflated autoregressive models performed better than zero inflated models in most cases. Apart from contribution to the knowledge of science, predictability of parasite burden will help farmers with effective disease management interventions. Researchers confronted with the task of analysing count data with excess zeroes can use the findings of this illustrative study as a guideline irrespective of their research discipline. Statistical methods from model selection, quantifying of zero inflation through to accounting for serial autocorrelation are described and illustrated. / Statistics / M.Sc. (Statistics)
53

ARIMA forecasts of the number of beneficiaries of social security grants in South Africa

Luruli, Fululedzani Lucy 12 1900 (has links)
The main objective of the thesis was to investigate the feasibility of accurately and precisely fore- casting the number of both national and provincial bene ciaries of social security grants in South Africa, using simple autoregressive integrated moving average (ARIMA) models. The series of the monthly number of bene ciaries of the old age, child support, foster care and disability grants from April 2004 to March 2010 were used to achieve the objectives of the thesis. The conclusions from analysing the series were that: (1) ARIMA models for forecasting are province and grant-type spe- ci c; (2) for some grants, national forecasts obtained by aggregating provincial ARIMA forecasts are more accurate and precise than those obtained by ARIMA modelling national series; and (3) for some grants, forecasts obtained by modelling the latest half of the series were more accurate and precise than those obtained from modelling the full series. / Mathematical Sciences / M.Sc. (Statistics)
54

MARGINAL LIKELIHOOD INFERENCE FOR FRAILTY AND MIXTURE CURE FRAILTY MODELS UNDER BIRNBAUM-SAUNDERS AND GENERALIZED BIRNBAUM-SAUNDERS DISTRIBUTIONS

Liu, Kai January 2018 (has links)
Survival analytic methods help to analyze lifetime data arising from medical and reliability experiments. The popular proportional hazards model, proposed by Cox (1972), is widely used in survival analysis to study the effect of risk factors on lifetimes. An important assumption in regression type analysis is that all relative risk factors should be included in the model. However, not all relative risk factors are observed due to measurement difficulty, inaccessibility, cost considerations, and so on. These unobservable risk factors can be modelled by the so-called frailty model, originally introduced by Vaupel et al. (1979). Furthermore, the frailty model is also applicable to clustered data. Cluster data possesses the feature that observations within the same cluster share similar conditions and environment, which are sometimes difficult to observe. For example, patients from the same family share similar genetics, and patients treated in the same hospital share the same group of profes- sionals and same environmental conditions. These factors are indeed hard to quantify or measure. In addition, this type of similarity introduces correlation among subjects within clusters. In this thesis, a semi-parametric frailty model is proposed to address aforementioned issues. The baseline hazards function is approximated by a piecewise constant function and the frailty distribution is assumed to be a Birnbaum-Saunders distribution. Due to the advancement in modern medical sciences, many diseases are curable, which in turn leads to the need of incorporating cure proportion in the survival model. The frailty model discussed here is further extended to a mixture cure rate frailty model by integrating the frailty model into the mixture cure rate model proposed originally by Boag (1949) and Berkson and Gage (1952). By linking the covariates to the cure proportion through logistic/logit link function and linking observable covariates and unobservable covariates to the lifetime of the uncured population through the frailty model, we obtain a flexible model to study the effect of risk factors on lifetimes. The mixture cure frailty model can be reduced to a mixture cure model if the effect of frailty term is negligible (i.e., the variance of the frailty distribution is close to 0). On the other hand, it also reduces to the usual frailty model if the cure proportion is 0. Therefore, we can use a likelihood ratio test to test whether the reduced model is adequate to model the given data. We assume the baseline hazard to be that of Weibull distribution since Weibull distribution possesses increasing, constant or decreasing hazard rate, and the frailty distribution to be Birnbaum-Saunders distribution. D ́ıaz-Garc ́ıa and Leiva-Sa ́nchez (2005) proposed a new family of life distributions, called generalized Birnbaum-Saunders distribution, including Birnbaum-Saunders distribution as a special case. It allows for various degrees of kurtosis and skewness, and also permits unimodality as well as bimodality. Therefore, integration of a generalized Birnbaum-Saunders distribution as the frailty distribution in the mixture cure frailty model results in a very flexible model. For this general model, parameter estimation is carried out using a marginal likelihood approach. One of the difficulties in the parameter estimation is that the likelihood function is intractable. The current technology in computation enables us to develop a numerical method through Monte Carlo simulation, and in this approach, the likelihood function is approximated by the Monte Carlo method and the maximum likelihood estimates and standard errors of the model parameters are then obtained numerically by maximizing this approximate likelihood function. An EM algorithm is also developed for the Birnbaum-Saunders mixture cure frailty model. The performance of this estimation method is then assessed by simulation studies for each proposed model. Model discriminations is also performed between the Birnbaum-Saunders frailty model and the generalized Birnbaum-Saunders mixture cure frailty model. Some illustrative real life examples are presented to illustrate the models and inferential methods developed here. / Thesis / Doctor of Science (PhD)
55

CURE RATE AND DESTRUCTIVE CURE RATE MODELS UNDER PROPORTIONAL ODDS LIFETIME DISTRIBUTIONS

FENG, TIAN January 2019 (has links)
Cure rate models, introduced by Boag (1949), are very commonly used while modelling lifetime data involving long time survivors. Applications of cure rate models can be seen in biomedical science, industrial reliability, finance, manufacturing, demography and criminology. In this thesis, cure rate models are discussed under a competing cause scenario, with the assumption of proportional odds (PO) lifetime distributions for the susceptibles, and statistical inferential methods are then developed based on right-censored data. In Chapter 2, a flexible cure rate model is discussed by assuming the number of competing causes for the event of interest following the Conway-Maxwell (COM) Poisson distribution, and their corresponding lifetimes of non-cured or susceptible individuals can be described by PO model. This provides a natural extension of the work of Gu et al. (2011) who had considered a geometric number of competing causes. Under right censoring, maximum likelihood estimators (MLEs) are obtained by the use of expectation-maximization (EM) algorithm. An extensive Monte Carlo simulation study is carried out for various scenarios, and model discrimination between some well-known cure models like geometric, Poisson and Bernoulli is also examined. The goodness-of-fit and model diagnostics of the model are also discussed. A cutaneous melanoma dataset example is used to illustrate the models as well as the inferential methods. Next, in Chapter 3, the destructive cure rate models, introduced by Rodrigues et al. (2011), are discussed under the PO assumption. Here, the initial number of competing causes is modelled by a weighted Poisson distribution with special focus on exponentially weighted Poisson, length-biased Poisson and negative binomial distributions. Then, a damage distribution is introduced for the number of initial causes which do not get destroyed. An EM-type algorithm for computing the MLEs is developed. An extensive simulation study is carried out for various scenarios, and model discrimination between the three weighted Poisson distributions is also examined. All the models and methods of estimation are evaluated through a simulation study. A cutaneous melanoma dataset example is used to illustrate the models as well as the inferential methods. In Chapter 4, frailty cure rate models are discussed under a gamma frailty wherein the initial number of competing causes is described by a Conway-Maxwell (COM) Poisson distribution in which the lifetimes of non-cured individuals can be described by PO model. The detailed steps of the EM algorithm are then developed for this model and an extensive simulation study is carried out to evaluate the performance of the proposed model and the estimation method. A cutaneous melanoma dataset as well as a simulated data are used for illustrative purposes. Finally, Chapter 5 outlines the work carried out in the thesis and also suggests some problems of further research interest. / Thesis / Doctor of Philosophy (PhD)
56

Inference for Gamma Frailty Models based on One-shot Device Data

Yu, Chenxi January 2024 (has links)
A device that is accompanied by an irreversible chemical reaction or physical destruction and could no longer function properly after performing its intended function is referred to as a one-shot device. One-shot device test data differ from typical data obtained by measuring lifetimes in standard life-tests. Due to the very nature of one-shot devices, actual lifetimes of one-shot devices under test cannot be observed, and they are either left- or right-censored. In addition, a one-shot device often has multiple components that could cause the failure of the device. The components are coupled together in the manufacturing process or assembly, resulting in the failure modes possessing latent heterogeneity and dependence. Frailty models enable us to describe the influence of common, but unobservable covariates, on the hazard function as a random effect in a model and also provide an easily understandable interpretation. In this thesis, we develop some inferential results for one-shot device testing data with gamma frailty model. We first develop an efficient expectation-maximization (EM) algorithm for determining the maximum likelihood estimates of model parameters of a gamma frailty model with exponential lifetime distributions for components based on one-shot device test data with multiple failure modes, wherein the data are obtained from a constant-stress accelerated life-test. The maximum likelihood estimate of the mean lifetime of $k$-out-of-$M$ structured one-shot devices under normal operating conditions is also presented. In addition, the asymptotic variance–covariance matrix of the maximum likelihood estimates is derived, which is then used to construct asymptotic confidence intervals for the model parameters. The performance of the proposed inferential methods is finally evaluated through Monte Carlo simulations and then illustrated with a numerical example. A gamma frailty model with Weibull baseline hazards is considered next for fitting one-shot device testing data. The Weibull baseline hazards enable us to analyze time-varying failure rates more accurately, allowing for a deeper understanding of the dynamic nature of system's reliability. We develop an EM algorithm for estimating the model parameters utilizing the complete likelihood function. A detailed simulation study evaluates the performance of the Weibull baseline hazard model with that of the exponential baseline hazard model. The introduction of shape parameters in the component's lifetime distribution within the Weibull baseline hazard model offers enhanced flexibility in model fitting. Finally, Bayesian inference is then developed for the gamma frailty model with exponential baseline hazard for one-shot device testing data. We introduce the Bayesian estimation procedure using Markov chain Monte Carlo (MCMC) technique for estimating the model parameters as well as for developing credible intervals for those parameters. The performance of the proposed method is evaluated in a simulation study. Model comparison between independence model and the frailty model is made using Bayesian model selection criterion. / Thesis / Candidate in Philosophy
57

Bayesian modelling of ultra high-frequency financial data

Shahtahmassebi, Golnaz January 2011 (has links)
The availability of ultra high-frequency (UHF) data on transactions has revolutionised data processing and statistical modelling techniques in finance. The unique characteristics of such data, e.g. discrete structure of price change, unequally spaced time intervals and multiple transactions have introduced new theoretical and computational challenges. In this study, we develop a Bayesian framework for modelling integer-valued variables to capture the fundamental properties of price change. We propose the application of the zero inflated Poisson difference (ZPD) distribution for modelling UHF data and assess the effect of covariates on the behaviour of price change. For this purpose, we present two modelling schemes; the first one is based on the analysis of the data after the market closes for the day and is referred to as off-line data processing. In this case, the Bayesian interpretation and analysis are undertaken using Markov chain Monte Carlo methods. The second modelling scheme introduces the dynamic ZPD model which is implemented through Sequential Monte Carlo methods (also known as particle filters). This procedure enables us to update our inference from data as new transactions take place and is known as online data processing. We apply our models to a set of FTSE100 index changes. Based on the probability integral transform, modified for the case of integer-valued random variables, we show that our models are capable of explaining well the observed distribution of price change. We then apply the deviance information criterion and introduce its sequential version for the purpose of model comparison for off-line and online modelling, respectively. Moreover, in order to add more flexibility to the tails of the ZPD distribution, we introduce the zero inflated generalised Poisson difference distribution and outline its possible application for modelling UHF data.
58

Identification des grands utilisateurs de soins de santé chez les patients souffrant de la douleur chronique non cancéreuse et suivis en soins de première ligne

Antaky, Elie 03 1900 (has links)
Contexte: La douleur chronique non cancéreuse (DCNC) génère des retombées économiques et sociétales importantes. L’identification des patients à risque élevé d’être de grands utilisateurs de soins de santé pourrait être d’une grande utilité; en améliorant leur prise en charge, il serait éventuellement possible de réduire leurs coûts de soins de santé. Objectif: Identifier les facteurs prédictifs bio-psycho-sociaux des grands utilisateurs de soins de santé chez les patients souffrant de DCNC et suivis en soins de première ligne. Méthodologie: Des patients souffrant d’une DCNC modérée à sévère depuis au moins six mois et bénéficiant une ordonnance valide d’un analgésique par un médecin de famille ont été recrutés dans des pharmacies communautaires du territoire du Réseau universitaire intégré de santé (RUIS), de l’Université de Montréal entre Mai 2009 et Janvier 2010. Ce dernier est composé des six régions suivantes : Mauricie et centre du Québec, Laval, Montréal, Laurentides, Lanaudière et Montérégie. Les caractéristiques bio-psycho-sociales des participants ont été documentées à l’aide d’un questionnaire écrit et d’une entrevue téléphonique au moment du recrutement. Les coûts directs de santé ont été estimés à partir des soins et des services de santé reçus au cours de l’année précédant et suivant le recrutement et identifiés à partir de la base de données de la Régie d’Assurance maladie du Québec, RAMQ (assureur publique de la province du Québec). Ces coûts incluaient ceux des hospitalisations reliées à la douleur, des visites à l’urgence, des soins ambulatoires et de la médication prescrite pour le traitement de la douleur et la gestion des effets secondaires des analgésiques. Les grands utilisateurs des soins de santé ont été définis comme étant ceux faisant partie du quartile le plus élevé de coûts directs annuels en soins de santé dans l’année suivant le recrutement. Des modèles de régression logistique multivariés et le critère d’information d’Akaike ont permis d’identifier les facteurs prédictifs des coûts directs élevés en soins de santé. Résultats: Le coût direct annuel médian en soins de santé chez les grands utilisateurs de soins de santé (63 patients) était de 7 627 CAD et de 1 554 CAD pour les utilisateurs réguliers (188 patients). Le modèle prédictif final du risque d’être un grand utilisateur de soins de santé incluait la douleur localisée au niveau des membres inférieurs (OR = 3,03; 95% CI: 1,20 - 7,65), la réduction de la capacité fonctionnelle liée à la douleur (OR = 1,24; 95% CI: 1,03 - 1,48) et les coûts directs en soins de santé dans l’année précédente (OR = 17,67; 95% CI: 7,90 - 39,48). Les variables «sexe», «comorbidité», «dépression» et «attitude envers la guérison médicale» étaient également retenues dans le modèle prédictif final. Conclusion: Les patients souffrant d’une DCNC au niveau des membres inférieurs et présentant une détérioration de la capacité fonctionnelle liée à la douleur comptent parmi ceux les plus susceptibles d’être de grands utilisateurs de soins et de services. Le coût direct en soins de santé dans l’année précédente était également un facteur prédictif important. Améliorer la prise en charge chez cette catégorie de patients pourrait influencer favorablement leur état de santé et par conséquent les coûts assumés par le système de santé. / Background: Chronic non-cancer pain (CNCP) has major social and economic impacts. Identifying patients at risk of being heavy health care users could be very useful; therefore, by improving their care direct health care costs could eventually be reduced. Purpose: To identify bio-psycho-social factors predicting the risk of being a heavy health care user among primary care CNCP patients. Methods: Patients reporting moderate to severe CNCP for at least 6 months with an active analgesic prescription from a primary care physician were recruited in community pharmacies on the territory of the Réseau universitaire integré de santé (RUIS), of the Université de Montréal between May 2009 and January 2010. The latter comprises six areas: Mauricie and centre du Quebec, Laval, Montreal, the Laurentians, Lanaudière and Montérégie. Upon recruitment, their bio-psycho-social characteristics were documented through self-administered and telephone questionnaires. The direct health costs were estimated for the health care services provided to patients in the year preceding and following recruitment using the database of the Régie d’Assurance maladie du Québec, RAMQ (Quebec province public health care insurance). These costs took into account the pain-related hospitalizations, emergency room visits, ambulatory care, and medication prescribed for pain treatment and drug side effects Heavy health care users were defined as those in the highest annual direct health care costs quartile in the year following recruitment. Logistic multivariate regression models using the Akaike information criterion were developed in order to identify the predictors of heavy health care use. Results: The median annual direct health care cost incurred by heavy health care users (n = 63) was CAD 7,627, compared to CAD 1,554 for the standard health care users (n = 188). The final predictive model of the risks of being a heavy health care user included pain located in the lower body (Odds ratio (OR) = 3.03; 95% CI: 1.20 - 7.65), pain-related disability (OR = 1.24; 95% CI: 1.03 - 1.48), and health care costs in the previous year (OR = 17.67; 95% CI: 7.90 - 39.48). Other retained variables were sex, comorbidity, depression level, and patients’ attitudes towards medical pain cure. Conclusion: Patients suffering from CNCP in the lower body and having a greater impact of pain on their daily functioning were more likely to be heavy health care and services users. Previous year annual direct cost was also a significant predictor. Improving pain management in this clientele of patients may improve their health and eventually reduce their health care cost to the health care system.
59

Mensuração da biomassa e construção de modelos para construção de equações de biomassa / Biomass measurement and models selection for biomass equations

Vismara, Edgar de Souza 07 May 2009 (has links)
O interesse pela quantificação da biomassa florestal vem crescendo muito nos últimos anos, sendo este crescimento relacionado diretamente ao potencial que as florestas tem em acumular carbono atmosférico na sua biomassa. A biomassa florestal pode ser acessada diretamente, por meio de inventário, ou através de modelos empíricos de predição. A construção de modelos de predição de biomassa envolve a mensuração das variáveis e o ajuste e seleção de modelos estatísticos. A partir de uma amostra destrutiva de de 200 indivíduos de dez essências florestais distintas advindos da região de Linhares, ES., foram construídos modelos de predição empíricos de biomassa aérea visando futuro uso em projetos de reflorestamento. O processo de construção dos modelos consistiu de uma análise das técnicas de obtenção dos dados e de ajuste dos modelos, bem como de uma análise dos processos de seleção destes a partir do critério de Informação de Akaike (AIC). No processo de obtenção dos dados foram testadas a técnica volumétrica e a técnica gravimétrica, a partir da coleta de cinco discos de madeira por árvore, em posições distintas no lenho. Na técnica gravimétrica, estudou-se diferentes técnicas de composição do teor de umidade dos discos para determinação da biomassa, concluindo-se como a melhor a que utiliza a média aritmética dos discos da base, meio e topo. Na técnica volumétrica, estudou-se diferentes técnicas de composição da densidade do tronco com base nas densidades básicas dos discos, concluindo-se que em termos de densidade do tronco, a média aritmética das densidades básicas dos cinco discos se mostrou como melhor técnica. Entretanto, quando se multiplica a densidade do tronco pelo volume deste para obtenção da biomassa, a utilização da densidade básica do disco do meio se mostrou superior a todas as técnicas. A utilização de uma densidade básica média da espécie para determinação da biomassa, via técnica volumétrica, se apresentou como uma abordagem inferior a qualquer técnica que utiliza informação da densidade do tronco das árvores individualmente. Por fim, sete modelos de predição de biomassa aérea de árvores considerando seus diferentes compartimentos foram ajustados, a partir das funções de Spurr e Schumacher-Hall, com e sem a inclusão da altura como variável preditora. Destes modelos, quatro eram gaussianos e três eram lognormais. Estes mesmos sete modelos foram ajustados incluindo a medida de penetração como variável preditora, totalizando quatorze modelos testados. O modelo de Schumacher-Hall se mostrou, de maneira geral, superior ao modelo de Spurr. A altura só se mostrou efetiva na explicação da biomassa das árvores quando em conjunto com a medida de penetração. Os modelos selecionados foram do grupo que incluíram a medida de penetração no lenho como variável preditora e , exceto o modelo de predição da biomassa de folhas, todos se mostraram adequados para aplicação na predição da biomassa aérea em áreas de reflorestamento. / Forest biomass measurement implies a destructive procedure, thus forest inventories and biomass surveys apply indirect procedure for the determination of biomass of the different components of the forest (wood, branches, leaves, roots, etc.). The usual approch consists in taking a destructive sample for the measurment of trees attributes and an empirical relationship is established between the biomass and other attributes that can be directly measured on standing trees, e.g., stem diameter and tree height. The biomass determination of felled trees can be achived by two techniques: the gravimetric technique, that weights the components in the field and take a sample for the determination of water content in the laboratory; and the volumetric technique, that determines the volume of the component in the field and take a sample for the determination of the wood specific gravity (wood basic density) in the laboratory. The gravimetric technique applies to all components of the trees, while the volumetric technique is usually restricted to the stem and large branches. In this study, these two techniques are studied in a sample fo 200 trees of 10 different species from the region of Linhares, ES. In each tree, 5 cross-sections of the stem were taken to investigate the best procedure for the determination of water content in gravimetric technique and for determination of the wood specific gravity in the volumetric technique. Also, Akaike Information Criterion (AIC) was used to compare different statistical models for the prediction o tree biomass. For the stem water content determination, the best procedure as the aritmetic mean of the water content from the cross-sections in the base, middle and top of the stem. In the determination of wood specific gravity, the best procedure was the aritmetic mean of all five cross-sections discs of the stem, however, for the determination of the biomass, i.e., the product of stem volume and wood specific gravity, the best procedure was the use of the middle stem cross-section disc wood specific gravity. The use of an average wood specific gravity by species showed worse results than any procedure that used information of wood specific gravity at individual tree level. Seven models, as variations of Spurr and Schumacher-Hall volume equation models, were tested for the different tree components: wood (stem and large branches), little branches, leaves and total biomass. In general, Schumacher-Hall models were better than Spurr based models, and models that included only diameter (DBH) information performed better than models with diameter and height measurements. When a measure of penetration in the wood, as a surrogate of wood density, was added to the models, the models with the three variables: diameter, height and penetration, became the best models.
60

Μελέτη της σχέσης μεταξύ δείκτη εμπιστοσύνης του καταναλωτή και χρηματιστηριακών αποδόσεων στα ευρωπαϊκά χρηματιστήρια

Πάκου, Αντωνία 07 January 2009 (has links)
Στην παρούσα εργασία μελετούμε τη σχέση μεταξύ χρηματιστηριακών αποδόσεων και δείκτη εμπιστοσύνης στις 27 χώρες-μέλη της ΕΕ για τα έτη 1985-2006. Βρήκαμε ότι για το μεγαλύτερο μέρος των χωρών της ΕΕ εμφανίζεται θετική συσχέτιση μεταξύ αποδόσεων και δείκτη εμπιστοσύνης του καταναλωτή στον βραχυχρόνιο ορίζοντα. Οι μεταβολές και στους δύο δείκτες τείνουν να κινούνται παράλληλα στην ίδια περίοδο, με εξαίρεση την πλειοψηφία των νεοεισελθέντων χωρών. Στον μακροπρόθεσμο ορίζοντα, βρήκαμε ότι για τις περισσότερες χώρες ο συντελεστής γίνεται σχεδόν μηδενικός. Για το μεγαλύτερο μέρος των χωρών της ΕΕ υφίσταται σχέση αιτιότητας μεταξύ των μεταβλητών, με τις αποδόσεις να προκαλούν κατά Granger τον δείκτη εμπιστοσύνης του καταναλωτή και τον δείκτη οικονομικής εμπιστοσύνης, αλλά το αντίστροφο δεν ισχύει. Αμφίδρομη σχέση αιτιότητας μεταξύ αποδόσεων και εμπιστοσύνης των καταναλωτών παρατηρείται μόνο για την Γαλλία οριακά, ενώ για την ΕΕ βρήκαμε οτι υπάρχει αμφίδρομη σχέση αιτιότητας μεταξύ αποδόσεων και δείκτη οικονομικής εμπιστoσύνης. / This paper studies the relationship between stock market developments and confidence index for the 27 EU countries - members over the years 1985-2006. We found that for the majority of the EU countries exists positive correlation between the stock market index and the confidence indicators (consumer confidence indicator and economic sentiment indicator) in the short horizon. The changes between these indexes tempt to move in the same direction contemporaneously and in the short horizon (of 1 month), with the new EU members to be an exception. The correlation becomes almost zero in the long horizon. For the most of the EU countries there is causality between the variables. Stock returns in general Granger-cause the Consumer Confidence Index and the Economic Sentiment Indicator, but not vice versa. We found also that there is feedback causality relationship between stock returns and confidence for France and the EU as a whole.

Page generated in 0.1728 seconds