• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 72
  • 9
  • 6
  • 2
  • 2
  • Tagged with
  • 111
  • 111
  • 82
  • 23
  • 20
  • 18
  • 16
  • 15
  • 13
  • 13
  • 12
  • 12
  • 10
  • 10
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Três ensaios sobre política monetária e crédito

Barbi, Fernando Carvalhaes 08 April 2014 (has links)
Submitted by Fernando Barbi (fcbarbi@gmail.com) on 2014-05-07T22:24:44Z No. of bitstreams: 1 TESE_FERNANDO_CARVALHAES_BARBI_204089_CDEE_FINAL.pdf: 966201 bytes, checksum: 6f481f17555ebd92319058e7f6e4c7ee (MD5) / Rejected by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br), reason: Bom dia Fernando, Conforme conversamos por telefone. Att. Suzi on 2014-05-08T12:02:11Z (GMT) / Submitted by Fernando Barbi (fcbarbi@gmail.com) on 2014-05-08T12:30:01Z No. of bitstreams: 1 TESE_FERNANDO_CARVALHAES_BARBI_204089_CDEE.pdf: 963867 bytes, checksum: 6b78db46891b72b31e89059c2a176bc9 (MD5) / Rejected by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br), reason: Fernando on 2014-05-08T12:33:32Z (GMT) / Submitted by Fernando Barbi (fcbarbi@gmail.com) on 2014-05-08T12:36:15Z No. of bitstreams: 1 TESE_FERNANDO_CARVALHAES_BARBI_204089_CDEE.pdf: 963906 bytes, checksum: 467d3c75aa7e81be984b8b5f22430c0b (MD5) / Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2014-05-08T12:38:09Z (GMT) No. of bitstreams: 1 TESE_FERNANDO_CARVALHAES_BARBI_204089_CDEE.pdf: 963906 bytes, checksum: 467d3c75aa7e81be984b8b5f22430c0b (MD5) / Made available in DSpace on 2014-05-08T13:28:07Z (GMT). No. of bitstreams: 1 TESE_FERNANDO_CARVALHAES_BARBI_204089_CDEE.pdf: 963906 bytes, checksum: 467d3c75aa7e81be984b8b5f22430c0b (MD5) Previous issue date: 2014-04-08 / In the first essay, 'Determinants of Credit Expansion in Brazil', analyzes the determinants of credit using an extensive bank level panel dataset. Brazilian economy has experienced a major boost in leverage in the first decade of 2000 as a result of a set factors ranging from macroeconomic stability to the abundant liquidity in international financial markets before 2008 and a set of deliberate decisions taken by President Lula's to expand credit, boost consumption and gain political support from the lower social strata. As relevant conclusions to our investigation we verify that: credit expansion relied on the reduction of the monetary policy rate, international financial markets are an important source of funds, payroll-guaranteed credit and investment grade status affected positively credit supply. We were not able to confirm the importance of financial inclusion efforts. The importance of financial sector sanity indicators of credit conditions cannot be underestimated. These results raise questions over the sustainability of this expansion process and financial stability in the future. The second essay, 'Public Credit, Monetary Policy and Financial Stability', discusses the role of public credit. The supply of public credit in Brazil has successfully served to relaunch the economy after the Lehman-Brothers demise. It was later transformed into a driver for economic growth as well as a regulation device to force private banks to reduce interest rates. We argue that the use of public funds to finance economic growth has three important drawbacks: it generates inflation, induces higher loan rates and may induce financial instability. An additional effect is the prevention of market credit solutions. This study contributes to the understanding of the costs and benefits of credit as a fiscal policy tool. The third essay, 'Bayesian Forecasting of Interest Rates: Do Priors Matter?', discusses the choice of priors when forecasting short-term interest rates. Central Banks that commit to an Inflation Target monetary regime are bound to respond to inflation expectation spikes and product hiatus widening in a clear and transparent way by abiding to a Taylor rule. There are various reports of central banks being more responsive to inflationary than to deflationary shocks rendering the monetary policy response to be indeed non-linear. Besides that there is no guarantee that coefficients remain stable during time. Central Banks may switch to a dual target regime to consider deviations from inflation and the output gap. The estimation of a Taylor rule may therefore have to consider a non-linear model with time varying parameters. This paper uses Bayesian forecasting methods to predict short-term interest rates. We take two different approaches: from a theoretic perspective we focus on an augmented version of the Taylor rule and include the Real Exchange Rate, the Credit-to-GDP and the Net Public Debt-to-GDP ratios. We also take an 'atheoretic' approach based on the Expectations Theory of the Term Structure to model short-term interest. The selection of priors is particularly relevant for predictive accuracy yet, ideally, forecasting models should require as little a priori expert insight as possible. We present recent developments in prior selection, in particular we propose the use of hierarchical hyper-g priors for better forecasting in a framework that can be easily extended to other key macroeconomic indicators. / O primeiro ensaio, "Determinantes da expansão do crédito no Brasil", analisa os determinantes do crédito usando um extenso conjunto de dados em painel sobre o sistema bancário. A economia brasileira teve um grande impulso na alavancagem na primeira década de 2000 como resultado de um conjunto de fatores que vão desde a estabilidade macroeconômica passando pela liquidez abundante nos mercados financeiros internacionais antes de 2008 até um conjunto de decisões deliberadas tomadas pelo presidente Lula para expandir o crédito, impulsionar o consumo e obter apoio político das camadas sociais mais baixas. Como conclusões verificamos que a expansão do crédito beneficiou-se da redução da taxa de juros, os mercados financeiros internacionais são uma fonte importante de recursos, o crédito garantido em folha de pagamento e o grau de investimento afetaram positivamente a oferta de crédito. Nós não fomos capazes de confirmar a importância dos esforços de inclusão financeira. A importância dos indicadores de sanidade do setor financeiro de condições de crédito não pode ser subestimada. Estes resultados levantam questões quanto à sustentabilidade desse processo de expansão e estabilidade financeira no futuro. O segundo ensaio, "Crédito Público, Política Monetária e Estabilidade Financeira", discute o papel do crédito público. A oferta de crédito público no Brasil serviu para relançar a economia após a crise desencadeada pela quebra do banco Lehman-Brothers. Mais tarde, ele foi transformado em um motor de crescimento econômico bem como num dispositivo de regulação para forçar os bancos privados a reduzir as taxas de juros. Argumenta-se que a utilização de fundos públicos para financiar o crescimento econômico tem três desvantagens importantes: ele gera inflação, induz taxas de financiamento mais elevadas e pode induzir à instabilidade financeira. Um efeito adicional é impedir o desenvolvimento de soluções de crédito de mercado. O terceiro ensaio, "Previsão Bayesiana de Taxas de Juros: as priors importam?", discute a escolha de priors para previsão das taxas de juros de curto prazo. Bancos Centrais que se comprometem com regimes de metas de inflação devem responder a variações nas expectativa de inflação e no hiato do produto de uma forma clara e transparente, respeitando a regra de Taylor. A estimativa de uma regra de Taylor pode ter que considerar um modelo não-linear com parâmetros variáveis no tempo. Este trabalho usa métodos de previsão bayesiana para as taxas de juro de curto prazo por duas abordagens diferentes. Por uma perspectiva teórica nos concentramos em uma versão aumentada da regra de Taylor. Também testamos uma abordagem baseada na teoria das expectativas da estrutura a termo cauva de juros para modelar os juros de curto prazo. A seleção dos priores é particularmente relevante para a precisão da previsão, no entanto deseja-se usar prior robustas a falta de conhecimento prévio. Apresentamos os recentes desenvolvimentos na seleção de priors, em especial, propomos o uso de priors hierárquicas da família de distribuição hiper-geométrica.
102

Vliv zaměstnání studenta na akademické výsledky: meta-analýza / The Impact of Student Employment on Educational Outcomes: A Meta-Analysis

Kroupová, Kateřina January 2021 (has links)
Despite the extensive body of empirical research, the discussion on whether student employment impedes or improves educational outcomes has not been resolved. Using meta-analytic methods, we conduct a quantitative review of 861 effect estimates collected from 69 studies describing the relationship between student work experience and academic performance. After outlining the theo- retical mechanisms and methodological challenges of estimating the effect, we test whether publication bias permeates the literature concerning educational implications of student employment. We find that researchers report negative estimates more often than they should. However, this negative publication bias is not present in a subset of studies controlling for the endogeneity of student decision to take up employment. Furthermore, after correcting for the negative publication bias, we find that the student employment-education relationship is close to zero. Additionally, we examine heterogeneity of the estimates using Bayesian Model Averaging. Our analysis suggests that employment intensity and controlling for student permanent characteristics are the most important factors in explaining the heterogeneity. In particular, working long hours re- sults in systematically more negative effect estimates than not working at...
103

Probabilistic Ensemble-based Streamflow Forecasting Framework

Darbandsari, Pedram January 2021 (has links)
Streamflow forecasting is a fundamental component of various water resources management systems, ranging from flood control and mitigation to long-term planning of irrigation and hydropower systems. In the context of floods, a probabilistic forecasting system is required for proper and effective decision-making. Therefore, the primary goal of this research is the development of an advanced ensemble-based streamflow forecasting framework to better quantify the predictive uncertainty and generate enhanced probabilistic forecasts. This research started by comprehensively evaluating the performances of various lumped conceptual models in data-poor watersheds and comparing various Bayesian Model Averaging (BMA) modifications for probabilistic streamflow simulation. Then, using the concept of BMA, two novel probabilistic post-processing approaches were developed to enhance streamflow forecasting performance. The combination of the entropy theory and the BMA method leads to an entropy-based Bayesian Model Averaging (En-BMA) approach for enhanced probabilistic streamflow and precipitation forecasting. Also, the integration of the Hydrologic Uncertainty Processor (HUP) and the BMA methods is proposed for probabilistic post-processing of multi-model streamflow forecasts. Results indicated that the MACHBV and GR4J models are highly competent in simulating hydrological processes within data-scarce watersheds, however, the presence of the lower skill hydrologic models is still beneficial for ensemble-based streamflow forecasting. The comprehensive verification of the BMA approach in terms of streamflow predictions has identified the merits of implementing some of the previously recommended modifications and showed the importance of possessing a mutually exclusive and collectively exhaustive ensemble. By targeting the remaining limitation of the BMA approach, the proposed En-BMA method can improve probabilistic streamflow forecasting, especially under high flow conditions. Also, the proposed HUP-BMA approach has taken advantage of both HUP and BMA methods to better quantify the hydrologic uncertainty. Moreover, the applicability of the modified En-BMA as a more robust post-processing approach for precipitation forecasting, compared to BMA, has been demonstrated. / Thesis / Doctor of Philosophy (PhD) / Possessing a reliable streamflow forecasting framework is of special importance in various fields of operational water resources management, non-structural flood mitigation in particular. Accurate and reliable streamflow forecasts lead to the best possible in-advanced flood control decisions which can significantly reduce its consequent loss of lives and properties. The main objective of this research is to develop an enhanced ensemble-based probabilistic streamflow forecasting approach through proper quantification of predictive uncertainty using an ensemble of streamflow forecasts. The key contributions are: (1) implementing multiple diverse forecasts with full coverage of future possibilities in the Bayesian ensemble-based forecasting method to produce more accurate and reliable forecasts; and (2) developing an ensemble-based Bayesian post-processing approach to enhance the hydrologic uncertainty quantification by taking the advantages of multiple forecasts and initial flow observation. The findings of this study are expected to benefit streamflow forecasting, flood control and mitigation, and water resources management and planning.
104

Evaluation psychometrischer Methoden zur Verbesserung der Diagnostik der Zwangsstörung

Schulze, Daniel 16 February 2022 (has links)
Die Zwangsstörung ist durch sehr verschiedenartige Symptome charakterisiert. Diese Symptomheterogenität stellt Herausforderungen für die psychologische Diagnostik dar. Die Yale-Brown Obsessive Compulsive Scale (Y-BOCS) gilt als der Goldstandard bei der Messung von Zwangssymptomen. Mittels einer großen Stichprobe von Zwangspatienten hatte diese Arbeit drei Ziele zur Weiterentwicklungen psychometrischer Methoden: Erstens wurden die Messmodelle von Zwangssymptomen evaluiert und verbessert. Dabei fand eine umfassende Überarbeitung statt, wobei Bayesianische Messmodelle genutzt wurden. Zweitens wurde getestet, ob die Messeigenschaften der Y-BOCS über die Zeit und für weitere klinische relevante Variablen konstant sind. In einer breit angelegten Messinvarianzanalyse wurden nur wenige Verletzungen der Messinvarianz gefunden. Falls solche vorliegen, werden Gruppenvergleiche verzerrt und eventuell falsche Schlussfolgerungen gezogen. Für diese Situationen wurde drittens eine Erweiterung partieller Messinvarianzmodelle entwickelt und angewendet. Partielle Messinvarianzmodelle ermöglichen valide Gruppenvergleiche auch dann, wenn Messinvarianz nur für einige wenige Items hält. Es wurde eine Methode zur Modellmittelung entwickelt, die die Unsicherheit berücksichtigt, die in der Auswahl von Items für partielle Messinvarianz liegt. Das entwickelte Bayesianische Verfahren macht Analyseentscheidungen komplett sichtbar und damit diskutierbar. Die vorgelegten Studien dienen der Weiterentwicklung psychometrischer Analysen in der klinischen Diagnostik im Allgemeinen und stärken die Validität der Messung durch die Y-BOCS im Besonderen. Klinische Studien können vertrauenswürdige Ergebnisse nur aufbauend auf soliden Messverfahren erzielen. Außerdem werden weitere neuere Entwicklungen in der psychometrischen Theorie im Hinblick auf ihren Nutzen in der klinischen Diagnostik und dem Verständnis der Zwangsstörung diskutiert. / Obsessive-compulsive disorder (OCD) is characterized by heterogeneous symptoms. Like for many clinical phenomena, this heterogeneity in symptoms poses challenges to psychological assessment. The Yale-Brown Obsessive Compulsive Scale (Y-BOCS) has developed into a gold standard within the past three decades. Using a large sample of patients suffering from OCD, we worked towards three goals in order to advance psychometric methods: First, evaluating and improving the measurement models associated with OCD symptoms by means of an exhaustive overhaul and the usage of Bayesian measurement modeling. Second, to test whether the Y-BOCS' measurement properties are stable across time and other features relevant to clinical research. In a broad analysis of measurement invariance (MI), we found only few instances where MI did not hold. Under such circumstances, group comparisons may be biased and conclusions could be misleading. For such situations, we thirdly derived and applied a procedure extending partial MI modeling. In partial MI models, group comparisons are still valid, even if MI holds only for a few items. We developed a model averaging approach that appropriately reflects the uncertainty stemming from choosing items for partial MI models. The developed Bayesian procedure makes decisions made during the analysis fully transparent and thus open to discussion. The presented studies form a research program that advances psychometrical analyses in clinical assessment and increases the validity of the assessment of OCD by means of the Y-BOCS. Clinical trials require such sound measurements in order to provide trustworthy conclusions. Furthermore, we discuss other recent advances in the field of psychometry and their usability for clinical research as a whole and the understanding of OCD specifically.
105

Pojednání o empirické finanční ekonomii / Essays in Empirical Financial Economics

Žigraiová, Diana January 2018 (has links)
This dissertation is composed of four essays that empirically investigate three topics in financial economics; financial stress and its leading indicators, the relationship between bank competition and financial stability, and the link between management board composition and bank risk. In the first essay we examine which variables have predictive power for financial stress in 25 OECD countries, using a recently constructed financial stress index. We find that panel models can hardly explain FSI dynamics. Although better results are achieved in country models, our findings suggest that financial stress is hard to predict out-of- sample despite the reasonably good in-sample performance of the models. The second essay develops an early warning framework for assessing systemic risks and predicting systemic events over two horizons of different length on a panel of 14 countries. We build a financial stress index to identify the starting dates of systemic financial crises and select crisis-leading indicators in a two-step approach; we find relevant prediction horizons for each indicator and employ Bayesian model averaging to identify the most useful predictors. We find superior performance of the long-horizon model for the Czech Republic. The theoretical literature gives conflicting predictions on how bank...
106

Macroeconometrics with high-dimensional data

Zeugner, Stefan 12 September 2012 (has links)
CHAPTER 1:<p>The default g-priors predominant in Bayesian Model Averaging tend to over-concentrate posterior mass on a tiny set of models - a feature we denote as 'supermodel effect'. To address it, we propose a 'hyper-g' prior specification, whose data-dependent shrinkage adapts posterior model distributions to data quality. We demonstrate the asymptotic consistency of the hyper-g prior, and its interpretation as a goodness-of-fit indicator. Moreover, we highlight the similarities between hyper-g and 'Empirical Bayes' priors, and introduce closed-form expressions essential to computationally feasibility. The robustness of the hyper-g prior is demonstrated via simulation analysis, and by comparing four vintages of economic growth data.<p><p>CHAPTER 2:<p>Ciccone and Jarocinski (2010) show that inference in Bayesian Model Averaging (BMA) can be highly sensitive to small data perturbations. In particular they demonstrate that the importance attributed to potential growth determinants varies tremendously over different revisions of international income data. They conclude that 'agnostic' priors appear too sensitive for this strand of growth empirics. In response, we show that the found instability owes much to a specific BMA set-up: First, comparing the same countries over data revisions improves robustness. Second, much of the remaining variation can be reduced by applying an evenly 'agnostic', but flexible prior.<p><p>CHAPTER 3:<p>This chapter explores the link between the leverage of the US financial sector, of households and of non-financial businesses, and real activity. We document that leverage is negatively correlated with the future growth of real activity, and positively linked to the conditional volatility of future real activity and of equity returns. <p>The joint information in sectoral leverage series is more relevant for predicting future real activity than the information contained in any individual leverage series. Using in-sample regressions and out-of sample forecasts, we show that the predictive power of leverage is roughly comparable to that of macro and financial predictors commonly used by forecasters. <p>Leverage information would not have allowed to predict the 'Great Recession' of 2008-2009 any better than conventional macro/financial predictors. <p><p>CHAPTER 4:<p>Model averaging has proven popular for inference with many potential predictors in small samples. However, it is frequently criticized for a lack of robustness with respect to prediction and inference. This chapter explores the reasons for such robustness problems and proposes to address them by transforming the subset of potential 'control' predictors into principal components in suitable datasets. A simulation analysis shows that this approach yields robustness advantages vs. both standard model averaging and principal component-augmented regression. Moreover, we devise a prior framework that extends model averaging to uncertainty over the set of principal components and show that it offers considerable improvements with respect to the robustness of estimates and inference about the importance of covariates. Finally, we empirically benchmark our approach with popular model averaging and PC-based techniques in evaluating financial indicators as alternatives to established macroeconomic predictors of real economic activity. / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
107

Evaluation économique des aires marines protégées : apports méthodologiques et applications aux îles Kuriat (Tunisie) / Economic valuation of marine protected areas : methodological perspectives and empirical applications to Kuriat Islands (Tunisia)

Mbarek, Marouene 16 December 2016 (has links)
La protection des ressources naturelles marines est un enjeu fort pour les décideurs publics. Le développement récent des aires marines protégées (AMP) contribue à ces enjeux de préservation. Les AMP ont pour objectifs de conserver les écosystèmes marins et côtiers tout en favorisant les activités humaines. La complexité de ces objectifs les rend difficiles à atteindre. L’objectif de cette thèse est de mener une analyse ex ante d’un projet d’une AMP aux îles Kuriat (Tunisie). Cette analyse représente une aide aux décideurs pour une meilleure gouvernance en intégrant les acteurs impliqués (pêcheur, visiteur, plaisancier) dans le processus de gestion. Pour ce faire, nous appliquons la méthode d’évaluation contingente (MEC) à des échantillons des pêcheurs et des visiteurs aux îles Kuriat. Nous nous intéressons au traitement des biais de sélection et d’échantillonnage et à l’incertitude sur la spécification des modèles économétriques lors de la mise en œuvre de la MEC. Nous faisons appel au modèle HeckitBMA,qui est une combinaison du modèle de Heckman (1979) et de l’inférence bayésienne, pour calculer le consentement à recevoir des pêcheurs. Nous utilisons aussi le modèle Zero inflated ordered probit (ZIOP), qui est une combinaison d’un probit binaire avec un probit ordonné, pour calculer le consentement à payer des visiteurs après avoir corrigé l’échantillon par imputation multiple. Nos résultats montrent que les groupes d’acteurs se distinguent par leur activité et leur situation économique ce qui les amène à avoir des perceptions différentes. Cela permet aux décideurs d’élaborer une politique de compensation permettant d’indemniser les acteurs ayant subi un préjudice. / The protection of marine natural resources is a major challenge for policy makers. The recent development of marine protected areas (MPAs) contributes to the preservation issues. MPAs are aimed to preserve the marine and coastal ecosystems while promoting human activities. The complexity of these objectives makes them difficult to reach. The purpose of this work is to conduct an ex-ante analysis of a proposed MPA to Kuriat Islands (Tunisia). This analysis is an aid to decision makers for better governance by integrating the actors involved (fisherman, visitor, boater) in the management process. To do this, we use the contingent valuation method (CVM) to samples of fishermen and visitors to the islands Kuriat. We are interested in the treatment of selection and sampling bias and uncertainty about specifying econometric models during the implementation of the CVM. We use the model HeckitBMA, which is a combination of the Heckman model (1979) and Bayesian inference, to calculate the willingness to accept of fishermen. We also use the model Zero inflated ordered probit (ZIOP), which is a combination of a binary probit with an ordered probit, to calculate the willingness to pay of visitors after correcting the sample by multiple imputation. Our results show that groups of actors are distinguished by their activity and economic conditions that cause them to have different perceptions. This allows policy makers to develop a policy of compensation to compensate the players who have been harmed.
108

Fusion pour la séparation de sources audio / Fusion for audio source separation

Jaureguiberry, Xabier 16 June 2015 (has links)
La séparation aveugle de sources audio dans le cas sous-déterminé est un problème mathématique complexe dont il est aujourd'hui possible d'obtenir une solution satisfaisante, à condition de sélectionner la méthode la plus adaptée au problème posé et de savoir paramétrer celle-ci soigneusement. Afin d'automatiser cette étape de sélection déterminante, nous proposons dans cette thèse de recourir au principe de fusion. L'idée est simple : il s'agit, pour un problème donné, de sélectionner plusieurs méthodes de résolution plutôt qu'une seule et de les combiner afin d'en améliorer la solution. Pour cela, nous introduisons un cadre général de fusion qui consiste à formuler l'estimée d'une source comme la combinaison de plusieurs estimées de cette même source données par différents algorithmes de séparation, chaque estimée étant pondérée par un coefficient de fusion. Ces coefficients peuvent notamment être appris sur un ensemble d'apprentissage représentatif du problème posé par minimisation d'une fonction de coût liée à l'objectif de séparation. Pour aller plus loin, nous proposons également deux approches permettant d'adapter les coefficients de fusion au signal à séparer. La première formule la fusion dans un cadre bayésien, à la manière du moyennage bayésien de modèles. La deuxième exploite les réseaux de neurones profonds afin de déterminer des coefficients de fusion variant en temps. Toutes ces approches ont été évaluées sur deux corpus distincts : l'un dédié au rehaussement de la parole, l'autre dédié à l'extraction de voix chantée. Quelle que soit l'approche considérée, nos résultats montrent l'intérêt systématique de la fusion par rapport à la simple sélection, la fusion adaptative par réseau de neurones se révélant être la plus performante. / Underdetermined blind source separation is a complex mathematical problem that can be satisfyingly resolved for some practical applications, providing that the right separation method has been selected and carefully tuned. In order to automate this selection process, we propose in this thesis to resort to the principle of fusion which has been widely used in the related field of classification yet is still marginally exploited in source separation. Fusion consists in combining several methods to solve a given problem instead of selecting a unique one. To do so, we introduce a general fusion framework in which a source estimate is expressed as a linear combination of estimates of this same source given by different separation algorithms, each source estimate being weighted by a fusion coefficient. For a given task, fusion coefficients can then be learned on a representative training dataset by minimizing a cost function related to the separation objective. To go further, we also propose two ways to adapt the fusion coefficients to the mixture to be separated. The first one expresses the fusion of several non-negative matrix factorization (NMF) models in a Bayesian fashion similar to Bayesian model averaging. The second one aims at learning time-varying fusion coefficients thanks to deep neural networks. All proposed methods have been evaluated on two distinct corpora. The first one is dedicated to speech enhancement while the other deals with singing voice extraction. Experimental results show that fusion always outperform simple selection in all considered cases, best results being obtained by adaptive time-varying fusion with neural networks.
109

Structure learning of Bayesian networks via data perturbation / Aprendizagem estrutural de Redes Bayesianas via perturbação de dados

Gross, Tadeu Junior 29 November 2018 (has links)
Structure learning of Bayesian Networks (BNs) is an NP-hard problem, and the use of sub-optimal strategies is essential in domains involving many variables. One of them is to generate multiple approximate structures and then to reduce the ensemble to a representative structure. It is possible to use the occurrence frequency (on the structures ensemble) as the criteria for accepting a dominant directed edge between two nodes and thus obtaining the single structure. In this doctoral research, it was made an analogy with an adapted one-dimensional random-walk for analytically deducing an appropriate decision threshold to such occurrence frequency. The obtained closed-form expression has been validated across benchmark datasets applying the Matthews Correlation Coefficient as the performance metric. In the experiments using a recent medical dataset, the BN resulting from the analytical cutoff-frequency captured the expected associations among nodes and also achieved better prediction performance than the BNs learned with neighbours thresholds to the computed. In literature, the feature accounted along of the perturbed structures has been the edges and not the directed edges (arcs) as in this thesis. That modified strategy still was applied to an elderly dataset to identify potential relationships between variables of medical interest but using an increased threshold instead of the predict by the proposed formula - such prudence is due to the possible social implications of the finding. The motivation behind such an application is that in spite of the proportion of elderly individuals in the population has increased substantially in the last few decades, the risk factors that should be managed in advance to ensure a natural process of mental decline due to ageing remain unknown. In the learned structural model, it was graphically investigated the probabilistic dependence mechanism between two variables of medical interest: the suspected risk factor known as Metabolic Syndrome and the indicator of mental decline referred to as Cognitive Impairment. In this investigation, the concept known in the context of BNs as D-separation has been employed. Results of the carried out study revealed that the dependence between Metabolic Syndrome and Cognitive Variables indeed exists and depends on both Body Mass Index and age. / O aprendizado da estrutura de uma Rede Bayesiana (BN) é um problema NP-difícil, e o uso de estratégias sub-ótimas é essencial em domínios que envolvem muitas variáveis. Uma delas consiste em gerar várias estruturas aproximadas e depois reduzir o conjunto a uma estrutura representativa. É possível usar a frequência de ocorrência (no conjunto de estruturas) como critério para aceitar um arco dominante entre dois nós e assim obter essa estrutura única. Nesta pesquisa de doutorado, foi feita uma analogia com um passeio aleatório unidimensional adaptado para deduzir analiticamente um limiar de decisão apropriado para essa frequência de ocorrência. A expressão de forma fechada obtida foi validada usando bases de dados de referência e aplicando o Coeficiente de Correlação de Matthews como métrica de desempenho. Nos experimentos utilizando dados médicos recentes, a BN resultante da frequência de corte analítica capturou as associações esperadas entre os nós e também obteve melhor desempenho de predição do que as BNs aprendidas com limiares vizinhos ao calculado. Na literatura, a característica contabilizada ao longo das estruturas perturbadas tem sido as arestas e não as arestas direcionadas (arcos) como nesta tese. Essa estratégia modificada ainda foi aplicada a um conjunto de dados de idosos para identificar potenciais relações entre variáveis de interesse médico, mas usando um limiar aumentado em vez do previsto pela fórmula proposta - essa cautela deve-se às possíveis implicações sociais do achado. A motivação por trás dessa aplicação é que, apesar da proporção de idosos na população ter aumentado substancialmente nas últimas décadas, os fatores de risco que devem ser controlados com antecedência para garantir um processo natural de declínio mental devido ao envelhecimento permanecem desconhecidos. No modelo estrutural aprendido, investigou-se graficamente o mecanismo de dependência probabilística entre duas variáveis de interesse médico: o fator de risco suspeito conhecido como Síndrome Metabólica e o indicador de declínio mental denominado Comprometimento Cognitivo. Nessa investigação, empregou-se o conceito conhecido no contexto de BNs como D-separação. Esse estudo revelou que a dependência entre Síndrome Metabólica e Variáveis Cognitivas de fato existe e depende tanto do Índice de Massa Corporal quanto da idade.
110

Structure learning of Bayesian networks via data perturbation / Aprendizagem estrutural de Redes Bayesianas via perturbação de dados

Tadeu Junior Gross 29 November 2018 (has links)
Structure learning of Bayesian Networks (BNs) is an NP-hard problem, and the use of sub-optimal strategies is essential in domains involving many variables. One of them is to generate multiple approximate structures and then to reduce the ensemble to a representative structure. It is possible to use the occurrence frequency (on the structures ensemble) as the criteria for accepting a dominant directed edge between two nodes and thus obtaining the single structure. In this doctoral research, it was made an analogy with an adapted one-dimensional random-walk for analytically deducing an appropriate decision threshold to such occurrence frequency. The obtained closed-form expression has been validated across benchmark datasets applying the Matthews Correlation Coefficient as the performance metric. In the experiments using a recent medical dataset, the BN resulting from the analytical cutoff-frequency captured the expected associations among nodes and also achieved better prediction performance than the BNs learned with neighbours thresholds to the computed. In literature, the feature accounted along of the perturbed structures has been the edges and not the directed edges (arcs) as in this thesis. That modified strategy still was applied to an elderly dataset to identify potential relationships between variables of medical interest but using an increased threshold instead of the predict by the proposed formula - such prudence is due to the possible social implications of the finding. The motivation behind such an application is that in spite of the proportion of elderly individuals in the population has increased substantially in the last few decades, the risk factors that should be managed in advance to ensure a natural process of mental decline due to ageing remain unknown. In the learned structural model, it was graphically investigated the probabilistic dependence mechanism between two variables of medical interest: the suspected risk factor known as Metabolic Syndrome and the indicator of mental decline referred to as Cognitive Impairment. In this investigation, the concept known in the context of BNs as D-separation has been employed. Results of the carried out study revealed that the dependence between Metabolic Syndrome and Cognitive Variables indeed exists and depends on both Body Mass Index and age. / O aprendizado da estrutura de uma Rede Bayesiana (BN) é um problema NP-difícil, e o uso de estratégias sub-ótimas é essencial em domínios que envolvem muitas variáveis. Uma delas consiste em gerar várias estruturas aproximadas e depois reduzir o conjunto a uma estrutura representativa. É possível usar a frequência de ocorrência (no conjunto de estruturas) como critério para aceitar um arco dominante entre dois nós e assim obter essa estrutura única. Nesta pesquisa de doutorado, foi feita uma analogia com um passeio aleatório unidimensional adaptado para deduzir analiticamente um limiar de decisão apropriado para essa frequência de ocorrência. A expressão de forma fechada obtida foi validada usando bases de dados de referência e aplicando o Coeficiente de Correlação de Matthews como métrica de desempenho. Nos experimentos utilizando dados médicos recentes, a BN resultante da frequência de corte analítica capturou as associações esperadas entre os nós e também obteve melhor desempenho de predição do que as BNs aprendidas com limiares vizinhos ao calculado. Na literatura, a característica contabilizada ao longo das estruturas perturbadas tem sido as arestas e não as arestas direcionadas (arcos) como nesta tese. Essa estratégia modificada ainda foi aplicada a um conjunto de dados de idosos para identificar potenciais relações entre variáveis de interesse médico, mas usando um limiar aumentado em vez do previsto pela fórmula proposta - essa cautela deve-se às possíveis implicações sociais do achado. A motivação por trás dessa aplicação é que, apesar da proporção de idosos na população ter aumentado substancialmente nas últimas décadas, os fatores de risco que devem ser controlados com antecedência para garantir um processo natural de declínio mental devido ao envelhecimento permanecem desconhecidos. No modelo estrutural aprendido, investigou-se graficamente o mecanismo de dependência probabilística entre duas variáveis de interesse médico: o fator de risco suspeito conhecido como Síndrome Metabólica e o indicador de declínio mental denominado Comprometimento Cognitivo. Nessa investigação, empregou-se o conceito conhecido no contexto de BNs como D-separação. Esse estudo revelou que a dependência entre Síndrome Metabólica e Variáveis Cognitivas de fato existe e depende tanto do Índice de Massa Corporal quanto da idade.

Page generated in 0.0524 seconds