• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 349
  • 79
  • 65
  • 30
  • 29
  • 12
  • 10
  • 9
  • 8
  • 7
  • 4
  • 4
  • 4
  • 4
  • 3
  • Tagged with
  • 751
  • 751
  • 106
  • 85
  • 78
  • 77
  • 70
  • 65
  • 62
  • 60
  • 58
  • 50
  • 49
  • 48
  • 44
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
441

Caractérisation du rôle des oscillations à haute fréquence dans les réseaux épileptiques / Characterization of the role of high-frequency oscillations in epileptic networks

Roehri, Nicolas 16 January 2018 (has links)
Touchant plus de 50 millions de personnes dans le monde, l’épilepsie est un problème majeur de santé publique. Un tiers des patients souffrent d’épilepsie pharmaco-résistante. Une chirurgie visant à enlever la région cérébrale à l’origine des crises – la zone épileptogène – est considérée comme l’option de référence pour rendre libre de crises ces patients. Le taux d’échec chirurgical non négligeable a poussé la recherche d’autres marqueurs. Un marqueur potentiel est les oscillations à haute fréquence (HFOs). Une HFO est une brève oscillation entre 80-500 Hz qui dure au moins 4 périodes enregistrée en EEG intracérébrale. Par leur caractère très bref, le marquage visuel de ces petites oscillations est fastidieux et chronophage.. Il semble impératif de trouver un moyen de détecter automatiquement ces oscillations pour étudier les HFOs sur des cohortes de patients. Aucun détecteur automatique existant ne fait cependant l’unanimité. Durant cette thèse, nous avons développé un nouveau moyen de visualiser les HFOs grâce à une normalisation originale de la transformée en ondelettes pour ensuite mieux les détecter automatiquement. Puis, nous avons mise en place une stratégie pour caractériser et valider des détecteurs. Enfin, nous avons appliqué le nouveau détecteur à une cohorte de patients pour déterminer la fiabilité des HFOs et des pointes épileptiques - le marqueur standard - dans la prédiction de la zone épileptogène. La conclusion de cette thèse est que les HFOs ne sont pas meilleurs que les pointes épileptiques pour prédire la zone épileptogène mais que combiner ces deux marqueurs permettait d’obtenir un marqueur plus robuste. / Epilepsy is a major health problem as it affects 50 million people worldwide. One third of the patients are resistant to medication. Surgical removal of the brain areas generating the seizure – the epileptogenic zone – is considered as the standard option for these patients to be seizure free. The non-negligible rate of surgical failure has led to seek other electrophysiological criteria. One putative marker is the high-frequency oscillations (HFOs).An HFO is a brief oscillation between 80-500 Hz lasting at least 4 periods recorded in intracerebral EEG. Due to their short-lasting nature, visually marking of these small oscillations is tedious and time-consuming. Automatically detecting these oscillations seems an imperative stage to study HFOs on cohorts of patients. There is however no general agreement on existing detectors.In this thesis, we developed a new way of representing HFOs thanks to a novel normalisation of the wavelet transform and to use this representation as a base for detecting HFOs automatically. We secondly designed a strategy to properly characterise and validate automated detectors. Finally, we characterised, in a cohort of patients, the reliability of HFOs and epileptic spikes - the standard marker - as predictors of the epileptogenic zone using the validated detector. The conclusion of this thesis is that HFOs are not better than epileptic spikes in predicting the epileptogenic zone but combining the two leads to a more robust biomarker.
442

Auslegung, Entwicklung und Inbetriebnahme eines longitudinalen und transversalen Feedbacksystems zur Dämpfung gekoppelter Teilchenpaket-Instabilitäten im BESSY-II-Speicherring

Knuth, Thomas 25 July 2000 (has links)
Das Auftreten kohärenter Schwingungen gekoppelter Teilchenpakete führt in modernen Elektronenspeicherringanlagen zu einer wesentlichen Beeinträchtigung ihrer Leistungsfähigkeit als Synchrotronstrahlungsquelle. Diese Instabilitäten können in longitudinaler und transversaler Richtung auftreten und führen neben der Verminderung der Brillanz des Synchrotronlichts im ungünstigsten Fall zum Strahlverlust. Das wirkungsvollste Instrument zur Beherrschung der Instabilitäten ist ein Rückkoppelsystem (Feedbacksystem), welches die angeregten Schwingungsamplituden detektiert und dämpft. Diese Arbeit beschäftigt sich mit der Auslegung, der Entwicklung und der Inbetriebnahme zweier von ihrem Aufbau her völlig unterschiedlicher Systeme zur Korrektur von longitudinalen Synchrotronschwingungen und transversalen Betatronschwingungen. Dabei schließt das transversale Feedbacksystem sowohl die horizontale als auch die vertikale Strahlachse ein. Beide Systeme beschränken sich nicht auf die Dämpfung bestimmter Schwingungsmoden, sondern sind so ausgelegt worden, daß alle Teilchenpakete unabhängig voneinander stabilisiert werden. Innerhalb von zwei Jahren konnten alle relevanten Komponenten entwickelt, gebaut und in Betrieb genommen werden. Im Rahmen dieser Arbeit wird auf den Aufbau und die Funktionsweise wichtiger Systembausteine eingegangen und der Prozeß der Inbetriebnahme der Feedbacksysteme erläutert. Meßresultate belegen die Effizienz beider Systeme, die im Nutzerbetrieb kontinuierlich zur Dämpfung von Instabilitäten bei Strömen bis zu 220 mA eingesetzt werden. Der durch die Inbetriebnahme der Rückkoppelsysteme gewonnene Nutzen für die Experimentatoren konnte im Rahmen dieser Arbeit nachgewiesen werden. Damit verfügt BESSY-II über leistungsfähige Feedbacksysteme, die kohärente Schwingungen aller Phasenraumkoordinaten dämpfen und damit die Anforderungen an die Quellgröße einer Synchrotronstrahlungsquelle der 3. Generation gewährleisten. / The appearance of coherent coupled bunch oscillations in modern electron storage rings contributes to a significant reduction of the performance as a synchrotron light source. These instabilities occur in longitudinal as well as in transversal directions and lead to a reduction of brilliance and in the worst case to beam loss. The most effective tool for controlling the instabilities is a feedback system, which detects and damps the excited oscillation amplitudes. This thesis describes the development, installation and commissioning of two completely different systems for damping of longitudinal synchrotron oscillations and transversal betatron oscillations. The transverse feedback system incorporates the horizontal as well as the vertical beam direction. Both systems are not restricted to damping certain modes of oscillation, but have been designed for the independent stabilization of all bunches separately. All components have been designed, built and commissioned within two years. In the scope of this thesis the development and the functionality of important components will be explained and the process of commissioning will be described. Measurements emphasize the efficiency of both systems, which are being used continuously for damping instabilities during user operation up to currents of 220 mA. In the scope of this thesis the improved experimental conditions for the user of the synchrotron light could be shown. Consequently, BESSY II possesses two efficient feedback systems which damp coherent oscillations of all phase space coordinates and guarantee the requirements to the source size of a 3rd generation light source.
443

Combinação de projeções de volatilidade baseadas em medidas de risco para dados em alta frequência / Volatility forecast combination using risk measures based on high frequency data

Araújo, Alcides Carlos de 29 April 2016 (has links)
Operações em alta frequência demonstraram crescimento nos últimos anos; em decorrência disso, surgiu a necessidade de estudar o mercado de ações brasileiro no contexto dos dados em alta frequência. Os estimadores da volatilidade dos preços de ações utilizando dados de negociações em alta frequência são os principais objetos de estudo. Conforme Aldridge (2010) e Vuorenmaa (2013), o HFT foi definido como a rápida realocação de capital feita de modo que as transações possam ocorrer em milésimos de segundos por uso de algoritmos complexos que gerenciam envio de ordens, análise dos dados obtidos e tomada das melhores decisões de compra e venda. A principal fonte de informações para análise do HFT são os dados tick by tick, conhecidos como dados em alta frequência. Uma métrica oriunda da análise de dados em alta frequência e utilizada para gestão de riscos é a Volatilidade Percebida. Conforme Andersen et al. (2003), Pong et al. (2004), Koopman et al. (2005) e Corsi (2009) há um consenso na área de finanças de que as projeções da volatilidade utilizando essa métrica de risco são mais eficientes de que a estimativa da volatilidade por meio de modelos GARCH. Na gestão financeira, a projeção da volatilidade é uma ferramenta fundamental para provisionar reservas para possíveis perdas;, devido à existência de vários métodos de projeção da volatilidade e em decorrência desta necessidade torna-se necessário selecionar um modelo ou combinar diversas projeções. O principal desafio para combinar projeções é a escolha dos pesos: as diversas pesquisas da área têm foco no desenvolvimento de métodos para escolhê-los visando minimizar os erros de previsão. A literatura existente carece, no entanto, de uma proposição de método que considere o problema de eventual projeção de volatilidade abaixo do esperado. Buscando preencher essa lacuna, o objetivo principal desta tese é propor uma combinação dos estimadores da volatilidade dos preços de ações utilizando dados de negociações em alta frequência para o mercado brasileiro. Como principal ponto de inovação, propõe-se aqui de forma inédita a utilização da função baseada no Lower Partial Moment (LPM) para estimativa dos pesos para combinação das projeções. Ainda que a métrica LPM seja bastante conhecida na literatura, sua utilização para combinação de projeções ainda não foi analisada. Este trabalho apresenta contribuições ao estudo de combinações de projeções realizadas pelos modelos HAR, MIDAS, ARFIMA e Nearest Neighbor, além de propor dois novos métodos de combinação -- estes denominados por LPMFE (Lower Partial Moment Forecast Error) e DLPMFE (Discounted LPMFE). Os métodos demonstraram resultados promissores pretendem casos cuja pretensão seja evitar perdas acima do esperado e evitar provisionamento excessivo do ponto de vista orçamentário. / The High Frequency Trading (HFT) has grown significantly in the last years, in this way, this raises the need for research of the high frequency data on the Brazilian stock market.The volatility estimators of the asset prices using high frequency data are the main objects of study. According to Aldridge (2010) and Vuorenmaa (2013), the HFT was defined as the fast reallocation of trading capital that the negotiations may occur on milliseconds by complex algorithms scheduled for optimize the process of sending orders, data analysis and to make the best decisions of buy or sell. The principal information source for HFT analysis is the tick by tick data, called as high frequency data. The Realized Volatility is a risk measure from the high frequency data analysis, this metric is used for risk management.According to Andersen et al. (2003), Pong et al. (2004), Koopman et al.(2005) and Corsi (2009) there is a consensus in the finance field that the volatility forecast using this risk measure produce better results than estimating the volatility by GARCH models. The volatility forecasting is a key issue in the financial management to provision capital resources to possible losses. However, because there are several volatility forecast methods, this problem raises the need to choice a specific model or combines the projections. The main challenge to combine forecasts is the choice of the weights, with the aim of minimizingthe forecast errors, several research in the field have been focusing on development of methods to choice the weights.Nevertheless, it is missing in the literature the proposition of amethod which consider the minimization of the risk of an inefficient forecast for the losses protection. Aiming to fill the gap, the main goal of the thesis is to propose a combination of the asset prices volatility forecasts using high frequency data for Brazilian stock market. As the main focus of innovation, the thesis proposes, in an unprecedented way, the use of the function based on the Lower Partial Moment (LPM) to estimate the weights for the combination of volatility forecasts. Although the LPM measure is well known in the literature, the use of this metric for forecast combination has not been yet studied.The thesis contributes to the literature when studying the forecasts combination made by the models HAR, MIDAS, ARFIMA and Nearest Neighbor. The thesis also contributes when proposing two new methods of combinations, these methodologies are referred to as LPMFE (Lower Partial Moment Forecast Error) and DLPMFE (Discounted LPMFE). The methods have shown promising results when it is intended to avoid losses above the expected it is not intended to cause provisioning excess in the budget.
444

Estudo de três estratégias de ventilação artificial protetora: alta freqüência, baixa freqüência e baixa freqüência associada à insuflação de gás traqueal, em modelo experimental de SARA / Comparing three protective mechanical ventilation strategies, HFOV, low-frequency-ventilation, and low-frequency-ventilation with TGI, in an ARDS experimental model

Volpe, Márcia Souza 09 February 2007 (has links)
Introdução: Um dos principais objetivos na SARA é encontrar a melhor estratégia protetora de ventilação mecânica que minimize o stress pulmonar e otimize as trocas gasosas. Teoricamente, estas duas metas podem ser obtidas simultaneamente, evitando-se a hiperdistensão e colapso cíclico de unidades alveolares instáveis. Numa tentativa de radicalizar a minimização da hiperdistensão e da pressão motriz inspiratória, duas estratégias podem ser propostas: o uso da ventilação de alta freqüência oscilatória (HFOV) e o uso da insuflação intra-traqueal de gás (TGI), esta última associada à hipercapnia permissiva e baixas freqüências respiratórias. Objetivo: identificar qual (quais) entre as três estratégias de ventilação mecânica, HFOV, TGI e ventilação protetora de baixa freqüência (VP: volume corrente ~6 mL/kg), foi (foram) a (s) mais protetora (s) em um modelo de SARA em coelhos, durante seis horas de ventilação mecânica. Material e métodos: Os animais (n = 45) foram submetidos a repetidas lavagens pulmonar até uma PaO2 < 100 mmHg. Imediatamente após a injuria pulmonar, foi obtida uma curva P/V para calculo do trabalho inspiratório e energia dissipada durante insuflação pulmonar. Em seguida, os animais foram randomizados em um dos três grupos: HFOV, VP ou TGI. O PEEP ou PMEAN ideais foram obtidos através de uma curva PEEP/PaO2 (ou PMEAN/PaO2) que foi precedida por uma manobra de recrutamento. Os animais dos grupos VP e TGI foram inicialmente ventilados em PCV com um delta de pressão = 8 cmH2O e freqüência = 60 resp/min. A única diferença inicial entre os dois foi que o grupo TGI possuía um fluxo traqueal continuo = 1 L/min. Os animais do grupo HFOV foram inicialmente ventilados com uma amplitude de pressão = 45 cmH2O e freqüência = 10 Hz. Todos os animais foram ventilados com uma FiO2 = 1.0. Os deltas de pressão (ou pressão motriz) nos grupos VP e TGI foram reajustados para manter uma PaCO2 = 90-110 mmHg, enquanto no HFOV a amplitude de pressão foi reajustada para manter uma PaCO2 = 45-55 mmHg. No final do experimento, outra curva P/V foi obtida. Amostras do LBA e sangue foram coletados antes e após o período de ventilação para determinar os níveis de IL-8. Amostras do pulmão esquerdo foram processadas para análise histológica e para cálculo da relação peso-úmido/ peso-seco. Resultados: Não foi observada diferença na PaO2 entre os grupos. A PaCO2 foi significantemente menor no grupo HFOV (59 ± 3 mmHg) quando comparado aos grupos VP (99 ± 4 mmHg) e TGI (80 ± 3 mmHg). O volume corrente foi significantemente menor nos grupos TGI e HFOV quando comparado ao grupo VP. Logo após a lesão pulmonar, todos os grupos necessitaram de trabalhos similares para a insuflação pulmonar, mas o grupo VP foi o único que não apresentou melhora (diminuição) deste trabalho expiratório, a estratégia VP foi a única que apresentou aumento ao longo das 6 horas (P<0,001). Os grupos TGI e HFOV também apresentaram maiores concentrações de polimorfonucleares no tecido pulmonar (P=0,008) e tendências a favorecer um maior índice superfície/volume (P=0,14), maior gradiente IL-8 (diferença ente IL-8 no LBA e plasma - P=0,08) e menor relação peso-úmido/peso-seco (P=0,17) ao final das 6 horas de ventilação. Discussão: O menor trabalho requerido na insuflação pulmonar depois de 6 horas de ventilação refletiu uma redução nas pressões críticas de abertura e, provavelmente, uma melhora do edema pulmonar e do sistema surfactante nas estratégias HFOV e TGI. O aumento do trabalho expiratório no grupo VP sugere, inclusive, uma deterioração na qualidade do surfactante neste grupo. Nos grupos TGI e HFO, a maior concentração de polimorfonucleares no tecido pulmonar e a tendência a apresentar maior gradiente de IL8 poderiam se interpretados como uma melhor membrana alvéolo-capilar, resultando na menor liberação de mediadores compartimentalizados no interior dos alvéolos. Além de necessitar volumes correntes mais altos, a estratégia VP necessitou de pressões inspiratórias progressivamente mais altas durante as seis horas de protocolo, devido a reajustes freqüentes, necessários à manutenção das trocas gasosas. Conclusão: Uma redução mais radical das pressões motrizes demonstrou efeitos benéficos num modelo de lesão pulmonar aguda experimental, mesmo quando associada a uma estratégia que já prioriza o recrutamento pulmonar ótimo. O TGI mostrou ser uma alternativa viável à HFOV, apresentando algumas vantagens práticas de implementação e em termos de previsibilidade de resposta nas trocas gasosas. / Introduction: One of the major goals in ARDS is to find the best protective mechanical ventilation strategy, which minimizes lung stress and optimizes gas exchange. Theoretically, these two goals can be accomplished by simultaneously avoiding alveolar overdistension and cyclic collapse of unstable alveolar units. Pushing further the rationale of this strategy, two new strategies have been proposed: high frequency oscillatory mechanical ventilation (HFOV) and intra-tracheal gas insufflation (TGI) associated with permissive hypercapnia and conventional frequencies. Objective: To determine which of the three protective modalities of mechanical ventilation, HFOV, low-frequency-protective ventilation (LFV), or LFV associated with tracheal gas insufflation (TGI), was the most protective strategy in an ARDS rabbit model during six hours of mechanical ventilation. Material and methods: The animals (n = 45) were submitted to repeated saline lavage until PaO2 < 100 mmHg. Immediately after lung injury, a P/V curve was obtained to calculate inspiratory/expiratory work and energy dissipated during lung inflation. Thereafter, the animals were randomized into one of three groups: LFV, HFOV or TGI. The optimal PEEP or PMEAN was obtained during a PEEP/PaO2 (or PMEAN/PaO2) curve which was preceded by a recruiting maneuver. The animals of the LFV and TGI groups were initially ventilated in PCV with diving pressure = 8 cmH2O and frequency = 60 b/m. The only initial difference between these two arms was that the TGI group had a continuous tracheal flow = 1 L/min. The animals in the HFOV were initially ventilated with an oscillatory pressure amplitude = 45 cmH2O and frequency = 10 Hz. All animals were ventilated with FiO2 = 1.0. Driving pressure was then adjusted in LFV and TGI groups to maintain a PaCO2 = 90-110 mmHg, while in HFO the pressure amplitude was adjusted to maintain a PaCO2 = 45-55 mmHg. At the end of the experiment, after 6 hours of ventilation, another P/V curve was obtained. BAL and bloods samples were drawn before and after the period of ventilation to determine IL-8 levels. The left lung was processed for histological analysis and for wet weight/dry weight (ww/dw) ratio. Results: We observed no differences in PaO2 among the groups. PaCO2 was significantly lower at HFO (59 ± 3 mmHg) when compared with LFV (99 ± 4 mmHg) and TGI (80 ± 3 mmHg) groups. Tidal volume was significantly lower in TGI and HFO groups when compared with LFV group. Soon after injury, all groups required similar energy for lung inflation (inspiratory work), but the VP group was the only one not presenting any improvement in this parameter after 6 hours (P<0.001). Concerning the expiratory work, the VP strategy was the only one presenting an increase in the expiratory work along the 6 hours (P<0.001). The TGI and HFOV groups showed the highest polymorphonuclear cell concentration in lung tissue (P=0.008) and trends towards a higher surface/volume index (P=0.14), higher IL8 gradient (difference between IL8 in BAL and plasma) and lower ww/dw ratio at the end of 6 hours of ventilation (P=0.17). Discussion: The lower energy for lung inflation after six hours of ventilation reflected the reduction of opening pressures and better surfactant function during ventilation under TGI and HFOV strategies. The increase in expiratory work during the VP strategy further suggests that the surfactant quality deteriorated under this strategy. In the TGI and HFOV groups, the higher concentration of polymorphonuclear cells and the trend towards a higher IL8 gradient between the lung and blood may suggest a better integrity of the alveolar-capillary membrane, leading to less release of compartmentalized mediators within the alveolar space. Besides the higher tidal volumes used during VP, this strategy required inspiratory pressures progressively higher along the hours, due to frequent and necessary adjustments of tidal volumes or pressures according to the gas-exchange requirements. Conclusion: An aggressive reduction of tidal volume and driving pressures was beneficial during protective strategies, even when an optimization of lung recruitment was already in place. The TGI strategy showed to be an attractive alternative to HFOV, presenting some advantages in terms of implementation and predictability of response.
445

High-frequency trading e eficiência informacional: uma análise empírica do mercado de capitais brasileiro no período  2007-2015 / High-frequency trading and informational efficiency: an empirical analysis of Brazilian capital markets from 2007 to 2015

Tadiello, Guilherme 24 October 2016 (has links)
Operações de alta frequência ganharam destaque nos últimos anos, tanto no mercado nacional quanto internacional, e têm atraído a atenção de reguladores, pesquisadores e da mídia. Assim, surgiu a necessidade de estudar o mercado de capitais brasileiro no contexto dos dados em alta frequência. Este estudo preocupa-se em analisar os efeitos dos avanços tecnológicos e novas formas de negociação na qualidade do mercado. Tais pontos são caracterizados pelo HFT. Gomber e Haferkorn (2013) explicam que HFT é um subgrupo das negociações com algoritmos. Os investidores HFTs são caracterizados por negociarem com seu próprio capital, manterem posições por espaços curtos de tempo, pelo alto volume de negociação e por atualizarem as ordens com frequência. A revisão da literatura permitiu delinear o termo e identificar as estratégias adotadas, os impactos positivos e negativos na qualidade de mercado, os riscos advindos da prática e medidas adotadas ou propostas para mitigar esses riscos. A contribuição decorrente das negociações em alta frequência foi analisada empiricamente com ênfase na questão da eficiência informacional do mercado nacional. Para isso, foram utilizados dados intradiários do índice Bovespa, com frequências de observação a partir de 1 minuto. Aplicações do teste de sequência para aleatoriedade e teste de razão de variância de Lo e Mackinlay (1988) evidenciaram um aumento na eficiência do mercado ao longo do período analisado, entre 2007 e 2015, para a frequência de observações de 1 minuto. Foi encontrada relação entre esse ganho em eficiência e o aumento da participação do HFT no mercado. Também foi constatado que o mercado se mostra menos eficiente quando a frequência de observação aumenta e que os ganhos em eficiência são mais acentuados para frequências maiores. Os últimos resultados fortalecem a percepção de que a melhora na eficiência está relacionada diretamente à atuação dos HFTs no mercado, haja vista a característica destes de explorarem ineficiências de preço em frações de segundos. Descreveu-se assim o mercado de capitais nessa era de alta frequência e os impactos do HFT na eficiência de mercado. Tais pontos podem ser colocados como contribuições práticas deste estudo. / High-frequency trading has gained notoriety in recent years and attracted incresing attention among policymakers, researchers and media. This brought about the need for research of high frequency data on brazilian capital market. This study aims to investigate the effects of technological advancements and new forms of trading, specially HFT, on market quality. Gomber and Haferkorn (2013, p. 97) define HFT as a subset of algorithmic trading \"characterized by short holding periods of trading positions, high trading volume, frequent order updates and proprietary trading\". The literature review made it possible to define the term and identify strategies, positive and negative impacts on market quality, risks and ways to mitigate these risks. The contribution arising from HFT was analyzed empirically with an emphasis on price efficiency in the domestic market, using intraday Bovespa index data in different frequencies. Run tests and Lo and Mackinlay (1988) variance ratio tests showed increasing efficiency over the period, between 2007 and 2015, for observations in 1 minute frequency. Relationship between this gain in price efficieny and the growth of HFT market share was found. It was found that the market is less eficiente when higher frequencies are analyzed, and that the efficiency gains are more pronounced for higher frequencies. The last results strengthen the perception that the efficiency gains are directly related to high-frequency trading, given its characteristc of exploring price inefficiencies that last fractions of seconds. The capital market in this high frequency era and the impacts of HFT on market efficiency were described in this study
446

Approche multi-proxys de la réponse des plages sableuses ouvertes aux événements de tempêtes, en incluant les phases de récupération / Study of open sandy beaches responses to storms including recovery periods.

Biausque, Mélanie 06 December 2018 (has links)
Cette thèse présente une étude de la dynamique des plages sableuses ouvertes dominées par la houle, au travers d’une base de données originale, couvrant une période de 29 mois, et composée de 150 levés DGPS couvrant 750m de linéaire côtier, donnant accès à la morphodynamique du site de Biscarrosse à différentes échelles de temps. Dans un premier temps, l’analyse du jeu de données à l’échelle des événements (tempêtes et successions de tempêtes appelées clusters) nous a permis de montrer que la réponse des plages sableuses aux clusters ne résulte pas de la somme des impacts induits par chaque tempête d’un cluster. Ainsi, l’effet cumulé des clusters, rapporté sur d'autres sites dans la littérature, n’est ici pas vérifié. L'impact de l’enchainement des tempêtes a également été étudié et il en résulte que lors d’un cluster, un changement des conditions hydrodynamique, à savoir, une augmentation des hauteurs de vagues et/ou du niveau d’eau, est nécessaire pour que la tempête suivante ait un impact érosif significatif sur le système. Dans un second temps, nous avons étudié la dynamique saisonnière du système plage/dune, que ce soit la saison hivernale ou estivale, dans le but de mettre en relief les principaux processus impliqués à cette échelle. Nos travaux montrent que la réponse hivernale de la plage ne dépend pas uniquement des conditions énergétiques et du profil pré-hivernal de la plage, mais également du séquençage des événements, comme lors d'un cluster. Mes travaux confirment également la nécessité de prendre en compte de nombreux paramètres dans l’étude de la dynamique hivernale des littoraux sableux : les conditions hydrodynamiques, le séquençage des évènements érosifs mais également reconstructifs, en particulier le ré-engraissement post-évènement, les transports sédimentaires cross-shore et longshore, ainsi que la position de la barre interne et des courants d’arrachements. La saison estivale est, quant-à-elle, marquée par la reconstruction de berme. Elle semble être liée à la fois aux conditions hydrodynamiques et aux caractéristiques des barres sableuses. L’étude de deux étés et deux hivers successifs a ainsi permis d’identifier les interactions entre les saisons et l’impact de la saison hivernale sur l’estivale, et l’influence de la dynamique événementielle sur la dynamique saisonnière. Elle a aussi permis de mettre en relief l’impact de l’urbanisme et des stratégies d’aménagement dans la réponse du système, à différentes échelles de temps. / This thesis presents a study of an open sandy beach wave-dominated, based on an original dataset, covering 29 months and composed by 150 DGPS surveys recorded along 750m of sandy shore, giving an access to the morphodynamic of Biscarrosse beach at different timescales. In a first time, event scale analysis showed that sandy beach response to clusters is not the result of the sum of the impact generated by each storm of a cluster on the system. Thus, the cumulated effect of clusters, described in the literature is not verified here. The storm sequencing has also been studied: during a cluster, changes in hydrodynamics conditions (rising of the water level and/or wave height) are necessary to provoke a significant erosion of the system by the second storm. In a second time, we studied the seasonal scale dynamic of the beach/dune system (winter and summer seasons) with the purpose to highlight dominant processes involved at this timescale. Beach response to winter seasons not only depends on hydrodynamic conditions and previous beach profile, but also on erosion/recovery event sequencing, post-storm recovery, cross-shore and longshore sediment transport, the barline characteristics and RIP current positions. Summer seasons are here defined by the berm reconstruction. Recovery periods are both linked to hydrodynamic conditions and barline characteristics (e.g. position and shape).The study of successive winters and summers allowed us to identify interactions between seasons, and the influence of short-scale dynamics on the seasonal one. It also emphasizes the impact of urbanism and coastal management strategies on the system’s response, at different timescales.
447

Essays on nonparametric estimation of asset pricing models

Dalderop, Jeroen Wilhelmus Paulus January 2018 (has links)
This thesis studies the use of nonparametric econometric methods to reconcile the empirical behaviour of financial asset prices with theoretical valuation models. The confrontation of economic theory with asset price data requires various functional form assumptions about the preferences and beliefs of investors. Nonparametric methods provide a flexible class of models that can prevent misspecification of agents’ utility functions or the distribution of asset returns. Evidence for potential nonlinearity is seen in the presence of non-Gaussian distributions and excessive volatility of stock returns, or non-monotonic stochastic discount factors in option prices. More robust model specifications are therefore likely to contribute to risk management and return predictability, and lend credibility to economists’ assertions. Each of the chapters in this thesis relaxes certain functional form assumptions that seem most important for understanding certain asset price data. Chapter 1 focuses on the state-price density in option prices, which confounds the nonlinearity in both the preferences and the beliefs of investors. To understand both sources of nonlinearity in equity prices, Chapter 2 introduces a semiparametric generalization of the standard representative agent consumption-based asset pricing model. Chapter 3 returns to option prices to understand the relative importance of changes in the distribution of returns and in the shape of the pricing kernel. More specifically, Chapter 1 studies the use of noisy high-frequency data to estimate the time-varying state-price density implicit in European option prices. A dynamic kernel estimator of the conditional pricing function and its derivatives is proposed that can be used for model-free risk measurement. Infill asymptotic theory is derived that applies when the pricing function is either smoothly varying or driven by diffusive state variables. Trading times and moneyness levels are modelled by marked point processes to capture intraday trading patterns. A simulation study investigates the performance of the estimator using an iterated plug-in bandwidth in various scenarios. Empirical results using S&P 500 E-mini European option quotes finds significant time-variation at intraday frequencies. An application towards delta- and minimum variance-hedging further illustrates the use of the estimator. Chapter 2 proposes a semiparametric asset pricing model to measure how consumption and dividend policies depend on unobserved state variables, such as economic uncertainty and risk aversion. Under a flexible specification of the stochastic discount factor, the state variables are recovered from cross-sections of asset prices and volatility proxies, and the shape of the policy functions is identified from the pricing functions. The model leads to closed-form price-dividend ratios under polynomial approximations of the unknown functions and affine state variable dynamics. In the empirical application uncertainty and risk aversion are separately identified from size-sorted stock portfolios exploiting the heterogeneous impact of uncertainty on dividend policy across small and large firms. I find an asymmetric and convex response in consumption (-) and dividend growth (+) towards uncertainty shocks, which together with moderate uncertainty aversion, can generate large leverage effects and divergence between macroeconomic and stock market volatility. Chapter 3 studies the nonparametric identification and estimation of projected pricing kernels implicit in the pricing of options, the underlying asset, and a riskfree bond. The sieve minimum-distance estimator based on conditional moment restrictions avoids the need to compute ratios of estimated risk-neutral and physical densities, and leads to stable estimates even in regions with low probability mass. The conditional empirical likelihood (CEL) variant of the estimator is used to extract implied densities that satisfy the pricing restrictions while incorporating the forwardlooking information from option prices. Moreover, I introduce density combinations in the CEL framework to measure the relative importance of changes in the physical return distribution and in the pricing kernel. The nonlinear dynamic pricing kernels can be used to understand return predictability, and provide model-free quantities that can be compared against those implied by structural asset pricing models.
448

Achados timpanométricos em neonatos:medidas e interpretações / Timpanometry in neonates: Measures and Interpretations

Silva, Kilza de Arruda Lyra e 23 August 2005 (has links)
Made available in DSpace on 2016-04-27T18:11:46Z (GMT). No. of bitstreams: 1 Dissertacao_KilzaArruda.pdf: 594058 bytes, checksum: a238824dfa6ae7f27d50010de62bd6ee (MD5) Previous issue date: 2005-08-23 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Introduction. Early identification and diagnosis of hearing impairment in newborns aim at establishing adequate amplification and intervention, as early as possible, in order to reduce negative consequences in language, individual and social development of the child. Tympanometry is part of a test battery for the diagnosis of hearing losses and is used to differentiate between conductive and sensorineural hearing losses. Before six months of age the results obtained using a probe tone of 226 Hz can be misleading. Therefore many studies have been done assessing the use of a high frequency probe tone of 678 and 1000 Hz aiming at a more valid procedure. Goal. The goal of the present study was to describe and discuss interpretations and measurements obtained in timpanometry of normal hearing neonates, using tone probes of 226, 678 and 1000 Hz. The following aspects were described: tympanometric curve type, Peak Compensated Static Acoustic Admittance (Ytm), Tympanometric Width (TW), Tympanometric Peak Pressure (PPT) and Equivalent Ear Canal Volume (Vea). Method. All subjects had normal otoacoustic emissions and no risk for hearing impairment. The curves were obtained in a quiet room using a middle ear analyzer GSI 33 II with tone probes of 226, 678 and 1000 Hz. All babies were calm or sleeping during the test. Results. 110 neonates were tested with the three tone probes therefore 660 curves were obtained. Age range was 6 to 30 days (58 boys and 52 girls). When a probe tone of 226 Hz was used, single peak curve was observed in 105 (47,7%) ears and double peak was found in 115 (52,3%) ears. Results with a probe tone of 678 Hz, revealed 56 (25,4%) single peak curves, 16 (7,3%) inverted curves (I) and 148 (67,3%) asymmetric (AS). Results with a probe tone of 1000 Hz showed 156 (70,9%) single peak tympanograms, 62 (28,2%) asymmetric and 2 (0,9%) inverted. Among the quantitative measurements analyzed, Vea demonstrated a significant difference in gender with probe tones of 226 Hz. Ytm, was also significantly different by gender with the probe tone of 1000 Hz, larger for the boys. For all the other variables no significant difference was found for ear or gender. When the curves were analyzed using the protocol proposed by Sutton et al (2002), 208 (94,5%) ears were considered normal and 12 (5,5%) abnormal with the probe tone of 678 Hz. For the probe tone of 1000 Hz, 217 (98,6%) ears were considered normal, and just 3 (1,4%) of the tympanograms were classified as abnormal. Conclusion. The tympanometric findings in this study were similar to those described in the literature with prevalence of single peaked curves for the probe tone of 1000 Hz and a similar number of single and double peaked curves with the probe tone of 226 Hz. The quantitative measurements were, in general, in agreement with the literature reviewed. The interpretation of the results with the probe tone of 1000 Hz using the protocol suggested by Sutton et al (2002) was the method that allowed the classification of normal for the greatest percentage of the ears tested suggesting that it can be very useful when neonates are evaluated. Further research with this protocol is suggested. / Introdução. A identificação e a caracterização precoce da perda auditiva em neonatos visam estabelecer condições para uma intervenção adequada, tão cedo quanto possível, a fim de reduzir as conseqüências negativas no desenvolvimento pessoal e social da criança. A timpanometria faz parte da bateria de testes do diagnóstico da perda auditiva e é utilizada para avaliação da orelha média, para diferenciar perdas condutivas de neurossensoriais. A timpanometria realizada em neonatos com menos de seis meses, quando executadas com tom sonda de baixa freqüência (226 Hz), pode gerar dúvidas, pois nesse tipo de sonda, neonatos com otite média podem revelar timpanograma aparentemente normal. Com isso, tem-se investigado o uso de tom sonda de alta freqüência (678 e 1000 Hz) em busca de resultados mais confiáveis. Objetivo. Descrever e analisar interpretações de características e medidas obtidas na timpanometria de neonatos ouvintes com sonda de tom prova de 226, 678 e 1000 Hz. São descritos os seguintes aspectos do timpanograma: características da curva timpanométrica, Admitância Acústica Estática de Pico Compensado na Altura da Membrana Timpânica (Ymt), Largura Timpanométrica (LT), Pressão do Pico Timpanométrico (PPT) e Volume Equivalente do Meato Acústico Externo (Vea). Metodologia. Os sujeitos analisados passaram por uma triagem que incluiu anamnese e teste de emissões otoacústicas. Para a realização das timpanometrias foi utilizado o analisador de orelha média GSI-33-II, com tons sonda de 226, 678 e 1000 Hz, em sala silenciosa e com a criança em estado tranqüilo. Resultados. Foram obtidos timpanogramas de 110 neonatos ouvintes com 6 a 30 dias de idade (58 meninos e 52 meninas), perfazendo um total de 660 timpanogramas. No tom sonda de 226 Hz, o tipo de curva pico único (A) apareceu em 105 (47,7%) orelhas e o tipo pico duplo (PD) em 115 (52,3%) orelhas. Os resultados na freqüência de 678 Hz indicaram 56 (25,5%) ocorrências de curva tipo A, 16 (7,3%) do tipo invertida (I) e 148 (67,3%) curvas do tipo assimétrica (AS). Na sonda de 1000 Hz foram registradas 156 (70,9%) curvas do tipo A, 62 (28,2%) do tipo AS e 2 (0,9%) do tipo I. Dentre as variáveis quantitativas analisadas, apenas o Vea apresentou efeito de significância por orelha na sonda de tom prova de 1000 Hz. O Vea apresentou efeito de significância em relação ao gênero nas freqüências de 226 e 1000 Hz. A Ymt, também, apresentou efeito de significância por gênero, na sonda de 1000 Hz, sendo maior nos meninos. Nas demais variáveis não foi encontrado efeito de significância nem por orelha e nem por gênero. Quando interpretados de acordo com o protocolo recomendado por Sutton et al (2002), obteve-se, em 678 Hz, 208 (94,5%) orelhas com resultado normal, enquanto 12 (5,5%) foram interpretadas como anormais. Na sonda de tom prova de 1000 Hz, 217 (98,6%) das orelhas foram normais, e apenas 3 (1,4%) dos timpanogramas foram classificados como anormais. Conclusão. Os achados timpanométricos, tanto em 226 Hz quanto em 1000 Hz, foram compatíveis com os resultados presentes na literatura, que descrevem alta ocorrência de curvas do tipo A em sonda de 1000 Hz e equilíbrio entre os tipos de curva A e PD em sonda de 226 Hz. Os dados registrados para as medidas quantitativas, também, estiveram de acordo com o indicado na literatura. A interpretação das curvas timpanométricas com sonda de 1000 Hz utilizando o protocolo proposto por Sutton et al (2002) foi a que possibilitou a classificação de normal na maior porcentagem das orelhas avaliadas, sugerindo que este pode ser um método de grande utilidade na avaliação de bebês. Recomenda-se que pesquisas futuras com esse protocolo sejam realizadas.
449

Ensaios em cópulas e finanças empíricas

Silva, Fernando Augusto Boeira Sabino da January 2017 (has links)
Nesta tese discutimos abordagens que utilizam cópulas para descrever dependências entre instrumentos nanceiros e avaliamos a performance destes métodos. Muitas crises nanceiras aconteceram desde o nal da década de 90, incluindo a crise asiática (1997), a crise da dívida da Rússia (1998), a crise da bolha da internet (2000), as crises após o 9/11 (2001) e a guerra do Iraque (2003), a crise do subprime or crise nanceira global (2007-08), e a crise da dívida soberana europeia (2009). Todas estas crises levaram a uma perda maciça de riqueza nanceira e a um aumento da volatilidade observada, e enfatizaram a importância de uma política macroprudencial mais robusta. Em outras palavras, perturbações nanceiras tornam os processos econômicos altamente não-lineares, levando os principais bancos centrais a tomarem medidas contrárias para conter a angústia - nanceira. Devido aos complexos padrões de dependência dos mercados nanceiros, uma abordagem multivariada em grandes dimensões para a análise da dependência caudal é seguramente mais perspicaz do que assumir retornos com distribuição normal multivariada. Dada a sua exibilidade, as cópulas são capazes de modelar melhor as regularidades empiricamente veri cadas que são normalmente atribuídas a retornos nanceiros multivariados: (1) volatilidade condicional assimétrica com maior volatilidade para grandes retornos negativos e menor volatilidade para retornos positivos (HAFNER, 1998); (2) assimetria condicional (AIT-SAHALIA; BRANDT, 2001; CHEN; HONG; STEIN, 2001; PATTON, 2001); (3) excesso de curtose (TAUCHEN, 2001; ANDREOU; PITTIS; SPANOS, 2001); e (4) dependência temporal não linear (CONT, 2001; CAMPBELL; LO; MACKINLAY, 1997). A principal contribuição dos ensaios é avaliar se abordagens mais so sticadas do que o método da distância e o tradicional modelo de Markowitz podem tirar proveito de quaisquer anomalias/fricções de mercado. Os ensaios são uma tentativa de fornecer uma análise adequada destas questões usando conjuntos de dados abrangentes e de longo prazo. Empiricamente, demonstramos que as abordagens baseadas em cópulas são úteis em todos os ensaios, mostrando-se bené cas para modelar dependências em diferentes cenários, avaliando as medidas de risco caudais mais adequadamente e gerando rentabilidade superior a dos benchmarks utilizados. / In this thesis we discuss copula-based approaches to describe statistical dependencies within nancial instruments and evaluate its performance. Many nancial crises have occurred since the late 1990s, including the Asian crisis (1997), the Russian national debt crisis (1998), the dot-com bubble crisis (2000), the crises after 9-11 (2001) and Iraq war (2003), the subprime mortgage crisis or global nancial crisis (2007-08), and the European sovereign debt crisis (2009). All of these crises lead to a massive loss of nancial wealth and an upward in observed volatility and have emphasized the importance of a more robust macro-prudential policy. In other words, nancial disruptions make the economic processes highly nonlinear making the major central banks to take counter-measures in order to contain nancial distress. The methods for modeling uncertainty and evaluating the market risk on nancial markets are now under more scrutiny after the global nancial crisis. Due to the complex dependence patterns of nancial markets, a high-dimensional multivariate approach to tail dependence analysis is surely more insightful than assuming multivariate normal returns. Given its exibility, copulas are able to model better the empirically veri ed regularities normally attributed to multivariate nancial returns: (1) asymmetric conditional volatility with higher volatility for large negative returns and smaller volatility for positive returns (HAFNER, 1998); (2) conditional skewness (AITSAHALIA; BRANDT, 2001; CHEN; HONG; STEIN, 2001; PATTON, 2001); (3) excess kurtosis (TAUCHEN, 2001; ANDREOU; PITTIS; SPANOS, 2001); and (4) nonlinear temporal dependence (CONT, 2001; CAMPBELL; LO; MACKINLAY, 1997). The principal contribution of the essays is to assess if more sophisticated approaches than the distance method and plain Markowitz model can take advantage of any market anomalies/ fricctions. The essays are one attempt to provide a proper analysis in these issues using a long-term and comprehensive datasets. We empirically show that copula-based approaches are useful in all essays, proving bene cial to model dependencies in di erent scenarios, assessing the downside risk measures more adequately and yielding higher profitability than the benchmarks.
450

Comment la Technologie Façonne les Marchés Financiers : l’Exemple du Marché des Changes / How Technology Shapes Financial Markets : the Perspective of the Foreign Exchange Market

Lafarguette, Romain 03 May 2017 (has links)
Cette thèse de doctorat est composée de trois chapitres traitant de l’impact des innovations technologiques sur les marchés financiers, prenant comme cas d’étude le marché des changes. Le premier chapitre analyse l’impact des innovations technologiques sur la géographie du marché des changes. Il utilise la connexion des pays au réseau sous-marin des câbles à fibre optique comme mesure de choc technologique exogène. Les estimations montrent que l’introduction des câbles à fibre optique a contribué à concentrer la répartition des activités de trading dans quelques grandes places financières au détriment de toutes les autres. Le deuxième chapitre s’intéresse à l’impact de la technologie sur la réaction des marchés des changes à de nouvelles informations macroéconomiques et financières. Il estime que le développement des technologies de l’information et de la communication permet de réduire la volatilité sur les marchés des changes de façon significative. Enfin, le troisième chapitre montre que le trading à grande vitesse contribue à atténuer les réactions de marché aux chocs macroéconomiques exogènes. Une explication possible, qui s’appuie sur un modèle théorique, est que le trading à grande vitesse augmente la dispersion des cotations de change, qui en retour accroît le temps nécessaire pour les traders pour traiter l’information contenue dans les cotations, rendant de fait le marché moins réactif à de nouvelles informations macroéconomiques et financières. Cette thèse de doctorat propose une nouvelle façon de penser et de mesurer l’impact du progrès technologique sur les marchés financiers. La première contribution est d’utiliser le réseau sous-marin des câbles à fibre optique comme choc technologique exogène et de mesurer son impact sur la géographie des marchés des changes et la volatilité. La seconde contribution est de montrer le lien entre trading à grande vitesse, dispersion des cotations et efficience des marchés, en utilisant l’entropie des cotations comme mesure du temps nécessaire pour traiter l’information contenue dans les prix et en comprendre l’impact sur l’efficience de marché. / This PhD dissertation is a collection of three essays on how technology has been shaping financial markets, using as a case study the foreign exchange market. The first chapter investigates the impact of technological innovations on the geography of the foreign exchange market. It uses as a proxy for exogeneous technological changes the connection of countries to submarine fiber-optic cables. The estimates of this chapter suggest that technology contributes to concentrating foreign exchange trading in an handful of financial centers. The second chapter studies the impact of technology on the reaction of foreign exchange markets to macroeconomic announcements. It shows that the development of Information and Communication Technologies dampens foreign exchange markets volatility. Finally, the third chapter shows that fast trading dampens market reaction to new macroeconomic information. One possible explanation, based on a theoretical model, is that fast traders increase the dispersion in exchange rate quotes, i.e. the time traders need to process new information about market prices; in turn, entropy dampens the market’s reaction to macro news. This PhD dissertation provides a new way to measure and conceptualize technological progress with regards to financial markets. The first contribution is to treat the network of submarine fiber optic cables as an exogenous technological shock to investigate the impact of technology on the geography of foreign exchange trading and on volatility. The second contribution is to show that patterns in the distribution of quotes matters in the context of fast trading. The concept of entropy in exchange rate quotes is used to characterize how fast information diffuses on financial markets and thereby to assess the implications of fast trading on market efficiency.

Page generated in 0.0719 seconds