• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 58
  • 10
  • 5
  • 4
  • 4
  • 2
  • 2
  • 1
  • Tagged with
  • 108
  • 22
  • 13
  • 13
  • 13
  • 10
  • 9
  • 9
  • 9
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

On ultra-wideband over fiber transmission systems employing semiconductor optical amplifiers / Etude de systèmes de transmission à bande ultra large sur fibre utilisant des amplificateurs optiques à semiconducteurs

Taki, Haidar 25 September 2017 (has links)
La technologie Ultra WideBand (UWB) sur fibre est une solution prometteuse pour répondre aux enjeux des futurs réseaux de communication WLAN/WPAN. Les caractéristiques de la fibre, incluant son énorme bande passante, offrent la possibilité d'une bonne qualité de service à longue portée. La propagation sans-fil UWB doit être réalisée sous des contraintes de densité spectrale de puissance particulières, imposées par l'autorité de régulation (FCC pour les Etats-Unis). La nouveauté de notre travail provient de I' exploitation des avantages d'un amplificateur optique à semi-conducteurs (SOA) afin d'obtenir une extension de portée à un coût et une complexité limités. Cependant, les effets non linéaires et le bruit d'émission spontanée amplifiée (ASE), intrinsèques à ce type de composant, sont susceptibles de dégrader la performance du système. La réduction de ces effets indésirables a donc été d'une importance centrale dans cette étude. Les non-linéarités du SOA ont été compensées en appliquant une solution de pré-distorsion analogique des formes d'ondes électriques. Un traitement basé sur phaser a également été proposé pour réduire simultanément I' influence de I'ASE et linéariser les caractéristiques du SOA, grâce à des opérations de chirping réparties entre l'émetteur et le récepteur. Avec la transmission Impulse Radio, en raison des propriétés temporelles des formats de modulation, des raies spectrales apparaissent, ce qui peut violer la limite FCC ou réduire I' efficacité énergétique. Une nouvelle technique de randomisation de formes d'ondes a été étudiée, qui s'est révélée efficace pour supprimer ces pics spectraux. Les trois approches ont montré un grand potentiel avec les formats On Off Keying et Pulse Position Modulation, à longue portée optique. Les performances d'une modulation différentielle Chaos Shift Keying ont finalement été examinées; une probabilité d'erreur inférieure a été obtenue expérimentalement en comparaison avec d'autres modulations non cohérentes. / Ultra WideBand (UWB) over fiber is a promising technology for meeting the demands of future wireless local-area networks (WLANs) and wireless personal-area networks (WPANs). Thanks to the enormous bandwidth and fiber characteristics, a high communication quality may be established at long reach. UWB wireless propagation must be achieved with special power and spectral constraints fixed by the regulatory bodies (e.g. US Federal Communication Commission). The novelty of our work originates from exploiting the benefits of a Semiconductor Optical Amplifier (SOA) so as to get a reach extension at limited cost and complexity. However, the inherent nonlinear effects and Amplified Spontaneous Emission (ASE) noise associated to such device may affect the system performance.Overcoming these impairments has been of central importance in this study. SOA nonlinearities have been mitigated by applying analog pre-distortion in electrical domain. Phaser-based processing was also proposed to simultaneously reduce ASE influence and linearize SOA characteristics, thanks to up/down chirping performed on the transmitter/receiver sides. With Impulse Radio UWB transmission, due to the time properties of modulation patterns, discrete lines arise in the corresponding spectrum, which may violate FCC limit or reduce the power efficiency. A new shape randomization technique has been investigated, which proved to be effective in suppressing these spectral spikes. The three approaches have shown a great potential with On Off Keying and Pulse Position Modulation formats at long optical reach.The performance of Differential Chaos Shift Keying was finally examined in the over fiber system, a lower error probability was experimentally achieved in comparison with other non-coherent modulations.
92

Detecção de padrões espaciais na distribuição dos pacientes portadores de doença genética com deficiência física da Associação de Assistência à Criança Deficiente (AACD) de Pernambuco

CAMPOS, Ana Clara Paixão 02 February 2013 (has links)
Submitted by (ana.araujo@ufrpe.br) on 2016-05-20T12:26:11Z No. of bitstreams: 1 Ana Clara Paixao Campos.pdf: 1515481 bytes, checksum: 29c30eb35f6e7da1f6e63d0971def668 (MD5) / Made available in DSpace on 2016-05-20T12:26:11Z (GMT). No. of bitstreams: 1 Ana Clara Paixao Campos.pdf: 1515481 bytes, checksum: 29c30eb35f6e7da1f6e63d0971def668 (MD5) Previous issue date: 2013-02-02 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / Knowing the spatial pattern of patients with genetic disease with physical disabilities in the treatment of Pernambuco AACD is of great importance because it makes it possible to guide the basic care assistance to these individuals and provide solid basis for planning public health policies. However, there are few references in the literature using the tools of spatial analysis in the context of disability. This dissertation was structured in the form of two papers. In the first article the global Moran's I index was used to characterize the spatial pattern of the rate of patients with genetic disease with physical disabilities in the AACD treatment of Pernambuco and the results were compared with those obtained with the randomization test. In both approaches we found the existence of spatial pattern for the rate of patients with genetic disease with physical disabilities in the treatment of Pernambuco AACD when we took into account the rates of four municipalities closest to each location. In the second article we evaluated the performance of the global Moran's I index and the Mantel test with Spearman correlation, both using randomization to assess the statistical significance, regarding the ability to detect spatial pattern for the rate of patients with diseases genetic with physical disabilities in the AACD treatment of Pernambuco. The results showed that the global Moran's I index proved to be a more satisfactory method for detecting the spatial pattern, since it uses the information in its calculations of the neighborhood, and provide greater control of the rejection rates of the null hypothesis under study. / Conhecer o padrão espacial dos portadores de doença genética com deficiência física em tratamento na AACD de Pernambuco é de grande importância, pois torna possível orientar a assistência aos cuidados básicos desses indivíduos e fornecer base sólida para o planejamento de políticas públicas de saúde. Entretanto, existem poucas referências na literatura utilizando o instrumental da análise espacial no contexto da deficiência física. A presente dissertação foi estruturada na forma de dois artigos científicos. No primeiro artigo o Índice I global de Moran foi utilizado para caracterizar o padrão espacial da taxa de pacientes portadores de doença genética com deficiência física em tratamento na AACD de Pernambuco e os resultados encontrados foram comparados com os obtidos através do teste de aleatorização. Em ambas as metodologias constatou-se a existência de padrão espacial agregado para a taxa de pacientes portadores de doença genética com deficiência física em tratamento na AACD de Pernambuco quando se levou em consideração as taxas dos 4 municípios mais próximos de cada localidade. No segundo artigo foi avaliado o desempenho do índice I global de Moran e do teste de Mantel com correlação de Spearman, ambos utilizando aleatorização para avaliar a significância da estatística, no que tange a capacidade de detectar padrão espacial para a taxa de pacientes portadores de doenças genéticas com deficiência física em tratamento na AACD de Pernambuco. Os resultados indicaram que o índice I global de Moran mostrou-se uma metodologia mais satisfatória para detectar do padrão espacial, uma vez que utiliza em seus cálculos as informações da vizinhança, além de proporcionar maior controle das taxas de rejeição da hipótese nula em estudo.
93

Cache Prediction and Execution Time Analysis on Real-Time MPSoC

Neikter, Carl-Fredrik January 2008 (has links)
Real-time systems do not only require that the logical operations are correct. Equally important is that the specified time constraints always are complied. This has successfully been studied before for mono-processor systems. However, as the hardware in the systems gets more complex, the previous approaches become invalidated. For example, multi-processor systems-on-chip (MPSoC) get more and more common every day, and together with a shared memory, the bus access time is unpredictable in nature. This has recently been resolved, but a safe and not too pessimistic cache analysis approach for MPSoC has not been investigated before. This thesis has resulted in designed and implemented algorithms for cache analysis on real-time MPSoC with a shared communication infrastructure. An additional advantage is that the algorithms include improvements compared to previous approaches for mono-processor systems. The verification of these algorithms has been performed with the help of data flow analysis theory. Furthermore, it is not known how different types of cache miss characteristic of a task influence the worst case execution time on MPSoC. Therefore, a program that generates randomized tasks, according to different parameters, has been constructed. The parameters can, for example, influence the complexity of the control flow graph and average distance between the cache misses.
94

Epidemiological applications of quantitative serum NMR metabolomics:causal inference from observational studies

Wang, Q. (Qin) 10 March 2017 (has links)
Abstract Cardiovascular diseases are the leading cause of death worldwide and type 2 diabetes is reaching a global epidemic. Epidemiological studies have identified numerous risk factors and pharmacotherapies in relation to these cardiometabolic diseases. However, the detailed molecular mechanisms of these risk factors and drug therapies generally remain incompletely understood. Elucidating the underlying molecular effects would be essential for better understanding of the disease pathogenesis and also for discovering new therapeutic targets. Quantitative serum metabolomics, which allows for simultaneous quantification of multiple circulating metabolic measures, provides a hypothesis-free approach to systematically inspect the metabolic changes in response to endogenous and exogenous stimuli. Metabolomics thus presents a valuable tool to study the detailed molecular effects of disease risk factors and drug therapies. However, current metabolomics studies are mostly conducted in small cross-sectional studies and the causal relations of the risk factors on the metabolic measures are generally unclear, providing limited public health impact. The present thesis serves as a proof-of-concept to illustrate that well-designed observational studies can be used to infer causality. With the exemplars of assessing molecular effects of two risk factors (body mass index and sex hormone-binding globulin) and two drug therapies (statins and oral contraceptives), the thesis demonstrates that an improved causal inference can be achieved in observational studies via the combination of multiple study designs, including cross-sectional, longitudinal and Mendelian randomization analysis. This robust study design approach together with metabolomics data can be also extended to study the molecular effects of other risk factors and drug therapies. With an improved molecular understanding of a wide range of risk factors and therapies, better understanding of disease pathogenesis is ensured. / Tiivistelmä Sydän- ja verisuonitaudit ovat johtava kuolinsyy maailmassa ja tyypin 2 diabetes on saavuttamassa globaalin epidemian mittasuhteet. Epidemiologiset tutkimukset ovat löytäneet useita riskitekijöitä ja lääkehoitoja edellä mainituille yleisille taudeille. Tyypin 2 diabetekseen ja sydän- ja verisuonitauteihin liittyvät yksityiskohtaiset molekylaariset mekanismit ymmärretään kuitenkin puutteellisesti. Molekylaaristen yksityiskohtien tarkempi ymmärtäminen olisi siten erittäin merkittävää sekä tautiprosessien ymmärtämiseksi että lääkehoitojen kehittämiseksi. Seerumin kvantitatiivinen metabolomiikka mahdollistaa useiden metabolisten suureiden samanaikaisen määrittämisen verenkierrosta ja tarjoaa siten hypoteesittoman lähestymistavan sekä sisäisten että ulkoisten ärsykkeiden aiheuttamien metabolisten muutosten systemaattiseen tutkimukseen. Metabolomiikka on siten arvokas työkalu yksityiskohtaisten molekylaaristen mekanismien tutkimuksessa, olipa kyseessä taudin riskitekijät tai lääkehoito. Metabolomiikkatutkimuksia on kuitenkin pääasiassa tehty pienissä poikittaistutkimuksissa ja riskitekijöihin liittyvien metabolisten suureiden syy- ja seuraussuhteet ovat yleisesti epäselviä, josta johtuen metabolisten suureiden kansanterveydellinen sovellettavuus on ollut heikkoa. Tämä väitöskirja esittelee tutkimuskonseptin hyvin suunniteltujen havaintotutkimuksien soveltamiseksi syy- ja seuraussuhteiden arvioinnissa. Työ sisältää esimerkit kahden riskitekijän (painoindeksi ja sukupuolihormoneja sitova globuliini) ja kahden lääkehoidon (statiinit ja ehkäisypillerit) molekylaaristen vaikutusten kausaalisista tutkimuksista. Tulokset havainnollistavat, että kausaalisten johtopäätösten luotettavuutta voidaan parantaa yhdistämällä useita tutkimusasetelmia, kuten poikittais- ja pitkittäistutkimuksia sekä Mendelististä satunnaistamista. Esitettyjä luotettavia tutkimusasetelmia, yhdessä metabolomiikkadatan kanssa, voidaan laajentaa muiden riskitekijöiden ja lääkehoitojen molekylaaristen vaikutusten tutkimuksiin. Parantunut molekyylitason ymmärrys useista riskitekijöistä ja lääkehoidoista johtaa myös parempaan tautiprosessien ymmärtämiseen.
95

Statistical Designs for Network A/B Testing

Pokhilko, Victoria V 01 January 2019 (has links)
A/B testing refers to the statistical procedure of experimental design and analysis to compare two treatments, A and B, applied to different testing subjects. It is widely used by technology companies such as Facebook, LinkedIn, and Netflix, to compare different algorithms, web-designs, and other online products and services. The subjects participating in these online A/B testing experiments are users who are connected in different scales of social networks. Two connected subjects are similar in terms of their social behaviors, education and financial background, and other demographic aspects. Hence, it is only natural to assume that their reactions to online products and services are related to their network adjacency. In this research, we propose to use the conditional autoregressive model (CAR) to present the network structure and include the network effects in the estimation and inference of the treatment effect. The following statistical designs are presented: D-optimal design for network A/B testing, a re-randomization experimental design approach for network A/B testing and covariate-assisted Bayesian sequential design for network A/B testing. The effectiveness of the proposed methods are shown through numerical results with synthetic networks and real social networks.
96

La génétique humaine pour l'étude de cibles pharmacologiques

Legault, Marc-André 03 1900 (has links)
En étudiant les variations génétiques au sein d'une population, il est possible d'identifier des polymorphismes génétiques qui confèrent une protection naturelle contre la maladie. Si l'on parvient à comprendre le mécanisme moléculaire qui sous-tend cette protection, par exemple en reliant la variation génétique à la perturbation d'une protéine bien précise, il pourrait être possible de développer des thérapies pharmacologiques qui agissent sur la même cible biologique. Cette relation entre les médicaments et les variations génétiques est une des prémisses centrales de la validation génétique de cibles pharmacologiques qui est un facteur de réussite dans le développement de médicaments. Dans cette thèse, nous utiliserons un modèle génétique pour prédire les effets bénéfiques et indésirables de l'ivabradine, un médicament utilisé afin de réduire la fréquence cardiaque. L'ivabradine est un inhibiteur du canal ionique potassium/sodium hyperpolarization-activated cyclic nucleotide-gated channel 4, encodé par le gène HCN4, dont les bénéfices sont hétérogènes chez différentes populations de patients. Ce médicament est efficace pour le traitement de l'angine et de l'insuffisance cardiaque, mais s'est avéré inefficace en prévention secondaire chez des patients coronariens stables sans dysfonction systolique. La caractérisation des effets de l'ivabradine s'est échelonnée sur une période de 6 ans et trois grands essais de phase III ont été menées. Nous étudierons la possibilité d'avoir prédit ou accéléré ce processus à l'aide de modèles génétiques et nous contrasterons les effets spécifiques à l'ivabradine des effets généraux de la réduction de la fréquence cardiaque par une approche de randomisation mendélienne. Deuxièmement, une approche génétique sera utilisée pour évaluer l'effet de l'inhibition de la cholesteryl ester tranfer protein (CETP), une enzyme responsable du transfert des cholestérols estérifiés et des triglycérides entre différentes lipoprotéines ainsi qu'une cible pharmacologique largement étudiée pour le traitement de la maladie coronarienne. Les études génétiques prédisent un bénéfice à l'inhibition de CETP, mais les essais randomisés ont eu des résultats hétérogènes et décevants. Nous utiliserons un modèle génétique d'inhibition de la CETP pour identifier des variables qui peuvent moduler l'effet de l'inhibition de la CETP sur des biomarqueurs et la maladie ischémique. Les biomarqueurs pris en compte comprennent les taux de cholestérol à lipoprotéines de basse et haute densité, mais aussi la capacité du plasma à absorber le cholestérol, une mesure fonctionnelle importante et sous-étudiée. Le sexe et l'indice de masse corporelle se sont avérés être deux variables qui modifient fortement les effets d'une réduction génétiquement prédite de la concentration de CETP sur les paramètres étudiés. Notre modèle prédit un bénéfice plus important de l'inhibition de la CETP pour les femmes et les individus ayant un indice de masse corporelle normal sur le profil lipidique, mais nous n'avons pas pu démontrer une modulation de l'effet sur la maladie ischémique. Cette étude reste importante sur le plan méthodologique, car elle soulève la possibilité d'utiliser des modèles génétiques de cibles pharmacologiques pour prédire l'hétérogénéité dans la réponse au médicament, une lacune des essais randomisés classiques. Enfin, nous avons adopté une approche centrée sur les gènes pour caractériser l'effet de 19 114 protéines humaines sur 1 210 phénotypes de la UK Biobank. Les résultats de cette étude sont accessibles au public (https://exphewas.statgen.org/) et constituent une ressource précieuse pour cerner rapidement les conséquences phénotypiques associées à un locus. Dans le contexte de validation de cibles pharmacologiques, cette plate-forme web peut aider à rapidement identifier les problèmes de sécurité potentiels ou à découvrir des possibilités de repositionnement du médicament. Un exemple d'utilisation de cette plate-forme est présenté où nous identifions le gène de la myotiline comme un nouvel acteur potentiel dans la pathogénèse de la fibrillation auriculaire. / Using population-level data, it is possible to identify genetic polymorphisms that confer natural protection against disease. If the molecular mechanism underlying this protection can be understood, for example by linking variants to the disruption of a particular protein, it may be possible to develop drugs that act on the same biological target. This link between drugs and variants is a central premise of genetic drug target validation. In this work, a genetic model is used to predict the beneficial and adverse effects of ivabradine, a drug used to lower heart rate. Ivabradine is an inhibitor of the ion channel potassium/sodium hyperpolarization-activated cyclic nucleotide-gated channel 4, encoded by the HCN4 gene, with heterogeneous benefits in different patient populations. This drug is effective in the treatment of angina and heart failure but it is ineffective in patients with stable coronary artery disease without systolic dysfunction. Characterization of the effect of ivabradine has occurred over a 6-year period and three large phase III trials have been conducted. We will investigate whether this process could have been streamlined using genetic models and contrast the ivabradine-specific effect with the general effect of heart rate reduction using a Mendelian Randomization approach. Second, a genetic approach is used to study the effect of inhibiting cholesteryl ester tranfer protein (CETP), an enzyme responsible for the transfer of cholestery esters and triglycerides between different lipoproteins and a widely studied drug target for the treatment of coronary artery disease. Genetic studies predict a benefit of CETP inhibition, but randomized trials yielded heterogeneous and disappointing results. We will use a genetic model of CETP inhibition to identify variables that may modulate the effect of CETP inhibition on biomarkers and ischemic disease. The biomarkers we considered included low- and high-density lipoprotein cholesterol levels but also the plasma cholesterol efflux capacity, an important and understudied functional measure of high density lipoproteins. Sex and body mass index strongly modulated the effect of a genetically predicted lower CETP concentration on the lipid profile. Our model predicts a greater benefit of CETP inhibition in women and individuals with normal body mass index on the lipid profile, but these observations did not translate to changes in the effect on cardiovascular outcomes. This study remains methodologically important because it demonstrates the possibility of using genetic models of drug targets to predict heterogeneity in drug response, a shortcoming of conventional randomized trials. Finally, we adopted a gene-centric approach to characterize the effect of 19,114 human protein-coding genes on 1,210 UK Biobank phenotypes. The results of this study are publicly available (https://exphewas.statgen.org/) and provide a valuable resource to rapidly screen the phenotypic consequences associated with a gene. In the context of drug target validation, this platform can help quickly identify potential safety issues or discover drug repurposing opportunities. An example of the use of this platform is presented where we identify the myotilin gene as a potential atrial fibrillation gene.
97

Optimizing Body Mass Index Targets Using Genetics and Biomarkers

Khan, Irfan January 2021 (has links)
Introduction/Background: Guidelines from the World Health Organization currently recommend targeting a body mass index (BMI) between 18.5 and 24.9 kg/m2 based on the lowest risk of mortality observed in epidemiological studies. However, these recommendations are based on population observations and do not take into account potential inter-individual differences. We hypothesized that genetic and non-genetic differences in adiposity, anthropometric, and metabolic measures result in inter-individual variation in the optimal BMI. Methods: Genetic variants associated with BMI as well as related adiposity, anthropometric, and metabolic phenotypes (e.g. triglyceride (TG)) were combined into polygenic risk scores (PRS), cumulative risk scores derived from the weighted contributions of each variant. 387,692 participants in the UK Biobank were split by quantiles of PRS or clinical biomarkers such as C-reactive protein (CRP), and alanine aminotransferase (ALT). The BMI linked with the lowest risk of all-cause and cause-specific mortality outcomes (“nadir value”) was then compared across quantiles (“Cox meta-regression model”). Our results were replicated using the non-linear mendelian randomization (NLMR) model to assess causality. Results: The nadir value for the BMI–all-cause mortality relationship differed across percentiles of BMI PRS, suggesting inter-individual variation in optimal BMI based on genetics (p = 0.005). There was a difference of 1.90 kg/m2 in predicted optimal BMI between individuals in the top and bottom 5th BMI PRS percentile. Individuals having above and below median TG (p = 1.29×10-4), CRP (p = 7.92 × 10-5), and ALT (p = 2.70 × 10-8) levels differed in nadir for this relationship. There was no difference in the computed nadir between the Cox meta-regression or NLMR models (p = 0.102). Conclusions: The impact of BMI on mortality is heterogenous due to individual genetic and clinical biomarker level differences. Although we cannot confirm that are results are causal, genetics and clinical biomarkers have potential use for making more tailored BMI recommendations for patients. / Thesis / Master of Science (MSc) / The World Health Organization (WHO) recommends targeting a body mass index (BMI) between 18.5 - 24.9 kg/m2 for optimal health. However, this recommendation does not take into account individual differences in genetics or biology. Our project aimed to determine whether the optimal BMI, or the BMI associated with the lowest risk of mortality, varies due to genetic or biological variation. Analyses were conducted across 387,692 individuals. We divided participants into groups according to genetic risk for obesity or clinical biomarker profile. Our results show that the optimal BMI varies according to genetic or biomarker profile. WHO recommendations do not account for this variation, as the optimal BMI can fall under the normal 18.5 - 24.9 kg/m2 or overweight 25.0 – 29.0 kg/m2 WHO BMI categories depending on individual genetic or biomarker profile. Thus, there is potential for using genetic and/or biomarker profiles to make more precise BMI recommendations for patients.
98

Détection de l’invalidité et estimation d’un effet causal en présence d’instruments invalides dans un contexte de randomisation mendélienne

Boucher-Roy, David 08 1900 (has links)
La randomisation mendélienne est une méthode d’instrumentation utilisant des instruments de nature génétique afin d’estimer, via par exemple la régression des moindres carrés en deux étapes, une relation de causalité entre un facteur d’exposition et une réponse lorsque celle-ci est confondue par une ou plusieurs variables de confusion non mesurées. La randomisation mendélienne est en mesure de gérer le biais de confusion à condition que les instruments utilisés soient valides, c’est-à-dire qu’ils respectent trois hypothèses clés. On peut généralement se convaincre que deux des trois hypothèses sont satisfaites alors qu’un phénomène génétique, la pléiotropie, peut parfois rendre la troisième hypothèse invalide. En présence d’invalidité, l’estimation de l’effet causal de l’exposition sur la réponse peut être sévèrement biaisée. Afin d’évaluer la potentielle présence d’invalidité lorsqu’un seul instrument est utilisé, Glymour et al. (2012) ont proposé une méthode qu’on dénomme ici l’approche de la différence simple qui utilise le signe de la différence entre l’estimateur des moindres carrés ordinaires de la réponse sur l’exposition et l’estimateur des moindres carrés en deux étapes calculé à partir de l’instrument pour juger de l’invalidité de l’instrument. Ce mémoire introduit trois méthodes qui s’inspirent de cette approche, mais qui sont applicables à la randomisation mendélienne à instruments multiples. D’abord, on introduit l’approche de la différence globale, une simple généralisation de l’approche de la différence simple au cas des instruments multiples qui a comme objectif de détecter si un ou plusieurs instruments utilisés sont invalides. Ensuite, on introduit les approches des différences individuelles et des différences groupées, deux méthodes qui généralisent les outils de détection de l’invalidité de l’approche de la différence simple afin d’identifier des instruments potentiellement problématiques et proposent une nouvelle estimation de l’effet causal de l’exposition sur la réponse. L’évaluation des méthodes passe par une étude théorique de l’impact de l’invalidité sur la convergence des estimateurs des moindres carrés ordinaires et des moindres carrés en deux étapes et une simulation qui compare la précision des estimateurs résultant des différentes méthodes et leur capacité à détecter l’invalidité des instruments. / Mendelian randomization is an instrumentation method that uses genetic instruments to estimate, via two-stage least squares regression for example, a causal relationship between an exposure and an outcome when the relationship is confounded by one or more unmeasured confounders. Mendelian randomization can handle confounding bias provided that the instruments are valid, i.e., that they meet three key assumptions. While two of the three assumptions can usually be satisfied, the third assumption is often invalidated by a genetic phenomenon called pleiotropy. In the presence of invalid instruments, the estimate of the causal effect of exposure on the outcome may be severely biased. To assess the potential presence of an invalid instrument in single-instrument studies, Glymour et al. (2012) proposed a method, hereinafter referred to as the simple difference approach, which uses the sign of the difference between the ordinary least squares estimator of the outcome on the exposure and the two-stage least squares estimator calculated using the instrument. Based on this approach, we introduce three methods applicable to Mendelian randomization with multiple instruments. The first method is the global difference approach and corresponds to a simple generalization of the simple difference approach to the case of multiple instruments that aims to detect whether one or more instruments are invalid. Next, we introduce the individual differences and the grouped differences approaches, two methods that generalize the simple difference approach to identify potentially invalid instruments and provide new estimates of the causal effect of the exposure on the outcome. The methods are evaluated using a theoretical investigation of the impact that invalid instruments have on the convergence of the ordinary least squares and two-stage least squares estimators as well as with a simulation study that compares the accuracy of the respective estimators and the ability of the corresponding methods to detect invalid instruments.
99

Study of neural correlates of attention in mice with spectro-spatio-temporal approaches / En studie om neurala korrelater av uppmärksamhet hos möss med spektro-spatio-temporala tillvägagångssätt

Ortiz, Cantin January 2018 (has links)
While signatures of attention can be observed in widespread areas within and outside of cortex, the control of attention is thought to be regulated by higher cognitive brain areas, such as the prefrontal cortex. In their recent study on mice Kim et al. could show that successful allocation of attention is characterized by increased spiking of a specific type of inhibitory interneurons, the parvalbumin neurons, and higher oscillatory activity in the gamma band in the local prefrontal network. It was recently demonstrated that encoding of working memory in prefrontal areas is linked to bursts of gamma oscillations, a discontinuous network process characterized by short periods of intense power in the gamma band. The relationship between attention and working memory is unclear, and it is possible that these two cognitive processes share encoding principles. To address this gap, the electrophysiological data collected in the Carlén Lab have been analyzed with advanced spatio-temporal approaches. In particular, we have analyzed bursting gamma activity in medial prefrontal cortex during attentional processing and investigated the similarities to gamma bursting observed during working memory. Gamma-band bursts during attention were reliably detected with several methods. We have characterized several features of the bursts, including the occurrence, duration and amplitude. The neuronal firing rates during and outside of bursts have also been computed. We investigated the correlation between different criteria characterizing the gamma burst and successful vs failed allocation of attention. Control data were generated to discuss the obtained results. The aim of the study was to explore the hypothesis that the medial prefrontal cortex encodes attention trough gamma bursts, which could reveal some similarities and differences in coding of central cognitive processes. No clear difference was found in the characterization between successful and failed allocation of attention. In addition, results were very similar in control set and original data. No underlying mechanism could be identified from this analysis. Therefore, as the bursts occurring in the gamma band in the prefrontal cortex (PFC) were not discriminative with respect to the different tested conditions, they do not seem to encode information related to attention. / Även fast flera olika hjärnområdens aktivitet kan korreleras med uppmärksamhet, anses kontrollen av uppmärksamhet regleras av högre kognitiva hjärnområden, såsom främre hjärnbarken. I en nyligen publicerad artikel studerade Kim et al. hjärnaktiviteten hos möss och kunde visa att en framgångsrik uppmärksamhet kännetecknas av en ökad aktivitet av en specifik typ av inhiberande nervceller, parvalbumin celler, och högre oscillerande aktivitet i gammafrekvens i främre hjärnbarkens lokala nätverk. Det har nyligen visats att kodning av arbetsminne i främre hjärnbarken är kopplat till utbrott av gamma-oscillationer, en diskontinuerlig nätverksprocess som kännetecknas av korta perioder av intensiva oscillationer av det lokala nätverket i gammafrekvens . Relationen mellan uppmärksamhet och arbetsminne är oklar, och det är möjligt att dessa två kognitiva processer delar kodningsprinciper. För att minska detta gap av kunskap har den elektrofysiologiska datan som samlats in i Carlén Lab analyserats med avancerade spatio-temporala tillvägagångssätt. I synnerhet har vi analyserat utbrott i gammaaktivitet i främre hjärnbarken under uppmärksamhet och undersökt likheterna med gamma- utbrott observerade under arbetsminne. Gamma-bandutbrott under uppmärksamhet påvisades på ett tillförlitligt sätt med flera metoder. Vi har karaktäriserat flera funktioner hos utbrotten, inklusive förekomsten, varaktigheten och amplituden. De enskilda cellernas aktivitet undersöktes även under och utanför utprotten av gamma-oscillationer. Vi undersökte sambandet mellan de olika kriterier som karakteriserar gamma-utbrott under framgångsrik mot misslyckad allokering av uppmärksamhet. Kontrolldata genererades för att diskutera de erhållna resultaten. Syftet med studien var att utforska hypotesen att den främre hjärnbarken kodar uppmärksamhet genom gamma-utbrott, vilket kan avslöja vissa likheter och skillnader i kodning av centrala kognitiva processer. Ingen klar skillnad hittades i karaktäriseringen mellan framgångsrik och misslyckad allokering av uppmärksamhet. Dessutom var resultaten mycket likartade i kontrolluppsättningen och den ursprungliga datan. Ingen underliggande mekanism kunde identifieras ur denna analys. Eftersom de utbrott som uppstod i gamma-bandet i främre hjärnbarken inte var unika med hänsyn till de olika testade förhållandena, tycks de därför inte koda information relaterad till uppmärksamhet.
100

Vérification de la pléiotropie en randomisation mendélienne : évaluation méthodologique et application à l'estimation de l'effet causal de l'adiposité sur la pression artérielle

Mbutiwi, Fiston Ikwa Ndol 07 1900 (has links)
Introduction La randomisation mendélienne (RM) est une approche de plus en plus populaire dans les études observationnelles qui utilise des variants génétiques (habituellement des polymorphismes mononucléotidiques ou single-nucleotide polymorphisms, SNPs) associés à une exposition (hypothèse 1 ou pertinence) comme instruments pour estimer l’effet causal de cette exposition sur une issue, en assumant l’absence de confusion entre l’instrument et l’issue (hypothèse 2 ou indépendance) et l’absence d’un effet de l’instrument sur l’issue en dehors de l’exposition (hypothèse 3 ou restriction d’exclusion). Cependant, la validité des résultats de la RM est menacée par la pléiotropie, phénomène biologique par lequel un SNP affecte distinctement l’exposition et l’issue, qui est l’une des principales causes de violation de la restriction d’exclusion. Cette thèse examine certains défis méthodologiques pratiques de la RM relatifs à la vérification de la restriction d’exclusion et à la validité des résultats à travers trois principaux objectifs : 1) cartographier comment les chercheurs en RM préviennent, détectent et/ou contrôlent, et discutent des violations potentielles de la restriction d'exclusion dues notamment à la pléiotropie ; 2) évaluer la performance de la méthode basée sur la confusion positive, qui compare les estimés ponctuels de l’effet de l’exposition sur l’issue obtenus par la RM et par la régression conventionnelle, dans la détection des instruments invalides dans plusieurs contextes pratiques d’études de RM ; et 3) examiner l’impact des méthodes courantes de gestion de la médication antihypertensive dans les études de RM modélisant la pression artérielle (PA) sur l'estimation de l’effet causal et la détection des violations potentielles de la restriction d'exclusion. Méthodes Pour l’objectif 1, une revue de littérature de 128 études de RM ayant utilisé au moins un SNP sur le gène FTO (fat mass and obesity-associated) comme instrument de l’indice de masse corporelle (IMC) a été réalisée. La façon dont les auteurs préviennent, évaluent ou contrôlent, et discutent des violations potentielles de la restriction d’exclusion dues notamment à la pléiotropie a été examinée. Pour l’objectif 2, une étude de simulation statistique considérant des contextes d’études de RM utilisant comme instrument un SNP ou un score de risque génétique (genetic risk score, GRS), une issue continue ou binaire, dans des scénarios évaluant l’impact de la taille de l’échantillon et du type de pléiotropie (indirect ou direct), a été réalisée. La performance de la méthode basée sur la confusion positive a été définie comme le pourcentage de jeux de données simulés dans lesquels la méthode détectait des instruments invalides. Pour l’objectif 3, une étude de RM de l’association entre l’IMC et la PA systolique (PAS) a été réalisée. Les méthodes de gestion de la médication antihypertensive examinées étaient : (i) pas de correction, (ii) inclure la médication dans les modèles comme une covariable d’ajustement, (iii) exclure de l’analyse les sujets traités aux antihypertenseurs, (iv) ajouter une valeur constante de 15 mm Hg aux valeurs mesurées de la PAS chez les sujets traités aux antihypertenseurs, et (v) utiliser comme issue un indicateur binaire de l'hypertension. Résultats Il existe une pléthore de méthodes utilisées dans les études de RM dont certaines peuvent être sous-optimales à prévenir, détecter ou contrôler le biais dû à l’inclusion des SNPs pléiotropiques. Les simulations statistiques montrent qu’en RM utilisant un SNP comme instrument, la méthode basée sur la confusion positive est performante à détecter l’invalidité de l’instrument lorsque la pléiotropie est directe plutôt qu’indirecte, indépendamment de l’issue, mais la performance de la méthode s’améliore avec l’augmentation de taille de l’échantillon. En revanche, la méthode est moins performante à détecter l’invalidité lorsque l’instrument est un GRS, mais sa performance augmente avec la proportion des SNPs invalides inclus dans le GRS. Enfin, les estimations de la RM varient énormément selon la stratégie de gestion de la médication antihypertensive choisie, contrairement à la détection des violations de la restriction d’exclusion qui n’en est pas affectée. Conclusion Cette thèse met de l’avant certaines difficultés méthodologiques dans les applications de la RM et l’importance de la triangulation de plusieurs méthodes dans la vérification des hypothèses de RM. Le champ de la RM est en plein essor, et des nouvelles méthodes sont souvent proposées. Il devient important non seulement de les évaluer, mais aussi d’en détailler l’utilisation et les hypothèses sous-jacentes pour une utilisation optimale en complément aux méthodes existantes. / Introduction Mendelian randomization (MR) is an increasingly popular technique in observational studies that uses genetic variants (usually single-nucleotide polymorphisms, SNPs) associated with an exposure (Assumption 1 or relevance) as instruments to estimate the causal effect of that exposure on an outcome, assuming no confounding between the instrument and the outcome (Assumption 2 or independence) and no effect of the instrument on the outcome outside of its association with the exposure (Assumption 3 or exclusion restriction). However, the validity of the MR results is challenged by pleiotropy, the biological phenomenon whereby a SNP distinctly affects the exposure and the outcome, which is one of the leading causes of violation of the exclusion restriction assumption. This thesis examines some practical MR methodological challenges related to the assessment of the exclusion restriction and the validity of MR results through three main objectives: 1) to examine how MR researchers prevent, detect, and/or control for, and discuss potential violations of the exclusion restriction due especially to pleiotropy; 2) to evaluate the performance of the leveraging positive confounding (LPC) method that compares the MR and the conventional point estimates in detecting invalid instruments in several practical MR settings; and 3) to examine the impact of commonly used methods of accounting for antihypertensive medication in MR studies modeling blood pressure (BP) on the estimation of the causal effect and the detection of potential violations of the exclusion restriction. Methods For Objective 1, a literature review of 128 MR studies that used at least one SNP in the fat mass and obesity-associated (FTO) gene as an instrument for body mass index (BMI) was conducted to examined how the authors prevent, detect, or control, and discuss potential violations of the exclusion restriction, especially due to pleiotropy. For Objective 2, a simulation study considering MR analyse settings using single SNP or genetic risk score (GRS) as an instrument, continuous or binary outcome, in scenarios evaluating the impact of sample size and type of pleiotropy (indirect vs. direct) was performed. The performance of the LPC method was assessed as the percentage of simulated datasets in which the LPC method detected invalid instruments. For Objective 3, an MR study of the association between BMI and systolic BP (SBP) was performed. The methods for accounting for antihypertensive medication examined were: (i) no adjustment, (ii) include medication in the models as an adjustment covariate, (iii) exclude from the analysis subjects treated with antihypertensive medication, (iv) add a constant value of 15 mm Hg to the measured values of SBP in subjects using antihypertensive medication, and (v) use as outcome a binary indicator of hypertension. Results There exists a plethora of methods used in MR studies, some of which may be suboptimal for preventing, detecting, or controlling for bias due to the inclusion of pleiotropic SNPs. Statistical simulations show that in MR using single SNP as an instrument, the LPC method performs better at detecting invalidity of the instrument when the pleiotropy is direct than indirect, regardless of the outcome, although the performance of the method improves with increasing sample size. In contrast, the method performs less well in detecting invalidity when the instrument is a GRS, but its performance increases with the proportion of invalid SNPs included in the GRS. Finally, MR estimates change greatly depending on the chosen strategy of accounting for antihypertensive medication in contrast to the detection of exclusion restriction violations which is not impacted. Conclusion This present thesis highlights some of the methodological challenges in MR applications and the importance of triangulating multiple methods when assessing the MR assumptions. The MR field is booming, and new methods are often proposed. Therefore, it is important to evaluate these methods as well as to detail their application and underlying assumptions for optimal use as a complement to existing methods.

Page generated in 0.1512 seconds