• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 2
  • 1
  • Tagged with
  • 9
  • 9
  • 6
  • 5
  • 5
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Testes bayesianos para homogeneidade marginal em tabelas de contingência / Bayesian tests for marginal homogeneity in contingency tables

Carvalho, Helton Graziadei de 06 August 2015 (has links)
O problema de testar hipóteses sobre proporções marginais de uma tabela de contingência assume papel fundamental, por exemplo, na investigação da mudança de opinião e comportamento. Apesar disso, a maioria dos textos na literatura abordam procedimentos para populações independentes, como o teste de homogeneidade de proporções. Existem alguns trabalhos que exploram testes de hipóteses em caso de respostas dependentes como, por exemplo, o teste de McNemar para tabelas 2 x 2. A extensão desse teste para tabelas k x k, denominado teste de homogeneidade marginal, usualmente requer, sob a abordagem clássica, a utilização de aproximações assintóticas. Contudo, quando o tamanho amostral é pequeno ou os dados esparsos, tais métodos podem eventualmente produzir resultados imprecisos. Neste trabalho, revisamos medidas de evidência clássicas e bayesianas comumente empregadas para comparar duas proporções marginais. Além disso, desenvolvemos o Full Bayesian Significance Test (FBST) para testar a homogeneidade marginal em tabelas de contingência bidimensionais e multidimensionais. O FBST é baseado em uma medida de evidência, denominada e-valor, que não depende de resultados assintóticos, não viola o princípio da verossimilhança e respeita a várias propriedades lógicas esperadas para testes de hipóteses. Consequentemente, a abordagem ao problema de teste de homogeneidade marginal pelo FBST soluciona diversas limitações geralmente enfrentadas por outros procedimentos. / Tests of hypotheses for marginal proportions in contingency tables play a fundamental role, for instance, in the investigation of behaviour (or opinion) change. However, most texts in the literature are concerned with tests that assume independent populations (e.g: homogeneity tests). There are some works that explore hypotheses tests for dependent proportions such as the McNemar Test for 2 x 2 contingency tables. The generalization of McNemar test for k x k contingency tables, called marginal homogeneity test, usually requires asymptotic approximations. Nevertheless, for small sample sizes or sparse tables, such methods may occasionally produce imprecise results. In this work, we review some classical and Bayesian measures of evidence commonly applied to compare two marginal proportions. We propose the Full Bayesian Significance Test (FBST) to investigate marginal homogeneity in two-way and multidimensional contingency tables. The FBST is based on a measure of evidence, called e-value, which does not depend on asymptotic results, does not violate the likelihood principle and satisfies logical properties that are expected from hypothesis testing. Consequently, the FBST approach to test marginal homogeneity overcomes several limitations usually met by other procedures.
2

Testes bayesianos para homogeneidade marginal em tabelas de contingência / Bayesian tests for marginal homogeneity in contingency tables

Helton Graziadei de Carvalho 06 August 2015 (has links)
O problema de testar hipóteses sobre proporções marginais de uma tabela de contingência assume papel fundamental, por exemplo, na investigação da mudança de opinião e comportamento. Apesar disso, a maioria dos textos na literatura abordam procedimentos para populações independentes, como o teste de homogeneidade de proporções. Existem alguns trabalhos que exploram testes de hipóteses em caso de respostas dependentes como, por exemplo, o teste de McNemar para tabelas 2 x 2. A extensão desse teste para tabelas k x k, denominado teste de homogeneidade marginal, usualmente requer, sob a abordagem clássica, a utilização de aproximações assintóticas. Contudo, quando o tamanho amostral é pequeno ou os dados esparsos, tais métodos podem eventualmente produzir resultados imprecisos. Neste trabalho, revisamos medidas de evidência clássicas e bayesianas comumente empregadas para comparar duas proporções marginais. Além disso, desenvolvemos o Full Bayesian Significance Test (FBST) para testar a homogeneidade marginal em tabelas de contingência bidimensionais e multidimensionais. O FBST é baseado em uma medida de evidência, denominada e-valor, que não depende de resultados assintóticos, não viola o princípio da verossimilhança e respeita a várias propriedades lógicas esperadas para testes de hipóteses. Consequentemente, a abordagem ao problema de teste de homogeneidade marginal pelo FBST soluciona diversas limitações geralmente enfrentadas por outros procedimentos. / Tests of hypotheses for marginal proportions in contingency tables play a fundamental role, for instance, in the investigation of behaviour (or opinion) change. However, most texts in the literature are concerned with tests that assume independent populations (e.g: homogeneity tests). There are some works that explore hypotheses tests for dependent proportions such as the McNemar Test for 2 x 2 contingency tables. The generalization of McNemar test for k x k contingency tables, called marginal homogeneity test, usually requires asymptotic approximations. Nevertheless, for small sample sizes or sparse tables, such methods may occasionally produce imprecise results. In this work, we review some classical and Bayesian measures of evidence commonly applied to compare two marginal proportions. We propose the Full Bayesian Significance Test (FBST) to investigate marginal homogeneity in two-way and multidimensional contingency tables. The FBST is based on a measure of evidence, called e-value, which does not depend on asymptotic results, does not violate the likelihood principle and satisfies logical properties that are expected from hypothesis testing. Consequently, the FBST approach to test marginal homogeneity overcomes several limitations usually met by other procedures.
3

Multi-omics Data Integration for Identifying Disease Specific Biological Pathways

Lu, Yingzhou 05 June 2018 (has links)
Pathway analysis is an important task for gaining novel insights into the molecular architecture of many complex diseases. With the advancement of new sequencing technologies, a large amount of quantitative gene expression data have been continuously acquired. The springing up omics data sets such as proteomics has facilitated the investigation on disease relevant pathways. Although much work has previously been done to explore the single omics data, little work has been reported using multi-omics data integration, mainly due to methodological and technological limitations. While a single omic data can provide useful information about the underlying biological processes, multi-omics data integration would be much more comprehensive about the cause-effect processes responsible for diseases and their subtypes. This project investigates the combination of miRNAseq, proteomics, and RNAseq data on seven types of muscular dystrophies and control group. These unique multi-omics data sets provide us with the opportunity to identify disease-specific and most relevant biological pathways. We first perform t-test and OVEPUG test separately to define the differential expressed genes in protein and mRNA data sets. In multi-omics data sets, miRNA also plays a significant role in muscle development by regulating their target genes in mRNA dataset. To exploit the relationship between miRNA and gene expression, we consult with the commonly used gene library - Targetscan to collect all paired miRNA-mRNA and miRNA-protein co-expression pairs. Next, by conducting statistical analysis such as Pearson's correlation coefficient or t-test, we measured the biologically expected correlation of each gene with its upstream miRNAs and identify those showing negative correlation between the aforementioned miRNA-mRNA and miRNA-protein pairs. Furthermore, we identify and assess the most relevant disease-specific pathways by inputting the differential expressed genes and negative correlated genes into the gene-set libraries respectively, and further characterize these prioritized marker subsets using IPA (Ingenuity Pathway Analysis) or KEGG. We will then use Fisher method to combine all these p-values derived from separate gene sets into a joint significance test assessing common pathway relevance. In conclusion, we will find all negative correlated paired miRNA-mRNA and miRNA-protein, and identifying several pathophysiological pathways related to muscular dystrophies by gene set enrichment analysis. This novel multi-omics data integration study and subsequent pathway identification will shed new light on pathophysiological processes in muscular dystrophies and improve our understanding on the molecular pathophysiology of muscle disorders, preventing and treating disease, and make people become healthier in the long term. / Master of Science
4

Leksandsbrons deformationsövervakning med geodetiska metoder

Olhans, Linnéa January 2018 (has links)
Av olika orsaker sker rörelser i strukturer, vilket leder till att deformationer uppstår. För att upptäcka deformationer krävs att förändringarna övervakas regelbundet. Övervakning kan ske på olika sätt. När detta utförs bör ett stomnät, upprättat av referenspunkter av god kvalitet, finnas tillgängligt. Stomnätet ger en grund för deformationsmätningen och gör att instrumentet som används, kan erhålla en lokalisering av var referenspunkterna placeras innan mätningen av strukturen utförs. 2014 utförde konsultföretaget Sweco, med uppdrag från Leksands kommun, en deformationsmätning av Leksandsbron, som är en gammal bågvalvsbro från 1925 i Leksands kommun, Dalarnas län. Syftet var att undersöka brons hållbarhet, men några slutsatser kunde inte dras utifrån den mätningen. I denna studie är syftet att ge förslag på hur Swecos stomnät skulle kunna utvecklas. Kompletteringen av stomnätet utfördes genom en simulering i programvaran SBG Geo där ett antal av Swecos kända stompunkter valdes och nya stompunkter placerades ut grafiskt med avseende på kvalitet, tillförlitlighet, geometri och ekonomiska aspekter i tre scenarier. Ett stomnät kan se ut på många olika sätt i teorin, men i praktiken är omgivningen begränsad vid bromätningar. Nätutjämning av förslagen fick senare klargöra vilket nät som var det bästa för att använda som referens till inmätningen av bron. Kompletteringen resulterade i att det förslag som var det mest lämpade också var det som var bäst anpassat till brons omgivning, natur och sikt. Det bästa förslaget har därefter användes därefter till inmätning av Leksandsbron, där Swecos deformationsmätning utgjorde referens. Inmätningen av stomnätet och bron utfördes med totalstationen Trimble S7. Genom nätutjämning beräknades nätets koordinater och osäkerheter från observationerna och differenserna jämfördes mot Swecos punkter för att se om någon deformation uppstått. Resultatet visar att två av Swecos punkter och några av brons punkter har utsatts för deformation. Deformationen för punkternas avvikelser analyserades också genom att använda t-student signifikanstest på ett konfidensintervall av 95%. Är avvikelsen inom intervallet har punkten inte rört sig och är avvikelsen utanför konfidensintervallet har punkten rört sig. Signifikanstestet visade att de punkter som utsatts för deformation i nätutjämningen även är signifikanta i t-student testet och bekräftade att punkterna har rört sig. / For different reasons there are movements in structures, which leads to deformations. To monitor deformations, the changes have to be monitored on a regular basis. Monitoring can take place in different ways. When doing this, a core network based on good quality reference points, should be available. The core network provides a basis for deformation measurements and allows the instrument to predict a suitable location before measuring of the structure. In 2014, the consultancy company Sweco, commissioned by Leksands municipality, made deformation measurements of Leksandsbron, an old arch bridge from 1925, situated in Leksand, Dalecarlia. The intention was to investigate the sustainability of the bridge, but it was not possible to draw any conclusions from the measurements from 2014. The purpose of this study is to give suggestions on how the core network from Sweco could be developed. The completion was performed by a simulation in SBG Geo Software where some known points from Sweco were chosen as a reference and new points were graphically placed in terms of quality, reliability, geometry and economic aspects in three scenarios. A core network can look in many different ways theoretically, but practically, and especially at bridge measurements, there are limitations. By network adjustment of the proposals it could be clarified, which network was the best one to use for the survey of the bridge. The result of the completion was that the best suited proposal also was the one best suited to the surroundings, nature and visibility of the bridge The best suggestion was used for the bridge measurement with Sweco´s deformation measurement as a reference. The survey of the core network and the points of the bridge was performed by the Trimble S7 total station. With network adjustment the coordinates and assurances were calculated and the differences were compared to Sweco points to see if any deformation had occurred. The result showed that two of Sweco´s points and some of the bridge points have been exposed to deformation. The deformation has also been analyzed for the deviations of the core network points and the bridge points by using t-student significance test of a confidence interval of 95%. If the deviation is within the range it´s considered that the point has not moved and if the deviation is outside the range it´s considered that the point has moved. The significance test showed that the same points that had been exposed to deformation in the network adjustment, also were significant in the t-student test, which confirms that the points have moved.
5

Pesquisas sob amostragem informativa utilizando o FBST / Surveys under informative sampling using the FBST

Azerêdo, Daniel Mendes 28 May 2013 (has links)
Pfeffermann, Krieger e Rinott (1998) apresentaram uma metodologia para modelar processos de amostragem que pode ser utilizada para avaliar se este processo de amostragem é informativo. Neste cenário, as probabilidades de seleção da amostra são aproximadas por uma função polinomial dependendo das variáveis resposta e concomitantes. Nesta abordagem, nossa principal proposta é investigar a aplicação do teste de significância FBST (Full Bayesian Significance Test), apresentado por Pereira e Stern (1999), como uma ferramenta para testar a ignorabilidade amostral, isto é, para avaliar uma relação de significância entre as probabilidades de seleção da amostra e a variável resposta. A performance desta modelagem estatística é testada com alguns experimentos computacionais. / Pfeffermann, Krieger and Rinott (1998) introduced a framework for modeling sampling processes that can be used to assess if a sampling process is informative. In this setting, sample selection probabilities are approximated by a polynomial function depending on outcome and auxiliary variables. Within this framework, our main purpose is to investigate the application of the Full Bayesian Significance Test (FBST), introduced by Pereira and Stern (1999), as a tool for testing sampling ignorability, that is, to detect a significant relation between the sample selection probabilities and the outcome variable. The performance of this statistical modelling framework is tested with some simulation experiments.
6

How to beat the Baltic market : An investigation of the P/E effect and the small firm effect on the Baltic stock market between the years 2000-2014

Hallberg, Oscar, Arklid, Filip January 2015 (has links)
The question many investors ask is whether or not it is possible to beat the market andearn money by being active on the stock market. In efficient markets this should not be possible, but several researches have come up with strategies that prove the opposite. There are certain market movements that cannot be explained by the arguments of the traditional efficient market hypothesis and such market movements are in the standard finance theory called anomalies. Two well-known anomalies are the P/E effect and the small firm effect. The P/E effect means that portfolios with low P/E stocks attain higher average risk-adjusted returns than portfolios with high P/E stocks. Similarly, the small firm effect means that companies with small market capitalization earn higher return than those with large market capitalization. Even though these anomalies were discovered in the US, they occur on other markets as well. However, most of the studies regarding these have focused on developed markets. Therefore, the focus in this study has been on emerging markets, more specifically the Baltic market. The problem we aimed to answer with this study is whether or not it is possible to attain abnormal returns on the Baltic stock market by using the P/E effect or the small firm effect. Further on, we found it interesting to investigate which one of the two anomalies that is the best investment strategy. By doing this, we have also been able examine if the Baltic market is efficient or not. The study investigates all listed firms (both active and dead) with available data on Nasdaq OMX Baltic between the years 2000-2014. There are two different samples, a P/E sample and a market capitalization sample. The firms in the samples are ranked and grouped into portfolios and then tested to see if there is significant evidence of the existence of the P/E effect and the small firm effect. The results of the tests show that the Baltic market is not completely efficient, since statistical support was found for the small firm effect. This implies that it is possible to attain abnormal returns on the Baltic market by investing in small capitalization stocks. However, the tests showed no significant evidence of the P/E effect. For this reason, with the assumptions made, we recommend the small firm effect as an investment strategy on the Baltic stock market.
7

Pesquisas sob amostragem informativa utilizando o FBST / Surveys under informative sampling using the FBST

Daniel Mendes Azerêdo 28 May 2013 (has links)
Pfeffermann, Krieger e Rinott (1998) apresentaram uma metodologia para modelar processos de amostragem que pode ser utilizada para avaliar se este processo de amostragem é informativo. Neste cenário, as probabilidades de seleção da amostra são aproximadas por uma função polinomial dependendo das variáveis resposta e concomitantes. Nesta abordagem, nossa principal proposta é investigar a aplicação do teste de significância FBST (Full Bayesian Significance Test), apresentado por Pereira e Stern (1999), como uma ferramenta para testar a ignorabilidade amostral, isto é, para avaliar uma relação de significância entre as probabilidades de seleção da amostra e a variável resposta. A performance desta modelagem estatística é testada com alguns experimentos computacionais. / Pfeffermann, Krieger and Rinott (1998) introduced a framework for modeling sampling processes that can be used to assess if a sampling process is informative. In this setting, sample selection probabilities are approximated by a polynomial function depending on outcome and auxiliary variables. Within this framework, our main purpose is to investigate the application of the Full Bayesian Significance Test (FBST), introduced by Pereira and Stern (1999), as a tool for testing sampling ignorability, that is, to detect a significant relation between the sample selection probabilities and the outcome variable. The performance of this statistical modelling framework is tested with some simulation experiments.
8

Inferência bayesiana objetiva e freqüentista para a probabilidade de sucesso

Pires, Rubiane Maria 10 February 2009 (has links)
Made available in DSpace on 2016-06-02T20:06:02Z (GMT). No. of bitstreams: 1 2203.pdf: 1300161 bytes, checksum: 2c1f11d939eab9ab849bb04bf2363a53 (MD5) Previous issue date: 2009-02-10 / Financiadora de Estudos e Projetos / This study considers two discrete distributions based on Bernoulli trials: the Binomial and the Negative Binomial. We explore credibility and confidence intervals to estimate the probability of success of each distribution. The main goal is to analyze their performance coverage probability and average range across the parametric space. We also consider point analysis of bayesian estimators and maximum likelihood estimators, whose interest is to confirm through simulation their consistency, bias and mean square error. In this paper the Objective Bayesian Inference is applied through the noninformative Bayes-Laplace prior, Haldane prior, reference prior and least favorable prior. By analyzing the prior distributions in the minimax decision theory context we verified that the least favorable prior distribution has every other considered prior distributions as particular cases when a quadratic loss function is applied, and matches the Bayes-Laplace prior in considering the quadratic weighed loss function for the Binomial model (which was never found in literature). We used the noninformative Bayes-Laplace prior and Jeffreys prior for the Negative Binomial model. Our findings show through coverage probability, average range of bayesian intervals and point estimation that the Objective Bayesian Inference has good frequentist properties for the probability of success of Binomial and Negative Binomial models. The last stage of this study discusses the presence of correlated proportions in matched-pairs (2 × 2 table) of Bernoulli with the goal of obtaining more information in relation of the considered measures for testing the occurrence of correlated proportions. In this sense the Trinomial model and the partial likelihood function were used from the frequentist and bayesian point of view. The Full Bayesian Significance Test (FBST) was used for real data sets and was shown sensitive to parameterization, however, this study was not possible for the frequentist method since distinct methods are needed to be applied to Trinomial model and the partial likelihood function. / Neste estudo são abordadas duas distribuições discretas baseadas em ensaios de Bernoulli, a Binomial e a Binomial Negativa. São explorados intervalos de credibilidade e confiança para estimação da probabilidade de sucesso de ambas as distribuições. A principal finalidade é analisar nos contextos clássico e bayesiano o desempenho da probabilidade de cobertura e amplitude média gerada pelos intervalos de confiança e intervalos de credibilidade ao longo do espaço paramétrico. Considerou-se também a análise dos estimadores pontuais bayesianos e o estimador de máxima verossimilhança, cujo interesse é confirmar por meio de simulação a consistência e calcular o viés e o erro quadrático médio dos mesmos. A Inferência Bayesiana Objetiva é empregada neste estudo por meio das distribuições a priori não-informativas de Bayes-Laplace, de Haldane, de Jeffreys e menos favorável. Ao analisar as distribuições a priori no contexto de teoria de decisões minimax, a distribuição a priori menos favorável resgata as demais citadas ao empregar a função de perda quadrática e coincide com a distribuição a priori de Bayes-Laplace ao considerar a função de perda quadrática ponderada para o modelo Binomial, o que não foi encontrado até o momento na literatura. Para o modelo Binomial Negativa são consideradas as distribuições a priori não-informativas de Bayes-Laplace e de Jeffreys. Com os estudos desenvolvidos pôde-se observar que a Inferência Bayesiana Objetiva para a probabilidade de sucesso dos modelos Binomial e Binomial Negativa apresentou boas propriedades freqüentistas, analisadas a partir da probabilidade de cobertura e amplitude média dos intervalos bayesianos e por meio das propriedades dos estimadores pontuais. A última etapa do trabalho consiste na análise da ocorrência de proporções correlacionadas em pares de eventos de Bernoulli (tabela 2×2) com a finalidade de determinar um possível ganho de informação em relação as medidas consideradas para testar a ocorrência de proporções correlacionadas. Para tanto fez-se uso do modelo Trinomial e da função de verossimilhança parcial tanto numa abordagem clássica quanto bayesiana. Nos conjuntos de dados analisados observou-se a medida de evidência bayesiana (FBST) como sensível à parametrização, já para os métodos clássicos essa comparação não foi possível, pois métodos distintos precisam ser aplicados para o modelo Trinomial e para a função de verossimilhança parcial.
9

Spectrum sensing for half and full-duplex interweave cognitive radio systems / Détection de spectre pour les systèmes half et full-duplex radio intelligente entrelacée

Nasser, Abbass 17 January 2017 (has links)
En raison de la demande croissante de services de communication sans fil et de la limitation des ressources de spectre, la radio cognitive (CR) a été initialement proposée pour résoudre la pénurie de spectre. CR divise les systèmes transmetteurs-récepteurs de communication en deux catégories : les Utilisateurs Principaux (PU) et les Utilisateurs Secondaires (SU). PU a le droit légal d'utiliser la bande spectrale, tandis que SU est un utilisateur opportuniste qui peut transmettre sur cette bande chaque fois qu'elle est vacante afin d'éviter toute interférence avec le signal de PU. De ce fait, la détection des activités de PU devient une priorité principale pour toute CR.Le Spectrum Sensing devient ainsi une partie importante d’un système CR, qui surveille les transmissions de PU. En effet, le Spectrum Sensing joue un rôle essentiel dans le mécanisme du fonctionnement du CR en localisant les canaux disponibles et, d'autre part, en protégeant les canaux occupés des interférences de la transmission SU. En fait, Spectrum Sensing a gagné beaucoup d'attention au cours de la dernière décennie, et de nombreux algorithmes sont proposés. Concernant la fiabilité de la performance, plusieurs défis comme le faible rapport signal sur bruit, l'incertitude de bruit (NU), la durée de détection du spectre, etc. Cette thèse aborde les défis de la détection du spectre et apporte quelques solutions. De nouveaux détecteurs basés sur la détection des caractéristiques cyclo-stationnaires et la densité spectrale de puissance (PSD) du signal de PU sont présentés. Un algorithme de test de signification de corrélation canonique (CCST) est proposé pour effectuer une détection cyclo-stationnaire. CCST peut détecter la présence des caractéristiques cycliques communes parmi les versions retardées du signal reçu. Ce test peut révéler la présence d'un signal cyclo-stationnaire dans le signal de mélange reçu. Une autre méthode de détection basée sur la PSD cumulative est proposée. En supposant que le bruit est blanc (sa PSD est plate), la PSD cumulative s'approche d'une droite. Cette forme devient non linéaire pour les signaux de télécommunication. Distinguer la forme cumulative PSD peut donc conduire à diagnostiquer l'état du canal.La radio cognitive Full-Duplex (FD-CR) a également été étudiée dans ce manuscrit, où plusieurs défis sont analysés en proposant de nouvelles contributions. Le fonctionnement FD permet au CR d'éviter la période de silence pendant la détection du spectre. Dans le système CR classique, le SU cesse de transmettre pendant la détection du spectre afin de ne pas affecter la fiabilité de détection. Dans FD-CR, SU peut éliminer la réflexion de son signal transmis et en même temps réaliser le Spectrum Sensing. En raison de certaines limitations, le résidu de l'auto-interférence ne peut pas être complètement annulé, alors la crédibilité de la détection du spectre est fortement affectée. Afin de réduire la puissance résiduelle, une nouvelle architecture de récepteur SU est élaborée pour atténuer les imperfections du circuit (comme le bruit de phase et la distorsion non linéaire de l'amplificateur à faible bruit du récepteur). La nouvelle architecture montre sa robustesse en assurant une détection fiable et en améliorant le débit de SU. / Due to the increasing demand of wireless communication services and the limitation in the spectrum resources, Cognitive Radio (CR) has been initially proposed in order to solve the spectrum scarcity. CR divides the communication transceiver into two categories: the Primary (PU) or the Secondary (SU) Users. PU has the legal right to use the spectrum bandwidth, while SU is an opportunistic user that can transmit on that bandwidth whenever it is vacant in order to avoid any interference to the signal of PU. Hence the detection of PU becomes a main priority for CR systems. The Spectrum Sensing is the part of the CR system, which monitors the PU activities. Spectrum Sensing plays an essential role in the mechanism of the CR functioning. It provides CR with the available channel in order to access them, and on the other hand, it protects occupied channels from the interference of the SU transmission. In fact, Spectrum Sensing has gained a lot of attention in the last decade, and numerous algorithms are proposed to perform it. Concerning the reliability of the performance, several challenges have been addressed, such as the low Signal to Noise Ratio (SNR), the Noise Uncertainty (NU), the Spectrum Sensing duration, etc. This dissertation addresses the Spectrum Sensing challenges and some solutions are proposed. New detectors based on Cyclo-Stationary Features detection and the Power Spectral Density (PSD) of the PU are presented. CanonicalCorrelation Significance Test (CCST) algorithm is proposed to perform cyclo-stationary detection. CCST can detect the presence of the common cyclic features among the delayed versions of the received signal. This test can reveal the presence of a cyclo-stationary signal in the received mixture signal. Another detection method based on the cumulative PSD is proposed. By assuming the whiteness of the noise (its PSD is at), the cumulative PSD approaches a straight line. This shape becomes non-linear when a telecommunication signal is present in the received mixture. Distinguishing the Cumulative PSD shape may lead to diagnose the channel status.Full-Duplex Cognitive Radio (FD-CR) has been also studied in this manuscript, where several challenges are analyzed by proposing a new contribution. FD functioning permits CR to avoid the silence period during the Spectrum Sensing. In classical CR system, SU stops transmitting during the Spectrum Sensing in order to do not affect the detection reliability. In FD-CR, SU can eliminate the reflection of its transmitted signal and at the same time achieving the Spectrum Sensing. Due to some limitations, the residual of the Self Interference cannot be completely cancelled, then the Spectrum Sensing credibility is highly affected. In order to reduce the residual power, a new SU receiver architecture is worked out to mitigate the hardware imperfections (such as the Phase Noise and the Non-Linear Distortion of the receiver Low-Noise Amplifier). The new architecture shows its robustness by ensuring a reliable detection and enhancing the throughput of SU.

Page generated in 0.0994 seconds