Spelling suggestions: "subject:"goodness off iit"" "subject:"goodness off hiit""
151 |
Modelling the proximal source of intercepted exotic insectsGuichard, Sylvain January 2009 (has links)
Biological invasions are major threats to any nation’s economy and biodiversity. To detect new biological incursions of some species biosecurity agencies deploy pheromone sentinel traps for targeted species at high risk sites such as airports, seaports and transitional facilities. A good example is the gypsy moth surveillance program in New Zealand. Following the detection of an incursion by an unwanted organism, ground-based searches to locate the source can be very expensive, but are essential to identify the introduction pathway and to increase the chances of success eradicating the unwanted organism. In such circumstances, the possibility of better targeting the search for the source of the incursion using a modelling approach is worthy of investigation A stochastic mechanistic model to hindcast moth flight from a recapture location to the release location was developed based on insect behaviour in response to wind and pheromones. The model was composed of two main processes, 1) downwind dispersal, assumed to result from an appetitive behaviour, indicated by an analysis of a previous mark-release-recapture experiment on painted apple moth (Teia anartoides, Walker) and, 2) anemotaxic dispersal inspired by pheromone anemotaxis theory but up-scaled from a fine-scaled behaviour model to a 2 m scale. A genetic algorithm was used to fit some model parameters. A specialised fitness function was developed to allow the genetic algorithm to identify parameters that resulted in models that reflected both the spread and density patterns in the trapping data. The resulting function allowed the stochastic model results to be compared with the inherently stochastic trapping data. The resulting individual based model simulates the spatio-temporal dispersal pattern of painted apple moth recorded during a previous mark-release-recapture experiment. While the proposed model is shown to have limitations with respect to accuracy and precision it is also demonstrated to greatly improve biosecurity incursion response capability, by more efficient targeting of search effort for the proximal source of an incursion.
|
152 |
Συμβολή στη στατιστική συμπερασματολογία για τις κατανομές γάμα και αντίστροφη κανονική με χρήση της εμπειρικής ροπογεννήτριας συνάρτησης / Contribution to statistical inference for the Gamma distributions and the Inverse Gaussian distributions using the empirical moment generating functionΚαλλιώρας, Αθανάσιος Γ. 01 September 2008 (has links)
Το αντικείμενο της παρούσας διατριβής είναι η διερεύνηση μεθόδων στατιστικής συμπερασματολογίας για την προσαρμογή και έλεγχο της κατανομής γάμα και της αντίστροφης κανονικής (inverse Gaussian) κατανομής σε δεδομένα με θετική λοξότητα. Τα πρότυπα αυτά χρησιμοποιούνται ευρέως στην ανάλυση αξιοπιστίας και ελέγχου μακροβιότητας καθώς και σε άλλες εφαρμογές.
Αρχικά γίνεται μια περιγραφή εναλλακτικών μεθόδων στατιστικής συμπερασματολογίας για τις διπαραμετρικές και τις τριπαραμετρικές οικογένειες κατανομών γάμα και αντίστροφης κανονικής. Στη συνέχεια διερευνάται η χρήση μεθόδων στατιστικής συμπερασματολογίας για την εκτίμηση των παραμέτρων της διπαραμετρικής γάμα κατανομής με χρήση της εμπειρικής ροπογεννήτριας συνάρτησης. Μέθοδοι εκτιμητικής, όπως είναι η μέθοδος των μικτών ροπών και των γενικευμένων ελαχίστων τετραγώνων, εφαρμόζονται και συγκρίνονται με την μέθοδο της μέγιστης πιθανοφάνειας μέσω πειραμάτων προσομοίωσης Monte Carlo. Επίσης, διερευνώνται έλεγχοι καλής προσαρμογής για τη διπαραμετρική γάμα κατανομή. Οι έλεγχοι αυτοί περιλαμβάνουν τους κλασικούς ελέγχους και έναν έλεγχο που χρησιμοποιεί την εμπειρική ροπογεννήτρια συνάρτηση. Με χρήση πειραμάτων προσομοίωσης Monte Carlo, γίνεται σύγκριση των ελέγχων ως προς το πραγματικό επίπεδο σημαντικότητας και την ισχύ έναντι άλλων λοξών προς τα δεξιά κατανομών. Στη συνέχεια εφαρμόζονται έλεγχοι καλής προσαρμογής γάμα κατανομών σε πραγματικά δεδομένα, τα οποία έχουν αναλυθεί νωρίτερα από άλλους ερευνητές. Για τον έλεγχο της τριπαραμετρικής γάμα κατανομής εφαρμόζεται μόνο ο έλεγχος με χρήση της εμπειρικής ροπογεννήτριας συνάρτησης, αφού δεν είναι γνωστοί κλασικοί έλεγχοι που χρησιμοποιούν την εμπειρική συνάρτηση κατανομής.
Τέλος, γίνεται εκτίμηση ποσοστιαίων σημείων της αντίστροφης κανονικής κατανομής. Αρχικά, εκτιμώνται ποσοστιαία σημεία για την τριπαραμετρική κατανομή και στη συνέχεια εφαρμόζονται δύο μέθοδοι υπολογισμού ποσοστιαίων σημείων για την περίπτωση της διπαραμετρικής κατανομής. Η εκτίμηση των ποσοστιαίων σημείων σε κάθε οικογένεια κατανομών χρησιμοποιεί δύο μεθόδους ενδιάμεσης εκτίμησης των παραμέτρων της κατανομής. Οι μέθοδοι συγκρίνονται ως προς το μέσο τετραγωνικό σφάλμα και τη σχετική μεροληψία με τη βοήθεια πειραμάτων προσομοίωσης. / The subject of the present dissertation is the investigation of procedures of statistical inference for fitting and testing the gamma distribution and inverse Gaussian distribution, with data having positive skewness. These distributions are used widely in reliability analysis and lifetime models as well as in other applications.
In the beginning, we describe alternative methods of statistical inference for the two and three-parameter families of gamma and inverse Gaussian distributions. Then, we examine methods of statistical inference in order to estimate the parameters of the two-parameter gamma distribution using the empirical moment generating function. Estimation procedures, like the method of mixed moments and the method of generalized least squares, are applied and compared with the method of maximum likelihood through Monte Carlo simulations. Also, we investigate goodness of fit tests for the two-parameter gamma distribution. These tests include the classical tests and a test based on the empirical moment generating function. Using Monte Carlo simulations, we compare the actual level of the tests and the power of the tests against skewed to the right distributions. We apply goodness of fit tests of gamma distributions to real life data, which have been examined earlier by other researchers. For the three-parameter gamma distribution we apply only one test using the empirical moment generating function since there are no classical tests using the empirical distribution function.
Finally, we estimate quantiles of the inverse Gaussian distribution. We start estimating quantiles for the three-parameter distribution and then we apply two procedures which estimate quantiles for the two-parameter distribution. The estimates of the quantiles for each family of distributions use two procedures for estimating intermediary the parameters of the distribution. The procedures are compared with respect to the normalized mean square error and the relative bias using simulations.
|
153 |
Une famille de distributions symétriques et leptocurtiques représentée par la différence de deux variables aléatoires gammaAugustyniak, Maciej January 2008 (has links)
Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal
|
154 |
Vibration Signal Features for the Quantification of Prosthetic Loosening in Total Hip ArthroplastiesStevenson, Nathan January 2003 (has links)
This project attempts to quantify the integrity of the fixation of total hip arthro- T plasties (THAs) by observing vibration signal features. The aim of this thesis is, therefore, to find the signal differences between firm and loose prosthesis. These difference will be expressed in different transformed domains with the expectation that a certain domain will provide superior results. Once the signal differences have been determined they will be examined for their ability to quantify the looseness. Initially, a new definition of progressive, femoral component loosening was created, based on the application of mechanical fit, involving four general conditions. In order of increasing looseness the conditions (with their equivalent engineering associations) are listed as, firm (adherence), firm (interference), micro-loose (transition) and macro-loose (clearance). These conditions were then used to aid in the development and evaluation of a simple mathematical model based on an ordinary differential equation. Several possible parameters well suited to quantification such as gap displacement, cement/interface stiffness and apparent mass were the identified from the model. In addition, the development of this model provided a solution to the problem of unifying early and late loosening mentioned in the literature by Li et al. in 1995 and 1996. This unification permitted early (micro loose) and late (macro loose) loosening to be quantified, if necessary, with the same parameter. The quantification problem was posed as a detection problem by utilising a varying amplitude input. A set of detection techniques were developed to detect the quantity of a critical value, in this case a force. The detection techniques include deviation measures of the instantaneous frequency of the impulse response of the system (accuracy of 100%), linearity of the systems response to Gaussian input (total accuracy of 97.9% over all realisations) and observed resonant frequency linearity with respect to displacement magnitude (accuracy of 100%). Note, that as these techniques were developed with the model in mind their simulated performance was, therefore, considerably high. This critical value found by the detector was then fed into the model and a quantified output was calculated. The quantification techniques using the critical value approach include, ramped amplitude input resonant analysis (experimental accuracy of 94%) and ramped amplitude input stochastic analysis (experimental accuracy of 90%). These techniques were based on analysing the response of the system in the time-frequency domain and with respect to its short-time statistical moments to a ramping amplitude input force, respectively. In addition, other mechanically sound forms of analysis, were then applied to the output of the nonlinear model with the aim of quantifying the looseness or the integrity of fixation of the THA. The cement/interface stiffness and apparent mass techniques, inspired by the work of Chung et.al. in 1979, attempt to assess the integrity of fixation of the THA by tracking the mechanical behaviour of the components of the THA, using the frequency and magnitude of the raw transducer data. This technique has been developed fron the theory of Chung etal but with a differing perspective and provides accuracies of 82% in experimentation and 71% in simulation for the apparent mass and interface stiffness techniques, respectively. Theses techniques do not quantify all forms of clinical loosening, as clinical loosening can exist in many different forms, but they do quantify mechanical loosening or the mechanical functionality of the femoral component through related parameters that observe reduction in mechanical mass, stiffness and the amount of rattle generated by a select ghap betweent he bone/cement or prosthesis/cement interface. This form of mechanical loosening in currently extremely difficult to detect using radiographs. It is envisaged that a vibration test be used in conjunction with radiographs to provide a more complete picture of the integrity of fixation of the THA.
|
155 |
Distribuição generalizada de chuvas máximas no Estado do Paraná. / Local and regional frequency analysis by lh-moments and generalized distributionsPansera, Wagner Alessandro 07 December 2013 (has links)
Made available in DSpace on 2017-07-10T19:23:40Z (GMT). No. of bitstreams: 1
Wagner.pdf: 5111902 bytes, checksum: b4edf3498cca6f9c7e2a9dbde6e62e18 (MD5)
Previous issue date: 2013-12-07 / The purpose of hydrologic frequency analysis is to relate magnitude of events with their occurrence frequency based on probability distribution. The generalized probability distributions can be used on the study concerning extreme hydrological events: extreme events, logistics and Pareto. There are several methodologies to estimate probability distributions parameters, however, L-moments are often used due to computational easiness. Reliability of quantiles with high return period can be increased by LH-moments or high orders L-moments. L-moments have been widely studied; however, there is little information about LH-moments on literature, thus, there is a great research requirement on such area. Therefore, in this study, LH-moments were studied under two approaches commonly used in hydrology: (i) local frequency analysis (LFA) and (ii) regional frequency analysis (RFA). Moreover, a database with 227 rainfall stations was set (daily maximum annual), in Paraná State, from 1976 to 2006. LFA was subdivided into two steps: (i) Monte Carlo simulations and (ii) application of results to database. The main result of Monte Carlo simulations was that LH-moments make 0.99 and 0.995 quantiles less biased. Besides, simulations helped on creating an algorithm to perform LFA by generalized distributions. The algorithm was applied to database and enabled an adjustment of 227 studied series. In RFA, the 227stations have been divided into 11 groups and regional growth curves were obtained; while local quantiles were obtained from the regional growth curves. The difference between local quantiles obtained by RFA was quantified with those obtained via LFA. The differences may be approximately 33 mm for return periods of 100 years. / O objetivo da análise de frequência das variáveis hidrológicas é relacionar a magnitude dos eventos com sua frequência de ocorrência por meio do uso de uma distribuição de probabilidade. No estudo de eventos hidrológicos extremos, podem ser usadas as distribuições de probabilidade generalizadas: de eventos extremos, logística e Pareto. Existem diversas metodologias para a estimativa dos parâmetros das distribuições de probabilidade, no entanto, devido às facilidades computacionais, utilizam-se frequentemente os momentos-L. A confiabilidade dos quantis com alto período de retorno pode ser aumentada utilizando os momentos-LH ou momentos-L de altas ordens. Os momentos-L foram amplamente estudados, todavia, os momentos-LH apresentam literatura reduzida, logo, mais pesquisas são necessárias. Portanto, neste estudo, os momentos-LH foram estudados sob duas abordagens comumente utilizadas na hidrologia: (i) Análise de frequência local (AFL) e (ii) Análise de frequência regional (AFR). Além disso, foi montado um banco de dados com 227 estações pluviométricas (máximas diárias anuais), localizadas no Estado do Paraná, no período de 1976 a 2006. A AFL subdividiu-se em duas etapas: (i) Simulações de Monte Carlo e (ii) Aplicação dos resultados ao banco de dados. O principal resultado das simulações de Monte Carlo foi que os momentos-LH tornam os quantis 0,99 e 0,995 menos enviesados. Além disso, as simulações viabilizaram a criação de um algoritmo para realizar a AFL utilizando as distribuições generalizadas. O algoritmo foi aplicado ao banco de dados e possibilitou ajuste das 227 séries estudadas. Na AFR, as 227 estações foram dividas em 11 grupos e foram obtidas as curvas de crescimento regional. Os quantis locais foram obtidos a partir das curvas de crescimento regional. Foi quantificada a diferença entre os quantis locais obtidos via AFL com aqueles obtidos via AFR. As diferenças podem ser de aproximadamente 33 mm para períodos de retorno de 100 anos.
|
156 |
Modélisation stochastique pour l’analyse d’images texturées : approches Bayésiennes pour la caractérisation dans le domaine des transforméesLasmar, Nour-Eddine 07 December 2012 (has links)
Le travail présenté dans cette thèse s’inscrit dans le cadre de la modélisation d’images texturées à l’aide des représentations multi-échelles et multi-orientations. Partant des résultats d’études en neurosciences assimilant le mécanisme de la perception humaine à un schéma sélectif spatio-fréquentiel, nous proposons de caractériser les images texturées par des modèles probabilistes associés aux coefficients des sous-bandes. Nos contributions dans ce contexte concernent dans un premier temps la proposition de différents modèles probabilistes permettant de prendre en compte le caractère leptokurtique ainsi que l’éventuelle asymétrie des distributions marginales associées à un contenu texturée. Premièrement, afin de modéliser analytiquement les statistiques marginales des sous-bandes, nous introduisons le modèle Gaussien généralisé asymétrique. Deuxièmement, nous proposons deux familles de modèles multivariés afin de prendre en compte les dépendances entre coefficients des sous-bandes. La première famille regroupe les processus à invariance sphérique pour laquelle nous montrons qu’il est pertinent d’associer une distribution caractéristique de type Weibull. Concernant la seconde famille, il s’agit des lois multivariées à copules. Après détermination de la copule caractérisant la structure de la dépendance adaptée à la texture, nous proposons une extension multivariée de la distribution Gaussienne généralisée asymétrique à l’aide de la copule Gaussienne. L’ensemble des modèles proposés est comparé quantitativement en terme de qualité d’ajustement à l’aide de tests statistiques d’adéquation dans un cadre univarié et multivarié. Enfin, une dernière partie de notre étude concerne la validation expérimentale des performances de nos modèles à travers une application de recherche d’images par le contenu textural. Pour ce faire, nous dérivons des expressions analytiques de métriques probabilistes mesurant la similarité entre les modèles introduits, ce qui constitue selon nous une troisième contribution de ce travail. Finalement, une étude comparative est menée visant à confronter les modèles probabilistes proposés à ceux de l’état de l’art. / In this thesis we study the statistical modeling of textured images using multi-scale and multi-orientation representations. Based on the results of studies in neuroscience assimilating the human perception mechanism to a selective spatial frequency scheme, we propose to characterize textures by probabilistic models of subband coefficients.Our contributions in this context consist firstly in the proposition of probabilistic models taking into account the leptokurtic nature and the asymmetry of the marginal distributions associated with a textured content. First, to model analytically the marginal statistics of subbands, we introduce the asymmetric generalized Gaussian model. Second, we propose two families of multivariate models to take into account the dependencies between subbands coefficients. The first family includes the spherically invariant processes that we characterize using Weibull distribution. The second family is this of copula based multivariate models. After determination of the copula characterizing the dependence structure adapted to the texture, we propose a multivariate extension of the asymmetric generalized Gaussian distribution using Gaussian copula. All proposed models are compared quantitatively using both univariate and multivariate statistical goodness of fit tests. Finally, the last part of our study concerns the experimental validation of the performance of proposed models through texture based image retrieval. To do this, we derive closed-form metrics measuring the similarity between probabilistic models introduced, which we believe is the third contribution of this work. A comparative study is conducted to compare the proposed probabilistic models to those of the state-of-the-art.
|
157 |
On validation of parametric models applied in survival analysis and reliability / Sur la validation des modèles paramétriques appliqués en analyse de survie et fiabilitéTahir, Muhammad-Ramzan 02 July 2012 (has links)
Le premier objectif de la thèse est de présenter un test d'ajustement pour les modèles paramétriques couramment utilisés en l'analyse de survie, la fiabilité, les sciences sociales, l'ingénierie, la santé publique et la démographie, en présence de censure à droite. Nous développons un logiciel en langue R pour les modèles paramétrique. Le modèle de Birnbaum-Saunders (BS) est utilisé pour la test d'ajustement pour les modèles AFT paramétriques et en analyse de système redondant. L'autre contribution porte sur l'analyse de système redondant composé avec une composante en état hot et l'autre en réserve fonctionnent en état warm pour augmenter la fiabilité de systeme. Nous calculons la fiabilité du système en termes de Fonction de répartition et nous donnons l'intervalle de confiance asymptotique. / This is an increasing importance in survival analysis and reliability to select a suitable basic model for further inquiries of the data. Little deviation in basic model can cause serious problems in final results. The presence of censoring and accelerated stresses make this task more difficult. Chi-square type goodness of fit tests are most commonly used for model selection. Many modifications in chi-square tests have been proposed by various researcher. The first aim of the thesis is to present a goodness of fit test for wide rage of parametric models (shape-scale families) commonly used in survival analysis, social sciences, engineering, public health and demography, in presence of right censoring. We give the explicit forms of the quadratic form of the test statistic (NRR test) for various models and apply the test on real data. We develop a computer program in R-language for all models. A separate section is dedicated for the test in demography. We focus on the Birnbaum-Saunders (BS) distribution for goodness of fit test for parametric AFT-model and analysis of redundant system.The other purpose of the thesis is to give the analysis of redundant system. To ensure high reliability of the main components of the systems, standby units are used. The main component is replaced by the standby unit automatically, if it fails. The standby unit can be in warm, hot, or cold state. We give the procedure of one main and (n-1) standby units placed in hot state, and give the detailed analysis of one main and one standby unit using BS parametric family. We use Sedyakin's physical principal and the approach of accelerated failure time model for the analysis of redundant system. This approach is different from the traditional ones in the literature but difficulties in calculations. We calculate the reliability of the system in terms of distribution function (unreliability function) and asymptotic confidence interval.
|
158 |
Tests d'ajustement pour des processus stochastiques dans le cas de l'hypothèse nulle paramétrique / On goodness-of-fit tests with parametric hypotheses for some stochastic processesBen Abdeddaiem, Maroua 11 May 2016 (has links)
Ce travail est consacré au problème de construction des tests d'ajustement dans le cas des processus stochastiques observés en temps continu. Comme modèles d'observations, nous considérons les processus de diffusion avec « petit bruit » et ergodique et le processus de Poisson non homogène. Sous l'hypothèse nulle, nous traitons le cas où chaque modèle dépend d'un paramètre inconnu unidimensionnel et nous proposons l'estimateur de distance minimale pour ce paramètre. Notre but est la construction des tests d'ajustement « asymptotically distribution free » (ADF) de niveau asymtotique α ϵ (0,1) dans le cas de cette hypothèse paramétrique pour les modèles traités. Nous montrons alors que la limite de chaque statistique étudiée ne dépend ni du modèle ni du paramètre inconnu. Les tests d'ajustement basés sur ces statistiques sont donc ADF. L'objectif principal de ce travail est la construction d'une transformation linéaire spéciale. En particulier, nous résolvons l'équation de Fredholm du second type avec le noyau dégénéré. Sa solution nous permet de construire la transformation linéaire désirée. Ensuite, nous montrons que l'application de cette transformation aux statistiques de base étudiées dans chaque modèle nous aide à introduire des statistiques ayant la même limite (l'intégrale du carrée du processus de Wiener). Cette dernière est « distribution free » vu qu'elle ne dépend ni du modèle ni du paramètre inconnu. Par conséquent, nous proposons des tests d'ajustement ADF en se basant sur cette transformation linéaire pour les processus de diffusion avec « petit bruit » et ergodique et le processus de Poisson non homogène. / This work is devoted to the problem of the construction of several goodness of-fit (GoF) tests in the case of somestochastic processes observed in continuous time. As models of observations, we take "small noise" and ergodic diffusionprocesses and an inhomogeneous Poisson process. Under the null hypothesis, we treat the case where each model depends on an unknown one-dimensional parameter and we consider the minimum distance estimator for this parameter. Our goal is to propose "asymptotically distribution free" (ADF) GoF tests of asymptotic size α ϵ (0,1) in the case of the parametric null hypotheses for the considered models. Indeed, we show that the limit of each studied statistic does not depend on the model and the unknown parameter. Therefore, the tests based on these statistics are ADF.The main purpose of this work is to construct a special linear transformation. In particular, we solve Fredholm equation ofthe second kind with degenerated kernel. Its solution gives us the desired linear transformation. Next, we show that theapplication of this transformation to the basic statistics allows us to introduce statistics with the same limit (the integral of the square of the Wiener process). The latter is "distribution free" because it does not depend on the models and the unknown parameter. Therefore, we construct the ADF GoF tests which are based on this linear transformation for the diffusion ("small noise" and ergodic) and inhomogeneous Poisson processes.
|
159 |
Spectrum Sensing Techniques For Cognitive Radio ApplicationsSanjeev, G 01 1900 (has links) (PDF)
Cognitive Radio (CR) has received tremendous research attention over the past decade, both in the academia and industry, as it is envisioned as a promising solution to the problem of spectrum scarcity. ACR is a device that senses the spectrum for occupancy by licensed users(also called as primary users), and transmits its data only when the spectrum is sensed to be available. For the efficient utilization of the spectrum while also guaranteeing adequate protection to the licensed user from harmful interference, the CR should be able to sense the spectrum for primary occupancy quickly as well as accurately. This makes Spectrum Sensing(SS) one of the where the goal is to test whether the primary user is inactive(the null or noise-only hypothesis), or not (the alternate or signal-present hypothesis). Computational simplicity, robustness to uncertainties in the knowledge of various noise, signal, and fading parameters, and ability to handle interference or other source of non-Gaussian noise are some of the desirable features of a SS unit in a CR.
In many practical applications, CR devices can exploit known structure in the primary signal. IntheIEEE802.22CR standard, the primary signal is a wideband signal, but with a strong narrowband pilot component. In other applications, such as military communications, and blue tooth, the primary signal uses a Frequency Hopping (FH)transmission. These applications can significantly benefit from detection schemes that are tailored for detecting the corresponding primary signals. This thesis develops novel detection schemes and rigorous performance analysis of these primary signals in the presence of fading. For example, in the case of wideband primary signals with a strong narrowband pilot, this thesis answers the further question of whether to use the entire wideband for signal detection, or whether to filter out the pilot signal and use narrowband signal detection. The question is interesting because the fading characteristics of wideband and narrowband signals are fundamentally different. Due to this, it is not obvious which detection scheme will perform better in practical fading environments.
At another end of the gamut of SS algorithms, when the CR has no knowledge of the structure or statistics of the primary signal, and when the noise variance is known, Energy Detection (ED) is known to be optimal for SS. However, the performance of the ED is not robust to uncertainties in the noise statistics or under different possible primary signal models. In this case, a natural way to pose the SS problem is as a Goodness-of-Fit Test (GoFT), where the idea is to either accept or reject the noise-only hypothesis. This thesis designs and studies the performance of GoFTs when the noise statistics can even be non-Gaussian, and with heavy tails. Also, the techniques are extended to the cooperative SS scenario where multiple CR nodes record observations using multiple antennas and perform decentralized detection.
In this thesis, we study all the issues listed above by considering both single and multiple CR nodes, and evaluating their performance in terms of(a)probability of detection error,(b) sensing-throughput trade off, and(c)probability of rejecting the null-hypothesis. We propose various SS strategies, compare their performance against existing techniques, and discuss their relative advantages and performance tradeoffs. The main contributions of this thesis are as follows:
The question of whether to use pilot-based narrowband sensing or wideband sensing is answered using a novel, analytically tractable metric proposed in this thesis called the error exponent with a confidence level.
Under a Bayesian framework, obtaining closed form expressions for the optimal detection threshold is difficult. Near-optimal detection thresholds are obtained for most of the commonly encountered fading models.
Foran FH primary, using the Fast Fourier Transform (FFT) Averaging Ratio(FAR) algorithm, the sensing-through put trade off are derived in closed form.
A GoFT technique based on the statistics of the number of zero-crossings in the observations is proposed, which is robust to uncertainties in the noise statistics, and outperforms existing GoFT-based SS techniques.
A multi-dimensional GoFT based on stochastic distances is studied, which pro¬vides better performance compared to some of the existing techniques. A special case, i.e., a test based on the Kullback-Leibler distance is shown to be robust to some uncertainties in the noise process.
All of the theoretical results are validated using Monte Carlo simulations. In the case of FH-SS, an implementation of SS using the FAR algorithm on a commercially off-the ¬shelf platform is presented, and the performance recorded using the hardware is shown to corroborate well with the theoretical and simulation-based results. The results in this thesis thus provide a bouquet of SS algorithms that could be useful under different CRSS scenarios.
|
160 |
Développements théoriques et empiriques des tests lisses d'ajustement des modèles ARMA vectorielsDesrosiers, Gabriel 12 1900 (has links)
Lors de la validation des modèles de séries chronologiques, une hypothèse qui peut s'avérer importante porte sur la loi des données. L'approche préconisée dans ce mémoire utilise les tests lisses d'ajustement. Ce mémoire apporte des développements théoriques et empiriques des tests lisses pour les modèles autorégressifs moyennes mobiles (ARMA) vectoriels. Dans des travaux précédents, Ducharme et Lafaye de Micheaux (2004) ont développé des tests lisses d'ajustement reposant sur les résidus des modèles ARMA univariés. Tagne Tatsinkou (2016) a généralisé les travaux dans le cadre des modèles ARMA vectoriels (VARMA), qui s'avèrent potentiellement utiles dans les applications avec données réelles. Des considérations particulières au cas multivarié, telles que les paramétrisations structurées dans les modèles VARMA sont abordées.
Les travaux de Tagne Tatsinkou (2016) sont complétés selon les angles théoriques et des études de simulations additionnelles sont considérées. Les nouveaux tests lisses reposent sur des familles de polynômes orthogonaux. Dans cette étude, une attention particulière est accordée aux familles de Legendre et d'Hermite. La contribution théorique majeure est une preuve complète que la statistique de test est invariante aux transformations linéaires affines lorsque la famille d'Hermite est adoptée. Les résultats de Tagne Tatsinkou (2016) représentent une première étape importante, mais ils sont incomplets quant à l'utilisation des résidus du modèle.
Les tests proposés reposent sur une famille de densités sous les hypothèses alternatives d'ordre k. La sélection automatique de l'ordre maximal, basée sur les résultats de Ledwina (1994), est discutée. La sélection automatique est également implantée dans nos études de simulations.
Nos études de simulations incluent des modèles bivariés et un modèle trivarié. Dans une étude de niveaux, on constate la bonne performance des tests lisses. Dans une étude de puissance, plusieurs compétiteurs ont été considérés. Il est trouvé que les tests lisses affichent des propriétés intéressantes de puissance lorsque les données proviennent de modèles VARMA avec des innovations dans la classe de lois normales contaminées. / When validating time series models, the distribution of the observations represents a potentially important assumption. In this Master's Thesis, the advocated approach uses smooth goodness-of-fit test statistics. This research provides theoretical and empirical developments of the smooth goodness of fit tests for vector autoregressive moving average models (VARMA). In previous work, Ducharme and Lafaye de Micheaux (2004) developed smooth goodness-of-fit tests designed for the residuals of univariate ARMA models. Later, Tagne Tatsinkou (2016) generalized the work within the framework of vector ARMA (VARMA) models, which prove to be potentially useful in real applications. Structured parameterizations, which are considerations specific to the multivariate case, are discussed.
The works of Tagne Tatsinkou (2016) are completed, according to theoretical angles, and additional simulation studies are also considered. The new smooth tests are based on families of orthogonal polynomials. In this study, special attention is given to Legendre's family and Hermite's family. The major theoretical contribution in this work is a complete proof that the test statistic is invariant to linear affine transformations when the Hermite family is adopted. The results of Tagne Tatsinkou (2016) represent an important first step, but they were incomplete with respect to the use of the model residuals.
The proposed tests are based on a family of densities under alternative hypotheses of order k. A data driven method to choose the maximal order, based on the results of Ledwina (1994), is discussed. In our simulation studies, the automatic selection is also implemented.
Our simulation studies include bivariate models and a trivariate model. In the level study, we can appreciate the good performance of the smooth tests. In the power study, several competitors were considered. We found that the smooth tests displayed interesting power properties when the data came from VARMA models with innovations in the class of contaminated normal distributions.
|
Page generated in 0.0777 seconds