• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 29
  • 17
  • 7
  • 3
  • 3
  • 3
  • 1
  • 1
  • Tagged with
  • 69
  • 66
  • 34
  • 12
  • 11
  • 10
  • 9
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Scaling up of peatland methane emission hotspots from small to large scales

Mohammed, Abdulwasey January 2015 (has links)
Methane is an important greenhouse gas that is relatively long-lived in the atmosphere, and wetlands are a major natural source of atmospheric methane. Methane emissions from wetlands are variable across both space and time at scales ranging from meters to continents and a comprehensive accounting of wetland methane efflux is critical for quantifying the atmospheric methane balance. Major uncertainties in quantifying methane efflux arise when measuring and modelling its physical and biological determinants, including water table depth, microtopography, soil temperature, the distribution of aerenchymous vegetation, and the distribution of mosses. Further complications arise with the nonlinear interaction between flux and derivers in highly-heterogeneous wetland landscape. A possible solution for quantifying wetland methane efflux at multiple scales in space (‘upscaling’) is repeated observations using remote sensing technology to acquire information about the land surface across time, space, and spectra. These scaling issues must be resolved to progress in our understanding of the role of wetlands in the global atmospheric methane budget from peatlands. In this thesis, data collected from multiple aircraft and satellite-based remote sensing platforms were investigated to characterize the fine scale spatial heterogeneity of a peatland in southwestern Scotland for the purpose of developing techniques for quantifying (‘upscaling’) methane efflux at multiple scales and space. Seasonal variation in pools such as expansion and contraction was simulated with the LiDAR data to investigate the expansion and contraction of the lakes and pools that could give an idea of increase or decrease in methane emissions. Concepts from information theory applied on the different data sets also revealed the relative loss in some features on peatland surface and relative gain on others and find a natural application for reducing bias in multi-scale spatial classification as well as quantifying the length scales (or scales) at which important surface features for methane fluxes are lost. Results from the wavelet analysis demonstrated the preservation of fine scale heterogeneity up to certain length scale and the pattern on peatland surface was preserved. Variogram techniques were also tested to determine sample size, range and orientation in the data set. All the above has implications on estimating methane budget from the peatland landscape and could reduce the bias in the overall flux estimates. All the methods used can also be applied to contrasting sites.
52

有影響力自變數的偵測

盧惟真 Unknown Date (has links)
在一個具有多個自變數的線性模式中,當我們發現模式在加入或刪除某些自變數時,若對其他參數的估計或估計分配或後驗分配造成極大的影響,我們就有必要提出警告訊息並做進一步分析。而偵測這些造成影響之自變數的方法,除了Schall和Dunne(1990)所提的Cook距離和AP統計量外,本文提出用Kullback-Leibler對稱散度的方法,以自變數增加前後,參數估計分配間的差異作為所加入之自變數影響力的指標。另一方面,就貝氏的觀點,以自變數增加前後,參數後驗分配間的差異程度作為偵測有影響力自變數的方法。此外,本文亦探索Kullback-Leibler對稱散度與自變數間共線性的關係。
53

Sélection de modèles semi-paramétriques

Liquet, benoit 11 December 2002 (has links) (PDF)
Cette thèse développe des méthodes de sélection de modèles pour des applications en Biostatistique et plus particulièrement dans le domaine médical. Dans la première partie, nous proposons une méthode et un programme de correction du niveau de signification d'un test lorsque plusieurs codages d'une variable explicative sont essayés. Ce travail est réalisé dans le cadre d'une régression logistique et appliqué à des données sur la relation entre cholestérol et démence. La deuxième partie de la thèse est consacrée au développement d'un critère d'information général permettant de sélectionner un estimateur parmi une famille d'estimateurs semi-paramétriques. Le critère que nous proposons est basé sur l'estimation par bootstrap de l'information de Kullback-Leibler. Nous appliquons ensuite ce critère à la modélisation de l'effet de l'amiante sur le risque de mésothéliome et nous comparons cette approche à la méthode de sélection de Birgé-Massart. Enfin, la troisième partie présente un critère de sélection en présence des données incomplètes. Le critère proposé est une extension du critère developpé dans la deuxième partie. Ce critère, construit sur l'espérance de la log-vraisemblance observée, permet en particulier de sélectionner le paramètre de lissage dans l'estimation lisse de la fonction de risque et de choisir entre des modèles stratifiés et des modèles à risques proportionnels. Nous avons notamment appliqué cette méthode à la modélisation de l'effet du sexe et du niveau d'éducation sur le risque de démence.
54

Representation Of Covariance Matrices In Track Fusion Problems

Gunay, Melih 01 November 2007 (has links) (PDF)
Covariance Matrix in target tracking algorithms has a critical role at multi- sensor track fusion systems. This matrix reveals the uncertainty of state es- timates that are obtained from diferent sensors. So, many subproblems of track fusion usually utilize this matrix to get more accurate results. That is why this matrix should be interchanged between the nodes of the multi-sensor tracking system. This thesis mainly deals with analysis of approximations of the covariance matrix that can best represent this matrix in order to efectively transmit this matrix to the demanding site. Kullback-Leibler (KL) Distance is exploited to derive some of the representations for Gaussian case. Also com- parison of these representations is another objective of this work and this is based on the fusion performance of the representations and the performance is measured for a system of a 2-radar track fusion system.
55

Contribui??es para a an?lise de sinais neuronais e biom?dicos

Santos, V?tor Lopes dos 03 March 2011 (has links)
Made available in DSpace on 2014-12-17T14:55:49Z (GMT). No. of bitstreams: 1 VitorLS_DISSERT.pdf: 1833534 bytes, checksum: 72ebc7d9d8be6ba8ae53eaad106afa8d (MD5) Previous issue date: 2011-03-03 / Conselho Nacional de Desenvolvimento Cient?fico e Tecnol?gico / Following the new tendency of interdisciplinarity of modern science, a new field called neuroengineering has come to light in the last decades. After 2000, scientific journals and conferences all around the world have been created on this theme. The present work comprises three different subareas related to neuroengineering and electrical engineering: neural stimulation; theoretical and computational neuroscience; and neuronal signal processing; as well as biomedical engineering. The research can be divided in three parts: (i) A new method of neuronal photostimulation was developed based on the use of caged compounds. Using the inhibitory neurotransmitter GABA caged by a ruthenium complex it was possible to block neuronal population activity using a laser pulse. The obtained results were evaluated by Wavelet analysis and tested by non-parametric statistics. (ii) A mathematical method was created to identify neuronal assemblies. Neuronal assemblies were proposed as the basis of learning by Donald Hebb remain the most accepted theory for neuronal representation of external stimuli. Using the Marcenko-Pastur law of eigenvalue distribution it was possible to detect neuronal assemblies and to compute their activity with high temporal resolution. The application of the method in real electrophysiological data revealed that neurons from the neocortex and hippocampus can be part of the same assembly, and that neurons can participate in multiple assemblies. (iii) A new method of automatic classification of heart beats was developed, which does not rely on a data base for training and is not specialized in specific pathologies. The method is based on Wavelet decomposition and normality measures of random variables. Throughout, the results presented in the three fields of knowledge represent qualification in neural and biomedical engineering / Following the new tendency of interdisciplinarity of modern science, a new field called neuroengineering has come to light in the last decades. After 2000, scientific journals and conferences all around the world have been created on this theme. The present work comprises three different subareas related to neuroengineering and electrical engineering: neural stimulation; theoretical and computational neuroscience; and neuronal signal processing; as well as biomedical engineering. The research can be divided in three parts: (i) A new method of neuronal photostimulation was developed based on the use of caged compounds. Using the inhibitory neurotransmitter GABA caged by a ruthenium complex it was possible to block neuronal population activity using a laser pulse. The obtained results were evaluated by Wavelet analysis and tested by non-parametric statistics. (ii) A mathematical method was created to identify neuronal assemblies. Neuronal assemblies were proposed as the basis of learning by Donald Hebb remain the most accepted theory for neuronal representation of external stimuli. Using the Marcenko-Pastur law of eigenvalue distribution it was possible to detect neuronal assemblies and to compute their activity with high temporal resolution. The application of the method in real electrophysiological data revealed that neurons from the neocortex and hippocampus can be part of the same assembly, and that neurons can participate in multiple assemblies. (iii) A new method of automatic classification of heart beats was developed, which does not rely on a data base for training and is not specialized in specific pathologies. The method is based on Wavelet decomposition and normality measures of random variables. Throughout, the results presented in the three fields of knowledge represent qualification in neural and biomedical engineering
56

Quantificação de risco operacional

Aaltonen, Alex 17 October 2012 (has links)
Submitted by Alex Aaltonen (aaltonenalex@gmail.com) on 2012-11-13T15:27:19Z No. of bitstreams: 1 Tese_em_versão_final_20121113_Título_Curto.pdf: 1228034 bytes, checksum: d53581eb0c3b7a27074aabe3b5db3550 (MD5) / Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2012-11-13T17:17:41Z (GMT) No. of bitstreams: 1 Tese_em_versão_final_20121113_Título_Curto.pdf: 1228034 bytes, checksum: d53581eb0c3b7a27074aabe3b5db3550 (MD5) / Made available in DSpace on 2012-11-13T17:34:20Z (GMT). No. of bitstreams: 1 Tese_em_versão_final_20121113_Título_Curto.pdf: 1228034 bytes, checksum: d53581eb0c3b7a27074aabe3b5db3550 (MD5) Previous issue date: 2012-10-17 / Measuring operational risk is necessary as it affects the value and survival of companies. A central focus of researchers, professionals in the financial sector, regulators, and bank supervisors involves controlling this risk. For this study, we explored four applications of the loss distribution approach for quantifying operational risk. A set of operating losses spanning two years at a major Brazilian bank was used for the purpose of applying and testing this approach based on the four methods. The empirical distribution method was found to be the most appropriate for measuring operational risk and calculating economic capital from the available data. The operational risk quantification method based on fitting theoretical distibutions to losses revealed that the Johnson curves are particularly flexible and readily implemented. Further, the Johnson curves were fitted to the distribution of operational losses and to the empirical distribution of the economic capital amounts. Knowing the capital distribution provides us with a notion of the economic capital calculation accuracy and prepares the way for future theoretical studies on operational VaR. Rather than calculating a single capital amount, we determined the distribution of economic capital amounts. We compared two methods, used to establish capital amount distributions for the bank. Our study demonstrated the possibility of justifying verification points in internal audit procedures on the basis of operational risk data, modeling, and management. Based on these findings, we concluded by setting out recommendations for bank supervision and regulation. / Risco operacional precisa ser mensurado pois afeta o valor e a sobrevivência das empresas. Ocupa o foco de pesquisadores, profissionais do sistema financeiro, reguladores e supervisores bancários, no esforço de controlar esse risco. Pesquisamos quatro formas de utilizar a abordagem de distribuição de perdas para a quantificação do risco operacional. Utilizamos um conjunto de dois anos de perdas operacionais de um banco brasileiro de grande porte para fazermos a aplicação e o teste dessa abordagem em quatro variantes. A variante que utiliza exclusivamente distribuições empíricas foi a mais adequada para medir o risco operacional e calcular o capital econômico a partir dos dados disponíveis. Na variante que quantifica o risco operacional ajustando distribuições teóricas às perdas, mostramos que as curvas de Johnson são especialmente flexíveis e de pronta implementação. Também, ajustamos as curvas de Johnson à distribuição de perdas operacionais e à distribuição amostral dos valores de capital econômico. Conhecer a distribuição do capital permite que tenhamos ideia da precisão com que estimamos o capital econômico e abre o caminho para futuros estudos teóricos da distribuição do Var operacional. Encontramos a distribuição dos valores de capital econômico ao invés de calcularmos um valor único de capital. Comparamos dois métodos, utilizados para estabelecer distribuições de valores de capital. Ao conduzirmos a pesquisa, notamos ser possível justificarmos pontos de verificação da auditoria interna sobre a base de dados, a modelagem e a gestão de risco operacional. Com a pesquisa produzimos sugestões para a supervisão e regulação dos bancos.
57

Text-Based Information Retrieval Using Relevance Feedback

Krishnan, Sharenya January 2011 (has links)
Europeana, a freely accessible digital library with an idea to make Europe's cultural and scientific heritage available to the public was founded by the European Commission in 2008. The goal was to deliver a semantically enriched digital content with multilingual access to it. Even though they managed to increase the content of data they slowly faced the problem of retrieving information in an unstructured form. So to complement the Europeana portal services, ASSETS (Advanced Search Service and Enhanced Technological Solutions) was introduced with services that sought to improve the usability and accessibility of Europeana. My contribution is to study different text-based information retrieval models, their relevance feedback techniques and to implement one simple model. The thesis explains a detailed overview of the information retrieval process along with the implementation of the chosen strategy for relevance feedback that generates automatic query expansion. Finally, the thesis concludes with the analysis made using relevance feedback, discussion on the model implemented and then an assessment on future use of this model both as a continuation of my work and using this model in ASSETS.
58

Quality strategy and method for transmission : application to image / Évaluation de la qualité des images dans un contexte de transmission

Xie, Xinwen 10 January 2019 (has links)
Cette thèse porte sur l’étude des stratégies d’amélioration de la qualité d’image dans les systèmes de communication sans fil et sur la conception de nouvelles métriques d’évaluation de la qualité. Tout d'abord, une nouvelle métrique de qualité d'image à référence réduite, basée sur un modèle statistique dans le domaine des ondelettes complexes, a été proposée. Les informations d’amplitude et de phase relatives des coefficients issues de la transformée en ondelettes complexes sont modélisées à l'aide de fonctions de densité de probabilité. Les paramètres associés à ces fonctions constituent la référence réduite qui sera transmise au récepteur. Ensuite, une approche basée sur les réseaux de neurones à régression généralisée est exploitée pour construire la relation de cartographie entre les caractéristiques de la référence réduite et le score objectif.Deuxièmement, avec la nouvelle métrique, une nouvelle stratégie de décodage est proposée pour la transmission d’image sur un canal de transmission sans fil réaliste. Ainsi, la qualité d’expérience (QoE) est améliorée tout en garantissant une bonne qualité de service (QoS). Pour cela, une nouvelle base d’images a été construite et des tests d’évaluation subjective de la qualité de ces images ont été effectués pour collecter les préférences visuelles des personnes lorsqu’elles sélectionnent les images avec différentes configurations de décodage. Un classificateur basé sur les algorithmes SVM et des k plus proches voisins sont utilisés pour la sélection automatique de la meilleure configuration de décodage.Enfin, une amélioration de la métrique a été proposée permettant de mieux prendre en compte les spécificités de la distorsion et la préférence des utilisateurs. Pour cela, nous avons combiné les caractéristiques globales et locales de l’image conduisant ainsi à une amélioration de la stratégie de décodage.Les résultats expérimentaux valident l'efficacité des métriques de qualité d'image et des stratégies de transmission d’images proposées. / This thesis focuses on the study of image quality strategies in wireless communication systems and the design of new quality evaluation metrics:Firstly, a new reduced-reference image quality metric, based on statistical model in complex wavelet domain, has been proposed. The magnitude and the relative phase information of the Dual-tree Complex Wavelet Transform coefficients are modelled by using probability density function and the parameters served as reduced-reference features which will be transmitted to the receiver. Then, a Generalized Regression Neural Network approach is exploited to construct the mapping relation between reduced-reference feature and the objective score.Secondly, with the new metric, a new decoding strategy is proposed for a realistic wireless transmission system, which can improve the quality of experience (QoE) while ensuring the quality of service (QoS). For this, a new database including large physiological vision tests has been constructed to collect the visual preference of people when they are selecting the images with different decoding configurations, and a classifier based on support vector machine or K-nearest neighboring is utilized to automatically select the decoding configuration.Finally, according to specific property of the distortion and people's preference, an improved metric has been proposed. It is the combination of global feature and local feature and has been demonstrated having a good performance in optimization of the decoding strategy.The experimental results validate the effectiveness of the proposed image quality metrics and the quality strategies.
59

Approximations and Applications of Nonlinear Filters / Approximation und Anwendung nichtlinearer Filter

Bröcker, Jochen 30 January 2003 (has links)
No description available.
60

Segmenta??o Fuzzy de Texturas e V?deos

Santos, Tiago Souza dos 17 August 2012 (has links)
Made available in DSpace on 2014-12-17T15:48:04Z (GMT). No. of bitstreams: 1 TiagoSS_DISSERT.pdf: 2900373 bytes, checksum: ea7bd73351348f5c75a5bf4f337c599f (MD5) Previous issue date: 2012-08-17 / Conselho Nacional de Desenvolvimento Cient?fico e Tecnol?gico / The segmentation of an image aims to subdivide it into constituent regions or objects that have some relevant semantic content. This subdivision can also be applied to videos. However, in these cases, the objects appear in various frames that compose the videos. The task of segmenting an image becomes more complex when they are composed of objects that are defined by textural features, where the color information alone is not a good descriptor of the image. Fuzzy Segmentation is a region-growing segmentation algorithm that uses affinity functions in order to assign to each element in an image a grade of membership for each object (between 0 and 1). This work presents a modification of the Fuzzy Segmentation algorithm, for the purpose of improving the temporal and spatial complexity. The algorithm was adapted to segmenting color videos, treating them as 3D volume. In order to perform segmentation in videos, conventional color model or a hybrid model obtained by a method for choosing the best channels were used. The Fuzzy Segmentation algorithm was also applied to texture segmentation by using adaptive affinity functions defined for each object texture. Two types of affinity functions were used, one defined using the normal (or Gaussian) probability distribution and the other using the Skew Divergence. This latter, a Kullback-Leibler Divergence variation, is a measure of the difference between two probability distributions. Finally, the algorithm was tested in somes videos and also in texture mosaic images composed by images of the Brodatz album / A segmenta??o de uma imagem tem como objetivo subdividi-la em partes ou objetos constituintes que tenham algum conte?do sem?ntico relevante. Esta subdivis?o pode tamb?m ser aplicada a um v?deo, por?m, neste, os objetos est?o presentes nos diversos quadros que comp?em o v?deo. A tarefa de segmentar uma imagem torna-se mais complexa quando estas s?o compostas por objetos que contenham caracter?sticas texturais, com pouca ou nenhuma informa??o de cor. A segmenta??o difusa, do Ingl?s fuzzy, ? uma t?cnica de segmenta??o por crescimento de regi?es que determina para cada elemento da imagem um grau de pertin?ncia (entre zero e um) indicando a confian?a de que esse elemento perten?a a um determinado objeto ou regi?o existente na imagem, fazendo-se uso de fun??es de afinidade para obter esses valores de pertin?ncia. Neste trabalho ? apresentada uma modifica??o do algoritmo de segmenta??o fuzzy proposto por Carvalho [Carvalho et al. 2005], a fim de se obter melhorias na complexidade temporal e espacial. O algoritmo foi adaptado para segmentar v?deos coloridos tratando-os como volumes 3D. Para segmentar os v?deos, foram utilizadas informa??es provenientes de um modelo de cor convencional ou de um modelo h?brido obtido atrav?s de uma metodologia para a escolha dos melhores canais para realizar a segmenta??o. O algoritmo de segmenta??o fuzzy foi aplicado tamb?m na segmenta??o de texturas, fazendo-se uso de fun??es de afinidades adaptativas ?s texturas de cada objeto. Dois tipos de fun??es de afinidades foram utilizadas, uma utilizando a distribui??o normal de probabilidade, ou Gaussiana, e outra utilizando a diverg?ncia Skew. Esta ?ltima, uma varia??o da diverg?ncia de Kullback- Leibler, ? uma medida da diverg?ncia entre duas distribui??es de probabilidades. Por fim, o algoritmo foi testado com alguns v?deos e tamb?m com imagens de mosaicos de texturas criadas a partir do ?lbum de Brodatz e outros

Page generated in 0.0286 seconds