91 |
Otimização topológica de estruturas contínuas considerando incertezas / Topology optimization of contimuum structure considering uncertaintiesSilva, Gustavo Assis da 22 February 2016 (has links)
Made available in DSpace on 2016-12-12T20:25:13Z (GMT). No. of bitstreams: 1
Gustavo Assis da Silva.pdf: 2694304 bytes, checksum: 361de063d220eeeebb77985807d4fc22 (MD5)
Previous issue date: 2016-02-22 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / This work addresses the use of the topology optimization of continuum structures under uncertainties in material properties associated to stiffness. The perturbation approach is used to perform the uncertainties quantification and the midpoint method is used for the random field discretization, where a decorrelation technique is used to reduce the computational effort. The finite element method is used for the domain discretization and the SIMP approach is used as material parameterization. Two problems are analyzed: the compliance minimization with volume constraint and the volume minimization with local stress constraints. The first problem is solved by using a optimality criteria method and the second problem by using the augmented Lagrangian method with a gradient based minimization method proposed in this work. The qp approach is used to avoid the singularity phenomenon in the problem with local stress constraints. Although this approach can be used considering uncertainty in any material property associated to stiffness ,the examples in this work show uncertainty only in Young s modulus. Different correlation lengths are considered to verify its influence in the optimum topologies. It is shown that the optimum topology, in both problems analyzed, becomes more distinct from the deterministic topology when the correlation length is reduced. / Este trabalho aborda o uso da otimização topológica de estruturas contínuas sob incertezas nas propriedades do material associadas à rigidez. O método de perturbação é utilizado para a quantificação de incertezas e o método do ponto médio é utilizado para a discretização do campo aleatório, onde uma abordagem de desacoplamento é utilizada para reduzir o custo computacional. O método dos elementos finitos é utilizado para a discretização do domínio e o modelo SIMP é utilizado na parametrização material. Dois problemas são analisados: o problema de minimização de flexibilidade com restrição de volume e o problema de minimização de volume com restrição local de tensão. O primeiro problema é solucionado utilizando-se um método de critério de ótimo e o segundo problema utilizando-se o método do Lagrangiano aumentado juntamente com um método de minimização baseado em gradiente proposto neste trabalho. Considerando-se o problema com restrição local de tensão, utilizou-se a relaxação qp para evitar o fenômeno de singularidade. Embora esta abordagem possa ser utilizada considerando-se incerteza em qualquer propriedade do material associada à rigidez, os exemplos ilustrados no trabalho apresentam incerteza apenas no módulo de elasticidade. Diferentes tamanhos de correlação são considerados de forma a verificar a sua influência na topologia ótima. Verifica-se que a topologia obtida, em ambos os problemas apresentados, torna-se mais distinta da topologia determinística com a redução do tamanho de correlação.
|
92 |
Probabilistic modelling of unsaturated slope stability accounting for heterogeneityArnold, Patrick January 2017 (has links)
The performance and safety assessment of geo-structures is strongly affected by uncertainty; that is, both due a subjective lack of knowledge as well as objectively present and irreducible unknowns. Due to uncertainty in the non-linear variation of the matric suction induced effective stress as a function of the transient soil-atmosphere boundary conditions, the unsaturated state of the subsoil is generally not accounted for in a deterministic slope stability assessment. Probability theory, accounting for uncertainties quantitatively rather than using "cautious estimates" on loads and resistances, may aid to partly bridge the gap between unsaturated soil mechanics and engineering practice. This research investigates the effect of uncertainty in soil property values on the stability of unsaturated soil slopes. Two 2D Finite Element (FE) programs have been developed and implemented into a parallelised Reliability-Based Design (RBD) framework, which allows for the assessment of the failure probability, failure consequence and parameter sensitivity, rather than a deterministic factor of safety. Utilising the Random Finite Element Method (RFEM), within a Monte Carlo framework, multivariate cross-correlated random property fields have been mapped onto the FE mesh to assess the effect of isotropic and anisotropic moderate heterogeneity on the transient slope response, and thus performance. The framework has been applied to a generic slope subjected to different rainfall scenarios. The performance was found to be sensitive to the uncertainty in the effective shear strength parameters, as well as the parameters governing the unsaturated soil behaviour. The failure probability was found to increase most during prolonged rainfall events with a low precipitation rate. Nevertheless, accounting for the unsaturated state resulted in a higher slope reliability than when not considering suction effects. In a heterogeneous deposit failure is attracted to local zones of low shear strength, which, for an unsaturated soil, are a function of both the spatial variability of soil property values, as well as of the soil-water dynamics, leading to a significant increase in the failure probability near the end of the main rainfall event.
|
93 |
Evaluation de la variabilité spatiale des paramètres géotechnique du sol à partir de mesures géophysiques : application à la plaine alluviale de Nahr-Beyrouth (Liban) / Evaluation of the spatial variability of geotechnical parameters of soil from geophysical measurements : application to the alluvial plain of Nahr Beirut (Lebanon)Salloum, Nancy 30 April 2015 (has links)
La variabilité spatiale des formations géologiques rend difficile la détermination des paramètres géotechniques nécessaires pour l'évaluation des aléas naturels (sismiques et gravitaires). Les méthodes d'imagerie géophysique, non-destructives et rapides, sont de plus en plus utilisées pour la reconnaissance de telles structures hétérogènes. Une campagne d'essais géophysiques et géotechniques a été réalisée dans la plaine alluviale de Beyrouth (Liban), ville soumise à un fort risque sismique, en vue de caractériser la variabilité des couches alluviales. Les résultats combinés ont permis de caractériser la structure 3D du site et de détecter la présence d'une couche d'argile molle, peu profonde et d'épaisseur variable. Cette couche de faible compacité, qui a rendu complexe l'interprétation des courbes de dispersion des ondes de surface, a une influence importante sur la réponse sismique du site. Les incertitudes reliées à la variabilité spatiale des propriétés géotechnique (N60) et géophysiques (Vs, ρ) ont pu être quantifiées au sein des couches rencontrées et les fonctions de distribution de ces paramètres ont été déterminées dans chaque couche à partir de ces essais, à la fois verticalement et horizontalement. Les valeurs de distance d'autocorrélation verticale (Vs, N60) et horizontale (ρ) obtenues ainsi que les valeurs de coefficient de variation se situent dans la gamme de valeurs trouvées dans la littérature. La réponse dynamique (amplification spectrale) de la plaine alluviale de Beyrouth a été simulée avec des modèles probabilistes unidimensionnels, et l'effet des trois paramètres statistiques (loi d'autocorrélation, distance d'autocorrélation et coefficient de variation) décrivant les variabilités des propriétés élastiques du sol (Vs) a été quantifié. Pour obtenir des réponses sismiques réalistes, un critère de sélection des profils Vs générés de façon probabiliste a été introduit afin de ne retenir que les profils compatibles (dans une gamme d'incertitude) avec la courbe de dispersion établie. Les modélisations probabilistes ont montré des différences significatives par rapport aux modélisations déterministes. Le principal paramètre probabiliste contrôlant l'amplification spectrale est le coefficient de variation, suivi de la distance d'autocorrélation, alors que le type de loi a peu d'influence. Enfin, nous avons vu que l'activité humaine peut avoir une influence significative sur l'application des méthodes géophysiques en site urbain. La compréhension d'un milieu complexe dans ce contexte nécessite de combiner toutes les méthodes géotechniques et géophysiques d'investigation afin d'obtenir un modèle robuste 2D/3D de la structure du sol / The spatial variability of geological formations makes it difficult to determine the geotechnical parameters necessary for the evaluation of natural hazards (seismic and gravity). The geophysical imaging methods, non-destructive and fast, are now increasingly used for heterogeneous structures of sub-surface recognition. Geophysical and geotechnical tests were carried out in the alluvial plain of Beirut (Lebanon), city with high seismic risk, to characterize the variability in the alluvial layers. Analyses of these tests were used to characterize the 3D structure of the site and to detect the presence of a shallow soft clay layer of variable thickness. This layer of low compactness, which made the interpretation of dispersion curves of surface waves complex, could be of prime importance for seismic response of the site. Using all the collected data, the uncertainties related to the spatial variability of geotechnical (N60) and geophysical (Vs, ρ) properties of soil were quantified in the layers encountered and the distribution functions of these parameters were determined in each layer, in both directions (vertical and horizontal). The autocorrelation distance in the vertical (Vs, N60) and horizontal (ρ) directions and the coefficient of variation are within the range of values founded in the literature. The dynamic response (spectral amplification) of the alluvial plain of Beirut was modeled by one dimensional probabilistic model and we quantified the effect of the three statistical parameters (autocorrelation function, autocorrelation distance and coefficient of variation) describing the elastic variability properties of soil (Vs). To obtain realistic seismic responses, we proposed a probabilistic (Vs) profile selection criteria in order to retain only compatible profiles (in a range of uncertainty) with the obtained dispersion curve. Probabilistic modeling showed significant differences from the deterministic modeling. It appeared that the main factor controlling probabilistic spectral amplification is the coefficient of variation (COVVs) followed by the autocorrelation distance, while the type of autocorrelation function has little influence. Finally, Human activity was also found to have a significant influence on the application of geophysical prospecting at this urban site. This case illustrates the need of combining investigation methods in order to understand the geophysical measurements in a complex medium and to reach a robust 2D/3D model.
|
94 |
Développement d'un modèle statistique non stationnaire et régional pour les précipitations extrêmes simulées par un modèle numérique de climat / A non-stationary and regional statistical model for the precipitation extremes simulated by a climate modelJalbert, Jonathan 30 October 2015 (has links)
Les inondations constituent le risque naturel prédominant dans le monde et les dégâts qu'elles causent sont les plus importants parmi les catastrophes naturelles. Un des principaux facteurs expliquant les inondations sont les précipitations extrêmes. En raison des changements climatiques, l'occurrence et l'intensité de ces dernières risquent fort probablement de s'accroître. Par conséquent, le risque d'inondation pourrait vraisemblablement s'intensifier. Les impacts de l'évolution des précipitations extrêmes sont désormais un enjeu important pour la sécurité du public et pour la pérennité des infrastructures. Les stratégies de gestion du risque d'inondation dans le climat futur sont essentiellement basées sur les simulations provenant des modèles numériques de climat. Un modèle numérique de climat procure notamment une série chronologique des précipitations pour chacun des points de grille composant son domaine spatial de simulation. Les séries chronologiques simulées peuvent être journalières ou infra-journalières et elles s'étendent sur toute la période de simulation, typiquement entre 1961 et 2100. La continuité spatiale des processus physiques simulés induit une cohérence spatiale parmi les séries chronologiques. Autrement dit, les séries chronologiques provenant de points de grille avoisinants partagent souvent des caractéristiques semblables. De façon générale, la théorie des valeurs extrêmes est appliquée à ces séries chronologiques simulées pour estimer les quantiles correspondants à un certain niveau de risque. La plupart du temps, la variance d'estimation est considérable en raison du nombre limité de précipitations extrêmes disponibles et celle-ci peut jouer un rôle déterminant dans l'élaboration des stratégies de gestion du risque. Par conséquent, un modèle statistique permettant d'estimer de façon précise les quantiles de précipitations extrêmes simulées par un modèle numérique de climat a été développé dans cette thèse. Le modèle développé est spécialement adapté aux données générées par un modèle de climat. En particulier, il exploite l'information contenue dans les séries journalières continues pour améliorer l'estimation des quantiles non stationnaires et ce, sans effectuer d'hypothèse contraignante sur la nature de la non-stationnarité. Le modèle exploite également l'information contenue dans la cohérence spatiale des précipitations extrêmes. Celle-ci est modélisée par un modèle hiérarchique bayésien où les lois a priori des paramètres sont des processus spatiaux, en l'occurrence des champs de Markov gaussiens. L'application du modèle développé à une simulation générée par le Modèle régional canadien du climat a permis de réduire considérablement la variance d'estimation des quantiles en Amérique du Nord. / Precipitation extremes plays a major role in flooding events and their occurrence as well as their intensity are expected to increase. It is therefore important to anticipate the impacts of such an increase to ensure the public safety and the infrastructure sustainability. Since climate models are the only tools for providing quantitative projections of precipitation, flood risk management for the future climate may be based on their simulations. Most of the time, the Extreme value theory is used to estimate the extreme precipitations from a climate simulation, such as the T-year return levels. The variance of the estimations are generally large notably because the sample size of the maxima series are short. Such variance could have a significant impact for flood risk management. It is therefore relevant to reduce the estimation variance of simulated return levels. For this purpose, the aim of this paper is to develop a non-stationary and regional statistical model especially suited for climate models that estimates precipitation extremes. At first, the non-stationarity is removed by a preprocessing approach. Thereafter, the spatial correlation is modeled by a Bayesian hierarchical model including an intrinsic Gaussian Markov random field. The model has been used to estimate the 100-year return levels over North America from a simulation by the Canadian Regional Climate Model. The results show a large estimation variance reduction when using the regional model.
|
95 |
Padrões estruturados e campo aleatório em redes complexasDoria, Felipe França January 2016 (has links)
Este trabalho foca no estudo de duas redes complexas. A primeira é um modelo de Ising com campo aleatório. Este modelo segue uma distribuição de campo gaussiana e bimodal. Uma técnica de conectividade finita foi utilizada para resolvê-lo. Assim como um método de Monte Carlo foi aplicado para verificar os resultados. Há uma indicação em nossos resultados que para a distribuição gaussiana a transição de fase é sempre de segunda ordem. Para as distribuições bimodais há um ponto tricrítico, dependente do valor da conectividade . Abaixo de um certo mínimo de , só existe transição de segunda ordem. A segunda é uma rede neural atratora métrica. Mais precisamente, estudamos a capacidade deste modelo para armazenar os padrões estruturados. Em particular, os padrões escolhidos foram retirados de impressões digitais, que apresentam algumas características locais. Os resultados mostram que quanto menor a atividade de padrões de impressões digitais, maior a relação de carga e a qualidade de recuperação. Uma teoria, também foi desenvolvido como uma função de cinco parâmetros: a relação de carga, a conectividade, o grau de densidade da rede, a relação de aleatoriedade e a correlação do padrão espacial. / This work focus on the study of two complex networks. The first one is a random field Ising model. This model follows a gaussian and bimodal distribution, for the random field. A finite connectivity technique was utilized to solve it. As well as a Monte Carlo method was applied to verify our results. There is an indication in our results that for a gaussian distribution the phase transition is always second-order. For the bimodal distribution there is a tricritical point, tha depends on the value of the connectivity . Below a certain minimum , there is only a second-order transition. The second one is a metric attractor neural network. More precisely we study the ability of this model to learn structured patterns. In particular, the chosen patterns were taken from fingerprints, which present some local features. Our results show that the higher the load ratio and retrieval quality are the lower is the fingerprint patterns activity. A theoretical framework was also developed as a function of five parameters: the load ratio, the connectivity, the density degree of the network, the randomness ratio and the spatial pattern correlation.
|
96 |
Statistical modeling and processing of high frequency ultrasound images : application to dermatologic oncology / Modélisation et traitement statistiques d’images d’ultrasons de haute fréquence. Application à l’oncologie dermatologique.Pereyra, Marcelo 04 July 2012 (has links)
Cette thèse étudie le traitement statistique des images d’ultrasons de haute fréquence, avec application à l’exploration in-vivo de la peau humaine et l’évaluation non invasive de lésions. Des méthodes Bayésiennes sont considérées pour la segmentation d’images échographiques de la peau. On y établit que les ultrasons rétrodiffusés par la peau convergent vers un processus aléatoire complexe de type Levy-Flight, avec des statistiques non Gaussiennes alpha-stables. L’enveloppe du signal suit une distribution Rayleigh généralisée à queue lourde. A partir de ces résultats, il est proposé de modéliser l’image ultrason de multiples tissus comme un mélange spatialement cohérent de lois Rayleigh à queues lourdes. La cohérence spatiale inhérente aux tissus biologiques est modélisée par un champ aléatoire de Potts-Markov pour représenter la dépendance locale entre les composantes du mélange. Un algorithme Bayésien original combiné à une méthode Monte Carlo par chaine de Markov (MCMC) est proposé pour conjointement estimer les paramètres du modèle et classifier chaque voxel dans un tissu. L’approche proposée est appliquée avec succès à la segmentation de tumeurs de la peau in-vivo dans des images d’ultrasons de haute fréquence en 2D et 3D. Cette méthode est ensuite étendue en incluant l’estimation du paramètre B de régularisation du champ de Potts dans la chaine MCMC. Les méthodes MCMC classiques ne sont pas directement applicables à ce problème car la vraisemblance du champ de Potts ne peut pas être évaluée. Ce problème difficile est traité en adoptant un algorithme Metropolis-Hastings “sans vraisemblance” fondé sur la statistique suffisante du Potts. La méthode de segmentation non supervisée, ainsi développée, est appliquée avec succès à des images échographiques 3D. Finalement, le problème du calcul de la borne de Cramer-Rao (CRB) du paramètre B est étudié. Cette borne dépend des dérivées de la constante de normalisation du modèle de Potts, dont le calcul est infaisable. Ce problème est résolu en proposant un algorithme Monte Carlo original, qui est appliqué avec succès au calcul de la borne CRB des modèles d’Ising et de Potts. / This thesis studies statistical image processing of high frequency ultrasound imaging, with application to in-vivo exploration of human skin and noninvasive lesion assessment. More precisely, Bayesian methods are considered in order to perform tissue segmentation in ultrasound images of skin. It is established that ultrasound signals backscattered from skin tissues converge to a complex Levy Flight random process with non-Gaussian _-stable statistics. The envelope signal follows a generalized (heavy-tailed) Rayleigh distribution. Based on these results, it is proposed to model the distribution of multiple-tissue ultrasound images as a spatially coherent finite mixture of heavy-tailed Rayleigh distributions. Spatial coherence inherent to biological tissues is modeled by a Potts Markov random field. An original Bayesian algorithm combined with a Markov chain Monte Carlo method is then proposed to jointly estimate the mixture parameters and a label-vector associating each voxel to a tissue. The proposed method is successfully applied to the segmentation of in-vivo skin tumors in high frequency 2D and 3D ultrasound images. This method is subsequently extended by including the estimation of the Potts regularization parameter B within the Markov chain Monte Carlo (MCMC) algorithm. Standard MCMC methods cannot be applied to this problem because the likelihood of B is intractable. This difficulty is addressed by using a likelihood-free Metropolis-Hastings algorithm based on the sufficient statistic of the Potts model. The resulting unsupervised segmentation method is successfully applied to tridimensional ultrasound images. Finally, the problem of computing the Cramer-Rao bound (CRB) of B is studied. The CRB depends on the derivatives of the intractable normalizing constant of the Potts model. This is resolved by proposing an original Monte Carlo algorithm, which is successfully applied to compute the CRB of the Ising and Potts models.
|
97 |
Sur l'existence de champs browniens fractionnaires indexés par des variétés / On the existence of fractional brownian fields indexed by manifoldsVenet, Nil 19 July 2016 (has links)
Cette thèse porte sur l'existence de champs browniens fractionnaires indexés par des variétés riemanniennes. Ces objets héritent des propriétés qui font le succès du mouvement brownien fractionnaire classique (H-autosimilarité des trajectoires ajustable, accroissements stationnaires), mais autorisent à considérer des applications où les données sont portées par un espace qui peut par exemple être courbé ou troué. L'existence de ces champs n'est assurée que lorsque la quantité 2H est inférieure à l'indice fractionnaire de la variété, qui n'est connu que dans un petit nombre d'exemples. Dans un premier temps nous donnons une condition nécessaire pour l'existence de champ brownien fractionnaire. Dans le cas du champ brownien (correspondant à H=1/2) indexé par des variétés qui ont des géodésiques fermées minimales, cette condition s'avère très contraignante : nous donnons des résultats de non-existence dans ce cadre, et montrons notamment qu'il n'existe pas de champ brownien indexé par une variété compacte non simplement connexe. La condition nécessaire donne également une preuve courte d'un fait attendu qui est la non-dégénérescence du champ brownien indexé par les espaces hyperboliques réels. Dans un second temps nous montrons que l'indice fractionnaire du cylindre est nul, ce qui constitue un exemple totalement dégénéré. Nous en déduisons que l'indice fractionnaire d'un espace métrique n'est pas continu par rapport à la convergence de Gromov-Hausdorff. Nous généralisons ce résultat sur le cylindre à un produit cartésien qui possède une géodésique fermée minimale, et donnons une majoration de l'indice fractionnaire de surfaces asymptotiquement proches du cylindre au voisinage d'une géodésique fermée minimale. / The aim of the thesis is the study of the existence of fractional Brownian fields indexed by Riemannian manifolds. Those fields inherit key properties of the classical fractional Brownian motion (sample paths with self-similarity of adjustable parameter H, stationary increments), while allowing to consider applications with data indexed by a space which can be for example curved or with a hole. The existence of those fields is only insured when the quantity 2H is inferior or equal to the fractional index of the manifold, which is known only in a few cases. In a first part we give a necessary condition for the fractional Brownian field to exist. In the case of the Brownian field (corresponding to H=1/2) indexed by a manifold with minimal closed geodesics this condition happens to be very restrictive. We give several nonexistence results in this situation. In particular we show that there exists no Brownian field indexed by a nonsimply connected compact manifold. Our necessary condition also gives a short proof of an expected result: we prove the nondegeneracy of fractional Brownian fields indexed by the real hyperbolic spaces. In a second part we show that the fractional index of the cylinder is null, which gives a totally degenerate case. We deduce from this result that the fractional index of a metric space is noncontinuous with respect to the Gromov-Hausdorff convergence. We generalise this result about the cylinder to a Cartesian product with a closed minimal geodesic. Furthermore we give a bound of the fractional index of surfaces asymptotically close to the cylinder in the neighbourhood of a closed minimal geodesic.
|
98 |
Padrões estruturados e campo aleatório em redes complexasDoria, Felipe França January 2016 (has links)
Este trabalho foca no estudo de duas redes complexas. A primeira é um modelo de Ising com campo aleatório. Este modelo segue uma distribuição de campo gaussiana e bimodal. Uma técnica de conectividade finita foi utilizada para resolvê-lo. Assim como um método de Monte Carlo foi aplicado para verificar os resultados. Há uma indicação em nossos resultados que para a distribuição gaussiana a transição de fase é sempre de segunda ordem. Para as distribuições bimodais há um ponto tricrítico, dependente do valor da conectividade . Abaixo de um certo mínimo de , só existe transição de segunda ordem. A segunda é uma rede neural atratora métrica. Mais precisamente, estudamos a capacidade deste modelo para armazenar os padrões estruturados. Em particular, os padrões escolhidos foram retirados de impressões digitais, que apresentam algumas características locais. Os resultados mostram que quanto menor a atividade de padrões de impressões digitais, maior a relação de carga e a qualidade de recuperação. Uma teoria, também foi desenvolvido como uma função de cinco parâmetros: a relação de carga, a conectividade, o grau de densidade da rede, a relação de aleatoriedade e a correlação do padrão espacial. / This work focus on the study of two complex networks. The first one is a random field Ising model. This model follows a gaussian and bimodal distribution, for the random field. A finite connectivity technique was utilized to solve it. As well as a Monte Carlo method was applied to verify our results. There is an indication in our results that for a gaussian distribution the phase transition is always second-order. For the bimodal distribution there is a tricritical point, tha depends on the value of the connectivity . Below a certain minimum , there is only a second-order transition. The second one is a metric attractor neural network. More precisely we study the ability of this model to learn structured patterns. In particular, the chosen patterns were taken from fingerprints, which present some local features. Our results show that the higher the load ratio and retrieval quality are the lower is the fingerprint patterns activity. A theoretical framework was also developed as a function of five parameters: the load ratio, the connectivity, the density degree of the network, the randomness ratio and the spatial pattern correlation.
|
99 |
Metodologia para extração de conteúdo de monitores e TVsFarias, Felipe de Souza 09 June 2016 (has links)
Submitted by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2017-02-14T20:28:34Z
No. of bitstreams: 2
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Dissertação - Felipe de Souza Farias.pdf: 29292259 bytes, checksum: deefd1f41564a97e9bcd3647e671cbeb (MD5) / Approved for entry into archive by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2017-02-14T20:28:53Z (GMT) No. of bitstreams: 2
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Dissertação - Felipe de Souza Farias.pdf: 29292259 bytes, checksum: deefd1f41564a97e9bcd3647e671cbeb (MD5) / Approved for entry into archive by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2017-02-14T20:29:07Z (GMT) No. of bitstreams: 2
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Dissertação - Felipe de Souza Farias.pdf: 29292259 bytes, checksum: deefd1f41564a97e9bcd3647e671cbeb (MD5) / Made available in DSpace on 2017-02-14T20:29:07Z (GMT). No. of bitstreams: 2
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Dissertação - Felipe de Souza Farias.pdf: 29292259 bytes, checksum: deefd1f41564a97e9bcd3647e671cbeb (MD5)
Previous issue date: 2016-06-09 / CNPq - Conselho Nacional de Desenvolvimento Científico e Tecnológico / In this work we present a TV content extraction methodology using a model
based on a Markov random field (MRF). We present two major contributions. For
the first contribution, we modified a method for quadrangular object detection in
color images, by means of adapting edge and rectangle detection techniques to detect
and select a sole rectangular object with features of a TV or monitor screen.
Besides this contribution, we concieved two databases consisted of, respectively, 504
and 600 TV and monitor photos, acquired under different sizes, different illumination
conditions and different distance between camera and device, as well as reference
images with the content presented in the devices in the acquisition moment. The
methodology’s performance was evaluated in the context of detection and evaluation
of monitor content, using the databases concieved in this work. For comparison, we
used existing methods for detecting rectangular objects in the same context of the
proposed methodology. The experiments demonstrate that the methodology’s performance
is greatly influenced by the content complexity and the image background. / Neste trabalho, apresentamos uma metodologia de extração de conteúdo de
TVs e monitores que utiliza um modelo baseado no campo aleatório de Markov
(MRF). Duas contribuições foram feitas. Na primeira contribuição, modificamos
um método de detecção de objetos quadrangulares em imagens coloridas. Isto é
feito adaptando as técnicas de detecção de borda e de retângulos para a detecção
e seleção de um único objeto retangular com características de uma tela de TV ou
monitor. Além desta contribuição, concebemos duas bases de dados com, respectivamente,
504 e 600 imagens de TV/monitores adquiridas em diferentes resoluções,
condições de iluminação e distância entre câmera e tela, assim como imagens de
referência com o conteúdo apresentado nos aparelhos no momento da captura. O
desempenho da metodologia foi avaliado em um contexto de detecção e avaliação
de conteúdo de monitores, utilizando as duas bases de dados concebidas neste trabalho.
Para comparação, utilizamos métodos de detecção de objetos retangulares
existentes na literatura na mesma aplicação da metodologia proposta. Os experimentos
demonstram que o desempenho da metodologia sofre grande influência da
complexidade do conteúdo e do background da imagem.
|
100 |
Modélisation probabiliste et inférence par l'algorithme Belief Propagation / Probabilistic Modelling and Inference using the Belief Propagation AlgorithmMartin, Victorin 23 May 2013 (has links)
On s'intéresse à la construction et l'estimation - à partir d'observations incomplètes - de modèles de variables aléatoires à valeurs réelles sur un graphe. Ces modèles doivent être adaptés à un problème de régression non standard où l'identité des variables observées (et donc celle des variables à prédire) varie d'une instance à l'autre. La nature du problème et des données disponibles nous conduit à modéliser le réseau sous la forme d'un champ markovien aléatoire, choix justifié par le principe de maximisation d'entropie de Jaynes. L'outil de prédiction choisi dans ces travaux est l'algorithme Belief Propagation - dans sa version classique ou gaussienne - dont la simplicité et l'efficacité permettent son utilisation sur des réseaux de grande taille. Après avoir fourni un nouveau résultat sur la stabilité locale des points fixes de l'algorithme, on étudie une approche fondée sur un modèle d'Ising latent où les dépendances entre variables réelles sont encodées à travers un réseau de variables binaires. Pour cela, on propose une définition de ces variables basée sur les fonctions de répartition des variables réelles associées. Pour l'étape de prédiction, il est nécessaire de modifier l'algorithme Belief Propagation pour imposer des contraintes de type bayésiennes sur les distributions marginales des variables binaires. L'estimation des paramètres du modèle peut aisément se faire à partir d'observations de paires. Cette approche est en fait une manière de résoudre le problème de régression en travaillant sur les quantiles. D'autre part, on propose un algorithme glouton d'estimation de la structure et des paramètres d'un champ markovien gaussien, basé sur l'algorithme Iterative Proportional Scaling. Cet algorithme produit à chaque itération un nouveau modèle dont la vraisemblance, ou une approximation de celle-ci dans le cas d'observations incomplètes, est supérieure à celle du modèle précédent. Cet algorithme fonctionnant par perturbation locale, il est possible d'imposer des contraintes spectrales assurant une meilleure compatibilité des modèles obtenus avec la version gaussienne de Belief Propagation. Les performances des différentes approches sont illustrées par des expérimentations numériques sur des données synthétiques. / In this work, we focus on the design and estimation - from partial observations - of graphical models of real-valued random variables. These models should be suited for a non-standard regression problem where the identity of the observed variables (and therefore of the variables to predict) changes from an instance to the other. The nature of the problem and of the available data lead us to model the network as a Markov random field, a choice consistent with Jaynes' maximum entropy principle. For the prediction task, we turn to the Belief Propagation algorithm - in its classical or Gaussian flavor - which simplicity and efficiency make it usable on large scale networks. After providing a new result on the local stability of the algorithm's fixed points, we propose an approach based on a latent Ising model, where dependencies between real-valued variables are encoded through a network of binary variables. To this end, we propose a definition of these variables using the cumulative distribution functions of the real-valued variables. For the prediction task, it is necessary to modify the Belief Propagation algorithm in order to impose Bayesian-like constraints on marginal distributions of the binary variables. Estimation of the model parameters can easily be performed using only pairwise observations. In fact, this approach is a way to solve the regression problem by working on quantiles.Furthermore, we propose a greedy algorithm for estimating both the structure and the parameters of a Gauss-Markov random field based on the Iterative Proportional Scaling procedure. At each iteration, the algorithm yields a new model which likelihood, or an approximation of it in the case of partial observations,is higher than the one of the previous model. Because of its local perturbation principle, this algorithm allows us to impose spectral constraints, increasing the compatibility with the Gaussian Belief Propagation algorithm. The performances of all approaches are empirically illustrated on synthetic data.
|
Page generated in 0.057 seconds