421 |
A decompositional investigation of 3D face recognitionCook, James Allen January 2007 (has links)
Automated Face Recognition is the process of determining a subject's identity from digital imagery of their face without user intervention. The term in fact encompasses two distinct tasks; Face Verficiation is the process of verifying a subject's claimed identity while Face Identification involves selecting the most likely identity from a database of subjects. This dissertation focuses on the task of Face Verification, which has a myriad of applications in security ranging from border control to personal banking. Recently the use of 3D facial imagery has found favour in the research community due to its inherent robustness to the pose and illumination variations which plague the 2D modality. The field of 3D face recognition is, however, yet to fully mature and there remain many unanswered research questions particular to the modality. The relative expense and specialty of 3D acquisition devices also means that the availability of databases of 3D face imagery lags significantly behind that of standard 2D face images. Human recognition of faces is rooted in an inherently 2D visual system and much is known regarding the use of 2D image information in the recognition of individuals. The corresponding knowledge of how discriminative information is distributed in the 3D modality is much less well defined. This dissertations addresses these issues through the use of decompositional techniques. Decomposition alleviates the problems associated with dimensionality explosion and the Small Sample Size (SSS) problem and spatial decomposition is a technique which has been widely used in face recognition. The application of decomposition in the frequency domain, however, has not received the same attention in the literature. The use of decomposition techniques allows a map ping of the regions (both spatial and frequency) which contain the discriminative information that enables recognition. In this dissertation these techniques are covered in significant detail, both in terms of practical issues in the respective domains and in terms of the underlying distributions which they expose. Significant discussion is given to the manner in which the inherent information of the human face is manifested in the 2D and 3D domains and how these two modalities inter-relate. This investigation is extended to cover also the manner in which the decomposition techniques presented can be recombined into a single decision. Two new methods for learning the weighting functions for both the sum and product rules are presented and extensive testing against established methods is presented. Knowledge acquired from these examinations is then used to create a combined technique termed Log-Gabor Templates. The proposed technique utilises both the spatial and frequency domains to extract superior performance to either in isolation. Experimentation demonstrates that the spatial and frequency domain decompositions are complimentary and can combined to give improved performance and robustness.
|
422 |
Análise de sobrevivência do tomateiro a Phytophthora infestans / Analysis of the survival of the tomato plant Phytophthora infestansAraujo, Maria Nilsa Martins de 05 September 2008 (has links)
Made available in DSpace on 2015-03-26T13:32:04Z (GMT). No. of bitstreams: 1
texto completo.pdf: 569181 bytes, checksum: 1b525772884dca74fcef6c9c8033aaa5 (MD5)
Previous issue date: 2008-09-05 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Reburning caused by Phytophthora infestansis is characterized as an aggressive disease of great destructive impact, capable of limiting or even hindering the economic cultivation of the tomato plant under conditions of high humidity and low temperatures. In view of the problems reburning can cause to tomato plant crops, this work aimed to: 1) fit models to describe the progress of the disease and form groups of tomato accesses with similar curves; 2) estimate data referring to the number of days to reach 5% severity of the disease, by means of inverse regression; 3) fit survival curves by means of the Kaplan-Meier estimator for the access groups and compare them by means of the Logrank test;4)fit survival curves by means of probabilistic models and compare these curves with Kaplan Meir´s non-parametric technique. Using tomato reburning real data, it was possible to fit the exponential model (Y = y0 exp (rX)) to
describe the disease s progress. The means of the parameter estimates were submitted to grouping analysis using the centroid method, generating 10 access groups. Time up to 5% of the disease was calculated via inverse regression. Non-parametric techniques were used to estimate survival function by means of the Kaplan-Meier´s estimator to compare the survival curves by the Logrank test .The survival function was also fit using the probabilistic models, exponential Weibull and Log-normal, respectively, which were compared by means of the verisimilitude ratio test (VRT), considering the generalized Gamma model, as a general case for these models. The methodology applied allowed fitting the exponential model to describe tomato plant reburning progress and to regroup the accesses studied in the 10 groups. The access BGH-6 obtained a smaller disease progress than the others, thus characterizing its higher resistance to the disease; An inverse regression allowed time estimation up to the occurrence of 5% of the severity of the tomato plant reburning. The Kaplan-Meier ´s non-parametric technique allowed estimating the survival curves of the tomato plant accesses belonging to the groups 1, 2, 4, 6 and 8. Utilizing the Logrank test, it could be
concluded that most two-by-two comparisons were significant (p<0.05), except in the comparisons of groups 2x4, 4x8 and 6x8. The use of the probabilistic models, exponential Weibull and Log-normal allowed estimating the survival curves of groups 2, 4, 6 and 8, except for group 4, to which the Weibull model was not adequate. Comparing the probabilistic models with the non-parametric technique, the curves of
the probabilistic models of groups 2 and 4 presented satisfactory results, compared to the curve estimated by Kaplan-Meier. / A requeima causada por Phytophthora infestans caracteriza-se por ser uma doença agressiva e de grande impacto destrutivo, podendo limitar ou até mesmo impedir o cultivo econômico do tomateiro sob condições de alta umidade e baixas temperaturas. Diante dos problemas que a requeima pode provocar às lavouras de tomate, este trabalho teve por objetivos: 1) ajustar modelos para descrever o progresso da doença e formar grupos de acessos de tomateiro com curvas semelhantes; 2) estimar dados referentes ao número de dias até atingir 5% de severidade da doença, por meio de regressão inversa; 3) ajustar curvas de sobrevivência por meio do estimador de Kaplan-Meier para grupos de acessos e compará-las mediante o uso do teste Logrank; 4) ajustar curvas de sobrevivência por meio de modelos probabilísticos e compará-las com a técnica não-paramétrica de Kaplan-Meier. Utilizando dados reais sobre a requeima do tomateiro, foi possível ajustar o modelo exponencial (Y = y0 exp (rX)) para descrever o progresso da doença. As médias das estimativas dos parâmetros foram submetidas à análise de agrupamento pelo método Centróide, o que gerou 10 grupos de acessos, sendo o tempo até a incidência de 5% da doença calculado via regressão inversa. Foram utilizadas técnicas não-paramétricas para estimar a função de sobrevivência por meio
do estimador de Kaplan-Meier e para comparar as curvas de sobrevivência pelo teste Logrank. Foi também ajustada a função de sobrevivência, empregando-se os modelos probabilísticos Exponencial, Weibull e Log-normal, os quais foram comparados por meio do Teste da Razão da Verossimilhança (TRV), considerando-se o modelo Gama generalizado por ser caso geral para esses modelos. A metodologia utilizada permitiu ajustar o modelo Exponencial para descrever o progresso da requeima do tomateiro e agrupar os acessos estudados em 10 grupos. O acesso BGH-6 sofreu um progresso de doença menor que os demais, caracterizando-se, assim, sua maior resistência à enfermidade. A regressão inversa possibilitou estimar o tempo até a ocorrência de 5% da severidade da requeima do tomateiro. Pela técnica não-paramétrica de Kaplan-Meier, foi possível estimar as curvas de sobrevivência dos acessos de tomateiro pertencentes aos grupos 1, 2, 4, 6 e 8. Utilizando o teste Logrank, pode-se concluir que a maioria das comparações duas a duas foi significativa (p<0,05), exceto nas comparações dos grupos 2x4, 4x8 e 6x8. O uso dos modelos probabilísticos Exponencial, Weibull e Log-normal possibilitou a estimação das curvas de sobrevivência nos grupos 2, 4, 6 e 8, exceto no grupo 4, em que o modelo Weibull não foi adequado. Comparando os modelos probabilísticos com a técnica não-paramétrica, as curvas dos modelos probabilísticos dos grupos 2 e 4 apresentaram ajustes satisfatórios com relação à curva estimada por Kaplan-Meier.
|
423 |
A Distribuição Fréchet generalizada. / The Generalized Fréchet Distribution.MACHADO, Elizabete Cardoso. 08 August 2018 (has links)
Submitted by Johnny Rodrigues (johnnyrodrigues@ufcg.edu.br) on 2018-08-08T19:36:59Z
No. of bitstreams: 1
ELIZABETE CARDOSO MACHADO - DISSERTAÇÃO PPGMAT 2013..pdf: 1253783 bytes, checksum: 102940a8c9aabedecfa013e695047eb1 (MD5) / Made available in DSpace on 2018-08-08T19:36:59Z (GMT). No. of bitstreams: 1
ELIZABETE CARDOSO MACHADO - DISSERTAÇÃO PPGMAT 2013..pdf: 1253783 bytes, checksum: 102940a8c9aabedecfa013e695047eb1 (MD5)
Previous issue date: 2013-09 / Capes / Neste trabalho fizemos um estudo sobre a classe de distribuições generalizadas
exponencializadas, a distribuição Fréchet generalizada e a distribuição Weibull inversa
log-generalizada. Obtemos algumas propriedades da distribuição Fréchet generalizada.
Uma nova distribuição é proposta: a distribuição log-Fréchet generalizada. Esta distribuição é uma estensão da distribuição Fréchet. Outra proposta deste trabalho é
introduzir um modelo de regressão log-Fréchet generalizada com censura Tipo I baseado
na distribuição log-Fréchet generalizada. / In this work, we did a research study about the exponentiated generalized class
of distributions, the generalized Fréchet distribution and the log-generalized inverse
Weibull distribution. We obtain some properties of generalized Fréchet distribution.
Furthermore, a new distribution is proposed: the generalized log-Fréchet distribution.
This new distribution is an extension of Fréchet distribution. Another propose of this
work is to introduce a generalized log-Frechét regression model with Type-I
censoringbased on the generalized log-Frechét distribution.
|
424 |
Extensões dos modelos de sobrevivência referente a distribuição WeibullVigas, Valdemiro Piedade 07 March 2014 (has links)
Made available in DSpace on 2016-06-02T20:06:09Z (GMT). No. of bitstreams: 1
5822.pdf: 1106242 bytes, checksum: 613a82d7af4c6f40b60637e4c7122121 (MD5)
Previous issue date: 2014-03-07 / Financiadora de Estudos e Projetos / In this dissertation, two models of probability distributions for the lifetimes until the occurrence of the event produced by a specific cause for elements in a population are reviewed. The first revised model is called the Weibull-Poisson (WP) which has been proposed by Louzada et al. (2011a). This model generalizes the exponential-Poisson distributions proposed by Kus (2007) and Weibull. The second, called long-term model, has been proposed by several authors and it considers that the population is not homogeneous in relation to the risk of event occurence by the cause studied. The population has a sub-population that consists of elements who are not liable do die by the specific cause in study. These elements are considered as immune or cured. In relation to the elements who are at risk the minimum value of time of the event accurance is observed. In the review of WP the expressions of the survival function, quantile function, probability density function, and of the hazard function, as well the expression of the non-central moments of order k and the distribution of order statistics are detailed. From this review we propose, in an original way, studies of the simulation to analyze the paramenters of frequentist properties of maximum likelihood estimators for this distribution. And also we also present results related to the inference about the parameters of this distribution, both in the case in which the data set consists of complete observations of lifetimes, and also in the case in which it may contain censored observations. Furthermore, we present in this paper, in an original way a regression model in a form of location and scale when T has WP distribution. Another original contribution of this dissertation is to propose the distribution of long-term Weibull-Poisson (LWP). Besides studying the LWP in the situation in which the covariates are included in the analysis. We also described the functions that characterize this distribution (distribution function, quantile function, probability density function and the hazard function). Moreover we describe the expression of the moment of order k, and the density function of a statistical order. A study by simulation viii of this distribution is made through maximum likelihood estimators. Applications to real data set illustrate the applicability of the two considered models. / Nesta dissertação são revistos dois modelos de distribuições de probabilidade para os tempos de vida até a ocorrência do evento provocado por uma causa específica para elementos em uma população. O primeiro modelo revisto é o denominado Weibull-Poisson (WP) que foi proposto por Louzada et al. (2011a), esse modelo generaliza as distribuições exponencial Poisson proposta por Kus (2007) e Weibull. O segundo, denominado modelo de longa duração, foi proposto por vários autores e considera que a população não é homogênea em relação ao risco de ocorrência do evento pela causa em estudo. A população possui uma sub-população constituída de elementos que não estão sujeitos ao evento pela causa especifica em estudo, sendo considerados como imunes ou curados. Em relação à parcela dos elementos que estão em risco observa-se o valor mínimo dos tempos da ocorrência do evento. Na revisão sobre a WP são detalhadas as expressões da função de sobrevivência, da função quantil, da função densidade de probabilidade e da função de risco, bem como a expressão dos momentos não centrais de ordem k e a distribuição de estatísticas de ordem. A partir desta revisão, é proposta de forma original, estudos de simulação com o objetivo de analisar as propriedades frequentistas dos estimadores de máxima verossimilhança dos parâmetros desta distribuição. E apresenta-se resultados relativos à inferência sobre os parâmetros desta distribuição, tanto no caso em que o conjunto de dados consta de observações completas de tempos de vida, como no caso em que ele possa conter observações censuradas. Alem disso, apresentamos de forma original neste trabalho um modelo de regressão na forma de locação e escala quando T tem distribuição WP. Outra contribuição original dessa dissertação é propor a distribuição de longa duração Weibull-Poisson (LWP), alem de estudar a LWP na situação em que as covariáveis são incluídas na análise. Realizou-se também a descrição das funções que caracterizam essa distribuição (função distribuição, função quantil, função densidade de probabilidade e função de risco). Assim como a descrição da expressão do momento de ordem k e da função densidade da estatística de ordem. É feito um estudo por simulação desta distribuição via máxima verossimilhança. Aplicações à conjuntos de dados reais ilustram a utilidade dos dois modelos considerados.
|
425 |
Modelos de regressão quando a função de taxa de falha não é monótona e o modelo probabilístico beta Weibull modificada / Regression models when the failure rate function is no monotone and the new beta modified Weibull modelGiovana Oliveira Silva 05 February 2009 (has links)
Em aplicações na área de análise de sobrevivência, é freqüente a ocorrência de função de taxa de falha em forma de U ou unimodal, isto e, funções não-monótonas. Os modelos de regressão comumente usados para dados de sobrevivência são log-Weibull, função de taxa de falha monótona, e log-logística, função de taxa de falha decrescente ou unimodal. Um dos objetivos deste trabalho e propor os modelos de regressão, em forma de locação e escala, log-Weibull estendida que apresenta função de taxa de falha em forma de U e log- Burr XII que tem como caso particular o modelo de regressão log-logística. Considerando dados censurados, foram utilizados três métodos para estimação dos parâmetros, a saber, máxima verossimilhança, bayesiana e jackkinife. Para esses modelos foram calculadas algumas medidas de diagnósticos de influência local e global. Adicionalmente, desenvolveu-se uma análise de resíduos baseada no resíduo tipo martingale. Para diferentes parâmetros taxados, tamanhos de amostra e porcentagens de censuras, várias simulações foram feitas para avaliar a distribuição empírica do resíduo tipo martingale e compará-la com a distribuição normal padrão. Esses estudos sugerem que a distribuição empírica do resíduo tipo martingale para o modelo de regressão log-Weibull estendida com dados censurados aproxima-se de uma distribuição normal padrão quando comparados com outros resíduos considerados neste estudo. Para o modelo de regressão log-Burr XII, foi proposta uma modificação no resíduo tipo martingale baseada no estudo de simulação para obter concordância com a distribuição normal padrão. Conjuntos de dados reais foram utilizados para ilustrar a metodologia desenvolvida. Também pode ocorrer que em algumas aplicações a suposição de independência dos tempos de sobrevivência não é válida. Assim, outro objetivo deste trabalho é introduzir um modelo de regressão log-Burr XII com efeito aleatório para o qual foi proposto um método de estimação para os parâmetros baseado no algoritmo EM por Monte Carlo. Por fim, foi desenvolvido um novo modelo probabilístico denominado de beta Weibull modificado que apresenta cinco parâmetros. A vantagem desse novo modelo é a flexibilidade em acomodar várias formas da função de taxa de falha, por exemplo, U e unimodal, e mostrou-se útil na discriminação entre alguns modelos probabilísticos alternativos. O método de máxima verossimilhança e proposto para estimar os parâmetros desta distribuição. A matriz de informação observada foi calculada. Um conjunto de dados reais é usado para ilustrar a aplicação da nova distribuição / In survival analysis applications, the failure rate function may have frequently unimodal or bathtub shape, that is, non-monotone functions. The regression models commonly used for survival studies are log-Weibull, monotone failure rate function shape, and log-logistic, decreased or unimodal failure rate function shape. In the first part of this thesis, we propose location-scale regression models based on an extended Weibull distribution for modeling data with bathtub-shaped failure rate function and on a Burr XII distribution as an alternative to the log-logistic regression model. Assuming censored data, we consider a classical analysis, a Bayesian analysis and a jackknife estimator for the parameters of the proposed models. For these models, we derived the appropriate matrices for assessing the local influence on the parameter estimates under diferent perturbation schemes, and we also presented some ways to perform global influence. Additionally, we developed residual analy- sis based on the martingale-type residual. For di®erent parameter settings, sample sizes and censoring percentages, various simulation studies were performed and the empirical distribution of the martingale-type residual was displayed and compared with the standard normal distribution. These studies suggest that the empirical distribution of the martingale-type residual for the log-extended Weibull regression model with data censured present a high agreement with the standard normal distribution when compared with other residuals considered in these studies. For the log-Burr XII regression model, it was proposed a change in the martingale-type residual based on some studies of simulation in order to obtain an agreement with the standard normal distribution. Some applications to real data illustrate the usefulness of the methodology developed. It can also happen in some applications that the assumption of independence of the times of survival is not valid, so it was added to the log-Burr XII regression model of random exects for which an estimate method was proposed for the parameters based on the EM algorithm for Monte Carlo simulation. Finally, a five- parameter distribution so called the beta modified Weibull distribution is defined and studied. The advantage of that new distribution is its flexibility in accommodating several forms of the failure rate function, for instance, bathtub-shaped and unimodal shape, and it is also suitable for testing goodness-of-fit of some special sub-models. The method of maximum likelihood is used for estimating the model parameters. We calculate the observed information matrix. A real data set is used to illustrate the application of the new distribution.
|
426 |
Implementation of a Log Agent in Microsoft Azure : and packaging it to Azure Marketplace / Implementering av en Log Agent i Microsoft Azure : och paketering till Azure MarketplaceBui, Michael, Pedersen, Magnus January 2015 (has links)
Cloud computing is still in an early stage of development and Microsoft is now investing considerable amount of resources in the cloud. Microsoft Azure is a cloud platform developed by Microsoft and it is continuously evolving, new features are constantly being added and old features are being updated. Integration Software, which is a company that focuses on products for system integration strongly believes that cloud-based solutions will have a significant impact on their future. This is why selling and developing solutions and services for the cloud are strategically important for them. The objective of this dissertation is to investigate Microsoft Azure in general and Azure Marketplace in particular. This investigation consisted of an implementation of a Microsoft Azure application and integrating this application with Azure Marketplace and evaluating the expenses for running the application. The purpose for this project is to gain practical experience and to work with new techniques and help Integration Software better understand Azure Marketplace. The application is a Log Agent which fetches data from an external source and resends the data to an external party (Integration Manager). Our first intention was to package and deploy the application to a newly updated Azure Marketplace. The new Azure Marketplace was never released during this dissertation so we decided to deploy the application to the existing version of Azure Marketplace. This was however not fully successful. We encountered some problems in successfully deploying the application to Azure Marketplace. The evaluations for the cost of running an Azure application were not carried out due to lack of time.
|
427 |
Modèles de régression multivariés pour la comparaison de populations en IRM de diffusion / Multivariate regression models for group comparison in diffusion tensor MRIBouchon, Alix 28 September 2016 (has links)
L'IRM de diffusion (IRMd) est une modalité d'imagerie qui permet d'étudier in vivo la structure des faisceaux de la substance blanche grâce à la caractérisation des propriétés de diffusion des molécules d'eau dans le cerveau. Les travaux de cette thèse se sont concentrés sur la comparaison de groupes d'individus en IRMd. Le but est d'identifier les zones de la substance blanche dont les propriétés structurelles sont statistiquement différentes entre les deux populations ou significativement corrélées avec certaines variables explicatives. L’enjeu est de pouvoir localiser et caractériser les lésions causées par une pathologie et de comprendre les mécanismes sous-jacents. Pour ce faire, nous avons proposé dans cette thèse des méthodes d'analyse basées voxel reposant sur le Modèle Linéaire Général (MLG) et ses extensions multivariées et sur des variétés, qui permettent d'effectuer des tests statistiques intégrant explicitement des variables explicatives. En IRMd, la diffusion des molécules d'eau peut être modélisée par un tenseur d'ordre deux représenté par une matrice symétrique définie-positive de dimension trois. La principale contribution de cette thèse a été de montrer la plus-value de considérer, dans le MLG, l'information complète du tenseur par rapport à un unique descripteur scalaire caractérisant la diffusion (fraction d’anisotropie ou diffusion moyenne), comme cela est généralement fait dans les études en neuro-imagerie. Plusieurs stratégies d’extension du MLG aux tenseurs ont été comparées, que ce soit en termes d’hypothèse statistique (homoscédasticité vs hétéroscédasticité), de métrique utilisée pour l’estimation des paramètres (Euclidienne, Log-Euclidienne et Riemannienne), ou de prise en compte de l’information du voisinage spatial. Nous avons également étudié l'influence de certains prétraitements comme le filtrage et le recalage. Enfin, nous avons proposé une méthode de caractérisation des zones détectées afin d’en faciliter l’interprétation physiopathologique. Les validations ont été menées sur données synthétiques ainsi que sur une base d’images issues d’une cohorte de patients atteints de Neuromyélite optique de Devic. / Diffusion Tensor MRI (DT-MRI) is an imaging modality that allows to study in vivo the structure of white matter fibers through the characterization of diffusion properties of water molecules in the brain. This work focused on group comparison in DT-MRI. The aim is to identify white matter regions whose structural properties are statistically different between two populations or significantly correlated with some explanatory variables. The challenge is to locate and characterize lesions caused by a disease and to understand the underlying mechanisms. To this end, we proposed several voxel-based strategies that rely on the General Linear Model (GLM) and its multivariate and manifold-based extensions, to perform statistical tests that explicitly incorporate explanatory variables. In DT-MRI, diffusion of water molecules can be modeled by a second order tensor represented by a three dimensional symmetric and positive definite matrix. The main contribution of this thesis was to demonstrate the added value of considering the full tensor information as compared to a single scalar index characterizing some diffusion properties (fractional anisotropy or mean diffusion) in the GLM, as it is usually done in neuroimaging studies. Several strategies for extending the GLM to tensor were compared, either in terms of statistical hypothesis (homoscedasticity vs heteroscedasticity), or metrics used for parameter estimation (Euclidean, Log-Euclidean and Riemannian), or the way to take into account the spatial neighborhood information. We also studied the influence of some pre-processing such as filtering and registration. Finally, we proposed a method for characterizing the detected regions in order to facilitate their physiopathological interpretation. Validations have been conducted on synthetic data as well as on a cohort of patients suffering from Neuromyelitis Optica.
|
428 |
Funções de predição espacial de propriedades do solo / Spatial prediction functions of soil propertiesRosa, Alessandro Samuel 27 January 2012 (has links)
Conselho Nacional de Desenvolvimento Científico e Tecnológico / The possibility of mapping soil properties using soil spatial prediction functions
(SSPFe) is a reality. But is it possible to SSPFe to estimate soil properties such as the particlesize
distribution (psd) in a young, unstable and geologically complex geomorphologic
surface? What would be considered a good performance in such situation and what
alternatives do we have to improve it? With the present study I try to find answers to such
questions. To do so I used a set of 339 soil samples from a small catchment of the hillslope
areas of central Rio Grande do Sul. Multiple linear regression models were built using landsurface
parameters (elevation, convergence index, stream power index). The SSPFe explained
more than half of data variance. Such performance is similar to that of the conventional soil
mapping approach. For some size-fractions the SSPFe performance can reach 70%. Largest
uncertainties are observed in areas of larger geological heterogeneity. Therefore, significant
improvements in the predictions can only be achieved if accurate geological data is made
available. Meanwhile, SSPFe built on land-surface parameters are efficient in estimating the
psd of the soils in regions of complex geology. However, there still are questions that I
couldn t answer! Is soil mapping important to solve the main social and environmental issues
of our time? What if our activities were subjected to a social control as in a direct democracy,
would they be worthy of receiving any attention? / A possibilidade de mapear as propriedades dos solos através do uso de funções de
predição espacial de solos (FPESe) é uma realidade. Mas seria possível construir FPESe para
estimar propriedades como a distribuição do tamanho de partículas do solo (dtp) em um
superfície geomorfológica jovem e instável, com elevada complexidade geológica e
pedológica? O que seria considerado um bom desempenho nessas condições e que
alternativas temos para melhorá-lo? Com esse trabalho tento encontrar respostas para essas
questões. Para isso utilizei um conjunto de 339 amostras de solo de uma pequena bacia
hidrográfica de encosta da região Central do RS. Modelos de regressão linear múltiplos foram
construídos com atributos de terreno (elevação, índice de convergência, índice de potência de
escoamento). As FPESe explicaram mais da metade da variância dos dados. Tal desempenho
é semelhante àquele da abordagem tradicional de mapeamento de solos. Para algumas frações
de tamanho o desempenho das FPESe pode chegar a 70%. As maiores incertezas ocorrem nas
áreas de maior heterogeneidade geológica. Assim, melhorias significativas nas predições
somente poderão ser alcançadas se dados geológicos acurados forem disponibilizados.
Enquanto isso, FPESe construídas a partir de atributos de terreno são eficientes em estimar a
dtp de solos de regiões com geologia complexa e elevada instabilidade. Mas restam dúvidas
que não consegui resolver! O mapeamento de solos é importante para a resolução dos
principais problemas sociais e ambientais do nosso tempo? E se nossas atividades estivessem
submetidas ao controle da população como em uma democracia direta, seriam elas dignas de
receber atenção?
|
429 |
Etude d'injections de Sobolev critiques dans les espaces d'Orlicz et applications / Study of the critical embedding ofthe lack of Sobolev into the Orlicz spaces and applicationsBen Ayed, Inès 28 December 2015 (has links)
Dans cette thèse, on s'est attaché d'une part à d'écrire le défaut de compacité de l'injection de Sobolev critique dans les différentes classes d'espaces d'Orlicz, et d'autre part à étudier l'équation de Klein-Gordon avec une non-linéarité exponentielle. Ce travail se divise en trois parties. L'objectif de la première partie est de caractériser le défaut de compacité de l'injection de Sobolev de $H^2_{rad}(R^4)$ dans l'espace d'Orlicz $mathcal{L}(R^4)$.Le but de la deuxième partie est double : tout d'abord, on a décrit le défaut de compacité de l'injection de Sobolev de $H^1(R^2)$ dans les différentes classes d'espaces d'Orlicz, ensuite on a étudié une famille d'équations de Klein-Gordon non linéaires à croissance exponentielle. Cette étude inclut à la fois les problèmes d'existence globale, de complétude asymptotique et d'étude qualitative pour le problème de Cauchy associé. La troisième partie est dédiée à l'analyse des solutions de l'équation de Klein-Gordon 2D issues d'une suite de données de Cauchy bornée dans $H^1_{rad}(R^2)times L^2_{rad}(R^2)$. Basée sur les décompositions en profils, cette analyse a été conduite dans le cadre de la norme d'Orlicz / In this thesis, we focused on the one hand on the description of the lack of compactness of the critical Sobolev embedding into different classes of Orlicz spaces, and on the other hand on the study of the nonlinear Klein-Gordon equation with exponential nonlinearity. This work is divided into three parts. The aim of the first part is to characterize the lack of compactness of the Sobolev embedding of $H^2_{rad}(R^4)$ into the Orlicz space $mathcal{L}(R^4)$.The aim of the second part is twofold: firstly, we describe the lack of compactness of the Sobolev embedding of $H^1(R^2)$ into different classes of Orlicz spaces, secondly we investigate a family of nonlinear Klein-Gordon equations with exponential nonlinearity. This study includes both the global existence problem, the asymptotic completeness and the qualitative study for the associated Cauchy problem. The third part is dedicated to the analysis of the solutions to the 2D Klein-Gordon equation associated to a sequence of bounded Cauchy data in $H^1_{rad}(R^2)times L^2_{rad}(R^2)$. Based on the profile decompositions, this analysis was conducted in the framework of Orlicz norm
|
430 |
Modelos estocásticos utilizados no planejamento da operação de sistemas hidrotérmicos / Stochastic model used in planning the operation of hydrothermalDanilo Alvares da Silva 20 May 2013 (has links)
Algumas abordagens para o problema de Planejamento Ótimo da Operação de Sistemas Hidrotérmicos (POOSH) utilizam modelos estocásticos para representar as vazões afluentes dos reservatórios do sistema. Essas abordagens utilizam, em geral, técnicas de Programação Dinâmica Estocástica (PDE) para resolver o POOSH. Por outro lado, muitos autores têm defendido o uso dos modelos determinísticos ou, particularmente, a Programação Dinâmica Determinística (PDD) por representar de forma individualizada a interação entre as usinas hidroelétricas do sistema. Nesse contexto, esta dissertação tem por objetivo comparar o desempenho da solução do POOSH obtida via PDD com a solução obtida pela PDE, que emprega um modelo Markoviano periódico, com distribuição condicional Log-Normal Truncada para representar as vazões. Além disso, é realizada a análise com abordagem bayesiana, no modelo de vazões, para estimação dos parâmetros e previsões das vazões afluentes. Comparamos as performances simulando a operação das usinas hidroelétricas de Furnas e Sobradinho, considerando séries de vazões geradas artificialmente / Some approaches for problem of Optimal Operation Planning of Hydrothermal Systems (OOPHS) use stochastic models to represent the inflows in the reservoirs that compose the system. These approaches typically use the Stochastic Dynamic Programming (SDP) to solve the OOPHS. On the other hand, many authors defend the use of deterministic models and, particularly, the Deterministic Dynamic Programming (DDP) since it individually represents the interaction between the hydroelectric plants. In this context, this dissertation aims to compare the performance of the OOPHS solution obtained via DDP with the one given by SDP, which employs a periodic Markovian model with conditional Truncated Log-Normal distribution to represent the inflows. Furthermore, it is performed a bayesian approach analysis, in the inflow model, for estimating the parameters and forecasting the inflows. We have compared the performances of the DDP and SDP solutions by simulating the hydroelectric plants of Furnas and Sobradinho, employing artificially generated series
|
Page generated in 0.0623 seconds