• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 56
  • 19
  • 9
  • 5
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 121
  • 121
  • 94
  • 17
  • 17
  • 17
  • 16
  • 16
  • 15
  • 14
  • 14
  • 13
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Evaluation probabiliste du risque lié à l'exposition à des aflatoxines et des fumonisines dû à la consommation de tortillas de maïs à la ville de Veracruz / Evaluación probabilistica de riesgo por exposición a aflatoxinas y fumonisinas por consumo de tortillas en la ciudad de Veracruz / Probabilistic risk assessment for aflatoxins and fumonisins exposition through the consumption of maize tortillas in Veracruz City, Mexico

Wall Martinez, Hiram Alejandro 20 October 2016 (has links)
Un des dangers chimiques les plus importants relevés par l'OMS concerne la contamination des céréales par les mycotoxines et notamment les aflatoxines et les fumonisines. La réglementation recommande des contaminations maximales d'aflatoxines dans les céréales inférieures à 20 mg/kg ; cependant on relève couramment des taux supérieurs à 200 mg/kg dans le maïs au Mexique. Bien qu'il ait été évalué que le processus de nixtamalisation détruit plus de 90 % des fumonisines et de 80 à 95 % des aflatoxines, le taux résiduel peut encore être élevé : certaines publications rapportent des concentrations jusqu'à 100 mg/kg dans les tortillas, ce qui représente un risque avéré vue la grande consommation de tortillas au Mexique (325g/j). Le JECFA (2001) a établi une dose maximale acceptable de 2μg/kg pc/j pour la fumonisine et recommande de réduire l'exposition aux aflatoxines au plus faible niveau possible en considérant que le seuil de 1 ng/kg pc/j ne devrait pas être atteint. Au cours de cette thèse 3 enquêtes aléatoires et représentatives ont été menées dans 40 tortillerias de la ville de Veracruz. La consommation de maïs de la population a été évaluée à partir d'un questionnaire de consommation. L'analyse des mycotoxines a été réalisée par HPLC-FD par utilisation de colonnes à immunoaffinité selon la réglementation européenne (CIRAD-Montpellier). L'analyse des données obtenues a été effectuée selon une méthode probabiliste permettant de construire une fonction de distribution de probabilités à partir de la méthode de Monte Carlo (UBO). La représentativité de la population a été validée à partir d'évaluation de quotas de population après échantillonnage aléatoire initial. La contamination des tortillas a été mesurée à 0.54-1.96 mg/kg pour les aflatoxines et à 65-136 mg/kg pour les fumonisines. La consommation moyenne de tortillas a été mesurée à 148 g de maïs par jour. L'exposition de la population aux aflatoxines apparaît alors comprise entre 0,94 et 3,14 ng/kg pc/j et celle aux fumonisines entre 146 et 315 ng/kg pc/j. Les échantillons les plus contaminés proviennent des tortillerias réalisant elles-mêmes leur procédure de nixtamalisation. L'analyse des résultats montre que 60 % de la population de Veracruz serait à risque selon les préconisations du JECFA. L'exposition aux fumonisines atteint 5 % de la dose maximale acceptable, du fait d'une relativement faible contamination du maïs à cette mycotoxine. Les résultats montrent donc un risque sanitaire pour la population de la ville de Veracruz. Une extension de ce travail à la totalité de l’Etat de Veracruz, incluant la population rurale, devrait être menée du fait du risque probablement accru de cette dernière catégorie de population en lien avec sa plus forte consommation de maïs. / One of the chemical hazards that WHO has reported more frequently is cereals contamination with mycotoxins, mainly aflatoxins and fumonisins. NOM-188-SSA1-2002 establishes that aflatoxin concentration in grain should not exceed 20 mg kg-1 ; however, there are reported concentrations > 200 mg kg-1 in maize. Although it has been documented that nixtamalizacion removes more than 90% of fumonisins and between 80 and 95% of aflatoxins, the residual amount could be important, finding reports concentrations higher than 100 mg kg-1 of aflatoxin in tortilla, representing a risk due to the high consumption of tortillas in Mexico (325 g d-1). The JECFA (2001) establishes a maximum intake of 2 mg kg-1 pc d-1 for fumonisin and aflatoxin recommends reducing “as low as reasonably achievable” levels. 3 random and representative sampling in Veracruz city, each in 40 tortillerias, were made. Corn intake and weight of the population were estimated using a consumption questionnaire. Mycotoxins analysis were performed by HPLC-FD using immunoaffinity columns according to European standard UNE-EN ISO 14123 : 2008 for aflatoxins and UNE-EN 13585 : 2002 for fumonisin in the CIRAD (Montpellier, France). Statistical analysis were performed under a probabilistic approach in collaboration with the University of Bretagne Occidentale (Brest, France), building probability density function (PDF) and using the Monte Carlo method. PDF parameters of the weight of the population was 74.15kg for men (which coincides with reported by CANAIVE) and 65.83kg for women ; the pollution aflatoxin tortilla was 0.54 – 1.96mg kg-1 and fumonisin from 65.46 – 136.00mg kg-1 ; the tortilla consumption was 148.3g of corn per person per day ; the daily intake of aflatoxins was 0.94 – 3.14ng kg-1 bw d-1 and fumonisin of 146.24 – 314.99ng kg-1 bw d-1. Samples with higher aflatoxin contamination came from tortillerias that make the nixtamalization in situ. In assessing exposure it was found that up to 60% of the population could be consuming more than the recommended by JECFA (2001) for aflatoxin dose (1ng kg-1 bw d-1). Exposure to fumonisins intake was < 5% due to low contamination by these mycotoxins. The results suggest that the population of the city of Veracruz could be in food risk by eating contaminated corn tortillas AFT. It is advisable to extend this study in rural communities, where risk factors could increase.
42

A Probabilistic Cost to Benefit Assessment of a Next Generation Electric Power Distribution System

January 2016 (has links)
abstract: This thesis provides a cost to benefit assessment of the proposed next generation distribution system, the Future Renewable Electric Energy Distribution Management (FREEDM) system. In this thesis, a probabilistic study is conducted to determine the payback period for an investment made in the FREEDM distribution system. The stochastic study will help in performing a detailed analysis in estimating the probability density function and statistics associated with the payback period. This thesis also identifies several parameters associated with the FREEDM system, which are used in the cost benefit study to evaluate the investment and several direct and indirect benefits. Different topologies are selected to represent the FREEDM test bed. Considering the cost of high speed fault isolation devices, the topology design is selected based on the minimum number of fault isolation devices constrained by enhanced reliability. A case study is also performed to assess the economic impact of energy storage devices in the solid state transformers so that the fault isolation devices may be replaced by conventional circuit breakers. A reliability study is conducted on the FREEDM distribution system to examine the customer centric reliability index, System Average Interruption Frequency Index (SAIFI). It is observed that the SAIFI was close to 0.125 for the FREEDM distribution system. In addition, a comparison study is performed based on the SAIFI for a representative U.S. distribution system and the FREEDM distribution system. The payback period is also determined by adopting a theoretical approach and the results are compared with the Monte Carlo simulation outcomes to understand the variation in the payback period. It is observed that the payback period is close to 60 years but if an annual rebate is considered, the payback period reduces to 20 years. This shows that the FREEDM system has a significant potential which cannot be overlooked. Several direct and indirect benefits arising from the FREEDM system have also been discussed in this thesis. / Dissertation/Thesis / Masters Thesis Electrical Engineering 2016
43

Análise de frequência de secas utilizando análise de agrupamento e distribuições de probabilidades.

MELO, Valneli da Silva. 17 July 2018 (has links)
Submitted by Lucienne Costa (lucienneferreira@ufcg.edu.br) on 2018-07-17T15:07:40Z No. of bitstreams: 1 VALNELI DA SILVA MELO – DISSERTAÇÃO (PPGMET) 2016.pdf: 1922991 bytes, checksum: 610d4875e0220fed76357693eec4f62f (MD5) / Made available in DSpace on 2018-07-17T15:07:40Z (GMT). No. of bitstreams: 1 VALNELI DA SILVA MELO – DISSERTAÇÃO (PPGMET) 2016.pdf: 1922991 bytes, checksum: 610d4875e0220fed76357693eec4f62f (MD5) Previous issue date: 2016-06 / Capes / O objetivo deste trabalho foi o de ajustar funções densidades de probabilidades aos dados das variáveis Severidade e Duração de secas em três sub-regiões do Semiárido do Brasil (SAB). Para tanto, foram utilizados dados de totais mensais precipitados de 320 postos pluviométricos, no período de 1984 a 2014, gentilmente cedidos pela Agência Nacional de Águas (ANA). Foi utilizado o método “RUN” para se obter os dados de Severidade e Duração de secas, a partir dos totais mensais precipitados. Em seguida utilizou-se a técnica de Análise de Agrupamento para regionalizar a Severidade e Duração de secas. O passo seguinte foi obter os ajustes à função distribuição de probabilidades para cada sub-região. As funções distribuição de probabilidades: Gama, GEV e Logística foram as que melhor se ajustaram à variável Severidade de secas, já a os modelos GEV, Weibull com três parâmetros e Gama com dois parâmetros se ajustaram melhor aos dados de Duração de secas para as sub-regiões homogêneas 1, 2 e 3 respectivamente. Os ajustes das variáveis Severidade e Duração de secas à distribuição de probabilidades Normal bivariada não logrou êxito em nenhuma sub-região do SAB. / The objective of this study was to adjust probability density functions to data of severity and duration of droughts in three sub-regions of the semiarid region of Brazil (SAB). To accomplish this task, the totals monthly precipitated of 320 rain gauges were used in the period 1984-2014. These data were kindly provided by the National Water Agency (ANA). Then, was used the method "RUN" to get the severity and duration of droughts based on totals monthly precipitated. Then we used the Cluster Analysis technique to regionalize the severity and duration of droughts. The next step was to get the adjustment of data to Probability Distribution Function (PDF) for each sub-region. The GEV and Logistics were the PDFs that best fitted to the severity of droughts. On the other hands, the GEV models, Weibull three parameters and Gamma with two parameters were adjusted better to the duration of droughts in the sub homogeneous-regions 1, 2 and 3 respectively. Lastly, the severity and duration of droughts were adjusted to normal bivariate distribution. This adjustment was not successful in any subregion of the SAB studied.
44

Classe de distribuições de Marshall-Olkin generalizada exponenciada.

BARROS, Kleber Napoleão Nunes de Oliveira 19 December 2014 (has links)
Submitted by (ana.araujo@ufrpe.br) on 2016-07-07T16:42:12Z No. of bitstreams: 1 Kleber Napoleao Nunes de Oliveira Barros.pdf: 1733769 bytes, checksum: 3f25ee9412d02841f417127da8b2b257 (MD5) / Made available in DSpace on 2016-07-07T16:42:13Z (GMT). No. of bitstreams: 1 Kleber Napoleao Nunes de Oliveira Barros.pdf: 1733769 bytes, checksum: 3f25ee9412d02841f417127da8b2b257 (MD5) Previous issue date: 2014-12-19 / This work generalizes the family of Marshall-Olkin distributions by adding parameters, making it a new more exible class, creating the new Generalized Exponentialized Marshall-Olkin Weibull distribution (GEMOW). Its probability density function and the associated risk function were studied with promising results. We found some quantities such as moments, moment generating function, quantile function and median, as well Bonferroni and Lorenz curves, for the proposed distribution. We drawed a simulation and we employed the bootstrap resampling procedure for the standard errors of the estimators of the model parameters. We applied the new distribution to magnitudes earthquakes dataset from Fiji archipelago, glass ber resistance dataset to the proposed model, sub-models and competitors distributions. Also it was obtained a regression model for censored data that was applied to data from a study of AIDS, and a Bayesian model implemented for carbon bre data. Comparing with the others distributions, the results demonstrate that GEMOW has superior t to the applied dataset. / O presente trabalho generaliza a famí lia de distribui ções Marshall-Olkin pela adi ção de parâmetros, tornando-a uma nova classe mais flexível, criando-se a nova distribui ção Marshall-Olkin Generalizada Exponenciada Weibull (MOGEW). Foi estudado o comportamento da fun ção densidade de probabilidade MOGEW e sua respectiva fun ção de risco com resultados promissores. Encontrou-se algumas quantidades tais como fun ção geradora de momentos, fun ção quantí lica e mediana, al ém das curvas de Bonferroni e Lorenz, para a distribui ção proposta. Obteve-se uma simula ção e utilizou-se o m étodo de reamostragem bootstrap para obter os erros padrão dos estimadores dos parâmetros do modelo. Para aplica ção foram utilizados dados de magnitudes de abalos s ísmicos pr óximos ao arquipélago de Fiji, dados de resistência de fi bras de vidro ajustando o modelo proposto, submodelos e distribui ções concorrentes. Tamb ém se obteve um modelo de regressão para dados censurados que foi aplicado a dados de um estudo sobre AIDS e um modelo Bayesiano para dados de quebra de fi bras de carbono. Os resultados mostraram que a distribui ção apresenta ajuste superior, em compara ção as distribui ções concorrentes, para os conjuntos de dados aplicados.
45

Mobile Application Development with Image Applications Using Xamarin

GAJJELA, VENKATA SARATH, DUPATI, SURYA DEEPTHI January 2018 (has links)
Image enhancement improves an image appearance by increasing dominance of some features or by decreasing ambiguity between different regions of the image. Image enhancement techniques have been widely used in many applications of image processing where the subjective quality of images is important for human interpretation. In many cases, the images have lack of clarity and have some effects on images due to fog, low light and other daylight effects exist. So, the images which have these scenarios should be enhanced and made clear to recognize the objects clearly. Histogram-based image enhancement technique is mainly based on equalizing the histogram of the image and increasing the dynamic range corresponding to the image. The Histogram equalization algorithm was performed and tested using different images facing the low light, fog images and colour contrast and succeeded in obtaining enhanced images. This technique is implemented by averaging the histogram values as the probability density function. Initially, we have worked with the MATLAB code on Histogram Equalization and made changes to implement an Application Program Interface i.e., API using Xamarin software. The mobile application developed using Xamarin software works efficiently and has less execution time when compared to the application developed in Android Studio. Debugging of the application is successfully done in both Android and IOS versions. The focus of this thesis is to develop a mobile application on Image enhancement using Xamarin on low light, foggy images.
46

Estudo de modelos estatisticos utilizados na caracterização de tecidos por ultra-som / A study of statistical models used for ultrasonic tissue characterization

Vivas, Gustavo de Castro 08 April 2006 (has links)
Orientadores: Eduardo Tavares Costa, Ricardo Grossi Dantas / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica e de Computação / Made available in DSpace on 2018-08-07T21:03:41Z (GMT). No. of bitstreams: 1 Vivas_GustavodeCastro_M.pdf: 7295002 bytes, checksum: 6c61cdae482950b95224f30787f35db0 (MD5) Previous issue date: 2006 / Resumo: O diagnóstico médico por ultra-som vem sendo amplamente difundido, tornando-se referência em muitos exames clínicos, destacando-se as imagens em modo-B, capazes de representar a anatomia de tecidos e órgãos de forma não-invasiva, em tempo real e sem a utilização de radiação ionizante. Entretanto, o speckle, artefato inerente aos sistemas que utilizam fontes coerentes como nos sistemas de ultra-som, degrada a qualidade das imagens, podendo reduzir bastante a capacidade de detecção de lesões pelo médico. A caracterização de tecidos por ultra-som visa extrair informações de relevância clínica sobre as reais características da estrutura biológica sob investigação e que não podem ser facilmente percebidas por inspeção visual. Neste trabalho foi realizado um estudo comparativo entre os principais modelos de distribuição estatística encontrados na literatura e adotados na caracterização de tecidos por ultra-som. Foram utilizadas funções densidade de probabilidade que melhor representassem o padrão de brilho existente em uma dada região de uma imagem. Os resultados indicaram a versatilidade da distribuição Composta (K-Nakagami) em modelar diferentes condições de espalhamento existentes nos tecidos, mostrando-se uma forte candidata para a caracterização de tecidos por ultra-som. Entretanto, usando o conceito de espalhadores equivalentes, pôde ser mostrado que a abordagem estatística utilizada não fornece parâmetros quantitativos conclusivos sobre a estrutura investigada, mas uma contribuição conjunta de vários fatores, entre eles a densidade e a distribuição de amplitudes dos espalhadores acústicos / Abstract: Ultrasound medical diagnosis has been widely used and has become a reference in many clinical examinations, especially B-mode imaging, capable of representing tissue and organ anatomy without ionizing radiation in a non-invasive way and in real-time. However, speckle, an inherent artifact of systems that use coherent sources like ultrasound systems, degrades image quality, leading to subjective and possibly misleading diagnostics. Ultrasonic tissue characterization aims to extract clinical relevant information of the biological structure characteristics under investigation and that cannot be easily achieved by visual inspection. In this dissertation it was carried out a comparative study of the most popular models of statistics distributions found in literature and commonly adopted in ultrasonic tissue characterization. It has been used probability density functions that better represented the brightness pattern of a given region of an ultrasound image. The results indicated the versatility of the Compound distribution (K-Nakagami) in modeling different scattering conditions of tissues, revealing itself a good model for use in ultrasonic tissue characterization. However, using the concept of equivalent scatterers, it could be shown that the statistics approach does not supply conclusive quantitative parameters of the structure under investigation, being a joint contribution of many factors such as density and amplitude distribution of the acoustic scatterers / Mestrado / Engenharia Biomedica / Mestre em Engenharia Elétrica
47

COMPUTATIONAL METHODS FOR RANDOM DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS

Navarro Quiles, Ana 01 March 2018 (has links)
Desde las contribuciones de Isaac Newton, Gottfried Wilhelm Leibniz, Jacob y Johann Bernoulli en el siglo XVII hasta ahora, las ecuaciones en diferencias y las diferenciales han demostrado su capacidad para modelar satisfactoriamente problemas complejos de gran interés en Ingeniería, Física, Epidemiología, etc. Pero, desde un punto de vista práctico, los parámetros o inputs (condiciones iniciales/frontera, término fuente y/o coeficientes), que aparecen en dichos problemas, son fijados a partir de ciertos datos, los cuales pueden contener un error de medida. Además, pueden existir factores externos que afecten al sistema objeto de estudio, de modo que su complejidad haga que no se conozcan de forma cierta los parámetros de la ecuación que modeliza el problema. Todo ello justifica considerar los parámetros de la ecuación en diferencias o de la ecuación diferencial como variables aleatorias o procesos estocásticos, y no como constantes o funciones deterministas, respectivamente. Bajo esta consideración aparecen las ecuaciones en diferencias y las ecuaciones diferenciales aleatorias. Esta tesis hace un recorrido resolviendo, desde un punto de vista probabilístico, distintos tipos de ecuaciones en diferencias y diferenciales aleatorias, aplicando fundamentalmente el método de Transformación de Variables Aleatorias. Esta técnica es una herramienta útil para la obtención de la función de densidad de probabilidad de un vector aleatorio, que es una transformación de otro vector aleatorio cuya función de densidad de probabilidad es conocida. En definitiva, el objetivo de este trabajo es el cálculo de la primera función de densidad de probabilidad del proceso estocástico solución en diversos problemas basados en ecuaciones en diferencias y diferenciales aleatorias. El interés por determinar la primera función de densidad de probabilidad se justifica porque dicha función determinista caracteriza la información probabilística unidimensional, como media, varianza, asimetría, curtosis, etc., de la solución de la ecuación en diferencias o diferencial correspondiente. También permite determinar la probabilidad de que acontezca un determinado suceso de interés que involucre a la solución. Además, en algunos casos, el estudio teórico realizado se completa mostrando su aplicación a problemas de modelización con datos reales, donde se aborda el problema de la estimación de distribuciones estadísticas paramétricas de los inputs en el contexto de las ecuaciones en diferencias y diferenciales aleatorias. / Ever since the early contributions by Isaac Newton, Gottfried Wilhelm Leibniz, Jacob and Johann Bernoulli in the XVII century until now, difference and differential equations have uninterruptedly demonstrated their capability to model successfully interesting complex problems in Engineering, Physics, Chemistry, Epidemiology, Economics, etc. But, from a practical standpoint, the application of difference or differential equations requires setting their inputs (coefficients, source term, initial and boundary conditions) using sampled data, thus containing uncertainty stemming from measurement errors. In addition, there are some random external factors which can affect to the system under study. Then, it is more advisable to consider input data as random variables or stochastic processes rather than deterministic constants or functions, respectively. Under this consideration random difference and differential equations appear. This thesis makes a trail by solving, from a probabilistic point of view, different types of random difference and differential equations, applying fundamentally the Random Variable Transformation method. This technique is an useful tool to obtain the probability density function of a random vector that results from mapping another random vector whose probability density function is known. Definitely, the goal of this dissertation is the computation of the first probability density function of the solution stochastic process in different problems, which are based on random difference or differential equations. The interest in determining the first probability density function is justified because this deterministic function characterizes the one-dimensional probabilistic information, as mean, variance, asymmetry, kurtosis, etc. of corresponding solution of a random difference or differential equation. It also allows to determine the probability of a certain event of interest that involves the solution. In addition, in some cases, the theoretical study carried out is completed, showing its application to modelling problems with real data, where the problem of parametric statistics distribution estimation is addressed in the context of random difference and differential equations. / Des de les contribucions de Isaac Newton, Gottfried Wilhelm Leibniz, Jacob i Johann Bernoulli al segle XVII fins a l'actualitat, les equacions en diferències i les diferencials han demostrat la seua capacitat per a modelar satisfactòriament problemes complexos de gran interés en Enginyeria, Física, Epidemiologia, etc. Però, des d'un punt de vista pràctic, els paràmetres o inputs (condicions inicials/frontera, terme font i/o coeficients), que apareixen en aquests problemes, són fixats a partir de certes dades, les quals poden contenir errors de mesura. A més, poden existir factors externs que afecten el sistema objecte d'estudi, de manera que, la seua complexitat faça que no es conega de forma certa els inputs de l'equació que modelitza el problema. Tot aço justifica la necessitat de considerar els paràmetres de l'equació en diferències o de la equació diferencial com a variables aleatòries o processos estocàstics, i no com constants o funcions deterministes. Sota aquesta consideració apareixen les equacions en diferències i les equacions diferencials aleatòries. Aquesta tesi fa un recorregut resolent, des d'un punt de vista probabilístic, diferents tipus d'equacions en diferències i diferencials aleatòries, aplicant fonamentalment el mètode de Transformació de Variables Aleatòries. Aquesta tècnica és una eina útil per a l'obtenció de la funció de densitat de probabilitat d'un vector aleatori, que és una transformació d'un altre vector aleatori i la funció de densitat de probabilitat és del qual és coneguda. En definitiva, l'objectiu d'aquesta tesi és el càlcul de la primera funció de densitat de probabilitat del procés estocàstic solució en diversos problemes basats en equacions en diferències i diferencials. L'interés per determinar la primera funció de densitat es justifica perquè aquesta funció determinista caracteritza la informació probabilística unidimensional, com la mitjana, variància, asimetria, curtosis, etc., de la solució de l'equació en diferències o l'equació diferencial aleatòria corresponent. També permet determinar la probabilitat que esdevinga un determinat succés d'interés que involucre la solució. A més, en alguns casos, l'estudi teòric realitzat es completa mostrant la seua aplicació a problemes de modelització amb dades reals, on s'aborda el problema de l'estimació de distribucions estadístiques paramètriques dels inputs en el context de les equacions en diferències i diferencials aleatòries. / Navarro Quiles, A. (2018). COMPUTATIONAL METHODS FOR RANDOM DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/98703 / TESIS
48

Computational methods for random differential equations: probability density function and estimation of the parameters

Calatayud Gregori, Julia 05 March 2020 (has links)
[EN] Mathematical models based on deterministic differential equations do not take into account the inherent uncertainty of the physical phenomenon (in a wide sense) under study. In addition, inaccuracies in the collected data often arise due to errors in the measurements. It thus becomes necessary to treat the input parameters of the model as random quantities, in the form of random variables or stochastic processes. This gives rise to the study of random ordinary and partial differential equations. The computation of the probability density function of the stochastic solution is important for uncertainty quantification of the model output. Although such computation is a difficult objective in general, certain stochastic expansions for the model coefficients allow faithful representations for the stochastic solution, which permits approximating its density function. In this regard, Karhunen-Loève and generalized polynomial chaos expansions become powerful tools for the density approximation. Also, methods based on discretizations from finite difference numerical schemes permit approximating the stochastic solution, therefore its probability density function. The main part of this dissertation aims at approximating the probability density function of important mathematical models with uncertainties in their formulation. Specifically, in this thesis we study, in the stochastic sense, the following models that arise in different scientific areas: in Physics, the model for the damped pendulum; in Biology and Epidemiology, the models for logistic growth and Bertalanffy, as well as epidemiological models; and in Thermodynamics, the heat partial differential equation. We rely on Karhunen-Loève and generalized polynomial chaos expansions and on finite difference schemes for the density approximation of the solution. These techniques are only applicable when we have a forward model in which the input parameters have certain probability distributions already set. When the model coefficients are estimated from collected data, we have an inverse problem. The Bayesian inference approach allows estimating the probability distribution of the model parameters from their prior probability distribution and the likelihood of the data. Uncertainty quantification for the model output is then carried out using the posterior predictive distribution. In this regard, the last part of the thesis shows the estimation of the distributions of the model parameters from experimental data on bacteria growth. To do so, a hybrid method that combines Bayesian parameter estimation and generalized polynomial chaos expansions is used. / [ES] Los modelos matemáticos basados en ecuaciones diferenciales deterministas no tienen en cuenta la incertidumbre inherente del fenómeno físico (en un sentido amplio) bajo estudio. Además, a menudo se producen inexactitudes en los datos recopilados debido a errores en las mediciones. Por lo tanto, es necesario tratar los parámetros de entrada del modelo como cantidades aleatorias, en forma de variables aleatorias o procesos estocásticos. Esto da lugar al estudio de las ecuaciones diferenciales aleatorias. El cálculo de la función de densidad de probabilidad de la solución estocástica es importante en la cuantificación de la incertidumbre de la respuesta del modelo. Aunque dicho cálculo es un objetivo difícil en general, ciertas expansiones estocásticas para los coeficientes del modelo dan lugar a representaciones fieles de la solución estocástica, lo que permite aproximar su función de densidad. En este sentido, las expansiones de Karhunen-Loève y de caos polinomial generalizado constituyen herramientas para dicha aproximación de la densidad. Además, los métodos basados en discretizaciones de esquemas numéricos de diferencias finitas permiten aproximar la solución estocástica, por lo tanto, su función de densidad de probabilidad. La parte principal de esta disertación tiene como objetivo aproximar la función de densidad de probabilidad de modelos matemáticos importantes con incertidumbre en su formulación. Concretamente, en esta memoria se estudian, en un sentido estocástico, los siguientes modelos que aparecen en diferentes áreas científicas: en Física, el modelo del péndulo amortiguado; en Biología y Epidemiología, los modelos de crecimiento logístico y de Bertalanffy, así como modelos de tipo epidemiológico; y en Termodinámica, la ecuación en derivadas parciales del calor. Utilizamos expansiones de Karhunen-Loève y de caos polinomial generalizado y esquemas de diferencias finitas para la aproximación de la densidad de la solución. Estas técnicas solo son aplicables cuando tenemos un modelo directo en el que los parámetros de entrada ya tienen determinadas distribuciones de probabilidad establecidas. Cuando los coeficientes del modelo se estiman a partir de los datos recopilados, tenemos un problema inverso. El enfoque de inferencia Bayesiana permite estimar la distribución de probabilidad de los parámetros del modelo a partir de su distribución de probabilidad previa y la verosimilitud de los datos. La cuantificación de la incertidumbre para la respuesta del modelo se lleva a cabo utilizando la distribución predictiva a posteriori. En este sentido, la última parte de la tesis muestra la estimación de las distribuciones de los parámetros del modelo a partir de datos experimentales sobre el crecimiento de bacterias. Para hacerlo, se utiliza un método híbrido que combina la estimación de parámetros Bayesianos y los desarrollos de caos polinomial generalizado. / [CA] Els models matemàtics basats en equacions diferencials deterministes no tenen en compte la incertesa inherent al fenomen físic (en un sentit ampli) sota estudi. A més a més, sovint es produeixen inexactituds en les dades recollides a causa d'errors de mesurament. Es fa així necessari tractar els paràmetres d'entrada del model com a quantitats aleatòries, en forma de variables aleatòries o processos estocàstics. Açò dóna lloc a l'estudi de les equacions diferencials aleatòries. El càlcul de la funció de densitat de probabilitat de la solució estocàstica és important per a quantificar la incertesa de la sortida del model. Tot i que, en general, aquest càlcul és un objectiu difícil d'assolir, certes expansions estocàstiques dels coeficients del model donen lloc a representacions fidels de la solució estocàstica, el que permet aproximar la seua funció de densitat. En aquest sentit, les expansions de Karhunen-Loève i de caos polinomial generalitzat esdevenen eines per a l'esmentada aproximació de la densitat. A més a més, els mètodes basats en discretitzacions mitjançant esquemes numèrics de diferències finites permeten aproximar la solució estocàstica, per tant la seua funció de densitat de probabilitat. La part principal d'aquesta dissertació té com a objectiu aproximar la funció de densitat de probabilitat d'importants models matemàtics amb incerteses en la seua formulació. Concretament, en aquesta memòria s'estudien, en un sentit estocàstic, els següents models que apareixen en diferents àrees científiques: en Física, el model del pèndol amortit; en Biologia i Epidemiologia, els models de creixement logístic i de Bertalanffy, així com models de tipus epidemiològic; i en Termodinàmica, l'equació en derivades parcials de la calor. Per a l'aproximació de la densitat de la solució, ens basem en expansions de Karhunen-Loève i de caos polinomial generalitzat i en esquemes de diferències finites. Aquestes tècniques només són aplicables quan tenim un model cap avant en què els paràmetres d'entrada tenen ja determinades distribucions de probabilitat. Quan els coeficients del model s'estimen a partir de les dades recollides, tenim un problema invers. L'enfocament de la inferència Bayesiana permet estimar la distribució de probabilitat dels paràmetres del model a partir de la seua distribució de probabilitat prèvia i la versemblança de les dades. La quantificació de la incertesa per a la resposta del model es fa mitjançant la distribució predictiva a posteriori. En aquest sentit, l'última part de la tesi mostra l'estimació de les distribucions dels paràmetres del model a partir de dades experimentals sobre el creixement de bacteris. Per a fer-ho, s'utilitza un mètode híbrid que combina l'estimació de paràmetres Bayesiana i els desenvolupaments de caos polinomial generalitzat. / This work has been supported by the Spanish Ministerio de Economía y Competitividad grant MTM2017–89664–P. / Calatayud Gregori, J. (2020). Computational methods for random differential equations: probability density function and estimation of the parameters [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/138396 / TESIS / Premios Extraordinarios de tesis doctorales
49

Currents Induced on Wired I.T. Networks by Randomly Distributed Mobile Phones - A Computational Study

Excell, Peter S., Abd-Alhameed, Raed, Vaul, John A. January 2006 (has links)
No / The probability density and exceedance probability functions of the induced currents in a screened cable connecting two enclosures, resulting from the close. presence of single and multiple mobile phones working at 900 MHz, are investigated. The analysis of the problem is undertaken using the Method of Moments, but due to weak coupling, the impedance matrix was modified to reduce the memory and time requirements for the problem, to enable it to be executed multiple times. The empirical probability distribution functions (PDFs) and exceedance probabilities for the induced currents are presented. The form of the PDFs is seen to be quite well approximated by a log-normal distribution for a single source and by a Weibull distribution for multiple sources
50

Stochastic distribution tracking control for stochastic non-linear systems via probability density function vectorisation

Liu, Y., Zhang, Qichun, Yue, H. 08 February 2022 (has links)
Yes / This paper presents a new control strategy for stochastic distribution shape tracking regarding non-Gaussian stochastic non-linear systems. The objective can be summarised as adjusting the probability density function (PDF) of the system output to any given desired distribution. In order to achieve this objective, the system output PDF has first been formulated analytically, which is time-variant. Then, the PDF vectorisation has been implemented to simplify the model description. Using the vector-based representation, the system identification and control design have been performed to achieve the PDF tracking. In practice, the PDF evolution is difficult to implement in real-time, thus a data-driven extension has also been discussed in this paper, where the vector-based model can be obtained using kernel density estimation (KDE) with the real-time data. Furthermore, the stability of the presented control design has been analysed, which is validated by a numerical example. As an extension, the multi-output stochastic systems have also been discussed for joint PDF tracking using the proposed algorithm, and the perspectives of advanced controller have been discussed. The main contribution of this paper is to propose: (1) a new sampling-based PDF transformation to reduce the modelling complexity, (2) a data-driven approach for online implementation without model pre-training, and (3) a feasible framework to integrate the existing control methods. / This paper is partly supported by National Science Foundation of China under Grants (61603262 and 62073226), Liaoning Province Natural Science Joint Foundation in Key Areas (2019- KF-03-08), Natural Science Foundation of Liaoning Province (20180550418), Liaoning BaiQianWan Talents Program, i5 Intelligent Manufacturing Institute Fund of Shenyang Institute of Technology (i5201701), Central Government Guides Local Science and Technology Development Funds of Liaoning Province (2021JH6/10500137).

Page generated in 0.1006 seconds