• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 3
  • Tagged with
  • 15
  • 15
  • 11
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Adaptable Design Improvements For Electromagnetic Shock Wave Lithotripters And Techniques For Controlling Cavitation

Smith, Nathan Birchard January 2012 (has links)
<p>In this dissertation work, the aim was to garner better mechanistic understanding of how shock wave lithotripsy (SWL) breaks stones in order to guide design improvements to a modern electromagnetic (EM) shock wave lithotripter. To accomplish this goal, experimental studies were carefully designed to isolate mechanisms of fragmentation, and models for wave propagation, fragmentation, and stone motion were developed. In the initial study, a representative EM lithotripter was characterized and tested for in vitro stone comminution efficiency at a variety of field positions and doses using phantom kidney stones of variable hardness, and in different fluid mediums to isolate the contribution of cavitation. Through parametric analysis of the acoustic field measurements alongside comminution results, a logarithmic correlation was determined between average peak pressure incident on the stone surface and comminution efficiency. It was also noted that for a given stone type, the correlations converged to an average peak pressure threshold for fragmentation, independent of fluid medium in use. The correlation of average peak pressure to efficacy supports the rationale for the acoustic lens modifications, which were pursued to simultaneously enhance beam width and optimize the pulse profile of the lithotripter shock wave (LSW) via in situ pulse superposition for improved stone fragmentation by stress waves and cavitation, respectively. In parallel, a numerical model for wave propagation was used to investigate the variations of critical parameters with changes in lens geometry. A consensus was reached on a new lens design based on high-speed imaging and stone comminution experiments against the original lens at a fixed acoustic energy setting. The results have demonstrated that the new lens has improved efficacy away from the focus, where stones may move due to respiration, fragmentation, acoustic radiation forces, or voluntary patient movements. Using traditional theory of brittle fragmentation and newfound understanding of average peak pressure correlation to stone comminution, the entire set of stone comminution data for lens comparison was modeled using a Weibull-style distribution function. This model linked both the average peak pressure and shock wave dose to efficacy, including their respective threshold parameters, and demonstrated correlation of coefficients to cavitation activity. Subsequently, this model was used in prediction of stone comminution efficiency from mimicked respiratory motions in vitro, which compared favorably to actual simulated motion studies using both the new and original lenses. Under a variety of mimicked respiratory motions, the new lens produced statistically higher stone comminution efficiency than the original lens. These results were confirmed in vivo in a swine model, where the new lens produced statistically higher stone comminution after 1,000 and 2,000 shocks. Finally, a mechanistic investigation into the effects of cavitation with the original lens was conducted using an integrated, self-focusing annular ring transducer specially designed for tandem pulse lithotripsy. It was found that cavitation and stone comminution efficiency are progressively enhanced by tandem pulsing as source energies of both the primary LSW and trailing pressure pulse increase, which suggests cavitation and stress waves act synergistically enhance the efficacy in kidney stone fragmentation.</p> / Dissertation
2

Effect of Chlorine Dioxide Gas Treatment on Bacterial Inactivation Inoculated on Spinach Leaves and on Pigment Content

Yang, Wenbo, Ms. 19 May 2015 (has links)
No description available.
3

Metanálise caso a caso sob a perspectiva bayesiana / Meta-analysis case by case using Bayesian approach

Camila Bertini Martins 29 November 2013 (has links)
O papel da metanálise de sumarizar estudos publicados de mesmo objetivo, por meio da estatística, torna-se cada dia mais fundamental em razão do avanço da ciência e do desejo de usar o menor número de seres humanos em ensaios clínicos, desnecessários, em vários casos. A síntese das informações disponíveis facilita o entendimento e possibilita conclusões robustas. O aumento de estudos clínicos, por exemplo, promove um crescimento da necessidade de metanálises, fazendo com que seja necessário o desenvolvimento de técnicas sofisticadas. Desse modo, o objetivo deste trabalho foi propor uma metodologia bayesiana para a realização de metanálises. O procedimento proposto consiste na mistura das distribuições a posteriori do parâmetro de interesse de cada estudo pertencente à metanálise; ou seja, a medida metanalítica proposta foi uma distribuição de probabilidade e não uma simples medida-resumo. A metodologia apresentada pode ser utilizada com qualquer distribuição a priori e qualquer função de verossimilhança. O cálculo da medida metanalítica pode ser utilizado, desde problemas simples até os mais sofisticados. Neste trabalho, foram apresentados exemplos envolvendo diferentes distribuições de probabilidade e dados de sobrevivência. Em casos, em que se há uma estatística suficiente disponível para o parâmetro em questão, a distribuição de probabilidade a posteriori depende dos dados apenas por meio dessa estatística e, assim, em muitos casos, há a redução de dimensão sem perda de informação. Para alguns cálculos, utilizou-se o método de simulação de Metropolis-Hastings. O software estatístico utilizado neste trabalho foi o R. / The meta-analysis role of using Statistics to summarize published studies that have the same goal becomes more essential day by day, due to the improvement of Science and the desire of using the least possible number of human beings in clinical trials, which in many cases is unnecessary. By match the available information it makes the understanding easier and it leads to more robust conclusions. For instance, the increase in the number of clinical researches also makes the need for meta-analysis go higher, arising the need for developing sophisticated techniques. Then our goal in this work is to propose a Bayesian methodology to conduct meta-analysis. The proposed procedure is a blend of posterior distributions from interest parameters of each work we are considering when doing meta-analysis. As a consequence, we have a probability distribution as a meta-analytic measure, rather than just a statistical summary. The methodology we are presenting can be used with any prior probability distribution and any likelihood function. The calculation of the meta-analytic measure has its uses from small to more complex problems. In this work we present some examples that consider various probability distributions and also survival data. There is a sufficient statistic available for the parameter of interest, the posterior probability distribution depends on the data only through this statistic and thus, in many cases, we can reduce our data without loss of information. Some calculations were performed through Metropolis-Hastings simulation algorithm. The statistical software used in this work was the R.
4

Metanálise caso a caso sob a perspectiva bayesiana / Meta-analysis case by case using Bayesian approach

Martins, Camila Bertini 29 November 2013 (has links)
O papel da metanálise de sumarizar estudos publicados de mesmo objetivo, por meio da estatística, torna-se cada dia mais fundamental em razão do avanço da ciência e do desejo de usar o menor número de seres humanos em ensaios clínicos, desnecessários, em vários casos. A síntese das informações disponíveis facilita o entendimento e possibilita conclusões robustas. O aumento de estudos clínicos, por exemplo, promove um crescimento da necessidade de metanálises, fazendo com que seja necessário o desenvolvimento de técnicas sofisticadas. Desse modo, o objetivo deste trabalho foi propor uma metodologia bayesiana para a realização de metanálises. O procedimento proposto consiste na mistura das distribuições a posteriori do parâmetro de interesse de cada estudo pertencente à metanálise; ou seja, a medida metanalítica proposta foi uma distribuição de probabilidade e não uma simples medida-resumo. A metodologia apresentada pode ser utilizada com qualquer distribuição a priori e qualquer função de verossimilhança. O cálculo da medida metanalítica pode ser utilizado, desde problemas simples até os mais sofisticados. Neste trabalho, foram apresentados exemplos envolvendo diferentes distribuições de probabilidade e dados de sobrevivência. Em casos, em que se há uma estatística suficiente disponível para o parâmetro em questão, a distribuição de probabilidade a posteriori depende dos dados apenas por meio dessa estatística e, assim, em muitos casos, há a redução de dimensão sem perda de informação. Para alguns cálculos, utilizou-se o método de simulação de Metropolis-Hastings. O software estatístico utilizado neste trabalho foi o R. / The meta-analysis role of using Statistics to summarize published studies that have the same goal becomes more essential day by day, due to the improvement of Science and the desire of using the least possible number of human beings in clinical trials, which in many cases is unnecessary. By match the available information it makes the understanding easier and it leads to more robust conclusions. For instance, the increase in the number of clinical researches also makes the need for meta-analysis go higher, arising the need for developing sophisticated techniques. Then our goal in this work is to propose a Bayesian methodology to conduct meta-analysis. The proposed procedure is a blend of posterior distributions from interest parameters of each work we are considering when doing meta-analysis. As a consequence, we have a probability distribution as a meta-analytic measure, rather than just a statistical summary. The methodology we are presenting can be used with any prior probability distribution and any likelihood function. The calculation of the meta-analytic measure has its uses from small to more complex problems. In this work we present some examples that consider various probability distributions and also survival data. There is a sufficient statistic available for the parameter of interest, the posterior probability distribution depends on the data only through this statistic and thus, in many cases, we can reduce our data without loss of information. Some calculations were performed through Metropolis-Hastings simulation algorithm. The statistical software used in this work was the R.
5

High Hydrostatic Pressure Induced Inactivation Kinetics Of E. Coli O157:h7 And S. Aureus In Carrot Juice And Analysis Of Cell Volume Change

Pilavtepe, Mutlu 01 December 2007 (has links) (PDF)
The main objective of this study was to determine the pressure induced inactivation mechanism of pressure-resistant Escherichia coli O157:H7 933 and Staphylococcus aureus 485 in a low acid food. Firstly, inactivation curves of pathogens were obtained at 200 to 400 MPa at 40&ordm / C in peptone water and carrot juice. First-order and Weibull models were fitted and Weibull model described the inactivation curves of both pathogens more accurately than first-order model, revealing that food systems could exhibit either protective or sensitizing effect on microorganisms. Carrot juice had a protective effect on E. coli O157:H7 whereas it had a sensitizing effect on S. aureus, due to the naturally occurring constituents or phytoalexins in carrot roots that could have a toxic effect. Secondly, scanning electron microscopy (SEM) and fluorescent microscopy images of studied pathogens were taken. Developed software was used to analyze SEM images to calculate the change in the view area and volume of cells. Membrane integrity of pressurized cells was also examined using fluorescent microscopy images. The increase in average values of the view area and volume of both pathogens was significant for the highest pressure levels studied. The increase in volume and the view area could be explained by the modification of membrane properties, i.e., disruption or increase in permeability, lack of membrane integrity, denaturation of membrane-bound proteins and pressure-induced phase transition of membrane lipid bilayer. The change in volume and the view area of microorganisms added another dimension to the understanding of inactivation mechanisms of microbial cells by HHP.
6

Técnicas não-paramétricas e paramétricas usadas na análise de sobrevivência de Chrysoperla externa (Neuroptera: Chrysopidae) / Non-Parametric and Parametric Techniques used in the survival analysis of Chrysoperla externa (Neuroptera: Chrysopidae)

Miranda, Marconi Silva 13 March 2012 (has links)
Made available in DSpace on 2015-03-26T13:32:15Z (GMT). No. of bitstreams: 1 texto completo.pdf: 512216 bytes, checksum: fd4223913c0ad60bce75a563695255ec (MD5) Previous issue date: 2012-03-13 / In survival analysis, the response variable is the time of occurrence of an event of interest, denominated failure time. Another characteristic of the survival analysis is to incorporate to the study incomplete sample data, in which for a determined reason the occurrence of the event was not verified, being these data defined as censured. The objective of this paper was to compare the use of the parametric and non-parametric techniques to estimate the survival time of C. externa (Neuroptera: Chrysopidae), predator insect which feed on other insects as well as mite, under the effect of three commercial products nim-based: Neempro (10 g of azadirachtina L-1), Organic neem (3,3 g of Azadirachtina L-1) and Natuneem (1,5 g of azadirachtina L-1). With this objective the survival functions for the different concentrations of each product, through the non-parametric method of Kaplan-Meier were estimated and compared by the logrank test and by parametric techniques, using the Weibull and log-normal exponential tests. Besides that, a study in order to select the most parsimonious model was done, using for that the likelihood ratio test (LRT) as well as the Akaike information criterion (AIC). The estimates of the selected parametric model were used to determine the survival functions in the concentrations of the three products, with the purpose of comparing with the nonparametric estimator Kaplan-Meier. Once the best model was defined the median survival time of C. externa was calculated in the tested concentrations of the products. Taking into consideration the conditions described in this experiment, one can conclude that the concentrations of the nim-based products have influence in the survival of C. externa. The higher the concentration of the used products, the lower was the survival time and among the evaluated products, Neempro was the one which presented the least lethal to the natural predator. / Em análise de sobrevivência, a variável resposta é o tempo de ocorrência de um evento de interesse, denominado tempo de falha. Outra característica da análise de sobrevivência é incorporar ao estudo dados amostrais incompletos, que por algum motivo a ocorrência do evento não foi verificada, dados estes definidos como censurados. O objetivo deste trabalho foi comparar o uso das técnicas paramétricas e não-paramétricas para estimar o tempo de sobrevivência de C. externa (Neuroptera: Chrysopidae), inseto predador que se alimenta de outros insetos e ácaros, sob efeito de três produtos comerciais à base de nim: Neempro (10 g de azadirachtina L-1), Organic neem (3,3 g de Azadirachtina L-1) e Natuneem (1,5 g de azadiractina L-1). Com esse objetivo foram estimadas as funções de sobrevivência para as diferentes concentrações de cada produto, por meio do método não-paramétrico de Kaplan-Meier, e comparadas pelo teste logrank e por meio das técnicas paramétricas, utilizando os modelos exponencial, de Weibull e log-normal. Foi realizado ainda, um estudo com a finalidade de selecionar o modelo mais parcimonioso, utilizando para isto o teste da razão de verossimilhança (TRV) e o critério de informação de Akaike (AIC). As estimativas do modelo paramétrico selecionado foram usadas para determinar as funções de sobrevivência nas concentrações dos três produtos, com o objetivo de comparar com o estimador não-paramétrico de Kaplan-Meier. Definido o melhor modelo foi calculado o tempo mediano de sobrevivência do C. externa nas concentrações testadas dos produtos. Levando em consideração as condições descritas neste experimento, pode-se concluir que as concentrações dos produtos a base de nim possuem influencia na sobrevivência de C. externa. Quanto maior foi a concentração dos produtos utilizados, menor foi o tempo de sobrevivência e entre os produtos avaliados o Neempro foi o que apresentou ser o menos letal ao predador natural.
7

Mapas da transmutação : modelagem, propriedades estruturais, estimação e aplicações / Transmutation maps : modeling, structural properties, estimation and applications

Granzotto, Daniele Cristina Tita 05 December 2016 (has links)
Submitted by Alison Vanceto (alison-vanceto@hotmail.com) on 2017-03-07T12:28:13Z No. of bitstreams: 1 TeseDCTG.pdf: 2877933 bytes, checksum: b26c7e73a0952568aa117e1724d1bffa (MD5) / Approved for entry into archive by Ronildo Prado (ronisp@ufscar.br) on 2017-03-20T18:57:09Z (GMT) No. of bitstreams: 1 TeseDCTG.pdf: 2877933 bytes, checksum: b26c7e73a0952568aa117e1724d1bffa (MD5) / Approved for entry into archive by Ronildo Prado (ronisp@ufscar.br) on 2017-03-20T18:57:18Z (GMT) No. of bitstreams: 1 TeseDCTG.pdf: 2877933 bytes, checksum: b26c7e73a0952568aa117e1724d1bffa (MD5) / Made available in DSpace on 2017-03-20T19:10:02Z (GMT). No. of bitstreams: 1 TeseDCTG.pdf: 2877933 bytes, checksum: b26c7e73a0952568aa117e1724d1bffa (MD5) Previous issue date: 2016-12-05 / Não recebi financiamento / Initially, we use the quadratic transmutation maps to compose a new probability model: the transmuted log-logistic distribution. Transmutation maps are a convenient way of constructing new distributions, in particular survival ones. It comprises the functional composition of the cumulative distribution function of one distribution with the inverse cumulative distribution (quantil) function of another. Its comprehensive description of properties, such as moments, quantiles, order statistics etc., along with its survival study and the classical and Bayesian estimation methods, are also part of this work. Focusing on analysis of survival, the study included two practical situations commonly found: the presence of regression variables, through the transmuted log-logistic regression model, and the presence of right censorship. In a second moment, searching for a more exible model than the transmuted, we present its generalization, the transmuted distributions of cubic rank. Using the methodology presented in this rst generalization, two models were considered to compose the new cubic transmuted distributions: the log-logistic and Weibull models. Faced with problems presented in the transmutated class of quadratic and cubic orders (such as the restricted parametric space of the transmutation parameter ), we propose in this work, a new family of distribution. This family, which we call e-transmuted or e-extended, is as simple as the transmuted model, because it includes a single parameter to the base model, but more exible than the class of transmuted models, once the transmuted is a particular case of the proposed family. In addition, the nem family presents important properties such as, orthogonality between the baseline model parameters and the e-transmutation parameter, along with unrestricted parametric space for the ! e-transmutation parameter, which is de ned on the real line. Simulation studies and real data applications were performed for all proposed models and generalizations. / Inicialmente, usamos os mapas de transmutação quadráticos para compor um novo modelo de probabilidade: a distribuição log-logística transmutada. Mapas de transmutação são uma forma conveniente de construção de novas distribuições, em especial de sobrevivência/con abilidade, e compreendem a composição funcional da função de distribuição acumulada e da função de distribuição acumulada inversa (quantil) de um outro modelo. Uma descrição detalhada de suas propriedades, tais como, momentos, quantis, estatística de ordem, dentre outras estatísticas, juntamente com o estudo de sobrevivência e métodos de estimação clássico e Bayesiano, também fazem parte deste trabalho. Focando em análise sobrevivência, incluímos no estudo duas situações práticas comumente encontradas: a presença de variáveis regressoras, através do modelo de regressão transmutado log-logístico, e a presença de censura à direita. Em um segundo momento, buscando um modelo mais exível que o transmutado, apresentamos uma generalização para esta classe de modelos, as distribuições transmutadas de rank cúbico. Usando a metodologia apresentada nesta primeira generalização, dois modelos foram considerados para compor as novas distribuições transmutadas cúbica: os modelos log-logístico e Weibull. Diante de problemas apresentados na classe transmutada de ordens quadrática e cúbica (tal como o espaço paramétrico restrito do parâmetro de transmuta ção ), propomos neste trabalho, uma nova família de distribuição. Esta família, a qual chamamos e-transmutada ou e-extendida, é tão simples quanto o modelo transmutado, por incluir um único parâmetro ao modelo base, porém mais exível do que a classe de modelos transmutados, sendo esta classe um caso particular da família proposta. Além disso, apresenta propriedades importantes, como ortogonalidade entre os parâmetros do modelo base e o parâmetro de e-transmutação, e espaço paramétrico não restrito para o parâmetro de etransmuta ção !, que é de nido em toda reta real. Estudos de simulação e aplicações a dados reais foram realizados para todos os modelos e generalizações propostas.
8

Estimação e teste de hipótese baseados em verossimilhanças perfiladas / "Point estimation and hypothesis test based on profile likelihoods"

Silva, Michel Ferreira da 20 May 2005 (has links)
Tratar a função de verossimilhança perfilada como uma verossimilhança genuína pode levar a alguns problemas, como, por exemplo, inconsistência e ineficiência dos estimadores de máxima verossimilhança. Outro problema comum refere-se à aproximação usual da distribuição da estatística da razão de verossimilhanças pela distribuição qui-quadrado, que, dependendo da quantidade de parâmetros de perturbação, pode ser muito pobre. Desta forma, torna-se importante obter ajustes para tal função. Vários pesquisadores, incluindo Barndorff-Nielsen (1983,1994), Cox e Reid (1987,1992), McCullagh e Tibshirani (1990) e Stern (1997), propuseram modificações à função de verossimilhança perfilada. Tais ajustes consistem na incorporação de um termo à verossimilhança perfilada anteriormente à estimação e têm o efeito de diminuir os vieses da função escore e da informação. Este trabalho faz uma revisão desses ajustes e das aproximações para o ajuste de Barndorff-Nielsen (1983,1994) descritas em Severini (2000a). São apresentadas suas derivações, bem como suas propriedades. Para ilustrar suas aplicações, são derivados tais ajustes no contexto da família exponencial biparamétrica. Resultados de simulações de Monte Carlo são apresentados a fim de avaliar os desempenhos dos estimadores de máxima verossimilhança e dos testes da razão de verossimilhanças baseados em tais funções. Também são apresentadas aplicações dessas funções de verossimilhança em modelos não pertencentes à família exponencial biparamétrica, mais precisamente, na família de distribuições GA0(alfa,gama,L), usada para modelar dados de imagens de radar, e no modelo de Weibull, muito usado em aplicações da área da engenharia denominada confiabilidade, considerando dados completos e censurados. Aqui também foram obtidos resultados numéricos a fim de avaliar a qualidade dos ajustes sobre a verossimilhança perfilada, analogamente às simulações realizadas para a família exponencial biparamétrica. Vale mencionar que, no caso da família de distribuições GA0(alfa,gama,L), foi avaliada a aproximação da distribuição da estatística da razão de verossimilhanças sinalizada pela distribuição normal padrão. Além disso, no caso do modelo de Weibull, vale destacar que foram derivados resultados distribucionais relativos aos estimadores de máxima verossimilhança e às estatísticas da razão de verossimilhanças para dados completos e censurados, apresentados em apêndice. / The profile likelihood function is not genuine likelihood function, and profile maximum likelihood estimators are typically inefficient and inconsistent. Additionally, the null distribution of the likelihood ratio test statistic can be poorly approximated by the asymptotic chi-squared distribution in finite samples when there are nuisance parameters. It is thus important to obtain adjustments to the likelihood function. Several authors, including Barndorff-Nielsen (1983,1994), Cox and Reid (1987,1992), McCullagh and Tibshirani (1990) and Stern (1997), have proposed modifications to the profile likelihood function. They are defined in a such a way to reduce the score and information biases. In this dissertation, we review several profile likelihood adjustments and also approximations to the adjustments proposed by Barndorff-Nielsen (1983,1994), also described in Severini (2000a). We present derivations and the main properties of the different adjustments. We also obtain adjustments for likelihood-based inference in the two-parameter exponential family. Numerical results on estimation and testing are provided. We also consider models that do not belong to the two-parameter exponential family: the GA0(alfa,gama,L) family, which is commonly used to model image radar data, and the Weibull model, which is useful for reliability studies, the latter under both noncensored and censored data. Again, extensive numerical results are provided. It is noteworthy that, in the context of the GA0(alfa,gama,L) model, we have evaluated the approximation of the null distribution of the signalized likelihood ratio statistic by the standard normal distribution. Additionally, we have obtained distributional results for the Weibull case concerning the maximum likelihood estimators and the likelihood ratio statistic both for noncensored and censored data.
9

Modelos de riscos aplicados à análise de sobrevivência / Hazard models on survival analysis

Perdona, Gleici da Silva Castro 25 August 2006 (has links)
Assumir suposições especiais sobre a função de risco tem sido a estratégia adotada por vários autores, com intuito de garantir modelos gerais e abrangentes, tanto para a análise de dados de sobrevivência quanto de conDabilidade. Neste estudo, modelos aplicados a dados da área de sobrevivência e conDabilidade são considerados. A Dnalidade deste estudo é propor modelos mais Pexíveis e/ou mais abrangentes de forma a generalizar modelos já existentes, bem como estudar suas propriedades e propor possíveis comparações entre os modelos via testes de hipóteses. Considera-se nesta tese, três classes de modelos baseados na função de risco (modelos de risco). A primeira classe apresenta-se como um caso particular do modelo de risco estendido (Louzada-Neto, 1999), formada por modelos que relacionam o parâmetro de escala a covariáveis, sendo que esse relacionamento pode ser considerado log-linear ou log-nãolinear. Considera-se um modelo particular onde a dependência do parâmetro de escala se dá de forma log-não-linear. Na segunda classe considera-se modelos que estão vinculados a dados de riscos competitivos, quando se tem ou não informação sobre qual tipo de risco foi responsável pela falha de um equipamento ou pelo óbito de um paciente. A terceira classe de modelos foi proposta, nesta tese, relacionando o contexto de modelos de longa duração. / Assuming special suppositions for the hazard function have been the strategy used for many authors in order to guarantee general and Pexible models for survival and reliability data. The present thesis considers two classes of hazard models, with the basic objective of proposing more Pexible models, studying their properties and proposing possible comparisons via hypothesis tests. We consider, three families of models where the struture was based in hazard function. The Drst class is a special case of the extented hazard model (Louzada, 1999). This class of models is composed by models with relationship between the scale parameter and the covariates could be log-linear or log-non-linear, we consider the log-non-linear. The second class is into the context of competing risk, where we do not known what kind of risk is responsable for the fail.or death. The third class, proposed in this work refers to a context of long term survivals. All the procedures were ilustrated in real datasets
10

Modeling Microbial Inactivation Subjected to Nonisothermal and Non-thermal Food Processing Technologies

Gabriella Mendes Candido De Oliveira (7451486) 17 October 2019 (has links)
<p>Modeling microbial inactivation has a great influence on the optimization, control and design of food processes. In the area of food safety, modeling is a valuable tool for characterizing survival curves and for supporting food safety decisions. The modeling of microbial behavior is based on the premise that the response of the microbial population to the environment factors is reproducible. And that from the past, it is possible to predict how these microorganisms would respond in other similar environments. Thus, the use of mathematical models has become an attractive and relevant tool in the food industry.</p> <p>This research provides tools to relate the inactivation of microorganisms of public health importance with processing conditions used in nonisothermal and non-thermal food processing technologies. Current models employ simple approaches that do not capture the realistic behavior of microbial inactivation. This oversight brings a number of fundamental and practical issues, such as excessive or insufficient processing, which can result in quality problems (when foods are over-processed) or safety problems (when foods are under-processed). Given these issues, there is an urgent need to develop reliable models that accurately describe the inactivation of dangerous microbial cells under more realistic processing conditions and that take into account the variability on microbial population, for instance their resistance to lethal agents. To address this urgency, this dissertation focused on mathematical models, combined mathematical tools with microbiological science to develop models that, by resembling realistic and practical processing conditions, can provide a better estimation of the efficacy of food processes. The objective of the approach is to relate the processing conditions to microbial inactivation. The development of the modeling approach went through all the phases of a modeling cycle from planning, data collection, formulation of the model approach according to the data analysis, and validation of the model under different conditions than those that the approach was developed.</p> <p>A non-linear ordinary differential equation was used to describe the inactivation curves with the hypothesis that the momentary inactivation rate is not constant and depends on the instantaneous processing conditions. The inactivation rate was related to key process parameters to describe the inactivation kinetics under more realistic processing conditions. From the solution of the non-linear ordinary differential equation and the optimization algorithm, safety inferences in the microbial response can be retrieved, such as the critical lethal variable that increases microbial inactivation. For example, for nonisothermal processes such as microwave heating, time-temperature profiles were modeled and incorporated into the inactivation rate equation. The critical temperature required to increase the microbial inactivation was obtained from the optimization analysis. For non-thermal processes, such as cold plasma, the time-varying concentration of reactive gas species was incorporated into the inactivation rate equation. The approach allowed the estimation of the critical gas concentration above which microbial inactivation becomes effective. For Pulsed Electric Fields (PEF), the energy density is the integral parameter that groups the wide range of parameters of the PEF process, such as the electric field strength, the treatment time and the electrical conductivity of the sample. The literature has shown that all of these parameters impact microbial inactivation. It has been hyphothesized that the inactivation rate is a function of the energy density and that above a threshold value significant microbial inactivation begins. </p> <p>The differential equation was solved numerically using the Runge-Kutta method (<i>ode45</i> in MATLAB ®). The<i> lsqcurvefit</i> function in MATLAB ® estimated the kinetic parameters. The approach to model microbial inactivation, whether when samples were subjected to nonisothermal or to non-thermal food processes, was validated using data published in the literature and/or in other samples and treatment conditions. The modeling approaches developed by this dissertation are expected to assist the food industry in the development and validation process to achieve the level of microbial reduction required by regulatory agencies. In addition, it is expected to assist the food industry in managing food safety systems through support food safety decision-making, such as the designation of the minimal critical parameter that may increase microbial inactivation. Finally, this dissertation will contribute in depth to the field of food safety and engineering, with the ultimate outcome of having a broad and highly positive impact on human health by ensuring the consumption of safe food products.</p>

Page generated in 0.4608 seconds