• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 24
  • 15
  • 13
  • 1
  • 1
  • Tagged with
  • 86
  • 86
  • 32
  • 21
  • 16
  • 14
  • 14
  • 13
  • 10
  • 10
  • 10
  • 10
  • 10
  • 9
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Desenvolvimento e validação de metodologia para determinação de metais em amostras de água por espectrometria de emissão óptica com plasma de argônio (ICP-OES) / Development and validation of methodology for determination of metals in water samples by optical emission with argon plasma spectrometry (ICP-OES)

FAUSTINO, MAINARA G. 08 April 2016 (has links)
Submitted by Claudinei Pracidelli (cpracide@ipen.br) on 2016-04-08T12:48:16Z No. of bitstreams: 0 / Made available in DSpace on 2016-04-08T12:48:16Z (GMT). No. of bitstreams: 0 / Para atender a legislação ambiental do Conselho Nacional do Meio Ambiente (CONAMA), a Resolução CONAMA 357/2005, é necessário desenvolver metodologias que se aplicam na medição analítica de forma correta e para um controle de qualidade é necessário a aplicação da validação de metodologia. Para atender as exigências legais, o presente trabalho desenvolveu uma metodologia para a identificação de 12 elementos metálicos, tais como: Al, Ba, Ca, Cd, Cr, Cu, Fe, K, Mg, Mn, Na e Ni em águas, avaliando 14 pontos, sendo eles GU000-01 (23°46\'49.6\"S-46°47\'22\"0W), GU000-02 (23°45\'29.5\"S-46°46\'18.7\"W), GU000-03 (23°44\'52.2\"S-46°46\'13.6\"W), GU106-04 (23°44\'44.6\"S-46°45\'25.8\"W), GU000-05 (23°44\'57.5\"S-46°45\'24.2\"W), GU107-06 (23°45\'01.2\"S-46°43\'61.5\"W), GU108-07 (23°43\'64.7\"S-46°43\'42.3\"W), GU000-08 (23°42\'96.9\"S-46°43\'61.2\"W), GU109-09 (23°43\'04.6\"S-46°43\'34.0\"W), GU105-10 (23°42\'89.9\"S-46°44\'68.7\"W), GU108-11 (23°42\'53.4\"S-46°43\'44.9\"W), GU103-12 (23°41\'88.5\"S-46°44\'67.3\"W), GU102-13 (23°41\'58.0\"S-46°43\'57.3\"W), GU000-14 (23°40\'78.2\"S-46°43\'55.0\"W), distribuídos pela Represa Guarapiranga, situada no Estado de São Paulo, aplicando a metodologia validade, realizada com base no guia do Instituto Nacional de Metrologia, Qualidade e Tecnologia (INMETRO), Orientação sobre Validação de Métodos Analíticos - DOQ-CGCE-008. Foram avaliados os parâmetros: seletividade, faixa de trabalho/linearidade, limites de detecção e quantificação, tendência/recuperação, precisão, robustez e incerteza de medição. Foi utilizado a técnica de espectrometria de Emissão Óptica com Plasma de Argônio (ICP-OES). O teste de seletividade comprovou que a matriz não interfere nas curvas analíticas elaborada; a faixa de trabalho apresentou um comportamento linear, para as amostras com e sem a matriz de interesse, com um coeficiente de correlação (r), entre 0,9965 a 1,0; os limites de detecção e quantificação do método atendem aos valores máximos permitidos pela Resolução CONAMA 357/2005; com os testes de repetitividade e de recuperação o método demonstrou ser preciso e exato, além de robusto. Posteriormente foi estimado uma incerteza de medição do método. A incerteza expandida estimada variou entre 3 e 18% da concentração encontrada. A validação da metodologia permitiu a sua aplicação para a avaliação da distribuição dos 12 elementos, nas águas da represa Guarapiranga. Foram observados valores altos para Ca, Na e K, em todos os pontos de coletas analisados, evidenciando que são os elementos que fazem parte da característica geológica da área. Os elementos Fe e Al obtiveram valores acima da legislação nos pontos da Represa (G000-01, G000-02 e G000-03). Com os testes dos parâmetros para a validação, com os cálculos estatísticos aplicados, foi possível desenvolver e aplicar uma metodologia adequada para o uso pretendido. / Dissertação (Mestrado em Tecnologia Nuclear) / IPEN/D / Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
52

Influência do enxerto de pele humana irradiada na regeneração tecidual de camundongos nude / Skin graft influence in human tissue radiated in Nude mice regeneration

MIRANDA, JURANDIR T. de 11 November 2016 (has links)
Submitted by Claudinei Pracidelli (cpracide@ipen.br) on 2016-11-11T17:07:10Z No. of bitstreams: 0 / Made available in DSpace on 2016-11-11T17:07:10Z (GMT). No. of bitstreams: 0 / Nas últimas décadas tem aumentado o interesse pelos enxertos de pele humana radioesterilizadas, para aplicação principalmente em queimaduras extensas e profundas. Isto se deve ao fato destes enxertos apresentarem rápida aderência e menor potencial antigênico, em comparação com os demais tratamentos utilizados. A proposta deste estudo foi avaliar a histoarquitetura do enxerto de pele humana irradiada com doses de 25 kGy, 50 kGy e não irradiada, durante o processo de reparação tecidual, em camundongos Nude submetidos a enxertia de pele na região dorsal. Três grupos de animais receberam enxertos de pele humana irradiada (25 kGy e 50 kGy) e não irradiada e foram eutanasiados no 3º, 7º e 21º dia após a realização da cirurgia. Após os procedimentos histológicos de rotina, as amostras de tecido foram coradas com hematoxilina e eosina (HE) para a quantificação de queratinócitos, fibroblastos, células de defesa e vasos sanguíneos e a reação de imunofluorescência (IF) foi realizada para a determinação da expressão de colágeno do tipo I humano e do colágeno dos tipos I e III de camundongo. A quantificação, tanto das células quanto dos tipos de colágeno foi realizada por análise de imagem, utilizando o programa Image-Pro PLus 6.0. Os resultados histológicos demostraram que a pele humana irradiação, quando enxertada, influencia o aumento do número de células no local de cicatrização ao longo do tempo, principalmente na dose de 25 kGy, além de proporcionar uma melhor dispersão destas células. No 21º dia, os três grupos de animais com enxertia de pele humana tiveram parte do enxerto incorporado no processo de cicatrização. O grupo não irradiado apresentou maior incorporação do enxerto (43%), porém menor produção de colágeno do tipo III de camundongo (22%). Já os grupos com enxertia de pele irradiada apresentaram menor incorporação do enxerto (6 e 15%), mas com maior produção de colágeno do tipo III de camundongo (35% e 28%, para 25 kGy e 50 kGy, respectivamente). Com este estudo pôde-se concluir que o grupo irradiado a 25 kGy, apresenta maior proliferação celular e formação de vasos,além de melhor remodelamento da região de cicatrização. / Dissertação (Mestrado em Tecnologia Nuclear) / IPEN/D / Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
53

Analýza vybrané firmy pomocí statistických metod / Analysis of a Selected Company Using Statistical Methods

Poláček, Lukáš January 2012 (has links)
The master’s thesis deals with the aggregate reviews of the economic situation of one joint-stock company, primarily using statistical methods. The goal will be to analyse the data by these means, to compare them, to make the conclusions and suggestions for improvement. From the knowledge of historical data and forecasting preconditions for the future the company will gain clearer image about its development and future directions.
54

Analýza ekonomických ukazatelů pomocí statistických metod / Analysis of Economic Indicators Using Statistical Methods

Barva, David January 2013 (has links)
The master thesis evaluates economic situation of a company using statistical methods. The paper comes out of the company's financial statement that have been economically analysed and led to statistical analysis. Using statistic methods together with historical data estimates future trends. Consequently, solutions that could lead to the company's financial stability and sustainable management.
55

Analýza vybrané firmy pomocí časových řad / Analysis of a Selected Company Using Time Series

Poláček, Lukáš January 2013 (has links)
The thesis is an analysis of economic indicators one joint stock company Jihomoravská plynárenská using statistical methods. The goal is an evaluation of economic indicators, to make the conclusions and suggestions for improvement. The suggest part of thesis deals with the pricing issues and looking for new opportunities. The thesis contains theoretical background needed for full understanding of the analytical part, analysis and recommendations, which after putting themselves into practice contribute to the improvement of present condition as well.
56

Uplatnění statistických metod při zpracování dat / The Use of Statistical Methods for Data Processing

Matuškovič, Marián January 2015 (has links)
This master thesis focuses on application of statistical methods in the processing of data. The first part of the thesis describes the theoretical foundations that are the basis for the practical part. Next part of this thesis describes the statistical and financial analysis and also design of an application that automate usage of statistical methods of regression analysis to predict the future economic situation development of the company. This thesis contains theory of time series methods and regression analysis.
57

On Identifying Signatures of Positive Selection in Human Populations: A Dissertation

Crisci, Jessica L. 25 June 2013 (has links)
As sequencing technology continues to produce better quality genomes at decreasing costs, there has been a recent surge in the variety of data that we are now able to analyze. This is particularly true with regards to our understanding of the human genome—where the last decade has seen data advances in primate epigenomics, ancient hominid genomics, and a proliferation of human polymorphism data from multiple populations. In order to utilize such data however, it has become critical to develop increasingly sophisticated tools spanning both bioinformatics and statistical inference. In population genetics particularly, new statistical approaches for analyzing population data are constantly being developed—unfortunately, often without proper model testing and evaluation of type-I and type-II error. Because the common Wright-Fisher assumptions underlying such models are generally violated in natural populations, this statistical testing is critical. Thus, my dissertation has two distinct but related themes: 1) evaluating methods of statistical inference in population genetics, and 2) utilizing these methods to analyze the evolutionary history of humans and our closest relatives. The resulting collection of work has not only provided important biological insights (including some of the first strong evidence of selection on human-specific epigenetic modifications (Shulha, Crisci, Reshetov, Tushir et al. 2012, PLoS Bio), and a characterization of human-specific genetic changes distinguishing modern humans from Neanderthals (Crisci et al. 2011, GBE)), but also important insights in to the performance of population genetic methodologies which will motivate the future development of improved approaches for statistical inference (Crisci et al, in review).
58

GENERATIVE MODELS WITH MARGINAL CONSTRAINTS

Bingjing Tang (16380291) 16 June 2023 (has links)
<p> Generative models form powerful tools for learning data distributions and simulating new samples. Recent years have seen significant advances in the flexibility and applicability of such models, with Bayesian approaches like nonparametric Bayesian models and deep neural network models such as Variational Autoencoders (VAEs) and Generative Adversarial Networks (GANs) finding use in a wide range of domains. However, the black-box nature of these models means that they are often hard to interpret, and they often come with modeling implications that are inconsistent with side knowledge resulting from domain knowledge. This thesis studies situations where the modeler has side knowledge represented as probability distributions on functionals of the objects being modeled, and we study methods to incorporate this particular kind of side knowledge into flexible generative models. This dissertation covers three main parts. </p> <p><br></p> <p>The first part focuses on incorporating a special case of the aforementioned side knowledge into flexible nonparametric Bayesian models. Many times, practitioners have additional distributional information about a subset of the coordinates of the observations being modeled. The flexibility of nonparametric Bayesian models usually implies incompatibility with this side information. Such inconsistency triggers the necessity of developing methods to incorporate this side knowledge into flexible nonparametric Bayesian models. We design a specialized generative process to build in this side knowledge and propose a novel sigmoid Gaussian process conditional model. We also develop a corresponding posterior sampling method based on data augmentation to overcome a doubly intractable problem. We illustrate the efficacy of our proposed constrained nonparametric Bayesian model in a variety of real-world scenarios including modeling environmental and earthquake data. </p> <p><br></p> <p>The second part of the dissertation discusses neural network approaches to satisfying the said general side knowledge. Further, the generative models considered in this part broaden into black-box models. We formulate this side knowledge incorporation problem as a constrained divergence minimization problem and propose two scalable neural network approaches as its solution. We demonstrate their practicality using various synthetic and real examples. </p> <p><br></p> <p> The third part of the dissertation concentrates on a specific generative model of individual pixels of the fMRI data constructed from a latent group image. Usually there is two-fold side knowledge about the latent group image: spatial structure and partial activation zones. The former can be captured by modeling the prior for the group image with Markov random fields. The latter, which is often obtained from previous related studies, is left for future research. We propose a novel Bayesian model with Markov random fields and aim to estimate the maximum a posteriori for the group image. We also derive a variational Bayes algorithm to overcome local optima in the optimization.</p>
59

Causal Inference in the Face of Assumption Violations

Yuki Ohnishi (18423810) 26 April 2024 (has links)
<p dir="ltr">This dissertation advances the field of causal inference by developing methodologies in the face of assumption violations. Traditional causal inference methodologies hinge on a core set of assumptions, which are often violated in the complex landscape of modern experiments and observational studies. This dissertation proposes novel methodologies designed to address the challenges posed by single or multiple assumption violations. By applying these innovative approaches to real-world datasets, this research uncovers valuable insights that were previously inaccessible with existing methods. </p><p><br></p><p dir="ltr">First, three significant sources of complications in causal inference that are increasingly of interest are interference among individuals, nonadherence of individuals to their assigned treatments, and unintended missing outcomes. Interference exists if the outcome of an individual depends not only on its assigned treatment, but also on the assigned treatments for other units. It commonly arises when limited controls are placed on the interactions of individuals with one another during the course of an experiment. Treatment nonadherence frequently occurs in human subject experiments, as it can be unethical to force an individual to take their assigned treatment. Clinical trials, in particular, typically have subjects that do not adhere to their assigned treatments due to adverse side effects or intercurrent events. Missing values also commonly occur in clinical studies. For example, some patients may drop out of the study due to the side effects of the treatment. Failing to account for these considerations will generally yield unstable and biased inferences on treatment effects even in randomized experiments, but existing methodologies lack the ability to address all these challenges simultaneously. We propose a novel Bayesian methodology to fill this gap. </p><p><br></p><p dir="ltr">My subsequent research further addresses one of the limitations of the first project: a set of assumptions about interference structures that may be too restrictive in some practical settings. We introduce a concept of the ``degree of interference" (DoI), a latent variable capturing the interference structure. This concept allows for handling arbitrary, unknown interference structures to facilitate inference on causal estimands. </p><p><br></p><p dir="ltr">While randomized experiments offer a solid foundation for valid causal analysis, people are also interested in conducting causal inference using observational data due to the cost and difficulty of randomized experiments and the wide availability of observational data. Nonetheless, using observational data to infer causality requires us to rely on additional assumptions. A central assumption is that of \emph{ignorability}, which posits that the treatment is randomly assigned based on the variables (covariates) included in the dataset. While crucial, this assumption is often debatable, especially when treatments are assigned sequentially to optimize future outcomes. For instance, marketers typically adjust subsequent promotions based on responses to earlier ones and speculate on how customers might have reacted to alternative past promotions. This speculative behavior introduces latent confounders, which must be carefully addressed to prevent biased conclusions. </p><p dir="ltr">In the third project, we investigate these issues by studying sequences of promotional emails sent by a US retailer. We develop a novel Bayesian approach for causal inference from longitudinal observational data that accommodates noncompliance and latent sequential confounding. </p><p><br></p><p dir="ltr">Finally, we formulate the causal inference problem for the privatized data. In the era of digital expansion, the secure handling of sensitive data poses an intricate challenge that significantly influences research, policy-making, and technological innovation. As the collection of sensitive data becomes more widespread across academic, governmental, and corporate sectors, addressing the complex balance between making data accessible and safeguarding private information requires the development of sophisticated methods for analysis and reporting, which must include stringent privacy protections. Currently, the gold standard for maintaining this balance is Differential privacy. </p><p dir="ltr">Local differential privacy is a differential privacy paradigm in which individuals first apply a privacy mechanism to their data (often by adding noise) before transmitting the result to a curator. The noise for privacy results in additional bias and variance in their analyses. Thus, it is of great importance for analysts to incorporate the privacy noise into valid inference.</p><p dir="ltr">In this final project, we develop methodologies to infer causal effects from locally privatized data under randomized experiments. We present frequentist and Bayesian approaches and discuss the statistical properties of the estimators, such as consistency and optimality under various privacy scenarios.</p>
60

ROBUST INFERENCE FOR HETEROGENEOUS TREATMENT EFFECTS WITH APPLICATIONS TO NHANES DATA

Ran Mo (20329047) 10 January 2025 (has links)
<p dir="ltr">Estimating the conditional average treatment effect (CATE) using data from the National Health and Nutrition Examination Survey (NHANES) provides valuable insights into the heterogeneous</p><p dir="ltr">impacts of health interventions across diverse populations, facilitating public health strategies that consider individual differences in health behaviors and conditions. However, estimating CATE with NHANES data face challenges often encountered in observational studies, such as outliers, heavy-tailed error distributions, skewed data, model misspecification, and the curse of dimensionality. To address these challenges, this dissertation presents three consecutive studies that thoroughly explore robust methods for estimating heterogeneous treatment effects. </p><p dir="ltr">The first study introduces an outlier-resistant estimation method by incorporating M-estimation, replacing the \(L_2\) loss in the traditional inverse propensity weighting (IPW) method with a robust loss function. To assess the robustness of our approach, we investigate its influence function and breakdown point. Additionally, we derive the asymptotic properties of the proposed estimator, enabling valid inference for the proposed outlier-resistant estimator of CATE.</p><p dir="ltr">The method proposed in the first study relies on a symmetric assumption which is commonly required by standard outlier-resistant methods. To remove this assumption while maintaining </p><p dir="ltr">unbiasedness, the second study employs the adaptive Huber loss, which dynamically adjusts the robustification parameter based on the sample size to achieve optimal tradeoff between bias and robustness. The robustification parameter is explicitly derived from theoretical results, making it unnecessary to rely on time-consuming data-driven methods for its selection.</p><p dir="ltr">We also derive concentration and Berry-Esseen inequalities to precisely quantify the convergence rates as well as finite sample performance.</p><p dir="ltr">In both previous studies, the propensity scores were estimated parametrically, which is sensitive to model misspecification issues. The third study extends the robust estimator from our first </p><p dir="ltr">project by plugging in a kernel-based nonparametric estimation of the propensity score with sufficient dimension reduction (SDR). Specifically, we adopt a robust minimum average variance estimation (rMAVE) for the central mean space under the potential outcome framework. Together with higher-order kernels, the resulting CATE estimation gains enhanced efficiency.</p><p dir="ltr">In all three studies, the theoretical results are derived, and confidence intervals are constructed for inference based on these findings. The properties of the proposed estimators are verified through extensive simulations. Additionally, applying these methods to NHANES data validates the estimators' ability to handle diverse and contaminated datasets, further demonstrating their effectiveness in real-world scenarios.</p><p><br></p>

Page generated in 0.0251 seconds