• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 264
  • 131
  • 41
  • 20
  • 16
  • 15
  • 11
  • 10
  • 8
  • 5
  • 5
  • 4
  • 3
  • 3
  • 3
  • Tagged with
  • 622
  • 83
  • 79
  • 64
  • 62
  • 57
  • 55
  • 48
  • 46
  • 45
  • 40
  • 39
  • 39
  • 38
  • 36
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
341

Estimativa de idade através das linhas incrementais de cemento / Age estimation through incremental lines in cementum

Dias, Paulo Eduardo Miamoto 11 June 2010 (has links)
A estimativa de idade pela contagem das linhas incrementais de cemento (LC) adicionadas à idade média de erupção do dente analisado é um método tido como preciso e confiável por alguns autores, enquanto outros o rejeitam afirmando não haver forte correlação entre idade real e estimada. O objetivo do estudo foi avaliar a técnica descrita e verificar se há influência de patologias bucais na estimativa de idade, analisando-se, além do número de LC, correlação entre espessura de cemento e idade real. Foram preparadas por desgaste 31 lâminas transversais, de aproximadamente 30 m, de 25 dentes recém extraídos. As lâminas foram observadas, fotografadas e medidas em microscopia óptica. As LC das imagens foram realçadas com uso do software Image J 1.43s e as contagens foram feitas por um observador e dois observadores-controle. Houve correlação moderada de 0.58 para toda a amostra, com erro médio de 9,7 anos. Para dentes com alterações periodontais, a correlação foi de 0.03 e erro médio de 22,6 anos. Para dentes sem alterações periodontais, a correlação foi de 0.74 e erro médio de 1,6 anos. A espessura de cemento teve correlação com idade real de 0.69 para toda amostra, 0.25 para dentes com problemas periodontais e 0.75 para dentes sem problemas periodontais. A técnica das LC associada à medição de espessura de cemento mostrou-se confiável para dentes sem patologias periodontais, porém em dentes com patologias periodontais ou histórico/quadro clínico desconhecido, recomenda-se a realização de exames macroscópicos conjuntos para comparação. / Age estimation by counting incremental lines in cementum added to the average age of tooth eruption is considered an accurate and reliable method by some authors, while others reject it stating no strong correlation between estimated and actual age. The aim of this study was to evaluate this technique and check the influence of oral conditions on age estimation by analyzing both the number of cementum lines as well as the correlation between cementum thickness and actual age, on diseased teeth. Thirty one undecalcified ground cross sections of approximately 30 m, from 25 freshly extracted teeth were prepared, observed, photographed and measured. The images were enhanced with the use of software and the counts were made by one observer and two control-observers. There was moderate correlation ((r)=0.58) for the entire sample, with mean error of 9.7 years. For teeth with periodontal pathologies, the correlation was 0.03 with a mean error of 22.6 years. For teeth without periodontal pathologies, the correlation was 0.74 with mean error of 1.6 years. There was correlation of 0.69 between cementum thickness and actual age for the entire sample, 0.25 for teeth with periodontal problems and 0.75 for teeth without periodontal pathologies. The cementum lines technique associated with the measurement of cementum thickness was reliable for teeth without periodontal pathologies, but in periodontally diseased teeth or teeth with unknown history/clinical background, parallel macroscopic examinations should be conducted.
342

CuT-REMD : uma nova abordagem para predi??o de estruturas terci?rias de prote?nas baseada em raio de corte incremental / CuT-REMD : a novel approach for tertiary protein Structure prediction based on incremental cutoff

Paes, Thiago Lipinski 27 March 2017 (has links)
Submitted by Caroline Xavier (caroline.xavier@pucrs.br) on 2017-08-25T13:27:56Z No. of bitstreams: 1 TES_THIAGO_LIPINSKI_PAES_COMPLETO.pdf: 7285473 bytes, checksum: 4dab05982a53386f1d5dd2202a2f9b21 (MD5) / Made available in DSpace on 2017-08-25T13:27:57Z (GMT). No. of bitstreams: 1 TES_THIAGO_LIPINSKI_PAES_COMPLETO.pdf: 7285473 bytes, checksum: 4dab05982a53386f1d5dd2202a2f9b21 (MD5) Previous issue date: 2017-03-27 / Among the main computational techniques currently applied to study proteins, classical molecular dynamics plays a important hole, specially its variation called replica exchange molecular dynamics or REMD, which provides efficient conformational sampling. Regular secondary structures elements of proteins are formed and maintained via stabilization by hydrogen bonds within helices and between strands of a -sheet. Packing of these structural elements, allowed by flexible turns and loops connecting them, leads to the formation of a structure that, in the successful cases, represents the native, functional state of a protein. Ionic, dipole, van der Waals, hydrophobic interactions, and hydrogen bonding are fundamental to these events. Most of these forces are strong up to a distance of 4.0 ?. Hence, these are the distances involved in the formation of local structural nubs that can further propagate and form whole elements of secondary structure. The common practice while simulating is, however, to keep fixed the cutoff at values higher or equal to 8.0 ?. Here a novel replica exchange molecular dynamics approach based on running cutoffs (varying from 4.0 ? to 8.0 ?) to enhance protein structure prediction is presented. We first proved the method as a reproducible one, as well as following a Boltzmann distribution and sampling different structures of conventional REMD. The human villin headpiece protein (PDB ID: 1UNC) was used as case study. We tested 9 different simulation protocols, in triplicate, and proved the use of incremental cutoff as an effective approach to enhance the quality and speed of protein structure predictions via replica exchange molecular dynamics. Applying the method to the protein test set, although of limited size, CuT-REMD showed good performance against the ab initio methods, most of the time being either as the best prediction method or with close results to the best ones. This made it possible to also compare CuT-REMD with de novo methods. Despite the difficulties, CuT-REMD maintained a good performance even surpassing certain servers for all tested proteins. The results obtained are encouraging, with the emergence of new questions to be addressed in the future. / Dentre os principais m?todos computacionais aplicados atualmente ao estudo de prote?nas, a din?mica molecular cl?ssica realiza importante papel, especialmente sua varia??o intitulada Replica Exchange Molecular Dynamics ou REMD, a qual prov? amostragem conformacional eficiente. Elementos de Estruturas Secund?rias (EES) regulares de prote?nas s?o formados e mantidos atrav?s de estabiliza??o por liga??es de hidrog?nio dentro de h?lices e entre fitas de uma folha . O empacotamento desses elementos estruturais, permitido por voltas e la?os flex?veis conectando-os, leva ? forma??o de uma estrutura que, nos casos bem sucedidos, representa o estado nativo, funcional de uma prote?na. Intera??es i?nicas, dipolo-dipolo, de van der Waals e hidrof?bicas, al?m de liga??es de hidrog?nio, s?o fundamentais para esses eventos. A maioria dessas for?as ? mais forte at? uma dist?ncia de 4,0 ?. Assim, essas (de 0,0 ? a 4,0 ?) s?o as dist?ncias envolvidas na forma??o de estruturas locais, que podem ainda se propagar e formar elementos inteiros de estrutura secund?ria. A pr?tica comum ao se executar simula??es por DM ?, no entanto, manter um raio de corte fixo em valores maiores ou iguais a 8,0 ?. Esta tese apresenta o m?todo CuTREMD, uma nova abordagem de REMD com base em raio de corte incremental (variando de 4,0 ? a 8,0 ?) testando a hip?tese de que tal abordagem pode otimizar a predi??o de estruturas terci?rias de prote?nas. Primeiramente, foi utilizada a prote?na villin headpiece humana (c?digo PDB 1UNC), como estudo de caso, e nove diferentes protocolos de simula??o foram testados, todos em triplicata. Posteriormente, com base nos resultados obtidos, um protocolo-padr?o foi escolhido como protocolo CuT-REMD, e um conjunto de nove prote?nas adicionais foi testado, sendo os resultados comparados com o m?todo REMD convencional. A utiliza??o de raio de corte incremental provou-se uma abordagem eficaz para melhorar a qualidade e velocidade das predi??es de estruturas de prote?nas via REMD. Aplicando o m?todo ao conjunto teste de prote?nas, embora de tamanho limitado, CuT-REMD mostrou bom desempenho em rela??o aos m?todos ab initio, colocando-se na grande maioria das vezes ou como o melhor m?todo de predi??o ou com resultados pr?ximos aos melhores m?todos. Isso possibilitou compar?-lo tamb?m com m?todos de novo e, embora com mais dificuldade, CuT-REMD manteve bom desempenho, inclusive superando certos servidores em todas as ocasi?es. Os resultados obtidos, em suma, mostram-se encorajadores, com o surgimento de novos questionamentos a serem abordados futuramente.
343

Zobecněné náhodné mozaiky, jejich vlastnosti, simulace a aplikace / Generalized random tessellations, their properties, simulation and applications

Jahn, Daniel January 2019 (has links)
The past few years have seen advances in modelling of polycrystalline materi- als using parametric tessellation models from stochastic geometry. A promising class of tessellations, the Gibbs-type tessellation, allows the user to specify a great variety of properties through the energy function. This text focuses solely on tetrahedrizations, a three-dimensional tessellation composed of tetrahedra. The existing results for two-dimensional Delaunay triangulations are extended to the case of three-dimensional Laguerre tetrahedrization. We provide a proof of existence, a C++ implementation of the MCMC simulation and estimation of the models parameters through maximum pseudolikelihood. 1
344

Construção de ferramenta computacional para estimação de custos na presença de censura utilizando o método da Ponderação pela Probabilidade Inversa

Sientchkovski, Paula Marques January 2016 (has links)
Introdução: Dados de custo necessários na Análise de Custo-Efetividade (CEA) são, muitas vezes, obtidos de estudos longitudinais primários. Neste contexto, é comum a presença de censura caracterizada por não se ter os dados de custo a partir de certo momento, devido ao fato de que indivíduos saem do estudo sem esse estar finalizado. A ideia da Ponderação pela Probabilidade Inversa (IPW – do inglês, Inverse Probability Weighting) vem sendo bastante estudada na literatura relacionada a esse problema, mas é desconhecida a disponibilidade de ferramentas computacionais para esse contexto. Objetivo: Construir ferramentas computacionais em software Excel e R, para estimação de custos pelo método IPW conforme proposto por Bang e Tsiatis (2000), com o objetivo de lidar com o problema da censura em dados de custos. Métodos: Através da criação de planilhas eletrônicas em software Excel e programação em software R, e utilizando-se bancos de dados hipotéticos com situações diversas, busca-se propiciar ao pesquisador maior entendimento do uso desse estimador bem como a interpretação dos seus resultados. Resultados: As ferramentas desenvolvidas, ao proporcionarem a aplicação do método IPW de modo intuitivo, se mostraram como facilitadoras para a estimação de custos na presença de censura, possibilitando calcular a ICER a partir de dados de custo. Conclusão: As ferramentas desenvolvidas permitem ao pesquisador, além de uma compreensão prática do método, a sua aplicabilidade em maior escala, podendo ser considerada como alternativa satisfatória às dificuldades postas pelo problema da censura na CEA. / Introduction: Cost data needed in Cost-Effectiveness Analysis (CEA) are often obtained from longitudinal primary studies. In this context, it is common the presence of censoring characterized by not having cost data after a certain point, due to the fact that individuals leave the study without this being finalized. The idea of Inverse Probability Weighting (IPW) has been extensively studied in the literature related to this problem, but is unknown the availability of computational tools for this context. Objective: To develop computational tools in software Excel and software R, to estimate costs by IPW method, as proposed by Bang and Tsiatis (2000), in order to deal with the problem of censorship in cost data. Methods: By creating spreadsheets in Excel software and programming in R software, and using hypothetical database with different situations, we seek to provide to the researcher most understanding of the use of IPW estimator and the interpretation of its results. Results: The developed tools, affording the application of IPW method in an intuitive way, showed themselves as facilitators for the cost estimation in the presence of censorship, allowing to calculate the ICER from more accurate cost data. Conclusion: The developed tools allow the researcher, besides a practical understanding of the method, its applicability on a larger scale, and may be considered a satisfactory alternative to the difficulties posed by the problem of censorship in CEA.
345

Estratégias incrementais em combinação de filtros adaptativos. / Incremental strategies in combination of adaptive filters.

Lopes, Wilder Bezerra 14 February 2012 (has links)
Neste trabalho uma nova estratégia de combinação de filtros adaptativos é apresentada e estudada. Inspirada por esquemas incrementais e filtragem adaptativa cooperativa, a combinação convexa usual de filtros em paralelo e independentes é reestruturada como uma configuração série-cooperativa, sem aumento da complexidade computacional. Dois novos algoritmos são projetados utilizando Recursive Least-Squares (RLS) e Least-Mean-Squares (LMS) como subfiltros que compõem a combinação. Para avaliar a performance da estrutura incremental, uma análise de média quadrática é realizada. Esta é feita assumindo que os combinadores têm valores fixos, de forma a permitir o estudo da universalidade da estrutura desacoplada da dinâmica do supervisor. As simulações realizadas mostram uma boa concordância com o modelo teórico obtido. / In this work a new strategy for combination of adaptive filters is introduced and studied. Inspired by incremental schemes and cooperative adaptive filtering, the standard convex combination of parallel-independent filters is rearranged into a series-cooperative configuration, while preserving computational complexity. Two new algorithms are derived employing Recursive Least-Squares (RLS) and Least-Mean-Squares (LMS) algorithms as the component filters. In order to assess the performance of the incremental structure, tracking and steady-state mean-square analysis is derived. The analysis is carried out assuming the combiners are fixed, so that the universality of the new structure may be studied decoupled from the supervisor\'s dynamics. The resulting analytical model shows good agreement with simulation results.
346

Estratégias incrementais em combinação de filtros adaptativos. / Incremental strategies in combination of adaptive filters.

Wilder Bezerra Lopes 14 February 2012 (has links)
Neste trabalho uma nova estratégia de combinação de filtros adaptativos é apresentada e estudada. Inspirada por esquemas incrementais e filtragem adaptativa cooperativa, a combinação convexa usual de filtros em paralelo e independentes é reestruturada como uma configuração série-cooperativa, sem aumento da complexidade computacional. Dois novos algoritmos são projetados utilizando Recursive Least-Squares (RLS) e Least-Mean-Squares (LMS) como subfiltros que compõem a combinação. Para avaliar a performance da estrutura incremental, uma análise de média quadrática é realizada. Esta é feita assumindo que os combinadores têm valores fixos, de forma a permitir o estudo da universalidade da estrutura desacoplada da dinâmica do supervisor. As simulações realizadas mostram uma boa concordância com o modelo teórico obtido. / In this work a new strategy for combination of adaptive filters is introduced and studied. Inspired by incremental schemes and cooperative adaptive filtering, the standard convex combination of parallel-independent filters is rearranged into a series-cooperative configuration, while preserving computational complexity. Two new algorithms are derived employing Recursive Least-Squares (RLS) and Least-Mean-Squares (LMS) algorithms as the component filters. In order to assess the performance of the incremental structure, tracking and steady-state mean-square analysis is derived. The analysis is carried out assuming the combiners are fixed, so that the universality of the new structure may be studied decoupled from the supervisor\'s dynamics. The resulting analytical model shows good agreement with simulation results.
347

Cost-effectiveness Analysis of Preimplantation Genetic Screening

Moye, William Andrew 01 January 2018 (has links)
In vitro fertilization (IVF) is used to help infertile couples achieve a live birth. Clinical studies have suggested that multiple, consecutive cycles of IVF can increase live birth rate significantly. Others have documented improved live birth rates from the use of new laboratory techniques for preimplantation genetic screening (PGS). This genetic screening technique seeks to determine the ploidy of the embryo prior to implantation into the woman. To date, no study has examined the cost-effectiveness of using IVF in conjunction with PGS compared to that of IVF alone for 3 consecutive cycles in achieving a live birth. This study compared the incremental cost-effectiveness ratios (ICER) from each intervention arm based on the clinical probabilities for each outcome and this study was grounded in the protection motivation theory. Costs were obtained from secondary sources, such as the literature and government databases. The model was constructed using a decision-analytical approach that allowed for z test statistical analysis of the outcomes, where the ICER is the dependent variable and the independent variables are the 2 interventions. The robustness of the model was tested through univariate and probabilistic sensitivity analysis and stratified by age groups. The results showed that PGS with IVF was cost-effective for women aged under 40 and women aged 40-42, but not for women over 42. Based on a willingness-to-pay threshold of $100,000, IVF with PGS was the most cost-effective strategy in all age groups. The positive social change implication of this study is such that understanding the costs associated with a new technology to achieve a live birth is significant for society to help guide clinical treatment of these patients.
348

Inkrementell responsanalys : Vilka kunder bör väljas vid riktad marknadsföring? / Incremental response analysis : Which customers should be selected in direct marketing?

Karlsson, Jonas, Karlsson, Roger January 2013 (has links)
If customers respond differently to a campaign, it is worthwhile to find those customers who respond most positively and direct the campaign towards them. This can be done by using so called incremental response analysis where respondents from a campaign are compared with respondents from a control group. Customers with the highest increased response from the campaign will be selected and thus may increase the company’s return. Incremental response analysis is applied to the mobile operator Tres historical data. The thesis intends to investigate which method that best explain the incremental response, namely to find those customers who give the highest incremental response of Tres customers, and what characteristics that are important.The analysis is based on various classification methods such as logistic regression, Lassoregression and decision trees. RMSE which is the root mean square error of the deviation between observed and predicted incremental response, is used to measure the incremental response prediction error. The classification methods are evaluated by Hosmer-Lemeshow test and AUC (Area Under the Curve). Bayesian logistic regression is also used to examine the uncertainty in the parameter estimates.The Lasso regression performs best compared to the decision tree, the ordinary logistic regression and the Bayesian logistic regression seen to the predicted incremental response. Variables that significantly affect the incremental response according to Lasso regression are age and how long the customer had their subscription.
349

Diseño incremental de e-servicios: estudio teórico, propuesta metodológica y casos prácticos.

Huerta Vásquez, Eduardo Andrés 11 October 2012 (has links)
This thesis presents the development and results of a research work that proposes a method of designing incremental services that works for knowledge management environments via Internet, called e-services. This method is based on the principles of collaborative design-by which all organizational profiles provide specific design tasks, and the idea of incremental progression of projects, increasing at each stage of production the formal features and functionality artifact. The thesis begins with a theoretical study in which we present the main concepts and features three main areas: 1) new business models and scenarios that emerge from them, as so-called "open innovation" and "living labs"; 2) The research in the design field, which involved paradigms as "collaborative design" that governs the development of this doctoral research; 3) Theory of distributed cognition, studying the characteristics of cognitive development of subjects in different environments using different types of artifacts. A second part of the thesis presents the results of empirical exploration based on the concepts, features and phases of action research method. This chapter presents the incremental design method -main contribution of the thesis- and explains how it has been tested on two projects undertaken by the organization in which research is conducted. These projects make contributions in two different environments in knowledge management area. The projects are the development (following the proposed method) of e-health service to support the treatment of the disease of dysphagia; and systematization of incremental design method by implementing a workflow tool that is useful in everyday activity of the collective multidisciplinary research that develops the thesis. From empirical exploration emerge qualitative and quantitative results that are intended to validate the proposed methodology and are exposed in the thesis document. Finally, the last chapter presents general conclusions and specific work and contributions this makes to the scientific community, as well as suggesting some future research that may follow the experience presented here. / El presente documento presenta la evolución y resultados del trabajo de investigación que tiene como objetivo proponer un método incremental de diseño de servicios concebidos para entornos de gestión del conocimiento a través de Internet, denominados e-services. Dicho método se basa en los principios del diseño colaborativo -según los cuales todos los perfiles de la organización aportan tareas de diseño específicas- y en la idea de la progresión incremental de los proyectos, aumentando en cada fase de producción las funcionalidades y características formales del artefacto. El método propuesto ha sido puesto en práctica por un grupo de investigación multidisciplinario dedicado a la realización de diversos proyectos en el ámbito de la gestión del conocimiento.
350

Statistical Inference for Costs and Incremental Cost-Effectiveness Ratios with Censored Data

Chen, Shuai 2012 May 1900 (has links)
Cost-effectiveness analysis is widely conducted in the economic evaluation of new treatment options. In many clinical and observational studies of costs, data are often censored. Censoring brings challenges to both medical cost estimation and cost-effectiveness analysis. Although methods have been proposed for estimating the mean costs with censored data, they are often derived from theory and it is not always easy to understand how these methods work. We provide an alternative method for estimating the mean cost more efficiently based on a replace-from-the-right algorithm, and show that this estimator is equivalent to an existing estimator based on the inverse probability weighting principle and semiparametric efficiency theory. Therefore, we provide an intuitive explanation to a theoretically derived mean cost estimator. In many applications, it is also important to estimate the survival function of costs. We propose a generalized redistribute-to-the right algorithm for estimating the survival function of costs with censored data, and show that it is equivalent to a simple weighted survival estimator of costs based on inverse probability weighting techniques. Motivated by this redistribute-to-the-right principle, we also develop a more efficient survival estimator for costs, which has the desirable property of being monotone, and more efficient, although not always consistent. We conduct simulation to compare our method with some existing survival estimators for costs, and find the bias seems quite small. Thus, it may be considered as a candidate for survival estimator for costs in a real setting when the censoring is heavy and cost history information is available. Finally, we consider one special situation in conducting cost-effectiveness analysis, when the terminating events for survival time and costs are different. Traditional methods for statistical inference cannot deal with such data. We propose a new method for deriving the confidence interval for the incremental cost-effectiveness ratio under this situation, based on counting process and the general theory for missing data process. The simulation studies show that our method performs very well for some practical settings. Our proposed method has a great potential of being applied to a real setting when different terminating events exist for survival time and costs.

Page generated in 0.184 seconds