221 |
Distribuições de limiar de dose e suas causas e consequências em Terapia Fotodinâmica / Threshold dose distributions and its causes and consequences in photodynamic therapyClara Maria Gonçalves de Faria 22 February 2017 (has links)
O princípio de terapia fotodinâmica (TFD) foi introduzido por volta de 1900 mas posteriormente investgado como candidato para tratamento de cancer na década de 1970. Desde então, existem diversos trabalhos a respeito do assunto in vitro, in vivo e em estudos clínicos e grandes avanços foram alcançados. Entretanto, alguns desafios ainda não foram superados, como a variabilidade dos resultados. Este trabalho consiste na investigação de suas causas, em que o principal objetivo é avançar o estado da arte em TFD. Para isso foi usado um modelo de distribuição de limiar de dose para avaliar resistência em TFD in vitro. As distribuições de limiar de dose são obtidas pela derivação a curva de dose resposta experimental. Elas são caracterizadas pela sua largura e pela dose que corresponde ao pico, que se relaciona a homogeneidade e resistência intrínseca da população, respectivamente. Na seção 1, é apresentada a avaliação e comparação de dados obtidos de resultados publicados na literatura e, na seção 2, de experimentos realizados pela autora em diferentes linhagens celulares. Da análise da primeira etapa, foi observado que a largura da distribuição é proporcional a dose do pico e foi possível investigar a dependência do resultado da TFD com a linhagem celular, dado um fotossensibilizador (FS). Foi interessante, também, notar que as distribuições de limiar de dose correspondem a curvas de atividade de marcadores celulares de apoptose, como função da dose de luz, para a maior parte das condições analisadas. Dos experimentos realizados pela autora, foi visto que as células normais são as mais resistentes ao dano, seguida das células de câncer resistentes e sua linhagem parental, e que sua resposta foi a mais homogênea. Essas observações foram corroboradas pelas imagens obtidas de microscopia de fluorescência para avaliação da captação de FS, que mostraram que as células tumorais acumulam mais FS que as outras. Portanto, foi mostrado o potencial de se aplicar distribuições de limiar de dose na análise de resultados de TFD in vitro, ela é uma poderosa ferramenta que fornece mais informações que as curvas de dose resposta padrão. / The principle of photodynamic therapy (PDT) was introduced around 1900 but further investigated as a candidate to cancer treatment in the 1970´s. Since then, there are several papers regarding the subject in vitro, in vivo and clinical trials and great advances were achieved. However, some challenges were not yet overcome, such as results variability. This work consists in the investigation of its causes, where the main goal is to advance the state of art of PDT. For that it is being used a threshold dose distribution model to evaluate cell resistance to PDT in vitro. The threshold distributions are obtained by differentiating the experimental dose response curve. They are characterized by its width and the dose that corresponds to the peak which relates to the homogeneity and intrinsic resistance of the population, respectively. In section 1, it is presented the evaluation and comparison of data obtained from published results in literature and, in section 2, of experiments performed by the author in different cell lines. From the analysis in the first part, it was observed that the width of the distribution is proportional to its dose of the peak and it was possible to investigate the dependence of the PDT result with the cell line, given a fixed photosensitizer (PS). It was also interesting to note that the threshold distribution corresponded to the activity curves for apoptotic cell markers as a function of light dose, for most of the conditions analyzed. From the experiments performed by the author, it was seen that the normal cell line was the most resistant one, followed by the resistant cancer cells and its parental cell line, and that its response was more homogeneous. Those finding were supported by the fluorescence microscopy images obtained to evaluate PS uptake, which shown that the tumor cells accumulated more PS than the other ones. Therefore, it was shown the potential of applying the threshold distribution to analyze PDT results in vitro, it is a powerful tool that provides more information than the standard dose response curves.
|
222 |
Como o comportamento animal pode influenciar a distribuição das espécies / The influence of animal behavior on species distributionsLima, Herlander Correia de 21 March 2018 (has links)
Submitted by Franciele Moreira (francielemoreyra@gmail.com) on 2018-03-23T12:55:32Z
No. of bitstreams: 2
Dissertação - Herlander Correia de Lima - 2018.pdf: 2848519 bytes, checksum: 21c8989e0952abc6dc4f229fa27ff46f (MD5)
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2018-03-23T14:58:47Z (GMT) No. of bitstreams: 2
Dissertação - Herlander Correia de Lima - 2018.pdf: 2848519 bytes, checksum: 21c8989e0952abc6dc4f229fa27ff46f (MD5)
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Made available in DSpace on 2018-03-23T14:58:47Z (GMT). No. of bitstreams: 2
Dissertação - Herlander Correia de Lima - 2018.pdf: 2848519 bytes, checksum: 21c8989e0952abc6dc4f229fa27ff46f (MD5)
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Previous issue date: 2018-03-21 / Conselho Nacional de Pesquisa e Desenvolvimento Científico e Tecnológico - CNPq / Research in animal personality is increasing our understanding of what prevents a species from
colonizing new areas, which is one of the outstanding questions in biogeography. Some
behavioral types can perform better than others in specific stages involved in range expansion.
For example, a high exploratory behavior increases the chances of finding new resources in novel
environments. However, inconsistent results in the literature hindered a definite recognition of the
role of animal personalities on species distributions. I collected data available in the literature and
performed a bayesian meta-analysis to assess which behavioral types are driving range expansion
in the following biogeographical processes: dispersal, migration and invasion. I used several
moderators to try to discern context-dependencies in effect sizes. A hierarchical model, with
effect sizes nested within studies, revealed that more exploratory and bolder behaviors facilitate
range expansion. Also, I found that invasive individuals are more likely to be more exploratory
and more active than natives, while dispersers are generally bolder and more exploratory than
non-dispersers. Yet, the low study sample size obtained for analysis stresses the need to conduct
more primary studies. Results highlight the role of behavioral traits in species distributions and
increase our knowledge about which ecological characteristics might prepare species to endure
the current global environmental challenges. / A pesquisa em personalidade animal está aumentando o nosso conhecimento sobre o que
previne uma espécie de colonizar novas áreas, sendo esta uma das principais questões em
biogeografia. Alguns tipos de comportamento podem resultar em melhor desempenho que outros
em específicos estágios de expansão do território. Por exemplo, um comportamento mais
exploratório facilita a descoberta de recursos em um novo meio. Contudo, resultados
inconsistentes na literatura estão dificultando um reconhecimento do papel da personalidade
animal na distribuição das espécies. Coletei dados da literatura e realizei uma meta-análise
bayesiana para determinar que tipos de comportamento são responsáveis pela expansão do
território através dos processos biogeográficos de: dispersão, invasão e migração. Fiz ainda uso
de vários moderadores na tentativa de identificar contexto-dependências nos tamanhos de efeito.
Em um modelo hierárquico, usando tamanhos de efeito aninhados dentro dos estudos, mostro que
um comportamento mais ousado e mais exploratório facilita o sucesso na expansão do território.
Para além disso, eu demonstro que invasores são mais exploratórios e mais ativos que nativos, e
dispersores são mais exploratórios e ousados que não-dispersores. Contudo, o baixo tamanho
amostral obtido para as análises demonstra a necessidade de conduzir mais estudos primários. Os
resultados realçam o papel dos traços comportamentais na distribuição das espécies e aumentam
o nosso conhecimento sobre que características ecológicas podem preparar as espécies para
resistir aos desafios das mudanças ambientais.
|
223 |
DiscriminaÃÃo Salarial por RaÃa e GÃnero no Mercado de Trabalho das RegiÃes Nordeste e Sudeste do Brasil: uma AplicaÃÃo de SimulaÃÃes Contrafactuais e RegressÃo QuantÃlica / Wage Discrimination by Race and Gender in the Job Market of Northeast and Southeast regions of Brazil: an application of Simulations Contrafactuais and Regression QuantÃlicJaqueline Nogueira Cambota 10 July 2005 (has links)
Universidade Federal do Cearà / Este artigo analisa a discriminaÃÃo salarial por raÃa e gÃnero nas distribuiÃÃes de salÃrio segundo os setores de atividade, comparando as regiÃes Nordeste e Sudeste do Brasil. Para este objetivo, utilizou-se os dados da Pnad 2002 e aplicou-se uma metodologia semi paramÃtrica - estimador kernel e outra paramÃtrica â regressÃo quantÃlica. Na primeira, realizaram-se simulaÃÃes contrafactuais, para verificar como seria a distribuiÃÃo de salÃrios dos trabalhadores negros (mulheres) caso eles tivessem a mesma escolaridade dos trabalhadores brancos (homens). Essas simulaÃÃes mostram que existe discriminaÃÃo contra mulheres e negros no mercado de trabalho em ambas as regiÃes. O mÃtodo kernel mostrou em uma representaÃÃo visualmente clara que a discriminaÃÃo contra a raÃa negra à maior no Sudeste para todos os setores de atividade, enquanto que nÃo se conseguiu identificar em qual regiÃo a discriminaÃÃo contra mulheres à maior, visto que ela depende do setor considerado. Em relaÃÃo à regressÃo quantÃlica, os resultados mostraram que a discriminaÃÃo salarial cresce para salÃrios maiores. / This paper analyses the discrimination of wages by race and gender in the wage distributions according to sectors of occupation, comparing the Northeast and Southeast regions of the Brazil. For this aim, we use data from Pnad 2002 and apply a semi-parametric method - kernel estimator and another parametric â quantile regression. In the first, we make counterfactual exercises, to examine how going to be the wage distribution of the black(women) workers if they had the same schooling than the white (men) workers. The results verify that exist discrimination against women and black workers in labor market of both regions. The kernel method provided a visually clear representation that discrimination against black workers is greater in the Southeast for all sectors, while it can not identify where discrimination against women is greater because it depend on the sector. The quantile regression showed that discrimination increases for higher wages.
|
224 |
Implementação e aceite de sistema de radioterapia de feixe modulado dinâmico com o uso de colimador secundário de múltiplas folhas / Commisioning and implantation of the dynamic intensity modulated radiation therapy using secondary multi-leaf collimator (MLC)Paulo José Cecilio 14 October 2008 (has links)
A radioterapia de feixe de intensidade modulada (IMRT) no seu modo dinâmico é uma forma de radioterapia tri-dimensional (3D), na qual modula-se um feixe de forma a obter-se a irradiação com campos que possuem perfil variável. Os campos são gerados por um sistema de otimização matemático e transformado em seqüências de movimento ou abertura de lâminas dos colimadores terciários de múltiplas folhas (MLC) ou feixe colimado helicoidal, reproduzindo a fluência de radiação adequada. No processo o operador atribui valores limitantes de dose ao alvo e aos órgãos de risco circunvizinhos para que o sistema de planejamento inverso realize a otimização possível. Após a aprovação do plano de tratamento o mesmo deve ser conferido, através de um controle de qualidade (CQ), onde são verificadas as doses que deverão ser administradas ao paciente, comprovando-se as doses obtidas e aprovadas no plano do sistema de planejamento (SPC). Para este controle os mesmos feixes e campos são medidos em termos de dose absorvida e perfis, através de dosimetria na qual comprova-se que não há erro físico ou dosimétrico no plano que irá tratar o paciente com diferença aceitável de até 5%, também utilizada como tolerância para a aprovação dos 460 casos avaliados nesta tese. Foram apresentados as metodologias para a aceitação no primeiro serviço a utilizá-la no Brasil e os testes de controle de qualidade de dois serviços de radioterapia, desde agosto de 2001 à maio de 2006 e no outro serviço de outubro de 2007 a maio de 2008, com controle de qualidade que permitiram os respectivos tratamentos clínicos com dados de 4 anos, ou seja, 460 casos com 3935 campos de tratamento verificados individualmente por dosimetria. Isto possibilitou o aperfeiçoamento da metodologia e garantia da qualidade nos tratamentos de IMRT dinâmico destes pacientes. / The intensity modulated radiation therapy (IMRT) is a type of radiation therapy using dynamic sliding window which modulated the beamlets of each field which are thus obtained as a variable profile. The multiple fields are obtained by mathematic optimization in special treatment planning system. In this way, the resulted field is generated by leaf sequencing using the multi-leaf collimator (MLC) or helicoidally beam. The optimization is an interactive process with operator and planning system where the dose prescription to target and dose limit for organ of risk are inserted to obtain the acceptable beam fluence and this process is named as inverse planning. The planning approved by physician should checked by means of dosimetry in order to assure the correct dose delivery; this action is the main task of a quality control (QC) program. The QC is performed by measurements of total absorbed dose and profile for each field planned for the patient. The acceptance level is 5% for total dose and was used for all 460 cases and 3935 fields analyzed between August 2001 to May 2006 at Albert Einstein Hospital and October 2007 to May 2008 at the Centro Infantil Dr. Boldrini. This work performs an analysis of the QC of treatments plans for all patients treated with IMRT. During four years the methodologies were frequently improved and upgradated for each tumor site and could thus be assured for the required quality of all treatments with dynamic IMRT.
|
225 |
As distribuições Kumaraswamy-log-logística e Kumaraswamy-logística / Distributions Kumaraswamy-log-logistic and Kumaraswamy-logisticTiago Viana Flor de Santana 18 October 2010 (has links)
Neste trabalho apresenta-se duas novas distribuições de probabilidade obtidas de dois métodos de generalização da distribuição log-logística com dois parâmetros (LL(?,?)). O primeiro método descrito em Marshall e Olkin (1997) transforma a nova distribuição, agora com três parâmetros e denominada distribuição log-logística modificada (LLM (v,?,?)), mais flexível porém, não muda a forma geral da função de taxa de falha e o novo parâmetro v, não influência no cálculo da assimetria e curtose. O segundo método utiliza a classe de distribuições Kumaraswamy proposta por Cordeiro e Castro (2010), para construir a nova distribuição de probabilidade, denominada distribuição Kumaraswamy log-logística (Kw-LL(a,b,?,?)), a qual considera dois novos parâmetros a e b obtendo ganho nas formas da função de taxa de falha, que agora além de modelar dados onde a função de taxa de falha tem forma decrescente e unimodal, modela forma crescente e forma de U. Também foi proposto as distribuições logística modificada (LM (v,µ,?)) e Kumaraswamy logística (Kw-L (a,b, µ,?)$) para a variável Y=log(T), em que T ~ LLM (v,?,?) no caso da distribuição logística modificada e T ~ Kw-LL(a,b,?,?) no caso da distribuição Kw-L. Com reparametrização ? = exp(µ) e ? = 1/?. Da mesma forma que a distribuição LLM, não há ganho quanto a forma da função de taxa de falha da distribuição logística modificada e o parâmetro v não contribuiu para o cálculo da assimetria e curtose desta distribuição. O modelo de regressão locação e escala foi proposto para ambas as distribuições. Por fim, utilizou-se dois conjuntos de dados, para exemplificar o ganho das novas distribuições Kw-LL e Kw-L em relação as distribuições log-logística e logística. O primeiro conjunto refere-se a dados de tempo até a soro-reversão de 143 crianças expostas ao HIV por via vertical, nascidas no Hospital das Clínicas da Faculdade de Medicina de Ribeirão Preto no período de 1995 a 2001, onde as mães não foram tratadas. O segundo conjunto de dados refere-se ao tempo até a falha de um tipo de isolante elétrico fluido submetivo a sete níveis de voltagem constante. / In this work, are presented two new probability distributions, obtained from two generalization methods of the log-logistic distribution, with two parameters (LL (?, ?)). The first method described in Marshall e Olkin (1997) turns the new distribution, now with three parameters, called modified log-logistic distribution (LLM(v, ?, ?)). This distribution is more flexible, but, does not change the general shape of the failure rate function, as well as the new parameter v, does not influence the calculus of skewness and kurtosis. The second method, uses the class of distributions Kumaraswamy proposed by Cordeiro and Castro (2010). To build the new probability distribution, called Kumaraswamy log-logistic distribution (Kw-LL(a,b,?,?)), which considers two new parameters a and b gaining in the forms of failure rate function, that now, even modeling data where the failure rate function has decreasing and unimodal shape, models the increasing form and the U-shaped. Also, were proposed the distributions modified logistic (LM (v,µ,?)) and Kumaraswamy logistics (Kw-L (a,b,µ,?)) for the variable Y=log(T), where T ~ LLM(v,?,?) in the case of the modified logistic distribution and T ~ Kw-LL (a,b,?,?) in the case of Kw-L distribution, with reparametrization ? =exp(µ) and ? = 1/?. As in the distribution LLM, there is no gain for the shape of the failure rate function of modified logistic distribution and the parameter v does not contribute to the calculation of skewness and kurtosis of the distribution. The location and scale regression models were proposed for both distributions. As illustration, were used two datasets to exemplify the gain of the new distributions Kw-LL and Kw-L compared with the log-logistic and logistic distributions. The first dataset refers to the data of time until soro-reversion of 143 children exposed to HIV through vertical, born in the Hospital of the Medical School of Ribeirão Preto during the period 1995 to 2001, where mothers were not treated. The second dataset refers to the time until the failure of a type of electrical insulating fluid subjected to seven constant voltage levels
|
226 |
Star formation in the Gould Belt : a submillimetre perspectiveMowat, Christopher January 2018 (has links)
This thesis presents my work characterising star formation in Gould Belt molecular clouds using submillimetre observations from SCUBA-2 on the James Clerk Maxwell Telescope (JCMT). I use these observations alongside data from previously published surveys using instruments including the Spitzer Space Telescope. I investigate the effect of including submillimetre data on the numbers, classifications and lifetimes of Young Stellar Objects (YSOs) in Gould Belt molecular clouds, particularly protostars. Following a literature review, I use SCUBA-2 450 and 850 μm observations to characterise star formation in the Lupus I molecular cloud. A total of eleven previously identified YSOs are detected with SCUBA-2, as well as eleven starless cores. Two cores have masses greater than the Jeans mass, and one has a virial parameter of 1.1 0.4, meaning these cores could be unstable against collapse. I use submillimetre emission to calculate disk masses, and find that one YSO has a disk mass greater than the minimum mass solar nebula. I find that Lupus I has a high percentage of both protostars and Very Low Luminosity Objects (VeLLOs). I also fit YSO Spectral Energy Distributions (SEDs) with models, allowing protostellar envelope masses and temperatures to be calculated, and interstellar extinction to be constrained for some YSOs. The signs of recent and future star formation support the hypothesis that a shock has triggered a star forming event in Lupus I. I also use SCUBA-2 data in conjunction with archival Spitzer and Herschel data to produce SEDs for five new candidate First Hydrostatic Cores (FHSCs) in Serpens South. These observations were then fit with models by the first author of this work, Alison Young. This work was able to identify two of the FHSC candidates as probable FHSCs, and constrain the rotation rate and inclination of one of them. I use JCMT Gould Belt Survey (GBS) observations of ten molecular clouds to produce an updated catalogue of protostars in these clouds. I use the FellWalker algorithm to find individual sources in the SCUBA-2 maps, and match them to the Spitzer YSO catalogue of Dunham et al. (2015). I use bolometric temperature to classify 362 out of 592 candidates as Class 0 or Class I protostars - a factor of two increase compared to the Spitzer catalogue due to improved submillimetre coverage. I find that protostellar lifetimes of 0.59 – 0.89 Myr - approximately 25 % longer than previously estimated. I also calculate protostellar luminosities, envelope masses, and envelope temperatures, and examine the distributions. Finally, I newly identify 19 protostars as VeLLOs, and increase the number of known VeLLOs in these clouds by a factor of two.
|
227 |
Median and Mode Approximation for Skewed Unimodal Continuous Distributions using Taylor Series ExpansionDula, Mark, Mogusu, Eunice, Strasser, Sheryl, Liu, Ying, Zheng, Shimin 06 April 2016 (has links)
Background: Measures of central tendency are one of the foundational concepts of statistics, with the most commonly used measures being mean, median, and mode. While these are all very simple to calculate when data conform to a unimodal symmetric distribution, either discrete or continuous, measures of central tendency are more challenging to calculate for data distributed asymmetrically. There is a gap in the current statistical literature on computing median and mode for most skewed unimodal continuous distributions. For example, for a standardized normal distribution, mean, median, and mode are all equal to 0. The mean, median, and mode are all equal to each other. For a more general normal distribution, the mode and median are still equal to the mean. Unfortunately, the mean is highly affected by extreme values. If the distribution is skewed either positively or negatively, the mean is pulled in the direction of the skew; however, the median and mode are more robust statistics and are not pulled as far as the mean. The traditional response is to provide an estimate of the median and mode as current methodological approaches are limited in determining their exact value once the mean is pulled away. Methods: The purpose of this study is to test a new statistical method, utilizing the first order and second order partial derivatives in Taylor series expansion, for approximating the median and mode of skewed unimodal continuous distributions. Specifically, to compute the approximated mode, the first order derivatives of the sum of the first three terms in the Taylor series expansion is set to zero and then the equation is solved to find the unknown. To compute the approximated median, the integration from negative infinity to the median is set to be one half and then the equation is solved for the median. Finally, to evaluate the accuracy of our derived formulae for computing the mode and median of the skewed unimodal continuous distributions, simulation study will be conducted with respect to skew normal distributions, skew t-distributions, skew exponential distributions, and others, with various parameters. Conclusions: The potential of this study may have a great impact on the advancement of current central tendency measurement, the gold standard used in public health and social science research. The study may answer an important question concerning the precision of median and mode estimates for skewed unimodal continuous distributions of data. If this method proves to be an accurate approximation of the median and mode, then it should become the method of choice when measures of central tendency are required.
|
228 |
Creating, Validating, and Using Synthetic Power Flow Cases: A Statistical Approach to Power System AnalysisJanuary 2019 (has links)
abstract: Synthetic power system test cases offer a wealth of new data for research and development purposes, as well as an avenue through which new kinds of analyses and questions can be examined. This work provides both a methodology for creating and validating synthetic test cases, as well as a few use-cases for how access to synthetic data enables otherwise impossible analysis.
First, the question of how synthetic cases may be generated in an automatic manner, and how synthetic samples should be validated to assess whether they are sufficiently ``real'' is considered. Transmission and distribution levels are treated separately, due to the different nature of the two systems. Distribution systems are constructed by sampling distributions observed in a dataset from the Netherlands. For transmission systems, only first-order statistics, such as generator limits or line ratings are sampled statistically. The task of constructing an optimal power flow case from the sample sets is left to an optimization problem built on top of the optimal power flow formulation.
Secondly, attention is turned to some examples where synthetic models are used to inform analysis and modeling tasks. Co-simulation of transmission and multiple distribution systems is considered, where distribution feeders are allowed to couple transmission substations. Next, a distribution power flow method is parametrized to better account for losses. Numerical values for the parametrization can be statistically supported thanks to the ability to generate thousands of feeders on command. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2019
|
229 |
Evaluating equating properties for mixed-format testsHe, Yi 01 May 2011 (has links)
Mixed-format tests containing both multiple-choice (MC) items and constructed-response (CR) items are used in many testing programs. The use of multiple formats presents a number of measurement challenges, one of which is how to adequately equate mixed-format tests under the common-item nonequivalent groups (CINEG) design, especially when, due to practical constraints, the common-item set contains only MC items. The purpose of this dissertation was to evaluate how equating properties were preserved for mixed-format tests under the CINEG design.
Real data analyses were conducted on 22 equating linkages of 39 mixed-format tests from the Advanced Placement (AP) Examination program. Four equating methods were used: the frequency estimation (FE) method, the chained equipercentile (CE) method, item response theory (IRT) true score equating, and IRT observed score equating. In addition, cubic spline postsmoothing was used with the FE and CE methods. The factors of investigation were the correlation between MC and CR scores, the proportion of common items, the proportion of MC-item score points, and the similarity between alternate forms. Results were evaluated using three equating properties: first-order equity, second-order equity, and the same distributions property.
The main findings from this dissertation were as follows: (1) Between the two IRT equating methods, true score equating better preserved first-order equity than observed score equating, and observed score equating better preserved second-order equity and the same distributions property than true score equating. (2) Between the two traditional methods, CE better preserved first-order equity than FE, but in terms of preserving second-order equity and the same distributions property, CE and FE produced similar results. (3) Smoothing helped to improve the preservation of second-order equity and the same distributions property. (4) A higher MC-CR correlation was associated with better preservation of first-order equity for both IRT methods. (5) A higher MC-CR correlation was associated with better preservation of second-order equity for IRT true score equating. (6) A higher MC-CR correlation was associated with better preservation of the same distributions property for IRT observed score equating. (7) The proportion of common items, the proportion of MC score points, and the similarity between forms were not found to be associated with the preservation of the equating properties. These results are interpreted in the context of research literature in this area and suggestions for future research are provided.
|
230 |
Methods for Meta–Analyses of Rare Events, Sparse Data, and HeterogeneityZabriskie, Brinley 01 May 2019 (has links)
The vast and complex wealth of information available to researchers often leads to a systematic review, which involves a detailed and comprehensive plan and search strategy with the goal of identifying, appraising, and synthesizing all relevant studies on a particular topic. A meta–analysis, conducted ideally as part of a comprehensive systematic review, statistically synthesizes evidence from multiple independent studies to produce one overall conclusion. The increasingly widespread use of meta–analysis has led to growing interest in meta–analytic methods for rare events and sparse data. Conventional approaches tend to perform very poorly in such settings. Recent work in this area has provided options for sparse data, but these are still often hampered when heterogeneity across the available studies differs based on treatment group. Heterogeneity arises when participants in a study are more correlated than participants across studies, often stemming from differences in the administration of the treatment, study design, or measurement of the outcome. We propose several new exact methods that accommodate this common contingency, providing more reliable statistical tests when such patterns on heterogeneity are observed. First, we develop a permutation–based approach that can also be used as a basis for computing exact confidence intervals when estimating the effect size. Second, we extend the permutation–based approach to the network meta–analysis setting. Third, we develop a new exact confidence distribution approach for effect size estimation. We show these new methods perform markedly better than traditional methods when events are rare, and heterogeneity is present.
|
Page generated in 0.1117 seconds