• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 216
  • 189
  • 26
  • 20
  • 18
  • 14
  • 13
  • 10
  • 4
  • 4
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 588
  • 134
  • 115
  • 87
  • 80
  • 66
  • 62
  • 61
  • 45
  • 44
  • 43
  • 40
  • 37
  • 36
  • 34
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Variability of two sampling methods in plaque samples

Hsu, Kuei-Ling C. January 2008 (has links) (PDF)
Thesis (M.S.)--University of Alabama at Birmingham, 2008. / Title from first page of PDF file (viewed Feb. 16, 2009). Includes bibliographical references.
162

Hematological changes in patients with severe malaria from Bangkok Hospital for tropical diseases /

Pirker-Krassnig, Daniela Karoline, Polrat Wilairatana, January 2006 (has links) (PDF)
Thematic Paper ((M.C.T.M. Clinical Tropical Medicine))--Mahidol University, 2006. / LICL has E-Thesis 0012 ; please contact computer services. LIRV has E-Thesis 0012 ; please contact circulation services.
163

A quantitative bacteriologic evaluation of teeth with necrotic pulp chambers a thesis submitted in partial fulfillment ... endodontics /

Lindemann, Michael B. January 1977 (has links)
Thesis (M.S.)--University of Michigan, 1977.
164

Incidence and clinical relevance of abnormal complete blood counts in survivors of childhood cancer

Long, Zsofia Banhegyi. January 2005 (has links) (PDF)
Thesis (M.D. with Distinction in Research) -- University of Texas Southwestern Medical Center at Dallas, 2005. / Vita. Bibliography: 25-28.
165

A quantitative bacteriologic evaluation of teeth with necrotic pulp chambers a thesis submitted in partial fulfillment ... endodontics /

Lindemann, Michael B. January 1977 (has links)
Thesis (M.S.)--University of Michigan, 1977.
166

Community assault and non-community assault among adults in Khayelitsha: A case count and comparison of injury severity

Forgus, Sheron 23 July 2015 (has links)
An article from this thesis is available in the repository at http://hdl.handle.net/10019.1/97621 / Background: Community Assault (CA) or vigilantism is rife in the township of Khayelitsha. Anecdotal evidence suggests that victims of CA are worse off than other assault cases. However, scientific data on the rate and severity of CA cases is lacking for South Africa. Aims and Objectives: To contribute to CA prevention and management strategies, by estimating the rate of CA among adults in Khayelitsha and comparing the injury severity and survival probability between cases of CA and other assault (non-CA) cases. Methods: We studied 4 health centres in Khayelitsha during July - December 2012. A consecutive case-series was conducted to capture all CA cases during this period and a retrospective folder review was performed on all cases of CA as well as on a control group of non-CA cases to compare injury severity and estimate survival probability. Results: One hundred and forty-eight adult cases of CA occurred (case rate 1.1/1000 person-years) over the study period. The Injury Severity Scores (ISS) in the CA group were significantly higher than in the non-CA group (P<0.001), with a median (Inter Quartile Range) ISS of 3 in CA cases (2-6) and 1 in non-CA cases (1-2). Comparison between the two groups showed that a GCS<15 (20.1% versus 5.4%), referral to the tertiary hospital (33.8% versus 22.6%), and crush syndrome (25.7% versus 0%) were all more common in CA cases. Survival probabilities were similar in both groups: 99.2% in the CA group versus 99.3% in the non-CA group. Conclusion: The rate of CA among adults in Khayelitsha is high, and the severity of injuries sustained by CA victims is substantially higher than in other assault cases.
167

Graph-based algorithms for transistor count minimization in VLSI circuit EDA tools / Algoritmos baseados em grafos para minimização de transistors em ferramentas EDA para circuitos VLSI

Matos, Jody Maick Araujo de January 2014 (has links)
Esta dissertação de mestrado introduz um conjunto de algoritmos baseados em grafos para a obtenção de circuitos VLSI com um número reduzido de transistores utilziando células simples. Esses algoritmos têm um foco principal na minimização do número de nodos em representações AIG e mapear essa estrutura otimizada utilizando células simples (NAND2 e NOR2) com um número mínimo de inversores. Devido à minimização de nodos, o AIG tem um alto compartilhamento lógico, o que pode derivar circuitos intermediários contendo células com fanouts infactíveis para os nodos tecnológicos atuais. De forma a resolver essas ocorrências, o circuito intermediário é submetido a um algoritmo para limitação de fanout. Os algoritmos propostos foram aplicados num conjunto de circuitos de benchmark e os resultados obtidos mostram a utilidade do método. Os circuitos resultantes tiveram, em média, 32% menos transistores do que as referências anteriores em números de transistores utilizando células simples. Adicionalmente, quando comparando esses resultados com trabalhos que utilizam células complexas, nossos números demonstraram que abordagens anteriores estão algumas vezes longe do número mínimo de transistores que pode ser obtido com o uso eficiente de uma biblioteca reduzida de células, composta por poucas células simples. Os circuitos baseados em células simples obtidos com a aplicação dos algoritmos proposto neste trabalho apresentam um menor número de transistores em muitos casos quando comparados aos resultados previamente publicados utilizando células complexas (CMOS estático e PTL). / This master’s thesis introduces a set of graph-based algorithms for obtaining reduced transistor count VLSI circuits using simple cells. These algorithms are mainly focused on minimizing node count in AIG representations and mapping this optimized AIG using simple cells (NAND2 and NOR2) with a minimal number of inverters. Due to the AIG node count minimization, the logic sharing is probably highly present in the optimized AIG, what may derive intermediate circuits containing cells with unfeasible fanout in current technology nodes. In order to fix these occurrences, this intermediate circuit is subjected to an algorithm for fanout limitation. The proposed algorithms were applied over a set of benchmark circuits and the obtained results have shown the usefulness of the method. The circuits generated by the methods proposed herein have, in average, 32% less transistor than the previous reference on transistor count using simple cells. Additionally, when comparing the presented results in terms of transistor count against works advocating for complex cells, our results have demonstrated that previous approaches are sometimes far from the minimum transistor count that can be obtained with the efficient use of a reduced cell library composed by only a few number of simple cells. The simple-cells-based circuits obtained after applying the algorithms proposed herein have presented a lower transistor count in many cases when compared to previously published results using complex (static CMOS and PTL) cells.
168

O jogo de pôquer : uma situação real para dar sentido aos conceitos de combinatória

Chilela, Ricardo Rodrigues January 2013 (has links)
A presente pesquisa foi desenvolvida para entender como ocorre o processo de ensino e aprendizagem da Combinatória, no caso particular dos problemas de contagem de agrupamentos de objetos, considerado difícil por professores e alunos; e para elaborar e experimentar uma proposta didática, com potencial para trazer algo novo ao processo. Com base na Teoria dos Campos Conceituais de Vergnaud, delineou-se os esquemas de um grupo de alunos do ensino médio: resolvem problemas de contagem direta, mas não resolvem os que exigem multiplicação e divisão. Com a análise de outros trabalhos correlatos, pode-se concluir que o ensino tem melhores chances de iniciar com a resolução de problemas, e não a partir de formulários e definições. Consequência deste estudo, foi organizada e posta em prática uma sequência didática que parte da vivência do “jogo de pôquer”. Entende-se o baralho (sem coringas) como um conjunto de 52 objetos, a partir do qual devemos formar agrupamentos de 5 objetos (“mãos”). Os problemas propostos gerados pelo jogo podem ser resolvidos com as quatro operações aritméticas. Ao final, constatou-se evolução nos esquemas dos alunos, que passaram a utilizar a multiplicação com significado e a utilizar uma organização gráfica adequada para as soluções. Mas ainda apareceram erros no uso da divisão, que foram analisados para poder-se oferecer ao professor/leitor, compreensão das dificuldades. / This research was conducted to understand how the teaching and learning of Combinatorics is, in the particular case of counting issues and groupings of objects, which is considered difficult by teachers and students. Also aims to develop and experience a didactic proposal, with the potential to bring something new to the process. Based on Vergnaud's theory of Conceptual Fields, it was outlined schemes of a group of high school students: they solve problems of direct counting, but do not solve problems that require multiplication and division. With the analysis of other related work, we can conclude that a better way of teaching would be starting with problem solving, and not from formulas and definitions. As a result of this study a teaching sequence that takes advantage of the experience of the poker game, was organized and implemented. It is understood the deck (without wildcards) of 52 cards, from which we form groups of 5 objects ("hands"). The proposed problems generated by the game can be solved with the four arithmetic operations. At the end of our experience, we discover changes in the schemes of the students, who start using multiplication meaning and an organization suitable for finding solutions. We notice that still errors appeared in the use of division, which were analyzed in order to offer the teacher / reader the understanding of the difficulties of the students.
169

Effects of template mass, complexity, and analysis method on the ability to correctly determine the number of contributors to DNA mixtures

Alfonse, Lauren Elizabeth 08 April 2016 (has links)
In traditional forensic DNA casework, the inclusion or exclusion of individuals who may have contributed to an item of evidence may be dependent upon the assumption on the number of individuals from which the evidence arose. Typically, the determination of the minimum number of contributors (NOC) to a mixture is achieved by counting the number of alleles observed above a given analytical threshold (AT); this technique is known as maximum allele count (MAC). However, advances in polymerase chain reaction (PCR) chemistries and improvements in analytical sensitivities have led to an increase in the detection of complex, low template DNA (LtDNA) mixtures for which MAC is an inadequate means of determining the actual NOC. Despite the addition of highly polymorphic loci to multiplexed PCR kits and the advent of interpretation softwares which deconvolve DNA mixtures, a gap remains in the DNA analysis pipeline, where an effective method of determining the NOC needs to be established. The emergence of NOCIt -- a computational tool which provides the probability distribution on the NOC, may serve as a promising alternative to traditional, threshold- based methods. Utilizing user-provided calibration data consisting of single source samples of known genotype, NOCIt calculates the a posteriori probability (APP) that an evidentiary sample arose from 0 to 5 contributors. The software models baseline noise, reverse and forward stutter proportions, stutter and allele dropout rates, and allele heights. This information is then utilized to determine whether the evidentiary profile originated from one or many contributors. In short, NOCIt provides information not only on the likely NOC, but whether more than one value may be deemed probable. In the latter case, it may be necessary to modify downstream interpretation steps such that multiple values for the NOC are considered or the conclusion that most favors the defense is adopted. Phase I of this study focused on establishing the minimum number of single source samples needed to calibrate NOCIt. Once determined, the performance of NOCIt was evaluated and compared to that of two other methods: the maximum likelihood estimator (MLE) -- accessed via the forensim R package, and MAC. Fifty (50) single source samples proved to be sufficient to calibrate NOCIt, and results indicate NOCIt was the most accurate method of the three. Phase II of this study explored the effects of template mass and sample complexity on the accuracy of NOCIt. Data showed that the accuracy decreased as the NOC increased: for 1- and 5-contributor samples, the accuracy was 100% and 20%, respectively. The minimum template mass from any one contributor required to consistently estimate the true NOC was 0.07 ng -- the equivalent of approximately 10 cells' worth of DNA. Phase III further explored NOCIt and was designed to assess its robustness. Because the efficacy of determining the NOC may be affected by the PCR kit utilized, the results obtained from NOCIt analysis of 1-, 2-, 3-, 4-, and 5-contributor mixtures amplified with AmpFlstr® Identifiler® Plus and PowerPlex® 16 HS were compared. A positive correlation was observed for all NOCIt outputs between kits. Additionally, NOCIt was found to result in increased accuracies when analyzed with 1-, 3-, and 4-contributor samples amplified with Identifiler® Plus and with 5-contributor samples amplified with PowerPlex® 16 HS. The accuracy rates obtained for 2-contributor samples were equivalent between kits; therefore, the effect of amplification kit type on the ability to determine the NOC was not substantive. Cumulatively, the data indicate that NOCIt is an improvement to traditional methods of determining the NOC and results in high accuracy rates with samples containing sufficient quantities of DNA. Further, the results of investigations into the effect of template mass on the ability to determine the NOC may serve as a caution that forensic DNA samples containing low-target quantities may need to be interpreted using multiple or different assumptions on the number of contributors, as the assumption on the number of contributors is known to affect the conclusion in certain casework scenarios. As a significant degree of inaccuracy was observed for all methods of determining the NOC at severe low template amounts, the data presented also challenge the notion that any DNA sample can be utilized for comparison purposes. This suggests that the ability to detect extremely complex, LtDNA mixtures may not be commensurate with the ability to accurately interpret such mixtures, despite critical advances in software-based analysis. In addition to the availability of advanced comparison algorithms, limitations on the interpretability of complex, LtDNA mixtures may also be dependent on the amount of biological material present on an evidentiary substrate.
170

Clinical judgement vs. evidence-based practice: two models to predict postoperative hematocrit following uncomplicated hysterectomy

Mayer, Sarah A. 13 July 2017 (has links)
BACKGROUND: Hysterectomies are one of the most frequently performed surgical procedures in the United States. There are a wide variety of diagnoses that require a patient to obtain this procedure, but the majority of hysterectomies are performed for benign indications. Currently, gynecologists do not follow a standardized protocol surrounding postoperative laboratory ordering, and healthcare professionals can order a wide range of tests as often as they choose. Extraneous laboratory orders are disruptive to the patients’ well-being and risk their health following surgery. These orders are costly for hospital systems, take up precious time of hospital employees, and influence the course of patient treatment only in extremely rare circumstances. There are few studies that develop exclusion criteria for patients who may not require a laboratory test following surgery. Though systems to predict postoperative hematocrit have been created, they are complicated and difficult to use. The few studies that were performed are yet to be accepted by the medical community, in part because of their limited scope. This study will be the first to incorporate the results of robotic surgery in the analysis. OBJECTIVE: The purpose of this study is to determine concrete parameters to indicate that a patient is in need of postoperative laboratory work and at risk for anemia or transfusion. We aim to develop two comprehensive models that guide surgical practitioners to identify the cases which do not require laboratory data. METHODS: A total of 1027 gynecologic surgeries were performed at Saint Francis Hospital and Medical Center between April 1, 2014 and May 31, 2016. This retrospective study extracted data from EPIC EMR according to 42 variables preconceived to be the leading indicators of postoperative hematocrit and overall healing. Five healthcare professionals were surveyed to identify the variables that influence their postsurgical patient assessments and their decisions to order blood testing. This information was developed into score sheets with differing levels of stringency. Correlation highlighted 14 of the initial 42 variables as contributors to postoperative hematocrit and an equation model was built. Stepwise linear regression was used for univariate and multivariate analyses, from which we created our equation to predict all patients’ postoperative hematocrit. RESULTS: Out of the 1027 initial cases, a total of 602 cases were identified as hysterectomies for benign indications. Survey data gave the highest value to urine output and heart rate as key indicators of postoperative anemia. From the survey data, two clinical scoring sheets with differing stringency were created to guide practitioner laboratory ordering. These sheets gave parameters of heart rate and urine output the largest correlative weight in determining postoperative hematocrit. However, based on regression analysis, parameters of age (AGE), body mass index (BMI), preoperative platelet count (PPC), estimated blood loss during surgery (IO EBL), preoperative hematocrit (PHCT) and postoperative fluid bolus orders (POSTOP FB) proved to be the key variables impacting postoperative hematocrit (POSTOP HCT). These items were translated into the equation: POSTOP HCT = 22.51 – 0.40*POSTOP FB – 0.01*IO EBL + 0.25 PHCT + 0.09*BMI + 0.06*AGE – 0.01*PPC (R-squared = 0.310). CONCLUSIONS: This study aims to decrease superfluous laboratory testing, as well as to contribute to a larger conversation considering the potential merits of clinical judgement in a data-driven healthcare system. We have created a number of comparable strategies in order to reduce the number of unnecessary blood draws: two clinical scoring sheets and an equation. The score sheets indicate when to order additional testing. These sheets are representative of a range of surgical practitioners’ conventional clinical judgement. The equation serves as an evidence-based guide for determining postoperative hematocrit following benign gynecologic surgery. These predictive mechanisms will be validated and a superior method determined as our research continues with prospective application. We eventually expect to use the most accurate mechanism to reduce postoperative blood testing following all surgeries.

Page generated in 0.0447 seconds