• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 47
  • 7
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 83
  • 21
  • 12
  • 12
  • 9
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Poetry of the American suburbs

Monacell, Peter. January 2004 (has links)
Thesis (M.A.)--University of Missouri-Columbia, 2004. / Typescript. Includes bibliographical references (leaves 67-80). Also available on the Internet.
52

Revealing social networks\' missed behavior: detecting reactions and time-aware analyses / Revelando o comportamento perdido em redes sociais: detectando reações e análises temporais

Samuel Martins Barbosa Neto 29 May 2017 (has links)
Online communities provide a fertile ground for analyzing people\'s behavior and improving our understanding of social processes. For instance, when modeling social interaction online, it is important to understand when people are reacting to each other. Also, since both people and communities change over time, we argue that analyses of online communities that take time into account will lead to deeper and more accurate results. In many cases, however, users behavior can be easily missed: users react to content in many more ways than observed by explicit indicators (such as likes on Facebook or replies on Twitter) and poorly aggregated temporal data might hide, misrepresent and even lead to wrong conclusions about how users are evolving. In order to address the problem of detecting non-explicit responses, we present a new approach that uses tf-idf similarity between a user\'s own tweets and recent tweets by people they follow. Based on a month\'s worth of posting data from 449 ego networks in Twitter, this method demonstrates that it is likely that at least 11% of reactions are not captured by the explicit reply and retweet mechanisms. Further, these uncaptured reactions are not evenly distributed between users: some users, who create replies and retweets without using the official interface mechanisms, are much more responsive to followees than they appear. This suggests that detecting non-explicit responses is an important consideration in mitigating biases and building more accurate models when using these markers to study social interaction and information diffusion. We also address the problem of users evolution in Reddit based on comment and submission data from 2007 to 2014. Even using one of the simplest temporal differences between usersyearly cohortswe find wide differences in people\'s behavior, including comment activity, effort, and survival. Furthermore, not accounting for time can lead us to misinterpret important phenomena. For instance, we observe that average comment length decreases over any fixed period of time, but comment length in each cohort of users steadily increases during the same period after an abrupt initial drop, an example of Simpson\'s Paradox. Dividing cohorts into sub-cohorts based on the survival time in the community provides further insights; in particular, longer-lived users start at a higher activity level and make more and shorter comments than those who leave earlier. These findings both give more insight into user evolution in Reddit in particular, and raise a number of interesting questions around studying online behavior going forward. / Comunidades online proporcionam um ambiente fértil para análise do comportamento de indivíduos e processos sociais. Por exemplo, ao modelarmos interações sociais online, é importante compreendemos quando indivíduos estão reagindo a outros indivíduos. Além disso, pessoas e comunidades mudam com o passar do tempo, e levar em consideração sua evolução temporal nos leva a resultados mais precisos. Entretanto, em muitos casos, o comportamento dos usuários pode ser perdido: suas reações ao conteúdo ao qual são expostos não são capturadas por indicadores explícitos (likes no Facebook, replies no Twitter). Agregações temporais de dados pouco criteriosas podem ocultar, enviesar ou até levar a conclusões equivocadas sobre como usuários evoluem. Apresentamos uma nova abordagem para o problema de detectar respostas não-explicitas que utiliza similaridade tf-idf entre tweets de um usuário e tweets recentes que este usuário recebeu de quem segue. Com base em dados de postagens de um mês para 449 redes egocêntricas do Twitter, este método evidencia que temos um volume de ao menos 11% de reações não capturadas pelos mecanismos explicitos de reply e retweet. Além disso, essas reações não capturadas não estão uniformemente distribuídas entre os usuários: alguns usuários que criam replies e retweets sem utilizar os mecanismos formais da interface são muito mais responsivos a quem eles seguem do que aparentam. Isso sugere que detectar respostas não-explicitas é importante para mitigar viéses e construir modelos mais precisos a fim de estudar interações sociais e difusão de informação. Abordamos o problema de evolução de usuários no Reddit com base em dados entre o período de 2007 a 2014. Utilizando métodos simples de diferenciação temporal dos usuários -- cohorts anuais -- encontramos amplas diferenças entre o comportamento, que incluem criação de comentários, métricas de esforço e sobrevivência. Desconsiderar a evolução temporal pode levar a equívocos a respeito de fenômenos importantes. Por exemplo, o tamanho médio dos comentários na rede decresce ao longo de qualquer intervalo de tempo, mas este tamanho é crescente em cada uma das cohorts de usuários no mesmo período, salvo de uma queda inicial. Esta é uma observação do Paradoxo de Simpson. Dividir as cohorts de usuários em sub-cohorts baseadas em anos de sobrevivência na rede nos fornece uma perspectiva melhor; usuários que sobrevivem por mais tempo apresentam um maior nível de atividade inicial, com comentários mais curtos do que aqueles que sobrevivem menos. Com isto, compreendemos melhor como usuários evoluem no Reddit e levantamos uma série de questões a respeito de futuros desdobramentos do estudo de comportamento online.
53

Jämförelse av ejektionsfraktion vid myokardscintigrafi i vila (GSPECT) och 2D ekokardiografi

Kochai, Fahrie January 2020 (has links)
Ejektionsfraktion (EF) är ett mått på den globala systoliska vänsterkammarens funktion. EF är en viktig parameter för den systoliska vänsterkammare funktionen eftersom de terapeutiska samt prognostiska fattade besluten bygger på bland annat detta mått, vilket normalt är ≥ 55 % av slutdiastolisk volym. Vid beräkning av EF används bland annat tvådimensionell (2D) ekokardiografi (2D-EKO) samt myokardscintigrafi i vila (GSPECT). Utifrån 2D-EKO erhålls EF med Biplan Simpson metoden tillämpad i fyrkammarvy och tvåkammarvy i slutdiastole och slutsystole. Med GSPECT i vila erhålls EF värden utifrån QPS och QGS bilderna efter att en automatisk utlinjering av endokardiets konturer genomförts med hjälp av Hermes Medical solution. Syftet med studien var att se om det föreligger en signifikant skillnad för uppmätt EF mellan Myokardscintigrafi (GSPECT) och 2D Ekokardiografi i vila. Studien innefattade 30 deltagare. Deltagarnas uppgifter avidentifierades och inhämtades från tidigare genomförd 2D-EKO respektive GSPECT i vila, med högst 6 månaders mellanrum av legitimerade Biomedicinsk Analytiker. Resultatet av studien förevisade ett p – värde (p=0,000) <0,1% samt r-värde (r = 0,65.) Vilket indikerar på en stark och signifikant korrelation däremot förekommer det en bristande uppnåelse av överstämmelse mellan metoderna utifrån spridningsdiagrammet som ger ( r^2= 0,42). Bland Altmandiagammet noterar att ett gott samband mellan metoderna föreligger och även så gör standardavvikelse för 2D-EKO samt GSPECT som påvisade (50.1±¬¬10,8) respektive (49,1± 15,1). Studien förevisade goda övergripande korrelationer mellan EF härstammande från 2D-EKO och GSPECT. Sammantaget visade studien att metoderna överensstämmer med varandra vid mätning av EF. / Ejection fraction (EF) is a measure of the global systolic left chamber function. EF is a significant parameter for the systolic left ventricle function since therapeutic and prognostic conclusions are based on the left chamber ejection fraction. EF is normally ≥ 55% of the end-diastolic volume. Left ventricle ejection fraction can be determined non-invasively by 2D echocardiography (2D – ECHO) and gated single photon emission computed tomography. Based on 2D echocardiography (2D-ECHO), EF was obtained using the biplane Simpson method applied in four-chamber view and two- chamber views in both end diastole and end systole. With myocardial perfusion (GSPECT), EF values were obtained based on the QPS- and QGS- images after an automatic orientation of endocarditis contours were carried out by Hermes Medical solution. The purpose of this report was to see if there is a significant difference in (EF) between (GSPECT) and (2D-ECHO) at rest. The study included 30 participants. EF values were attained by licensed biomedical scientist. Participants' data were de-characterized and obtained from previously performed Echocardiography (2D-ECHO) and myocardial perfusion (GSPECT) at rest examination at intervals of no more than 6 months. The outcomes of the study showed (p=0.000) <0.1% and (r = 0.65) thus representing a strong and significant correlation, however there is a lack of consistency between the methods based on the scattering chart that gives (r^2 = 0.42). Bland Altman's illustration state that a good correlation between the methods exists and so does standard deviation for 2D-ECHO and Myocardial perfusion (GSPECT) as demonstrated (50.1±10.8) and (49.1±15.1) respectively. The study showed good overall correlations between EF derived from (2D ECHO) and Myocardial perfusion (GSPECT). In conclusion, the statistics demonstrated that the methods are comparable to each other when measuring EF.
54

Martin Simpson, “Grablegung und Auferstehung” (Übersetzung)

Böhnke, Dietmar 11 July 2019 (has links)
No description available.
55

The Signifying Storyteller: Harriette Simpson Arnow’s “The Goat Who Was a Cow

Sutton, Matthew D. 01 January 2018 (has links)
No description available.
56

[en] ACCURATE VOLUME RENDERING BASED ON ADAPTIVE NUMERICAL INTEGRATION / [pt] VISUALIZAÇÃO VOLUMÉTRICA PRECISA BASEADA EM INTEGRAÇÃO NUMÉRICA ADAPTATIVA

LEONARDO QUATRIN CAMPAGNOLO 28 January 2016 (has links)
[pt] Um dos principais desafios em algoritmos de visualização volumétrica é calcular a integral volumétrica de maneira eficiente, mantendo uma precisão mínima adequada. Geralmente, métodos de integração numérica utilizam passos de tamanho constante, não incluindo nenhuma estratégia de controle numérico. Como uma possível solução, métodos numéricos adaptativos podem ser utilizados, pois conseguem adaptar o tamanho do passo de integração dada uma tolerância de erro pré-definida. Em CPU, os algoritmos adaptativos de integração numérica são, normalmente, implementados recursivamente. Já em GPU, é desejável eliminar implementações recursivas. O presente trabalho propõe um algoritmo adaptativo e iterativo para a avaliação da integral volumétrica em malhas regulares, apresentando soluções para manter o controle do passo da integral interna e externa. Os resultados do trabalho buscaram comparar a precisão e eficiência do método proposto com o modelo de integração com passo de tamanho constante, utilizando a soma de Riemann. Verificou-se que o algoritmo proposto gerou resultados precisos, com desempenho competitivo. As comparações foram feitas em CPU e GPU. / [en] One of the main challenges in volume rendering algorithms is how to compute the Volume Rendering Integral accurately, while maintaining good performance. Commonly, numerical methods use equidistant samples to approximate the integral and do not include any error estimation strategy to control accuracy. As a solution, adaptive numerical methods can be used, because they can adapt the step size of the integration according to an estimated numerical error. On CPU, adaptive integration algorithms are usually implemented recursively. On GPU, however, it is desirable to eliminate recursive algorithms. In this work, an adaptive and iterative integration strategy is presented to evaluate the volume rendering integral for regular volumes, maintaining the control of the step size for both internal and external integrals. A set of computational experiments were made comparing both accuracy and efficiency against the Riemann summation with uniform step size. The proposed algorithm generates accurate results, with competitive performance. The comparisons were made using both CPU and GPU implementations.
57

Model Based Analysis of Clonal Developments Allows for Early Detection of Monoclonal Conversion and Leukemia

Baldow, Christoph, Thielecke, Lars, Glauche, Ingmar 28 March 2017 (has links) (PDF)
The availability of several methods to unambiguously mark individual cells has strongly fostered the understanding of clonal developments in hematopoiesis and other stem cell driven regenerative tissues. While cellular barcoding is the method of choice for experimental studies, patients that underwent gene therapy carry a unique insertional mark within the transplanted cells originating from the integration of the retroviral vector. Close monitoring of such patients allows accessing their clonal dynamics, however, the early detection of events that predict monoclonal conversion and potentially the onset of leukemia are beneficial for treatment. We developed a simple mathematical model of a self-stabilizing hematopoietic stem cell population to generate a wide range of possible clonal developments, reproducing typical, experimentally and clinically observed scenarios. We use the resulting model scenarios to suggest and test a set of statistical measures that should allow for an interpretation and classification of relevant clonal dynamics. Apart from the assessment of several established diversity indices we suggest a measure that quantifies the extension to which the increase in the size of one clone is attributed to the total loss in the size of all other clones. By evaluating the change in relative clone sizes between consecutive measurements, the suggested measure, referred to as maximum relative clonal expansion (mRCE), proves to be highly sensitive in the detection of rapidly expanding cell clones prior to their dominant manifestation. This predictive potential places the mRCE as a suitable means for the early recognition of leukemogenesis especially in gene therapy patients that are closely monitored. Our model based approach illustrates how simulation studies can actively support the design and evaluation of preclinical strategies for the analysis and risk evaluation of clonal developments.
58

UTFÖRANDE AV EJEKTIONFRAKTIONSMÄTNING MED HJÄLP AV SIMPSON METOD AV EN STUDENT OCH EN ERFAREN BIOMEDICINSK ANALYTIKER / PERFORMANCE OF EJECTION FRACTION MEASURMENT WITH SIMPSON METHOD BY A STUDENT AND AN EXPERIENCED BIOMEDICIAL SCIENTIST.

Flamarz, Diana January 2020 (has links)
Echocardiography examination is an important and familiar method for heart`s examination. Echocardiography is used to assess the function of the heart during to check the heart disease. In an echocardiography examination, the heart´s flow rates, contractility (pumping capacity), wall thickness, and inner diameter can be examined. All these examinations are done with the help of evolution of the ultrasonic waves that the ultrasonic transducer sends out and receives. The transducer consists of piezoelectric crystals that can both transmit and receive ultrasonic waves with frequencies exceeding 20 kHz. The purpose of the study is to compare the measurement of the left ventricular ejection fraction (LVEF) between an experienced biomedical scientist (BMA) and a student. In addition to see how the image quality affects the result. The measurement was performed by using the Simpson method. The result was analyzed by using with a static method. The results were analyzed by using a paired t-test to see if there is any significant difference between the performance of a BMA and a student. The measurement was performed on apical 4-chamber and apical 2-chamber image. The study included 30 patients, both heart -healthy and cardiac patients of the genders. The result showed that there is a significant difference in the performance of LVEF- measurements between BMA and student, with lower values measured by the student. / Ekokardiografiundersökning är en viktig och vanlig metod vid undersökning av hjärtat. Ekokardiografi används för att bedöma hjärtats funktion vid utredning av hjärtsjukdomar. Vid en ekokardiografiundersökning kan hjärtats flödeshastigheter, kontraktilitet (pumpförmåga), väggtjocklek, och innerdiameter undersökas. Alla dessa undersökningar görs med hjälp av tolkning av ultraljudsvågorna som ultraljudsgivaren skickar ut och tar emot. Givaren består av piezoelektriska kristaller som kan både sända och tar emot ultraljudsvågor med frekvens på över 20 kHz. Syftet med denna studie är att jämföra mätningen av den vänstra ventrikulära ejektionsfraktion (LVEF) mellan en erfaren biomedicinsk analytiker (BMA) och en student samt att se hur bildkvalitén påverkar resultatet. Mätningen utfördes med Simpsons- metoden. Resultatet analyserades med hjälp av en statistisk metod. Resultatet analyserades med hjälp av parat t-test för att se om det finns någon signifikant skillnad mellan utförandet av en BMA och en student. Mätningen utfördes på apikala 4-kammarbilder och apikala 2-kammarblider. Studien inkluderade 30 patienter, både hjärtfriska och hjärtsjuka patienter av både könen. Resultatet visade att det finns en signifikant skillnad i utförande av LVEF- mätningar mellan BMA och student, med lägre uppmätta värden av studenten.
59

Computerised methods for selecting a small number of single nucleotide polymorphisms that enable bacterial strain discrimination

Robertson, Gail Alexandra January 2006 (has links)
The possibility of identifying single nucleotide polymorphisms (SNPs) that would be useful for rapid bacterial typing was investigated. Neisseria meningitidis was the organism chosen for modelling the approach since informative SNPs could be found amongst the sequence data available for multi-locus sequence typing (MLST) at http://www.mlst.net. The hypothesis tested was that a small number of SNPs located within the seven gene fragments sequenced for MLST provide information equivalent to MLST. Preliminary investigations revealed that a small number of SNPs could be utilised to highly discriminate sequence types (STs) of clinical interest. Laboratory procedures demonstrated that SNP fingerprinting of N. meningitidis isolates is achievable. Further tests showed that laboratory identification of a defining SNP in the genome of isolates was to be a practical method of obtaining relevant typing information. Identification of the most discriminating SNPs amongst the ever-increasing amount of MLST sequence data summoned the need for computer-based assistance. Two methods of SNP selection devised by the author of this thesis were translated into computer-based algorithms by contributing team members. Software for two computer programs was produced. The algorithms facilitate the optimal selection of SNPs useful for (1) distinguishing specific STs and (2) differentiating non-specific STs. Current input information can be obtained from the MLST database and consequently the programs can be applied to any bacterial species for which MLST data have been entered. The two algorithms for the selection of SNPs were designed to serve contrasting purposes. The first of these was to determine the ST identity of isolates from an outbreak of disease. In this case, isolates would be tested for their membership to any of the STs known to be associated with disease. It was shown that one SNP per ST could distinguish each of four hyperinvasive STs of N. meningitidis from between 92.5% and 97.5% of all other STs. With two SNPs per ST, between 96.7% and 99.0% discrimination is achieved. The SNPs were selected from MLST loci with the assistance of the first algorithm which scores SNPs according to the number of base mismatches in a sequence alignment between an allele of an ST of interest and alleles belonging to all other STs at a specified locus. The second purpose was to determine whether or not isolates from different sources belong to the same ST, regardless of their actual ST identity. It was shown that with seven SNPs, four sample STs of N. meningitidis could, on average, be discriminated from 97.1% of all other STs. The SNPs were selected with the aid of the second algorithm which scores SNPs at MLST loci for the relative frequency of each nucleotide base in a sequence alignment as a measure of the extent of their polymorphism. A third algorithm for selecting SNPs has been discussed. By altering the method of scoring SNPs, it is possible to overcome the limitations inherent in the two algorithms that were utilised for finding SNPs. In addition, the third approach caters for finding SNPs that distinguish members of a complex from non-members.
60

Speaking out : class, race, and gender in the writings of Ruth McEnery Stuart, Edith Summers Kelley, and Harriette Simpson Arnow /

Reynolds, Claire E. January 2008 (has links)
Thesis (Ph.D.) -- University of Rhode Island, 2008. / Typescript. Includes bibliographical references (leaves 161-168).

Page generated in 0.4133 seconds