• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 91
  • 22
  • 20
  • 14
  • 10
  • 10
  • 8
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 220
  • 33
  • 32
  • 23
  • 22
  • 21
  • 19
  • 18
  • 18
  • 18
  • 17
  • 16
  • 15
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

ECG event detection & recognition using time-frequency analysis / Ανίχνευση & αναγνώριση συμβάντων ΗΚΓ με ανάλυση χρόνου-συχνότητας

Νεοφύτου, Νεόφυτος 09 July 2013 (has links)
Electrocardiography (ECG) has been established as one of the most useful diagnostic tools in medicine and is critical in the management of various heart conditions. Automated or semi-automated ECG analysis algorithms are expected to play an important role in the utilization of the ECG data. The correct identification of the QRS complexes is a fundamental step in every ECG analysis method. A major problem that is often encountered in automatic QRS detection is the presence of artifacts in the ECG data, which cause considerable alterations to the signal. Some common filters can smooth the effect of the artifacts, however they cannot eliminate them due to their spectral frequency overlap with the signal components. In this thesis, the objective was to develop a method, based on Time-Frequency Analysis that would be able to automatically detect and remove artifacts in order to increase the reliability of automatic QRS detection. The ECG data used for this purpose was taken from the Physionet library and more specifically from the MIMIC II database. The data in this database was acquired from ICU patients and it contains various types of rhythms as well as artifacts. First, a Graphical User Interface (GUI) was developed in order to manually annotate ECG data and was used for creating the ground truth for testing the methods developed. The Time-Frequency Analysis method used for the analysis of the ECG data, was based on a time-varying Autoregressive (AR) model whose solutions were obtained using Burg’s method. Several factors that affect the effectiveness of the method were investigated in order to optimize the algorithm experimentally. The algorithm implemented performs three main functions: “Artifact Hypothesis Testing,” “Artifact Detection and Removal,” and “QRS Complex Detection.” The first step, “Artifact Hypothesis Testing,” examines whether the signal contains any artifact or not. This is performed with a correct classification rate of 95.56%. The second step was the “Artifact Detection and Removal,” which could detect and remove the artifact area with an accuracy of 95.60% based on each signal sample identified as artifact or not. The final step, the “QRS Complex Detection,” correctly identified 92% of QRS complexes (322 out of 335 annotated QRS complexes). Finally, the proposed method was compared with one of the most commonly used methods in ECG analysis, the Wavelet Transform Analysis (WTA). The two methods were tested on exactly the same dataset. The WTA resulted in an overall score of 65.3% mainly due to the large number of false positive detections in the regions of artifact. / Το ηλεκτροκαρδιογράφημα (ΗΚΓ) έχει καθιερωθεί ως ένα από τα πιο χρήσιμα εργαλεία διάγνωσης στην ιατρική και είναι πολύ σημαντικό στη διαχείριση καρδιαγγειακών παθήσεων. Αυτοματοποιημένοι ή ημι-αυτοματοποιημένοι αλγόριθμοι ανάλυσης του ΗΚΓ αναμένεται να έχουν σημαντικό ρόλο στη χρήση των δεδομένων του ΗΚΓ. Η σωστή αναγνώριση των συμπλεγμάτων QRS είναι βασικό βήμα σε κάθε μέθοδο ανάλυσης του ΗΚΓ. Ένα σημαντικό πρόβλημα που συχνά προκύπτει σε αυτόματη ανίχνευση QRS είναι η παρουσία των τεχνητών σφαλμάτων (artifacts) στα δεδομένα ΗΚΓ, τα οποία προκαλούν σημαντικές αλλαγές στο σήμα. Κάποια κοινά φίλτρα μπορούν να εξομαλύνουν τις επιπτώσεις των τεχνητών σφαλμάτων, ωστόσο δεν μπορούν να τα εξαλείψουν λόγω της μεγάλης επικάλυψης του φάσματος συχνοτήτων τους με αυτού των στοιχείων του σήματος. Στην παρούσα εργασία στόχος ήταν η ανάπτυξη μιας μεθόδου, βασισμένης στην Ανάλυση Χρόνου-Συχνότητας, που θα είναι σε θέση να εντοπίσει αυτόματα και να αφαιρεί τα τεχνητά σφάλματα, ώστε να έχουμε μια πιο αξιόπιστη μέθοδο αυτόματης ανίχνευσης των QRS. Τα δεδομένα ΗΚΓ που χρησιμοποιήθηκαν για το σκοπό αυτό λήφθηκαν από τη βιβλιοθήκη Physionet και πιο συγκεκριμένα από τη βάση δεδομένων MIMIC II. Τα δεδομένα σε αυτή τη βάση δεδομένων προέρχονται από ασθενείς της Μονάδας Εντατικής Θεραπείας, και ως εκ τούτου, περιέχουν διάφορα είδη ρυθμών αλλά και τεχνητών σφαλμάτων. Αρχικά, ένα Γραφικό Περιβάλλον Χρήστη (GUI), σχεδιάστηκε για τη χειροκίνητη σηματοδότηση των διάφορων περιοχών ΗΚΓ σημάτων και χρησιμοποιήθηκε για τη δημιουργία των αληθών αποτελεσμάτων για δοκιμή της μεθόδου. H Ανάλυση Χρόνου-Συχνότητας έγινε με τη χρήση ενός χρονικά μεταβαλλόμενου Αυτοπαλινδρομικού (AR) μοντέλου οι λύσεις του οποίου βρέθηκαν με τη μέθοδο Burg. Ακολούθησε η διερεύνηση διαφόρων παραγόντων που επηρεάζουν την αποτελεσματικότητα της μεθόδου, προκειμένου να βελτιστοποιηθεί πειραματικά η μέθοδος. Ο αλγόριθμος που υλοποιήθηκε εκτελεί τρεις βασικές λειτουργίες: “Artifact Hypothesis Testing,” “Artifact Detection and Removal” και “QRS Complex Detection.” Κατ’ αρχήν, το βήμα "Artifact Hypothesis Testing" εξετάζει αν το σήμα περιέχει τεχνητό σφάλμα ή όχι, με το ποσοστό σωστής ταξινόμησης να ανέρχεται στο 95.56%. Το δεύτερο βήμα, η ανίχνευση και αφαίρεση της περιοχής του τεχνητού σφάλματος, έγινε με ακρίβεια 95.60% με βάση το πόσα σημεία του σήματος αναγνωρίστηκαν ως τεχνητό σφάλμα ή όχι. Τέλος, το συνολικό ποσοστό ορθής ανίχνευσης των συμπλεγμάτων QRS ήταν 92% (322 από τα 335 QRS που επισημάνθηκαν χειροκίνητα). Τέλος, έγινε μια σύγκριση μεταξύ της προτεινόμενης μεθόδου και μιας μεθόδου ανάλυσης ΗΚΓ που χρησιμοποιείται πολύ συχνά, της ανάλυσης με Μετασχηματισμό Wavelet (WTA). Οι δύο μέθοδοι δοκιμάστηκαν στα ίδια ακριβώς δεδομένα. Η ορθή ανίχνευση των συμπλεγμάτων QRS με τη μέθοδο WTA ήταν 65.3% κυρίως λόγω του μεγάλου αριθμού ψευδώς θετικών αποτελεσμάτων στις περιοχές των τεχνητών σφαλμάτων.
152

La génération des connaissances et la conception des artefacts visuels : le cas de l'aménagement des espaces de travail dans les entreprises / Knowledge generation and visuel's artifact conception : case of workspace planning within the enterprises

Chouki, Mourad 26 November 2012 (has links)
Actuellement, le développement du travail en mode projet, ainsi que le travail en réseaux, entraînent de profondes modifications dans la manière de concevoir les espaces de travail et les bureaux. Les entreprises qui conçoivent ces espaces associent souvent des architectes et des designers.La thèse a pour objet de répondre à la question suivante : comment les connaissances nouvelles sont elles générées dans les activités de conception relatives au domaine de l'aménagement des espaces de travail ?Le premier chapitre s'articulera autour des différents travaux relatifs aux activités deconception (travaux de Lebahar, Hatchuel et Visser, etc.). Le second s'intéressera aux différentes théories relatives à la génération des connaissances (travaux de Nonaka, Tsoukas et Engeström, etc).Afin de répondre à notre question de recherche, nous avons mis en place un dispositif d'observation de longue durée au sein des entreprises étudiées (approche de nature ethnométhdologique). Il s'agit de deux entreprises parisiennes spécialisées dans l'aménagement des espaces de travail: Génie des Lieux et Workspace CBRE.Nous avons montré que des difficultés d'intercompréhension apparaissent dans les activités de conception. Ces obstacles sont dûs à l'existence des connaissances tacites relationnelles au sens d'Harry Collins. Ces dernières peuvent être rendues explicites par la conception d'objets intermédiaires. Nous avons vu également comment la conception des artefacts en deux dimensions et en trois dimensions permet la génération des connaissances nouvelles et utiles dans un projet d'aménagement d'espaces de travail. / The development of this work is currently carried out in a project mode, since working in the network requires deep modifications in a way of conceiving the workspaces as well as the offices. The enterprises that conceive these spaces often associate both the designers and the architects.The purpose of this present thesis is to answer the flowing question: how are the recent knowledges generated in the conception activities which appear in the field of work space planning?The first chapter focuses on the works related to the conception activities (works of Lebahar, Hatchuel and Visser...etc...). The second deals with the different theories relating to the generation of knowledge (works of Nonaka, Tsoukas and Engeström... etc...).To answer our research question, we have installed a device which requires a long_term intervention inside the enterprises meant for study (ethnomethodological approach). There are two Parisian enterprises that specialize in work space planning: (GENIE DES LIEUX and workspace CBRE).During the work, we have shown some difficulty intercomprehension that emerged in the conception activities. We have concluded, as a result, that these obstacles were due to relational tacit knowledges (RTK) according to Harry Collins theory. However these difficulties can be made explicit by the conception of intermediate objects.We have also noticed that ho when conceiving the artifacts into two and three dimensions, this allows the generation of new and useful knowledges in a project of work space planning.
153

Designprinciper för digitala DevOps-bedömningsmodeller / Designprinciples for digital DevOps assessment models

Sandberg, Tobias, Svensson, Tobias January 2018 (has links)
Det finns idag ett stort behov för IT-verksamheter att arbeta med ständiga förbättringar för att hålla sig konkurrenskraftiga på marknaden. En viktig del i arbetet med ständiga förbättringar är att bedöma och utvärdera den befintliga situationen i syfte att skapa bra åtgärder. För att möjliggöra denna bedömning kan verksamheter använda sig av standarder och modeller för processbedömning. I IT-sektorn anstränger sig många företag med att förbättra arbetsprocesserna och brygga samman sina utvecklings- (eng. Development) och driftsavdelningar (eng. Operations). Arbetet med denna sammanlänkning benämns ofta som DevOps. Problemet som vi adresserar är att det saknas enkla digitala verktyg som är kontextualiserade för verksamheter som vill förbättra sitt samarbete mellan dessa avdelningar. Den befintliga klassen av system för DevOps-bedömning innehåller modeller som är otillräckliga och stödjer därmed inte utvecklings- och driftsaktörer i deras arbete att bedöma sin verksamhet för att ge beslutsunderlag för förbättring. I syfte att förbättra möjligheterna för DevOps-verksamheter och samtidigt skapa ny kunskap har vi designat och utvärderat en digital bedömningsmodell som kan användas i praktiken. För att uppfylla syftet har vi använt oss av forskningsmetoden Action Design Research som är särskilt lämplig metod vid skapande av IT-relaterade modeller i en verklig kontext. Resultatet bekräftar att befintliga bedömningsmodeller inte är tillräckliga och att problemet är generaliserbart som en klass av problem. En operativ digital modell kommer även presenteras med syfte att bedöma verksamheter ur ett DevOps-kontext. Vid utveckling av artefakten har även tre generella designprinciper identifierats vilka utvecklare och praktiker bör följa vid design av framtida bedömningsmodeller för DevOps. Principerna innebär att vid skapandet av en bedömningsmodell för DevOps i en IT-kontext bör det (i) användas en betygsskala uppdelad på fyra kapacitetsnivåer, (ii) påståenden som används i modellen bör vara förändrings- och anpassningsbara då verksamheter är unika, samt att (iii) modellen bör utvecklas så att development och operations kan utföra utvärderingen tillsammans. / Today, there is a great need for IT businesses to work with continuous improvement to keep competitive on the market. An important part of the work of continuous improvement is to assess and evaluate the existing situation with a view for creating profitable actions. To enable this assessment, businesses can use standards and models for process assessment. In the IT sector, many companies try to improve their work processes and combine development and operations. The work of this interconnection is often referred to as DevOps. The problem we address is that there are no simple digital tools that are contextualized for activities that want to improve collaboration between these departments. The existing class of DevOps assessment systems contains inadequate models, thus not support development and operational players in their work to assess their business to provide a basis for improvements. In order to improve the opportunities for DevOps operations while creating new knowledge, we have designed and evaluated a digital assessment model that can be used in practice. To fulfill the purpose, we have used the research method Action Design Research, which is a particularly suitable method of creating IT-related models in a real context. The result confirms that existing assessment models are not sufficient and that the problem is generalizable as a class of problems. An operating digital model will also be presented with the purpose of assessing activities from a DevOps context. In developing the artifact, three general design principles have also been identified which developers and practitioners should follow for design of future assessment models for DevOps. The principles imply that, in the creation of a DevOps assessment model in an IT context, (i) a grading scale should be divided into four capacity levels, (ii) statements used in the model should be changeable and adaptable as organizations are unique and (iii) should be developed so that development and operation can perform the evaluation together.
154

Memória da produção editorial científica da EDUFRN: 1962 a 1980

Pereira, Francisca Sirleide 29 March 2012 (has links)
Made available in DSpace on 2015-04-16T15:23:23Z (GMT). No. of bitstreams: 1 arquivototal.pdf: 7353504 bytes, checksum: b4d34e72b680dce38510dbc34be13dac (MD5) Previous issue date: 2012-03-29 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / It unveils the results of the research in quantity and qualitative Memories of EDUFRN scientific editorial production: 1962-1980, carried out in the Programa de Pós-graduação em Ciência da Informação da Universidade Federal da Paraíba (PPGCI/UFPB). It analyses the scientific editorial production of EDUFRN, imbracing it as artifact memories. We applied the method of documental research according to Jardin (1998) and the thematic oral history, by Arostegui (2006) adding them to the analyse of contents, and documental analyse, by by Bardin (1977). In order to reach our objective we appealed to the theories of collective memory by Halbwachs (2006) and Ricouer (2007); of scientifc divulgation, by Targino (1999) and Bueno (2007); of documantation, by Jardin (1995) and others. We worked with the conception of artifacts of memory, by Gonzaléz de Gomez (2009), Azevedo Neto and Freire (2007) and concluded that the scientific works published by the Publisher, named in this work of documents are plenty of informations and memories. For this reason we called them memory workmanship. We suggest, among other things, that such artfacts could be preserved while cultural public inheritance. / Apresenta os resultados da pesquisa quanti-qualitativa Memórias da produção editorial científica da EDUFRN: 1962-1980, realizada no Programa de Pós-graduação em Ciência da Informação da Universidade Federal da Paraíba (PPGCI/UFPB). Analisa a produção editorial científica da EDUFRN, compreendendo-a como artefato de memória. Aplica os métodos da Pesquisa documental, conforme Gil (2006) e da Pesquisa história de história oral temática, de Aróstegui (2006), associando-os às análises documental, de Gil (2006), e de conteúdo, de Bardin (1977). Recorre às teorias de memória coletiva de Halbwachs (2006) e de Ricouer (2007); de divulgação científica, de Targino (1999), Bueno (2009) e outros, e de documentação, de Jardim (1995). Trabalha a concepção de artefatos de memória aplicada ao livro cientifico a partir da concepção de Gonzaléz de Goméz (2009), Azevedo Neto e Freire (2007). Conclui que as obras científicas publicadas pela Editora, denominadas nesse trabalho de documentos, estão repletas de informação e de memórias. Passa a chamá-las de artefatos memorialísticos e sugere, dentre outras coisas, a preservação, enquanto patrimônio público cultural.
155

Informações epistolares: memórias em envelopes

Andrade, Brenda Alves de 28 February 2014 (has links)
Made available in DSpace on 2015-04-16T15:23:30Z (GMT). No. of bitstreams: 1 arquivototal.pdf: 9178443 bytes, checksum: b242dea44588f549dfdc6024edc7d2df (MD5) Previous issue date: 2014-02-28 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / With the general objective to analyze the handwritten epistles of theSanta Claus Program of the Correios, 2009 version, as information artifact by electing the aspects of the individual and collective memory of the writers, from Paraíba, the study analyzed 243 handwritten letters that were discarded at the end of the campaign. The research adopted as theoretical perspective the studies of authors like: Deleuze and Guatari (1992) for the epistle information concept; Foucault (1992) and Gomes (2004) in the self-writing, in addition to Halbwachs (2006) for selecting the memory aspects. In relation to the methodological viewpoint, we adopted a quanti-qualitative approach in the documental research and to aid the analysis we used the theoretical assumptions of Bardin´s content analysis (1977). Firstly, we mentioned the aspects related to the construction of the epistle information concept regarding Information Science by including the epistles, in the area, as artifact of information and memory, once these are full of information that is able to reveal aspects of thought as well as representations of the society. As a result, it was verified that the information enunciations versus articulated epistles constitute a possible concept of Epistle Information. The analysis revealed the writers´ personal aspects by unveiling the individual memory of each participant and when expressed, from a general, social and collective outlook, it was possible to (re)construct the writers´ collective memory and perceive the Santa Claus Campaign of the Correios, in 2009, and its product as a construct of the social reality. Therefore, the epistles constitute meaningful information sources able to alter the knowledge on the individuals´ socioeconomic structures that compose the sample of this study. / Com o objetivo geral de analisar as epístolas manuscritas do Programa Papai Noel dos Correios, versão 2009, como artefato de informação, elegendo os aspectos da memória individual e coletivados missivistas paraibanos, o estudo analisou 243 epístolas manuscritas, que foram descartadas no término da referida campanha. O estudo adotou como perspectiva teórica os estudos de autores; Deleuze e Guatari (1992) para o conceito de informação epistolar; Foucault (1992) e Gomes (2004) na teoria da escrita de si, além de Halbwachs (2006) para selecionar os aspectos da memória. Do ponto de vista metodológico adotamos a abordagem quanti-qualitativa na perspectiva da pesquisa documental e para auxiliar a análise, utilizamos os pressupostos teóricos da análise de conteúdo bardaniana (1977). De início, abordamos os aspectos relativos à construção do conceito de informação epistolar no âmbito da Ciência da Informação, inserindo as epístolas na área como artefato de informação e memória, tendo em vista, que estas, estão carregadas de informações capazes de revelar aspectos do pensamento e das representações da sociedade. E como resultado, verificou-se que os enunciados de informação x epístolas articulados constituem um conceito possível de Informação Epistolar. A análise revelou aspectos pessoais dos missivistas desvendando a memória individual de cada um e quando articuladas, a partir de uma visão geral, social e coletiva, foi possível (re)construir a memória coletiva dos missivistas e percebendo a Campanha Papai Noel dos Correios do ano de 2009 e seu produto, como um construto da realidade social. Portanto, as epístolas se constituem fontes de informação fortes capazes de alterar o conhecimento sobre as estruturas socioeconômicas dos sujeitos que compõem a amostra deste estudo.
156

Mobiltelefonen som social artefakt inom familjen : en studie med föräldrarna i fokus / The Cell Phone as a social artefact within the family : a study focused on parents

Nemback, Joakim January 2008 (has links)
Studies of modern communication media such as Cell Phones and so called Instant Messengers are popular today. This study focused on these modern techniques, focused on parents as a target group and on what is important for them in everyday communication. By exploratory using three focus groups it became clear that the children, and the link to them was the absolute most important thing for the parents. It became clear that this link to a great extent today exists through the mobile phone. In a following study a Mobile Instant Messenger called My Friends was introduced. It was used by two families during two weeks to see how it would change the communication within the family. The focus was to see how problems with availability were handled, it turned out that: My friends had a more obvious way of indicating availability than the Cell Phone had. It was also consulted before making contact through other media. Emphasis was also put to find out what kind of communication the different media; phone call (Cell Phone), texting and Instant Messenger supports. It showed that three variables seemed to be important when choosing media: • the haste of the business • the goal of the business • the need for precision The bigger knowledge two people had of each other the more the latter was affected but the first two were unaffected. / Studier av moderna kommunikationsmedia som mobiltelefonen och så kallade Instant Messengers är populära idag. Denna studie fokuserade på dessa moderna tekniker, och koncentrerade sig på målgruppen föräldrar, och vad som är viktigt för dem i deras vardagskommunikation. Genom att först i explorativt syfte använda tre fokusgrupper framkom det att barnen, och länken till dem, var det absolut viktigaste för föräldrarna. Det framgår i studien att länken till barnen idag till stor del finns genom mobiltelefonen. I en andra studie introducerades en mobil Instant Messenger kallad My Friends. Denna användes av två familjer under två veckors tid för att se hur den rådande kommunikationen inom familjen förändrades. Här fokuserades det på hur problem med tillgänglighet löstes, och det visade sig att: My Friends hade ett tydligare sätt att visa tillgänglighet än mobiltelefonen, och den konsulterades ibland även när den faktiska kommunikationen skedde genom andra media. Stor emfas lades också på att utröna vilken typ av kommunikation de olika medierna mobiltelefoni, SMS och Instant Messenger stödjer. Det visade sig att tre variabler verkar vara viktiga när val av kommunikationsmedia görs: • ärendets brådska • ärendets mål • behov av exakthet Ju större kännedom två personer hade om varandra desto mer påverkades den senare medan de två föregående förblev oförändrade.
157

Impact of artifact correction methods on R-R interbeat signals to quantifying heart rate variability (HRV) according to linear and nonlinear methods. / Impactos das correções de artefatos em sinais de intervalos R-R para a quantificação da variabilidade da frequência cardíaca (HRV) de acordo com métodos lineares e não lineares.

Anderson Ivan Rincon Soler 10 March 2016 (has links)
In the analysis of heart rate variability (HRV) are used temporal series that contains the distances between successive heartbeats in order to assess autonomic regulation of the cardiovascular system. These series are obtained from the electrocardiogram (ECG) signal analysis, which can be affected by different types of artifacts leading to incorrect interpretations in the analysis of the HRV signals. Classic approach to deal with these artifacts implies the use of correction methods, some of them based on interpolation, substitution or statistical techniques. However, there are few studies that shows the accuracy and performance of these correction methods on real HRV signals. This study aims to determine the performance of some linear and non-linear correction methods on HRV signals with induced artefacts by quantification of its linear and nonlinear HRV parameters. As part of the methodology, ECG signals of rats measured using the technique of telemetry were used to generate real heart rate variability signals without any error. In these series were simulated missing points (beats) in different quantities in order to emulate a real experimental situation as accurately as possible. In order to compare recovering efficiency, deletion (DEL), linear interpolation (LI), cubic spline interpolation (CI), moving average window (MAW) and nonlinear predictive interpolation (NPI) were used as correction methods for the series with induced artifacts. The accuracy of each correction method was known through the results obtained after the measurement of the mean value of the series (AVNN), standard deviation (SDNN), root mean square error of the differences between successive heartbeats (RMSSD), Lomb\'s periodogram (LSP), Detrended Fluctuation Analysis (DFA), multiscale entropy (MSE) and symbolic dynamics (SD) on each HRV signal with and without artifacts. The results show that, at low levels of missing points the performance of all correction techniques are very similar with very close values for each HRV parameter. However, at higher levels of losses only the NPI method allows to obtain HRV parameters with low error values and low quantity of significant differences in comparison to the values calculated for the same signals without the presence of missing points. / Na análise da variabilidade da frequência cardíaca (Heart Rate Variability - HRV) são usadas séries temporais que contém as distancias entre batimentos cardíacos sucessivos, com o m de avaliar a regulação autonômica do sistema cardiovascular. Estas séries são obtidas a partir da análise de sinais de eletrocardiograma (ECG), as quais podem ser afetados por distintos tipos de artefatos, levando a interpretações incorretas nas análises feitas sob as séries da HRV. Abordagem clássica para lidar com esses artefatos implica a utilização de métodos de correção, alguns deles com base na interpolação, substituição ou técnicas estatísticas. No entanto, existem poucos estudos que mostram a precisão e desempenho destes métodos de correção em sinais reais da HRV. Assim, o presente estudo tem como objetivo determinar cómo os diferentes níveis de artefatos presentes no sinal afetam as caraterísticas da mesma, utilizando-se diferentes métodos lineares e não lineares de correção e posteriormente quanticação dos parâmetros da HRV. Como parte da metodología utilizada, sinais ECG de ratos obtidas mediante a técnica da telemetria foram usadas para gerar séries de HRV reais sem nenhum tipo de erro. Nestas séries foram simulados batimentos perdidos para diferentes taxas de pontos a m de emular a situação real com a maior precisão possível. Adicionalmente, foram aplicados os métodos de eliminação de segmentos (DEL), interpolação linear (LI) e cúbica (CI), janela de média móvel (MAW) e interpolação preditiva não lineal (NPI) como métodos de correção dos artefatos simulados sob as séries com erros. A precisão de cada método de correção foi conhecida através dos resultados obtidos com a quanticação do valor médio da série (AVNN), desvio padrão (SDNN), erro quadrático médio das diferenças entre batimentos sucessivos (RMSSD), periodograma de Lomb (LSP), análise de flutuações destendenciadas (DFA), entropia multiescala (MSE) e dinâmica simbólica (SD) sob cada sinal de HRV com e sem erros. Os resultados obtidos mostram que para baixos níveis de perdas de batimentos o desempenho das técnicas de correção é similar, com valores muito semelhantes para cada parámetro quanticado da HRV. Não obstante, em níveis de perdas maiores só NPI permite obter valores muito próximos e sem muitas diferenças signicativas para os mesmos parâmetros da HRV, em comparação com os valores calculados para as séries sem perdas.
158

Salas de controle: do artefato ao instrumento / Control rooms: from artifact to instrument

Adson Eduardo Resende 09 May 2011 (has links)
O projeto de espaços de trabalho exige, por parte do projetista, equacionar conflitos entre as diversas lógicas parciais dos vários usuários de um mesmo artefato. Compreender as relações estabelecidas entre os vários subsistemas que compõem a atividade de trabalho e dos ambientes leva, inevitavelmente, à necessidade de se desenvolver métodos de projeto que possam contemplar demandas inerentes a essa complexidade. Com efeito, poderíamos inferir que, na verdade, o que é preciso considerar na hora do projeto é a existência de uma interface entre o artefato de trabalho e o usuário. É o exercício pleno do uso dessa interface que permite aos usuários construírem sua experiência. Nessa experiência, encontramos os requisitos de projeto, e, para recuperá-la e fazer emergirem as necessidades do projeto, devemos lançar mão de métodos que se adéquem às condições atuais da prática projetual e às situações reais de uso dos artefatos. A evolução do artefato para instrumento resulta da associação dos artefatos com os esquemas de utilização dos seus usuários, reflexo da sua experiência. Metodologias como a Análise Ergonômica do Trabalho e a Avaliação Pós-Ocupação, apoiadas na Teoria da atividade, e por ela guiadas, podem ajudar na construção de uma reflexão consciente sobre a complexidade e as variáveis que surgem no uso dos ambientes de trabalho. O nosso grande labor no estudo de caso fundamenta-se no acompanhamento da atividade em curso, numa sala de controle de um sistema de Metrô. Durante as observações e levantamentos realizados, pudemos identificar a distância entre o projeto da sala e o trabalho real dos operadores e seus esquemas de utilização. O projeto tem sido reflexo de um processo de concepção que precisa ser incrementado, incorporando, definitivamente, características da atividade e a experiência dos usuários. / Design of work spaces demands from the designer solving conflicts which arise from the many partial logics of the various users of a same artifact. To understand the relations established between the many subsystems which make up the activity of work and its environment, leads inevitably to the need of developing design methods capable of dealing with the inherent demands of this complexity. Under this light, it is possible to infer that the existence of an interface between the work artifact and the user has to be considered when drafting a project. It is the very exercise of the use of this interface which allows users to construct their own experiences. Consequently, the design requirements are found within the experience itself, and, to recover it and impel the emergence of design\'s needs, one must forgo the methods which are linked to the current conditions of the accepted practices of design. The evolution from artifact to instrument results from the association of artifacts with the utilization schemes presented by users, a reflection of their own experiences. Methodologies, such as Ergonometric Analysis of Work and Post-Occupancy Evaluation, supported and guided by the Theory of activity, can help the construction of a conscientious reflection on both, complexity and variables which arise throughout the use of a work environment. The major work realized in this case study is based on the follow-up of in-progress activities within a subway\'s system control room. Observation and surveys carried out along this study identified a void between the design of the control room and the real work performed by the employees and their utilization schemes. Design should reflect a conception process which incorporates characteristics of both user activities and user experience.
159

Influência da região anatômica na formação de artefatos metálicos produzidos por implantes dentários em imagens de tomografia computadorizada de feixe cônico / Effect of anatomical region on the formation of metal artifacts produced by dental implants in cone beam computed tomographic images

Machado, Alessiana Helena 21 July 2017 (has links)
Submitted by Renata Lopes (renatasil82@gmail.com) on 2017-08-21T19:15:12Z No. of bitstreams: 1 alessianahelenamachado.pdf: 2020827 bytes, checksum: 87061a02b07d6d6e92c3f1dcee06da4c (MD5) / Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2017-08-24T12:04:37Z (GMT) No. of bitstreams: 1 alessianahelenamachado.pdf: 2020827 bytes, checksum: 87061a02b07d6d6e92c3f1dcee06da4c (MD5) / Made available in DSpace on 2017-08-24T12:04:37Z (GMT). No. of bitstreams: 1 alessianahelenamachado.pdf: 2020827 bytes, checksum: 87061a02b07d6d6e92c3f1dcee06da4c (MD5) Previous issue date: 2017-07-21 / O objetivo no presente estudo foi comparar, quantitativamente, os artefatos metálicos produzidos em imagens de tomografia computadorizada de feixe cônico (TCFC) por implantes instalados em diferentes regiões maxilomandibulares. Para isso, um total de 200 implantes, selecionados de exames de TCFC, foi dividido em quatro grupos: Grupo 1 (n = 50) - implantes localizados na região anterior da maxila; Grupo 2 (n = 50) - implantes localizados na região posterior da maxila; Grupo 3 (n = 50) - implantes localizados na região anterior da mandíbula e Grupo 4 (n = 50) - implantes localizados na região posterior da mandíbula. Os implantes ainda foram classificados em isolados ou adjacentes a outros implantes. Foram selecionados três cortes axiais de cada implante incluído na amostra (apical, médio e cervical). Nesses cortes foram mensurados os artefatos produzidos pelos implantes. Para comparar as variáveis com dois grupos foi aplicado o teste U de Mann-Whitney. Para a comparação entre os cortes axiais foram aplicados os testes de Kruskal-Wallis e Student-Newman-Keuls. A mandíbula apresentou uma quantidade de artefatos maior que a maxila (corte apical: p = 0,0024; corte médio: p < 0,0001). A região anterior produziu mais artefatos que a região posterior (corte apical: p = 0,0105; corte médio: p < 0,0316). Não houve diferença significativa na quantidade de artefatos entre implantes isolados e adjacentes e o corte cervical foi o mais acometido por artefatos. Pode-se concluir que os implantes dentários sempre produzem artefatos metálicos em imagens de TCFC, sendo esses artefatos influenciados pela localização anatômica na arcada dentária. / The objective of the present study was to compare, quantitatively, the metal artifacts produced in cone beam computed tomography (CBCT) images by dental implants installed in different maxillomandibular regions. A total of 200 implants selected from CBCT examinations were divided into four groups: Group 1 (n = 50) - implants located in the anterior maxilla; Group 2 (n = 50) - implants located in the posterior maxilla; Group 3 (n = 50) - implants located in the anterior mandible; and Group 4 (n = 50) - implants located in the posterior mandible. The implants were further classified as isolated or adjacent to other implants. Three axial slices were selected for each sampled implant (apical, middle and cervical). On each slice, the artifacts produced by the implants were counted. The Mann-Whitney U test was used to compare the variables between groups. The Kruskal-Wallis and Student-NewmanKeuls tests were used to compare the axial slices. The mandible showed a greater number of artifacts than the maxilla (apical slice: p = 0.0024; middle slice: p < 0.0001). The anterior region produced more artifacts than the posterior region (apical slice: p = 0.0105; middle slice: p < 0.0316). There was no significant difference in the number of artifacts between isolated and adjacent implants, and the cervical slice was most affected by artifacts. It can be concluded that dental implants always produce metal artifacts in CBCT images, and these artifacts are affected by the anatomical location in the dental arch.
160

Quantificação de artefatos metálicos produzidos por implantes dentários em imagens de tomografia computadorizada de feixe cônico obtidas com diferentes protocolos de aquisição / Quantification of metallic artifacts produced by dental implants in cbct images obtained using different acquisition protocols

Fardim, Karolina Aparecida Castilho 08 August 2018 (has links)
Submitted by Geandra Rodrigues (geandrar@gmail.com) on 2018-10-24T11:31:17Z No. of bitstreams: 0 / Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2018-10-24T15:48:14Z (GMT) No. of bitstreams: 0 / Made available in DSpace on 2018-10-24T15:48:14Z (GMT). No. of bitstreams: 0 Previous issue date: 2018-08-08 / O objetivo do trabalho foi quantificar, em imagens de tomografia computadorizada de feixe cônico (TCFC) obtidas com diferentes protocolos, os artefatos metálicos produzidos por implantes de titânio instalados em diferentes regiões da mandíbula. Os implantes foram instalados em quatro diferentes regiões (incisivo, canino, pré-molar e molar) de um phamtom e submetidos a exames de TCFC com variação da posição do objeto no interior do FOV (central, anterior, posterior, direita e esquerda), variação do FOV (6 x 13 e 12 x 13 cm) e do tamanho do voxel (0,25 e 0,30 mm). Um corte axial da região cervical de cada implante foi selecionado para quantificação. Os testes de Kruskal-Wallis e Student-Newman-Keuls foram utilizados para comparação das regiões dos dentes e entre as diferentes posições do phantom dentro do FOV. O teste de Wilcoxom foi utilizado para comparar a variação de tamanho do FOV e voxel. O teste ANOVA fatorial para avaliar a interação entre as variáveis do estudo. A região de incisivo apresentou a maior quantidade de artefatos, em comparação as outras regiões (p=0,0315). Não houve diferença significativa na variação da posição do phantom dentro do FOV (p=0,7418). O FOV menor produziu mais artefatos (p<0,0001). Ao comparar as imagens produzidas com diferentes resoluções, o menor voxel produziu mais artefatos (p<0,0001). Os artefatos metálicos sofrem influência do tamanho do FOV e do voxel, além da região anatômica. A variação da localização do phantom no interior do FOV não alterou a quantidade de artefatos. / O objetivo do trabalho foi quantificar, em imagens de tomografia computadorizada de feixe cônico (TCFC) obtidas com diferentes protocolos, os artefatos metálicos produzidos por implantes de titânio instalados em diferentes regiões da mandíbula. Os implantes foram instalados em quatro diferentes regiões (incisivo, canino, pré-molar e molar) de um phamtom e submetidos a exames de TCFC com variação da posição do objeto no interior do FOV (central, anterior, posterior, direita e esquerda), variação do FOV (6 x 13 e 12 x 13 cm) e do tamanho do voxel (0,25 e 0,30 mm). Um corte axial da região cervical de cada implante foi selecionado para quantificação. Os testes de Kruskal-Wallis e Student-Newman-Keuls foram utilizados para comparação das regiões dos dentes e entre as diferentes posições do phantom dentro do FOV. O teste de Wilcoxom foi utilizado para comparar a variação de tamanho do FOV e voxel. O teste ANOVA fatorial para avaliar a interação entre as variáveis do estudo. A região de incisivo apresentou a maior quantidade de artefatos, em comparação as outras regiões (p=0,0315). Não houve diferença significativa na variação da posição do phantom dentro do FOV (p=0,7418). O FOV menor produziu mais artefatos (p<0,0001). Ao comparar as imagens produzidas com diferentes resoluções, o menor voxel produziu mais artefatos (p<0,0001). Os artefatos metálicos sofrem influência do tamanho do FOV e do voxel, além da região anatômica. A variação da localização do phantom no interior do FOV não alterou a quantidade de artefatos.

Page generated in 0.0993 seconds