131 |
ALGORITHMES DE COMPRESSION D'IMAGES ET CODES DE CONTOURSAhronovitz, Ehoud 30 September 1985 (has links) (PDF)
Le Volume important de la représentation numérique des images pose (entre autres) le problème de leur codage sous forme condensée. Nous avons développé une méthode permettant de construire un code fortement comprimé à partir d'une image bicolore. Elle permet d'extraire tous les objets de l'image en un seul balayage tout en attachant à chaque objet des caractéristiques de sa forme. Des traitements de compression différents sont alors adaptés à chaque type d'objet. Grâce à une technique d'affichage reposant sur des principes simples, des opérations algébriques sont possibles sur les contours extraits. L'adéquation à tout format d'image et à toute précision de la saisie est inhérente au principe même de la méthode. Nous proposons un découpage en modules indépendants, en ayant pour objectif une parallélisation ultérieure. Le logiciel, en version monoprocesseur, a été réalisé sur une SM90 (CNET), sous le système Unix, et donne de résultats souvent meilleurs ou comparables à ceux obtenus par les méthodes connues actuellement.
|
132 |
System for vessel characterization : development and evaluation with application to deep vein thrombosis diagnosisGuerrero, Julian 11 1900 (has links)
A system for vessel characterization aimed at detecting deep vein thrombosis (DVT) in the lower limbs has been developed and evaluated using ultrasound image processing, location and force sensors measurements, blood flow information and a protocol based on the current clinical standard, compression ultrasound. The goal is to provide an objective and repeatable system to measure DVT in a rapid and standardized manner, as this has been suggested in the literature as an approach to improve overall detection of the disease.
The system uses a spatial Kalman filter-based algorithm with an elliptical model in the measurement equation to detect vessel contours in transverse ultrasound images and estimate ellipse parameters, and temporal constant velocity Kalman filters for tracking vessel location in real-time. The vessel characterization also comprises building a 3-D vessel model and performing compression and blood flow assessments to calculate measures that indicate the possibility of DVT in a vessel. A user interface designed for assessing a vessel for DVT was also developed.
The system and components were implemented and tested in simulations, laboratory settings, and clinical settings. Contour detection results are good, with mean and rms errors ranging from 1.47-3.64 and 3.69-9.67 pixels, respectively, in simulated and patient images, and parameter estimation errors of 5%. Experiments showed errors of 3-5 pixels for the tracking approaches. The measures for DVT were evaluated, independently and integrated in the system. The complete system was evaluated, with sensitivity of 67-100% and specificity of 50-89.5%. System learnability and memorability were evaluated in a separate user study, with good results.
Contributions include a segmentation approach using a full parameter ellipse model in an extended Kalman filter, incorporating multiple measurements, an alternate sampling method for faster parameter convergence and application-specific initialization, and a tracking approach that includes a sub-sampled sum of absolutes similarity calculation and a method to detect vessel bifurcations using flow data. Further contributions include an integrated system for DVT detection that can combine ultrasound B-mode, colour flow and elastography images for vessel characterization, a system interface design focusing on usability that was evaluated with medical professionals, and system evaluations through multiple patient studies.
|
133 |
Virus recognition in electron microscope images using higher order spectral featuresOng, Hannah Chien Leing January 2006 (has links)
Virus recognition by visual examination of electron microscope (EM) images is time consuming and requires highly trained and experienced medical specialists. For these reasons, it is not suitable for screening large numbers of specimens. The objective of this research was to develop a reliable and robust pattern recognition system that could be trained to detect and classify different types of viruses from two-dimensional images obtained from an EM. This research evaluated the use of radial spectra of higher order spectral invariants to capture variations in textures and differences in symmetries of different types of viruses in EM images. The technique exploits invariant properties of the higher order spectral features, statistical techniques of feature averaging, and soft decision fusion in a unique manner applicable to the problem when a large number of particles were available for recognition, but were not easily registered on an individual basis due to the low signal to noise ratio. Experimental evaluations were carried out using EM images of viruses, and a high statistical reliability with low misclassification rates was obtained, showing that higher order spectral features are effective in classifying viruses from digitized electron micrographs. With the use of digital imaging in electron microscopes, this method can be fully automated.
|
134 |
Automatic emotion recognition: an investigation of acoustic and prosodic parametersSethu, Vidhyasaharan , Electrical Engineering & Telecommunications, Faculty of Engineering, UNSW January 2009 (has links)
An essential step to achieving human-machine speech communication with the naturalness of communication between humans is developing a machine that is capable of recognising emotions based on speech. This thesis presents research addressing this problem, by making use of acoustic and prosodic information. At a feature level, novel group delay and weighted frequency features are proposed. The group delay features are shown to emphasise information pertaining to formant bandwidths and are shown to be indicative of emotions. The weighted frequency feature, based on the recently introduced empirical mode decomposition, is proposed as a compact representation of the spectral energy distribution and is shown to outperform other estimates of energy distribution. Feature level comparisons suggest that detailed spectral measures are very indicative of emotions while exhibiting greater speaker specificity. Moreover, it is shown that all features are characteristic of the speaker and require some of sort of normalisation prior to use in a multi-speaker situation. A novel technique for normalising speaker-specific variability in features is proposed, which leads to significant improvements in the performances of systems trained and tested on data from different speakers. This technique is also used to investigate the amount of speaker-specific variability in different features. A preliminary study of phonetic variability suggests that phoneme specific traits are not modelled by the emotion models and that speaker variability is a more significant problem in the investigated setup. Finally, a novel approach to emotion modelling that takes into account temporal variations of speech parameters is analysed. An explicit model of the glottal spectrum is incorporated into the framework of the traditional source-filter model, and the parameters of this combined model are used to characterise speech signals. An automatic emotion recognition system that takes into account the shape of the contours of these parameters as they vary with time is shown to outperform a system that models only the parameter distributions. The novel approach is also empirically shown to be on par with human emotion classification performance.
|
135 |
Numerical Laplace transformation methods for integrating linear parabolic partial differential equationsNgounda, Edgard 12 1900 (has links)
Thesis (MSc (Applied Mathematics))--University of Stellenbosch, 2009. / ENGLISH ABSTRACT: In recent years the Laplace inversion method has emerged as a viable alternative
method for the numerical solution of PDEs. Effective methods for the
numerical inversion are based on the approximation of the Bromwich integral.
In this thesis, a numerical study is undertaken to compare the efficiency of
the Laplace inversion method with more conventional time integrator methods.
Particularly, we consider the method-of-lines based on MATLAB’s ODE15s
and the Crank-Nicolson method.
Our studies include an introductory chapter on the Laplace inversion method.
Then we proceed with spectral methods for the space discretization where we
introduce the interpolation polynomial and the concept of a differentiation
matrix to approximate derivatives of a function. Next, formulas of the numerical
differentiation formulas (NDFs) implemented in ODE15s, as well as the
well-known second order Crank-Nicolson method, are derived. In the Laplace
method, to compute the Bromwich integral, we use the trapezoidal rule over
a hyperbolic contour. Enhancement to the computational efficiency of these
methods include the LU as well as the Hessenberg decompositions.
In order to compare the three methods, we consider two criteria: The
number of linear system solves per unit of accuracy and the CPU time per
unit of accuracy. The numerical results demonstrate that the new method,
i.e., the Laplace inversion method, is accurate to an exponential order of convergence
compared to the linear convergence rate of the ODE15s and the
Crank-Nicolson methods. This exponential convergence leads to high accuracy
with only a few linear system solves. Similarly, in terms of computational cost, the Laplace inversion method is more efficient than ODE15s and the
Crank-Nicolson method as the results show.
Finally, we apply with satisfactory results the inversion method to the axial
dispersion model and the heat equation in two dimensions. / AFRIKAANSE OPSOMMING: In die afgelope paar jaar het die Laplace omkeringsmetode na vore getree
as ’n lewensvatbare alternatiewe metode vir die numeriese oplossing van
PDVs. Effektiewe metodes vir die numeriese omkering word gebasseer op die
benadering van die Bromwich integraal.
In hierdie tesis word ’n numeriese studie onderneem om die effektiwiteit
van die Laplace omkeringsmetode te vergelyk met meer konvensionele tydintegrasie
metodes. Ons ondersoek spesifiek die metode-van-lyne, gebasseer
op MATLAB se ODE15s en die Crank-Nicolson metode.
Ons studies sluit in ’n inleidende hoofstuk oor die Laplace omkeringsmetode.
Dan gaan ons voort met spektraalmetodes vir die ruimtelike diskretisasie,
waar ons die interpolasie polinoom invoer sowel as die konsep van ’n
differensiasie-matriks waarmee afgeleides van ’n funksie benader kan word.
Daarna word formules vir die numeriese differensiasie formules (NDFs) ingebou
in ODE15s herlei, sowel as die welbekende tweede orde Crank-Nicolson
metode. Om die Bromwich integraal te benader in die Laplace metode, gebruik
ons die trapesiumreël oor ’n hiperboliese kontoer. Die berekeningskoste
van al hierdie metodes word verbeter met die LU sowel as die Hessenberg
ontbindings.
Ten einde die drie metodes te vergelyk beskou ons twee kriteria: Die aantal
lineêre stelsels wat moet opgelos word per eenheid van akkuraatheid, en
die sentrale prosesseringstyd per eenheid van akkuraatheid. Die numeriese resultate demonstreer dat die nuwe metode, d.i. die Laplace omkeringsmetode,
akkuraat is tot ’n eksponensiële orde van konvergensie in vergelyking tot
die lineêre konvergensie van ODE15s en die Crank-Nicolson metodes. Die
eksponensiële konvergensie lei na hoë akkuraatheid met slegs ’n klein aantal
oplossings van die lineêre stelsel. Netso, in terme van berekeningskoste is die
Laplace omkeringsmetode meer effektief as ODE15s en die Crank-Nicolson
metode.
Laastens pas ons die omkeringsmetode toe op die aksiale dispersiemodel
sowel as die hittevergelyking in twee dimensies, met bevredigende resultate.
|
136 |
VisualMet : um sistema para visualização e exploração de dados meteorológicos / VisualMet: a system for visualizing and exploring meteorological dataManssour, Isabel Harb January 1996 (has links)
Os centros operacionais e de pesquisa em previsão numérica do tempo geralmente trabalham com uma grande quantidade de dados complexos multivariados, tendo que interpretá-los num curto espaço de tempo. Técnicas de visualização científica podem ser utilizadas para ajudar a entender o comportamento atmosférico. Este trabalho descreve a arquitetura e as facilidades apresentadas pelo sistema VisualMet, que foi implementado com base em um estudo das tarefas desenvolvidas pelos meteorologistas responsáveis pelo 8º Distrito de Meteorologia, em Porto Alegre. Este centro coleta dados meteorológicos três vezes ao dia, de 32 estações locais, e recebe dados similares do Instituto Nacional de Meteorologia, localizado em Brasília, e do National Meteorological Center, localizado nos Estados Unidos. Tais dados são resultados de observações de variáveis tais como temperatura, pressão, velocidade do vento e tipos de nuvens. As tarefas dos meteorologistas e as classes de dados foram observadas e analisadas para definir as características do sistema. A arquitetura e a implementação do VisualMet seguem, respectivamente, uma abordagem orientada a ferramentas e o paradigma de programação orientada a objetos. Dados obtidos das estações meteorológicas são instancias de uma classe chamada "Entidade". Três outras classes de objetos representando ferramentas que suportam as tarefas dos meteorologistas foram modeladas. Os objetos no sistema são apresentados ao usuário através de duas janelas, "Base de Entidades" e " Base de Ferramentas". A implementação da "Base de Ferramentas" inclui ferramentas de mapeamento (para produzir mapas de contorno, mapas de ícones e gráficos), ferramentas de armazenamento (para guardar e recuperar imagens geradas pelo sistema) e uma ferramenta de consulta (para ler valores de variáveis de estações selecionadas). E dada especial atenção a ferramenta de mapa de contorno, onde foi utilizado o método Multiquádrico para interpolação de dados. O trabalho apresenta ainda um estudo sobre métodos de interpolação de dados esparsos, antes de descrever detalhadamente os resultados obtidos com a ferramenta de mapa de contorno. Estes resultados (imagens) são discutidos e comparados com mapas gerados manualmente por meteorologistas do 8º Distrito de Meteorologia. Possíveis extensões do presente trabalho são também abordadas. / The weather forecast centers deal with a great volume of complex multivariate data, which usually have to be understood within short time. Scientific visualization techniques can be used to support both daily forecasting and meteorological research. This work reports the architecture and facilities of a system, named VisualMet, that was implemented based on a case study of the tasks accomplished by the meteorologists responsible for the 8th Meteorological District, in the South of Brazil. This center collects meteorological data three times a day from 32 local stations and receives similar data from both the National Institute of Meteorology, located in Brasilia, and National Meteorological Center, located in the United States of America. Such data result from observation of variables like temperature, pressure, wind velocity, and type of clouds. The tasks of meteorologists and the classes of application data were observed to define system requirements. The architecture and implementation of Visual- Met follow the tool-oriented approach and object-oriented paradigm, respectively. Data taken from meteorological stations are instances of a class named Entity. Three other classes of tools which support the meteorologists' tasks are modeled. Objects in the system are presented to the user through two windows, "Entities Base" and "Tools Base". Current implementation of the "Tools Base" contains mapping tools (to produce contour maps, icons maps and graphs), recording tools (to save and load images generated by the system) and a query tool (to read variables values of selected stations). The results of applying the multiquadric method to interpolate data for the construction of contour maps are also discussed. Before describing the results obtained with the multiquadric method, this work also presents a study on interpolation methods for scattered data. The results (images) obtained with the contour map tool are discussed and compared with the maps drawn by the meteorologists of the 8th Meteorological District. Possible extensions to this work are also presented.
|
137 |
VisualMet : um sistema para visualização e exploração de dados meteorológicos / VisualMet: a system for visualizing and exploring meteorological dataManssour, Isabel Harb January 1996 (has links)
Os centros operacionais e de pesquisa em previsão numérica do tempo geralmente trabalham com uma grande quantidade de dados complexos multivariados, tendo que interpretá-los num curto espaço de tempo. Técnicas de visualização científica podem ser utilizadas para ajudar a entender o comportamento atmosférico. Este trabalho descreve a arquitetura e as facilidades apresentadas pelo sistema VisualMet, que foi implementado com base em um estudo das tarefas desenvolvidas pelos meteorologistas responsáveis pelo 8º Distrito de Meteorologia, em Porto Alegre. Este centro coleta dados meteorológicos três vezes ao dia, de 32 estações locais, e recebe dados similares do Instituto Nacional de Meteorologia, localizado em Brasília, e do National Meteorological Center, localizado nos Estados Unidos. Tais dados são resultados de observações de variáveis tais como temperatura, pressão, velocidade do vento e tipos de nuvens. As tarefas dos meteorologistas e as classes de dados foram observadas e analisadas para definir as características do sistema. A arquitetura e a implementação do VisualMet seguem, respectivamente, uma abordagem orientada a ferramentas e o paradigma de programação orientada a objetos. Dados obtidos das estações meteorológicas são instancias de uma classe chamada "Entidade". Três outras classes de objetos representando ferramentas que suportam as tarefas dos meteorologistas foram modeladas. Os objetos no sistema são apresentados ao usuário através de duas janelas, "Base de Entidades" e " Base de Ferramentas". A implementação da "Base de Ferramentas" inclui ferramentas de mapeamento (para produzir mapas de contorno, mapas de ícones e gráficos), ferramentas de armazenamento (para guardar e recuperar imagens geradas pelo sistema) e uma ferramenta de consulta (para ler valores de variáveis de estações selecionadas). E dada especial atenção a ferramenta de mapa de contorno, onde foi utilizado o método Multiquádrico para interpolação de dados. O trabalho apresenta ainda um estudo sobre métodos de interpolação de dados esparsos, antes de descrever detalhadamente os resultados obtidos com a ferramenta de mapa de contorno. Estes resultados (imagens) são discutidos e comparados com mapas gerados manualmente por meteorologistas do 8º Distrito de Meteorologia. Possíveis extensões do presente trabalho são também abordadas. / The weather forecast centers deal with a great volume of complex multivariate data, which usually have to be understood within short time. Scientific visualization techniques can be used to support both daily forecasting and meteorological research. This work reports the architecture and facilities of a system, named VisualMet, that was implemented based on a case study of the tasks accomplished by the meteorologists responsible for the 8th Meteorological District, in the South of Brazil. This center collects meteorological data three times a day from 32 local stations and receives similar data from both the National Institute of Meteorology, located in Brasilia, and National Meteorological Center, located in the United States of America. Such data result from observation of variables like temperature, pressure, wind velocity, and type of clouds. The tasks of meteorologists and the classes of application data were observed to define system requirements. The architecture and implementation of Visual- Met follow the tool-oriented approach and object-oriented paradigm, respectively. Data taken from meteorological stations are instances of a class named Entity. Three other classes of tools which support the meteorologists' tasks are modeled. Objects in the system are presented to the user through two windows, "Entities Base" and "Tools Base". Current implementation of the "Tools Base" contains mapping tools (to produce contour maps, icons maps and graphs), recording tools (to save and load images generated by the system) and a query tool (to read variables values of selected stations). The results of applying the multiquadric method to interpolate data for the construction of contour maps are also discussed. Before describing the results obtained with the multiquadric method, this work also presents a study on interpolation methods for scattered data. The results (images) obtained with the contour map tool are discussed and compared with the maps drawn by the meteorologists of the 8th Meteorological District. Possible extensions to this work are also presented.
|
138 |
Haptic Perception, Decision-making, and Learning for Manipulation with Artificial HandsJanuary 2016 (has links)
abstract: Robotic systems are outmatched by the abilities of the human hand to perceive and manipulate the world. Human hands are able to physically interact with the world to perceive, learn, and act to accomplish tasks. Limitations of robotic systems to interact with and manipulate the world diminish their usefulness. In order to advance robot end effectors, specifically artificial hands, rich multimodal tactile sensing is needed. In this work, a multi-articulating, anthropomorphic robot testbed was developed for investigating tactile sensory stimuli during finger-object interactions. The artificial finger is controlled by a tendon-driven remote actuation system that allows for modular control of any tendon-driven end effector and capabilities for both speed and strength. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. Next, attention was focused on real-time artificial perception for decision-making. A robotic system needs to perceive its environment in order to make decisions. Specific actions such as “exploratory procedures” can be employed to classify and characterize object features. Prior work on offline perception was extended to develop an anytime predictive model that returns the probability of having touched a specific feature of an object based on minimally processed sensor data. Developing models for anytime classification of features facilitates real-time action-perception loops. Finally, by combining real-time action-perception with reinforcement learning, a policy was learned to complete a functional contour-following task: closing a deformable ziplock bag. The approach relies only on proprioceptive and localized tactile data. A Contextual Multi-Armed Bandit (C-MAB) reinforcement learning algorithm was implemented to maximize cumulative rewards within a finite time period by balancing exploration versus exploitation of the action space. Performance of the C-MAB learner was compared to a benchmark Q-learner that eventually returns the optimal policy. To assess robustness and generalizability, the learned policy was tested on variations of the original contour-following task. The work presented contributes to the full range of tools necessary to advance the abilities of artificial hands with respect to dexterity, perception, decision-making, and learning. / Dissertation/Thesis / Doctoral Dissertation Mechanical Engineering 2016
|
139 |
Interfaces rodoviário-urbanas na produção da cidade: estudo de caso do contorno rodoviário de João Pessoa-PBCastro, Alexandre Augusto Bezerra da Cunha 03 April 2014 (has links)
Made available in DSpace on 2015-05-14T12:09:35Z (GMT). No. of bitstreams: 1
arquivototal.pdf: 9782170 bytes, checksum: e9aff4656bf80c325fe0eccf48ab0a1e (MD5)
Previous issue date: 2014-04-03 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / Roads performed an important role in the design process of city, by facilitating long distance journeys. When necessary, contours were constructed that supposedly facilitate connectivity between the city and the highway. However, when absorbed into the urban fabric of cities, these stretches of road contours develop individual dynamics with intra-urban space in which it is inserted, with the change of accessibility, morphology and the use and occupation of urban land. These interfaces occur during the growth process of the city and during the subsequent stages. Within this context, this master's research aims to analyze the dynamics between the implementation of BR-230 road contour and the process of production of the urban space of the city of João Pessoa, Paraíba, between the years 1963 and 2013, in terms of morphology, use and occupation of urban land. To this end, we employed the method of evolutionary logic of the urban fabric of Panerai, which divides the urban evolution in three stages: overcoming limits, growth and combination and conflict. The method was associated with analytical tools such as Space Syntax and the use of Geographic Information System software. The results show that the highway boosted the process of spreading the city to the south, which produced a sprawling urban area, this being the fastest growing sector in that period, where they were also able to identify the pattern of horizontal and fragmented spatial growth, changes in accessibility based on morphology and pattern of use and occupation of spaces bordering the highway / As rodovias desempenham função importante no processo de estruturação das cidades, por facilitar os deslocamentos interurbanos. Quando necessário, eram construídos contornos que supostamente facilitam a conectividade entre a cidade e a rodovia. No entanto, quando absorvidas pelo tecido urbano das cidades, estes trechos de contornos rodoviários desenvolvem dinâmicas particulares com o espaço intraurbano no qual está inserido, com a alteração da acessibilidade, da morfologia e do uso e ocupação da terra urbana. Estas interfaces ocorrem durante o processo de crescimento da cidade e durante os estágios subsequentes. Com base nesse contexto, esta pesquisa de mestrado tem por objetivo analisar as dinâmicas entre a implantação do contorno rodoviário da BR-230 e o processo de produção do espaço intraurbano da cidade de João Pessoa, na Paraíba, entre os anos de 1963 e 2013, em termos de morfologia, uso e ocupação do solo urbano. Para tal, foi empregado o método da lógica evolutiva do tecido urbano de Panerai, que divide a evolução urbana em três estágios: superação de limites, crescimento e combinação e conflitos. O método foi associado a ferramentas analíticas, como a sintaxe espacial e o uso de softwares de Sistema de Informação Geográfica. Os resultados mostram que a rodovia impulsionou o processo de espraiamento da cidade para o sul, que produziu uma malha urbana tentacular, sendo esse setor o que mais cresceu naquele período, onde também se puderam identificar o padrão de crescimento espacial horizontal e fragmentado, alterações na acessibilidade com base na morfologia e no padrão de uso e ocupação dos espaços lindeiros à rodovia
|
140 |
Concordância entre o débito cardíaco estimado através das técnicas de termodiluição transpulmonar e de análise de contorno de pulso e a técnica de termodiluição de artéria pulmonar em cães anestesiados com isofluranoGarofalo, Natache Arouca January 2016 (has links)
Orientador: Francisco José Teixeira-Neto / Resumo: Introdução e objetivos: Mensurações do débito cardíaco (DC) pela técnica termodiluição transpulmonar (DCTP) e pela análise de contorno de pulso com calibração pela técnica transpulmonar (DCACP) são alternativas menos invasivas em comparação ao DC fornecido pela técnica de termodiluição de artéria pulmonar (DCP). Entretanto, instabilidades hemodinâmicas podem alterar o desempenho destes métodos. A Fase I do estudo objetivou avaliar se a utilização de 10 mL de indicador térmico (solução fisiológica a ≤ 5oC) para o DCTP (artéria femoral) promoveria melhor concordância e habilidade em detectar alterações no DCP em comparação a 5 mL de indicador. Na Fase II, objetivou-se verificar se alterações na resistência vascular sistêmica (RVS) influenciariam a concordância e a habilidade em detectar tendências entre o DCACP e o DCP. Métodos: Em 8 cães adultos (20,8–31,5 kg), mensurações simultâneas em triplicata do DCTP e DCP foram obtidas utilizando 5 e 10 mL de indicador térmico durante anestesia com isoflurano associado ou não com a infusão contínua intravenosa de remifentanil (0,3 e 0,6 μg/kg/min) ou de dobutamina (2,5 e 5,0 μg/kg/min) (Fase I). Durante a Fase II, o DCACP e o DCP foram mensurados simultaneamente (triplicata) antes e durante alterações na RVS induzidas pela infusão contínua de fenilefrina (1,0 μg/kg/min) ou de nitroprussiato (1,0 μg/kg/min). A acurácia e a precisão da concordância entre métodos foram estudadas pela análise de Bland-Altman para medidas múltiplas (Fase I) ... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: Background and objectives: Cardiac output (CO) measurements by transpulmonary thermodilution (TPTDCO) and by pulse contour analysis calibrated with transpulmonary thermodilution (PCACO) are less invasive alternatives to pulmonary artery thermodilution (PATDCO). However, hemodynamic instability could affect the performance of these methods. The objective of Phase I of the study was to determine if the use of 10 mL of thermal indicator (physiological saline at ≤ 5oC) for TPTDCO (measured in the femoral artery) would improve the agreement and trending ability with PATDCO in comparison to 5 mL of indicator. During Phase II, the aim was to verify if changes in systemic vascular resistance (SVR) would alter the agreement and trending ability between PCACO and PATDCO. Methods: In eight adult dogs (20.8–31.5 kg), simultaneous TPTDCO and PATDCO measurements (averaged from 3 repetitions) using 5 and 10 mL of thermal indicator were obtained during isoflurane anesthesia combined or not with intravenous remifentanil (0.3 e 0.6 μg/kg/min) or dobutamine (2.5 e 5.0 μg/kg/min) (Phase-1). During Phase-2, triplicate PCACO and PATDCO measurements were recorded before and during phenylephrine (1.0 μg/kg/min) or nitroprusside (1.0 μg/kg/min) induced changes in SVR. The accuracy and precision of agreement was evaluated by the Bland-Altman method for multiple measurements (Phase I) and for single measurements per subject (Phase 2). The ability of the test methods (PCACO and TPTDCO) to detect changes... (Complete abstract click electronic access below) / Doutor
|
Page generated in 0.0315 seconds