• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 16
  • 5
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 33
  • 18
  • 17
  • 14
  • 13
  • 13
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Regional Mechanical Function Changes Remain after Ventricular Pacing Cessation: Evidence of Mechanical Cardiac Memory

Skorinko, Jeremy Kenneth 27 March 2010 (has links)
Every year 400,000 - 600,000 people in the United States die from sudden cardiac death. Sudden cardiac death is often caused by irregular electrical impulses, or arrhythmias, in the heart. Arrhythmias can be corrected through pharmacological therapies, device therapies, or both. One type of device therapy, pacemakers, are inserted in the heart to correct arrhythmias. After a period of ventricular pacing, cardiac memory is defined by changes in the T-wave that are persistent upon return to normal activation pathways. During ventricular pacing, regional stroke work in areas closest to the pacing electrode is significantly decreased. We hypothesize that the mechanical function in the region around the pacing site will continue to have altered mechanical function after cession of pacing, in effect showing a mechanical cardiac memory. To test the hypothesis, nine canine models were implanted with pacing electrodes in both the atrium and ventricle. After a forty- minute stabilization period, baseline data were obtained during atrial pacing. Cardiac memory was induced in five canine models through a two-hour period of ventricular pacing followed immediately by atrial pacing. The remaining canine models served as controls, undergoing atrial pacing for two hours. High- density mapper (HDM) was used to determine mechanical function in a region centered approximately 1 cm away from the pacing electrode. No differences in global function (tau, developed pressure, dP/dtmax, dP/dtmin) were found after two hours of ventricular pacing upon return to normal activation pathways. There was a significant decrease in regional stroke work in an area close to the electrode between baseline (5.7 ± 2.6 %), during ventricular pacing (-3.8 ± 0.9 %)(p<0.05) and after two hours of ventricular pacing upon return to normal activation pathways (2.4 ± 1.6 %)(p<0.05). Further, systolic area contraction was also significantly different between baseline (5.0 ± 6.6 %) and after two hours of ventricular pacing upon return to normal activation pathways (0.2 ± 7.4 %)(p<0.05). Diastolic twist and diastolic twist rates showed no significant differences. Finally, contractile principal strain increased by inducing cardiac memory (-2.6 ± 0.3 %) as compared to baseline (-1.1 ± 0.5 %)(p<0.05). These findings suggest there is a mechanical correlation to electrical cardiac memory.
2

Aplicación del Modelo HDM en la Evaluacion de Proyectos de Carreteras en Peru: "Carreteras Bagua Chica-Flor de la Esperanza"

Lluncor Yataco, Gianmarco Xavier January 2012 (has links)
1.1 Tema Aplicación del Modelo HDM III en la evaluación de proyectos de Carreteras en el Perú: Carretera “Bagua Chica – Flor De La Esperanza ubicada en el departamento de Amazonas”. 1.2 Marco Situacional El presente estudio tiene por objetivo analizar la aplicabilidad del Modelo HDM III en carreteras en Perú revisando el caso del estudio de la mejor alternativa de construcción de la carretera “Bagua Chica – Flor de la Esperanza”, ya sea siguiendo una primera alternativa que es trazo del proyecto de ingeniería de detalle o la segunda alternativa que es el trazo de la variante propuesta, para la cual se evaluarán los costos de operación de los vehículos al circular por una u otra alternativa.
3

Design and implementation of the Hybrid Detector for Microdosimetry (HDM): Challenges in readout architecture and experimental results

Pierobon, Enrico 05 December 2023 (has links)
This thesis introduces an innovative approach for enhancing the characterization of radiation field quality through microdosimetry. Over the past 30 years, clinical results have shown that ion therapy may be a superior treatment option for several types of cancer, including recurrent cancers, compared to conventional radiation. Despite these promising results, there are still several treatment uncertainties related to biological and physical processes that prevent the full exploitation of particle therapy. Among the physical characterizations, it is paramount to measure the quality of the irradiating field in order to link the biological effect to its physical description. In this way, uncertainties in treatment can be reduced and outcomes optimized. One tool for studying the radiation field that has become increasingly important in the last decade is microdosimetry . Over the last years, microdosimetry has proved to be a superior tool for describing radiation quality, especially when compared to standard reference quantities used nowadays in the clinic. In microdosimetry, the fundamental quantity is the lineal energy y, defined as the energy deposition in the detector divided by the Mean Chord Length (MCL): an approximation used to estimate the track length traveled by radiation in the detector, valid in an isotropic, uniform radiation field. As a consequence, microdosimeters has evolved in obtaining the best possible energy release estimation, without improving the accuracy of the MCL approximation. Measuring the Real Track Length (RTL) traveled by the particle inside the detector could provide a better description of the radiation quality. In fact, from a biological perspective, it is critical if a large amount of energy is released over a long particle track, or if it is extremely dense over a small particle track. If the energy released is more dense, the biological damage induced is likely to be more complex and therefore more significant. For these reasons, a novel approach to microdosimetry is presented that considers the RTL in the radiation quality description. The first chapter of the thesis presents standard microdosimetry and its main quantities. A special emphasis is given to the microdosimeter used in this work, i.e. the TEPC or Tissue Equivalent Proportional Counter, a gas microdosimeter that is equivalent in terms of energy deposition to 2 um of tissue. A comprehensive characterization of the TEPC response to different ions and energies can be found in the literature. A topic missing in the literature is the investigation of the TEPC response to clinical protons of different particles rates. A section is dedicated to the TEPC detector response to pileup. Pileup occurs where two or more energy deposition events are processed together, disrupting the normal signal processing. By exposing the TEPC to particles rates ranging from few particles per seconds to 106 particles per second, it was possible to estimate the distortion of the acquired spectra due to pileup. On the other hand, by using Monte Carlo simulations, it was possible to reproduce the effect of pileup on microdosimetric spectra. Using a quantitative approach, the experimental spectra measured at different particles rate and the spectra simulated at a different pileup probability are matched based on a similarity criteria. In this way, it was possible to build a particle rate-pileup curve for the TEPC, used to quantify the pileup probability contribution. More in general, this approach could be extended and used to other microdosimeters. The acquisition of the data in pileup condition is sometimes inevitable, and some microdosimeters are more likely to suffer from high particle rates. With this part of the thesis, I aim to provide a tool to acquire microdosimetric spectra even in pileup condition. A description of the TEPC acquisition chain is provided in the next section. This is an important topic as any further integration or improvement will require the modification of at least one element of the acquisition. Then, the typical data analysis carried out on the microdosimetric spectra is presented, together with the calibration procedure of the TEPC detector based on Monte Carlo simulation using Geant4. Finally, I provide an overview of the software Mandarina, which is the implemented Graphical User Interface (GUI), written in C# language, and developed specifically to analyze the experimental microdosimetric data. By using this software, users can build a microdosimetric spectra starting from raw acquired data. In addition, the software provides the ability to modify key acquisition parameters and provides real-time feedback on how the microdosimetric spectra change under these modifications. Then, I introduce the concept of Hybrid Detector of Microdosimetry (HDM). HDM is composed of a commercial TEPC, and 4 layers of Low Gain Avalanche Detectors (LGADs). LGADs are silicon detectors featuring an internal gain by exploring the avalanche effect. This makes them suitable to detect particles with a broad range of energy release in the silicon. A detailed description of how the LGADs detect ionizing radiation is provided in this work. LGADs are used in the HDM as a tracking component, capable of reconstructing the particle trajectories inside the TEPC. In this way, instead of relying on the MCL approximation to calculate the value of y, it is possible to define a new quantity: yr. yr differs from the standard y because it uses the real track length instead of the mean chord length approximation. Next, a preliminary Geant4-based study for optimizing the detector geometry is discussed. Tracking capability and simulated microdosimetric spectra with the estimated track length were assessed and are presented in this thesis. To experimentally realize HDM, the acquisition chain of the TEPC must be upgraded since the original acquisition system cannot directly integrate the tracking information from the LGADs strips. A chapter of this work is dedicated to the implementation of the new acquisition system, which allows for the digitalization of the time series signal produced by the detector. The system is based on an Eclypse-Z7 FPGA development board which can host up to 4 Analog to Digital Converters (ADC). Following a bottom-up approach, this chapter describes first the main characteristics of the signal to be digitized. An overview of the Eclypse-Z7 development board with its main capabilities is provided. Finally, the controller in charge of driving the ADC is described. Being a Zynq FPGA, both Programming Logic (PL) and Processing System (PS) need to be programmed. The PL is responsible for driving the ADC at a low level, controlling the triggering and the data flow to the PS. The PS hosts a custom Linux distribution with the task of supervising the acquisition by setting the main parameters, like the number of samples to acquire, the trigger condition and position with respect to the acquisition window. The PS is also responsible for storing the data safely into an SD card connected to the Eclypse-Z7. With a fully customizable system, it is then possible to integrate other systems by properly synchronizing the acquisition with other devices. In the specific case of HDM, a correspondence between the energy release and the LGAD-based tracking component needs to be implemented. Once the time series is properly acquired, the data analysis needs to be developed. A specific section of the thesis is dedicated to this important task, as the correct processing of the signals is a requirement to obtain robust microdosimetric spectra. The time series processing features a classification algorithm that allows to identify artifacts of the acquired signals, such as saturation, double hits and noisy signals. Once the time series are correctly processed and the relevant information is extracted, it is possible to calculate the microdosimetric spectra. In this acquisition chain the detector signal is processed with 3 different levels of gain, obtaining the same version of the signal but with different amplification. In this way it is possible to span a large dynamic range while maintaining the required resolution typical of microdosimetry. However, the three signals must be then joined together to span the required dynamic range. This process goes under the name of intercalibration and has a dedicated section in the chapter. Once the signals are intercalibrated, it is necessary to apply a calibration. The new calibration process developed within this work differs from the previously adopted calibration method based on Monte Carlo simulation, and is described in detail. Finally, the spectra obtained with the new acquisition are compared to those obtained with the original acquisition chain. The next chapter is dedicated to the LGAD readout. Again, following a bottom up approach, an introduction to the LGAD signal is provided. This readout acquisition chain is already partially available since it has been developed by the INFN-TO (Istituto Nazionale di Fisica Nucleare) of Turin. For the first stage of signal processing, two main components developed by the aforementioned INFN-TO are available: the ABACUS chip and the ESA_ABACUS printed circuit board (PCB) board. The ABACUS chip is an ASIC (application-specific integrated circuit) designed to process directly the small signal coming from the LGADs strips. At each activation of one LGAD strip, a digital signal is generated. Each ABACUS is capable of handling up to 24 LGADs strips and can adjust the threshold of each channel within a limited range. Threshold adjustment is required to separate the signal from the noise, as it is expected that all the channels do not share a common threshold due to their specific noise. The ABACUS PCB has been developed to physically host up to 6 ABACUS chips plus the LGAD sensor. It is equipped with an internal DAC (Digital to Analog Converted) used to set a common threshold for all 24 channels managed by one ABACUS chip. In this way, a common threshold can be selected using the ABACUS DAC, and then, to satisfy the specific needs of each channel, the ABACUS chip is used. In order to program the thresholds, the manufacturer required specific serial communication protocols. It is necessary to integrate this communication protocol into the acquisition system. To meet these requirements, I developed an FPGA-based readout system capable of processing the signal from the ABACUS chip and setting the threshold for each channel. I describe in detail the implementation of such a system in a dedicated chapter, again following a bottom-up approach starting from the PL, and moving to the PS. In a specific section, I show how the communication protocol has been implemented and tested and how the fast digital pulses, coming from the ABACUS chip, are processed in the PL. I also describe how the PS system was built. As in the case of the new TEPC acquisition, a Linux system was run on the PS. This made it easier for the end user to work with the acquired data and threshold controls. The movement of data from the PL to the PS is accomplished using DMA or Direct Memory Access. This is a critical component because it allows fast (within one clock cycle) data transfer from the PL to the user in the PS. The implementation of such architecture is quite complex and demands both knowledge in advanced electronics and Linux systems. In fact, the DMA requires the implementation of a Linux kernel driver to correctly move the data. This process is described in a dedicated section of this thesis. With this implementation design in the FPGA it was possible to acquire the signal from 24 LGADs strips and control the thresholds. An experimental campaign was conducted at the proton therapy center in Trento where the whole acquisition system was tested extensively. The results are reported in a dedicated section of this thesis. All the signals coming from the protons with energies ranging from 70 to 228 MeV were correctly discriminated, proving that the readout system can work with protons of clinical energies. Finally, thermal tests were conducted on the acquisition setups since during the experimental campaign some thermal drifts of the baseline were observed. The test results are shown in a dedicated section of this thesis. Finally, I included a chapter on discussion on the results achieved and on future perspective.
4

Procedimento para análise de sensibilidade do Programa HDM-4 / Procedure for sensitivity analysis of the HDM-4 software

Nunes, Diego Frinhani 13 April 2012 (has links)
A presente dissertação teve como objetivos principais delinear um procedimento de análise de sensibilidade que possa ser utilizado de forma ampla por usuários do programa computacional de gerência de pavimentos HDM-4 e analisar a influência de modificações no cenário pré-definido no estudo de caso, notadamente considerações de projeto e de tráfego. A escolha do Método dos Efeitos Elementares foi precedida por uma revisão bibliográfica sobre análise de sensibilidade, sendo que o procedimento foi delineado com um estudo de caso que partiu de um cenário pré-definido (rodovia de pista simples e pavimento asfáltico sobre base granular) e fez a análise da variação de 61 fatores, em 300 trajetórias, totalizando 18.600 simulações no programa HDM-4. Os resultados deste estudo indicaram que a utilização de uma amostra com 30 trajetórias é adequada, pois aumentar este número não acarreta ganhos significativos e a sua diminuição ocasiona prejuízos. Com o procedimento delineado, passou-se à verificação de impactos que a modificação dos cenários pré-definidos para a análise gerariam, ou seja, se a modificação de fatores que não estão no grupo de 61 fatores em estudo interferiria no resultado final do procedimento. Foram estudadas modificações nas considerações de projeto (Taxa de Desconto Monetário Anual e Período de Análise) e nas considerações de tráfego (Percentual de Veículos Comerciais e da Taxa de Crescimento Anual do Tráfego), com os resultados evidenciando que ambas as modificações influem no resultado final da análise de sensibilidade, sendo que as modificações nas considerações de tráfego se mostraram mais influentes que as modificações nas considerações de projeto. Complementarmente, foi feita uma análise de sensibilidade pelo Método dos Efeitos Elementares, com 250 trajetórias, que identificou como os três fatores mais influentes na análise econômica do programa HDM-4, dentro do universo estudado, o Volume Diário Médio Anual de Veículos Motorizados, o Fator de Calibração para a Progressão da Irregularidade Longitudinal e o Fator de Calibração para a Razão do Número Estrutural Ajustado das Estações Úmida e Seca. / This Master of Science Dissertation aimed to outline a procedure for sensitivity analysis that can be widely used by computer program of pavement management HDM-4 users. It also aimed to analyze the influence of changes in the pre-defined scenario of the case study, especially design and traffic considerations. The choice of the Elementary Effects Method was preceded by a literature review on sensitivity analysis, and the procedure was outlined with a case study that came from a pre-defined scenario (two lane standard and asphalt pavement on granular base) and variations of 61 factors, considering 300 trajectories, in a total of 18,600 simulations in HDM-4 program. The results indicated that the use of a sample with 30 trajectories is appropriate, because increasing this number does not cause significant gains and reducing it can harm the analysis. With the procedure outlined, it was checked if changes in factors that are not in the group of 61 studied factors can interfere with the end result. Thus, changes in design (Discount Rate and Period Annual Monetary Analysis) and traffic (Percentage of Commercial Vehicle and Annual Growth Rate of Traffic) considerations affected the final result the sensitivity analysis, and the changes in traffic considerations were more influential than changes in design considerations. In addition, it was conducted a sensitivity analysis by the Elementary Effects Method, with 250 trajectories, which identified as the three most influential factors in the economic analysis of HDM-4 program, within the universe studied, the Annual Average Daily Traffic of Automobiles, the Calibration Factor for the Progression of Longitudinal Roughness and the Calibration Factor for the Ratio of Adjusted Structural Number Wet and Dry Seasons.
5

Procedimento para análise de sensibilidade do Programa HDM-4 / Procedure for sensitivity analysis of the HDM-4 software

Diego Frinhani Nunes 13 April 2012 (has links)
A presente dissertação teve como objetivos principais delinear um procedimento de análise de sensibilidade que possa ser utilizado de forma ampla por usuários do programa computacional de gerência de pavimentos HDM-4 e analisar a influência de modificações no cenário pré-definido no estudo de caso, notadamente considerações de projeto e de tráfego. A escolha do Método dos Efeitos Elementares foi precedida por uma revisão bibliográfica sobre análise de sensibilidade, sendo que o procedimento foi delineado com um estudo de caso que partiu de um cenário pré-definido (rodovia de pista simples e pavimento asfáltico sobre base granular) e fez a análise da variação de 61 fatores, em 300 trajetórias, totalizando 18.600 simulações no programa HDM-4. Os resultados deste estudo indicaram que a utilização de uma amostra com 30 trajetórias é adequada, pois aumentar este número não acarreta ganhos significativos e a sua diminuição ocasiona prejuízos. Com o procedimento delineado, passou-se à verificação de impactos que a modificação dos cenários pré-definidos para a análise gerariam, ou seja, se a modificação de fatores que não estão no grupo de 61 fatores em estudo interferiria no resultado final do procedimento. Foram estudadas modificações nas considerações de projeto (Taxa de Desconto Monetário Anual e Período de Análise) e nas considerações de tráfego (Percentual de Veículos Comerciais e da Taxa de Crescimento Anual do Tráfego), com os resultados evidenciando que ambas as modificações influem no resultado final da análise de sensibilidade, sendo que as modificações nas considerações de tráfego se mostraram mais influentes que as modificações nas considerações de projeto. Complementarmente, foi feita uma análise de sensibilidade pelo Método dos Efeitos Elementares, com 250 trajetórias, que identificou como os três fatores mais influentes na análise econômica do programa HDM-4, dentro do universo estudado, o Volume Diário Médio Anual de Veículos Motorizados, o Fator de Calibração para a Progressão da Irregularidade Longitudinal e o Fator de Calibração para a Razão do Número Estrutural Ajustado das Estações Úmida e Seca. / This Master of Science Dissertation aimed to outline a procedure for sensitivity analysis that can be widely used by computer program of pavement management HDM-4 users. It also aimed to analyze the influence of changes in the pre-defined scenario of the case study, especially design and traffic considerations. The choice of the Elementary Effects Method was preceded by a literature review on sensitivity analysis, and the procedure was outlined with a case study that came from a pre-defined scenario (two lane standard and asphalt pavement on granular base) and variations of 61 factors, considering 300 trajectories, in a total of 18,600 simulations in HDM-4 program. The results indicated that the use of a sample with 30 trajectories is appropriate, because increasing this number does not cause significant gains and reducing it can harm the analysis. With the procedure outlined, it was checked if changes in factors that are not in the group of 61 studied factors can interfere with the end result. Thus, changes in design (Discount Rate and Period Annual Monetary Analysis) and traffic (Percentage of Commercial Vehicle and Annual Growth Rate of Traffic) considerations affected the final result the sensitivity analysis, and the changes in traffic considerations were more influential than changes in design considerations. In addition, it was conducted a sensitivity analysis by the Elementary Effects Method, with 250 trajectories, which identified as the three most influential factors in the economic analysis of HDM-4 program, within the universe studied, the Annual Average Daily Traffic of Automobiles, the Calibration Factor for the Progression of Longitudinal Roughness and the Calibration Factor for the Ratio of Adjusted Structural Number Wet and Dry Seasons.
6

Papel da sinalização purinérgica nos efeitos anti-inflamatórios do treinamento aeróbio em um modelo de asma induzido por ácaro de poeira doméstica / Role of purinergic signaling in the anti-inflammatory effects of aerobic trainning in a model of asthma by house dust mite

Oliveira Junior, Manoel Carneiro de 07 March 2017 (has links)
Submitted by Nadir Basilio (nadirsb@uninove.br) on 2018-07-16T21:41:44Z No. of bitstreams: 1 Manoel Carneiro de Oliveira Junior.pdf: 2899222 bytes, checksum: 9b2b796a4ed66d8af915bf5458411bad (MD5) / Made available in DSpace on 2018-07-16T21:41:44Z (GMT). No. of bitstreams: 1 Manoel Carneiro de Oliveira Junior.pdf: 2899222 bytes, checksum: 9b2b796a4ed66d8af915bf5458411bad (MD5) Previous issue date: 2017-03-07 / Asthma is a chronic inflammatory airway disease, in which purinergic signaling has a central role, controlling activation of structural and hematopoietic cells. Aerobic exercise (AE) present anti-inflammatory effects to the airways, but no cellular and molecular mechanisms are known. Thus, this project evaluated the effects of AE on purinergic signaling response in a model of asthma induced by house dust mite (HDM). AE was performed in a treadmill at moderate intensity, 5x/week, during 4 weeks, after 3 weeks of prior HDM administration. HDM (dermatophagoides pteronyssinus; 100mg/mouse) was administered 3x/week, during 7 weeks. The results demonstrated that AE reduced adenosine triphosphate (ATP) accumulation (p<0.001), IL-1beta, IL-4, IL-5, CXCL1/KC, IL-13, IL-17, IL-23, IL-33 and TNF-alpha (p<0.001), while increased IL-1ra, IL-2 and IL-10 in bronchoalveolar lavage (BAL). Total number of leukocytes, eosinophils, lymphocytes and neutrophils in BAL and the number of eosinophils, neutrophils and lymphocytes in the airway wall (p<0.01) were reduced by AE. Airway collagen, elastin, smooth muscle and mucus were reduced by AE (p<0.01). TGF-beta, IGF-1 and VEGF levels was reduced by AE (p<0.001). Lung mechanics (Resistance, Elastance, GTIS, HTIS, RAW) and airway hyperresponsiveness (AHR) to methacholine was ameliorated by AE (p<0.01). IL-4, IL-5 and IL-13 production by lymph nodes, splenocytes and bone marrow cells was also reduced by AE. The expression of P2X7, P2Y2 and P2Y6 by peribronchial leukocytes (p<0.01) and by airway epithelial cells (p<0.01) were reduced by AE. AE reduces asthma phenotype by inhibiting purinergic signaling in a model of HDM-induce asthma. / A asma é uma doença inflamatória crônica das vias aéreas, na qual a sinalização purinérgica tem um papel central, controlando a ativação de células estruturais e hematopoiéticas. O treinamento físico aeróbio (TFA) apresenta efeitos anti-inflamatórios nas vias aéreas, mas não são conhecidos mecanismos celulares e moleculares. Assim, este projeto avaliou os efeitos da TFA sobre a resposta de sinalização purinérgica em um modelo de asma induzida pelo ácaro da poeira doméstica (HDM). TFA foi realizada numa esteira em intensidade moderada, 5x / semana, durante 4 semanas, após 3 semanas de administração anterior de HDM. Foi administrado HDM (dermatophagoides pteronyssinus, 100 mg / camundongo) 3x / semana, durante 7 semanas. Os resultados demonstraram que TFA reduziu a acumulação de adenosina trifosfato (ATP) (p <0,001), IL-1beta, IL-4, IL-5, CXCL1 / KC, IL-13, IL-17, IL-23, IL-33 e TNF-alfa (p <0,001), enquanto aumentou IL-1ra, IL-2 e IL-10 no lavado broncoalveolar (LBA). O número total de leucócitos, eosinófilos, linfócitos e neutrófilos na LBA e o número de eosinófilos, neutrófilos e linfócitos na parede das vias aéreas (p <0,01) foram reduzidos por EA. Colágeno da via aérea, elastina, músculo liso e muco foram reduzidos por EA (p <0,01). Os níveis de TGF-beta, IGF-1 e VEGF foram reduzidos pelo EA (p <0,001). A mecânica pulmonar (resistência, elastância, GTIS, HTIS, RAW) e a hiperreatividade das vias aéreas (AHR) à metacolina foi melhorada pelo EA (p <0,01). A produção de IL-4, IL-5 e IL-13 por nódulos linfáticos, esplenócitos e células da medula óssea também foi reduzida por EA. A expressão de P2X7, P2Y2 e P2Y6 por leucócitos peribronquiais (p <0,01) e por células epiteliais das vias aéreas (p <0,01) foram reduzidas pelo TFA. TFA reduziu o fenótipo de asma inibindo a sinalização purinérgica num modelo de HDM induzindo asma.
7

Implementación del Sistema de Gestión de Pavimentos con Herramienta HDM-4 para la Red Vial Nro. 5 Tramo Ancón - Huacho - Pativilca

Montoya Goicochea, Jorge Eduardo January 2007 (has links)
Al planificar el mantenimiento y la rehabilitación en un programa plurianual los ingenieros de pavimentos se enfrentan con la decisión de determinar qué tramos de la carretera deben ser reparados, cuándo deben realizarse las reparaciones y qué tipos de reparaciones o tratamiento deben usarse. Se trata de un problema importante debido al amplio escenario de combinaciones entre las posibles elecciones entre el gran número de secciones homogéneas existentes en la carretera y los diferentes tratamientos de reparación posibles.
8

Análise comparativa de modelos de previsão de desempenho de pavimentos flexíveis

Deise Menezes Nascimento 01 June 2005 (has links)
Os modelos de previsão de desempenho de pavimentos são importantes ferramentas utilizadas pelos sistemas de gerência, essenciais para o planejamento das atividades de manutenção e reabilitação, assim como para a estimativa dos recursos necessários para a preservação das rodovias. Este trabalho tem por objetivo comparar modelos de desempenho de pavimentos, desenvolvidos por análises empíricas e empírico-mecanísticas, que predizem a evolução da condição de pavimentos flexíveis, ao longo do tempo e/ou tráfego acumulado. Os modelos de desempenho analisados foram desenvolvidos por pesquisadores e órgãos rodoviários brasileiros e internacionais, inclusive os modelos de deterioração utilizados pelo programa computacional de gerência de pavimentos desenvolvido pelo Banco Mundial, o HDM–4 (Highway Development and Management). A pesquisa está baseada na comparação do desempenho real de seções de pavimentos rodoviários, obtido a partir da base de dados dos experimentos LTPP (Long-Term Pavement Performance) do FHWA (Federal Highway Administration), com o comportamento previsto pelos modelos de desempenho desenvolvidos por Queiroz (1981), Paterson (1987), Marcon (1996) e Yshiba (2003). Neste trabalho, a análise do comportamento das seções de teste LTPP-FHWA é feita utilizando-se uma programação fatorial que, através da análise de variância (ANOVA), permite a determinação do nível de significância de fatores pré- selecionados (variáveis independentes: tráfego, idade e número estrutural corrigido) bem como a modelagem do desempenho dos pavimentos dessas seções (variáveis dependentes: irregularidade longitudinal e deformação permanente). / The pavement performance prediction models are important tools used for pavement management, essential for the planning of maintenance and rehabilitation activities, as well as for budgeting. The aim of this work is to compare performance prediction models developed through empirical and empirical-mechanistic analyses, which predict the evolution of the condition of flexible pavements, throughout the time and/or accumulated traffic. The performance prediction models analyzed were developed by researchers and Brazilian and international road agencies, including the deterioration models used by the pavement management comuputer program HDM-4 (Highway Development and Management), developed by the World Bank. The research is based on the comparison of the real performance of pavement of sections, obtained from the data base of the LTPP Program (Long-Term Pavement Performance) of FHWA (Federal Highway Administration), with the behavior predicted by deterioration models developed by Queiroz (1981), Paterson (1987), Marcon (1996) and Yshiba (2003). In this work, the analysis of the behavior of the LTPP-FHWA test sections is made through a factorial programming. Analysis of Variance (ANOVA) allows the determination of the level of significance of pre-selected factors (independent variables: traffic, age and pavement structure) and the development of performance prediction models (dependent variables: roughness and rutting).
9

Uma adaptação dos modelos de deterioração do HDM III à região centro de São Paulo / not available

Valéria Maria Sestini 04 April 1997 (has links)
Com o objetivo de comparar as condições do pavimento previstas pelo modelo HDM III com as observadas, foram avaliados três trechos rodoviários, divididos num total de 19 subtrechos homogêneos, na região central do estado de São Paulo. O método adotado para medir a irregularidade do pavimento nestes trechos foi o de nível e mira. As deflexões recuperáveis foram medidas com a viga Benkelman e com aparelho tipo FWD (\"Falling Weight Deflectometer\"). O levantamento da condição da superfície baseou-se na especificação de serviço DNER-ES 128/83 e no procedimento DNER-PRO 08-94. Durante todo o período de avaliação foram feitos registros em vídeo (filmagens), dos trechos selecionados. Foi feita uma contagem de tráfego em cada trecho e definido o fator de veículo com base em dados de pesagem de 1993. Com base nestes dados foram testados alguns dos modelos de deterioração do HDM III, aplicados aos casos específicos dos pavimentos que compõem os trechos avaliados. Verificada a necessidade de adaptação destes modelos à região de estudo foram definidos, através de análise de regressão, alguns fatores de deterioração para serem aplicados diretamente no modelo HDM III. Tendo em vista que o modelo HDM III utiliza a deflexão medida com a viga Benkelman, definiu-se uma correlação destes resultados com a deflexão recuperável do pavimento medida com equipamentos tipo FWD. / Three highway section divided in nineteen homogeneous subsections were studied in order to compare the pavement conditions predicted by the HDM III model with the values measured \"in situ\". The roughness was measured by the road and level method and recoverable deflections with Benkelman beam and FWD (Falling Weight Deflectometer). The surface distress survey was based on DNER-ES 128/83 and DNER-PRO 08/94 standards. Television tapes were recorded during the period of evaluation. Traffic countings in each section were rnade and the equivalent standard axle load factors were stabilished, based on load data from 1993. Running the HDM III software with these data, the prediction models within the HDM III were tested. For the sections under analysis it had shown that adaptations to the deterioration factors should be done. These new deterioration factors are suggested. Since the HDM III model uses the Benkelman beam values, a correlation between FWD and Benkelman beam was established.
10

Uma adaptação dos modelos de deterioração do HDM III à região centro de São Paulo / not available

Sestini, Valéria Maria 04 April 1997 (has links)
Com o objetivo de comparar as condições do pavimento previstas pelo modelo HDM III com as observadas, foram avaliados três trechos rodoviários, divididos num total de 19 subtrechos homogêneos, na região central do estado de São Paulo. O método adotado para medir a irregularidade do pavimento nestes trechos foi o de nível e mira. As deflexões recuperáveis foram medidas com a viga Benkelman e com aparelho tipo FWD (\"Falling Weight Deflectometer\"). O levantamento da condição da superfície baseou-se na especificação de serviço DNER-ES 128/83 e no procedimento DNER-PRO 08-94. Durante todo o período de avaliação foram feitos registros em vídeo (filmagens), dos trechos selecionados. Foi feita uma contagem de tráfego em cada trecho e definido o fator de veículo com base em dados de pesagem de 1993. Com base nestes dados foram testados alguns dos modelos de deterioração do HDM III, aplicados aos casos específicos dos pavimentos que compõem os trechos avaliados. Verificada a necessidade de adaptação destes modelos à região de estudo foram definidos, através de análise de regressão, alguns fatores de deterioração para serem aplicados diretamente no modelo HDM III. Tendo em vista que o modelo HDM III utiliza a deflexão medida com a viga Benkelman, definiu-se uma correlação destes resultados com a deflexão recuperável do pavimento medida com equipamentos tipo FWD. / Three highway section divided in nineteen homogeneous subsections were studied in order to compare the pavement conditions predicted by the HDM III model with the values measured \"in situ\". The roughness was measured by the road and level method and recoverable deflections with Benkelman beam and FWD (Falling Weight Deflectometer). The surface distress survey was based on DNER-ES 128/83 and DNER-PRO 08/94 standards. Television tapes were recorded during the period of evaluation. Traffic countings in each section were rnade and the equivalent standard axle load factors were stabilished, based on load data from 1993. Running the HDM III software with these data, the prediction models within the HDM III were tested. For the sections under analysis it had shown that adaptations to the deterioration factors should be done. These new deterioration factors are suggested. Since the HDM III model uses the Benkelman beam values, a correlation between FWD and Benkelman beam was established.

Page generated in 0.0439 seconds