• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 82
  • 49
  • 15
  • 5
  • 3
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 191
  • 56
  • 48
  • 47
  • 29
  • 19
  • 18
  • 17
  • 17
  • 16
  • 16
  • 16
  • 15
  • 15
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Uso de varfarina : nível de informação e adesão ao tratamento em pacientes da atenção primária à saúde

Souza, Thais Furtado de January 2016 (has links)
Varfarina é o anticoagulante oral prescrito com maior frequência, no entanto há difi-culdades em seu manejo na prática clínica. Embora tenha eficácia bem estabelecida é considerado um medicamento potencialmente perigoso, que está associado a erros de medicação fatais na atenção primária à saúde. Para garantir a segurança do paciente, seu uso requer o monitoramento dos níveis de anticoagulação, sendo importante a ade-são ao tratamento e a informação dos pacientes quanto aos cuidados durante o trata-mento. O objetivo do presente estudo foi verificar o nível de informações dos pacien-tes quanto à prescrição, o nível de informações prestadas pela equipe de saúde aos pa-cientes, a adesão ao tratamento e os níveis de controle da anticoagulação através do valor do Coeficiente Internacional Normatizado (INR). Foi realizado estudo transver-sal, a partir de uma coorte prospectiva, com 60 pacientes atendidos na atenção primá-ria à saúde no município de Ijuí, utilizando-se questionário para verificar o nível de informação prestada aos pacientes pela equipe de sáude, a Escala de Adesão Terapêu-tica de Morisky de Oito Itens para verificar a adesão, e o exame do tempo de protrom-bina para verificar o valor INR. De acordo com os critérios adotados, verificou-se um nível insuficiente de informações prestada pela equipe de saúde, baixa adesão ao tra-tamento, com a maioria dos pacientes fora do intervalo terapêutico adequado. Verifica-se a necessidade de melhoria da qualidade das informações prestadas aos usuários, incentivo da adesão ao tratamento e melhor monitoramento da anticoagulação visando à segurança do paciente. / Warfarin is the oral anticoagulant most frequently prescribed, although is difficult handling warfarin in clinical practice. It’s has a well-established efficacy, but is con-sidered a potentially dangerous drug, and in primary health care is associated with fatal medication errors. To ensure patient safety, its use requires requires anticoagulation levels monitoring ,, so medication adherence and care information’s about anticoagula-tion therapy are important. The aim of this study was to verify the information level about the prescription, information level provided by the health care team, medication adherence and anticoagulation control levels by the International Normalized Ratio (INR) value. A cross-sectional study, from a prospective cohort, was realized with 60 patients seen in primary health care in Ijuí city, a questionnaire was use to check the information level to the patients by the health team, Eight Item Morisky Medication Adherence Scale was use to verify adherence, and prothrombin time exam was use to check the INR value. According to the criteria adopted, was observed insufficient in-formation level provided by health care team, poor adherence to treatment and most patients was out of the therapeutic range. It’s necessary improve the quality of infor-mation provided to the patients, promote medication adherence and improve the anti-coagulation monitoring for patient safety in the treatment.
92

Variações na extensão da cobertura de gelo do Nevado Cololo, Bolívia

Oliveira, Ana Maria Sanches Dorneles Ferreira de January 2013 (has links)
Este estudo apresenta padrões de flutuações das geleiras do Nevado Cololo, Bolívia, no período 1975–2011, determinado partir de dados orbitais, cartográficos e climáticos. As massas de gelo do Nevado Cololo são representativas das geleiras tropicais andinas que estão sujeitas a alternância entre condições atmosféricas úmidas (novembro-abril) e secas (maio-outubro) (outer tropics). Essa sazonalidade é determinada pela oscilação latitudinal da Zona de Convergência Intertropical (ZCIT) e perturbada pelos eventos não sazonais do fenômeno ENOS. A fase positiva, o El Niño, contribui negativamente para o balanço de massa dessas geleiras e foi frequente no intervalo investigado. Esse trabalho usou imagens TM/Landsat-5 para determinar a cobertura de gelo em 1989, 1997, 2008 e 2011. Aplicando o Normalized Difference Snow Index (NDSI), que utiliza as características espectrais opostas das massas de gelo no visível e no infravermelho próximo, este trabalho delimitou as geleiras do Nevado Cololo. Utilizando as informações de carta topográfica foi obtido um Modelo Digital de Elevação (MDE), elaborado pela interpolação de pontos de elevação usando o método geoestatístico krigagem ordinária. As informações obtidas do sensoriamento remoto e da cartografia foram incorporadas a um Sistema de Informação Geográfica (SIG) para se obter parâmetros das geleiras. A análise da séries temporais de precipitação e temperatura usaram dados do Global Precipitation Climatology Centre (GPCC)/NOAA, do Climate Research Unit Time Series (CRUTS)/University of East Anglia e de duas estações meteorológicas. Os dados climáticos não apresentam tendências estatisticamente significativas, mas há uma fraca redução da precipitação durante os meses de novembro, dezembro e abril, condições essa que podem indicar menor nebulosidade durante o verão. Em 2011 só restavam 48 das 122 geleiras identificadas em 1975. Geleiras pequenas (< 0,1 km²) com cotas máximas baixas foram as mais afetadas e atualmente não existem geleiras abaixo de 4.626 m a.n.m. A cobertura de gelo era de 24,77 ±0,00032 km² em 2011, 42,02% menor do que em 1975. A perda superficial ocorreu em todas as vertentes, independente de orientação, mas as geleiras voltadas a leste foram mais afetadas. Mesmo a maior geleira do Nevado Cololo, face SW, perdeu 21,6% de sua área total e sua frente retraiu cerca de 1 km durante o intervalo de 36 anos. Proporcionalmente, houve o aumento do número de geleiras cuja declividade média está entre 30° e 40°. A redução da espessura gelo é atestada pela fragmentação de geleiras e afloramentos do embasamento em suas partes internas. A perda de massa dessas geleiras estudadas foi provavelmente causada pela intensificação dos processos de ablação. / This study presents fluctuations patterns for the Nevado Cololo glaciers, Bolivia, in the period 1975–2011, as determined from orbital, cartographic and climatic data. Nevado Cololo ice masses are representative of Andean tropical glaciers subjected to alternations of humid (November to April) and dry (May to October) (outer tropics) atmospheric conditions. This seasonality is determined by the Inter-tropical Convergence Zone (ITCZ) latitudinal oscillation and disturbed by the no seasonal ENSO phenomena. The positive phase, El Niño, contributes negatively to these glaciers mass balance and was frequent during the investigated time period. This work used TM/ Landsat-5 imagery to determine the ice cover in 1989, 1997, 2008 and 2011. Applying the Normalized Snow Difference Index (NDIS), which uses the opposite spectral characteristics of ice masses in the visible and near infrared region, this work delimited the Nevado Cololo glaciers. Based on information from a topographic chart, we obtained a Digital Elevation Model (DEM) using elevation points interpolated by the ordinary kriging geostatistical method. Information derived from remote sensing and cartographic sources was incorporated into a Geographic Information System (GIS) to obtain glaciers parameters. The analyses of precipitation and temperature time series used data from the Global Precipitation Climatology Centre (GPCC)/NOAA, the Climate Research Unit Time Series (CRUTS)/University of East Anglia and from two meteorological stations. Climatic data show no statistically significant trend, but there was a weak precipitation reduction during November, December and April months, a condition that may indicate low cloudiness during the summer. By 2011, there were only 48 of the 122 glaciers identified in 1975. Small glaciers (<0.1 km²) with low maximum elevations were most affected and currently there are no glaciers below 4,626 m asl. The ice covered 24.77 km² in 2011, 42.02% less than in 1975. Surface loss occurred in all slopes, regardless of orientation, but glaciers facing east were most affected. Even the largest glacier in Nevado Cololo, SW face, lost 21.6% of its total area and its front retreated about 1 km during the 36 years period. Proportionately, there was an increase in the number of glaciers whose average slope is between 30° and 40°. The ice thickness reduction is attested by glaciers break up and bedrock outcrops in its internal parts. These glaciers mass loss was probably caused by the intensification of ablation processes.
93

Garantia da qualidade de ensaios mecânicos de materiais metálicos

Fabrício, Daniel Antônio Kapper January 2015 (has links)
A implantação de Sistemas de Gestão da Qualidade em ambiente laboratorial vem se tornando uma necessidade frequente, devido à demanda dos clientes por resultados confiáveis e rastreáveis. No Brasil, a Coordenação Geral de Acreditação do Inmetro (Cgcre) é o órgão responsável pela acreditação de laboratórios segundo a NBR ISO/IEC 17025, sendo o Laboratório de Metalurgia Física (LAMEF) da Universidade Federal do Rio Grande do Sul acreditado para a realização de ensaios de materiais metálicos desde 2010. Em 2013, tendo passado por um processo de aumento de escopo de acreditação para os ensaios mecânicos, evidenciou-se o não atendimento ao item de garantia da qualidade da NBR ISO/IEC 17025 e da NIT-DICLA-026. Para atender aos requisitos, foram conduzidas análises através de Ensaios de Proficiência e por monitoramento interno da qualidade pelos métodos do Erro Normalizado e de Análise de Variância. Quando identificados desvios dos critérios de aceitação, ações corretivas foram tomadas, visando a melhoria contínua dos sistemas de medição. Os resultados do trabalho demonstraram que a implantação sistemática de métodos estatísticos para o monitoramento da qualidade dos ensaios foi fundamental para o processo de aumento do escopo de acreditação do LAMEF, o qual foi consolidado no início de 2014. / The implementation of Quality Management Systems in laboratory environment is being an increasing necessity, due to customer demand for reliable and traceable test results. The General Coordination for Accreditation of Inmetro (Cgcre/Inmetro) is the Brazilian body responsible for the accreditation of laboratories according to ISO/IEC 17025 standard, and the Physical Metallurgy Laboratory (LAMEF) from the Federal University of Rio Grande do Sul (UFRGS) has been accredited in performance of metallic materials tests since 2010. In 2013, during the process of accreditation scope extension on mechanical testing, there were identified nonconformities to the quality assurance requirements of ISO/IEC 17025 and NIT-DICLA-026. In order to comply with these requirements, studies have been carried out through Proficiency Testing and internal quality monitoring via Normalized Error and Analysis of Variance methods. Corrective actions were taken when deviations from the acceptance criteria were identified, aiming the measurement systems continuous improvement. The results of this work have demonstrated that the systematic implementation of statistical methods for monitoring the testing quality was critical to the process of LAMEF extension of scope, which was consolidated in the early 2014.
94

Determinação experimental da distribuição de dose absorvida em diferentes qualidades de feixes mamográficos / Experimental determination of absorbed dose distribution in different qualities of mammographic x-ray beams.

Josilene Cerqueira Santos 20 December 2017 (has links)
A dose glandular média é a grandeza utilizada para dosimetria em mamografia. Esta grandeza apresenta grande dependência com propriedades específicas referentes ao tipo de mama a ser avaliada, tais como sua glandularidade e espessura comprimida. Depende também das propriedades do espectro de raios X, tais como a combinação anodo/filtro e a tensão aplicada ao tubo, que modificam a camada semirredutora do feixe. A caracterização do feixe de raios X através da medição direta do seu espectro é um procedimento de alta complexidade e de difícil execução em sistemas de mamografia, devido à arquitetura dos equipamentos e da alta taxa de fluência de fótons característica dos feixes. Estes espectros representam a fonte de informações quantitativas e qualitativas mais completa do feixe. O objetivo geral desse trabalho é estimar distribuições de dose glandular em diferentes profundidades de materiais simuladores de tecido mamário por meio de espectros de raios X medidos em mamógrafos. Para isso, foram utilizadas técnicas radiográficas comumente empregadas na mamografia para o rastreamento do câncer de mama. Foi avaliado o comportamento da dose glandular média normalizada pelo kerma no ar incidente (DgNp) com parâmetros relacionados à mama (glandularidade e espessura) e aos espectros (camada semirredutora, tensão, combinação alvo/filtro). Primeiramente, foi desenvolvida uma metodologia experimental para medição dos espectros de raios X nos mamógrafos. Em seguida, foram propostos os seguintes métodos de cálculo da DgNp utilizando esses espectros: Método I, que calcula a DgNp com espectros incidentes; Método II, que utiliza espectros incidentes e transmitidos para este cálculo e Método III, que usa espectros incidentes e transmitidos para estimar a distribuição de dose em profundidade. Por fim, para efeito de comparação, a distribuição da DgNp também foi estimada utilizando dosímetros termoluminescentes (TLDs). A metodologia desenvolvida para medição de espectros mostrou-se eficaz para posicionamento e alinhamento adequados do detector no feixe e consequentemente a medição dos espectros diretos. Os espectros incidentes experimentais mostraram boa concordância com espectros simulados. Os resultados mostraram uma distribuição bem-comportada desses coeficientes (DgNp), com tendência linear ou exponencial, com relação aos parâmetros analisados. Para um dado espectro, os valores de DgNp apresentaram decrescimento exponencial com a espessura do material simulador ao tecido mamário e dependencia linear com a glandularidade. Além disso, esses coeficientes crescem linearmente com o aumento da camada semirredutora e, consequentemente, com a energia efetiva. A partir das distribuições de DgNp (obtidas pelo Método III) foi possível estimar a dose no volume completo da mama com diferença máxima de 5,2% dos valores obtidos com o Método II. A variação da DgNp com a profundidade obtidas com TLDs mostrou-se bastante coerente com os resultados observados com a utilização do Método III. Conclui-se que é possível avaliar a dose glandular em mamografia utilizando espectros de raios X e que a metodologia proposta tem potencial para ser aplicada como procedimentos alternativos para dosimetria em mamografia. / The mean glandular dose is the quantity used for dosimetry in mammography. This quantity has a strong dependence with some properties of the evaluated breast, such as its glandularity and compressed thickness. It also depends on the X-ray spectrum used for the mammographic image production, such as the target/filter combination and the tube voltage, which are related to the half value layer (HVL) of the beam. The X-ray beam characterization by means of the direct measurement of its spectrum is a complex procedure, and it is difficult to be implemented in clinical systems due to the architecture of the mammography equipment and the high photon fluence rates. These spectra provide a complete qualitative and quantitative information of the X-ray beam. The general objective of this work is to estimate mean glandular dose distributions in different depths of breast tissue-equivalent materials (bTEM) considering the X-ray spectra measured in clinical mammography devices. Radiographic techniques commonly applied for breast cancer screening were used. The behavior of the mean glandular dose normalized to the incident air kerma (DgNp), with parameters related to the breast (glandularity and compressed thickness) and to the mammographic spectra (HVL, tube voltage, target/filter combination), was evaluated. First, an experimental methodology was developed to measure X-ray spectra in clinical mammography devices. Then, the following methods for calculating the DgNp using these spectra were considered: Method I, which calculates the DgNp using the incident spectra; Method II, which uses incident and transmitted spectra by the bTEMs, and Method III, which uses incident and transmitted spectra to estimate the dose distributions in depth of the breast equivalent materials. Finally, thermoluminescent dosimeters (TLDs) were used as a comparative method to evaluate DgNp distributions. The methodology developed for measuring spectra proved to be efficient for the proper positioning and alignment of the detector and, consequently, for the measurement of direct X-ray spectra. The experimental incident spectra showed good agreement with spectra simulated in similar conditions. The results showed well-defined trends (either linear or exponential) of the distributions of these coefficients (DgNp) regarding the analyzed parameters. The DgNp values presented an exponential decay with the bTEM thickness and linear decrease with the glandularity. In addition, these coefficients increase linearly with the increase of the HVL and, consequently, with the increase of the effective energy. From the distributions of DgNp (obtained by Method III) it was possible to estimate the DgNp in the whole breast with a maximum difference of 5.2% from the values obtained using the Method II. The variation of the DgNp with the depth, obtained with TLDs, showed to be consistent with the results observed using the Method III. In conclusion, it is possible to evaluate the glandular dose in mammography examinations using X-ray spectra and the suggested methodology, with some adaptations, can be applied as an alternative procedure for dosimetry in mammography.
95

Garantia da qualidade de ensaios mecânicos de materiais metálicos

Fabrício, Daniel Antônio Kapper January 2015 (has links)
A implantação de Sistemas de Gestão da Qualidade em ambiente laboratorial vem se tornando uma necessidade frequente, devido à demanda dos clientes por resultados confiáveis e rastreáveis. No Brasil, a Coordenação Geral de Acreditação do Inmetro (Cgcre) é o órgão responsável pela acreditação de laboratórios segundo a NBR ISO/IEC 17025, sendo o Laboratório de Metalurgia Física (LAMEF) da Universidade Federal do Rio Grande do Sul acreditado para a realização de ensaios de materiais metálicos desde 2010. Em 2013, tendo passado por um processo de aumento de escopo de acreditação para os ensaios mecânicos, evidenciou-se o não atendimento ao item de garantia da qualidade da NBR ISO/IEC 17025 e da NIT-DICLA-026. Para atender aos requisitos, foram conduzidas análises através de Ensaios de Proficiência e por monitoramento interno da qualidade pelos métodos do Erro Normalizado e de Análise de Variância. Quando identificados desvios dos critérios de aceitação, ações corretivas foram tomadas, visando a melhoria contínua dos sistemas de medição. Os resultados do trabalho demonstraram que a implantação sistemática de métodos estatísticos para o monitoramento da qualidade dos ensaios foi fundamental para o processo de aumento do escopo de acreditação do LAMEF, o qual foi consolidado no início de 2014. / The implementation of Quality Management Systems in laboratory environment is being an increasing necessity, due to customer demand for reliable and traceable test results. The General Coordination for Accreditation of Inmetro (Cgcre/Inmetro) is the Brazilian body responsible for the accreditation of laboratories according to ISO/IEC 17025 standard, and the Physical Metallurgy Laboratory (LAMEF) from the Federal University of Rio Grande do Sul (UFRGS) has been accredited in performance of metallic materials tests since 2010. In 2013, during the process of accreditation scope extension on mechanical testing, there were identified nonconformities to the quality assurance requirements of ISO/IEC 17025 and NIT-DICLA-026. In order to comply with these requirements, studies have been carried out through Proficiency Testing and internal quality monitoring via Normalized Error and Analysis of Variance methods. Corrective actions were taken when deviations from the acceptance criteria were identified, aiming the measurement systems continuous improvement. The results of this work have demonstrated that the systematic implementation of statistical methods for monitoring the testing quality was critical to the process of LAMEF extension of scope, which was consolidated in the early 2014.
96

Evaluation of Idempotency & Block Size of Data on the Performance of Normalized Compression Distance Algorithm

Mandhapati, Venkata Srikanth, Bajwa, Kamran Ali January 2012 (has links)
Normalized compression distance (NCD) is a similarity distance metric algorithm which is used for the purpose of analyzing the type of file fragments. The performance of NCD depends upon underlying compression algorithm to be used. We have studied three compressors bzip2, gzip and ppmd, the compression ratio of ppmd is better than bzip2 and the compression ratio of bzip2 is better than gzip, but which one out of these three is better than one another in the viewpoint of idempotency is evaluated by us. Then we have applied NCD along with k nearest neighbour as a classification algorithm to a randomly selected public corpus data with different block sizes (512 byte, 1024 bytes, 1536 bytes, 2048 bytes). The performance of two compressors bzip2 and gzip is also compared for the NCD algorithm in the perspective of idempotency. Objectives: In this study we have investigated the In this study we have investigated the combine effect of both of the parameters namely compression ratio versus idempotency and varying block size of data on the performance of NCD. The objective is to figure out that in order to have a better performance of NCD either a compressor for NCD should be selected on the basis of better compression ratio of compressors or better idempotency of compressors. The whole purpose of using different block sizes was to evaluate either the performance of NCD will improve or not by varying the block size of data to be used for making the datasets. Methods: Experiments are performed to test the hypotheses and evaluate the effect of compression ratio versus idempotency and block size of data on the performance of NCD. Results: The results obtained after the analysis of null hypotheses of main experiment are retained, which showed that there is no statistically significant difference on the performance of NCD when varying block size of data is used and also there is no statistically significant difference on the NCD’s performance when a compressor is selected for NCD on the basis of better compression ratio or better idempotency. Conclusions: As the results obtained from the experiments are unable to reject the null hypotheses of main experiment so no conclusion could be drawn of the effect of the independent variables on the dependent variable i.e. there is no statistically significant effect of compression ratio versus idempotency and varying block size of data on performance of the NCD.
97

Eddy-covariance carbon balance, photosynthetic capacity and vegetation indices in a harvested boreal jack pine stand

Hawthorne, Iain 05 1900 (has links)
Eddy-covariance (EC) CO₂ flux data were analysed and annual carbon (C) balances estimated for a four-year period (2004-2007) following clearcut harvesting of a boreal jack pine stand in northern Saskatchewan. The site was a source of C to the atmosphere for all years, with annual net ecosystem productivity (NEP) increasing from -153 g C m⁻² yr⁻¹ in 2004 to -63 g C m⁻² yr⁻¹ in 2007. This increase was mainly due to gross primary productivity (GPP) increasing significantly from 78 to 200 g C m⁻² yr⁻¹ , while ecosystem respiration (R) increased only slightly from 231 to 263 g C m⁻² yr⁻¹ over the same period. In the 2006 growing season (GS), a field campaign was conducted to investigate the relationships between monthly destructive measurements of leaf area index (LAI) and daily measurements of the normalized difference vegetation index (NDVI) and photosynthetic capacity (Amax). The latter was derived from 5-day, 16-day, 30-day and annual Michaelis-Menten light response analyses using daytime measurements of NEP and incident photosynthetically active radiation. Digital-camera data were used to evaluate the potential of using the rectilinear-lens vegetation index (RLVI) as a surrogate for NDVI of a young forest stand. Results showed that LAI was linearly related to NDVI and RLVI, which was largely the result of changes in the deciduous vegetation component across the GS. These results indicate that RLVI could be used as a surrogate for NDVI up to a GS maximum LAI of 0.91 m2 m⁻² observed in 2006. Measured mean (± 1 S.D.) GS LAI was 0.67 (± 0.24) m² m⁻² in 2006. LAI accounted for the majority of the variability in Amax at the 30-day time scale, while at shorter time scales air temperature was the dominant control. For 2004 to 2007, mean spring estimates of LAI were 0.25, 0.29, 0.38 (compared to 0.40 m² m⁻² from measurements) and 0.41 m² m⁻², respectively. Results suggest that a steady increase in the jack pine LAI component accounted for the annual increases in GPP and hence NEP over the four years. / Land and Food Systems, Faculty of / Graduate
98

Simulace proudění vzduchu a stanovení trvalé tlakové ztráty pro normalizovanou clonu / Simulation and determination of permanent pressure loss for the normalized orifice plate

Šimberský, Michal January 2014 (has links)
This master’s thesis deals with the simulation of flow through the normalized orifice plate. There is described flow measurement with flowmeters, which reduces the cross-section of the pipe and causes the pressure difference before and after the flowmeter. Following is a description methods of modeling of turbulent flow and a description of software for the simulation of flow from the company Ansys. The theoretical part is followed a practical part, which is focused on determining the permanent pressure loss caused by the orifice plate and the verification of straight pipeline lengths between orifice and obstacle.
99

Geoelectrical imaging for interpretation of geological conditions affecting quarry operations

Magnusson, Mimmi K. January 2008 (has links)
Determination of the subsurface geology is very important for the rock quarry industry. This is primarily done by drilling and mapping of the surface geology. However in Sweden the bedrock is often completely covered by Quaternary sediments making the prediction of subsurface geology quite difficult. Incorrect prediction of the rock-mass quality can lead to economic problems for the quarry. By performing geophysical measurements a more complete understanding of the subsurface geology can be determined. This study shows that by doing 2D-parallel data sampling a 3D inversion of the dataset is possible, which greatly enhances the visualization of the subsurface. Furthermore the electrical resistivity technique together with the induced polarization method proved to be very efficient in detecting fracture frequency, identification of major fracture zones, and variations in rock-mass quality all of which can affect the aggregate quality. With this technique not only the rock-mass quality is determined but also the thickness of the overburden. Implementation of geophysics can be a valuable tool for the quarry industry, resulting in substantial economic benefits. / QC 20101118
100

Learning, Detection, Representation, Indexing And Retrieval Of Multi-agent Events In Videos

Hakeem, Asaad 01 January 2007 (has links)
The world that we live in is a complex network of agents and their interactions which are termed as events. An instance of an event is composed of directly measurable low-level actions (which I term sub-events) having a temporal order. Also, the agents can act independently (e.g. voting) as well as collectively (e.g. scoring a touch-down in a football game) to perform an event. With the dawn of the new millennium, the low-level vision tasks such as segmentation, object classification, and tracking have become fairly robust. But a representational gap still exists between low-level measurements and high-level understanding of video sequences. This dissertation is an effort to bridge that gap where I propose novel learning, detection, representation, indexing and retrieval approaches for multi-agent events in videos. In order to achieve the goal of high-level understanding of videos, firstly, I apply statistical learning techniques to model the multiple agent events. For that purpose, I use the training videos to model the events by estimating the conditional dependencies between sub-events. Thus, given a video sequence, I track the people (heads and hand regions) and objects using a Meanshift tracker. An underlying rule-based system detects the sub-events using the tracked trajectories of the people and objects, based on their relative motion. Next, an event model is constructed by estimating the sub-event dependencies, that is, how frequently sub-event B occurs given that sub-event A has occurred. The advantages of such an event model are two-fold. First, I do not require prior knowledge of the number of agents involved in an event. Second, no assumptions are made about the length of an event. Secondly, after learning the event models, I detect events in a novel video by using graph clustering techniques. To that end, I construct a graph of temporally ordered sub-events occurring in the novel video. Next, using the learnt event model, I estimate a weight matrix of conditional dependencies between sub-events in the novel video. Further application of Normalized Cut (graph clustering technique) on the estimated weight matrix facilitate in detecting events in the novel video. The principal assumption made in this work is that the events are composed of highly correlated chains of sub-events that have high conditional dependency (association) within the cluster and relatively low conditional dependency (disassociation) between clusters. Thirdly, in order to represent the detected events, I propose an extension of CASE representation of natural languages. I extend CASE to allow the representation of temporal structure between sub-events. Also, in order to capture both multi-agent and multi-threaded events, I introduce a hierarchical CASE representation of events in terms of sub-events and case-lists. The essence of the proposition is that, based on the temporal relationships of the agent motions and a description of its state, it is possible to build a formal description of an event. Furthermore, I recognize the importance of representing the variations in the temporal order of sub-events, that may occur in an event, and encode the temporal probabilities directly into my event representation. The proposed extended representation with probabilistic temporal encoding is termed P-CASE that allows a plausible means of interface between users and the computer. Using the P-CASE representation I automatically encode the event ontology from training videos. This offers a significant advantage, since the domain experts do not have to go through the tedious task of determining the structure of events by browsing all the videos. Finally, I utilize the event representation for indexing and retrieval of events. Given the different instances of a particular event, I index the events using the P-CASE representation. Next, given a query in the P-CASE representation, event retrieval is performed using a two-level search. At the first level, a maximum likelihood estimate of the query event with the different indexed event models is computed. This provides the maximum matching event. At the second level, a matching score is obtained for all the event instances belonging to the maximum matched event model, using a weighted Jaccard similarity measure. Extensive experimentation was conducted for the detection, representation, indexing and retrieval of multiple agent events in videos of the meeting, surveillance, and railroad monitoring domains. To that end, the Semoran system was developed that takes in user inputs in any of the three forms for event retrieval: using predefined queries in P-CASE representation, using custom queries in P-CASE representation, or query by example video. The system then searches the entire database and returns the matched videos to the user. I used seven standard video datasets from the computer vision community as well as my own videos for testing the robustness of the proposed methods.

Page generated in 0.0815 seconds