• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 34
  • 34
  • 9
  • 6
  • 5
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Extension of Nonequilibrium Work Theorems with Applications to Diffusion and Permeation in Biological Systems

Holland, Bryan W. 05 September 2012 (has links)
Nonequilibrium work methods for determining potentials of mean force (PMF) w(z) have recently gained popularity as an alternative to standard equilibrium based methods. Introduced by Kosztin et al., the forward-reverse (FR) method is a bidirectional work method in that it requires the work to be sampled in both forward and reverse directions along the reaction coordinate z. This bidirectional sampling leads to much faster convergence than other nonequilibrium methods such as the Jarzynski equality, and the calculation itself is extremely simple, making the FR method an attractive way of determining the PMF. Presented here is an extension to the FR method that deals with sampling problems along essentially irreversible reaction coordinates. By oscillating a particle as it is steered along a reaction coordinate, both forward and reverse work samples are obtained as the particle progresses. Dubbed the oscillating forward-reverse (OFR) method, this new method overcomes the issue of irreversibility that is present in numerous soft-matter and biological systems, particularly in the stretching or unfolding of proteins. The data analysis of the OFR method is non-trivial however, and to this end a software package named the ‘OFR Analysis Tool’ has been created. This software performs all of the complicated analysis necessary, as well as a complete error analysis that considers correlations in the data, thus streamlining the use of the OFR method for potential end users. Another attractive feature of the FR method is that the dissipative work is collected at the same time as the free energy changes, making it possible to also calculate local diffusion coefficients, D(z), from the same simulation as the PMF through the Stokes-Nernst-Einstein relation Fdrag = −γv, with γ = kB T /D. While working with the OFR method, however, the D(z) results never matched known values or those obtained through other methods, including the mean square displacement (or Einstein) method. After a reformulation of the procedure to obtain D(z), i.e. by including the correct path length and particle speeds, results were obtained that were much closer to the correct values. The results however showed very little variation over the length of the reaction coordinate, even when D(z) was known to vary drastically. It seemed that the highly variable and noncontinuous velocity function of the particle being steered through the “stiff-spring” method was incompatible with the macroscopic definition of the drag coefficient, γ. The drag coefficient requires at most a slowly varying velocity so that the assumption of a linearly related dissipative work remains valid at all times. To address this, a new dynamic constraint steering protocol (DCP) was developed to replace the previously used “stiff-spring” method, now referred to as a dynamic restraint protocol (DRP). We present here the results for diffusion in bulk water, and both the PMF and diffusion results from the permeation of a water molecule through a DPPC membrane. We also consider the issue of ergodicity and sampling, and propose that to obtain an accurate w(z) (and D(z)) from even a moderately complex system, the final result should be a weighted average obtained from numerous pulls. An additional utility of the FR and OFR methods is that the permeability across lipid bilayers can be calculated from w(z) and D(z) using the inhomogeneous solubility-diffusion (ISD) model. As tests, the permeability was first calculated for H2O and O2 through DPPC. From the simulations, the permeability coefficients for H2O were found to be 0.129 ± 0.075 cm/s and 0.141 ± 0.043 cm/s, at 323 K and 350 K respectively, while the permeability coefficients for O2 were 114 ± 40 cm/s and 101 ± 27 cm/s, again at 323 K and 350 K respectively. As a final, more challenging system, the permeability of tyramine – a positively charged trace amine at physiological pH – was calculated. The final value of P = 0.89 ± 0.24 Ang/ns is over two orders of magnitude lower than that obtained from experiment (22 ± 4 Ang/ns), although it is clear that the permeability as calculated through the ISD is extremely sensitive to the PMF, as scaling the PMF by ∼ 20% allowed the simulation and experimental values to agree within uncertainty. With accurate predictions for free energies and permeabilities, the OFR method could potentially be used for many valuable endeavors such as rational drug design.
22

Elucidating mechanisms of gene regulation. Integration of high-throughput sequencing data for studying the epigenome

Althammer, Sonja Daniela 27 April 2012 (has links)
The recent advent of High-Throughput Sequencing (HTS) methods has triggered a revolution in gene regulation studies. Demand has never been higher to process the immense amount of emerging data to gain insight into the regulatory mechanisms of the cell. We address this issue by describing methods to analyze, integrate and interpret HTS data from different sources. In particular, we developed and benchmarked Pyicos, a powerful toolkit that offers flexibility, versatility and efficient memory usage. We applied it to data from ChIP-Seq on progesterone receptor in breast cancer cells to gain insight into regulatory mechanisms of hormones. Moreover, we embedded Pyicos into a pipeline to integrate HTS data from different sources. In order to do so, we used data sets from ENCODE to systematically calculate signal changes between two cell lines. We thus created a model that accurately predicts the regulatory outcome of gene expression, based on epigenetic changes in a gene locus. Finally, we provide the processed data in a Biomart database to the scientific community. / La llegada reciente de nuevos métodos de High-Throughput Sequencing (HTS) ha provocado una revolución en el estudio de la regulación génica. La necesidad de procesar la inmensa cantidad de datos generados, con el objectivo de estudiar los mecanismos regulatorios en la celula, nunca ha sido mayor. En esta tesis abordamos este tema presentando métodos para analizar, integrar e interpretar datos HTS de diferentes fuentes. En particular, hemos desarollado Pyicos, un potente conjunto de herramientas que ofrece flexibilidad, versatilidad y un uso eficiente de la memoria. Lo hemos aplicado a datos de ChIP-Seq del receptor de progesterona en células de cáncer de mama con el fin de investigar los mecanismos de la regulación por hormonas. Además, hemos incorporado Pyicos en una pipeline para integrar los datos HTS de diferentes fuentes. Hemos usado los conjuntos de datos de ENCODE para calcular de forma sistemática los cambios de señal entre dos líneas celulares. De esta manera hemos logrado crear un modelo que predice con bastante precisión los cambios de la expresión génica, basándose en los cambios epigenéticos en el locus de un gen. Por último, hemos puesto los datos procesados a disposición de la comunidad científica en una base de datos Biomart.
23

A bug report analysis and search tool

Cavalcanti, Yguaratã Cerqueira 31 January 2009 (has links)
Made available in DSpace on 2014-06-12T15:53:57Z (GMT). No. of bitstreams: 2 arquivo1938_1.pdf: 2696606 bytes, checksum: c2ff3cbbb3029fd0f89eb8d67c0e4f08 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2009 / Manutenção e evolução de software são atividades caracterizadas pelo seu enorme custo e baixa velocidade de execução. Não obstante, elas são atividades inevitáveis para garantir a qualidade do software quase todo software bem sucedido estimula os usuários a fazer pedidos de mudanças e melhorias. Sommerville é ainda mais enfático e diz que mudanças em projetos de software são um fato. Além disso, diferentes estudos têm afirmado ao longo dos anos que as atividades de manutenção e evolução de software são as mais caras do ciclo de desenvolvimento, sendo responsável por cerca de até 90% dos custos. Todas essas peculiaridades da fase de manutenção e evolução de software leva o mundo acadêmico e industrial a investigar constantemente novas soluções para reduzir os custos dessas atividades. Neste contexto, Gerência de Configuração de Software (GCS) é um conjunto de atividades e normas para a gestão da evolução e manutenção de software; GCS define como são registradas e processadas todas as modificações, o impacto das mesmas em todo o sistema, dentre outros procedimentos. Para todas estas tarefas de GCM existem diferentes ferramentas de auxílio, tais como sistemas de controle de versão e bug trackers. No entanto, alguns problemas podem surgir devido ao uso das mesmas, como por exemplo o problema de atribuição automática de responsável por um bug report e o problema de duplicação de bug reports. Neste sentido, esta dissertação investiga o problema de duplicação de bug reports resultante da utilização de bug trackers em projetos de desenvolvimento de software. Tal problema é caracterizado pela submissão de dois ou mais bug reports que descrevem o mesmo problema referente a um software, tendo como principais conseqüências a sobrecarga de trabalho na busca e análise de bug reports, e o mal aproveitamento do tempo destinado a essa atividade
24

Forenzní analýza malware / Forensic Malware Analysis

Král, Benjamin January 2018 (has links)
This master's thesis describes methodologies used in malware forensic analysis including methods used in static and dynamic analysis. Based on those methods a tool intended to be used by Computer Security Incident Response Teams (CSIRT) is designed to allow fast analysis and decisions regarding malware samples in security incident investigations. The design of this tool is thorougly described in the work along with the tool's requirements on which the tool design is based on. Based on the design a ForensIRT tool is implemented and then used to analyze a malware sample Cridex to demonstrate its capabilities. Finally the analysis results are compared to those of other comparable available malware forensics tools.
25

Nástroj pro podporu provádění analýzy finančního zdraví firmy / Tool for Execution of a Firm's Financial Healthiniss Analysis

Kubiš, Jan January 2018 (has links)
The topic of this thesis is financial analalysis of both, a company and an investment. It also contains description of software for its execution, which was implemented within the thesis. The first part describes financial analysis in general -- definitions, motivation, input data set definition, enumeration of common methods and indicators, their meaning and iterpretation of their values. The second part contains description of design, implementation and functionality of a tool, which purpose is automatization of chosen domains from financial analysis.
26

Barrieren des Demografiemanagements überwinden und Wandel erfolgreich gestalten: Barriereanalysetool

Geithner, Silke, Brückner, Franziska, Möller, Luisa, Schirmer, Frank January 2016 (has links)
Die Broschüre „Barrieren des Demografiemanagements überwinden und Wandel erfolgreich gestalten“ beinhaltet mit dem Barriereanalysetool einen Selbstcheck, mit dem Unternehmen testen können, welches Bewusstsein für das Demografiemanagement in ihrem Unternehmen herrscht und welche Instrumente des Demografiemanagements angewendet werden. Anhand der Auswertung des Selbstchecks können die Unternehmen ablesen, welche Relevanz die demografischen Entwicklungen (u.a. Fachkräftemangel, alternde Belegschaft) in ihrem Unternehmen aktuell haben. Darüber hinaus werden praktische Handlungsempfehlungen und Tipps für weitere Aktivitäten in dem jeweiligen Themengebiet vorgestellt.:Vorwort Wie kann das Analysetool eingesetzt werden? 1 Instrumente des Demografiemanagements 2 Barrieren 2.1 Unterstützer des Demografiemanagements 2.2 Wahrnehmung der demografischen Herausforderungen 2.3 Nutzung demografiespezifischer Ressourcen 2.4 Generationenmanagement 2.5 Entwicklung und Anpassung des Personalmanagements 2.6 Stellung des Personalmanagements im Unternehmen 3 Fazit 4 Erweiterte Literaturzusammenfassung
27

Development of training material for a process analysis tool in the paper industry / Utveckling av ett utbildningsmaterial för ett processanalysverktyg inom pappersindustrin

Kristoffersson, Sara January 2020 (has links)
Paperboard is accessible for everyone, such as packaging for provisions or beauty products. Paperboard consists of several layers of pulp and has different types of qualities depending on the material’s area of use. Within the paperboard production, the process behavior is analyzed to find solutions to decrease the product variations in order to reach the desired product results. Process analyses are continuously made to improve paperboard production and avoid defects in the paperboard product. The company Holmen has recently implemented a new process analysis tool on trial, named Wedge, at the paperboard mill in Workington, UK, which is a software tool that can be used for analyzing the paperboard process. Holmen’s vision was to develop training material for the software tool that could be used by the employees for educational purposes.The purpose of this project degree was to develop training material of the process analysis tool, i.e. the software Wedge, for novice learners. Initially, the purpose was to examine and identify the employees’ learning within the software tool at the mill in Workington. Based on that, training material was developed that could be used for self-directed learning material.The study was conducted through qualitative methods, which included a group interview with development engineers and a one-to-one interview with the training manager at the Workington mill. Additionally, observations of training opportunities and an evaluation questionnaire of the training material were used in this study. Based on these results, a thematic analysis was conducted where identified themes have been interpreted from aspects of cognitive learning and adult learning.The result is based on the qualitative survey and the analysis of the interviews and questionnaire responses indicates that the training material is suitable and pedagogical for novice learners. The development engineers mean, among other things, that computer-based training should contain ‘step-by-step’ examples of work-related situations and the training must be organized such that new information will not be overwhelming and unintelligible. It is also important that new knowledge or information can be acquired both visually and by text-based instructions, to be able to provide the learners with various kinds of teaching aids since there are different approaches to learn new knowledge. Therefore, the training material based on e-learning has been designed as a first lesson of how to use and navigate in the process analysis tool. The developed training material entails four interactive videos with incremental learning of how the process analysis tool can be used in the paper industry. / Materialet kartong är något alla har till hands, som exempelvis förpackningar för proviant eller skönhetsprodukter. Kartong består av flera lager av pappersmassa och har olika typer av kvalitéer beroende på materialets användningsområde. Inom produktionen görs analyser av kartongprocessens beteende för att hitta lösningar som minskar produktvariationerna och på så sätt uppnå önskade slutresultat. Processanalyser görs kontinuerligt för att förbättra kartongproduktion och därmed undvika produktdefekter. Företaget Holmen har implementerat ett nytt processanalysverktyg, så kallat Wedge, på deras kartongbruk i Workington, UK, vilket är en programvara som kan användas för att kunna göra analyser av kartongprocess. Holmen önskade att ett utbildningsmaterial för programvaran skulle utvecklas och kunna användas i utbildningssyfte för produktionsarbetarna.Syftet med detta examensarbete var att utveckla ett utbildningsmaterial för nybörjare av processanalysverktyget Wedge. Grunden till detta låg i att undersöka och identifiera de anställdas lärande utifrån programvarans utbildning på kartongbruket i Workington. Baserat på detta utvecklades ett utbildningsmaterial som kan användas som ett självstuderande lärandemedel.Undersökningen har utförts av kvalitativa metodval som innefattar en gruppintervju med utvecklingsingenjörer och en intervju med utbildningsansvarig på kartongbruket i Workington, samt observationer av utbildningstillfällen och enkäter för utvärdering av utbildningsmaterialet. Baserat på detta har en tematisk analys genomförts där identifierade teman har tolkats utifrån ett kognitivt lärandeperspektiv och vuxnas lärande.Resultatet baseras på den kvalitativa undersökningen och analys av intervjuerna och enkäterna visar på att utbildningsmaterialet är passande och ligger på en bra pedagogisk nivå för nybörjare. Utvecklingsingenjörerna önskar bland annat att en datorbaserad utbildning ska innehålla förberedda ’steg-för-steg’-exempel utifrån verklighetsbaserade problem samt att utbildning måste organiseras sådan att ny information inte blir överväldigande och svårförståelig. Det är även viktigt att ny kunskap och information kan fås både visuellt och textbaserat, och att en blandning mellan olika läromedel finns eftersom personer har olika strategier för att lära sig ny kunskap. Utifrån detta har ett utbildningsmaterial, baserat på e-lärande, utvecklats och skapats med ändamålet att fungera som en första lektion av hur en kan använda och navigera i processanalysverktyget. Det utbildningsmaterial som har utvecklats innefattar fyra interaktiva videos som är uppbyggda för stegvis inlärning om hur processanalysverktyget kan användas i pappersindustrin.
28

Safety in the Urban Space / Trygghet i stadsrummet : Analysverktyg för att främja upplevd trygghet genom fysisk utformning och kollektiv kartläggning

Ouertani, Mayssa January 2022 (has links)
Creating safe environments is a necessity within urban planning. In each element of the urban planning process, it is essential to reflect upon how the proposed plans can affect safety within the chosen environment. The purpose of the thesis is therefore to develop an analysis tool based on scientific research, to assess the perceived safety that is mediated through environmental design. The report aims to answer the following research questions, Which theoretical perspectives and scientific research can be used as a research basis, to develop a analysis tool that seek to increase perceived safety through environmental design? How can the perceived safety through the environmental design be assessed using the analysis tool? The method consists of a literature study, to give a broad and global insight on data as well as results that have been in previous research on safety. International studies and articles have been primarily used with few elements from the Swedish context, with the aim of obtaining a broad mapping of different contexts where safety in the physical environment has been investigated. The used search engines consist of the Royal Institute of Technology's Primo, Google and Google Scholar. Through these, the theoretical perspectives and the scientific research that underlies the work were found.  The analysis tool consists of a checklist that includes six different categories; Lighting and mobility, Maintenance of the physical environment, Technical monitoring, Natural monitoring, Physical design and orientation as well as Vegetation in the physical environment. Within each category there are various claims that the user of the tool will rate in an assessment scale from 1 to 5. To take position to the presented claims, the user must perform a site visit to observe the surroundings. / Skapandet av trygga miljöer är en grundpelare inom samhällsplanering. I varje element av samhällsbyggnadsprocessen är det essentiellt samt nödvändigt att reflektera hur de planerade åtgärderna kan påverka tryggheten inom den valda miljön. Syftet med avhandlingen är därav att ta fram ett analysverktyg utifrån vetenskaplig forskning, för att bedöma den upplevda tryggheten som förmedlas genom utformningen av den fysiska miljön. Rapporten riktar sig till att besvara följande frågeställningar, Vilka teoretiska perspektiv och vetenskaplig forskning kan användas som forskningsunderlag, för att utveckla analysverktyg som eftersträvar att öka upplevd trygghet genom fysisk utformning? Hur kan den upplevda tryggheten genom utformningen av den fysiska miljön bedömas med hjälp av analysverktyget? Metoden utgörs av en litteraturstudie, för att ge en bred samt global inblick av data och resultat inom tidigare forskning av trygghet. För att få en bred kartläggning av olika fall där trygghet i den fysiska miljön har undersökts har primärt internationella studier och artiklar använts med få inslag från den svenska kontexten. De sökmotorer som använts består av Kungliga Tekniska Högskolans Primo, Google och Google Scholar. Genom dessa hittades de teoretiska perspektiv och den vetenskapliga forskningen som ligger till grund för arbetet.  Analysverktyget består av en checklista som innefattar sex olika kategorier; Belysning och mobilitet, Underhåll av den fysiska miljön, Teknisk övervakning, Naturlig övervakning, Fysiska utformning och orienterbarhet samt Vegetation i den fysiska miljön. Inom varje kategori förekommer diverse påståenden som bedöms utifrån en bedömningsskala från 1 till 5. För att ta ställning till de presenterade påståenden bör användaren gå en rundtur på platsen och observera den omgivande miljön .
29

Développement et mise à l'essai d'un outil pour analyser des albums jeunesse afin d'élaborer un répertoire d'oeuvres québécoises propices au travail interprétatif

Turgeon, Elaine 02 1900 (has links)
La présente recherche porte sur la lecture littéraire dans un contexte d’enseignement primaire et concerne plus précisément les albums jeunesse qui favorisent le développement des habiletés interprétatives. Il s’agit d’une recherche-développement qui comporte trois objectifs. Le premier objectif consiste à développer un outil pour analyser les procédés narratifs des albums jeunesse et y cerner les éléments propices au travail interprétatif. Le second objectif vise à mettre à l’essai l’outil d’analyse afin d’en mesurer la validité et la fidélité, alors que le dernier objectif consiste à élaborer, à l’aide de l’outil développé, un répertoire d’albums jeunesse québécois susceptibles de favoriser le développement des habiletés interprétatives des élèves du primaire. La méthodologie mise en œuvre afin d’atteindre les trois objectifs a d’abord permis d’analyser les besoins, c’est-à-dire, les finalités et les utilisateurs de l’outil d’analyse, puis d’en concevoir et d’en élaborer une première version, de la mettre à l’essai afin d’en évaluer la validité auprès d’experts, avant d’en produire une deuxième version, d’en évaluer la fidélité à l’aide de codeurs et finalement, d’en produire une troisième puis une quatrième version afin d’élaborer un répertoire d’albums jeunesse québécois propices au travail interprétatif. Bien que la mise à l’essai ne permette pas de conclure de façon tout à fait satisfaisante à propos de l’objectivité des indicateurs de l’outil développé, l’analyse des commentaires des experts permet d’affirmer que les indicateurs de l’outil d’analyse présentent un très haut degré de pertinence, ce qui donne à penser que l’outil développé de même que le répertoire de quinze albums jeunesse québécois susceptibles de favoriser le développement des habiletés interprétatives des élèves du primaire élaboré, dans le cadre de cette recherche, peuvent s’avérer de premiers outils utiles et pertinents pour le milieu scolaire. / This research paper examines literary reading in the context of primary education and focuses specifically on picturebooks that foster the development of interpretive skills. It gives an account of research and development in which we pursued three objectives. The first objective was to develop a tool to analyse the narrative processes of picturebooks and identify elements conducive to interpretive work; the second aimed to test the analysis tool in order to measure its reliability and validity; and the final objective was to use the tool we developed to put together a collection of Quebec children’s books that favour the development of interpretive skills in primary students. To achieve these three objectives, we adopted a methodology that enabled us to first analyse the needs, namely the users and purposes of the analysis tool, and then to design and develop an initial version. This version was tested with experts to determine its validity. A second version was then produced and its reliability was evaluated using encoders. Lastly, a third and forth version was produced to assemble a collection of Quebec picturebooks conducive to interpretive work. Although our test did not allow us to draw an altogether satisfactory conclusion about the objectivity of the indicators of our analysis tool, an analysis of comments by the experts confirms that the indicators of our analysis tool are highly relevant, which suggests that the tool we developed and the collection of 15 Quebec picturebooks that favour the development of interpretive skills in primary students we assembled may be important and useful tools within a school environment.
30

Uma abordagem para visualização e análise baseada em clustering de dados espaço-temporais. / An approach to visualization and analysis based on clustering of spatiotemporal data.

OLIVEIRA, Maxwell Guimarães de. 04 August 2018 (has links)
Submitted by Johnny Rodrigues (johnnyrodrigues@ufcg.edu.br) on 2018-08-04T13:59:37Z No. of bitstreams: 1 MAXWELL GUIMARÃES DE OLIVEIRA - DISSERTAÇÃO PPGCC 2012..pdf: 28277196 bytes, checksum: 398cd7b385ee61c414d0086810fbeeed (MD5) / Made available in DSpace on 2018-08-04T13:59:37Z (GMT). No. of bitstreams: 1 MAXWELL GUIMARÃES DE OLIVEIRA - DISSERTAÇÃO PPGCC 2012..pdf: 28277196 bytes, checksum: 398cd7b385ee61c414d0086810fbeeed (MD5) Previous issue date: 2012-08-20 / Capes / Atualmente, há um volume considerável de dados espaço-temporais disponíveis em vários meios, sobretudo na Internet. A visualização de dados espaço-temporais é uma tarefa complexa, que requer uma série de recursos visuais apropriados para que, em conjunto, possam permitir aos usuários uma correta interpretação das informações analisadas. Além do emprego de técnicas de visualização, a utilização de técnicas de descoberta de conhecimento em bancos de dados tem se mostrado relevante no auxílio à análise exploratória de relacionamentos em dados espaço-temporais. O levantamento do estado da arte em visualização de dados espaço-temporais leva à conclusão de que a área ainda é deficiente em soluções para visualização e análise desses tipos. Muitas abordagens abrangem somente questões espaciais, desprezando as características temporais desses dados. Inserido nesse contexto, o principal objetivo deste trabalho é melhorar a experiência do usuário em visualização e análise espaço-temporal, indo além do universo da visualização dos dados espaço-temporais brutos e considerando, também, a importância em visualização de dados espaço-temporais derivados de um processo de descoberta de conhecimento, mais especificamente algoritmos de clustering. Esse objetivo é atingido com a definição de uma abordagem inovadora em visualização de dados espaço-temporais, e de sua implementação, denominada GeoSTAT (Geographic SpatioTemporal Analysis Tool), que engloba pontos importantes observados nas principais abordagens existentes e acrescenta, principalmente, técnicas de visualização voltadas à dimensão temporal e à utilização de algoritmos de clustering, valorizando características até então pouco exploradas em dados espaço-temporais. A validação deste trabalho ocorre por meio de dois estudos de caso, onde cada um aborda dados espaço-temporais de um domínio específico, para demonstrar a experiência do usuário final diante das técnicas de visualização reunidas na abordagem proposta. / Nowadays, there is a considerable amount of spatiotemporal data available in various media, especially on the Internet. The visualization of spatiotemporal data is a complex task that requires a series of visual suitable resources which can enable users to have a correct interpretation of the data. Apart from the use of visualization techniques, the use of techniques of knowledge discovery in databases has proven relevantfor the exploratory analysis of relationships in spatiotemporal data. The state of the art in visualization of spatiotemporal data leads to the conclusion that the area is still deficient in solutions for viewing and analysis of those data. Many approaches cover only spatial issues, ignoring the temporal characteristics of such data. Inserted in this context, the main objective of this work is to improve the user experience in spatiotemporal visualization and analysis, going beyond the universe of visualization of spatiotemporal raw data and also considering the importance of visualization of spatiotemporal data derived from a knowledge discovery process, more specifically clustering algorithms. This goal is achieved by defining an innovative approach for the analysis and visualization of spatiotemporal data, and its implementation, called GeoSTAT (Spatiotemporal Geographic Analysis Tool), which includes importam points in the main existing approaches and adds especially visualization techniques geared to the temporal dimension and the use of clustering algorithms, enhancing unexplored features in spatiotemporal data. The validation of this work occurs through two case studies, where each one deals with spatiotemporal data of a specific domain to demonstrate the end-user experience on the visualization techniques combined in the proposed approach.

Page generated in 0.1015 seconds