• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 2
  • Tagged with
  • 5
  • 5
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Digitizing the Parthenon using 3D Scanning : Managing Huge Datasets

Lundgren, Therese January 2004 (has links)
<p>Digitizing objects and environments from real world has become an important part of creating realistic computer graphics. Through the use of structured lighting and laser time-of-flight measurements the capturing of geometric models is now a common process. The result are visualizations where viewers gain new possibilities for both visual and intellectual experiences. </p><p>This thesis presents the reconstruction of the Parthenon temple and its environment in Athens, Greece by using a 3D laser-scanning technique. </p><p>In order to reconstruct a realistic model using 3D scanning techniques there are various phases in which the acquired datasets have to be processed. The data has to be organized, registered and integrated in addition to pre and post processing. This thesis describes the development of a suitable and efficient data processing pipeline for the given data. </p><p>The approach differs from previous scanning projects considering digitizing this large scale object at very high resolution. In particular the issue managing and processing huge datasets is described. </p><p>Finally, the processing of the datasets in the different phases and the resulting 3D model of the Parthenon is presented and evaluated.</p>
2

Digitizing the Parthenon using 3D Scanning : Managing Huge Datasets

Lundgren, Therese January 2004 (has links)
Digitizing objects and environments from real world has become an important part of creating realistic computer graphics. Through the use of structured lighting and laser time-of-flight measurements the capturing of geometric models is now a common process. The result are visualizations where viewers gain new possibilities for both visual and intellectual experiences. This thesis presents the reconstruction of the Parthenon temple and its environment in Athens, Greece by using a 3D laser-scanning technique. In order to reconstruct a realistic model using 3D scanning techniques there are various phases in which the acquired datasets have to be processed. The data has to be organized, registered and integrated in addition to pre and post processing. This thesis describes the development of a suitable and efficient data processing pipeline for the given data. The approach differs from previous scanning projects considering digitizing this large scale object at very high resolution. In particular the issue managing and processing huge datasets is described. Finally, the processing of the datasets in the different phases and the resulting 3D model of the Parthenon is presented and evaluated.
3

XML Integrated Environment For Service-Oriented Data Management

Maarouf, Marwan Younes 12 June 2007 (has links)
No description available.
4

A Versatile Sensor Data Processing Framework for Resource Technology

Kaever, Peter, Oertel, Wolfgang, Renno, Axel, Seidel, Peter, Meyer, Markus, Reuter, Markus, König, Stefan 28 June 2021 (has links)
Die Erweiterung experimenteller Infrastrukturen um neuartige Sensor eröffnen die Möglichkeit, qualitativ neuartige Erkenntnisse zu gewinnen. Um diese Informationen vollständig zu erschließen ist ein Abdecken der gesamten Verarbeitungskette von der Datenauslese bis zu anwendungsbezogenen Auswertung erforderlich. Eine Erweiterung bestehender wissenschaftlicher Instrumente beinhaltet die strukturelle und zeitbezogene Integration der neuen Sensordaten in das Bestandssystem. Das hier vorgestellte Framework bietet durch seinen flexiblen Ansatz das Potenzial, unterschiedliche Sensortypen in unterschiedliche, leistungsfähige Plattformen zu integrieren. Zwei unterschiedliche Integrationsansätze zeigen die Flexibilität dieses Ansatzes, wobei einer auf die Steigerung der Sensitivität einer Anlage zur Sekundärionenmassenspektroskopie und der andere auf die Bereitstellung eines Prototypen zur Untersuchung von Rezyklaten ausgerichtet ist. Die sehr unterschiedlichen Hardwarevoraussetzungen und Anforderungen der Anwendung bildeten die Basis zur Entwicklung eines flexiblen Softwareframeworks. Um komplexe und leistungsfähige Applikationsbausteine bereitzustellen wurde eine Softwaretechnologie entwickelt, die modulare Pipelinestrukturen mit Sensor- und Ausgabeschnittstellen sowie einer Wissensbasis mit entsprechenden Konfigurations- und Verarbeitungsmodulen kombiniert.:1. Introduction 2. Hardware Architecture and Application Background 3. Software Concept 4. Experimental Results 5. Conclusion and Outlook / Novel sensors with the ability to collect qualitatively new information offer the potential to improve experimental infrastructure and methods in the field of research technology. In order to get full access to this information, the entire range from detector readout data transfer over proper data and knowledge models up to complex application functions has to be covered. The extension of existing scientific instruments comprises the integration of diverse sensor information into existing hardware, based on the expansion of pivotal event schemes and data models. Due to its flexible approach, the proposed framework has the potential to integrate additional sensor types and offers migration capabilities to high-performance computing platforms. Two different implementation setups prove the flexibility of this approach, one extending the material analyzing capabilities of a secondary ion mass spectrometry device, the other implementing a functional prototype setup for the online analysis of recyclate. Both setups can be regarded as two complementary parts of a highly topical and ground-breaking unique scientific application field. The requirements and possibilities resulting from different hardware concepts on one hand and diverse application fields on the other hand are the basis for the development of a versatile software framework. In order to support complex and efficient application functions under heterogeneous and flexible technical conditions, a software technology is proposed that offers modular processing pipeline structures with internal and external data interfaces backed by a knowledge base with respective configuration and conclusion mechanisms.:1. Introduction 2. Hardware Architecture and Application Background 3. Software Concept 4. Experimental Results 5. Conclusion and Outlook
5

Au-delà de la volumétrie en morphométrie basée sur les déformations : application au dimorphisme sexuel durant l'adolescence / Beyond volumetry in longitudinal deformation-based morphometry : application to sexual dimorphism during adolescence

Hadj-Hamou, Mehdi 14 December 2016 (has links)
L'analyse des changements morphologiques du cerveau dans des séries temporelles d'images est un sujet important en neuroimagerie. Bien que le développement des bases de données longitudinales ait aidé à réduire la variabilité inter-individu, il reste encore de nombreux biais qui doivent être évités lors de l'estimation des évolutions longitudinales. De plus, lorsque les changements intra-sujet sont très faibles par rapport à la variabilité inter-sujet, il est crucial de savoir si les méthodes existantes peuvent capturer sans biais les changements longitudinaux. Dans la plupart des études, les changements longitudinaux sont limités à leur composante volumétrique scalaire afin d'en faciliter l'analyse. Cependant, les changements cérébraux ne sont généralement pas uniquement volumétriques et dans ce cas multivarié, l'interprétation est alors plus difficile. Cette thèse adresse ces problèmes en suivant trois axes principaux. Premièrement, nous proposons une chaîne de traitement longitudinale reposant sur la morphométrie à partir de déformations et ayant pour but d'estimer de manière robuste les changements longitudinaux. Afin d’éviter de rajouter du biais, nous détaillons tout l'enchaînement des étapes de traitement. En plus de cette contribution, nous intégrons une modification de l'algorithme de recalage non-linéaire qui consiste à masquer le terme de similarité tout en conservant la symétrie de la formulation. Cette contribution augmente la robustesse des résultats vis-à-vis des artefacts d'intensité situés en bordure du cerveau et augmente ainsi la sensibilité de l'étude statistique réalisée sur les déformations longitudinales / Analysing the progression of brain morphological changes in time series of images is an important topic in neuroimaging. Although the development of longitudinal databases has helped reducing the inter-individual variability, there still exist numerous biases that need to be avoided when capturing longitudinal evolutions. Moreover, when the intra-subject changes are very small with respect to the inter-subject variability it is crucial to know if the available methods can capture the longitudinal change with no bias. In most of the studies, these longitudinal changes are limited to scalar volumetric changes in order to ease their analysis. However, one can observe that brain changes are not limited to volumetry. In this multivariate case, the interpretation is more difficult. This thesis addresses these problems along three main axes. First, we propose a longitudinal Deformation-based Morphometry processing pipeline to robustly estimate the longitudinal changes. We detail the whole sequencing of the processing steps as they are key to avoid adding bias. In addition to this contribution we integrate a modification to the non-linear registration algorithm by masking the similarity term while keeping the symmetry of the formulation. This change increases the robustness of the results with respect to intensity artifacts located in the brain boundaries and leads to increased sensitivity of the statistical study on the longitudinal deformations. The proposed processing pipeline is based on freely available software and tools so that it is fully reproducible

Page generated in 0.3047 seconds