• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 18
  • 5
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 34
  • 10
  • 7
  • 7
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Compression multimodale du signal et de l’image en utilisant un seul codeur / Multimodal compression of digital signal and image data using a unique encoder

Zeybek, Emre 24 March 2011 (has links)
Cette thèse a pour objectif d'étudier et d'analyser une nouvelle stratégie de compression, dont le principe consiste à compresser conjointement des données issues de plusieurs modalités, en utilisant un codeur unique. Cette approche est appelée « Compression Multimodale ». Dans ce contexte, une image et un signal audio peuvent être compressés conjointement et uniquement par un codeur d'image (e.g. un standard), sans la nécessité d'intégrer un codec audio. L'idée de base développée dans cette thèse consiste à insérer les échantillons d'un signal en remplacement de certains pixels de l'image « porteuse » tout en préservant la qualité de l'information après le processus de codage et de décodage. Cette technique ne doit pas être confondue aux techniques de tatouage ou de stéganographie puisqu'il ne s'agit pas de dissimuler une information dans une autre. En Compression Multimodale, l'objectif majeur est, d'une part, l'amélioration des performances de la compression en termes de débit-distorsion et d'autre part, l'optimisation de l'utilisation des ressources matérielles d'un système embarqué donné (e.g. accélération du temps d'encodage/décodage). Tout au long de ce rapport, nous allons étudier et analyser des variantes de la Compression Multimodale dont le noyau consiste à élaborer des fonctions de mélange et de séparation, en amont du codage et de séparation. Une validation est effectuée sur des images et des signaux usuels ainsi que sur des données spécifiques telles que les images et signaux biomédicaux. Ce travail sera conclu par une extension vers la vidéo de la stratégie de la Compression Multimodale / The objective of this thesis is to study and analyze a new compression strategy, whose principle is to compress the data together from multiple modalities by using a single encoder. This approach is called “Multimodal Compression” during which, an image and an audio signal is compressed together by a single image encoder (e.g. a standard), without the need for an integrating audio codec. The basic idea developed in this thesis is to insert samples of a signal by replacing some pixels of the "carrier's image” while preserving the quality of information after the process of encoding and decoding. This technique should not be confused with techniques like watermarking or stéganographie, since Multimodal Compression does not conceal any information with another. Two main objectives of Multimodal Compression are to improve the compression performance in terms of rate-distortion and to optimize the use of material resources of a given embedded system (e.g. acceleration of encoding/decoding time). In this report we study and analyze the variations of Multimodal Compression whose core function is to develop mixing and separation prior to coding and separation. Images and common signals as well as specific data such as biomedical images and signals are validated. This work is concluded by discussing the video of the strategy of Multimodal Compression
12

Efficient search of an underwater area based on probability

Pukitis Furhoff, Hampus January 2019 (has links)
Today more and more different types of autonomous robots and vehicles are being developed. Most of these rely on the global positioning system and/or communication with other robots and vehicles to determine their global position. However, these are not viable options for the autonomous underwater vehicles (AUVs) of today since radio-waves does not travel well in water. Instead, various techniques for determining the AUVs position are used which comes with a margin of error. This thesis examines the problem of efficiently performing a local search within this margin of error with the objective of finding a docking-station or a bouy.To solve this problem research was made on the subject of search theory and how it previously has been applied in this context. What was found was that classical bayesian search theory had not been used very often in this context since it would require to much processing power to be a viable option in the embedded systems that is AUVs. Instead different heuristics were used to get solutions that still were viable for the situations in which they were used, even though they maybe wasn’t optimal.Based on this the search-strategies Spiral, Greedy, Look-ahead and Quadtree were developed and evaluated in a simulator. Their mean time to detection (MTTD) were compared as well as the average time it took for the strategies to process a search. Look-ahead was the best one of the four different strategies with respect to the MTTD and based on this it is suggested that it should be implemented and evaluated in a real AUV. / Idag utvecklas allt fler olika typer av autonoma robotar och fordon. De flesta av dessa är beroende av det globala positioneringssystemet och/eller kommunikation med andra robotar och fordon för att bestämma deras globala position. Detta är dock inte realistiska alternativ för autonoma undervattensfordon (AUV) idag eftersom radiovågor inte färdas bra i vatten. I stället används olika tekniker för att bestämma AUVens position, tekniker som ofta har en felmarginal. Denna rapport undersöker problemet med att effektivt utföra en lokal sökning inom denna felmarginal med målet att hitta en dockningsstation eller en boj.För att lösa detta problem gjordes en litteraturstudie om ämnet sökteori och hur det tidigare har tillämpats i detta sammanhang. Det som hittades var att den klassiska bayesiska sökteorin inte hade använts mycket ofta i detta sammanhang eftersom det skulle kräva för mycket processorkraft för att det skulle vara ett rimligt alternativ för de inbyggda systemen på en AUV. Istället användes olika heuristiska metoder för att få lösningar som fortfarande var dugliga för de situationer där de användes, även om de kanske inte var optimala.Baserat på detta utvecklades sökstrategierna Spiral, Greedy, Look-ahead och Quad-tree och utvärderades i en simulator. Deras genomsnittliga tid för att upptäcka målet (MTTD) jämfördes liksom den genomsnittliga tiden det tog för strategierna att bearbeta en sökning. Look-ahead var den bästa av de fyra olika strategierna med avseende på MTTD och baserat på detta föreslås det att den ska implementeras och utvärderas i en verklig AUV.
13

[pt] MODELAGEM DE OBJETOS GEOLÓGICOS: IA PARA DETECÇÃO AUTOMÁTICA DE FALHAS E GERAÇÃO DE MALHAS DE QUADRILÁTEROS / [en] MODELING OF GEOBODIES: AI FOR SEISMIC FAULT DETECTION AND ALL-QUADRILATERAL MESH GENERATION

AXELLE DANY JULIETTE POCHET 14 December 2018 (has links)
[pt] A exploração segura de reservatórios de petróleo necessita uma boa modelagem numérica dos objetos geológicos da sub superfície, que inclui entre outras etapas: interpretação sísmica e geração de malha. Esta tese apresenta um estudo nessas duas áreas. O primeiro estudo é uma contribuição para interpretação de dados sísmicos, que se baseia na detecção automática de falhas sísmicas usando redes neurais profundas. Em particular, usamos Redes Neurais Convolucionais (RNCs) diretamente sobre mapas de amplitude sísmica, com a particularidade de usar dados sintéticos para treinar a rede com o objetivo final de classificar dados reais. Num segundo estudo, propomos um novo algoritmo para geração de malhas bidimensionais de quadrilaterais para estudos geomecânicos, baseado numa abordagem inovadora do método de quadtree: definimos novos padrões de subdivisão para adaptar a malha de maneira eficiente a qualquer geometria de entrada. As malhas obtidas podem ser usadas para simulações com o Método de Elementos Finitos (MEF). / [en] Safe oil exploration requires good numerical modeling of the subsurface geobodies, which includes among other steps: seismic interpretation and mesh generation. This thesis presents a study in these two areas. The first study is a contribution to data interpretation, examining the possibilities of automatic seismic fault detection using deep learning methods. In particular, we use Convolutional Neural Networks (CNNs) on seismic amplitude maps, with the particularity to use synthetic data for training with the goal to classify real data. In the second study, we propose a new two-dimensional all-quadrilateral meshing algorithm for geomechanical domains, based on an innovative quadtree approach: we define new subdivision patterns to efficiently adapt the mesh to any input geometry. The resulting mesh is suited for Finite Element Method (FEM) simulations.
14

DCMS: A Data Analytics and Management System for Molecular Simulation

Berrada, Meryem 16 March 2015 (has links)
Despite the fact that Molecular Simulation systems represent a major research tool in multiple scientific and engineering fields, there is still a lack of systems for effective data management and fast data retrieval and processing. This is mainly due to the nature of MS which generate a very large amount of data - a system usually encompass millions of data information, and one query usually runs for tens of thousands of time frames. For this purpose, we designed and developed a new application, DCMS (A data Analytics and Management System for molecular Simulation), that intends to speed up the process of new discovery in the medical/physics fields. DCMS stores simulation data in a database; and provides users with a user-friendly interface to upload, retrieve, query, and analyze MS data without having to deal with any raw data. In addition, we also created a new indexing scheme, the Time-Parameterized Spatial (TPS) tree, to accelerate query processing through indexes that take advantage of the locality relationships between atoms. The tree was implemented directly inside the PostgreSQL kernel, on top of the SP-GiST platform. Along with this new tree, two new data types were also defined, as well as new algorithms for five data points' retrieval queries.
15

Détection et correction des intersections entre courbes B-splines. Application a la généralisation cartographique.

Guilbert, Eric 08 November 2004 (has links) (PDF)
Cette thèse présente une méthode de détection et de correction des intersections visuelles et singulières entre courbes B-splines adaptée à la généralisation des cartes marines. Dans une première partie, nous nous intéressons à la détection des intersections. La méthode proposée effectue d'abord un partitionnement du plan. Les courbes sont reparties dans les cellules sans calcul numérique. Le partitionnement est donc rapide et robuste. Ensuite, les intersections sont calculées à l'aide de schémas de subdivision. La deuxième partie concerne la correction des conflits par déformation respectant les contraintes cartographiques. Nous présentons une première méthode où le polygone de contrôle est assimilé à un réseau de barres déformé par l'application de forces externes. Une deuxième méthode est ensuite présentée où le déplacement est représenté par un snake soumis a des énergies définies en fonction des conflits. Les paramètres de forme sont réglés automatiquement.
16

Trois applications de la fragmentation et du calcul poissonien à la combinatoire

Joseph, Adrien 30 June 2011 (has links) (PDF)
Cette thèse est consacrée à l'étude de trois modèles combinatoires intervenant dans la théorie des probabilités. Nous nous intéressons tout d'abord à la hauteur d'arbres de fragmentation. À mesure de dislocation fixée, deux régimes bien différents peuvent apparaître selon la capacité des sommets : au-delà d'une capacité critique, les hauteurs ont même asymptotique tandis que, en deçà de ce paramètre critique, les arbres sont de plus en plus hauts à mesure que le seuil de rupture diminue. Nous présentons ensuite des résultats obtenus avec Nicolas Curien sur le quadtree. Nous explicitons les comportements asymptotiques des coûts moyens des requêtes partielles. La théorie des fragmentations joue encore un rôle clé. Nous étudions enfin les grands graphes aléatoires, critiques pour le modèle de configuration. Sous certaines hypothèses, nous prouvons que, correctement remises à l'échelle, les suites des tailles des composantes connexes de ces graphes convergent en un certain sens vers une suite aléatoire non triviale que nous caractérisons. La situation est bien différente selon que la loi des degrés d'un sommet a un moment d'ordre 3 fini ou est une loi de puissance d'exposant compris entre 3 et 4.
17

An Adaptive Well-Balanced Positivity Preserving Central-Upwind Scheme for the Shallow Water Equations Over Quadtree Grids

Ghazizadeh Fard, Seyed Mohammad Ali 17 April 2020 (has links)
Shallow water equations are widely used to model water flows in the field of hydrodynamics and civil engineering. They are complex, and except for some simplified cases, no analytical solution exists for them. Therefore, the partial differential equations of the shallow water system have been the subject of various numerical analyses and studies in past decades. In this study, we construct a stable and robust finite volume scheme for the shallow water equations over quadtree grids. Quadtree grids are two-dimensional semi-structured Cartesian grids that have different applications in several fields of engineering, such as computational fluid dynamics. Quadtree grids refine or coarsen where it is required in the computational domain, which gives the advantage of reducing the computational cost in some problems. Numerical schemes on quadtree grids have different properties. An accurate and robust numerical scheme is able to provide a balance between the flux and source terms, preserve the positivity of the water height and water surface, and is capable of regenerating the grid with respect to different conditions of the problem and computed solution. The proposed scheme uses a piecewise constant approximation and employs a high-order Runge-Kutta method to be able to make the solution high-order in space and time. Hence, in this thesis, we develop an adaptive well-balanced positivity preserving scheme for the shallow water system over quadtree grids utilizing different techniques. We demonstrate the formulations of the proposed scheme over one of the different configurations of quadtree cells. Six numerical benchmark tests confirm the ability of the scheme to accurately solve the problems and to capture small perturbations. Furthermore, we extend the proposed scheme to the coupled variable density shallow water flows and establish an extended method where we focus on eliminating nonphysical oscillations, as well as well-balanced, positivity preserving, and adaptivity properties of the scheme. Four different numerical benchmark tests show that the proposed extension of the scheme is accurate, stable, and robust.
18

Regionenbasierte Partitionierung bei fraktaler Bildkompression mit Quadtrees

Ochotta, Tilo 20 October 2017 (has links)
Fraktale Bildcodierung ist ein leistungsfähiges Verfahren zur Kompression von Bilddaten. In der vorliegenden Arbeit werden zwei verschiedene Ansätze zur notwendigen Partitionierung des zu codierenden Bildes untersucht. Beide Typen zählen zu den regionenbasierten, hochadaptiven Methoden zur Bildpartitionierung, wobei das Bild zunächst in Grundblöcke zerlegt wird, die anschließend geeignet zu Regionen zusammengefaßt werden. Bei der ersten, bereits in früheren Arbeiten eingehend untersuchten Methode bestehen die Grundpartitionen aus quadratischen Blöcken gleicher Größe. Bei der zweiten zu untersuchenden Methode werden die Grundblöcke durch eine Quadtree-Zerlegung gebildet und besitzen damit unterschiedliche Größen. Nach der Anwendung eines entsprechenden Regionen-Merging-Verfahrens ergeben sich Partitionen, die sich sowohl in Struktur als auch in der zur Abspeicherung benötigten Anzahl von Bits unterscheiden. Einerseits weisen die regionenbasierten Partitionen mit Quadtrees eine geradlinigere Struktur auf, weshalb sie sich mit arithmetischer Codierung besser komprimieren lassen als regionenbasierte Partitionen mit uniformen Grundblöcken. Andererseits liefert der Quadtree-basierte Ansatz eine meßbar schlechtere Qualität des decodierten Bildes bei gleicher Anzahl von Regionen. Diese Unterschiede werden in dieser Arbeit untersucht und erläutert. Dazu werden die in der Literatur vorhandenen Ansätze aufgegriffen und weitere Verfahren vorgestellt, die zu einer effizienteren Partitionsabspeicherung führen. Versuche haben gezeigt, daß der Quadtree-basierte Ansatz mit den vorgestellten Neuerungen zu leicht besseren Ergebnissen bezüglich des Rekonstruktionsfehlers als der uniforme Ansatz führt. Die erreichten Werte stellen die zur Zeit besten Resultate bei fraktaler Bildkompression im Ortsraum dar. Auch im Hinblick auf eine schnelle Codierung ist die Anwendung des Quadtree-Schemas im Vergleich zum uniformen Ansatz von Vorteil, es wird bessere Bildqualität bei kürzerer Codierungszeit erreicht.
19

Multi-Resolution Obstacle Mapping with Rapidly-Exploring Random Tree Path Planning for Unmanned Air Vehicles

Millar, Brett Wayne 08 April 2011 (has links) (PDF)
Unmanned air vehicles (UAVs) have become an important area of research. UAVs are used in many environments which may have previously unknown obstacles or sources of danger. This research addresses the problem of obstacle mapping and path planning while the UAV is in flight. Online obstacle mapping is achieved through the use of a multi-resolution map. As sensor information is received, a quadtree is built up to hold the information based upon the uncertainty associated with the measurement. Once a quadtree map of obstacles is built up, we desire online path re-planning to occur as quickly as possible. We introduce the idea of a quadtree rapidly-exploring random tree (RRT), which will be used as the online path re-planning algorithm. This approach implements a variable sized step instead of the fixed-step size usually used in the RRT algorithm. This variable step uses the structure of the quadtree to determine the step size. The step size grows larger or smaller based upon the size of the area represented by the quadtree it passes through. Finally this approach is tested in a simulation environment. The results show that the quadtree RRT requires fewer steps on average than a standard RRT to find a path through an area. It also has a smaller variance in the number of steps taken by the path planning algorithm in comparison to the standard RRT.
20

Automated Adaptive Data Center Generation For Meshless Methods

Mitteff, Eric 01 January 2006 (has links)
Meshless methods have recently received much attention but are yet to reach their full potential as the required problem setup (i.e. collocation point distribution) is still significant and far from automated. The distribution of points still closely resembles the nodes of finite volume-type meshes and the free parameter, c, of the radial-basis expansion functions (RBF) still must be tailored specifically to a problem. The localized meshless collocation method investigated requires a local influence region, or topology, used as the expansion medium to produce the required field derivatives. Tests have shown a regular cartesian point distribution produces optimal results, however, in order to maintain a locally cartesian point distribution a recursive quadtree scheme is herein proposed. The quadtree method allows modeling of irregular geometries and refinement of regions of interest and it lends itself for full automation, thus, reducing problem setup efforts. Furthermore, the construction of the localized expansion regions is closely tied up to the point distribution process and, hence, incorporated into the automated sequence. This also allows for the optimization of the RBF free parameter on a local basis to achieve a desired level of accuracy in the expansion. In addition, an optimized auto-segmentation process is adopted to distribute and balance the problem loads throughout a parallel computational environment while minimizing communication requirements.

Page generated in 0.2904 seconds