• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • 1
  • Tagged with
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Consistent and Communication-Efficient Range-Only Decentralized Collaborative Localization using Covariance Intersection

Sjödahl Wennergren, Erik, Lundberg, Björn January 2024 (has links)
High-accuracy localization is vital for many applications and is a fundamental prerequisite for enabling autonomous missions. Modern navigation systems often rely heavily on Global Navigation Satellite Systems (GNSS) for achieving high localization accuracy over extended periods of time, which has necessitated alternative localization methods that can be used in GNSS-disturbed environments. One popular alternative that has emerged is Collaborative Localization (CL), which is a method where agents of a swarm combine knowledge of their own state with relative measurements of other agents to achieve a localization accuracy that is better than what a single agent can achieve on its own. Performing this in a decentralized manner introduces the challenge of how to account for unknown inter-agent correlations, which typically leads to the need for using conservative fusion methods such as Covariance Intersection (CI) to preserve consistency. Many existing CL algorithms that utilize CI assume agents to have perception systems capable of identifying the relative position of other swarm members. These algorithms do therefore not work in systems where, e.g., agents are only capable of measuring range to each other. Other CI algorithms that support more generic measurement models can require large amounts of data to be exchanged when agents communicate, which could lead to issues in bandwidth-limited systems. This thesis develops a consistent decentralized collaborative localization algorithm based on CI that supports range-only measurements between agents and requires a communication effort that is constant in the number of agents in the swarm. The algorithm, referred to as the PSCI algorithm, was found to maintain satisfactory performance in various scenarios but exhibits slightly increased sensitivity to the measurement geometry compared to an already existing, more communication-heavy, CI-based algorithm. Moreover, the thesis highlights the impact of linearization errors in range-only CL systems and shows that performing CI-fusion before the range-observation measurement update, with a clever choice of CI cost function, can reduce linearization errors for the PSCI algorithm. A comparison between the PSCI algorithm and an already existing algorithm, referred to as the Cross-Covariance Approximation (CCA) algorithm, has further been conducted through a sensitivity analysis with respect to communication rate and the number of GNSS agents. The simulation results indicate that the PSCI algorithm exhibits diminishing improvement in Root Mean Square Error (RMSE) with increased communication rates, while the RMSE of the CCA algorithm reaches a local minimum, subsequently showing overconfidence with higher rates. Lastly, evaluation under a varying number of GNSS agents indicates that cooperative benefits for the PSCI filter are marginal when uncertainty levels are uniform across agents. However, the PSCI algorithm demonstrates superior performance improvements with an increased number of GNSS agents compared to the CCA algorithm, attributed to the overconfidence of the latter.
2

Agrégation d'information pour la localisation d'un robot mobile sur une carte imparfaite / Information aggregation for the localization of a mobile robot using a non-perfect map

Delobel, Laurent 04 May 2018 (has links)
La plupart des grandes villes modernes mondiales souffrent des conséquences de la pollution et des bouchons. Une solution à ce problème serait de réglementer l'accès aux centres-villes pour les voitures personnelles en faveur d'un système de transports publics constitués de navettes autonomes propulsées par une énergie n'engendrant pas de pollution gazeuse. Celles-ci pourraient desservir les usagers à la demande, en étant déroutées en fonction des appels de ceux-ci. Ces véhicules pourraient également être utilisés afin de desservir de grands sites industriels, ou bien des sites sensibles dont l'accès, restreint, doit être contrôlé. Afin de parvenir à réaliser cet objectif, un véhicule devra être capable de se localiser dans sa zone de travail. Une bonne partie des méthodes de localisation reprises par la communauté scientifique se basent sur des méthodes de type "Simultaneous Localization and Mapping" (SLAM). Ces méthodes sont capables de construire dynamiquement une carte de l'environnement ainsi que de localiser un véhicule dans une telle carte. Bien que celles-ci aient démontré leur robustesse, dans la plupart des implémentations, le partage d'une carte commune entre plusieurs robots peut s'avérer problématique. En outre, ces méthodes n'utilisent fréquemment aucune information existant au préalable et construisent la carte de leur environnement à partir de zéro.Nous souhaitons lever ces limitations, et proposons d'utiliser des cartes de type sémantique, qui existent au-préalable, par exemple comme OpenStreetMap, comme carte de base afin de se localiser. Ce type de carte contient la position de panneaux de signalisation, de feux tricolores, de murs de bâtiments etc... De telles cartes viennent presque à-coup-sûr avec des imprécisions de position, des erreurs au niveau des éléments qu'elles contiennent, par exemple des éléments réels peuvent manquer dans les données de la carte, ou bien des éléments stockés dans celles-ci peuvent ne plus exister. Afin de gérer de telles erreurs dans les données de la carte, et de permettre à un véhicule autonome de s'y localiser, nous proposons un nouveau paradigme. Tout d'abord, afin de gérer le problème de sur-convergence classique dans les techniques de fusion de données (filtre de Kalman), ainsi que le problème de mise à l'échelle, nous proposons de gérer l'intégralité de la carte par un filtre à Intersection de Covariance Partitionnée. Nous proposons également d'effacer des éléments inexistant des données de la carte en estimant leur probabilité d'existence, calculée en se basant sur les détections de ceux-ci par les capteurs du véhicule, et supprimant ceux doté d'une probabilité trop faible. Enfin, nous proposons de scanner périodiquement la totalité des données capteur pour y chercher de nouveaux amers potentiels que la carte n'intègre pas encore dans ses données, et de les y ajouter. Des expérimentations montrent la faisabilité d'un tel concept de carte dynamique de haut niveau qui serait mise à jour au-vol. / Most large modern cities in the world nowadays suffer from pollution and traffic jams. A possible solution to this problem could be to regulate personnal car access into center downtown, and possibly replace public transportations by pollution-free autonomous vehicles, that could dynamically change their planned trajectory to transport people in a fully on-demand scenario. These vehicles could be used also to transport employees in a large industrial facility or in a regulated access critical infrastructure area. In order to perform such a task, a vehicle should be able to localize itself in its area of operation. Most current popular localization methods in such an environment are based on so-called "Simultaneous Localization and Maping" (SLAM) methods. They are able to dynamically construct a map of the environment, and to locate such a vehicle inside this map. Although these methods demonstrated their robustness, most of the implementations lack to use a map that would allow sharing over vehicles (map size, structure, etc...). On top of that, these methods frequently do not take into account already existing information such as an existing city map and rather construct it from scratch. In order to go beyond these limitations, we propose to use in the end semantic high-level maps, such as OpenStreetMap as a-priori map, and to allow the vehicle to localize based on such a map. They can contain the location of roads, traffic signs and traffic lights, buildings etc... Such kind of maps almost always come with some degree of imprecision (mostly in position), they also can be wrong, lacking existing but undescribed elements (landmarks), or containing in their data elements that do not exist anymore. In order to manage such imperfections in the collected data, and to allow a vehicle to localize based on such data, we propose a new strategy. Firstly, to manage the classical problem of data incest in data fusion in the presence of strong correlations, together with the map scalability problem, we propose to manage the whole map using a Split Covariance Intersection filter. We also propose to remove possibly absent landmarks still present in map data by estimating their probability of being there based on vehicle sensor detections, and to remove those with a low score. Finally, we propose to periodically scan sensor data to detect possible new landmarks that the map does not include yet, and proceed to their integration into map data. Experiments show the feasibility of such a concept of dynamic high level map that could be updated on-the-fly.

Page generated in 0.127 seconds