Spelling suggestions: "subject:"sensorfusion"" "subject:"sensorfusions""
191 |
GPS and IMU Sensor Fusion to Improve Velocity AccuracyLaurell, Adam, Karlsson, Erik, Naqqar, Yousuf January 2022 (has links)
The project explores the possibilities on how to improve the accuracy of GPS velocity data by using sensor fusion with an extended Kalman filter. The proposed solution in this project is a sensor fusion between the GPS and IMU of the system, where the extended Kalman filter was used to estimate the velocity from the sensor data. The hardware used for the data acquisition to the proposed solution was a Pixhawk 4 (PX4), which has an IMU consisting of accelerometers, gyroscopes and magnetometers. The PX4:s corresponding GPS module was also used to collect accurate velocity data. The data was logged using Simulink and later processed with MATLAB. The sensor fusion using the extended Kalman filter gave good estimates upon constant acceleration but had problems with estimating over varying acceleration. This was initially planned to be solved using smoothing filters, which is an essential part of the fusion process, but was never implemented due to time constraints. The constructed filter acts as a foundation towards future improvement. Other methods such as unscented Kalman filter, particle filter and neural network could also be explored to improve the estimation of the velocity due to these filters being known to have better performance. However, most of these alternatives need more computing power and are generally harder to implement compared to the extended Kalman filter. This project would be beneficial to QTAGG, since increasing the velocity resolution and accuracy of the system can provide possibilities of better optimization. It is also a commonly implemented solution where there are many state of the art implementations available.
|
192 |
Building an Efficient Occupancy Grid Map Based on Lidar Data Fusion for Autonomous driving ApplicationsSalem, Marwan January 2019 (has links)
The Localization and Map building module is a core building block for designing an autonomous vehicle. It describes the vehicle ability to create an accurate model of its surroundings and maintain its position in the environment at the same time. In this thesis work, we contribute to the autonomous driving research area by providing a proof-of-concept of integrating SLAM solutions into commercial vehicles; improving the robustness of the Localization and Map building module. The proposed system applies Bayesian inference theory within the occupancy grid mapping framework and utilizes Rao-Blackwellized Particle Filter for estimating the vehicle trajectory. The work has been done at Scania CV where a heavy duty vehicle equipped with multiple-Lidar sensory architecture was used. Low level sensor fusion of the different Lidars was performed and a parallelized implementation of the algorithm was achieved using a GPU. When tested on the frequently used datasets in the community, the implemented algorithm outperformed the scan-matching technique and showed acceptable performance in comparison to another state-of-art RBPF implementation that adapts some improvements on the algorithm. The performance of the complete system was evaluated under a designed set of real scenarios. The proposed system showed a significant improvement in terms of the estimated trajectory and provided accurate occupancy representations of the vehicle surroundings. The fusion module was found to build more informative occupancy grids than the grids obtained form individual sensors. / Modulen som har hand om både lokalisering och byggandet av karta är en av huvudorganen i ett system för autonom körning. Den beskriver bilens förmåga att skapa en modell av omgivningen och att hålla en position i förhållande till omgivningen. I detta examensarbete bidrar vi till forskningen inom autonom bilkörning med ett valideringskoncept genom att integrera SLAM-lösningar i kommersiella fordon, vilket förbättrar robustheten hos lokaliserings-kartbyggarmodulen. Det föreslagna systemet använder sig utav Bayesiansk statistik applicerat i ett ramverk som har hand om att skapa en karta, som består av ett rutnät som används för att beskriva ockuperingsgraden. För att estimera den bana som fordonet kommer att färdas använder ramverket RBPF(Rao-Blackwellized particle filter). Examensarbetet har genomförts hos Scania CV, där ett tungt fordon utrustat med flera lidarsensorer har använts. En lägre nivå av sensor fusion applicerades för de olika lidarsensorerna och en parallelliserad implementation av algoritmen implementerades på GPU. När algoritmen kördes mot data som ofta används av ”allmänheten” kan vi konstatera att den implementerade algoritmen ger ett väldigt mycket bättre resultat än ”scan-matchnings”-tekniken och visar på ett acceptabelt resultat i jämförelse med en annan högpresterande RBPFimplementation, vilken tillför några förbättringar på algoritmen. Prestandan av hela systemet utvärderas med ett antal egendesignade realistiska scenarion. Det föreslagna systemet visar på en tydlig förbättring av uppskattningen av körbanan och bidrar även med en exakt representation av omgivningen. Sensor Fusionen visar på en bättre och mer informativ representation än när man endast utgår från de individuella lidarsensorerna.
|
193 |
Real-Time Forward Urban Environment Perception for an Autonomous Ground Vehicle Using Computer Vision and LIDARGreco, Christopher Richard 17 March 2008 (has links) (PDF)
The field of autonomous vehicle research is growing rapidly. The Congressional mandate for the military to use unmanned vehicles has, in large part, sparked this growth. In conjunction with this mandate, DARPA sponsored the Urban Challenge, a competition to create fully autonomous vehicles that can operate in urban settings. An extremely important feature of autonomous vehicles, especially in urban locations, is their ability to perceive their environment. The research presented in this thesis is directed toward providing an autonomous vehicle with real-time data that efficiently and compactly represents its forward environment as it navigates an urban area. The information extracted from the environment for this application consists of stop line locations, lane information, and obstacle locations, using a single camera and LIDAR scanner. A road/non-road binary mask is first segmented. From the road information in the mask, the current traveling lane of the vehicle is detected using a minimum distance transform and tracked between frames. The stop lines and obstacles are detected from the non-road information in the mask. Stop lines are detected using a variation of vertical profiling, and obstacles are detected using shape descriptors. A laser rangefinder is used in conjunction with the camera in a primitive form of sensor fusion to create a list of obstacles in the forward environment. Obstacle boundaries, lane points, and stop line centers are then translated from image coordinates to UTM coordinates using a homography transform created during the camera calibration procedure. A novel system for rapid camera calibration was also implemented. Algorithms investigated during the development phase of the project are included in the text for the purposes of explaining design decisions and providing direction to researchers who will continue the work in this field. The results were promising, performing the tasks fairly accurately at a rate of about 20 frames per second, using an Intel Core2 Duo processor with 2 GB RAM.
|
194 |
Presence detection by means of RF waveform classificationLengdell, Max January 2022 (has links)
This master thesis investigates the possibility to automatically label and classify radio waves for presence detection, where the objective is to obtain information about the number of people in a room based on channel estimates. Labeling data for machine learning is time consuming and tedious process. To address this two approaches are evaluated. One was to develop a framework to generate labels with the aid of computer vision AI. The other relies on unsupervised learning classifiers complemented with heuristics to generate the labels. The investigation also studies the performance of the classifiers as a function of the TX/RX configuration, SNR, number of consecutive samples in a feature vector, bandwidth and frequency band. When someone moves in a room the propagation environment changes and induces variations in the channel estimates, compared to when the room is empty. These variations are the fundamental concept that is exploited in this thesis. Two methods are suggested to perform classification without the need of training data. The first uses random trees embeddings to construct a random forest without labels and the second using statistical bootstrapping with a random forest classifier. The labels used for annotation indicate whether were zero, one or two people in the room. The performance of binary and non-binary classification is evaluated both for the two blind detection models, as well as the performance of the unsupervised learning techniques Kmeans and self-organizing maps. For classification both supervised and unsupervised learning use random forest classifiers. Results show that random forest classifiers perform well for this kind of problem, and that random tree embeddings are able to extract relational data that could be used for automatic labeling of the data.
|
195 |
Environment Perception for Autonomous Driving : A 1/10 Scale Implementation Of Low Level Sensor Fusion Using Occupancy Grid MappingRawat, Pallav January 2019 (has links)
Autonomous Driving has recently gained a lot of recognition and provides challenging research with an aim to make transportation safer, more convenient and efficient. This emerging technology also has widespread applications and implications beyond all current expectations in other fields of robotics. Environment perception is one of the big challenges for autonomous robots. Though a lot of methods have been developed to utilize single sensor based approaches, since different sensor types have different operational characteristics and failure modes, they compliment each other. Different sensors provide different sets of data, which creates difficulties combining information to form a unified picture. The proposed solution consists of low level sensor fusion of LIDAR and stereo camera data using an occupancy grid framework. Bayesian inference theory is utilized and a real time system has been implemented on a 1/10 scale robot vehicle. The result of the thesis shows that it is possible to use a 2D LIDAR and stereo camera to build a map of the environment. The implementation focuses on the practical issues like blind spots of individ sensors. Overall, the fused occupancy grid gives better result than occupancy grids from individual sensors. Sensor confidence is higher for the camera since frequency of mapping of a 2D LIDAR is low / Autonom körning har nyligen fått mycket erkännande och erbjuder utmanande forskningsmöjligheter med målen att göra transporter säkrare, bekvämare och effektivare. Den framväxande tekniken har också tillämpningar och konsekvenser inom andra områden av robotteknik i en omfattning som vida överträffat förväntningarna. Att uppfatta den omgivande miljön är en av de stora utmaningarna för autonoma robotar. Även om många metoder har utvecklats där en enda sensor används, har de bästa resultaten uppnåtts genom en kombination av sensorer. Olika sensorer ger olika uppsättningar data, vilket skapar svårigheter att kombinera information för att bilda en enhetlig bild. Den föreslagna lösningen består av lågfrekvent sensorfusion av LIDAR och stereokamera med användning av rutnätsramar. Bayesisk inferensteori har använts och ett realtidssystem har implementerats på robotfordon i skala 1/10. Resultatet av examensarbetet visar att det är möjligt att använda en 2D-LIDAR och en stereokamera för att bygga en omgivningskarta. Genomförandet fokuserar på praktiska problem såsom problem med döda vinkeln hos dessa sensorer. Generellt ger det kombinerade rutnätet bättre resultat än det från enskilda sensorer. Sensortillförlitligheten är högre för kameran då 2D-LIDAR kartlägger med mycket lägre frekvens
|
196 |
Localization For AutonomousDriving using Statistical Filtering : A GPS aided navigation approach with EKF and UKF / Lokalisering för autonom körning med statistiskfiltrering : En GPS-stödd navigeringsmetod med EKF och UKFSingh, Devrat January 2022 (has links)
A critical requirement for safe autonomous driving is to have an accurate state estimate of thevehicle. One of the most ubiquitous yet reliable ways for this task is through the integrationof the onboard Inertial Navigation System (INS) and the Global Navigation Satellite System(GNSS). This integration can further be assisted through fusion of information from otheronboard sensors. On top of that, a ground vehicle enforces its own set of rules, through non-holonomic constraints, which along with other vehicle dynamics can aid the state estimation.In this project, a sequential probabilistic inference approach has been followed, that fusesthe high frequency, short term accurate INS estimates, with low frequency, drift free GPSobservations. The fusion of GPS and IMU has been sought through a modular asynchronousloosely coupled framework, capable of augmenting additional observation sources to facilitatethe state estimation and tracking process. Besides GPS and IMU, the applied strategy makesuse of wheel speed sensor measurements, nonholonomic constraints and online estimationof IMU sensor biases as well wheel speed scalling factor. Theses augmentations have beenshown to increase the robustness of the localization module, under periods of GPS outage.The Extended Kalman Filter (EKF) has seen extensive usage for such sensor fusion tasks,however, the performance can be limited due to the propagation of the covariance throughlinearization of the underlying non-linear model. The Unscented Kalman Filter (UKF) avoidsthe issue of linearization based on jacobians. Instead, it uses a carefully chosen set ofsample points in order to accurately map the probability distribution. Correspondingly, thesurrounding literature also indicates towards the UKF out performing EKF in such tasks.Therefore, the present thesis also seeks to evaluate these claims.The EKF and SRUKF (Square Root UKF) instances of the developed algorithm have beentested on real sensor logs, recorded from a Scania test vehicle. Under no GPS outage situation,the implemented localization algorithm performs within a position RMSE of 60cm.The robustness of the localization algorithm, to GPS outages, is evaluated by simulating0-90% lengths of GPS unavailability, during the estimation process. Additionally, to unfoldthe impact of parameters, the individual modules within the suggested framework wereisolated and analysed with respect to their contribution towards the algorithm’s localizationperformance.Out of all, the online estimation of IMU sensor biases proved to be critical for increasingthe robustness of the suggested localization algorithm to GPS shortage, especially for the EKF.In terms of the distinction, both the EKF and the SRUKF performed to similar capabilities,however, the UKF showed better results for higher levels of GPS cuts. / Ett kritiskt krav för säker autonom körning är att ha en korrekt tillståndsuppskattning avfordonet. Ett av de mest förekommande men ändå tillförlitliga sätten för denna uppgift ärgenom integrationen av det inbyggda tröghetsnavigationssystemet (INS) och med Satellitnavi-gation (GNSS). Denna integration kan ytterligare underlättas genom sammanslagning avinformation från andra sensorer ombord. Utöver det upprätthåller ett markfordon sin egenuppsättning regler, genom icke-holonomiska begränsningar, som tillsammans med annanfordonsdynamik kan hjälpa till vid tillståndsuppskattningen.I detta projekt har en sekventiell probabilistisk slutledning följts, som sammansmälterde högfrekventa, kortsiktiga exakta INS-uppskattningarna, med lågfrekventa, driftfria GPS-observationer. Sammanslagningen av GPS och IMU har sökts genom ett modulärt asynkrontlöst kopplat ramverk, som kan utökas med ytterligare observationskällor för att underlättatillståndsuppskattningen och spårningsprocessen. Förutom GPS och IMU använder dentillämpade strategin mätningar av hjulhastighetssensorer, icke-holonomiska begränsningaroch onlineuppskattning av IMU-sensorbias samt hjulhastighetsskalningsfaktor. Dessa tillägghar visat sig öka robustheten hos lokaliseringsmodulen under perioder utan GPS-signal.Extended Kalman Filter (EKF) har sett omfattande användning för sådana sensorfusionsup-pgifter, men prestandan kan begränsas på grund av spridningen av kovariansen genomlinearisering av den underliggande icke-linjära modellen. Unscented Kalman Filter (UKF)undviker frågan om linearisering baserad på jacobianer. Istället använder den en noggrantutvald uppsättning provpunkter för att korrekt kartlägga sannolikhetsfördelningen. På motsva-rande sätt indikerar den omgivande litteraturen också mot UKF att utföra EKF i sådanauppgifter. Därför försöker denna avhandling också utvärdera dessa påståenden.EKF- och SRUKF-instanserna (Square Root UKF) av den utvecklade algoritmen hartestats på sensorloggar, inspelade från ett Scania-testfordon. Utan GPS-avbrott presterar denimplementerade lokaliseringsalgoritmen inom en position RMSE på 60 cm.Robustheten hos lokaliseringsalgoritmen, vid GPS-avbrott, utvärderas genom att simulera0-90% längder av GPS-otillgänglighet under uppskattningsprocessen. Utöver det har deenskilda modulerna inom det föreslagna ramverket isolerats och analyserats med avseendepå deras bidrag till algoritmens lokaliseringsprestanda.Av allt visade sig onlineuppskattningen av IMU-sensorbiaser vara avgörande för att ökarobustheten hos den föreslagna lokaliseringsalgoritmen mot GPS-brist, särskilt för EKF. Närdet gäller distinktionen presterade både EKF och SRUKF med liknande förmåga, men UKFvisade bättre resultat vid längre perioder utan GPS-signal.
|
197 |
An Intelligent System for Small Unmanned Aerial Vehicle Traffic ManagementCook, Brandon M. 28 June 2021 (has links)
No description available.
|
198 |
Automotive Radar For Localization In GNSS- Denied EnvironmentsOtake, Bianca January 2021 (has links)
Precise and robust automotive localization is a must for autonomous vehicles. Radar is a cheap and robust sensor, and this project aimed to find a method to use automotive radar to localize globally. By using radar data to build occupancy grids based on other state-of-the-art radar localization methods, and applying image correlation techniques, a localization precision of below 20 cm could be achieved, delivering poses at frequency higher than 0.5 Hz along with a characterization of the uncertainty. By using an improved sensor model for the occupancy grid mapping, filtering the radar data, and using image correlation in the Fourier domain. The presented results are better than the state-of-the-art radar localization methods, both in terms of precision and frequency, however not in terms of heading estimation. The work provides a foundation for future investigations and improvements of radar as a sensor for localization. / Exakt och robust fordonslokalisering är ett måste för framtidens autonoma fordon. Radar är billig och robust sensor, och detta projekt utfördes i syfte att hitta en metod för att använda bilradar för att lokalisera globalt. Genom att använda radardata för att bygga occupancyg grids baserade på de senaste bästa radarlokaliseringsmetoder och tillämpa bildkorrelationstekniker, kunde en lokaliseringsprecision bättre än 20 cm uppnås, vilket ger positioner vid frekvens högre än 0,5 Hz tillsammans osäkerhetens karaktärisering. Genom att använda en förbättrad sensormodell för kartläggning av occupancy grids, filtrera radardata och använda bildkorrelation i Fourier- domänen. De presenterade resultaten är bättre än de senaste metoderna för radarlokalisering, både när det gäller precision och frekvens, men inte när det gäller riktning. Arbetet utgör en grund för framtida undersökningar av radar som en sensor för lokalisering.
|
199 |
ROBOTIC-ASSISTED BEATING HEART SURGERYBebek, Ozkan 25 January 2008 (has links)
No description available.
|
200 |
Machine Learning for Road Following by Autonomous Mobile RobotsWarren, Emily Amanda 25 September 2008 (has links)
No description available.
|
Page generated in 0.0653 seconds