• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 166
  • 25
  • 10
  • 8
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 265
  • 265
  • 69
  • 63
  • 62
  • 54
  • 50
  • 49
  • 45
  • 39
  • 38
  • 38
  • 34
  • 33
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Triangulation Based Fusion of Sonar Data with Application in Mobile Robot Mapping and Localization

Wijk, Olle January 2001 (has links)
No description available.
132

Sensor Fusion Navigation for Sounding Rocket Applications / Navigering med Sensorfusion i en Sondraket

Nilsson, Mattias, Vinkvist, Rikard January 2008 (has links)
One of Saab Space’s products is the S19 guidance system for sounding rockets.Today this system is based on an inertial navigation system that blindly calculatesthe position of the rocket by integrating sensor readings with unknown bias. Thepurpose of this thesis is to integrate a Global Positioning System (GPS) receiverinto the guidance system to increase precision and robustness. There are mainlytwo problems involved in this integration. One is to integrate the GPS with sensorfusion into the existing guidance system. The seconds is to get the GPS satellitetracking to work under extremely high dynamics. The first of the two problems issolved by using an Extended Kalman filter (EKF) with two different linearizations.One of them is uses Euler angles and the other is done with quaternions. Theintegration technique implemented in this thesis is a loose integration between theGPS receiver and the inertial navigation system. The main task of the EKF isto estimate the bias of the inertial navigation system sensors and correct it toeliminate drift in the position. The solution is verified by computing the positionof a car using a GPS and an inertial measurement unit. Different solutions to theGPS tracking problem are proposed in a pre-study. / En av Saab Space produkter är navigationssystemet S19 som styr sondraketer.Fram till idag har systemet varit baserat på ett tröghetsnavigeringssystem somblint räknar ut position genom att integrera tröghetsnavigerinssystemets sensorermed okända biaser. Syftet med detta exjobb är att integrera en GPS med tröghetsnavigeringsystemetför att öka robusthet och precision. Det kan i huvudsak delasupp i två problem; att integrera en GPS-mottagare med det befintliga navigationsystemetmed användning utav sensorfusion, och att få satellitföljningen attfungera under extremt höga dynamiska förhållanden. Det första av de två problemenlöses genom ett Extended Kalman filter (EKF) med två olika linjäriseringar.Den första linjäriseringen är med Eulervinklar och är välbeprövad. Den andra ärmed kvaternioner. Integrationstekniken som implementeras i detta Examensarbeteär en lös integration mellan GPS-mottagaren och tröghetsnavigeringssystemet. Huvudsyftetmed EKF:en är att estimera bias i tröghetsnavigeringsystemets sensoreroch korrigera dem för att eliminera drifter i position. Lösningen verifieras genomatt räkna ut positionen för en bil med GPS och en inertiell mätenhet. Olika lösningartill satellitföljningen föreslås i en förstudie.
133

Tillståndsskattning i robotmodell med accelerometrar / State estimation in a robot model using accelerometers

Ankelhed, Daniel, Stenlind, Lars January 2005 (has links)
The purpose of this report is to evaluate different methods for identifying states in robot models. Both linear and non-linear filters exist among these methods and are compared to each other. Advantages, disadvantages and problems that can occur during tuning and running are presented. Additional measurements from accelerometers are added and their use with above mentioned methods for state estimation is evaluated. The evaluation of methods in this report is mainly based on simulations in Matlab, even though some experiments have been performed on laboratory equipment. The conclusion indicates that simple non-linear models with few states can be more accurately estimated with a Kalman filter than with an extended Kalman filter, as long as only linear measurements are used. When non-linear measurements are used an extended Kalman filteris more accurate than a Kalman filter. Non-linear measurements are introduced through accelerometers with non-linear measurement equations. Using accelerometers generally leads to better state estimation when the measure equations have a simple relation to the model.
134

Multiple Platform Bias Error Estimation / Estimering av Biasfel med Multipla Plattformar

Wiklund, Åsa January 2004 (has links)
Sensor fusion has long been recognized as a mean to improve target tracking. Sensor fusion deals with the merging of several signals into one to get a better and more reliable result. To get an improved and more reliable result you have to trust the incoming data to be correct and not contain unknown systematic errors. This thesis tries to find and estimate the size of the systematic errors that appear when we have a multi platform environment and data is shared among the units. To be more precise, the error estimated within the scope of this thesis appears when platforms cannot determine their positions correctly and share target tracking data with their own corrupted position as a basis for determining the target's position. The algorithms developed in this thesis use the Kalman filter theory, including the extended Kalman filter and the information filter, to estimate the platform location bias error. Three algorithms are developed with satisfying result. Depending on time constraints and computational demands either one of the algorithms could be preferred.
135

Radar and Thermopile Sensor Fusion for Pedestrian Detection

Rouhani, Shahin January 2005 (has links)
During the last decades, great steps have been taken to decrease passenger fatality in cars. Systems such as ABS and airbags have been developed for this purpose alone. But not much effort has been put into pedestrian safety. In traffic today, pedestrians are one of the most endangered participants and in recent years, there has been an increased demand for pedestrian safety from the European Enhanced Vehicle safety Committee and the European New Car Assessment Programme has thereby developed tests where pedestrian safety is rated. With this, detection of pedestrians has arised as a part in the automotive safety research. This thesis provides some of this research available in the area and a brief introduction to some of the sensors readily available. The objective of this work is to detect pedestrians in front of a vehicle by using thermoelectric infrared sensors fused with short range radar sensors and also to minimize any missed detections or false alarms. There has already been extensive work performed with the thermoelectric infrared sensors for this sole purpose and this thesis is based on that work. Information is provided about the sensors used and an explanation of how they are set up during this work. Methods used for classifying objects are given and the assumptions made about pedestrians in this system. A basic tracking algorithm is used to track radar detected objects in order to provide the fusion system with better data. The approach chosen for the sensor fusion is a central-level fusion where the probabilities for a pedestrian from the radars and the thermoelectric infrared sensors are combined using Dempster-Shafer Theory and accumulated over time in the Occupancy Grid framework. Theories that are extensively used in this thesis are explained in detail and discussed accordingly in different chapters. Finally the experiments undertaken and the results attained from the presented system are shown. A comparison is made with the previous detection system, which only uses thermoelectric infrared sensors and of which this work continues on. Conclusions regarding what this system is capable of are drawn with its inherent strengths and weaknesses.
136

Kalman Filter Based Fusion Of Camera And Inertial Sensor Measurements For Body State Estimation

Aslan Aydemir, Gokcen 01 September 2009 (has links) (PDF)
The focus of the present thesis is on the joint use of cameras and inertial sensors, a recent area of active research. Within our scope, the performance of body state estimation is investigated with isolated inertial sensors, isolated cameras and finally with a fusion of two types of sensors within a Kalman Filtering framework. The study consists of both simulation and real hardware experiments. The body state estimation problem is restricted to a single axis rotation where we estimate turn angle and turn rate. This experimental setup provides a simple but effective means of assessing the benefits of the fusion process. Additionally, a sensitivity analysis is carried out in our simulation experiments to explore the sensitivity of the estimation performance to varying levels of calibration errors. It is shown by experiments that state estimation is more robust to calibration errors when the sensors are used jointly. For the fusion of sensors, the Indirect Kalman Filter is considered as well as the Direct Form Kalman Filter. This comparative study allows us to assess the contribution of an accurate system dynamical model to the final state estimates. Our simulation and real hardware experiments effectively show that the fusion of the sensors eliminate the unbounded error growth characteristic of inertial sensors while final state estimation outperforms the use of cameras alone. Overall we can v demonstrate that the Kalman based fusion result in bounded error, high performance estimation of body state. The results are promising and suggest that these benefits can be extended to body state estimation for multiple degrees of freedom.
137

Triangulation Based Fusion of Sonar Data with Application in Mobile Robot Mapping and Localization

Wijk, Olle January 2001 (has links)
No description available.
138

Radar and Thermopile Sensor Fusion for Pedestrian Detection

Rouhani, Shahin January 2005 (has links)
<p>During the last decades, great steps have been taken to decrease passenger fatality in cars. Systems such as ABS and airbags have been developed for this purpose alone. But not much effort has been put into pedestrian safety. In traffic today, pedestrians are one of the most endangered participants and in recent years, there has been an increased demand for pedestrian safety from the European Enhanced Vehicle safety Committee and the European New Car Assessment Programme has thereby developed tests where pedestrian safety is rated. With this, detection of pedestrians has arised as a part in the automotive safety research.</p><p>This thesis provides some of this research available in the area and a brief introduction to some of the sensors readily available. The objective of this work is to detect pedestrians in front of a vehicle by using thermoelectric infrared sensors fused with short range radar sensors and also to minimize any missed detections or false alarms. There has already been extensive work performed with the thermoelectric infrared sensors for this sole purpose and this thesis is based on that work.</p><p>Information is provided about the sensors used and an explanation of how they are set up during this work. Methods used for classifying objects are given and the assumptions made about pedestrians in this system. A basic tracking algorithm is used to track radar detected objects in order to provide the fusion system with better data. The approach chosen for the sensor fusion is a central-level fusion where the probabilities for a pedestrian from the radars and the thermoelectric infrared sensors are combined using Dempster-Shafer Theory and accumulated over time in the Occupancy Grid framework. Theories that are extensively used in this thesis are explained in detail and discussed accordingly in different chapters.</p><p>Finally the experiments undertaken and the results attained from the presented system are shown. A comparison is made with the previous detection system, which only uses thermoelectric infrared sensors and of which this work continues on. Conclusions regarding what this system is capable of are drawn with its inherent strengths and weaknesses.</p>
139

Sensor Fusion Navigation for Sounding Rocket Applications / Navigering med Sensorfusion i en Sondraket

Nilsson, Mattias, Vinkvist, Rikard January 2008 (has links)
<p>One of Saab Space’s products is the S19 guidance system for sounding rockets.Today this system is based on an inertial navigation system that blindly calculatesthe position of the rocket by integrating sensor readings with unknown bias. Thepurpose of this thesis is to integrate a Global Positioning System (GPS) receiverinto the guidance system to increase precision and robustness. There are mainlytwo problems involved in this integration. One is to integrate the GPS with sensorfusion into the existing guidance system. The seconds is to get the GPS satellitetracking to work under extremely high dynamics. The first of the two problems issolved by using an Extended Kalman filter (EKF) with two different linearizations.One of them is uses Euler angles and the other is done with quaternions. Theintegration technique implemented in this thesis is a loose integration between theGPS receiver and the inertial navigation system. The main task of the EKF isto estimate the bias of the inertial navigation system sensors and correct it toeliminate drift in the position. The solution is verified by computing the positionof a car using a GPS and an inertial measurement unit. Different solutions to theGPS tracking problem are proposed in a pre-study.</p> / <p>En av Saab Space produkter är navigationssystemet S19 som styr sondraketer.Fram till idag har systemet varit baserat på ett tröghetsnavigeringssystem somblint räknar ut position genom att integrera tröghetsnavigerinssystemets sensorermed okända biaser. Syftet med detta exjobb är att integrera en GPS med tröghetsnavigeringsystemetför att öka robusthet och precision. Det kan i huvudsak delasupp i två problem; att integrera en GPS-mottagare med det befintliga navigationsystemetmed användning utav sensorfusion, och att få satellitföljningen attfungera under extremt höga dynamiska förhållanden. Det första av de två problemenlöses genom ett Extended Kalman filter (EKF) med två olika linjäriseringar.Den första linjäriseringen är med Eulervinklar och är välbeprövad. Den andra ärmed kvaternioner. Integrationstekniken som implementeras i detta Examensarbeteär en lös integration mellan GPS-mottagaren och tröghetsnavigeringssystemet. Huvudsyftetmed EKF:en är att estimera bias i tröghetsnavigeringsystemets sensoreroch korrigera dem för att eliminera drifter i position. Lösningen verifieras genomatt räkna ut positionen för en bil med GPS och en inertiell mätenhet. Olika lösningartill satellitföljningen föreslås i en förstudie.</p>
140

Automatic geo-referencing by integrating camera vision and inertial measurements

Randeniya, Duminda I. B 01 June 2007 (has links)
Importance of an alternative sensor system to an inertial measurement unit (IMU) is essential for intelligent land navigation systems when the vehicle travels in a GPS deprived environment. The sensor system that has to be used in updating the IMU for a reliable navigation solution has to be a passive sensor system which does not depend on any outside signal. This dissertation presents the results of an effort where position and orientation data from vision and inertial sensors are integrated. Information from a sequence of images captured by a monocular camera attached to a survey vehicle at a maximum frequency of 3 frames per second was used in upgrading the inertial system installed in the same vehicle for its inherent error accumulation. Specifically, the rotations and translations estimated from point correspondences tracked through a sequence of images were used in the integration. However, for such an effort, two types of tasks need to be performed. The first task is the calibration to estimate the intrinsic properties of the vision sensors (cameras), such as the focal length and lens distortion parameters and determination of the transformation between the camera and the inertial systems. Calibration of a two sensor system under indoor conditions does not provide an appropriate and practical transformation for use in outdoor maneuvers due to invariable differences between outdoor and indoor conditions. Also, use of custom calibration objects in outdoor operational conditions is not feasible due to larger field of view that requires relatively large calibration object sizes. Hence calibration becomes one of the critical issues particularly if the integrated system is used in Intelligent Transportation Systems applications. In order to successfully estimate the rotations and translations from vision system the calibration has to be performed prior to the integration process. The second task is the effective fusion of inertial and vision sensor systems. The automated algorithm that identifies point correspondences in images enables its use in real-time autonomous driving maneuvers. In order to verify the accuracy of the established correspondences, independent constraints such as epipolar lines and correspondence flow directions were used. Also a pre-filter was utilized to smoothen out the noise associated with the vision sensor (camera) measurements. A novel approach was used to obtain the geodetic coordinates, i.e. latitude, longitude and altitude, from the normalized translations determined from the vision sensor. Finally, the position locations based on the vision sensor was integrated with those of the inertial system in a decentralized format using a Kalman filter. The vision/inertial integrated position estimates are successfully compared with those from 1) inertial/GPS system output and 2) actual survey performed on the same roadway. This comparison demonstrates that vision can in fact be used successfully to supplement the inertial measurements during potential GPS outages. The derived intrinsic properties and the transformation between individual sensors are also verified during two separate test runs on an actual roadway section.

Page generated in 0.1983 seconds