• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 187
  • 25
  • 13
  • 8
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 296
  • 296
  • 78
  • 69
  • 66
  • 61
  • 57
  • 50
  • 46
  • 43
  • 43
  • 41
  • 38
  • 36
  • 35
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Shooter Localization in a Wireless Sensor Network / Lokalisering av skytt i ett trådlöst sensornätverk

Wilsson, Olof January 2009 (has links)
Shooter localization systems are used to detect and locate the origin of gunfire. A wireless sensor network is one possible implementation of such a system. A wireless sensor network is sensitive to synchronization errors. Localization techniques that rely on the timing will give less accurate or even useless results if the synchronization errors are too large. This thesis focuses on the influence of synchronization errors on the abilityto localize a shooter using a wireless sensor network. A localization algorithm is developed and implemented and the effect of synchronization errors is studied. The localization algorithm is evaluated using numerical experiments, simulations, and data from real gunshots collected at field trials. The results indicate that the developed localization algorithm is able to localizea shooter with quite good accuracy. However, the localization performance is to a high degree influenced by the geographical configuration of the network as well as the synchronization error. / Skottlokaliseringssystem används för att upptäcka och lokalisera ursprunget för avlossade skott. Ett trådlöst sensornätverk är ett sätt att utforma ett sådant system.Trådlösa sensornätverk är känsliga för synkroniseringsfel. Lokaliseringsmetoder som bygger på tidsobservationer kommer med för stora synkroniseringsfel ge dåliga eller helt felaktiga resultat. Detta examensarbete fokuserar på vilken inverkan synkroniseringsfel har på möjligheterna att lokalisera en skytt i ett trådlöst sensornätverk. En lokaliseringsalgoritm utvecklas och förmågan att korrekt lokalisera en skytt vid olika synkroniseringsfel undersöks. Lokaliseringsalgoritmen prövas med numeriska experiment, simuleringar och även för data från riktiga skottljud, insamlade vid fältförsök. Resultaten visar att lokaliseringsalgoritmen fungerar tillfredställande, men att lokaliseringsförmågan till stor del påverkas av synkroniseringsfel men även av sensornätverkets geografiska utseende.
82

State estimation of RC cars for the purpose of drift control / Tillståndsskattning på RC-bilar för driftreglering

Liljestrand, Jonatan January 2011 (has links)
High precision state estimation is crucial when executing drift control and high speed control close to the stability limit, on electric RC scale cars. In this thesis the estimation is made possible through recursive Bayesian filtering; more precisely the Extended Kalman Filter. By modelling the dynamics of the car and using it together with position measurements and control input signals, it is possible to do state estimation and prediction with high accuracy even on non-measured states. Focus is on real-time, on-line, estimation of the so called slip angles of the front and rear tyres, because of their impact of the car’s behaviour. With the extended information given to the system controller, higher levels of controllability could be reached. This can be used not only for higher speeds and drift control, but also a possibility to study future anti-skid safety measures forground vehicles.
83

Indoor Positioning using Sensor-fusion in Android Devices

Shala, Ubejd, Rodriguez, Angel January 2011 (has links)
This project examines the level of accuracy that can be achieved in precision positioning by using built-in sensors in an Android smartphone. The project is focused in estimating the position of the phone inside a building where the GPS signal is bad or unavailable. The approach is sensor-fusion: by using data from the device’s different sensors, such as accelerometer, gyroscope and wireless adapter, the position is determined. The results show that the technique is promising for future handheld indoor navigation systems that can be used in malls, museums, large office buildings, hospitals, etc.
84

Sensor fusion between a Synthetic Attitude and Heading Reference System and GPS / Sensorfusion mellan ett Syntetiskt attityd- och kursreferenssystem och GPS

Rosander, Regina January 2003 (has links)
Sensor fusion deals with the merging of several signals into one, extracting a better and more reliable result. Traditionally the Kalmanfilter is used for this purpose and the aircraft navigation has benefited tremendously from its use. This thesis considers the merge of two navigation systems, the GPS positioning system and the Saab developed Synthetic Attitude and Heading Reference System (SAHRS). The purpose is to find a model for such a fusion and to investigate whether the fusion will improve the overall navigation performance. The non-linear nature of the navigation equations will lead to the use of the extended Kalman filter and the model is evaluated against both simulated and real data. The results show that this strategy indeed works but problems will arise when the GPS signal falls away.
85

Robust Automotive Positioning: Integration of GPS and Relative Motion Sensors / Robust fordonspositionering: Integration av GPS och sensorer för relativ rörelse

Kronander, Jon January 2004 (has links)
Automotive positioning systems relying exclusively on the input from a GPS receiver, which is a line of sight sensor, tend to be sensitive to situations with limited sky visibility. Such situations include: urban environments with tall buildings; inside parking structures; underneath trees; in tunnels and under bridges. In these situations, the system has to rely on integration of relative motion sensors to estimate vehicle position. However, these sensor measurements are generally affected by errors such as offsets and scale factors, that will cause the resulting position accuracy to deteriorate rapidly once GPS input is lost. The approach in this thesis is to use a GPS receiver in combination with low cost sensor equipment to produce a robust positioning module. The module should be capable of handling situations where GPS input is corrupted or unavailable. The working principle is to calibrate the relative motion sensors when GPS is available to improve the accuracy during GPS intermission. To fuse the GPS information with the sensor outputs, different models have been proposed and evaluated on real data sets. These models tend to be nonlinear, and have therefore been processed in an Extended Kalman Filter structure. Experiments show that the proposed solutions can compensate for most of the errors associated with the relative motion sensors, and that the resulting positioning accuracy is improved accordingly.
86

Modeling and Estimation of Dynamic Tire Properties

Narby, Erik January 2006 (has links)
Information about dynamic tire properties has always been important for drivers of wheel driven vehicles. With the increasing amount of systems in modern vehicles designed to measure and control the behavior of the vehicle information regarding dynamic tire properties has grown even more important. In this thesis a number of methods for modeling and estimating dynamic tire properties have been implemented and evaluated. The more general issue of estimating model parameters in linear and non-linear vehicle models is also addressed. We conclude that the slope of the tire slip curve seems to dependent on the stiffness of the road surface and introduce the term combined stiffness. We also show that it is possible to estimate both longitudinal and lateral combined stiffness using only standard vehicle sensors.
87

3D Multi-Field Multi-Scale Features From Range Data In Spacecraft Proximity Operations

Flewelling, Brien Roy 2012 May 1900 (has links)
A fundamental problem in spacecraft proximity operations is the determination of the 6 degree of freedom relative navigation solution between the observer reference frame and a reference frame tied to a proximal body. For the most unconstrained case, the proximal body may be uncontrolled, and the observer spacecraft has no a priori information on the body. A spacecraft in this scenario must simultaneously map the generally poorly known body being observed, and safely navigate relative to it. Simultaneous localization and mapping(SLAM)is a difficult problem which has been the focus of research in recent years. The most promising approaches extract local features in 2D or 3D measurements and track them in subsequent observations by means of matching a descriptor. These methods exist for both active sensors such as Light Detection and Ranging(LIDAR) or laser RADAR(LADAR), and passive sensors such as CCD and CMOS camera systems. This dissertation presents a method for fusing time of flight(ToF) range data inherent to scanning LIDAR systems with the passive light field measurements of optical systems, extracting features which exploit information from each sensor, and solving the unique SLAM problem inherent to spacecraft proximity operations. Scale Space analysis is extended to unstructured 3D point clouds by means of an approximation to the Laplace Beltrami operator which computes the scale space on a manifold embedded in 3D object space using Gaussian convolutions based on a geodesic distance weighting. The construction of the scale space is shown to be equivalent to both the application of the diffusion equation to the surface data, as well as the surface evolution process which results from mean curvature flow. Geometric features are localized in regions of high spatial curvature or large diffusion displacements at multiple scales. The extracted interest points are associated with a local multi-field descriptor constructed from measured data in the object space. Defining features in object space instead of image space is shown to bean important step making the simultaneous consideration of co-registered texture and the associated geometry possible. These descriptors known as Multi-Field Diffusion Flow Signatures encode the shape, and multi-texture information of local neighborhoods in textured range data. Multi-Field Diffusion Flow Signatures display utility in difficult space scenarios including high contrast and saturating lighting conditions, bland and repeating textures, as well as non-Lambertian surfaces. The effectiveness and utility of Multi-Field Multi-Scale(MFMS) Features described by Multi-Field Diffusion Flow Signatures is evaluated using real data from proximity operation experiments performed at the Land Air and Space Robotics(LASR) Laboratory at Texas A&M University.
88

Pose Estimation and Calibration Algorithms for Vision and Inertial Sensors

Hol, Jeroen Diederik January 2008 (has links)
<p>This thesis deals with estimating position and orientation in real-time, using measurements from vision and inertial sensors. A system has been developed to solve this problem in unprepared environments, assuming that a map or scene model is available. Compared to ‘camera-only’ systems, the combination of the complementary sensors yields an accurate and robust system which can handle periods with uninformative or no vision data and reduces the need for high frequency vision updates.</p><p>The system achieves real-time pose estimation by fusing vision and inertial sensors using the framework of nonlinear state estimation for which state space models have been developed. The performance of the system has been evaluated using an augmented reality application where the output from the system is used to superimpose virtual graphics on the live video stream. Furthermore, experiments have been performed where an industrial robot providing ground truth data is used to move the sensor unit. In both cases the system performed well.</p><p>Calibration of the relative position and orientation of the camera and the inertial sensor turn out to be essential for proper operation of the system. A new and easy-to-use algorithm for estimating these has been developed using a gray-box system identification approach. Experimental results show that the algorithm works well in practice.</p>
89

Investigation of Inertial Navigation for Localization in Underground Mines

Svensson, John January 2015 (has links)
This thesis project considers the potential use of inertial navigation on a consumer grade tablet mounted in a vehicle in an underground mine. The goal is to identify which sensors and techniques are useful and to design a navigation algorithm based on those results. The navigation algorithm is intended to work alongside the current received signal strength indication (RSSI) positioning system. Testing of the gyroscope, accelerometer and magnetometer sensors suggest that, while dead reckoning is likely not precise enough, an orientation filter can be designed that can be used for navigation. A complementary orientation filter using the gyroscope and accelerometer is then designed that shows better results than the default sensor fusion solutions available in Android. The filter is expandable and can come to include magnetometer data in the future. Based on the outputs of this filter, a navigation algorithm based onangle matching with map information is proposed. Precise positioning in an underground mine can be crucial to employee safety, and may also bring production benefits.
90

Sensor Fusion and Calibration of Inertial Sensors, Vision, Ultra-Wideband and GPS

Hol, Jeroen D. January 2011 (has links)
The usage of inertial sensors has traditionally been confined primarily to the aviation and marine industry due to their associated cost and bulkiness. During the last decade, however, inertial sensors have undergone a rather dramatic reduction in both size and cost with the introduction of MEMS technology. As a result of this trend, inertial sensors have become commonplace for many applications and can even be found in many consumer products, for instance smart phones, cameras and game consoles. Due to the drift inherent in inertial technology, inertial sensors are typically used in combination with aiding sensors to stabilize andimprove the estimates. The need for aiding sensors becomes even more apparent due to the reduced accuracy of MEMS inertial sensors. This thesis discusses two problems related to using inertial sensors in combination with aiding sensors. The first is the problem of sensor fusion: how to combine the information obtained from the different sensors and obtain a good estimate of position and orientation. The second problem, a prerequisite for sensor fusion, is that of calibration: the sensors themselves have to be calibrated and provide measurement in known units. Furthermore, whenever multiple sensors are combined additional calibration issues arise, since the measurements are seldom acquired in the same physical location and expressed in a common coordinate frame. Sensor fusion and calibration are discussed for the combination of inertial sensors with cameras, UWB or GPS. Two setups for estimating position and orientation in real-time are presented in this thesis. The first uses inertial sensors in combination with a camera; the second combines inertial sensors with UWB. Tightly coupled sensor fusion algorithms and experiments with performance evaluation are provided. Furthermore, this thesis contains ideas on using an optimization based sensor fusion method for a multi-segment inertial tracking system used for human motion capture as well as a sensor fusion method for combining inertial sensors with a dual GPS receiver. The above sensor fusion applications give rise to a number of calibration problems. Novel and easy-to-use calibration algorithms have been developed and tested to determine the following parameters: the magnetic field distortion when an IMU containing magnetometers is mounted close to a ferro-magnetic object, the relative position and orientation of a rigidly connected camera and IMU, as well as the clock parameters and receiver positions of an indoor UWB positioning system. / MATRIS (Markerless real-time Tracking for Augmented Reality Image), a sixth framework programme funded by the European Union / CADICS (Control, Autonomy, and Decision-making in Complex Systems), a Linneaus Center funded by the Swedish Research Council (VR) / Strategic Research Center MOVIII, funded by the Swedish Foundation for Strategic Research (SSF)

Page generated in 0.0822 seconds