• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 165
  • 25
  • 10
  • 8
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 263
  • 263
  • 69
  • 62
  • 61
  • 54
  • 49
  • 48
  • 43
  • 39
  • 38
  • 38
  • 33
  • 33
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Sensor Fusion in Smartphones : with Application to Car Racing Performance Analysis / Sesnorfusion i Smartphones : med Tillämpning Inom Bilkörningsanalys

Wallin, Jonas, Zachrisson, Joakim January 2013 (has links)
Today's smartphones are equipped with a variety of different sensors such as GPS receivers, accelerometers, gyroscopes and magnetometers, making smartphones viable tools in many applications. The computational capacity of smartphones allows for software applications running advanced signal processing algorithms. Thus, attaching a smartphone inside a car makes it possible to estimate kinematics of the vehicle by fusing information from the different sensors inside the smartphone. Fusing information from different sources for improving estimation quality is a well-known problem and there exist a lot of methods and algorithms in this area. This thesis approaches the sensor fusion problem of estimating kinematics of cars using smartphones for the purpose of analysing driving performance. Different varieties of the coordinated turn model for describing the vehicle dynamics are investigated. Also, different measurement models are evaluated where bias errors of the sensors are taken into consideration. Pre-filtering and construction of pseudo-measurements are also considered which allow for use of state space models with a lower dimension. / Dagens smartphones är utrustade med en rad olika typer av sensorer såsom GPS mottagare, accelerometrar, gyroskop och magnetometrar vilket medför ett stort användningsområde. Beräkningskapaciteten hos smartphones gör det möjligt för mjukvaruapplikationer att använda sig av avancerade algoritmer för signalbehandling. Det är därför möjligt att placera en smartphone inuti en bil och skatta bilens kinematik genom att kombinera informationen från de olika sensorerna. Att fusionera information från olika källor för att erhålla bättre skattningar är ett välkänt område där det finns många metoder och algoritmer utvecklade. Detta examensarbete behandlar sensorfusionsproblemet att skatta bilars kinematik med hjälp av smartphones för syftet att kunna analysera körprestanda. Olika varianter av en coordinated turn modell för att beskriva bilens dynamik undersöks. Dessutom testas olika modeller för sensorerna där hänsyn till exempelvis biasfel tas. Förbehandling av data och pseudomätningar testas också vilket gör det möjligt att använda tillståndsmodeller med låg dimension.
62

Vision and Radar Sensor Fusion for Advanced Driver Assistance Systems / Vision och Radar Sensorfusion för Avancerade Förarassistanssystem

Andersson Naesseth, Christian January 2013 (has links)
The World Health Organization predicts that by the year 2030, road traffic injuries will be one of the top five leading causes of death. Many of these deaths and injuries can be prevented by driving cars properly equipped with state-of-the-art safety and driver assistance systems. Some examples are auto-brake and auto-collision avoidance which are becoming more and more popular on the market today. A recent study by a Swedish insurance company has shown that on roadswith speeds up to 50 km/h an auto-brake system can reduce personal injuries by up to 64 percent. In fact in an estimated 40 percent of crashes, the auto-brake reduced the effects to the degree that no personal injury was sustained. It is imperative that these so called Advanced Driver Assistance Systems, to be really effective, have good situational awareness. It is important that they have adequate information of the vehicle’s immediate surroundings. Where are other cars, pedestrians or motorcycles relative to our own vehicle? How fast are they driving and in which lane? How is our own vehicle driving? Are there objects in the way of our own vehicle’s intended path? These and many more questions can be answered by a properly designed system for situational awareness. In this thesis we design and evaluate, both quantitatively and qualitatively, sensor fusion algorithms for multi-target tracking. We use a combination of camera and radar information to perform fusion and find relevant objects in a cluttered environment. The combination of these two sensors is very interesting because of their complementary attributes. The radar system has high range resolution but poor bearing resolution. The camera system on the other hand has a very high bearing resolution. This is very promising, with the potential to substantially increase the accuracy of the tracking system compared to just using one of the two. We have also designed algorithms for path prediction and a first threat awareness logic which are both qualitively evaluated.
63

A Methodology For Real-time Sensor-based Blockage Assessment Of Building Structures During Earthquakes

Ergin, Tuluhan 01 February 2013 (has links) (PDF)
During and after earthquakes, occupants inside a damaged building should be evacuated rapidly and safely whereas related units outside the buildings (e.g. first responders) should know the current condition of the building. Obviously, this information should be as accurate as possible and accessed timely in order to speed up the evacuation. Unfortunately, absence of such information during evacuation and emergency response operations results in increased number of casualties. Hence, there arises a need for an approach to make rapid damage and blockage assessment in buildings possible. This study focuses on sensor-based, real-time blockage assessment of buildings during earthquakes and it is based on the idea that / the blocked units of a building (e.g. corridors) can be assessed with the help of different types of sensors. The number and locations of these sensors are arranged in such a way that it becomes possible to picture the current condition of the building. Sensors utilized in this study can be listed as accelerometer, ultrasonic range finder, gyro sensor, closed cable circuit and video camera. The research steps of this thesis include (1) examination of the damage indicators which can cause blockage, (2) assessment of the monitoring devices, (3) expression of the conducted experimental studies in order to assess blokage condition of a corridor unit, (4) proposing an sensor fusion approach, and (5) presentation of the performed case study as an implementation of the blockage assessment. The findings of this research can be made use of in future studies on sensor-based blockage assessment.
64

MALLS - Mobile Automatic Launch and Landing Station for VTOL UAVs

Gising, Andreas January 2008 (has links)
The market for vertical takeoff and landing unmanned aerial vehicles, VTOL UAVs, is growing rapidly. To reciprocate the demand of VTOL UAVs in offshore applications, CybAero has developed a novel concept for landing on moving objects called MALLS, Mobile Automatic Launch and Landing Station. MALLS can tilt its helipad and is supposed to align to either the horizontal plane with an operator adjusted offset or to the helicopter skids. Doing so, eliminates the gyroscopic forces otherwise induced in the rotordisc as the helicopter is forced to change attitude when the skids align to the ground during landing or when standing on a jolting boat with the rotor spun up. This master’s thesis project is an attempt to get the concept of MALLS closer to a quarter scale implementation. The main focus lies on the development of the measurement methods for achieving the references needed by MALLS, the hori- zontal plane and the plane of the helicopter skids. The control of MALLS is also discussed. The measurement methods developed have been proved by tested implementations or simulations. The theories behind them contain among other things signal filtering, Kalman filtering, sensor fusion and search algorithms. The project have led to that the MALLS prototype can align its helipad to the horizontal plane and that a method for measuring the relative attitude between the helipad and the helicopter skids have been developed. Also suggestions for future improvements are presented.
65

Shooter Localization in a Wireless Sensor Network / Lokalisering av skytt i ett trådlöst sensornätverk

Wilsson, Olof January 2009 (has links)
Shooter localization systems are used to detect and locate the origin of gunfire. A wireless sensor network is one possible implementation of such a system. A wireless sensor network is sensitive to synchronization errors. Localization techniques that rely on the timing will give less accurate or even useless results if the synchronization errors are too large. This thesis focuses on the influence of synchronization errors on the abilityto localize a shooter using a wireless sensor network. A localization algorithm is developed and implemented and the effect of synchronization errors is studied. The localization algorithm is evaluated using numerical experiments, simulations, and data from real gunshots collected at field trials. The results indicate that the developed localization algorithm is able to localizea shooter with quite good accuracy. However, the localization performance is to a high degree influenced by the geographical configuration of the network as well as the synchronization error. / Skottlokaliseringssystem används för att upptäcka och lokalisera ursprunget för avlossade skott. Ett trådlöst sensornätverk är ett sätt att utforma ett sådant system.Trådlösa sensornätverk är känsliga för synkroniseringsfel. Lokaliseringsmetoder som bygger på tidsobservationer kommer med för stora synkroniseringsfel ge dåliga eller helt felaktiga resultat. Detta examensarbete fokuserar på vilken inverkan synkroniseringsfel har på möjligheterna att lokalisera en skytt i ett trådlöst sensornätverk. En lokaliseringsalgoritm utvecklas och förmågan att korrekt lokalisera en skytt vid olika synkroniseringsfel undersöks. Lokaliseringsalgoritmen prövas med numeriska experiment, simuleringar och även för data från riktiga skottljud, insamlade vid fältförsök. Resultaten visar att lokaliseringsalgoritmen fungerar tillfredställande, men att lokaliseringsförmågan till stor del påverkas av synkroniseringsfel men även av sensornätverkets geografiska utseende.
66

State estimation of RC cars for the purpose of drift control / Tillståndsskattning på RC-bilar för driftreglering

Liljestrand, Jonatan January 2011 (has links)
High precision state estimation is crucial when executing drift control and high speed control close to the stability limit, on electric RC scale cars. In this thesis the estimation is made possible through recursive Bayesian filtering; more precisely the Extended Kalman Filter. By modelling the dynamics of the car and using it together with position measurements and control input signals, it is possible to do state estimation and prediction with high accuracy even on non-measured states. Focus is on real-time, on-line, estimation of the so called slip angles of the front and rear tyres, because of their impact of the car’s behaviour. With the extended information given to the system controller, higher levels of controllability could be reached. This can be used not only for higher speeds and drift control, but also a possibility to study future anti-skid safety measures forground vehicles.
67

Sensor fusion between a Synthetic Attitude and Heading Reference System and GPS / Sensorfusion mellan ett Syntetiskt attityd- och kursreferenssystem och GPS

Rosander, Regina January 2003 (has links)
Sensor fusion deals with the merging of several signals into one, extracting a better and more reliable result. Traditionally the Kalmanfilter is used for this purpose and the aircraft navigation has benefited tremendously from its use. This thesis considers the merge of two navigation systems, the GPS positioning system and the Saab developed Synthetic Attitude and Heading Reference System (SAHRS). The purpose is to find a model for such a fusion and to investigate whether the fusion will improve the overall navigation performance. The non-linear nature of the navigation equations will lead to the use of the extended Kalman filter and the model is evaluated against both simulated and real data. The results show that this strategy indeed works but problems will arise when the GPS signal falls away.
68

Robust Automotive Positioning: Integration of GPS and Relative Motion Sensors / Robust fordonspositionering: Integration av GPS och sensorer för relativ rörelse

Kronander, Jon January 2004 (has links)
Automotive positioning systems relying exclusively on the input from a GPS receiver, which is a line of sight sensor, tend to be sensitive to situations with limited sky visibility. Such situations include: urban environments with tall buildings; inside parking structures; underneath trees; in tunnels and under bridges. In these situations, the system has to rely on integration of relative motion sensors to estimate vehicle position. However, these sensor measurements are generally affected by errors such as offsets and scale factors, that will cause the resulting position accuracy to deteriorate rapidly once GPS input is lost. The approach in this thesis is to use a GPS receiver in combination with low cost sensor equipment to produce a robust positioning module. The module should be capable of handling situations where GPS input is corrupted or unavailable. The working principle is to calibrate the relative motion sensors when GPS is available to improve the accuracy during GPS intermission. To fuse the GPS information with the sensor outputs, different models have been proposed and evaluated on real data sets. These models tend to be nonlinear, and have therefore been processed in an Extended Kalman Filter structure. Experiments show that the proposed solutions can compensate for most of the errors associated with the relative motion sensors, and that the resulting positioning accuracy is improved accordingly.
69

3D Multi-Field Multi-Scale Features From Range Data In Spacecraft Proximity Operations

Flewelling, Brien Roy 2012 May 1900 (has links)
A fundamental problem in spacecraft proximity operations is the determination of the 6 degree of freedom relative navigation solution between the observer reference frame and a reference frame tied to a proximal body. For the most unconstrained case, the proximal body may be uncontrolled, and the observer spacecraft has no a priori information on the body. A spacecraft in this scenario must simultaneously map the generally poorly known body being observed, and safely navigate relative to it. Simultaneous localization and mapping(SLAM)is a difficult problem which has been the focus of research in recent years. The most promising approaches extract local features in 2D or 3D measurements and track them in subsequent observations by means of matching a descriptor. These methods exist for both active sensors such as Light Detection and Ranging(LIDAR) or laser RADAR(LADAR), and passive sensors such as CCD and CMOS camera systems. This dissertation presents a method for fusing time of flight(ToF) range data inherent to scanning LIDAR systems with the passive light field measurements of optical systems, extracting features which exploit information from each sensor, and solving the unique SLAM problem inherent to spacecraft proximity operations. Scale Space analysis is extended to unstructured 3D point clouds by means of an approximation to the Laplace Beltrami operator which computes the scale space on a manifold embedded in 3D object space using Gaussian convolutions based on a geodesic distance weighting. The construction of the scale space is shown to be equivalent to both the application of the diffusion equation to the surface data, as well as the surface evolution process which results from mean curvature flow. Geometric features are localized in regions of high spatial curvature or large diffusion displacements at multiple scales. The extracted interest points are associated with a local multi-field descriptor constructed from measured data in the object space. Defining features in object space instead of image space is shown to bean important step making the simultaneous consideration of co-registered texture and the associated geometry possible. These descriptors known as Multi-Field Diffusion Flow Signatures encode the shape, and multi-texture information of local neighborhoods in textured range data. Multi-Field Diffusion Flow Signatures display utility in difficult space scenarios including high contrast and saturating lighting conditions, bland and repeating textures, as well as non-Lambertian surfaces. The effectiveness and utility of Multi-Field Multi-Scale(MFMS) Features described by Multi-Field Diffusion Flow Signatures is evaluated using real data from proximity operation experiments performed at the Land Air and Space Robotics(LASR) Laboratory at Texas A&M University.
70

Pose Estimation and Calibration Algorithms for Vision and Inertial Sensors

Hol, Jeroen Diederik January 2008 (has links)
<p>This thesis deals with estimating position and orientation in real-time, using measurements from vision and inertial sensors. A system has been developed to solve this problem in unprepared environments, assuming that a map or scene model is available. Compared to ‘camera-only’ systems, the combination of the complementary sensors yields an accurate and robust system which can handle periods with uninformative or no vision data and reduces the need for high frequency vision updates.</p><p>The system achieves real-time pose estimation by fusing vision and inertial sensors using the framework of nonlinear state estimation for which state space models have been developed. The performance of the system has been evaluated using an augmented reality application where the output from the system is used to superimpose virtual graphics on the live video stream. Furthermore, experiments have been performed where an industrial robot providing ground truth data is used to move the sensor unit. In both cases the system performed well.</p><p>Calibration of the relative position and orientation of the camera and the inertial sensor turn out to be essential for proper operation of the system. A new and easy-to-use algorithm for estimating these has been developed using a gray-box system identification approach. Experimental results show that the algorithm works well in practice.</p>

Page generated in 0.0638 seconds