• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 186
  • 25
  • 13
  • 8
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 292
  • 292
  • 78
  • 69
  • 64
  • 61
  • 56
  • 48
  • 43
  • 43
  • 42
  • 40
  • 38
  • 35
  • 35
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

State estimation of RC cars for the purpose of drift control / Tillståndsskattning på RC-bilar för driftreglering

Liljestrand, Jonatan January 2011 (has links)
High precision state estimation is crucial when executing drift control and high speed control close to the stability limit, on electric RC scale cars. In this thesis the estimation is made possible through recursive Bayesian filtering; more precisely the Extended Kalman Filter. By modelling the dynamics of the car and using it together with position measurements and control input signals, it is possible to do state estimation and prediction with high accuracy even on non-measured states. Focus is on real-time, on-line, estimation of the so called slip angles of the front and rear tyres, because of their impact of the car’s behaviour. With the extended information given to the system controller, higher levels of controllability could be reached. This can be used not only for higher speeds and drift control, but also a possibility to study future anti-skid safety measures forground vehicles.
82

Indoor Positioning using Sensor-fusion in Android Devices

Shala, Ubejd, Rodriguez, Angel January 2011 (has links)
This project examines the level of accuracy that can be achieved in precision positioning by using built-in sensors in an Android smartphone. The project is focused in estimating the position of the phone inside a building where the GPS signal is bad or unavailable. The approach is sensor-fusion: by using data from the device’s different sensors, such as accelerometer, gyroscope and wireless adapter, the position is determined. The results show that the technique is promising for future handheld indoor navigation systems that can be used in malls, museums, large office buildings, hospitals, etc.
83

Sensor fusion between a Synthetic Attitude and Heading Reference System and GPS / Sensorfusion mellan ett Syntetiskt attityd- och kursreferenssystem och GPS

Rosander, Regina January 2003 (has links)
Sensor fusion deals with the merging of several signals into one, extracting a better and more reliable result. Traditionally the Kalmanfilter is used for this purpose and the aircraft navigation has benefited tremendously from its use. This thesis considers the merge of two navigation systems, the GPS positioning system and the Saab developed Synthetic Attitude and Heading Reference System (SAHRS). The purpose is to find a model for such a fusion and to investigate whether the fusion will improve the overall navigation performance. The non-linear nature of the navigation equations will lead to the use of the extended Kalman filter and the model is evaluated against both simulated and real data. The results show that this strategy indeed works but problems will arise when the GPS signal falls away.
84

Robust Automotive Positioning: Integration of GPS and Relative Motion Sensors / Robust fordonspositionering: Integration av GPS och sensorer för relativ rörelse

Kronander, Jon January 2004 (has links)
Automotive positioning systems relying exclusively on the input from a GPS receiver, which is a line of sight sensor, tend to be sensitive to situations with limited sky visibility. Such situations include: urban environments with tall buildings; inside parking structures; underneath trees; in tunnels and under bridges. In these situations, the system has to rely on integration of relative motion sensors to estimate vehicle position. However, these sensor measurements are generally affected by errors such as offsets and scale factors, that will cause the resulting position accuracy to deteriorate rapidly once GPS input is lost. The approach in this thesis is to use a GPS receiver in combination with low cost sensor equipment to produce a robust positioning module. The module should be capable of handling situations where GPS input is corrupted or unavailable. The working principle is to calibrate the relative motion sensors when GPS is available to improve the accuracy during GPS intermission. To fuse the GPS information with the sensor outputs, different models have been proposed and evaluated on real data sets. These models tend to be nonlinear, and have therefore been processed in an Extended Kalman Filter structure. Experiments show that the proposed solutions can compensate for most of the errors associated with the relative motion sensors, and that the resulting positioning accuracy is improved accordingly.
85

Modeling and Estimation of Dynamic Tire Properties

Narby, Erik January 2006 (has links)
Information about dynamic tire properties has always been important for drivers of wheel driven vehicles. With the increasing amount of systems in modern vehicles designed to measure and control the behavior of the vehicle information regarding dynamic tire properties has grown even more important. In this thesis a number of methods for modeling and estimating dynamic tire properties have been implemented and evaluated. The more general issue of estimating model parameters in linear and non-linear vehicle models is also addressed. We conclude that the slope of the tire slip curve seems to dependent on the stiffness of the road surface and introduce the term combined stiffness. We also show that it is possible to estimate both longitudinal and lateral combined stiffness using only standard vehicle sensors.
86

3D Multi-Field Multi-Scale Features From Range Data In Spacecraft Proximity Operations

Flewelling, Brien Roy 2012 May 1900 (has links)
A fundamental problem in spacecraft proximity operations is the determination of the 6 degree of freedom relative navigation solution between the observer reference frame and a reference frame tied to a proximal body. For the most unconstrained case, the proximal body may be uncontrolled, and the observer spacecraft has no a priori information on the body. A spacecraft in this scenario must simultaneously map the generally poorly known body being observed, and safely navigate relative to it. Simultaneous localization and mapping(SLAM)is a difficult problem which has been the focus of research in recent years. The most promising approaches extract local features in 2D or 3D measurements and track them in subsequent observations by means of matching a descriptor. These methods exist for both active sensors such as Light Detection and Ranging(LIDAR) or laser RADAR(LADAR), and passive sensors such as CCD and CMOS camera systems. This dissertation presents a method for fusing time of flight(ToF) range data inherent to scanning LIDAR systems with the passive light field measurements of optical systems, extracting features which exploit information from each sensor, and solving the unique SLAM problem inherent to spacecraft proximity operations. Scale Space analysis is extended to unstructured 3D point clouds by means of an approximation to the Laplace Beltrami operator which computes the scale space on a manifold embedded in 3D object space using Gaussian convolutions based on a geodesic distance weighting. The construction of the scale space is shown to be equivalent to both the application of the diffusion equation to the surface data, as well as the surface evolution process which results from mean curvature flow. Geometric features are localized in regions of high spatial curvature or large diffusion displacements at multiple scales. The extracted interest points are associated with a local multi-field descriptor constructed from measured data in the object space. Defining features in object space instead of image space is shown to bean important step making the simultaneous consideration of co-registered texture and the associated geometry possible. These descriptors known as Multi-Field Diffusion Flow Signatures encode the shape, and multi-texture information of local neighborhoods in textured range data. Multi-Field Diffusion Flow Signatures display utility in difficult space scenarios including high contrast and saturating lighting conditions, bland and repeating textures, as well as non-Lambertian surfaces. The effectiveness and utility of Multi-Field Multi-Scale(MFMS) Features described by Multi-Field Diffusion Flow Signatures is evaluated using real data from proximity operation experiments performed at the Land Air and Space Robotics(LASR) Laboratory at Texas A&M University.
87

Pose Estimation and Calibration Algorithms for Vision and Inertial Sensors

Hol, Jeroen Diederik January 2008 (has links)
<p>This thesis deals with estimating position and orientation in real-time, using measurements from vision and inertial sensors. A system has been developed to solve this problem in unprepared environments, assuming that a map or scene model is available. Compared to ‘camera-only’ systems, the combination of the complementary sensors yields an accurate and robust system which can handle periods with uninformative or no vision data and reduces the need for high frequency vision updates.</p><p>The system achieves real-time pose estimation by fusing vision and inertial sensors using the framework of nonlinear state estimation for which state space models have been developed. The performance of the system has been evaluated using an augmented reality application where the output from the system is used to superimpose virtual graphics on the live video stream. Furthermore, experiments have been performed where an industrial robot providing ground truth data is used to move the sensor unit. In both cases the system performed well.</p><p>Calibration of the relative position and orientation of the camera and the inertial sensor turn out to be essential for proper operation of the system. A new and easy-to-use algorithm for estimating these has been developed using a gray-box system identification approach. Experimental results show that the algorithm works well in practice.</p>
88

Investigation of Inertial Navigation for Localization in Underground Mines

Svensson, John January 2015 (has links)
This thesis project considers the potential use of inertial navigation on a consumer grade tablet mounted in a vehicle in an underground mine. The goal is to identify which sensors and techniques are useful and to design a navigation algorithm based on those results. The navigation algorithm is intended to work alongside the current received signal strength indication (RSSI) positioning system. Testing of the gyroscope, accelerometer and magnetometer sensors suggest that, while dead reckoning is likely not precise enough, an orientation filter can be designed that can be used for navigation. A complementary orientation filter using the gyroscope and accelerometer is then designed that shows better results than the default sensor fusion solutions available in Android. The filter is expandable and can come to include magnetometer data in the future. Based on the outputs of this filter, a navigation algorithm based onangle matching with map information is proposed. Precise positioning in an underground mine can be crucial to employee safety, and may also bring production benefits.
89

Sensor Fusion and Calibration of Inertial Sensors, Vision, Ultra-Wideband and GPS

Hol, Jeroen D. January 2011 (has links)
The usage of inertial sensors has traditionally been confined primarily to the aviation and marine industry due to their associated cost and bulkiness. During the last decade, however, inertial sensors have undergone a rather dramatic reduction in both size and cost with the introduction of MEMS technology. As a result of this trend, inertial sensors have become commonplace for many applications and can even be found in many consumer products, for instance smart phones, cameras and game consoles. Due to the drift inherent in inertial technology, inertial sensors are typically used in combination with aiding sensors to stabilize andimprove the estimates. The need for aiding sensors becomes even more apparent due to the reduced accuracy of MEMS inertial sensors. This thesis discusses two problems related to using inertial sensors in combination with aiding sensors. The first is the problem of sensor fusion: how to combine the information obtained from the different sensors and obtain a good estimate of position and orientation. The second problem, a prerequisite for sensor fusion, is that of calibration: the sensors themselves have to be calibrated and provide measurement in known units. Furthermore, whenever multiple sensors are combined additional calibration issues arise, since the measurements are seldom acquired in the same physical location and expressed in a common coordinate frame. Sensor fusion and calibration are discussed for the combination of inertial sensors with cameras, UWB or GPS. Two setups for estimating position and orientation in real-time are presented in this thesis. The first uses inertial sensors in combination with a camera; the second combines inertial sensors with UWB. Tightly coupled sensor fusion algorithms and experiments with performance evaluation are provided. Furthermore, this thesis contains ideas on using an optimization based sensor fusion method for a multi-segment inertial tracking system used for human motion capture as well as a sensor fusion method for combining inertial sensors with a dual GPS receiver. The above sensor fusion applications give rise to a number of calibration problems. Novel and easy-to-use calibration algorithms have been developed and tested to determine the following parameters: the magnetic field distortion when an IMU containing magnetometers is mounted close to a ferro-magnetic object, the relative position and orientation of a rigidly connected camera and IMU, as well as the clock parameters and receiver positions of an indoor UWB positioning system. / MATRIS (Markerless real-time Tracking for Augmented Reality Image), a sixth framework programme funded by the European Union / CADICS (Control, Autonomy, and Decision-making in Complex Systems), a Linneaus Center funded by the Swedish Research Council (VR) / Strategic Research Center MOVIII, funded by the Swedish Foundation for Strategic Research (SSF)
90

ENHANCED GRAIN CROP YIELD MONITOR ACCURACY THROUGH SENSOR FUSION AND POST-PROCESSING ALGORITHMS

Veal, Matthew Wayne 01 January 2006 (has links)
Yield monitors have become an indispensable part of precision agriculture systemsbecause of their ability to measure the yield variability. Accurate yield monitor data availabilityis essential for the assessment of farm practices. The current technology of measuring grainyields is prone to errors that can be attributed to mass flow variations caused by the mechanismswithin a grain combine. Because of throughput variations, there are doubts regarding thecorrelation between the mass flow measurement and the actual grain volume produced at aspecific location. Another inaccuracy observed in yield monitor data can be attributed to inexactcut-widths values entered by the machine operator.To effectively address these yield monitor errors, two crop mass flow sensing deviceswere developed and used to correct yield monitor data. The two quantities associated with cropmaterial mass flow that were sensed were tension on the feeder housing drive chain and thehydraulic pressure on the threshing cylinder's variable speed drive. Both sensing approacheswere capable of detecting zero mass flow conditions better than the traditional grain mass flowsensor. The alternative sensors also operate without being adversely affected by materialtransport delays. The feeder housing-based sensor was more sensitive to variations in cropmaterial throughput than the hydraulic pressure sensor. Crop mass flow is not a surrogate forgrain mass flow because of a weak relationship (R2 andlt; 0.60) between the two quantities. The cropmass flow signal does denote the location and magnitude of material throughput variations intothe combine. This delineation was used to redistribute grain mass flow by aligning grain andcrop mass flow transitions using sensor fusion techniques. Significant improvements (?? = 0.05)in yield distribution profile were found after the correction was applied.To address the cut-width entry error, a GIS-based post-processing algorithm wasdeveloped to calculate the true harvest area for each yield monitor data point. Based on theresults of this method, a combine operator can introduce yield calculation errors of 15%. Whenthese two correction methods applied to yield monitor data, the result is yield maps withdramatically improved yield estimates and enhanced spatial accuracy.

Page generated in 0.0752 seconds