• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 3
  • 1
  • Tagged with
  • 13
  • 13
  • 13
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

A DSP embedded optical naviagtion system

Gunnam, Kiran Kumar 30 September 2004 (has links)
Spacecraft missions such as spacecraft docking and formation flying require high precision relative position and attitude data. Although Global Positioining Systems can provide this capability near the earth, deep space missions require the use of alternative technologies. One such technology is the vision-based navigation (VISNAV) sensor system developed at Texas A&M University. VISNAV comprises an electro-optical sensor combined with light sources or beacons. This patented sensor has an analog detector in the focal plane with a rise time of a few microseconds. Accuracies better than one part in 2000 of the field of view have been obtained. This research presents a new approach involving simultaneous activation of beacons with frequency division multiplexing as part of the VISNAV sensor system. In addition, it discusses the synchronous demodulation process using digital heterodyning and decimating filter banks on a low-power fixed point DSP, which improves the accuracy of the sensor measurements and the reliability of the system. This research also presents an optimal and computationally efficient six-degree-of-freedom estimation algorithm using a new measurement model based on the attitude representation of Modified Rodrigues Parameters.
12

A DSP embedded optical naviagtion system

Gunnam, Kiran Kumar 30 September 2004 (has links)
Spacecraft missions such as spacecraft docking and formation flying require high precision relative position and attitude data. Although Global Positioining Systems can provide this capability near the earth, deep space missions require the use of alternative technologies. One such technology is the vision-based navigation (VISNAV) sensor system developed at Texas A&M University. VISNAV comprises an electro-optical sensor combined with light sources or beacons. This patented sensor has an analog detector in the focal plane with a rise time of a few microseconds. Accuracies better than one part in 2000 of the field of view have been obtained. This research presents a new approach involving simultaneous activation of beacons with frequency division multiplexing as part of the VISNAV sensor system. In addition, it discusses the synchronous demodulation process using digital heterodyning and decimating filter banks on a low-power fixed point DSP, which improves the accuracy of the sensor measurements and the reliability of the system. This research also presents an optimal and computationally efficient six-degree-of-freedom estimation algorithm using a new measurement model based on the attitude representation of Modified Rodrigues Parameters.
13

Robot navigation in sensor space

Keeratipranon, Narongdech January 2009 (has links)
This thesis investigates the problem of robot navigation using only landmark bearings. The proposed system allows a robot to move to a ground target location specified by the sensor values observed at this ground target posi- tion. The control actions are computed based on the difference between the current landmark bearings and the target landmark bearings. No Cartesian coordinates with respect to the ground are computed by the control system. The robot navigates using solely information from the bearing sensor space. Most existing robot navigation systems require a ground frame (2D Cartesian coordinate system) in order to navigate from a ground point A to a ground point B. The commonly used sensors such as laser range scanner, sonar, infrared, and vision do not directly provide the 2D ground coordi- nates of the robot. The existing systems use the sensor measurements to localise the robot with respect to a map, a set of 2D coordinates of the objects of interest. It is more natural to navigate between the points in the sensor space corresponding to A and B without requiring the Cartesian map and the localisation process. Research on animals has revealed how insects are able to exploit very limited computational and memory resources to successfully navigate to a desired destination without computing Cartesian positions. For example, a honeybee balances the left and right optical flows to navigate in a nar- row corridor. Unlike many other ants, Cataglyphis bicolor does not secrete pheromone trails in order to find its way home but instead uses the sun as a compass to keep track of its home direction vector. The home vector can be inaccurate, so the ant also uses landmark recognition. More precisely, it takes snapshots and compass headings of some landmarks. To return home, the ant tries to line up the landmarks exactly as they were before it started wandering. This thesis introduces a navigation method based on reflex actions in sensor space. The sensor vector is made of the bearings of some landmarks, and the reflex action is a gradient descent with respect to the distance in sensor space between the current sensor vector and the target sensor vec- tor. Our theoretical analysis shows that except for some fully characterized pathological cases, any point is reachable from any other point by reflex action in the bearing sensor space provided the environment contains three landmarks and is free of obstacles. The trajectories of a robot using reflex navigation, like other image- based visual control strategies, do not correspond necessarily to the shortest paths on the ground, because the sensor error is minimized, not the moving distance on the ground. However, we show that the use of a sequence of waypoints in sensor space can address this problem. In order to identify relevant waypoints, we train a Self Organising Map (SOM) from a set of observations uniformly distributed with respect to the ground. This SOM provides a sense of location to the robot, and allows a form of path planning in sensor space. The navigation proposed system is analysed theoretically, and evaluated both in simulation and with experiments on a real robot.

Page generated in 0.098 seconds