Spelling suggestions: "subject:"mixed ring unmanned periferial ehicles"" "subject:"mixed ring unmanned periferial invehicles""
1 |
Analytical approach to multi-objective joint inference control for fixed wing unmanned aerial vehiclesCasey, Julian L. 15 December 2020 (has links)
No description available.
|
2 |
Vision-based Strategies for Landing of Fixed Wing Unmanned Aerial VehiclesMarianandam, Peter Arun January 2015 (has links) (PDF)
Vision-based conventional landing of a fixed wing UAV is addressed in this thesis. The work includes mathematical modeling, interface to a software for rendering the outside scenery, image processing techniques, control law development and outdoor experimentation.
This research focuses on detecting the lines or the edges that flank the landing site, use them as visual cues to extract the geometrical parameters such as the line co-ordinates and the line slopes, that are mapped to the control law, to align and conventionally land the fixed wing UAV. Pre-processing and image processing techniques such as Canny Edge detection and Hough Transforms have been used to detect the runway lines or the edges of a landing strip.
A Vision-in-the-Loop Simulation (VILS) set up on a personal computer or laptop, has been developed, without any external camera/equipment or networking cables that enables visual serving toper form vision-based studies and simulation. UAV mass, inertia, engine and aero data from literature has been used along withUAV6DOF equations to represent the UAV mathematical model. The UAV model is interfaced to a software using UDP data packets via ports, for rendering the outside scenery in accordance with the UAV’s translation and orientation. The snapshots of the outside scenery, that is passed through an internet URL by including the ‘http’ protocol, is image processed to detect the lines and the line parameters for the control.
VILS set has been used to simulate UAV alignment to the runway and landing. Vision-based alignment is achieved by rolling the UAV such that the landing strip that is off center is brought to the center of the image plane. A two stage proportional aileron control input using the line co-ordinates, bringing the midpoints of the top ends of the runway lines to the center of the image, followed by bringing the mid points of the bottom ends of the runway lines to the center of the image has been demonstrated through simulation.
A vision-based control for landing has been developed, that consists of an elevator command that is commiserate with the acceptable range of glide slope followed by a flare command till touch down, which is a function of the flare height and estimated height from the 3rd order polynomial of the runway slope obtained by characterization. The feasibility of using the algorithms for a semi-prepared or unprepared landing strip with no visible runway lines have also been demonstrated. Landing on an empty tract of land and in poor visibility condition, by synthetically drawing the runway lines based on a single 3rd order slope. vs height polynomial solution are also presented. A fixed area, and a dynamic area search for the Hough peaks in the Hough accumulator array for the correct detection of lines are addressed.
A novel technique for crosswind landing, quite different from conventional techniques, has been introduced, using only the aileron control input for correcting the drift. Three different strategies using the line co-ordinates and the line slopes, with varying levels of accuracy have been presented and compared.
About 125 landing data of a manned instrumented prototype aircraft have been analysed to corroborate the findings of this research. Outdoor experiments are also conducted to verify the feasibility of using the line detection algorithm in a realistic scenario and to generate experimental evidence for the findings of this research. Computation time estimates are presented to establish the feasibility of using vision for the problem of conventional landing.
The thesis concludes with the findings and direction for future work.
|
Page generated in 0.0891 seconds