In recent years, due to the international competition, soaring cost of land and personnel, aging population, low birth rate¡Ketc, resulting in the recession of the competitiveness of traditional industries in Taiwan. Manpower is needed to monitor the manufacturing process, however, only a worker can¡¦t endure such kind of repetitive workload; on the other hand, it¡¦s not economic to hire more workers to share the workload. Therefore, we expect robots to replace human resources in the manufacturing process.
With the advance of science and technology, the mobile robot must equip intelligent judgments. For instance, obstacle avoidance, a way to avoid damage being caused by collision with the obstacles. In general, there are some tables, chairs and the electrical equipment in the office. In the dynamic obstacles case, the robot is effective and immediate response to determine while the people are walking, the staff members to maintain a work efficiency, and security through complex environments. It is the primary topics of discussion.
Another important function is path planning, such as the patrol, and the global path planning. Let the mobile robot reach the specified target successfully.
In the remote monitoring case, let users know the actual situation of the mobile robot. For example, records of patrol information and specify the action type to move.
Therefore, this thesis presents a project of the indoor integrated intelligent mobile robots, including obstacle avoidance, path planning, and remote monitoring of the unknown environment.
Identifer | oai:union.ndltd.org:NSYSU/oai:NSYSU:etd-0903112-133520 |
Date | 03 September 2012 |
Creators | Chen, Guan-Yan |
Contributors | Kao-Shing Hwang, Jau-Woei Perng, Chi-Cheng Cheng, Kuo-Yang Tu |
Publisher | NSYSU |
Source Sets | NSYSU Electronic Thesis and Dissertation Archive |
Language | Cholon |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0903112-133520 |
Rights | user_define, Copyright information available at source archive |
Page generated in 0.0016 seconds