Random Tree Generator on Collision Map for Multi-agent Collision Free Speed Planning
다개체의 무충돌 속도 계획을 위한 충돌맵 위의 랜덤 트리 생성기

Cited 0 time in Web of Science Cited 0 time in Scopus
공과대학 전기·정보공학부
Issue Date
서울대학교 대학원
학위논문 (석사)-- 서울대학교 대학원 : 공과대학 전기·정보공학부, 2018. 8. 이범희.
This thesis proposes collision - avoidance control technique of robots operating in a multi-agent environment. This thesis proposes a random tree generator on the collision map as a key algorithm for driving control. The random tree generator method relates to generating a random tree on a collision map that reflects the SCC (Simple continuous-curvature) path model and generating a suitable velocity profile.

Until now, the collision avoidance algorithm of the robot on the collision map did not consider the change of the velocity and the acceleration constraints in the path. However, there is a curved path as well as a straight path in the case of the actual robot system. The robot has a velocity and an acceleration range smaller than the maximum possible linear path to travel stably in a curved path. Therefore, it is necessary to reflect the velocity and acceleration changes according to the property of the path on the collision map to realize the near real speed plan.

The conventional collision avoidance algorithm on collision map uses a time delay method or a minimum time delay method. However, when using the time delay method in the multi-agent environment, it can be confirmed that the accumulation effect of time delay rises as the number of robots increases. Therefore, there is a need for an algorithm that finds a velocity profile that can avoid the collision without delaying time on the collision map.

According to the necessity of the new collision avoidance method, a random tree method designed to find the path on the map is modified and applied on the collision map. The random tree method is a method of locating the route to the destination by randomly generating trees from the origin on the map. We propose a random tree generator for collision avoidance by applying this method to collision map space which is time and distance space. A simulator of a multi-agent environment was created and compared with the conventional collision avoidance algorithm to verify the validity of the proposed algorithm.

First, we modeled robots with SCC path model and created velocity and acceleration constraint conditions in each path. To construct multi-agent environment, we set priority, origin, destination, route, and the speed and acceleration limit conditions according to the path of 12 robots. To avoid a collision between robots, we used existing extended collision map method. Moreover, we applied the existing collision avoidance algorithms and proposed random tree generator method to the simulator.

The results show that the proposed method generates the velocity profile on the collision map of each robot. The characteristics of the tree generation step considering the SCC path model are analyzed. Moreover, we compare the arrival time of the proposed method with the arrival time of the conventional time - delay algorithm. Then, we confirmed that the time - accumulation effect does not appear when we applied the random tree generator method to the multi-agent environment. Therefore, we confirmed that the proposed method is a time efficient method.

In conclusion, we can apply the random tree generator method to the collision map with various velocity and acceleration constraints. Moreover, we can solve the accumulative time delay effect by using the random tree generator method. We expect that not only mobile robot such as unmanned car but also multi-agent robot systems such as airport surveillance system and hospital support robot system use the proposed method as a speed planning method to avoid a collision between each other.
Files in This Item:
Appears in Collections:
College of Engineering/Engineering Practice School (공과대학/대학원)Dept. of Electrical and Computer Engineering (전기·정보공학부)Theses (Master's Degree_전기·정보공학부)
  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.