DynaBARN: Benchmarking Metric Ground Navigation in Dynamic Environments (2022)
Anirudh Nair, Fulin Jiang, Kang Hou, Zifan Xu, Shuozhe Li, Xuesu Xiao, and Peter Stone
Safely avoiding dynamic obstacles while moving toward a goal is a fundamental capability of autonomous mobile robots. Current benchmarks for dynamic obstacle avoidance do not provide a way to alter how obstacles move and instead use only a single method to uniquely determine the movement of obstacles, e.g., constant velocity, the social force model, or Optimal Reciprocal Collision Avoidance (ORCA). Using a single method in this way restricts the variety of scenarios in which the robot navigation system is trained and/or evaluated, thus limiting its robustness to dynamic obstacles of different speeds, trajectory smoothness, acceleration/deceleration, etc., which we call motion profiles. In this paper, we present a simulation testbed, DynaBARN, to evaluate a robot navigation system's ability to navigate in environments with obstacles with different motion profiles, which are systematically generated by a set of difficulty metrics. Additionally, we provide a demonstration collection pipeline that records robot navigation trials controlled by human users to compare with autonomous navigation performance and to develop navigation systems using learning from demonstration. Finally, we provide results of four classical and learning-based navigation systems in DynaBARN, which can serve as baselines for future studies. We release DynaBARN open source as a standardized benchmark for future autonomous navigation research in environments with different dynamic obstacles. The code and environments are released at https:github.com/aninair1905/DynaBARN.
View:
PDF
Citation:
In Proceedings of the 2022 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), November 2022.
Bibtex:

Peter Stone Faculty pstone [at] cs utexas edu