Peter Stone's Selected Publications

Classified by TopicClassified by Publication TypeSorted by DateSorted by First Author Last NameClassified by Funding Source


Watch Where You're Going! Gaze and Head Orientation as Predictors for Social Robot Navigation

Watch Where You're Going! Gaze and Head Orientation as Predictors for Social Robot Navigation.
Blake Holman, Abrar Anwar, Akash Singh, Mauricio Tec, Justin Hart, and Peter Stone.
In Proceedings of the International Conference on Robotics and Automation (ICRA), May 2021.

Download

[PDF]5.7MB  

Abstract

Mobile robots deployed in human-populated environments must be able to safelyand comfortably navigate in close proximity to people. Head orientation and gazeare both mechanisms which help people to interpret where other people intend to walk, which in turn enables them to coordinate their movement. Head orientation has previously been leveraged to develop classifiers which are able to predict the goal of a person's walking motion. Gaze is believed to generally precede head orientation, with a person quickly moving their eyes to a target and then following it with a turn of their head. This study leverages state-of-the-art virtual reality technology to place participants into a simulated environment in which their gaze and motion can be observed. The results of this study indicate that position, velocity, head orientation, and gaze can all be used as predictive features of the goal of a person's walking motion. The results also indicate that gaze both precedes head orientation and can be used to predict thegoal of a person's walking motion at a higher level of accuracy earlier in theirwalking trajectory. These findings can be leveraged in the design of social navigation systems for mobile robots.

BibTeX Entry

@InProceedings{ICRA21-hart,
  author = {
	Blake Holman and
	Abrar Anwar and
	Akash Singh and
	Mauricio Tec and
	Justin Hart and
	Peter Stone},
  title = {Watch Where You're Going! Gaze and Head Orientation as Predictors for Social Robot Navigation},
  booktitle = {Proceedings of the International Conference on Robotics and Automation (ICRA)},
  location = {Xi'an, China},
  month = {May},
  year = {2021},
  abstract = {
Mobile robots deployed in human-populated environments must be able to safely
and comfortably navigate in close proximity to people. Head orientation and gaze
are both mechanisms which help people to interpret where other people intend to 
walk, which in turn enables them to coordinate their movement. Head orientation 
has previously been leveraged to develop classifiers which are able to predict 
the goal of a person's walking motion. Gaze is believed to generally precede 
head orientation, with a person quickly moving their eyes to a target and then 
following it with a turn of their head. This study leverages state-of-the-art 
virtual reality technology to place participants into a simulated environment 
in which their gaze and motion can be observed. The results of this study 
indicate that position, velocity, head orientation, and gaze can all be used as 
predictive features of the goal of a person's walking motion. The results also 
indicate that gaze both precedes head orientation and can be used to predict the
goal of a person's walking motion at a higher level of accuracy earlier in their
walking trajectory. These findings can be leveraged in the design of social 
navigation systems for mobile robots.
},
}

Generated by bib2html.pl (written by Patrick Riley ) on Mon Mar 25, 2024 00:05:13