Journal article
IEEE Robotics and Automation Letters, 2023
APA
Click to copy
Eltouny, K. A., Liu, W., Tian, S., Zheng, M., & Liang, X. (2023). DE-TGN: Uncertainty-Aware Human Motion Forecasting Using Deep Ensembles. IEEE Robotics and Automation Letters.
Chicago/Turabian
Click to copy
Eltouny, Kareem A., Wansong Liu, Sibo Tian, Minghui Zheng, and Xiao Liang. “DE-TGN: Uncertainty-Aware Human Motion Forecasting Using Deep Ensembles.” IEEE Robotics and Automation Letters (2023).
MLA
Click to copy
Eltouny, Kareem A., et al. “DE-TGN: Uncertainty-Aware Human Motion Forecasting Using Deep Ensembles.” IEEE Robotics and Automation Letters, 2023.
BibTeX Click to copy
@article{kareem2023a,
title = {DE-TGN: Uncertainty-Aware Human Motion Forecasting Using Deep Ensembles},
year = {2023},
journal = {IEEE Robotics and Automation Letters},
author = {Eltouny, Kareem A. and Liu, Wansong and Tian, Sibo and Zheng, Minghui and Liang, Xiao}
}
Ensuring the safety of human workers in a collaborative environment with robots is of utmost importance. Although accurate pose prediction models can help prevent collisions between human workers and robots, they are still susceptible to critical errors. In this study, we propose a novel approach called deep ensembles of temporal graph neural networks (DE-TGN) that not only accurately forecast human motion but also provide a measure of prediction uncertainty. By leveraging deep ensembles and employing stochastic Monte-Carlo dropout sampling, we construct a volumetric field representing a range of potential future human poses based on covariance ellipsoids. To validate our framework, we conducted experiments using three motion capture datasets including Human3.6M, and two human-robot interaction scenarios, achieving state-of-the-art prediction error. Moreover, we discovered that deep ensembles not only enable us to quantify uncertainty but also improve the accuracy of our predictions.