Robotics: Science and Systems XV

DensePhysNet: Learning Dense Physical Object Representations Via Multi-Step Dynamic Interactions

Zhenjia Xu, Jiajun Wu, Andy Zeng, Joshua Tenenbaum, Shuran Song

Abstract:

We study the problem of learning physical object representations for robot manipulation. Understanding object physics is critical for successful object manipulation, but also challenging because physical object properties can rarely be inferred from the object's static appearance. In this paper, we propose DensePhysNet, a system that actively executes a sequence of dynamic interactions (e.g., sliding and colliding), and uses a deep predictive model over its visual observations to learn dense, pixel-wise representations that reflect the physical properties of observed objects. Our experiments in both simulation and real settings demonstrate that the learned representations carry rich physical information, and can directly be used to decode physical object properties such as friction and mass. The use of dense representation enables DensePhysNet to generalize well to novel scenes with more objects than in training. With knowledge of object physics, the learned representation also leads to more accurate and efficient manipulation in downstream tasks than the state-of-the-art.

Download:

Bibtex:

  
@INPROCEEDINGS{Song-RSS-19, 
    AUTHOR    = {Zhenjia Xu AND Jiajun Wu AND Andy Zeng AND Joshua Tenenbaum AND Shuran Song}, 
    TITLE     = {DensePhysNet: Learning Dense Physical Object Representations Via Multi-Step Dynamic Interactions}, 
    BOOKTITLE = {Proceedings of Robotics: Science and Systems}, 
    YEAR      = {2019}, 
    ADDRESS   = {FreiburgimBreisgau, Germany}, 
    MONTH     = {June}, 
    DOI       = {10.15607/RSS.2019.XV.046} 
}