The ISC is part of the Johns Hopkins Applied Physics Laboratory and will follow all current policies. Please visit the JHU/APL page for more information on the Lab's visitor guidance.

2019

Visual Robot Task Planning


Abstract

Prospection is key to solving challenging problems in new environments, but it has not been deeply explored as applied to task planning for perception-driven robotics. We propose visual robot task planning, where we take in an input image and must generate a sequence of high-level actions and associated observations that achieve some task. In this paper, we describe a neural network architecture and associated planning algorithm that (1) learns a representation of the world that can generate prospective futures, (2) uses this generative model to simulate the result of sequences of high-level actions in a variety of environments, and (3) evaluates these actions via a variant of Monte Carlo Tree Search to find a viable solution to a particular problem. Our approach allows us to visualize intermediate motion goals and learn to plan complex activity from visual information, and used this to generate and visualize task plans on held-out examples of a block-stacking simulation.

Citation

@inproceedingsPaxton_2019 doi: 10.1109/icra.2019.8793736 url: https://doi.org/10.1109/icra.2019.8793736 year: 2019 month: may publisher: IEEE author: Paxton Chris and Barnoy Yotam and Katyal Kapil and Arora Raman and Hager Gregory D. title: Visual Robot Task Planning booktitle: 2019 International Conference on Robotics and Automation (ICRA)

Citation

@inproceedingsPaxton_2019 doi: 10.1109/icra.2019.8793736 url: https://doi.org/10.1109/icra.2019.8793736 year: 2019 month: may publisher: IEEE author: Paxton Chris and Barnoy Yotam and Katyal Kapil and Arora Raman and Hager Gregory D. title: Visual Robot Task Planning booktitle: 2019 International Conference on Robotics and Automation (ICRA)