JavaScript is disabled for your browser. Some features of this site may not work without it.
Generalizable Learning for Natural Language Instruction Following on Physical Robots

Author
Blukis, Valts
Abstract
Robot applications in unstructured human-inhabited environments, such as households, assistive scenarios, and collaborative industrial tasks, require a human-robot interaction interface through which humans can instruct robots what to do.Natural language is an accessible, expressive, effective, and efficient interface to specify a wide array of instructions and goals to robots. This thesis presents a modular and interpretable representation learning approach to following natural language instructions on physical robots by mapping raw visual observations to continuous control. The approach reconsiders the full robotics stack, including solving perception, planning, mapping, and control challenges, all with a focus on enabling behavior specification in natural language. The presented approach achieved the first demonstration of a physical robot that could follow natural language instructions by mapping RGB images to continuous control. The thesis also introduces a hierarchical approach to following mobile manipulation instructions, showing promising early results on a virtual mobile manipulation benchmark.
Description
219 pages
Date Issued
2021-12Subject
Deep learning; Machine learning; Natural language processing; Neural networks; Robotics; Semantic mapping
Committee Chair
Artzi, Yoav
Committee Member
Snavely, Noah; Ju, Wendy Guang-wen; Lee, Daniel Dongyuel
Degree Discipline
Computer Science
Degree Name
Ph. D., Computer Science
Degree Level
Doctor of Philosophy
Rights
Attribution-NonCommercial 4.0 International
Type
dissertation or thesis
Except where otherwise noted, this item's license is described as Attribution-NonCommercial 4.0 International