eCommons

 

Towards Natural And Robust Human-Robot Interaction Using Sketch And Speech

Other Titles

Abstract

For centuries, we have dreamt of intelligent machines that could someday co-exist with humans as autonomous agents, working for, with, and sometimes ˇ even against us. Since Karel Capek's play, R.U.R. (Rossum's Universal Robots) was written in 1920 [1], robots have permeated science fiction books, movies and television, giving rise to famous characters such as Robbie in I, Robot [2], Johnny 5 in Short Circuit [3], and C-3PO in Star Wars [4]. However, the fields of robotics and artificial intelligence are still a long way off from producing fullyautonomous machines like Rosie from The Jetsons [5] that can behave and interact as humans do. Today, getting computer agents to perform even the simplest of tasks requires designing an interface that is able to translate what the human wants into what the computer can do. Traditionally, this has been accomplished by constraining human users to communicate in a specific and unambiguous way, such as pressing buttons or selecting options from a menu. This type of interaction is rigid and unnatural, and is far from how humans communicate with one another. In recent years, there has been growing interest in the development of more natural and flexible human-robot interfaces, allowing humans to communicate with machines using means such as speech, drawing, gesturing, etc. These methods are still in their infancy, and while they offer more human-like interaction with computers, ensuring that the user's intentions are correctly inter- preted places limits on the flexibility of expression allowed by such systems. For example, despite recent advances in speech recognition technology, natural language interfaces are still largely confined to simple applications in which the speaker's intentions are disambiguated through the use of pre-defined phrases (e.g., "Call home"), or do not need to be interpreted at all, such as for data entry or speech-to-text processing. In this dissertation, a number of algorithms are proposed with the aim of allowing users to naturally communicate with a semi-autonomous robot while placing as few restrictions on the user's input as possible. The methods presented here reside in the domains of sketch and speech, which are flexible in their expressiveness and take advantage of how humans communicate with each other. The application considered in this work is mobile robot navigation, i.e., instructing a semi-autonomous robot to move to a specific location within its environment, where it will presumably undertake some useful task. By allowing the user to use speak and sketch naturally, the burden of recognition is shifted from human to machine, allowing the user to focus attention on the task at hand. This dissertation develops a probabilistic framework for sketch and speech recognition, the model for which is learned from training data such that recognition is accurate and robust. It also introduces a method for qualitative navigation, allowing the human user to give navigation instructions using an approximate sketched map. These approaches encourage the robot to understand how humans communicate, rather than to force the human to conform to a communication structure designed for the robot, taking a small step towards truly natural human-robot interaction.

Journal / Series

Volume & Issue

Description

Sponsorship

Date Issued

2012-01-31

Publisher

Keywords

Location

Effective Date

Expiration Date

Sector

Employer

Union

Union Local

NAICS

Number of Workers

Committee Chair

Campbell, Mark

Committee Co-Chair

Committee Member

Kress Gazit, Hadas
Lipson, Hod

Degree Discipline

Mechanical Engineering

Degree Name

Ph. D., Mechanical Engineering

Degree Level

Doctor of Philosophy

Related Version

Related DOI

Related To

Related Part

Based on Related Item

Has Other Format(s)

Part of Related Item

Related To

Related Publication(s)

Link(s) to Related Publication(s)

References

Link(s) to Reference(s)

Previously Published As

Government Document

ISBN

ISMN

ISSN

Other Identifiers

Rights

Rights URI

Types

dissertation or thesis

Accessibility Feature

Accessibility Hazard

Accessibility Summary

Link(s) to Catalog Record