Show simple item record

dc.contributor.authorDonald, Bruce Randallen_US
dc.contributor.authorJennings, Jamesen_US
dc.date.accessioned2007-04-23T17:56:22Z
dc.date.available2007-04-23T17:56:22Z
dc.date.issued1991-12en_US
dc.identifier.citationhttp://techreports.library.cornell.edu:8081/Dienst/UI/1.0/Display/cul.cs/TR91-1248en_US
dc.identifier.urihttps://hdl.handle.net/1813/7088
dc.description.abstractWe consider how a robot may interpret its sensors and direct its actions so as to gain more information about the world, and to accomplish manipulation tasks. The key difficulty is uncertainty, in the form of noise in sensors, error in control, and unmodelled or unknown aspects of the environment. Our research focuses on general techniques for coping with uncertainty, specifically, to sense ther state of the task, adapt to changes, and reason to select actions to gain information and achieve the goal. Sensors yield partial information about the world. When we interrogate the environment through our sensors, we in effect view a projection of the world onto the space of possible sensor values. We investigate the structure of this sensor space and its relationship to the world. We observe that sensors partition the world into perceptual equivalence classes, that can serve as natural "landmarks." By analyzing the properties of these equivalence classes we develop a "lattice" and a "bundle" structure for the information available to the robot through sensing and action. This yields a framework in which we develop and characterize algorithms for sensor-based planning and reasoning.en_US
dc.format.extent9198807 bytes
dc.format.extent2402063 bytes
dc.format.mimetypeapplication/pdf
dc.format.mimetypeapplication/postscript
dc.language.isoen_USen_US
dc.publisherCornell Universityen_US
dc.subjectcomputer scienceen_US
dc.subjecttechnical reporten_US
dc.titleSensor Interpretation and Task-Directed Planning Using Perceptual Equivalence Classesen_US
dc.typetechnical reporten_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

Statistics