JavaScript is disabled for your browser. Some features of this site may not work without it.
eCommons will become read-only at noon on May 26 for an infrastructure update. Submissions will not be accepted at this time. We anticipate that this update will be completed by June 2 at 5 p.m.
Please contact us at ecommons-admin@cornell.edu if you have questions or concerns.
Sensor Interpretation and Task-Directed Planning Using Perceptual Equivalence Classes
dc.contributor.author | Donald, Bruce Randall | en_US |
dc.contributor.author | Jennings, James | en_US |
dc.date.accessioned | 2007-04-23T17:56:22Z | |
dc.date.available | 2007-04-23T17:56:22Z | |
dc.date.issued | 1991-12 | en_US |
dc.identifier.citation | http://techreports.library.cornell.edu:8081/Dienst/UI/1.0/Display/cul.cs/TR91-1248 | en_US |
dc.identifier.uri | https://hdl.handle.net/1813/7088 | |
dc.description.abstract | We consider how a robot may interpret its sensors and direct its actions so as to gain more information about the world, and to accomplish manipulation tasks. The key difficulty is uncertainty, in the form of noise in sensors, error in control, and unmodelled or unknown aspects of the environment. Our research focuses on general techniques for coping with uncertainty, specifically, to sense ther state of the task, adapt to changes, and reason to select actions to gain information and achieve the goal. Sensors yield partial information about the world. When we interrogate the environment through our sensors, we in effect view a projection of the world onto the space of possible sensor values. We investigate the structure of this sensor space and its relationship to the world. We observe that sensors partition the world into perceptual equivalence classes, that can serve as natural "landmarks." By analyzing the properties of these equivalence classes we develop a "lattice" and a "bundle" structure for the information available to the robot through sensing and action. This yields a framework in which we develop and characterize algorithms for sensor-based planning and reasoning. | en_US |
dc.format.extent | 9198807 bytes | |
dc.format.extent | 2402063 bytes | |
dc.format.mimetype | application/pdf | |
dc.format.mimetype | application/postscript | |
dc.language.iso | en_US | en_US |
dc.publisher | Cornell University | en_US |
dc.subject | computer science | en_US |
dc.subject | technical report | en_US |
dc.title | Sensor Interpretation and Task-Directed Planning Using Perceptual Equivalence Classes | en_US |
dc.type | technical report | en_US |