A handheld device for a large multimodal device for a large multimodal haptic dataset
The integration of multimodal tactile sensing with computer vision has the potential to enhance the performance of robots at manipulation tasks by allowing them to rapidly identify task-relevant objects. While vision-based object identification enables the inference of physical properties, tactile perception can complement it by enabling the inference of material properties. Prior research in applying data-driven methods to the problem of haptic perception has shown promising results; however, a major limitation has been the lack of suitable training data. In this work, we present a handheld data acquisition device capable of collecting and logging data across six different sensing modalities upon interaction with an object. Additionally, we outline a data collection procedure in which human data collectors grasp objects within their homes using this device. This effort aims to contribute to the creation of an extensive multimodal haptic dataset of everyday household objects.