Textile identification using fingertip motion and 3D force sensors in an open-source gripper

説明

We propose the use of a human-inspired exploratory motion in which a robot gripper's fingertips are rubbed together, to obtain tactile information about and recognize a grasped textile. Our method not only recognizes different materials, but also distinguishes between one and multiple layers of the same material. The motion can be performed using an open-source, 3D printable gripper, without needing to move either the robot or the object. We also propose a set of features to extract from the proposed exploratory back-and-forth motion, which performs at over 94 % recognition rate when distinguishing 18 different materials with an easily-trained SVM. We compare the performance with frequency-based features as well as a deep-learning-based classifier.

収録刊行物

詳細情報 詳細情報について

問題の指摘

ページトップへ