The team took a KUKA robotic arm and added a push button sensor called GelSight, developed by Ted Adelson's group at CSAIL. The information collected by GelSight was then sent to an AI to learn the relationship between visual and tactile information.
To teach the AI how to identify objects by touching, the team drew 1
"By looking at the scene, our model can feel like touching a flat surface or a sharp edge," says Yunzhu Li, CSAIL PhD student and lead author of a new article on the system. "By blindly groping, our model can predict interaction with the environment only from tactile feelings, and combining these two senses could strengthen the robot and reduce the data we need for tasks manipulating and grasping objects."
the robot can only identify objects in a controlled environment. The next step is to create a larger dataset so that the robot can work in more diverse environments.
"Methods like these can be very useful for robotics if you need to answer questions like" Is this object difficult or difficult "" soft? "Or" If I lift this cup by the handle, how good will my grip be? "This is a very challenging problem because the signals are so different and this model has great capabilities," says Andrew Owens, a postdoctoral fellow at the University of California at Berkeley.