Putting clothes on may seem like a basic skill, but it’s one that AI finds particularly challenging. So discovered the Georgia Tech researchers working on the development of virtual robots that can dress themselves. Using reinforcement learning, the team taught the bots the beginnings of the motor skills needed to put on jackets, shirts, and other clothes for the upper body, with mixed results. Part of the challenge is that the robots need to also have haptic perception, which is the ability to literally feel the clothes around their arms and hands, as humans do, in order to keep from getting tangled up in, say, a sleeve or, worse, tearing it, as one robot managed to do in the trials. These models likely won’t find a physical world equivalent to help those who might need help dressing themselves—the engineering required for that is another separate challenge—but it’s a reminder that the robots aren’t taking over just yet.
sign up for our newsletter