This AI-Powered Smart Glove Can Identify Objects by Touch

MIT AI Smart glove
Image credit: MIT CSAIL

Scientists at MIT have developed a smart glove that can recognize objects by touching alone.

In the dark, humans have the ability to identify an object such as glasses or phone just by touching it. For years, scientists have been trying to teach robots how to grip different objects without crushing or dropping them.

Related New Prosthesis Provides Sense of Touch So the User Knows Its Location

“Humans can identify and handle objects well because we have tactile feedback. As we touch objects, we feel around and realize what they are. Robots don’t have that rich feedback,” says Subramanian Sundaram PhD ’18, a former CSAIL graduate student. “We’ve always wanted robots to do what humans can do, like doing the dishes or other chores. If you want robots to do these things, they must be able to manipulate objects really well.”

In a paper published in Nature, the researchers describe the low-cost device called Scalable Tactile Glove (STAG). The smart glove is equipped with about 550 tiny sensors across nearly the entire hand. Each sensor captures pressure signals as humans interact with objects in various ways. A neural network processes the signals to “learn” a dataset of pressure-signal patterns related to specific objects. Then, the system uses that dataset to classify the objects and predict their weights by feel alone, with no visual input needed, reports CSAIL MIT.

The researchers compiled a dataset using STAG for 26 common objects including a soda can, scissors, tennis ball, spoon, pen, and mug. Using the dataset, the system predicted the objects’ identities with up to 76 percent accuracy. The system can also predict the correct weights of most objects within about 60 grams.

Similar sensor-based gloves in use today can cost thousands of dollars and often contain only around 50 sensors that capture less information. Even though STAG produces very high-resolution data, it’s made from commercially available materials totaling around $10.

MIT CSAIL building
MIT Computer Science and Artificial Intelligence Lab (Image credit: CSAIL)

The tactile sensing system could be used in combination with traditional computer vision and image-based datasets to give robots a more human-like understanding of interacting with objects, according to CSAIL MIT report.

Related Scientists Use Virtual Reality and Neural Stimulation to Help Amputees Use Their Prostheses

“Humans can identify and handle objects well because we have tactile feedback. As we touch objects, we feel around and realize what they are. Robots don’t have that rich feedback,” says Subramanian Sundaram PhD ’18, a former CSAIL graduate student. “We’ve always wanted robots to do what humans can do, like doing the dishes or other chores. If you want robots to do these things, they must be able to manipulate objects really well.”

WT | Wearable Technologies Conference in San Francisco on July 9-10

The most innovative wearables event will be back on July 9-10 in beautiful San Francisco at SEMICON West to celebrate the 34th edition of the WT | Wearable Technologies Conference Series. Topics include data analytics in professional sports, prevention, treatment and rehabilitation with wearables, the future of digital health, medication and adherence, smart patches, workflow optimization and workforce safety and much more – featuring international leaders and experts of the wearables industry´s biggest names including Abbott, Autodesk, Datwyler, Kopin, Maxim Integrated, Multek, NFLPA, Omron, SharkDreams, Qualcomm, and many more. Register now to be part of #WTUS19