A digitally embroidered smart glove developed at MIT can facilitate piano lessons and human-robot teleoperation using a machine learning agent that adapts to how different users respond to touch. Credit: Alex Shipps/MIT CSAIL
You’ve probably met someone who identifies as a visual or auditory learner, but others absorb knowledge through a different modality: touch. Being able to understand touch interactions is especially important for tasks like learning delicate surgical procedures and playing musical instruments, but unlike video and audio, touch is difficult to record and to transfer.
To address this challenge, researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and elsewhere have developed an embroidered smart glove capable of capturing, reproducing, and relaying tactile instructions.
To complement the wearable, the team also developed a simple machine learning agent that adapts to how different users react to tactile feedback, thereby optimizing their experience. The new system could potentially help teach people physical skills, improve responsive teleoperation of robots, and facilitate virtual reality training.
An open access article describing the work was published in Natural communications on January 29.
Will I be able to play the piano?
To create their smart glove, the researchers used a digital embroidery machine to seamlessly integrate tactile sensors and haptic actuators (a device that provides tactile feedback) into the textiles. This technology is present in smartphones, where haptic responses are triggered by tapping on the touchscreen.
For example, if you tap on an iPhone app, you will feel a slight vibration coming from that specific part of your screen. Likewise, the new adaptive wearable sends feedback to different parts of your hand to indicate the optimal movements to perform different skills.
The smart glove could, for example, teach users to play the piano. In a demonstration, an expert was tasked with recording a simple melody on a section of keys, using the smart glove to capture the sequence by which they pressed their fingers on the keyboard. Then, a machine learning agent converted this sequence into haptic feedback, which was then fed into the students’ gloves for them to follow the instructions.
With their hands hovering over this same section, actuators vibrated on the fingers corresponding to the keys below. The pipeline optimizes these directions for each user, taking into account the subjective nature of touch interactions.
“Humans engage in a wide variety of tasks by constantly interacting with the world around them,” says Yiyue Luo MS ’20, lead author of the paper, a Ph.D. student in the Department of Electrical Engineering and Computer Science (EECS) from MIT and affiliated with CSAIL. “We don’t usually share these physical interactions with others. Instead, we often learn by observing their movements, such as playing the piano and dancing.”
“The main challenge in conveying touch interactions is that everyone perceives haptic feedback differently,” adds Luo. “This obstacle inspired us to develop a machine learning agent that learns to generate adaptive haptics for individuals’ gloves, presenting them with a more practical approach to learning optimal movement.”
The wearable system is customized to fit the specifications of a user’s hand via a digital manufacturing method. A computer produces a cut based on the individuals’ manual measurements; Then an embroidery machine sews the sensors and haptics. In 10 minutes, the soft, fabric-based wearable is ready to wear. Initially trained on the haptic responses of 12 users, its adaptive machine learning model only needs 15 seconds of new user data to personalize feedback.
In two other experiments, tactile directions with time-sensitive feedback were transferred to users wearing the gloves while playing laptop games. In a rhythm game, players learned to follow a narrow, winding path to collide with a goal zone, and in a racing game, drivers collected coins and maintained their vehicle’s balance all the way to the line arrival.
Luo’s team found that participants achieved the highest gaming scores with optimized haptics, as opposed to without haptics and with unoptimized haptics.
“This work is the first step toward creating personalized AI agents that continuously capture data about the user and the environment,” says lead author Wojciech Matusik, professor of electrical engineering and computer science at MIT and leader of the computer design and manufacturing group within CSAIL. “These agents then help them perform complex tasks, learn new skills and promote better behaviors.”
Bring realistic experience to electronic settings
In robotic teleoperation, researchers found that their gloves could transfer sensations of force to robotic arms, helping them perform more delicate grasping tasks.
“It’s a bit like trying to teach a robot to behave like a human,” Luo explains. In one case, the MIT team used human teleoperators to teach a robot how to secure different types of bread without distorting them. By teaching optimal grip, humans could precisely control robotic systems in environments such as manufacturing, where these machines could collaborate more safely and effectively with their operators.
“The technology that powers the embroidered smart glove is an important innovation for robots,” says Daniela Rus, the Andrew (1956) and Erna Viterbi Professor of Electrical Engineering and Computer Science at MIT, director of CSAIL and author of the paper.
“With its ability to capture tactile interactions at high resolution, similar to those of human skin, this sensor allows robots to perceive the world through touch. The seamless integration of tactile sensors into textiles bridges the gap between physical actions and digital feedback, offering vast potential in responsive robot teleoperation and immersive virtual reality training.
Likewise, the interface could create more immersive experiences in virtual reality. Wearing smart gloves would add tactile sensations to digital environments in video games, where players could sense their surroundings to avoid obstacles. Additionally, the interface would provide a more personalized and tactile experience in virtual training courses used by surgeons, firefighters and pilots, where precision is paramount.
Although these wearable devices could provide a more convenient experience for users, Luo and his group believe they could expand their wearable technology beyond fingers. With stronger haptic feedback, interfaces could guide feet, hips, and other body parts that are less sensitive than hands.
Luo also noted that with a more complex artificial intelligence agent, his team’s technology could help with more complex tasks, like manipulating clay or driving an airplane. Currently, the interface can only help with simple movements like pressing a key or grabbing an object. In the future, MIT’s system could incorporate more user data and make more compliant, tight-fitting wearable devices to better account for the impact of hand movements on haptic perceptions.
More information:
Yiyue Luo et al, Adaptive Touch Interaction Transfer via Digitally Embroidered Smart Gloves, Natural communications (2024). DOI: 10.1038/s41467-024-45059-8
Provided by the Massachusetts Institute of Technology
This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and education.
Quote: Adaptive smart glove can teach new physical skills (February 20, 2024) retrieved February 20, 2024 from
This document is subject to copyright. Apart from fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for information only.