The development of affordable, high-performance sensors may have crucial implications for robotics research, as they could improve perception and help boost robot manipulation and navigation. In recent years, engineers have introduced a wide range of advanced tactile sensors that can improve robots’ ability to detect tactile signals, using the information they collect to guide their actions.
Researchers at New York University have presented AnySkin, an inexpensive and durable sensor that is easy to assemble and integrate into robotic systems. This sensor, presented in an article pre-published on arXivis much more accessible than many other tactile sensors introduced in recent years and could thus open up new opportunities for robotics research.
“Touch is fundamental to the way humans interact with the world around them, but in contemporary robotics, the sense of touch lags far behind vision, and I’ve been trying to understand why for the past few years,” Raunaq Bhirangi, co-author. of the paper, told Tech Xplore.
“The most common reasons we’ve heard from roboticists are: ‘It’s too difficult to fit into my setup,’ ‘How can I train a neural network with this?’ “I have to use the same copy of the sensor for data collection and evaluation. What if it tears halfway through? » AnySkin was expressly designed to address each of these concerns. »
AnySkin, the new magnetic touch sensor designed by Bhirangi and colleagues, is an updated version of a sensor the researchers presented in a previous paper, called ReSkin. The new sensor builds on ReSkin’s simplistic design, but it also features better signal coherence and physical separation between the device’s electronics and its sensing interface.
AnySkin can be assembled in just seconds and can be used to learn artificial neural network models with very little or no preprocessing. Compared to ReSkin, it also collects touch signals with greater consistency and can be easily and quickly repaired in the event of accidental damage.
“If you’re trying to teach your robot to do exciting tasks and accidentally tear skin, you can replace your skin in 10 seconds and continue your experiment,” Bhirangi said. “AnySkin consists of two main components: the skin and the electronics. The skin is a magnetic elastomer made by curing a mixture of magnetic particles with silicone, followed by magnetization using a pulse magnetizer. “
The unique self-adhesive design of the AnySkin sensor allows for greater flexibility in how the sensor is integrated. This means it can be simply stretched and inserted onto various surfaces to equip them with sensing capabilities.
The sensor is also very versatile, as it can be easily manufactured in different shapes and assembled. AnySkin can also be simply peeled off surfaces and replaced if damaged.
In initial tests, the researchers found that their sensor worked remarkably well, with performance comparable to other well-established touch sensors. Notably, they also observed that different AnySkin sensors exhibit very similar performance and sensing responses, suggesting that they could be reliably replicated and deployed at large scale.
“We used machine learning to end-to-end train certain robot models, which take the raw signal from AnySkin as well as images from different points of view and use this information to perform very precise tasks: locating a strip of sockets and insert a plug into the first socket, locate a credit card machine and slide a card through it, locate a USB port and insert a USB stick into it,” Bhirangi said.
“While it was interesting to see that we could perform these precise tasks even when the locations of the power strip/card machine/USB port were varied, what was even more exciting was the fact that you could swap the skin, and our knowledge models would continue to work well. This type of generalizability opens up a multitude of possibilities.
In the future, AnySkin could be integrated into a wider range of robotic systems and tested in additional scenarios. Researchers believe it would be very suitable for collecting large amounts of touch data and using it to train large-scale deep learning models similar to those underlying computer vision and natural language processing (NLP). ).
“We now plan to integrate AnySkin into different robot configurations, beyond simple robot grippers to multi-fingered robot hands, data collection devices such as the Robot Utility Models stick and sensory gloves,” Bhirangi added. “We are also looking for better ways to leverage tactile information to improve visuotactile control for fine robot manipulation.”
More information:
Raunaq Bhirangi et al, AnySkin: plug-and-play skin sensing for robotic touch, arXiv (2024). DOI: 10.48550/arxiv.2409.08276
arXiv
© 2024 Science X Network
Quote: Low-cost tactile sensor shows promise for large-scale robotic applications (October 15, 2024) retrieved October 15, 2024 from
This document is subject to copyright. Except for fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for informational purposes only.