Flow of data processing work. Credit: Electronic (2025). DOI: 10.3390 / electronics14081605
The recent progress of robotics and automatic learning have enabled the automation of many real world tasks, including various manufacturing and industrial processes. Among other applications, robotic and artificial intelligence systems (AI) have been successfully used to automate certain stages in clothing manufacturing.
Researchers at Laurentiens University in Canada have recently decided to explore the possibility of completely automating clothing knitting. To do this, they have developed a model to convert images of fabric into complete instructions that knitting robots could read and follow. Their model, described in an article published in ElectronicWas noted to successfully make models for the creation of clothing in a single penis and several knitting yards.
“Our article takes up the challenge of the automation of knitting by converting the images into fabric into instructions readable by machine,” Tech Xplore Xingyu Zheng and Mengcheng Lau, co-authors of the newspaper told Tech.
“Traditional methods require manual labeling, which is with a high intensity of labor and limits scalability. Inspired by this gap, our objective was to develop a deep learning system that the inverters knitted images knitted, allowing greater personalization and evolution in textile manufacturing.”
The approach based on in -depth learning developed by Zheng, Lau and their colleagues attacks the problem of the production of knitting instructions by performing two main steps. The first was named the “generation phase”, while the second was the “inference phase”.
“In the generation phase, an AI model deals with real tissue images in clear synthetic representations, then interprets these synthetic images to predict simplified knitting instructions, called front labels,” said the Haoliang Sheng and Songpu Cai, co-authors of the article. “In the inference phase, another model uses the front labels to deduce the complete knitting instructions ready for the machine.”
-
Image illustrating the study pipeline. It begins with a real image in knitted fabric, followed by the generation phase, where the “refiners” and “Img2prog” modules produce a simplified front label. Then, in the inference phase, the “residual model” generates complete knitting instructions. Credit: Sheng et al.
-
The complete knitting instructions produced by the model. The complete final label includes both the visible front layer and the hidden rear layer, ensuring that the output is ready for direct use by knitting machines. Credit: Sheng et al.
-
More samples generated by the model. Credit: Sheng et al.
The new model for creating fabric patterns introduced by researchers has several precious characteristics and advantages. More specifically, it can produce knitting patterns in two and more yards, accurately incorporate the rare points and be easily applied to new styles of fabric.
The researchers tested their system offered in a series of tests, using it to produce models for around 5,000 textile samples, which were to be made of both natural and synthetic tissue. They found that this was working remarkably well, generating precise knitting instructions for most of these articles.
“Our model has reached an accuracy of more than 97% in the conversion of images into knitting instructions, considerably surpassing existing methods,” said Sheng and Cai.
“Our system has also effectively managed the complexity of multicolored wires and rare points, which were major limitations in previous approaches. In terms of applications, our method allows a fully automated textile production, a reduction in time and labor costs.”
The new model developed by Lau, Zheng, Sheng and Cai could soon be tested and improved more. Finally, it could be deployed in real world parameters, potentially supporting automated mass production of personalized knitted clothing. When used with knitting robotic systems, the model could also allow designers to quickly create prototypes of their conceptions or test new models without manually creating models read by machine.
“In the future, we plan to treat imbalances in the data set, in particular for rare points, thanks to advanced increase techniques,” added Lau and Zheng.
“We also aim to incorporate color recognition to improve structural and visual fidelity. Widening the system to manage variable input and output sizes is another objective, allowing it to adapt dynamically to different fabrics. Finally, we intend to extend our pipeline to complex 3D 3D clothing and to explore cross -domain applications such as weaving and embrained.”
More information:
Haoliang Sheng et al, Knitting Robots: an in -depth learning approach for opposite fabric patterns, Electronic (2025). DOI: 10.3390 / electronics14081605
© 2025 Science X Network
Quote: The system converts fabric images into complete knitting instructions readable by machine (2025, May 2) recovered on May 4, 2025 from
This document is subject to copyright. In addition to any fair program for private or research purposes, no part can be reproduced without written authorization. The content is provided only for information purposes.