• About
  • Advertise
  • Contact
Friday, May 16, 2025
Manhattan Tribune
  • Home
  • World
  • International
  • Wall Street
  • Business
  • Health
No Result
View All Result
  • Home
  • World
  • International
  • Wall Street
  • Business
  • Health
No Result
View All Result
Manhattan Tribune
No Result
View All Result
Home Science

Training algorithm breaks barriers of deep physical neural networks

manhattantribune.com by manhattantribune.com
7 December 2023
in Science
0
Training algorithm breaks barriers of deep physical neural networks
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


AI-generated concept image (DALL-E 3) depicting light waves passing through a physical system. Credit: LWE/EPFL

EPFL researchers have developed an algorithm to train an analog neural network as accurately as a digital network, enabling the development of more efficient alternatives to power-hungry deep learning hardware.

With their ability to process large amounts of data through algorithmic “learning” rather than traditional programming, it often seems that the potential of deep neural networks like Chat-GPT is limitless. But as the scope and impact of these systems have grown, so have their size, complexity and energy consumption, the latter of which is significant enough to raise concerns about their contribution to global carbon emissions. .

While we often think of technological progress in terms of moving from analog to digital, researchers are now looking for answers to this problem in physical alternatives to digital deep neural networks. One of these researchers is Romain Fleury from the Wave Engineering Laboratory at EPFL’s Faculty of Engineering.

In an article published in Sciencehe and his colleagues describe an algorithm for training physical systems that has improved speed, improved robustness, and reduced power consumption compared to other methods.

“We successfully tested our training algorithm on three wave-based physics systems that use sound waves, light waves and microwaves to transport information rather than electrons. But our versatile approach can be used to train any physical system,” explains first author and LWE researcher Ali Momeni.

A “more biologically plausible” approach

Neural network training involves helping systems learn to generate optimal parameter values ​​for a task such as image or speech recognition. This traditionally involves two steps: a forward pass, where data is sent through the network and an error function is calculated based on the output, and a backward pass (also known as backpropagation, or BP), where a gradient of the error function with with respect to all network parameters is calculated.

Over repeated iterations, the system updates based on these two calculations to return increasingly precise values. The problem? In addition to being very energy intensive, BP is poorly suited to physical systems. In fact, training physical systems usually requires a digital twin for the BP step, which is inefficient and carries a risk of lag between reality and simulation.

The scientists proposed replacing the BP step with a second pass through the physical system to update each network layer locally. In addition to reducing power consumption and eliminating the need for a digital twin, this method better reflects human learning.

“The structure of neural networks is inspired by the brain, but the brain is unlikely to learn via BP,” says Momeni. “The idea here is that if we train each physical layer locally, we can use our actual physical system instead of first building a digital model of it. So we developed a more biologically plausible approach.”

The EPFL researchers, with Philipp del Hougne of CNRS IETR and Babak Rahmani of Microsoft Research, used their physical local learning algorithm (PhyLL) to train experimental acoustic and microwave systems and a modeled optical system to classify data such as vowels and images. In addition to showing comparable accuracy to BP-based training, the method was robust and adaptable, even in systems exposed to unpredictable external disturbances, compared to the state of the art.

An analog future?

Although the LWE approach constitutes the first BP-free training of deep physical neural networks, some numerical parameter updates are still necessary. “This is a hybrid training approach, but our goal is to reduce the number crunching as much as possible,” says Momeni.

The researchers now hope to implement their algorithm on a small-scale optical system with the ultimate goal of increasing the scalability of the network.

“In our experiments we used neural networks with up to 10 layers, but would this still work with 100 layers with billions of parameters? This is the next step and will require overcoming the technical limitations of physical systems.”

More information:
Ali Momeni et al, Training without backpropagation of deep physical neural networks, Science (2023). DOI: 10.1126/science.adi8474

Provided by the Ecole Polytechnique Fédérale de Lausanne

Quote: Training algorithm breaks the barriers of deep physical neural networks (December 7, 2023) retrieved December 7, 2023 from

This document is subject to copyright. Apart from fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for information only.



Tags: algorithmbarriersbreaksdeepnetworksneuralphysicaltraining
Previous Post

Continuous bombing on Jabalia and the displacement of residents to shelter centers News

Next Post

Hundreds of martyrs in new massacres in Gaza, and the occupation reveals the arrest of 700 in the northern Gaza Strip News

Next Post
Hundreds of martyrs in new massacres in Gaza, and the occupation reveals the arrest of 700 in the northern Gaza Strip  News

Hundreds of martyrs in new massacres in Gaza, and the occupation reveals the arrest of 700 in the northern Gaza Strip News

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Category

  • Blog
  • Business
  • Health
  • International
  • National
  • Science
  • Sports
  • Wall Street
  • World
  • About
  • Advertise
  • Contact

© 2023 Manhattan Tribune -By Millennium Press

No Result
View All Result
  • Home
  • International
  • World
  • Business
  • Science
  • National
  • Sports

© 2023 Manhattan Tribune -By Millennium Press