Taking inspiration from the human brain, researchers have developed a new synaptic transistor capable of higher-level thinking.
Designed by researchers at Northwestern University, Boston College and the Massachusetts Institute of Technology (MIT), the device simultaneously processes and stores information, just like the human brain. In new experiments, the researchers demonstrated that the transistor goes beyond simple machine learning tasks to categorize data and is capable of performing associative learning.
Although previous studies have exploited similar strategies to develop brain-like computing devices, these transistors cannot operate outside of cryogenic temperatures. On the other hand, the new device is stable at room temperature. It also operates at high speeds, consumes very little power, and retains stored information even when the power is turned off, making it ideal for real-world applications.
The study, titled “Moire synaptic transistor with neuromorphic functionality at room temperature,” was published Dec. 20 in the journal Nature.
“The brain has a fundamentally different architecture than a digital computer,” said Mark C. Hersam of Northwestern, who co-led the research.
“In a digital computer, data flows between a microprocessor and memory, which consumes a lot of power and creates a bottleneck when trying to multitask. On the other hand, in the brain, memory and information “Processing is co-located and fully integrated, resulting in an order of magnitude greater energy efficiency. Our synaptic transistor similarly achieves simultaneous memory and information processing functionality to more closely mimic the brain. “
Hersam is the Walter P. Murphy Professor of Materials Science and Engineering at Northwestern’s McCormick School of Engineering. He is also chairman of the Department of Materials Science and Engineering, director of the Materials Science and Engineering Research Center, and a member of the International Institute of Nanotechnology. Hersam co-led the research with Qiong Ma of Boston College and Pablo Jarillo-Herrero of MIT.
Recent advances in artificial intelligence (AI) have prompted researchers to develop computers that work more like the human brain. Conventional digital computing systems have separate processing and storage units, causing data-intensive tasks to consume large amounts of energy. As smart devices continually collect large amounts of data, researchers strive to discover new ways to process it all without consuming an increasing amount of energy.
Currently, the memory resistor, or “memristor”, is the most developed technology capable of combining processing and memory functions. But memristors still suffer from energy-expensive switching.
“For several decades, the paradigm in electronics has been to build everything from transistors and use the same silicon architecture,” Hersam said.
“Significant progress has been made by simply integrating more and more transistors into integrated circuits. There is no denying the success of this strategy, but it comes at the cost of high power consumption, particularly at the In today’s big data era where digital computing is in the spotlight, it’s a runway to overwhelm the network. We need to rethink computing hardware, especially for AI and machine learning tasks.
To rethink this paradigm, Hersam and his team explored new advances in the physics of moire patterns, a type of geometric design that appears when two patterns overlap. When two-dimensional materials are stacked, new properties emerge that do not exist in a single layer. And when these layers are twisted to form a moiré pattern, unprecedented tuning of electronic properties becomes possible.
For the new device, the researchers combined two different types of atomically thin materials: bilayer graphene and hexagonal boron nitride. When stacked and deliberately twisted, the materials formed a moire pattern. By rotating one layer relative to the other, researchers could obtain different electronic properties in each graphene layer, even if they are only separated by atomic-scale dimensions. With the right choice of torsion, the researchers exploited moiré physics for neuromorphic functionality at room temperature.
“With twist as a new design parameter, the number of permutations is vast,” Hersam said. “Graphene and hexagonal boron nitride are very similar structurally but just different enough to achieve exceptionally strong moiré effects.”
To test the transistor, Hersam and his team trained it to recognize similar, but not identical, patterns. Earlier in 2023, Hersam introduced a new nanoelectronic device capable of analyzing and categorizing data in an energy-efficient way, but its new synaptic transistor takes machine learning and AI even further.
“If AI is supposed to mimic human thinking, one of the most basic tasks would be to classify data, which is simply sorting it into boxes,” Hersam said. “Our goal is to advance AI technology toward higher-level thinking. Real-world conditions are often more complicated than current AI algorithms can handle, which is why we tested our new devices in more complicated conditions to verify their advanced capabilities.
The researchers first showed the device a pattern: 000 (three zeros in a row). Then they asked the AI to identify similar patterns, such as 111 or 101. “If we trained it to detect 000 and then gave it 111 and 101, it knows that 111 is more like 000 than 101,” Hersam explained. “000 and 111 are not exactly the same, but both are made up of three consecutive numbers. Recognizing that similarity is a form of higher-level cognition known as associative learning.”
In experiments, the new synaptic transistor successfully recognized similar patterns, displaying its associative memory. Even when researchers threw curve balls, such as giving it incomplete models, they were still able to demonstrate associative learning.
“Current AI can be easy to confuse, which can cause major problems in certain contexts,” Hersam said. “Imagine if you are operating an autonomous vehicle and weather conditions deteriorate. The vehicle might not be able to interpret more complex sensor data as well as a human driver. But even when we gave imperfect input to our transistor, it could still identify the correct answer.
More information:
Mark Hersam, Moiré synaptic transistor with neuromorphic functionality at room temperature, Nature (2023). DOI: 10.1038/s41586-023-06791-1. www.nature.com/articles/s41586-023-06791-1
Provided by Northwestern University
Quote: New brain-like transistor performs energy-efficient associative learning at room temperature (December 20, 2023) retrieved December 20, 2023 from
This document is subject to copyright. Apart from fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for information only.