-
Research
Wiry synapse
Neuronal networks in the human brain are superior to conventional computers in many ways. Researchers have developed an electronic component that functions similarly to the synapse of a nerve cell. In the future, networks consisting of such circuit elements could work as efficiently as the brain.
The numbers are impressive: with an energy consumption of just 20 watts, our brain accomplishes an estimated 10,000 billion so-called binary arithmetic operations per second . In comparison, computers consume significantly more energy – about 225 watts on average for a standard PC – and are therefore far less efficient than the human brain. No wonder that researchers around the globe are trying to copy the way the “biological computer” works.
One of them is Dr. Ilia Valov from the Peter Grünberg Institute (PGI-7). Together with colleagues from Aachen and Turin, he has developed a new electronic miniature component, which can process and store information and receive multiple signals in parallel. A network of many such units on one chip would be a so-called neuromorphic processor that functions similarly to the brain.
Adaptive role model
In the brain, nerve cells – neurons – are connected to form a huge network: a neuronal network. The contact points of the neurons are called synapses. “They transmit signals, process and store information,” explains Valov. In doing so, the synapses can, for example, adapt their size or efficiency as required – experts refer to this property as synaptic plasticity. Among others, the ability to learn and forget are based on this. The synapses thus combine several functions and are versatile.
Classic electronic components are not able to do this: they are either memory modules or work modules. The two types of modules are always spatially separated from each other. Therefore, transmission between them requires more time and energy than with a neural network. In addition, the computer hardware is not adaptable. This means that it does not change its structure according to completed tasks in order to process them faster in future. Until now, artificial intelligence (AI) has generally used such classical processor techniques. They merely imitate the decentralised and self-learning operating principle of neural networks by means of sophisticated software. “This approach is quite inefficient in terms of energy consumption and space hogging,” explains Valov. It would be better to simulate the operating principle of the brain using a network of artificial synapses.
Expert in miniature electrochemical components: Ilia Valov
The synapse-like component of the Jülich team consists of a zinc oxide wire about 10,000ths of a millimetre in diameter that connects a platinum electrode to a silver electrode. When current flows through the wire, it changes its electrical resistance depending on the strength and direction of the current. What makes this special: in contrast to conventional transistors, the last resistance value is retained after the current is switched off. This way, information can be stored. If the current is switched on again, the changed resistance value causes a different current flow. The components therefore behave similarly to biological synapses: they change their structure on the basis of the signals and, as a result, will forward future signals differently.
In technical jargon, such components are called memristors – a composition of “memory” and “resistor” for electrical resistance. “The special thing about our memristor is that it combines various functions such as saving, learning and forgetting,” explains Valov. Previous memristors could only reproduce one of these properties each.
Fast and adaptive
In order to also design the hardware according to the biological model, it would have to be possible to connect the new components to a functional network that can perform certain tasks. “This would enable parallel data processing and storage, for example, which significantly speeds up calculations,” explains Valov. In addition, such a neuromorphic processor could – similar to the brain – learn independently, that is, perform certain tasks better and faster after a training phase.
The researchers are already working on linking individual artificial synapses to form a larger network. Valov emphasises, however, that it will take quite some time before processors will actually be built with memristors. So for the time being, the brain remains unique.
Janosch Deeg
Neuronal synapse
An electrical signal triggers several processes in the transmitter neuron: the vesicles, or bubbles filled with messenger substances called neurotransmitters, merge with the membrane of the synapse. The neurotransmitters move into the synaptic cleft and then dock onto the receptors of the receptor neuron, changing its electrical resistance and causing a signal to be transmitted. This is how information is stored and processed. The more often two neurons communicate, the more pronounced their connection will become – for example, by releasing more neurotransmitters or increasing the receptor density. This is called synaptic plasticity.
Artificial synapse (memristor)
Due to a positive voltage at the silver electrode (= transmitter), silver ions on the nanowire begin to move towards the platinum electrode (= receiver). The silver ions form a conductive bridge between the electrodes, which lowers the resistance. A negative voltage, on the other hand, increases the resistance. Through repeated electrical impulses, the resistance can be controlled purposefully and the silver ions will remain on the wire after switching off the current (saving) or will disintegrate again (forgetting). In this way, the component imitates the plasticity of the neuronal synapse.
© 2022 Forschungszentrum Jülich