A new supercomputer aims to faithfully imitate the human brain: it could help unlock the secrets of the mind and advance AI

    <clase abarcada=Sdecoret / Shutterstock” src=”https://s.yimg.com/ny/api/res/1.2/hvAoLyRd9nfsqQ3mfhaVqQ–/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTU4NQ–/https://media.zenfs.com/en/the_conversation_464/2f69064fd728adfb03a a9e42b3698b69″ data-src= “https://s.yimg.com/ny/api/res/1.2/hvAoLyRd9nfsqQ3mfhaVqQ–/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTU4NQ–/https://media.zenfs.com/en/the_conversation_464/2f69064fd728adfb03aa9e4 2b3698b69″/>

A supercomputer coming online in April 2024 will rival the estimated rate of operations on the human brain, according to researchers in Australia. The machine, called DeepSouth, is capable of performing 228 trillion operations per second.

It is the world’s first supercomputer capable of simulating networks of neurons and synapses (key biological structures that make up our nervous system) on the scale of the human brain.

DeepSouth belongs to an approach known as neuromorphic computing, which aims to mimic the biological processes of the human brain. It will be run from the International Center for Neuromorphic Systems at Western Sydney University.

Our brain is the most amazing computing machine we know. By distributing its computing power to billions of small units (neurons) that interact through trillions of connections (synapses), the brain can rival the world’s most powerful supercomputers, while requiring only the same energy as it uses. the bulb of a refrigerator lamp.

Meanwhile, supercomputers typically take up a lot of space and need large amounts of electrical power to operate. The world’s most powerful supercomputer, the Hewlett Packard Enterprise Frontier, can perform just over a trillion operations per second. It covers 680 square meters (7,300 square feet) and requires 22.7 megawatts (MW) to operate.

Our brains can perform the same number of operations per second with only 20 watts of power and a weight of only 1.3kg-1.4kg. Among other things, neuromorphic computing aims to unlock the secrets of this astonishing efficiency.

Transistors at the limit

On June 30, 1945, mathematician and physicist John von Neumann described the design of a new machine, the Discrete Variable Automatic Electronic Computer (Edvac). This effectively defined the modern electronic computer as we know it.

My smartphone, the laptop I’m using to write this article, and the world’s most powerful supercomputer share the same fundamental structure introduced by von Neumann almost 80 years ago. These all have distinct memory and processing units, where data and instructions are stored in memory and computed by a processor.

For decades, the number of transistors on a microchip doubled about every two years, an observation known as Moore’s Law. This allowed us to have smaller and cheaper computers.

However, transistor sizes are now approaching the atomic scale. At these small sizes, excessive heat generation is a problem, as is a phenomenon called quantum tunneling, which interferes with the operation of transistors. This is slowing down and will eventually stop the miniaturization of transistors.

To overcome this problem, scientists are exploring new approaches to computing, starting with the powerful computer we all have hidden in our heads: the human brain. Our brain does not function according to John von Neumann’s computer model. They do not have separate computing and memory areas.

Instead, they work by connecting billions of nerve cells that communicate information in the form of electrical impulses. Information can pass from one neuron to the next through a junction called a synapse. The organization of neurons and synapses in the brain is flexible, scalable and efficient.

Thus, in the brain (and unlike a computer), memory and computation are governed by the same neurons and synapses. Since the late 1980s, scientists have been studying this model with the intention of importing it into computer science.

Tablet.

Imitation of life

Neuromorphic computers are based on intricate networks of simple elementary processors (which act like the neurons and synapses of the brain). The main advantage of this is that these machines are inherently “parallel”.

This means that, just like neurons and synapses, virtually all of a computer’s processors can operate simultaneously and communicate in tandem.

Additionally, because the calculations performed by individual neurons and synapses are very simple compared to traditional computers, power consumption is much lower. Although neurons are sometimes thought of as processing units and synapses as memory units, they contribute to both processing and storage. In other words, the data is already located where the calculation requires it.

This speeds up brain computing in general because there is no separation between the memory and the processor, which in classical machines (von Neumann) causes a slowdown. But it also avoids the need to perform a specific data access task from a main memory component, as occurs in conventional computer systems and consumes a considerable amount of power.

The principles we just described are the main inspiration for DeepSouth. This is not the only neuromorphic system currently active. It is worth mentioning the Human Brain Project (HBP), funded by an EU initiative. The HBP was operational from 2013 to 2023 and gave rise to BrainScaleS, a machine located in Heidelberg, Germany, that emulates the way neurons and synapses work.

BrainScaleS can simulate the way neurons “sting”, the way an electrical impulse travels along a neuron in our brain. This would make BrainScaleS an ideal candidate for investigating the mechanics of cognitive processes and, in the future, the mechanisms underlying severe neurological and neurodegenerative diseases.

Because they are designed to mimic real brains, neuromorphic computers could be the beginning of a game-changer. By offering sustainable and affordable computing power and allowing researchers to evaluate models of neurological systems, they are an ideal platform for a variety of applications. They have the potential to improve our understanding of the brain and offer new approaches to artificial intelligence.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The conversationThe conversation

The conversation

Domenico Vicinanza does not work for, consult with, own shares in, or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond his academic appointment.

Leave a Reply

Your email address will not be published. Required fields are marked *