Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Asharq Al-Awsat
Asharq Al-Awsat
Technology
Jeddah - Asharq Al-Awsat

KAUST Researchers Create Chip Mimicking Human Brain

An undated image of the human brain taken through scanning technology. REUTERS/Sage Center for the Study of the Mind, University of California, Santa Barbara/Handout

Researchers at the King Abdullah University of Science and Technology (KAUST) are presenting new solutions to achieve the best AI (artificial intelligence) speeds and performances by building a neural network mimicking the biology of the human brain in a microchip.

KAUST researchers have laid the groundwork for more efficient hardware-based artificial intelligence computing systems by developing a biomimicking “spiking” neural network on a microchip.

Neural Networks

Since the invention of computers, scientists hoped to create a computing system that mimics the human brain in order to facilitate people’s life and address their problems with the help of machines and devices that accomplish tasks that require effort and time.

In the 1940s, scientists created the so-called Artificial Neural Networks (ANNs), a key pillar in the development and advancement of AI technologies; Artificial Intelligence (AI) is the process of teaching machines how to mimic the human cognitive abilities and patterns. Inspired by the biological neural networks, ANNs are mathematical models created to store data, scientific knowledge, and experimental information, and then used to train machines on creating solutions for precise problems.

Artificial intelligence technology is rapidly evolving, with a flood of new applications including advanced automation, data mining and interpretation, healthcare, and marketing. These systems are built around a mathematical artificial neural network (ANN), which is made up of layers of decision-making nodes.

Labeled data is first fed into the system to “train” the model to respond in a specific way, after which the decision-making rules are locked in, and the model is deployed on standard computing hardware.

While this method works, it is a clumsy approximation of our brains’ far more complex, powerful, and efficient neural networks.

Spiking Neural Network (SNN)

“An ANN is an abstract mathematical model that bears little resemblance to real nervous systems and necessitates a lot of computing power,” explains Wenzhe Guo, a Ph.D. student on the research team.

“In contrast, a spiking neural network is built and operates in the same way as the biological nervous system, and it can process information faster and more efficiently,” he adds.

The third generation of ‘spiking’ neural networks were designed to fill the gap between neuroscience and machine learning; they rely on a model that resembles the mechanism the biological neurons use to calculate. SNNs mimic the structure of the nervous system as a network of synapses that transmit information in the form of action potentials, or spikes, via ion channels as they occur.

This event-driven behavior, which is mathematically implemented as a “leaky integrate-and-fire model,” makes SNNs extremely energy efficient. In this case, it’s a mathematically-applied guided behavior, and a major toll of neural computing. Furthermore, the structure of interconnected nodes allows for a high degree of parallelization, which increases processing power and efficiency. It is also amenable to direct implementation in computing hardware as a neuromorphic chip.

“We used a standard low-cost FPGA (Field-programmable gate array) microchip and implemented a spike-timing-dependent plasticity model, which is a biological learning rule discovered in our brain,” Guo explains.

Importantly, no teaching signals or labels are required for this biological model, allowing the neuromorphic computing system to learn real-world data patterns without training.

“Because SNN models are very complex,” Guo explains, “our main challenge was to tailor the neural network settings for optimal performance. We then designed the optimal hardware architecture while keeping cost, speed, and energy consumption in mind.”

The brain-on-a-chip developed by the team was found to be more than 20 times faster and 200 times more energy efficient than other neural network platforms.

“Our ultimate goal is to create a brain-like hardware computing system that is compact, fast, and low-energy. The next step is to collaborate to improve the design and optimize product packaging, miniaturize the chip, and customize it for various industrial applications,” Guo explains.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.