DEV Community

Cover image for Neural-Inspired Computing Architectures: From Biology to Silicon
Shaman Shetty
Shaman Shetty

Posted on

Neural-Inspired Computing Architectures: From Biology to Silicon

Introduction

Global ecological crises demand progressive reforms in computing and IIT in touch with traditional computing techniques invented many decades ago have enabled the world to advance into the modern era. However, the efficiency in processing and energy consumption has come to its peak, but Mother Nature has outdone me for the past 3 billion years on these concerns. The 20 watts of power that the human brain uses on information processing is less than the amount that an ordinary light bulb requires and as such, this level of efficiency presents an entirely new outlook on computing and more particularly on how information can be utilized.

Neural-Inspired Computing constitutes a drastic reconstruction of computer architecture, transforming computers from a sequential model of processing into systems of multi parallel distributed processing akin to the workings of the human brain. This will not be on the level of computers making a routine increase in performance, but on the orders of energetic efficiency leaps, increasing the ability to learn, to be adaptable in the future.

What is Neural Inspired Computing??

Neuromorphic Computing
Neural-Inspired Computing: Definition Neural-Inspired Computing also known as Neville Moore inspired computing refers to the computers hardware and structures which replicate the biological neural models in a human brain. In contrast to standard computers where memory and processing are distinct from each other, these systems function using multiple principles in one unified structure enabling the conflation of different components. Biological neurons for example integrate memory and processing through their synapse. Core Principles Neural- inspired computing is anchored upon a few principles which include the following: The Process of Separation: Instead of process-moving rigidly, as linear architecture computing systems do, amassing and splitting systems of memory and processing into multiple single elements, neural- inspired systems gather, understanding and discretion units, allowing many simulations to run concurrently on numerous microchips, akin to the roughly one billion neural connections in the human brain. Several Inputs Working Together: Incorporating processes that are carried out only when needed and taking into account how information is merged with calculation has the potential to yield impressive energy performance. Events are the data that neural- inspired systems use. These parts do not switch on unless information is made available, unlike traditional processors and sometimes referred to as event- driven processors.

Some components of the system may fail, but it will still be able to work thanks to the redundancy, providing robust and reliable operation. There are many components that shift information processing and confuse architecture design, similar to biological systems.

History & Development

Neural-inspired computer systems have a rich history that consists of great historical and technological advances since the first conceptualization which was more than 8 decades ago.

The groundwork for neural-inspired computation was paved back in the 1940s by McCullough and Pitts further aided by the Rosenblatt low level implementation of neural networks. This paved the way to perceptrons being developed in 1957.

Straying from software simulations to using analog CMOS VLSI cadre and architecture to represent neurobiology in the 80s was a massive leap, inspiring cutting edge design movement. These insights combined with development learning algorithms in the late 90s and early 2000s ushered in an advanced epoch in networked circuitry design.

This has been accelerating dramatically since 2010, including breakthroughs on memristors as artificial synapses, the development of large-scale neuromorphic chips, such as IBM's TrueNorth, and the Intel Loihi processor. SNNs

Key Concepts and Algorithms

Spiking Neural Networks (SNNs)

The third generation of neural networks is more biologically relevant than the first two generations of artificial neural networks. In SNNs, neurons communicate in discrete spikes instead of continuous values, which can be more energy-efficient and closer to biological neurons.

The dynamics of a spiking neuron are described by the membrane potential equation:

τ(dV/dt) = -(V-V_rest) + R∑I_syn(t)

Here V denotes the membrane potential, τ denotes the membrane time constant, V_rest the resting potential, R is the membrane resistance, and I_syn the synaptic currents.

Spike-Timing-Dependent Plasticity (STDP)

STDP is a biologically-inspired learning rule whereby connections between neurons can be potentiated or depressed in real time by their relative spikes timing. Such temporal learning rules underpin a vast variety of neural-inspired systems:

Δw = A+ * exp(-Δt/τ+) if Δt > 0
Δw = -A- * exp(Δt/τ-) if Δt < 0

Here, Δw is the change in synaptic weight, Δt is the time difference between pre- and post-synaptic spikes, and A+/- and τ+/- are parameters that shape the learning window.

Hardware Implementations

IBM TrueNorth

IBM's truenorth chip

IBM's TrueNorth chip is a landmark in neural-inspired computing, with:
1 million digital neurons
256 million synapses
Extremely low power consumption (70mW/cm²)
Event-driven operation

Intel Loihi

Intel's Loihi

Intel's Loihi processor takes the success of TrueNorth to the next level with features such as:
On-chip learning capability
Hierarchical connectivity
Programmable neural parameters
Support for various neural coding schemes

Software Frameworks and Algorithms

Modern neural-inspired computing relies on sophisticated software frameworks that implement a variety of neuron models and learning rules. The Leaky Integrate-and-Fire (LIF) model is a cornerstone that balances biological realism with computational efficiency:
dV/dt = (V_rest - V + RI)/τ
When V reaches threshold V_th:
Neuron emits spike
V resets to V_reset
Refractory period begins

Applications and Use Cases

Neural-Inspired Computing

Neural-inspired computing has great potential in the following areas:

Robotics and Autonomous Systems: The nature of neural-inspired systems makes them well-suited for real-time processing of sensory information, and hence, make more efficient and responsive robots.
Pattern Recognition: These architectures find applications in complex, noisy data streams, detecting patterns, so computer vision, speech recognition are a few application areas.

Brain-Computer Interfaces: Neural-inspired systems will interface more naturally with biological neural systems, promising breakthroughs in neuroprosthetics and neural rehabilitation.

Future Research Areas

This area of neural-inspired computing remains on the cutting edge. Important research directions are:

  1. Novel Materials and Devices: Development of new materials and devices that better emulate synaptic behavior, such as phase-change memory and spintronic devices.

  2. Scalability Issues: How to increase the size and interconnect larger artificial neuron networks.

  3. Learning Algorithms: Further efficient learning rules with biological plausibility for performance improvement.

Recent Major Developments

[However these are not completely neural-inspired technology, this is what made me discover more on this topic.]

NVIDIA's new 200-billion transistor Blackwell B200 GPU
NVIDIA's Blackwell Architecture:
NVIDIA's latest chip, the B200 'Blackwell'architecture incorporates several principles inspired by neural computing, however they aren't purely neuromorphic processors like IBM's truenorth or Intel's Loihi.
What makes Nvidia's technology interesting is that they have found the middleground between traditional computing and neural-inspired computing.
i.e. Their chips process information in parallel across thousands of cores, somewhat similar to how neurons work in parallel in our brains. However, they do this using digital circuits rather than the analog, spike-based approach used in pure neuromorphic computing.

Elon Musk's Neuralink
Elon Musk's Neuralink:
Neuralink is a brain-computer interface (BCI) that uses a coin-sized implant to monitor and stimulate brain activity. The implant is designed to be fully implantable and cosmetically invisible.
Neuralink has developed custom chips that need to process neural signals in real-time, which requires some neural-inspired computing principles. They need to handle massive amounts of parallel inputs (from many neurons simultaneously) while consuming very little power - exactly the kind of problem that neural-inspired computing aims to solve.
What is interesting about this topic is that as this technology makes more advancement in this field, we'll learn more about the 'REAL' neural networks and make huge advancements in the neural-inspired technology.

Conclusion

Neural-inspired computing is one of the core reconceptualizations of the architecture of the computer, envisioning systems approaching the efficiency and adaptability of biological brains. Although much remains to be done, the field moves rapidly, being fueled by material science, neurosciences, and computer engineering innovation.

Resources for Further Learning

  1. "Neuromorphic Engineering: From Neural Systems to Brain-Like Engineered Systems" by Giacomo Indiveri and Timothy K. Horiuchi

  2. "Spiking Neuron Models: Single Neurons, Populations, Plasticity" by Wulfram Gerstner

  3. The Neuromorphic Computing Platform of the Human Brain Project

Top comments (0)