Understanding Neuromorphic Computing: The Next Generation of AI

Image by Gerd Altmann from Pixabay

Neuromorphic computing, as the name suggests, uses a model that’s inspired by the workings of the brain. The brain makes an appealing model for computing as it is compact and can fit easily in something the size of one’s head, unlike most supercomputers that fill the rooms. Brains also need much less energy than most supercomputers. Our brain uses around 20 watts, whereas the Fugaku supercomputer uses 28 megawatts. While supercomputers need large cooling systems, the brain stays inside the skull neatly at 37°C. There’s a lot more about the power of the brain. And so, with traditional models of computing struggling, if we could harness techniques used by our brains, it would lead us to extensively more powerful computers in the future.

Need for the neuromorphic systems

Most hardware nowadays is based on the von Neumann architecture, which has separate memory and computing. As von Neumann chips need to shuffle information back and forth between the memory and CPU, they waste time and energy that causes a problem called the von Neumann bottleneck. As time goes on, von Neumann architectures would make it harder to deliver the compute power increases needed.

Hence, to keep up, a new type of non-von Neumann architecture will be required: a neuromorphic architecture. Neuromorphic systems and quantum computing have both been claimed as the solution. It’s neuromorphic computing, i.e., brain-inspired computing is likely to be commercialized sooner. 

Benefits of the brain over von Neumann system

  • Unlike series computing in the von Neumann system, brains use massively parallel computing. 
  • They are also more fault-tolerant than computers.

Researchers are hoping to model both the advantages within neuromorphic systems.

How can we make a computer that works like the human brain?

Neuromorphic computing models similar to how the brain works through spiking neural networks. Conventional computing is based on transistors that can either be on or off, one or zero, whereas spiking neural networks can convey information in both the same temporal and spatial way as the brain can, thus producing more than one-two outputs. Neuromorphic systems can be either digital or analog, with the role of synapses played by either software or memristors.

Memristors could also contain synapses’ ability to store information as well as transmitting it. Memristors can store a range of values instead of only one and zero, allowing the system to mimic the variation in the strength of a connection between two synapses. We can change the weights in the artificial synapses in neuromorphic computing to allow the brain-based systems to learn. It also includes phase-change memory, resistive RAM, spin-transfer torque magnetic RAM, and conductive bridge RAM. Researchers are also looking for different ways to model the brain’s synapse, such as quantum dots and graphene.

Possible uses of the neuromorphic system

  • For heavy computing tasks, edge devices like smartphones currently use a cloud-based system that processes the query and feeds the answer to the device again. But with neuromorphic systems, that query could be conducted within the device itself, without any back-and-forth movement. 
  • But perhaps the most significant driving force for investments in neuromorphic computing is the promise it holds for AI. Right now, AI tends to be heavily rules-based and trained on datasets before generating a particular outcome. But the human brain works much better. Our grey matter is far more comfortable with ambiguity and flexibility.
  • It’s expected that the next generation of AI could deal with a few more brain-like problems, including constraint satisfaction.

But to make this successful, there would be many challenges like bringing fundamental change in the computing norms, hardware, and software, rewriting the entire programming language, etc. Thus, a lot has to be studied about the brain. Researchers are working on it, understanding the neuromorphic design.

Source: https://www.intel.com/content/www/us/en/research/neuromorphic-computing.html

https://www.zdnet.com/article/what-is-neuromorphic-computing-everything-you-need-to-know-about-how-it-will-change-the-future-of-computing/

Related Products