For starters
Today’s topic continues my interest in the combination of biology and computing, and it got me gripped so much that I decided to break my usual schedule and post it mid-week, I hope you guys are okay with it. It all started with an article I read on Tuesday (I’ll talk about it later), and then, after some talking to my friends who know more about the topic, I decided to connect the dots and present to you: Neuromorphic computing.
Okay, sounds like “Alien”, what is it?
Neuromorphic computing is a new area of computing that mimics the workings of the human brain. This type of computing is still in its early stages but has already shown promise for applications such as pattern recognition and machine learning. Neuromorphic computers are designed to take advantage of the massively parallel nature of the brain, which allows them to perform complex computations very quickly.
The human brain in itself is a fantastic computational machine. It can recognize patterns, learn from experience, and make decisions based on incomplete information. These abilities are mainly due to the brain's massive parallelism – its ability to process large amounts of data simultaneously. Researchers across the globe and big companies like Intel saw a vacant opportunity and jumped on it, which resulted in us seeing the first neuromorphic chips today and the overall advancement of the technology at quite a steady pace.
What’s the novelty?
Conventional computers are limited by their serial architecture, which can only process one task at a time. This bottleneck severely limits their ability to match the speed and flexibility of the human brain. Neuromorphic computing aims to overcome this limitation by emulating the massively parallel architecture of the brain. If we succeed in this research, we’d be able to create a “parallel computer” per-se that would be able to perform several tasks simultaneously and have a superior pattern recognition model.
In a neuromorphic computer, each processing unit (neuron) is connected to many other units (synapses). This connectivity allows neurons to communicate with each other directly without going through a centralized control unit. This makes neuromorphic computers much faster and more flexible than conventional computers.
Potential usage
One potential application for neuromorphic computing is pattern recognition. The human brain is extremely good at recognizing patterns, even when those patterns are obscured or incomplete. Neuromorphic computers could be used to automatically identify objects in images or videos, for example.
Another potential application is machine learning. Machine learning algorithms require lots of data in order to learn effectively – more data than a single computer can typically handle efficiently. However, if multiple neuromorphic computers were used together, they could share data and learn from each other more effectively. This would allow machine learning algorithms to scale up and become more powerful than ever before. Neuromorphic computing is still in its early stages but has already shown great promise as a new way of doing computation that could revolutionize fields like pattern recognition and machine learning.
How it all started?
The reason I wrote this blog is an article by Quanta Magazine, which describes new findings in Neuro-computer chip creation. Shortly, it tells the story of researchers who created NeuRRAM, a new chip based on a technology of RRAM - resistive RAM memory. It solves one of the key problems of neuromorphic computing - computational capacities. While these chips are superior in terms of energy efficiency, they were overlooked because they weren’t able to compete in pure “power”, but now that the research by Weier Wan, Philip Wong and Gert Cauwenberghs is over, we can see the new dimensions in which the chip can be used. An article reveals that the approaches to solving problems resolved in this paper were done all the way back in 1964 but only now was a conclusive answer given, and the technology optimized enough to be sizeable better on all fronts.
Afterthoughts
I won’t give home reading today, but I’d highly advise reading the article and follow-ups mentioned in it for your own sake because chances are that these chips are something we are going to see way more in the news over the next few years.
Thanks for reading,
Always yours,
Sean