Neuromorphic computing: The next big leap

by May 15, 2020Articles

The human brain is one of the most complex, efficient systems that nature created. Our brains have limitless potential, and is responsible for every innovation and discovery the mankind prides itself on. Its efficiency to support cognitive abilities with low energy consumption and fast computational speed, has inspired scientists for years. Can you imagine a simulation of a brain! The potential it could unlock, its efficiency and application range. This may seem like a dream to some of you, but in reality, it’s not far away. Neuromorphic computing is a field that has the potential to open doors for several future possibilities.

What Is Neuromorphic Computing?

Neuromorphic computing, uses neutral networks to parallel how the brain performs its functions including decision making, memorizing information. “Neuromorphic” can be translated as “taking the form of the brain”. A neuromorphic device ideally will have functions analogous to parts of the human brain.

AI uses rules and logic to draw reasoned conclusions within a specific defined domain. This first-generation AI proved its worth in monitoring, reading data and performing predefined tasks with perfection. The second generation has integrated deep-learning, that can sense, analyse and perceive.

Abstraction is the next big leap for AI. Interpretation and adaptation, in addition to finding context and understanding will open up a whole new world of possibilities. Neuromorphic computing uses probabilistic computing to deal with the uncertainty and ambiguity in real time learning and decision making.

Origin of Neuromorphic Computing:

It is surprising that the concept of a neuromorphic chip was first mentioned by Caltech professor Carver Mead in 1990. Mead suggested that analog chips could mimic the electrical activity of neurons and synapses in the brain. Conventional chips keep transmissions at a fixed voltage. Power dissipation is a huge problem especially when today’s systems perform complex machine learning tasks. In comparison, neuromorphic chips would have low energy consumption. Thus, to render better operating environment and lower energy consumption, the chip would be event driven and only operate when needed.

Several companies have invested in this brain-inspired computing. Qualcomm made strides in the field through a neuromorphic chip-implemented robot in 2014. This robot was able to perform tasks which would have required a specially programmed computer. IBM’s SyNAPSE chip, introduced in 2014, has a neuromorphic type architecture. It that requires only an incredibly low energy consumption of 70mW. These companies also explore this technology for research purposes.

In 2012, Intel proposed a design where Magnetic Tunnel Junctions act like the cell body of the neuron, and the Domain Wall Magnets act as the synapse. The CMOS detection and transmission unit is comparable to the axon of the biological neuron that transmits electrical signal.

Advantages:

Aside from the advantage of low power consumption, neuromorphic devices are better at tasks that need pattern matching such as self-driving cars. They can cadre to the applications that require the imitation of ‘thinking’. The architecture has a high degree of parallelism, and can handle deep memory hierarchies in a uniform way. Parallelizing a task to more complicated neural network problems and scale out across multiple nodes. However, this currently has a limitation of a few hundred nodes. Neuromorphic computing would unlock a much higher degree of scalability.

Research Focuses

The key challenges in research are matching flexibility and ability to learn with the energy efficiency of the human brain. Spiking neural networks are a viable model that mimic the natural neural networks that exist in the human brains. The components of neuromorphic system must be logically analogous to neurons. By encoding information within the signals and the timing, SNNs simulate the learning process by dynamically remapping the synapses.

Multi-Disciplinary Computing:

Neuromorphic computing intersection of diverse disciplines, like computational neuroscience, machine learning, microelectronics, and computer architecture. Right from experimentation and development, to identifying applications to solve real-world problems.

The uncertainty and noise modulated into natural data are a key challenge for the advancement of AI. Algorithms must become adept at tasks based on natural data, which humans manage intuitively but computer systems have difficulty with.

Having the capability to understand and compute with uncertainties will enable intelligent applications in diverse AI domains. For example, in medical imaging, uncertainty measures can prioritize which images a radiologist with specific regions highlighted.

The current state of AI enables the systems to recognize and respond to their surroundings, and aid in avoiding collision. For fully autonomous driving, however, the algorithms must involve uncertainty as well. The decision making must involve understanding the environment and predicting future events. The perception and understanding tasks need to be aware of the uncertainty inherent in such tasks.

Examples of neuromorphic engineering projects:

Today, there are several academic and commercial experiments under way to produce working, reproducible neuromorphic models, including:

SpiNNaker is a low-grade supercomputer. Its job is to simulate cortical microcircuits. In August 2018, Spinnaker was responsible for the largest neural network simulation to date, involving around 80,000 neurons connected by 300 million synapses.

Intel is experimenting with neuromorphic chip architecture, called Loihi. Announced in 2017, it has been designed to implement a spiking neural network that adds more brain-like characteristics.

Neuromorphic computing is a new and emerging field that has a plethora of opportunities for the future. If this article has spiked your interest, go ahead and delve into more information. You can look into this field as a career choice and be a pioneer for a new phase in artificial intelligence!

Creating a multiplication Skill in Alexa using python

Written By Jyotsna Rajaraman

Hi! I'm Jyotsna, an electronics and communication undergrad who likes reading, writing, talking and learning. So when it comes to teaching, all my favorite things come together! I hope my articles have helped and taught you something. 🙂

RELATED POSTS

5 Booming Technologies in IoT to watch out for in 2022

5 Booming Technologies in IoT to watch out for in 2022

Introduction Internet of Things - IoT is one of the industries that has experienced an exponential rise in the past few years. With technology on the rise, we expect this field to grow even further in the coming years. It is one of the most important technologies...

Furtherance to SIM Technology: eSIM and embedded SIM

Furtherance to SIM Technology: eSIM and embedded SIM

eSIM (electronic SIM) and embedded SIM are two different terms. While both are under development and can be incorporated in IoT. They will result in more efficient SIM technology combined with the fast-growing and in-demand 5G network. Before going into the details...

The Internet of Nano Things (IoNT): Evolution of a new era

The Internet of Nano Things (IoNT): Evolution of a new era

Internet of Nano Things The internet of nano-things (IoNT) is a network that connects a collection of very small devices to transport data. The internet of nano-things is similar to the internet of things. The only difference is that the devices present inside it are...

10 Innovations in IoT Using 5G

10 Innovations in IoT Using 5G

5G usage cases typically depend on the improved speed and stability of 5G, as well as the reduced latency it provides, and they have the potential to disrupt both conventional and digital industries. And, in the coming months, years, and decades, 5G technology will...

What is IoRT(Internet of Robotic Things)

What is IoRT(Internet of Robotic Things)

The IoT and robotics, two different fields, are coming together to create IoRT (Internet of Robotic Things). The IoRT is a concept in which intelligent devices can monitor the events happening around them, fuse their sensor data, use local and distributed intelligence...

Discover the Top 5 proven Use cases of IoT data analytics

Discover the Top 5 proven Use cases of IoT data analytics

Billions of connected IoT devices are generating a massive amount of data every second. Meanwhile, as the IoT is booming this data generation has exponential growth. This data needs to be analyzed in order to retrieve insights out of this data. Further, these insights...

Data Analysis role in IoT

Data Analysis role in IoT

Before diving into Data analysis role in IoT, let us first understand what data analysis exactly mean What is Data Analysis? According to Wikipedia, Data analysis is a process of...

What is the future of IoT?

What is the future of IoT?

IoT or the Internet of Things describes the network of physical objects—“things”—that are embedded with sensors, software, and other technologies for the purpose of connecting and exchanging data with other devices and systems over the Internet. The definition of...

IoT Security Solutions

IoT Security Solutions

Introduction IoT is one of the emerging technology. Moreover, this has its own risks and rewards. IoT devices sure make our lives simpler and automate a lot of processes. By now there are billions of IoT devices which include Smart TVs, Smart Refrigerators, Smart...

The relation between Embedded Systems and IoT

The relation between Embedded Systems and IoT

Embedded systems are the major part of our technological advances, found in everyday items such as – microwave oven, washing machine, remote control, RFID tags, routers, modems, PDAs, mobile phones etc. However, we’re in a new era of internet-based...

VIDEOS – FOLLOW US ON YOUTUBE

EXPLORE OUR IOT PROJECTS

IoT Smart Gardening System – ESP8266, MQTT, Adafruit IO

Gardening is always a very calming pastime. However, our gardens' plants may not always receive the care they require due to our active lifestyles. What if we could remotely keep an eye on their health and provide them with the attention they require? In this article,...

How to Simulate IoT projects using Cisco Packet Tracer

In this tutorial, let's learn how to simulate the IoT project using the Cisco packet tracer. As an example, we shall build a simple Home Automation project to control and monitor devices. Introduction Firstly, let's quickly look at the overview of the software. Packet...

All you need to know about integrating NodeMCU with Ubidots over MQTT

In this tutorial, let's discuss Integrating NodeMCU and Ubidots IoT platform. As an illustration, we shall interface the DHT11 sensor to monitor temperature and Humidity. Additionally, an led bulb is controlled using the dashboard. Besides, the implementation will be...

All you need to know about integrating NodeMCU with Ubidots over Https

In this tutorial, let's discuss Integrating NodeMCU and Ubidots IoT platform. As an illustration, we shall interface the DHT11 sensor to monitor temperature and Humidity. Additionally, an led bulb is controlled using the dashboard. Besides, the implementation will be...

How to design a Wireless Blind Stick using nRF24L01 Module?

Introduction Let's learn to design a low-cost wireless blind stick using the nRF24L01 transceiver module. So the complete project is divided into the transmitter part and receiver part. Thus, the Transmitter part consists of an Arduino Nano microcontroller, ultrasonic...

Sending Temperature data to ThingSpeak Cloud and Visualize

In this article, we are going to learn “How to send temperature data to ThingSpeak Cloud?”. We can then visualize the temperature data uploaded to ThingSpeak Cloud anywhere in the world. But "What is ThingSpeak?” ThingSpeak is an open-source IoT platform that allows...

Amaze your friend with latest tricks of Raspberry Pi and Firebase

Introduction to our Raspberry Pi and Firebase trick Let me introduce you to the latest trick of Raspberry Pi and Firebase we'll be using to fool them. It begins with a small circuit to connect a temperature sensor and an Infrared sensor with Raspberry Pi. The circuit...

How to implement Machine Learning on IoT based Data?

Introduction The industrial scope for the convergence of the Internet of Things(IoT) and Machine learning(ML) is wide and informative. IoT renders an enormous amount of data from various sensors. On the other hand, ML opens up insight hidden in the acquired data....

Smart Display Board based on IoT and Google Firebase

Introduction In this tutorial, we are going to build a Smart Display Board based on IoT and Google Firebase by using NodeMCU8266 (or you can even use NodeMCU32) and LCD. Generally, in shops, hotels, offices, railway stations, notice/ display boards are used. They are...

Smart Gardening System – GO GREEN Project

Automation of farm activities can transform agricultural domain from being manual into a dynamic field to yield higher production with less human intervention. The project Green is developed to manage farms using modern information and communication technologies....