EBRAINS Researchers Introduce A Robot Whose Internal Workings Mimic Human Brain

The human brain contains between 100 million and 100 billion neurons that process information from the senses and body and send messages back to the body. Thus, human intelligence is one of the most intriguing concepts many AI scientists are looking to replicate. 

A team of researchers at the new EBRAINS research infrastructure are building robots whose internal workings mimic the brain that would bring new concepts on the neural mechanisms.

Led by Cyriel Pennartz, a Professor of Cognition and Systems Neurosciences at the University of Amsterdam, the team includes cognitive neuroscientists, computational modelers, and roboticists who came together to create complex neural network architectures for perception. This architecture is built based on real-life data from rats. Scientists believe that understanding the brain and using that knowledge would help create better robots. 

They named the model “MultiPrednet,” which consists of visual and tactile input modules and a third module that merges them. From their new model, they were able to replicate how the brain makes predictions across different senses—this enabled researchers to predict how something would feel from looking at it and vice versa.

Like our brains, these networks train by constantly making predictions about the environment, comparing them to original sensory inputs, and later adjusting the network to avoid future error signals.

The researchers, along with Martin Pearson at Bristol Robotics Lab, evaluated how the MultiPrednet performs in a body. For this, they integrated MultiPrednet into Whiskeye, which is a rodent-like robot that autonomously explores its environment.

The results demonstrate that the brain-based model performs much better than traditional deep learning systems. Furthermore, when testing MultiPredNet on navigation and recognition of familiar scenes, the team noticed that it outperforms the conventional frameworks. 

The team recreated the robots as a simulation on the Neurorobotics Platform of the EBRAINS research infrastructure, allowing them to perform long-duration or even parallel experiments under controlled conditions, thereby accelerating their study. In addition, they plan to employ High Performance and Neuromorphic Computing Platforms soon to create detailed models of control and perception.


The code and tools used for the analysis of the study are made open on EBRAINS for researchers worldwide to run their own experiments. 

HBP’s Scientific Research Director, Katrin Amunts, states the importance of exploring the brain’s behavior as part of a body in an environment to understand cognition better. Connecting cognitive neuroscience and robotics domains would help researchers closely study how the brain acts in a body. 

The innovations that connect insights from brain science to AI and robotics will be advantageous in building future robots, says Pawel Swieboda, EBRAINS CEO and HBP’s Director-General.

Source: https://techxplore.com/news/2021-06-robot-ebrains-combine-vision.html

MultiPrednet model: https://search.kg.ebrains.eu/instances/Model/2164c2b9bbb66b42ce358d108b5081ce

🐝 Join the Fastest Growing AI Research Newsletter Read by Researchers from Google + NVIDIA + Meta + Stanford + MIT + Microsoft and many others...