MIT Researchers, Working On Analog Deep Learning, Introduce A New Hardware Powered By Ultra-Fast Protonics And With Much Less Energy

The amount of time, effort, and resources needed to train increasingly complicated neural network models is soaring as more machine learning experiments are being done. In order to combat this, a brand-new branch of artificial intelligence called “analog deep learning” is on the rise. It promises faster processing with far less energy consumption. Like transistors are the essential components of digital computers, programmable resistors are the fundamental building blocks of analog deep learning. Researchers have developed a network of analog artificial “neurons” and “synapses” that can do calculations similarly to a digital neural network by repeatedly repeating arrays of programmable resistors in intricate layers. Then, this network may be trained using complex AI tasks like image recognition and natural language processing.

A team of diverse MIT researchers aims to increase the speed of a particular kind of artificial analog synapse they had previously created. They used a valuable inorganic substance in the manufacturing process to give their devices a speed boost of a million times over earlier iterations, roughly a million times faster than the synapses in the human brain. This inorganic component also contributes to the resistor’s exceptional energy efficiency. The new material is compatible with silicon production methods, in contrast to materials employed in the earlier iteration of their device. This modification has made it possible to fabricate nanometer-scale devices and may open the door to their incorporation into commercial computing hardware for deep-learning applications.

For two key reasons, analog deep learning is faster and more efficient than its digital version. The main factor is that computations are carried out in memory, preventing massive amounts of data from being repeatedly transported from memory to a processor. Analog processors also carry out parallel processes. A protonic programmable resistor is the main component of MIT’s new analog processor technology. Analog machine learning is possible with this new processor by varying the electrical conductivity of protonic resistors. Learning occurs in the human brain due to the strengthening and weakening of synapses, the connections between neurons. Since its inception, deep neural networks have employed this analogy, in which training procedures are used to design the network weights.

The motion of the protons governs the conductance. The protons required to increase this conductance are conducted through an electrolyte. To create a programmable protonic resistor that is incredibly quick and highly energy efficient, the team investigated various materials for the electrolyte. Inorganic phosphosilicate glass (PSG), essentially silicon dioxide, was finally decided to be used by the team. It is the perfect solid electrolyte for this use since it exhibits strong proton conductivity at room temperature without the requirement for water. Because PSG has many nanometer-sized pores with surfaces that act as pathways for proton diffusion, it allows for rapid proton transport.

Additionally, it can endure extremely powerful, pulsed electric fields. The resistor can successfully run for millions of cycles without failing since protons do not harm the material, making it a million times faster than their last gadget. Additionally, it can function efficiently at ambient temperature, making it appropriate for integration into computing devices.

By studying resistor array qualities and scaling them up so they may be embedded into systems, the researchers intend to reengineer the programmable resistors for high-volume manufacture soon. They also intend to research the materials to eliminate any obstructions that prevent the voltage needed to effectively transfer protons into, through, and out of the electrolyte. The researchers think that future innovation will depend heavily on their study. They acknowledge that the road ahead will be difficult but are very optimistic about the prospects. The MIT-IBM Watson AI Lab also contributes to the funding of their research.

This Article is written as a summary article by Marktechpost Staff based on the research paper 'Nanosecond protonic programmable resistors for analog deep learning'. All Credit For This Research Goes To Researchers on This Project. Checkout the paper and reference article.

Please Don't Forget To Join Our ML Subreddit

Khushboo Gupta is a consulting intern at MarktechPost. She is currently pursuing her B.Tech from the Indian Institute of Technology(IIT), Goa. She is passionate about the fields of Machine Learning, Natural Language Processing and Web Development. She enjoys learning more about the technical field by participating in several challenges.