Google DeepMind Introduces GNoME: A New Deep Learning Tool that Dramatically Increases the Speed and Efficiency of Discovery by Predicting the Stability of New Materials

Inorganic crystals are essential to many contemporary technologies, including computer chips, batteries, and solar panels. Every new, stable crystal results from months of meticulous experimentation, and stable crystals are essential for enabling new technologies since they do not dissolve.

Researchers have engaged in costly, trial-and-error experiments that yielded only limited results. They sought new crystal structures by modifying existing crystals or trying other element combinations. 28,000 novel materials have been found in the past ten years thanks to computational methods spearheaded by the Materials Project and others. The capacity of emerging AI-guided techniques to reliably forecast materials that may be experimentally viable has been a major limitation up until now. 

Researchers from the Lawrence Berkeley National Laboratory and Google DeepMind have published two papers in Nature demonstrating the potential of our AI predictions for autonomous material synthesis. The study shows a finding of 2.2 million more crystals, the same as approximately 800 years’ worth of information. Their new deep learning tool, Graph Networks for Materials Exploration (GNoME), predicts the stability of novel materials, greatly improving the speed and efficiency of discovery. GNoME exemplifies the promise of AI in the large-scale discovery and development of novel materials. Separate yet contemporaneous efforts by scientists in different laboratories across the globe have produced 736 of these novel structures.

The number of technically feasible materials has been increased by a factor of two thanks to GNoME. Among its 2.2 million forecasts, 380,000 show the greatest promise for experimental synthesis because of their stability. Materials with the ability to create next-generation batteries that improve the efficiency of electric vehicles and superconductors that power supercomputers are among these contenders.

GNoME is a model for a state-of-the-art GNN. Because GNN input data is represented by a graph analogous to atomic connections, GNNs are well suited to finding novel crystalline materials.

Data on crystal structures and their stability, initially used to train GNoME, are publicly available through the Materials Project. The use of ‘active learning’ as a training method significantly improved GNoME’s efficiency. The researchers generated new crystal candidates and predicted their stability using GNoME. They used Density Functional Theory (DFT), a well-established computational method in physics, chemistry, and materials science for understanding atomic structures—crucial for evaluating crystal stability—to repeatedly check their model’s performance throughout progressive training cycles to evaluate its predictive power. The model training went back into the process using the high-quality training data.

The findings show that the research increased the rate of materials stability prediction discovery from approximately 50% to 80%, using an external benchmark set by earlier state-of-the-art models as a guide. Enhancements to this model’s efficiency allowed the discovery rate to be boosted from below 10% to over 80%; these gains in efficiency may have a major bearing on the computing power needed for each discovery.

The autonomous lab produced over forty-one novel materials using ingredients from the Materials Project and stability information from GNoME, paving the way for further advancements in AI-driven materials synthesis.

The GNoME’s forecasts have been released to the scientific community. The researchers will provide the Materials Project, which analyzes the compounds and adds them to its online database with 380,000 materials. With the help of these resources, they hope that the community will seek to study inorganic crystals further and realize the potential of machine learning technologies as experimental guidelines.

Check out the Paper 1 and Paper 2 and Reference ArticleAll credit for this research goes to the researchers of this project. Also, don’t forget to join our 33k+ ML SubReddit, 41k+ Facebook Community, Discord Channel, and Email Newsletter, where we share the latest AI research news, cool AI projects, and more.

If you like our work, you will love our newsletter..

Dhanshree Shenwai is a Computer Science Engineer and has a good experience in FinTech companies covering Financial, Cards & Payments and Banking domain with keen interest in applications of AI. She is enthusiastic about exploring new technologies and advancements in today’s evolving world making everyone's life easy.

🚀 LLMWare Launches SLIMs: Small Specialized Function-Calling Models for Multi-Step Automation [Check out all the models]