Researchers at the University of Edinburgh and Zhejiang University have revealed a unique way to combine deep neural networks (DNNs) for creating a new system that learns to generate adaptive skills.
Deep neural networks (DNNs) are trained on multiple examples repeatedly to learn functions. They are used in various AI applications, including identifying faces in a crowd or determining a loan applicant’s creditworthiness.
Scientists have now combined several DNNs developed for different applications to create a new system with the advantages of all of its constituent DNNs. The reports state that the new system is not only the sum of its parts, but it is also capable of learning new functions that no DNN was able to do alone. Therefore, it is termed as a multi-expert learning architecture (MELA) that gets adaptive skills from a combination of representative expert skills.
Building this new system involved training various DNNs for different functions. For instance, one DNN was taught to make a robot trot and another to navigate around obstacles. On training all DNNs, they were connected to a neural gating network that gradually learned to call the other DNNs when something required its unique skillset. That resulting system is now able to carry out all of the skills of all of the combined DNNs.
As the MELA discovered its constituent parts and their respective capabilities, it learned to use them together by trial and error method in ways that were not taught. MELA blends various DNNs and dynamically synthesizes a new DNN to generate adaptive responses to new situations. For example, it discovered how to combine getting up after falling with a slippery floor and how to function when one of its motors broke.
This method leverages the benefits of trained expert skills and the fast online synthesis of adaptive policies to generate responsive motor skills during dynamic tasks. The work will bring a revolutionary breakthrough in robotics research, providing a new model where a robot can understand a problem it has not experienced before.