Facebook’s ‘Expire-Span’ Tool Enables Machine Learning Models to Forget Irrelevant Data

Facebook has recently developed an AI tool that allows machine learning models to preserve certain information while forgetting the rest. It claims that the tool, Expire-Span, can predict information most relevant to a task at hand thereby, allowing AI systems to process data at larger scales. 

Conventionally, AI models memorize information without distinction, unlike humans. Therefore, creating the ability to decide whether to forget the information or not at the software level is challenging. Usually, state-of-the-art models struggle with large quantities of information like books or videos and incurring high computing costs. This can lead to many other problems such as catastrophic learning or catastrophic interference, a situation where AI systems fail to recall what they’ve learned from a training dataset. 


A lot of solutions that have been proposed to solve the problem focus on compression. Historical information is compressed into smaller chunks; however, the blurry versions of memory can significantly affect the accuracy of the model’s predictions.

Facebook has proposed an alternative- Expire-Span, a model which gradually forgets irrelevant information. The model works by first predicting which information is most relevant based on context. Next, it assigns each piece of information with an expiration date so that when the date passes, the data is deleted.

According to Facebook, Expire-Span has achieved leading results on a benchmark for character-level language modeling and showed improved efficiency across long-context workloads in language modeling, reinforcement learning, object collision, and algorithmic tasks.

Expire-Span attempts to induce intrinsic forgetting in AI and capture the neurogenesis process in software form. To calculate the expiration dates of words, images, video frames, and other information, Expire-Span decides how long the information is retained as a memory each time a new piece of data is presented. The gradual decay is essential to keep important information without blurring it. Expire-Span makes predictions based on context learned from data and influenced by its surrounding memories.

Expire-Span can scale tens of thousands of pieces of information and retain less than a thousand bits of it. The researchers now plan to investigate how the underlying techniques might help incorporate different types of memories into AI systems. Expire-Span could empower people to easily retain the information they find most significant for long-range tasks and memories.


Codes: https://github.com/facebookresearch/transformer-sequential

Paper: https://arxiv.org/pdf/2105.06548.pdf

Source: https://ai.facebook.com/blog/teaching-ai-how-to-forget-at-scale/

🐝 Join the Fastest Growing AI Research Newsletter Read by Researchers from Google + NVIDIA + Meta + Stanford + MIT + Microsoft and many others...