Facebook AI Introduces A New Self-Supervised Learning Framework For Model Selection And Hyperparameter Tuning For Large-Scale Forecasting

0
3711
Source: https://arxiv.org/pdf/2102.05740.pdf

Researchers at Facebook AI have recently released a new self-supervised learning framework for model selection (SSL-MS) and hyperparameter tuning (SSL-HPT), which provides accurate forecasts with less computational time and resources. The SSL-HPT algorithm estimates hyperparameters 6-20x faster when compared with baseline (search-based) algorithms, producing accurate forecasting results in numerous applications.

At present, Forecasting is one of the significant data science and machine learning tasks performed. Therefore, it is crucial to have fast, reliable, and accurate forecasting results with large amounts of time series data for managing various businesses. 

Time series analysis is used to find trends and forecast future values. A slight difference in hyperparameters in this type of analysis could lead to very different forecast results for a given model and have serious consequences. Therefore, it’s essential to select optimal hyperparameter values. 

Most of the existing hyperparameter tuning methods are based on one fundamental component: searching. This makes them computationally expensive and challenging to apply to fast or scalable time-series hyperparameter tuning. The new SSL-HPT framework uses time-series features as inputs and produces optimal hyperparameters in less time without losing accuracy.

Workflow:

The team developed the self-supervised learning framework with two primary tasks for the forecasting domain: SSL-MS and SSL-HPT.

SSL-MS: The self-supervised learning framework for SSL-MS consists of the following three steps:

  1. Offline training data preparation: The researchers first obtain time-series features for each time series and the best-performing model for each time series through offline exhaustive hyperparameter tuning.
  2. Offline training: They train a classifier (self-supervised learner) with the data obtained from step 1. The data input feature is the time series feature, and the label is the best performing model.
  3. Online model prediction: In their online services, they extract features for new time-series data and then make an inference with pre-trained classifiers, such as a random forest model.
Figure 1: The workflow of SSL-MS
Source: https://ai.facebook.com/blog/large-scale-forecasting-self-supervised-learning-framework-for-hyper-parameter-tuning/

SSL-HPT: The workflow of SSL-MS is easily extended to SSL-HPT. All hyperparameter settings within a predefined parameter space are explored for each time series when given a model. Then the most likely one is chosen as the output 𝑌. They used the same time series features as used in SSL-MS for input 𝑋. When training the self-supervised learner is complete, one can directly predict hyperparameters and produce the forecasting results for any new time series data.

Figure 2: The workflow of SSL-HTP
Source: https://ai.facebook.com/blog/large-scale-forecasting-self-supervised-learning-framework-for-hyper-parameter-tuning/

The team empirically evaluated the algorithms on both internal and external data sets. They note that SSL frameworks can significantly improve model selection and hyperparameter tuning efficiency, reducing the running time by 6-20x with comparable forecasting accuracy. This framework finds applications in many tasks, including capacity planning and management, demand forecasting, energy prediction, and anomaly detection.

This new approach is independent of specific forecasting models and algorithms. Preliminary analysis shows that this framework can be extended to the model recommendation and enhance the Bayesian optimization algorithm in facebook’s AX library.

Source: https://ai.facebook.com/blog/large-scale-forecasting-self-supervised-learning-framework-for-hyper-parameter-tuning/

Paper: https://arxiv.org/abs/2102.05740?

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.