Fairseq: A Fast, Extensible Toolkit for Sequence Modeling

0
2190
Image Source: https://github.com/pytorch/fairseq
-Advertisement-

Fairseq(-py) is a sequence modeling¬†toolkit written in Python and developed at Facebook’s AI Research. This toolkit allows AI researchers and developers to train customized models for translation, summarization, language modeling, and other text generation tasks. It provides reference implementations of various sequence-to-sequence models, including Long Short-Term Memory (LSTM) networks and a novel convolutional neural network (CNN). This can increase the speed for generating translations than comparable recurrent neural network (RNN) models.

Features

Advertisement

Fairseq toolkit provides reference implementations of various sequence-to-sequence models, including:

  • Convolutional Neural Networks (CNN)
  • LightConv and DynamicConv models
  • Long Short-Term Memory (LSTM) networks
  • Transformer (self-attention) networks
  • Non-autoregressive Transformers
  • multi-GPU (distributed) training on one machine or across multiple machines

Github: https://github.com/pytorch/fairseq

Installation: https://ai.facebook.com/tools/fairseq/

Paper: https://arxiv.org/pdf/1904.01038.pdf

Video: https://www.youtube.com/watch?v=OtgDdWtHvto

Installation

Requirements

Install PyTorch.

Install fairseq-py.

pip install fairseq

On MacOS:

CFLAGS="-stdlib=libc++" pip install fairseq

Full Documentation: https://fairseq.readthedocs.io/en/latest/

https://www.youtube.com/watch?v=OtgDdWtHvto

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.