GPT-3: A New Breakthrough in Language Generator

0
2856
Image: Pixabay

OpenAI has come up with a language generator GPT-3, which is a successor of GPT-2. This newly developed AI was put forward to a few selected outside software developers for testing. 

GPT-2 released a year prior, and it let out convincing streams in regards to message in the extent of different styles when induced with an underlying sentence. The differentiating factor of GPT-3 is having 175 billion parameters(the qualities that a neural system attempts to upgrade during preparing), whereas GPT-2 had only 1.5 billion. GPT-3 is the most significant language model ever.

The different features of GPT-3 are

  1. The AI cant generate short stories, songs, press releases, technical manuals, and more
  2. GPT-3 can imitate the writing styles of particular writers
  3. It can also produce any text, including guitar tabs or PC code

But like every other system, there are certain disadvantages too.

  1. Lot of fine-tuning required to remove the hateful sexist and racist language used at times by GPT-3.
  2. It is an engineering marvel and not a smart AI; hence some of its text lack common sense.

GPT-3 is by all accounts great at combining text it has found somewhere else on the web, making it a sort of huge, varied scrapbook made from a vast number of bits of text that it at that point sticks together in unusual and impressive manners on request.

There is as yet far to go for GPT-3, but it is a giant leap in language generating AI system.

GitHub: https://github.com/openai/gpt-3

Paper: https://arxiv.org/abs/2005.14165

Source: https://www.technologyreview.com/2020/07/20/1005454/openai-machine-learning-language-generator-gpt-3-nlp/

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.