OpenAI Releases the 1.5 Billion Parameter GPT-2 : The Text Generating Model

0
1193
Image Source: https://openai.com/blog/better-language-models/
-Advertisement-

OpenAI announced the launch of its 1.5 billion parameter language model GPT-2. GPT-2 has been in the news as the scary AI text generator with potential threats regarding fake news stories, and so on. GPT-2 has also received some praise for its performance.

GPT-2 automatically creates texts using limited input. ‘Talk to Transformer’ is very quick to adapt GPT-2, and you can try yourself using it.

-Advertisement-Python Specialization from University of Michigan

So does 1.5 billion parameters from the text generating model decreases the threat chances from its last versions?

Check it out OpenAI’s Report

Image source: https://talktotransformer.com/

Github: https://github.com/openai/gpt-2

Dataset: https://github.com/openai/gpt-2-output-dataset

Web: https://openai.com/blog/gpt-2-1-5b-release/

OpenAI’s report: https://d4mucfpksywv.cloudfront.net/papers/GPT_2_Report.pdf  

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.