OpenAI Releases the 1.5 Billion Parameter GPT-2 : The Text Generating Model

OpenAI announced the launch of its 1.5 billion parameter language model GPT-2. GPT-2 has been in the news as the scary AI text generator with potential threats regarding fake news stories, and so on. GPT-2 has also received some praise for its performance.

GPT-2 automatically creates texts using limited input. ‘Talk to Transformer’ is very quick to adapt GPT-2, and you can try yourself using it.

So does 1.5 billion parameters from the text generating model decreases the threat chances from its last versions?

🚀 JOIN the fastest ML Subreddit Community

Check it out OpenAI’s Report

Image source: https://talktotransformer.com/

Github: https://github.com/openai/gpt-2

Dataset: https://github.com/openai/gpt-2-output-dataset

Web: https://openai.com/blog/gpt-2-1-5b-release/

OpenAI’s report: https://d4mucfpksywv.cloudfront.net/papers/GPT_2_Report.pdf  

Asif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is committed to harnessing the potential of Artificial Intelligence for social good. His most recent endeavor is the launch of an Artificial Intelligence Media Platform, Marktechpost, which stands out for its in-depth coverage of machine learning and deep learning news that is both technically sound and easily understandable by a wide audience. The platform boasts of over 2 million monthly views, illustrating its popularity among audiences.

Check out https://aitoolsclub.com to find 100's of Cool AI Tools