Skip to main content

Does OpenAI's GPT-2 Neural Network Pose a Threat to Democracy?

Undoubtedly OpenAI's Neural Network for deep text generation was built with good intentions, but can its sheer power become a recipe for disaster?

GPT2 started life as a what word follows next predictor, just as Gmail or the virtual keyboards in our mobile devices do. For that purpose it was trained with a massive 40GB dataset, a database collected from sites around the web heavy in text, mostly news sites.

The dataset was fed into the Neural Network in order to build a linguistic model as a basis for predicting the next word. That, of course, means that it can generate text, well as far as the next word goes. However, it turns out that the model became so good at it that it also learned to generate complete meaningful sentences:

full article on i-programmer

Comments

Popular posts from this blog

RAG from Scratch

  The "RAG from Scratch" tutorial by Langchain coupled with the "RAG playground" are two great educational resources that will help you kickstart your journey with RAG. https://www.i-programmer.info/news/105-artificial-intelligence/17676-rag-from-scratch.html

Hour Of Code 2024 Is About To Kick Off

  This year the event that aims to provide a coding experience for all school students and anyone else who wants to join in runs between December 9th and 15th and includes new activities. Let's find out all about it! https://www.i-programmer.info/news/150-training-a-education/17664-hour-of-code-2024-is-about-to-kick-off.html