Skip to main content

Does OpenAI's GPT-2 Neural Network Pose a Threat to Democracy?

Undoubtedly OpenAI's Neural Network for deep text generation was built with good intentions, but can its sheer power become a recipe for disaster?

GPT2 started life as a what word follows next predictor, just as Gmail or the virtual keyboards in our mobile devices do. For that purpose it was trained with a massive 40GB dataset, a database collected from sites around the web heavy in text, mostly news sites.

The dataset was fed into the Neural Network in order to build a linguistic model as a basis for predicting the next word. That, of course, means that it can generate text, well as far as the next word goes. However, it turns out that the model became so good at it that it also learned to generate complete meaningful sentences:

full article on i-programmer

Comments

Popular posts from this blog

The Advent of SQL 2024 Has Commenced

  It's Advent - the time of year when we countdown the days to Christmas - and if your are a programmer complete daily coding challenges with the Advent of Code, the Advent of Perl, the Advent of Java, Javascriptmas, etc. Now we have the Advent of SQL too with 24 SQL challenges to complete before Christmas! https://www.i-programmer.info/news/204-challenges/17678-the-advent-of-sql-2024-has-commenced.html

Greenplum's Cloudberry Fork Enters Apache Incubator

  Cloudberry is the open source equivalent of Greenplum. Now it is fostered by the Apache Foundation as it acquires incubating status. It all began about six months ago. Greenplum's Github repositories was archived and went dark. This meant no more free new releases or security and bug fixes for its users. Why? Because in May 2024, Tanzu made the decision to close-source the project. https://www.i-programmer.info/news/84-database/17694-greenplums-cloudberry-fork-enters-apache-incubator-.html