Skip to main content

Does OpenAI's GPT-2 Neural Network Pose a Threat to Democracy?

Undoubtedly OpenAI's Neural Network for deep text generation was built with good intentions, but can its sheer power become a recipe for disaster?

GPT2 started life as a what word follows next predictor, just as Gmail or the virtual keyboards in our mobile devices do. For that purpose it was trained with a massive 40GB dataset, a database collected from sites around the web heavy in text, mostly news sites.

The dataset was fed into the Neural Network in order to build a linguistic model as a basis for predicting the next word. That, of course, means that it can generate text, well as far as the next word goes. However, it turns out that the model became so good at it that it also learned to generate complete meaningful sentences:

full article on i-programmer

Comments

Popular posts from this blog

Spatial Data Management For GIS and Data Scientists

  Videos of the lectures taught in Fall 2023 at the University of Tennessee are now available as a YouTube playlist. They provide a complete overview of the concepts of GeoSpatial science using Google Earth Engine, PostgresSQL GIS , DuckDB, Python and SQL. https://www.i-programmer.info/news/145-mapping-a-gis/16772-spatial-data-management-for-gis-and-data-scientists.html