Reddit Bans "Involuntary Porn" Made With Artificial Intelligence
Weeks ago, a disturbing subreddit called "deepfakes" featuring adult content edited using artificial intelligence went viral. The content seemingly featured popular actresses (and the occasional actor) in adult movies, though none of those celebrities had actually participated in the content and none had given their consent. The videos were edited using artificial intelligence, and now Reddit is squashing them.
Reddit is calling these "deepfaked" videos involuntary pornography, and it has just banned the Deefakes subreddit that was sharing them. In a statement published on the site, one Reddit administrator explained that the company has implemented new site-wide rules specifically against such content, whereas before those rules were part of a larger single policy covering multiple facets of unacceptable content.
Under the new policy, Reddit says that porn created and/or posted without permission is banned including ones "that have been faked." This covers the issue of what it calls involuntary pornography in which someone uses the face of a celebrity (or anyone else) and uses AI to seamlessly edit it over the face of a participant in an adult film.
The "Deepfakes" subreddit stirred huge controversy after going viral, raising questions related to ethics and legality. This was the first time many people had witnessed the ability of certain AI tech to generate content in which a person appears to be present even if they're not.
The technology itself isn't new, though. We saw a creepy example of this fake footage tech last year via the University of Washington, where researchers made a tool that uses AI to generate a video of someone "saying" something they didn't actually say. Then-president Barack Obama was used as an example.
SOURCE: Reddit