Reddit bans deepfake porn videos

Gigacycle > Technology News  > Reddit bans deepfake porn videos

Reddit bans deepfake porn videos

Reddit has banned “fake porn” – imagery and videos that superimpose a subject’s face over an explicit photo or video without the person’s permission.

The move follows the development of artificial intelligence software that made it relatively easy to create clips featuring computer-generated versions of celebrities’ faces.

Reddit had become one of the most popular places to share and discuss so-called deepfake videos.

It also reworded its policy for minors.

The discussion site had been under growing pressure to act after other platforms – including Twitter, Gfycat and Pornhub – introduced their own deepfake bans.

However, it may cause unease among some Reddit users who already feared the platform was becoming less “open” after its closure of two alt-right forums in 2017.

What are deepfakes?

Deepfakes involve the use of artificial intelligence software to create a computer-generated version of a subject’s face that closely matches the original expressions of another person in a video.

The algorithm involved is believed to have been developed last year.

To work it requires a selection of photos of the subject’s face taken from different angles – about 500 different images are suggested for a good result – and a video clip to merge the results with.

The practice was brought to the wider public’s attention in December, when the news site Motherboard reported how it had been used to create fake videos of the Wonder Woman actress Gal Gadot.

But it became much more common after January’s release of a software tool called FakeApp, which made it possible to carry out the process with a single button press.

Not all of the clips generated have been pornographic in nature – many feature spoofs of US President Donald Trump, and one user has specialised in placing his wife’s face in Hollywood film scenes.

But much of the material generated has merged female pop singers and actresses’ faces with footage of women engaged in sexual acts.

Reddit’s new involuntary pornography policy states that it “prohibits the dissemination of images or video depicting any person in a state of nudity or engaged in any act of sexual conduct apparently created or posted without their permission, including depictions that have been faked”.

The Deepfakes subreddit – which had more than 91,000 subscribers – was among the first forums to be deleted after its publication.

But the site has also taken down the Celebfakes forum, whose focus was faked static images. It had been in existence for about seven years.

In recent days, there had been concern among the Deepfakes community itself that some users had been using the technology to create clips that featured the faces of under-16s – in effect child abuse imagery.

Reddit has also clarified its policy over this practice.

“Reddit prohibits any sexual or suggestive content involving minors or someone who appears to be a minor,” it said.

“Depending on the context, this can in some cases include depictions of minors that are fully clothed and not engaged in overtly sexual acts. If you are unsure about a piece of content involving a minor or someone who appears to be a minor, do not post it.”

Go to Source

No Comments

Sorry, the comment form is closed at this time.