Deepfake porn: where AI goes to die

Karl D. Stephan
mercatornet.com
2024-09-23

In Dante's Inferno, Hell is imagined as a conical pit with ever-deepening rings dedicated to the torment of worse and worse sinners. At the very bottom is Satan himself, constantly gnawing on Judas, the betrayer of Jesus Christ.

While much of Dante's mediaeval imagery would be lost on most people today, we still recognize a connection in language between lowness and badness. Calling deepfake porn the nadir of how artificial intelligence is used expresses my opinion of it, and also the opinion of women who have become victims of having their faces stolen and applied to pornographic images.

A recent article by Eliza Strickland in IEEE Spectrum shows both the magnitude of the problem and the largely ineffective measures that have been taken to mitigate this evil--for evil it is.

With the latest AI-powered software, it can take less than half an hour to use a single photograph of a woman's face to produce a 60-second porn video that makes it look like the victim was a willing participant in whatever debauchery the original video portrayed. A 2024 research paper cites a survey of 16,000 adults in ten countries and shows that 2.2 percent of the respondents reported being a victim of "non-consensual synthetic intimate imagery," which is apparently just a more technical way of saying "deepfake porn." The US was one of the ten countries included, and 1.1 percent of the respondents in the US reported being victimized by it. Because virtually all the victims are women and assuming men and women were represented equally in the survey, that means one out of every fifty women in the US has been a victim of deepfake porn.

Full Text

More Headlines…