You Could Become a Porn Star Without Even Knowing It Thanks to a New AI App

(ANTIMEDIA) — We are smack dab in the middle of what future generations will likely refer to as the Golden Age of algorithmic artificial intelligence. Startups galore, all competing to get a paper-thin slice of what analysts say will soon be a $14 billion dollar cake. Currently, the major players in AI development are the governments of the U.S., China, and Russia — all involved in a push that has been described as the next great “arms” race — and Amazon, Google, and Facebook, which enjoy a strange symbiotic relationship with the first of the former. It’s a high stakes game with nothing less than global military and economic dominance hanging in the balance, and yet, unsurprisingly, if you search for AI news online right now, you’re likely to hear about porn. More specifically, celebrity sex videos and the newest incarnation of ‘revenge’ porn.

That’s right, artificial intelligence has finally trickled down into the ultimate recession-proof industry. The newest craze is something called “deepfakes,” a machine learning algorithm that helps porn enthusiasts face-swap celebrities and friends into their favorite videos. It started in a subreddit and quickly went viral, leading a user named ‘deepfakeapp’ to create FakeApp, a user-friendly program that makes the creation of deepfakes easier and less time-consuming. The goal, he says, is to make it possible to create deepfakes without expensive equipment or technical training, a democratization of ‘revenge porn’ that has left ethicists gasping for air.

While several websites hosting deepfakes have recently begun purging them, the nature of the Internet makes completely eliminating them a veritable game of whack-a-mole. In a couple years, says Peter Eckersley, chief computer scientist for the Electronic Frontier Foundation, the videos will look flawless.

Deepfakes are predicated on artificial intelligence — namely, neural network mapping. But they’re surprisingly easy to make. All you need are a few open-source tools like Instagram Scraper and the Chrome extension DownAlbum to cull photos from publicly available Facebook or Instagram accounts. Then you buy a good GPU [graphics processing unit, the kind that high-end 3D video games require] with CUDA support [NVIDIA’s parallel computing platform and programming model].” Or you can rent cloud GPUs through services like Google Cloud Platform. From beginning to end, including data extraction and frame-by-frame editing, you can make a ‘deepfake’ in eight to 12 hours.

For even more refinement, deepfake producers can use sites like Porn World Doppelganger, Porn Star By Face, and FindPornFace to find pornstars with body types and facial dimensions similar to the person they’re replicating.

The most popular targets so far on Reddit and Pornhub have been celebrities like Gal Gadot, Maisie Williams, Taylor Swift, Jessica Alba, Daisy Ridley, and Emma Watson. One of the most popular of the celebrated deepfakes is a CGI Princess Leia, from Rogue One: A Star Wars Story, which user Derpfakesays took him 20 minutes.

Of course, celebrity porn fakes aren’t new. However, a troubling wrinkle in the deepfake story has been seen in users substituting the faces of classmates, friends, and former girlfriends. In some cases, it has even been termed a new form of ‘revenge porn.’

In a deepfakes chat room on Discord, one user was even bragging about how he had made a deepfake of a girl he’d had a crush on in high school.

ai porn app

As the technology advances, says Deborah Johnson, Professor Emeritus of Applied Ethics at the University of Virginia’s school of engineering, it will become impossible to distinguish between AI-generated fake porn and the real thing, presenting a major ethical challenge, as virtually anyone could wake up one day to find themselves circulating on the Internet as a porn star.

What is new is the fact that it’s now available to everybody, or will be… It’s destabilizing,” Deborah asserts. “The whole business of trust and reliability is undermined by this stuff.”

As Anti-Media recently reported, new AI deep learning applications have made it possible to essentially fabricate video footage that looks completely real, making it even more likely that deepfakes will continue to evolve.

Some deepfake makers acknowledge that what they’re doing is wrong but counter that it’s part of a larger trend.

What we do here isn’t wholesome or honorable, it’s derogatory, vulgar, and blindsiding to the women that deepfakes works on,” said one deepfaker, waxing metaphysical. “If anything can be real, nothing is real. Even legitimate homemade sex movies used as revenge porn can be waved off as fakes as this system becomes more relevant.

Do people who find themselves in a deepfake porn video have any legal recourse? While celebrities can sue for misappropriation or intellectual property violations, others may find themselves in a legal grey area of the revenge porn law. However, fearing backlash, platforms have already begun to crack down, eliminating deepfakes from forums on Reddit and Gyfcat. In this way, deepfakes drift into an alternate universe bubble of the “fake news” narrative. With the government unable to legislate the morality of the creators, they depend on websites and hosts to help ‘censor’ content. Incredibly, Gizmodo even claimed that deepfake trolls are flocking to Russian social networks.

Of course, not everybody is using deepfake technology for porn. A user named Z3ROCOOL22 turned Argentina’s president Mauricio Macri into Hitler.

And what would a viral Internet meme be without a ubiquitous Nicholas Cage deepfake?

This article originally appeared on our Steemit blog.

Creative Commons / Anti-Media / Report a typo

Since you’re here…

…We have a small favor to ask. Fewer and fewer people are seeing Anti-Media articles as social media sites crack down on us, and advertising revenues across the board are quickly declining. However, unlike many news organizations, we haven’t put up a paywall because we value open and accessible journalism over profit — but at this point, we’re barely even breaking even. Hopefully, you can see why we need to ask for your help. Anti-Media’s independent journalism and analysis takes substantial time, resources, and effort to produce, but we do it because we believe in our message and hope you do, too.

If everyone who reads our reporting and finds value in it helps fund it, our future can be much more secure. For as little as $1 and a minute of your time, you can support Anti-Media. Thank you. Click here to support us

    4