Examining the merits of non-consensual and voyeuristic technology.
In the annals of time, it took only a decade, just a century ago, between German physics professor Wilhelm Röntgen discovering X-rays and George W. Macdonald getting a patent for “X-ray spectacles.” The latter counted being able to “see through clothes” as one of its numerous dubious benefits.
Thus then, 100 years later, it makes sense that it took the developers of “deepfake” apps like Nudifier half as long to figure out how a cell-phone snapped photo of someone can be altered via artificial intelligence to reveal them naked.
On one hand, what may seem like harmless fun also opens up space for discussions regarding something far more sinister, unseemly, and damning about the human condition. How to solve for humankind gone off the rails of decency? Insofar as for America, the precedent is already on the books.
Boyhood fantasies come to life
The idea that technology developers are obsessed with attempting to make real-life versions of things they saw advertised in comic books feels like it involves too many layers of negative stereotyping: It conjures the image of nerdy, horny teenage males who have watched Weird Science one-too-many times and then realized technological advancements could make their far-fetched fantasies into truth.
However, that’s not entirely far from the truth.
As Vice’s Motherboard site reported in June 2019 regarding the maker of the now-deleted DeepNude app was inspired by memories of old comic book adverts for “X-ray specs,” which promised they could be used to see through peoples’ clothes.”
Moreover, when the developer, known only as “Alberto” said that, “[l]ike everyone, I was fascinated by the idea that they could really exist and this memory remained,” it almost becomes an annoyingly laughable notion to consider.
However, one stops laughing when “Alberto” added that, “the technology is already (within everyone’s reach)…So if someone has bad intentions, having DeepNude doesn’t change much…If I don’t do it, someone else will do it in a year.”
As The Verge reported:
[Alberto] compared the software to Photoshop, saying this can be used to achieve the same results as DeepNude ‘after half hours of youtube tutorial.’
Thus, there exists the question of what exactly we have done as a society when we have allowed technology to progress at a speed faster than lawmaking? When it is not possible to shove the proverbial toothpaste back in the tube, what are we to do next?
Tech outpacing the law
It would be Draconian and anti-First Amendment to pursue jurisprudence against anyone who, without consent, created AI-generated nudes that are available via digital or online means. The existence and prevalence of already existent deepfakes and illicitly revealed nude photography and pornographic video on the Internet is already quite significant.
However, in the fact that apps like Nudifier and DeepNude have the technological capability to create nude images of minors, there exists an airtight legal loophole by which provision of consent can be used as standing to ensure that while yes, these apps do have a “free speech” precedent allowing them to exist in America, they can absolutely be regulated.
In 1988, the United States Congress passed the “2257 regulations,” aka the Child Protection and Obscenity Enforcement Act of 1988, into law. Ideally, these regulations require producers of sexually explicit material to obtain proof of age for every model they shoot and retain those records.
Moreover, the act “bans the unlawful digitizing an image, of a visual depiction of sexually explicit conduct.” As well, “assembling, manufacturing, publishing, duplicating, reproducing, or reissuing a book, magazine, periodical, film, videotape, digital image, or picture, or other matter intended for commercial distribution, that contains a visual depiction of sexually explicit conduct” is illegal.
Furthermore, and key to DeepNude apps, “it is illegal to insert on a computer site or service a digital image of, or otherwise managing the sexually explicit content of a computer site or service that contains a visual depiction of sexually explicit conduct.”
Using this act as precedent, just like so many American-based websites, providing a scrolling legal read with a “check yes if agree” box for “Consent Acts,” would be wonderful. While yes, though apps like these purport themselves to be editing photos for the purpose of parody and entertainment, and uses no sexually explicit images in their promotion, injustice anywhere is a threat to justice everywhere.
Voyeurism, objectification, and consent
Forcing anyone who wants to use an app like Nudifier or DeepNude to have to provide identification, proof of age, and possibly even having to pay a premium for the “nudifying” service or provide direct knowledge of where the naked image is to be used is a tradeoff.
Of course, one of the benefits to apps like these is that the developers offer complete anonymity for its users. For example, in Nudifier’s advertising, it states that it will“require no personal information,” and let users pay in cryptocurrency.
“Consent Acts” would not change that necessarily. There would need to be provisions made regarding the subpoenaing of information if necessary as concessions to what has previously existed in this wholly deregulated realm. In this case, an ounce of prevention is worth a pound of cure.
A recent article in the UK’s Huffington Post referred to pornographic deepfakes as a form of misogyny. While yes, the United States Department of Defense has the Defense Advanced Research Projects Agency, or DARPA, hard at work developing machine-learning algorithms that can detect these inherently harmful deepfakes, it’s again, entirely possible that technology moves at a speed that cannot be maintained by lawmakers attempting to abate their growth.
Yes, it’s entirely likely that we have, as a society, progressed past the point of no return. When the digital age overwhelms the niceties of humankind, it’s likely wise to not fall prey to the mob haranguing for swift, incendiary, and reactionary change.
Rather, taking a second, contemplating where a historical silver lining exists, and then tugging it tightly around the metaphorical neck of the problem and its problematic ne’er do wells, ensures the progressive change that ensures our best, yet flawed humanity endures.
Image sources: KlausHausmann, Marco Verch