The aptly named “thispersondoesnotexist.com” generates faces randomly. The ostensible purpose of this deepfake face generator is a little unclear, but I’m assuming the developers wanted to show how real deepfake faces can look.
The result is… creepy. With a click of the mouse, you can generate faces of people who never lived.
“Deepfake” is a wonderfully ominous term combining “deep learning” and “fake.”
Deepfake face generators create an uncanny valley effect that is more emotional than visual.
When confronted with the traditional “uncanny valley”, you see something humanoid yet strange, something that evokes disgust and awe all at once. It’s the mechanism of the horror movie: the stilted walk and too-long hair of the girl from “The Ring,” the 360-degree head spin of “The Exorcist.”
Thispersondoesnotexist.com and other deepfake face generators don’t create the same feeling on first sight. These are the first three deepfake faces I generated with the website. The whole time, I felt kind of like a necromancer summoning zombies.
At worst, each of my unholy golems look like poorly taken headshot photos:
You can point to some “uncanny” inaccuracies: a bent lip, marks on the skin, an uneven mouth. But if you refresh enough, you’ll get someone who may have just taken a bad picture.
Except it’s not someone.
The eyes reflect no real light, because the photos weren’t taken. The face shows no real experience, because there was no life lived.
The Deep End of Deepfakes
The uncanny valley of creating fake people with deepfake face generators is spiritual. But the use of them is practical.
An app developed in China, Zao, allows users to swap their faces onto famous scenes from movies.
https://twitter.com/AllanXia/status/1168049059413643265?s=20
Deepfakes can both create new people and put people in scenes and situations they never encountered.
This invites a wonderfully dystopian cascade on the imagination: deepfake porn is on the rise, deepfake political speeches sway elections, deepfake business meetings tank stock prices and companies.
What we can coin the “chaos potential” of deepfake face generators and video apps, is limitless. Meanwhile, smart speakers like Alexa are soaking up voice data to figure out how AI can create what we could call deepfake conversations.
In a fun twist, it seems that we’ve made the simulation hypothesis come true on our own. As deepfakes start to spread and our digital entertainment, news, and conversations are all taking place under the assumption that the interaction could be a “deepfake” interaction, we’re actually building the simulation ourselves.