I recently had the misfortune to be Deepfaked, after a set of heated arguments on a cat-related gossip forum. Several of the posters on the forum were former members of the Anonymous hacking group and had a high degree of technical proficiency, so once my personal information was out there it was a very simple matter for them to find my online dating profile, pull out one of the pictures, and animate it to music. The whole process took less than a few minutes, shockingly enough.
What I found a little concerning was the subtlety of the Deepfake. While the animated portion of the Deepfake was crudely done, the pranksters had also photoshopped a tarp in the background that had not been there in the original photo. It seemed obvious to me that they were trying to set me up for some sort of false accusations in the future, distracting me from the obvious modification to the image with a more subtle modification that they had hoped I wouldn’t notice. This was not surprising to me: intelligence agencies have been doing this kind of thing for over a decade already and the only new element is that the tools and techniques to create these kind of fake videos have finally trickled out to the modern public.
What will society look like, now that consensus reality itself is now debatable? It’s going to be very different. Everything from government to business to relationships will change completely.
Imagine this scenario. You’re gaming with some online friends on a Friday and your wife goes out to a birthday party for one of her friends. The following morning, somebody anonymously texts you a video showing your wife making out with another man at the party. Is it real, or a Deepfake? How would you know? A lot of men might say that they would trust their wives, but the fact is that women cheat all the time. The U.S. has an infidelity rate of 71% and most other countries are not far behind. In other words, if you believe that your partner would be one of the 29% that does not cheat it’s statistically much more likely that you’re a delusional simp. So how do you cope in a low-trust world where anybody can make up a fake video and completely shatter the trust you have in your loved ones?
Here’s another example. You get into an argument with one of your subordinates at work. Shortly thereafter, a recording appears with you making all sorts of racist and offensive comments. Your workplace fires or suspends you. How do you prove that it wasn’t really you making those comments?
Last example. You interview with somebody online and are offered a prestigious job overseas. Quitting your job, you fly to the city that they are based in and rent an apartment to begin your new career, only to discover that the job doesn’t exist and neither does the person who interviewed you. What do you do in this situation? How do we cope in a post-truth world?
Some people - typically wealthy, connected people - say that the solution is to outlaw Deepfakes. This is a very typical position for extremely wealthy and privileged people to take because it solves the problem for them but not for anybody poorer than them, and it’s very typical for these kinds of entitled people to only care about themselves. After all, wealthy celebrities like Taylor Swift can afford to sue anybody who uses their likeness, but some poor 19 year old college student doesn’t have the time or resources to sue anybody who posts Deepfake porn of them to the internet. A law that would make Deepfakes illegal would only benefit our elites who have the resources to file endless lawsuits, while leaving the rest of us completely vulnerable to this kind of manipulation.
I think that the only solution to the post-truth era is to take the opposite approach. Instead of making Deepfakes illegal, we should spam as many Deepfakes as possible, until people learn not to trust everything they see on the internet. If your manager fires you because somebody else posted a Deepfake of you saying racist things, you should make several Deepfakes of him saying racist things and get him fired. Then make some Deepfakes of the executives at your company cheating on their wives, and give them a messy and expensive divorce. After all, if their gullibility and trusting nature ruins your life, it’s only fair that they should experience the same consequences. The sad thing is that most people in our society don’t care about the truth, and only bother to research things thoroughly when getting things wrong would directly impact them. Therefore the solution to the Deepfake problem is to give them skin in the game. In fact, I think that if a company fires you based on something that later turns out to be a Deepfake, you should be allowed to sue them for wrongful termination. Our management class is full of gullible morons who usually don’t bother fact-checking accusations and a law like this would encourage them to actually be decent human beings for a change and actually research false allegations made against their employees instead of just immediately firing people based on the optics. This is why I try to spam fake Taylor Swift porn on the internet whenever possible. The sad fact is that a lot of human beings are fundamentally selfish. Our narcissistic elites - the people at the very top of society - don’t genuinely care about the rest of us: they only pretend to. This means that if we plebs ever have problems that need to be solved, the best way to force our worthless elites to get off their asses and solve our problems is to make them directly experience those problems themselves. When we force our elites to have skin in the game and feel the pain that those underneath them have to deal with, it’s surprising how quickly we can make societal change happen. But let’s not just limit this to celebrities. Literally anybody who’s gullible enough to believe whatever they read online without questioning it deserves to have Deepfakes made of them. If you participate in an online lynch mob on social media, then you’re guilty of whatever mistakes the lynch mob makes, and I have zero sympathy when somebody else uses Deepfakes to frame you and make you a target of that same lynch mob.
From a practical perspective, this solution makes sense because there really isn’t a way to stop Deepfakes. There’s no “Internet police” that has the time to comb through everything online hunting down these crimes (although given how widespread cybercrime is, perhaps there really should be). Even if such an enforcement agency existed, there will always be rogue states like North Korea that are willing to host Deepfakes, as well as darkweb sites that pop up faster than they can be removed. So rather than trying to treat Deepfakes as a supply-side problem, we should address the demand. If literally everybody on the planet Earth has been put into a Deepfake video at one point, then people will learn to stop believing Deepfakes. So go wild. Deepfake your neighbors, your bosses, your government officials, even random Instagram influencers. We as a society desperately need to become less gullible, and if everybody is Deepfaked, then effectively nobody is.
While this is an elegant solution (and indeed the only solution) to addressing widespread societal naivete when it comes to Deepfakes, I still haven’t answered the original question: what will society look like when consensus reality is now debatable?
I think the best answer is that in the future, trust will now be a scarce and valuable commodity. When you catch somebody lying to you even once, then everything they say will lack credibility in the future because you have no reason to trust their word over a persuasively-generated Deepfake. Couples will need to share their phone passwords and allow each other open access to texts and emails so that if one of them is ever Deepfaked with an infidelity video, the other partner can check for themselves whether it’s true or not. Corporations will no longer be able to fire employees for saying controversial things on social media because if the managers at these corporations are stupid enough to do so, then their competitors can just deepfake all their best employees being racist in order to poach their talent and cause them to go out of business. Politicians will have to be extremely honest (for a change) because if they are ever publicly caught in even a single lie, then any Deepfakes that the opposition makes of them will have much more credibility. Like the faeries of old, instead of openly lying to people, government officials will only be able to deceive people by selectively telling the literal truth but twisting it in a misleading way. It will be a very interesting and wacky time.
In short, for heroic free-speech supporters who think like me, the post-truth future looks like a utopia, while for people with an evil authoritarian mindset who hate free-speech, it looks more like a nightmare. I’m proud that in some small way, I could help accelerate this future and bring it to you faster - with the assistance of the lovely Taylor Swift, of course.
Thanks for making my point so elegantly, Taytay!