Culture Feature

"In Event of Moon Disaster": Nixon DeepFake Offers an Eerie Glimpse of a Moon Landing Disaster

A team at MIT teamed up with AI specialists to show us what Richard Nixon's famous contingency speech would have looked like.

Popular conspiracy theories would have you believe that NASA—with the help of director Stanley Kubrick—faked the Apollo 11 moon landing in a soundstage in Los Feliz.

In reality there is abundant evidence to the contrary—from the parabolic arcs of moon dust to orbital pictures and retroreflectors—proving that Neil Armstorng and Buzz Aldrin really did land on the moon. On July 20th, 1969—51 years ago today—they ventured onto the surface and took some pictures, but due to damage sustained in their touch-and-go landing, they were stranded there. They never came back…

To Make a Deepfake: Richard Nixon, a Moon Disaster, and our Information Ecosystem at Riskwww.youtube.com

At least, that's what happened in the universe of In Event of Moon Disaster, a new short film highlighting the dangers of DeepFake technology. Piecing together Apollo 11 footage with an AI version of Richard Nixon delivering the very real "moon disaster" speech (written for the contingency of Armstrong and Aldrin not making it home), In Event of Moon Disaster lets us peer into an alternate timeline where the challenges of landing—which very nearly turned into an actual disaster—left two men to die on the moon.

The film, which premiered in November of 2019 as an art installation—playing on a TV in a replica 1960s living room—is now available online at moondisaster.org, along with various resources detailing how it was made. Using machine learning technology to recreate Nixon's voice and face, a group at MIT teamed up with two AI companies and mapped a convincing Nixon-clone to the movements of an actor delivering the unused speech.

Their fake Nixon isn't quite perfect. As his haunting speech kicks off—"Fate has ordained that the men who went to the moon to explore in peace will stay on the moon to rest in peace"—keen observers will notice that some of his movements are slightly jerky, and the way his voice catches certain sounds with an odd electronic inflection. But these issues are subtle. If you weren't looking for flaws, you could easily chalk them up to Nixon being nervous—the recording being degraded.

To an unwitting audience the footage could easily be pawned off as a newly discovered recording of Nixon practicing his contingency speech, or an artifact of a Mandela-effect blending of realities. But the filmmakers' purpose is to help prepare us for a future in which reality will soon feel much more permeable to this sort of strangeness.

Of course, in some ways this is not a new phenomenon. It's not even the first time that Nixon has been convincingly placed in a situation that never happened. But unlike the footage of Nixon presenting Tom Hanks with an award in Forrest Gump, In Event of a Moon Disaster didn't require a professional voice actor to mimic Nixon, nor an expensive team of meticulous digital effects artists—and the result is actually considerably better.

Forrest Gump (9/10) Best Movie Quote - Watergate Scandal (1994)www.youtube.com

The threat of DeepFakes—and related voice technology—is that anyone with access to enough images and recordings of their subject can already do a decent job of producing a digital clone that will do and say whatever the creator wants. Pump enough information into the software, and the AI does most of the work for you.

DeepFakes and Disinformation

If this can be done with fake footage of a President from the 1960s, what's to stop malicious users from producing compelling evidence of a modern politician engaging in crimes or voicing reprehensible positions on important issues?

Consider how much of a scandal it became when excerpts of Hillary Clinton's paid Goldman Sachs speeches were leaked—including the assertion that it's important for a politician to have "both a public and a private position." Now imagine if, instead of transcripts, we had gotten cell phone footage of Clinton announcing that her private position was to make guns illegal and abortions mandatory. Or imagine if Donald Trump had been able to effectively claim that the infamous "grab 'em by the p***y" Access Hollywood tape had been faked.

We are approaching the point where our politics are going to be heavily influenced by the existence of DeepFake technology. The Internet, with its open access to unverified information, has already made the existence of objective truth feel increasingly fragile.

What's creeping in to replace it is a vast spectrum of niche perspectives that only seek confirmation of their beliefs—communities of thousands of like-minds finding each other in order to manufacture consensus on the minute details of obscure issues that the rest of the world isn't even aware of.

This is how some cryptic nonsense posted on 4chan explodes into the millions-strong mass delusion of QAnon—wherein JFK Jr. is alive and well, fighting the deep state pedophile ring in the guise of...a man who looks nothing like JFK Jr. It's how even the use of face masks to contain the spread of a respiratory pandemic is treated as an insidious ploy to kill people.

There is no room for authority or expertise when everyone has a community that will tell them exactly what they want to hear. But when this kind of conspiratorial thinking can be bolstered by zero-budget, moderate-effort video "evidence" that only the most careful viewers will have reason to doubt—and that no authority will be trusted to counter—how will the notion of objective truth even survive?

This new arena of disinformation will make conspiracies about the moon landing look quaint and marginal by comparison. Will it be possible to parse events that took place in the real world from a proliferation of digital hoaxes designed to confirm our darkest suspicions?

As the 2020 election approaches, we can expect to see the first round of this kind of subterfuge undermining the process. If we're lucky, efforts to ban the spread of DeepFakes on a number of major platforms will successfully limit their reach, but all it takes is for Donald Trump to tweet one video of Joe Biden professing his loyalty to Xi Jinping in Mandarin, and the seal will be broken.

Will Twitter delete it? Will any of Trump's fans believe it isn't real? And can a divided population—living in separate, increasingly manufactured realities—continue to coexist? The alternate reality of In Event of Moon Disaster allows us to play with this kind of fracture—to safely feel a hint of the strangeness and uncertainty that this new technology may soon make commonplace.

Home Stallone [DeepFake]www.youtube.com

For now it still takes a talented team of researchers and specialists to produce footage as convincing as this, but a cursory search of YouTube produces countless examples of how eerily amateur hobbyists can swap someone's face onto another person's actions.

As the software improves and the knowledge of how to use it spreads, the barrier to manufacture hoaxes of this quality and beyond will all but disappear. Will our society be strong enough to withstand that new threat?