Lee McIntyre: Post-Truth & the Perfect Fraud of Deepfake Video

By Flora Stadler

I remember once, back when I was a Very Serious Poet recently graduated from liberal arts college, I was talking to my mother about graduate school and she asked if I would consider journalism.

“That’s also writing,” she’d said.

“It’s not the same thing at all,” I replied, all righteous.

But 10 years later—after falling into a job as an editor at a local magazine and becoming a true believer in the connection between news and democracy—I did go to graduate school for journalism. There, I got an unexpected education in media literacy: I learned not only hsow to report news, but how to consume it—how to understand the sometimes subtle difference between fact and implication, the importance of source, the way language and even statistics can be manipulated.

But nothing could’ve prepared me for where we are now. News media, social media—it’s overwhelming and, honestly, discouraging.

In this bloody election cycle, I’m going to fall back on reporting and bring some nonfiction focus into the Get Lit online salon. I need truth to counter this confusing, divisive and downright Orwellian time, as we learn to consume new forms of media responsibly in the age of post-truth politics.

In my quest to separate facts from alternative facts, I found the refreshingly straightforward Lee McIntyre, Research Fellow at the Center for Philosophy and History of Science at Boston University and author of Post-Truth, a very good (and kind of scary) book examining the path that led us here, and how we can tread carefully on it.

McIntyre isn’t just a nonfiction author. He also had interesting things to say about his fiction writing, a genre he described as “ethical thriller,” and how it impacted his work on this book: “Quite a number of philosophers read mysteries, all the way back to Wittgenstein. And one day, I was reading [John] Grisham and I thought, ‘I can do this, how hard can it be?’ It’s very hard, very hard indeed. But it’s made me better at writing nonfiction because some of the techniques used in fiction, like show don’t tell, really work in nonfiction. Sometimes what’s the most persuasive is to tell somebody a story. So Post-Truth would’ve been a very different book if I hadn’t written fiction… I had to tell a story that made sense in the overall arc of the argument.”

But let’s get to the juicy nonfiction, and to the root of my question:

Since you’ve written the book Post-Truth, what’s the one thing you’re seeing now that worries you most, and what should we be doing as news/social media producers and consumers to mitigate it?

“The most common question I get is, ‘Where do we go now? What can we do next?’ In order to answer the question, you have to play the ball where it lies, and it’s a slightly different place right now than it was when I turned in the manuscript for the book [in May 2017].

By far, the thing that I’m the most worried about is the extent to which post-truth is a precursor for an authoritarian government. We’re seeing more and more that lying by public officials—not just in the U.S., but in other countries—is a precursor for oppression and silencing of journalists and, in some cases, violence against journalists. Political leaders do that when they’re getting ready to take over, getting ready to accrete more power, and they want to shut up the truth-tellers.

The reason I’m more worried about this than I was a year-and-a-half ago is not only that political events have changed, but fake news has changed. At the time I wrote Post-Truth, the main kind of fake news we were concerned with was the type that was being produced in Russian troll farms and other places, that were these naked attempts to try to insert false stories into the news stream and get us to believe them. That still exists, but now Trump is making false accusations of fake news against legitimate news organizations, which can have a kind of ricochet effect.

And with the technological advances, it’s now possible to produce audio and audiovisual fake news. There’s a company called Lyrebird that has a very legitimate business: What they try to do is take a one-minute audio sample and use that to create a vocal dictionary. So they put it through a speech synthesizer and get you to say anything you need to say. Now, if you’re Stephen Hawking, that’s a good thing. But if you’re someone trying to produce fake news, that’s a terrible thing, because you can use someone’s voice to try to get them to say whatever you need to say.

There’s another project out of Stanford, a video project called Face2Face. What they can do, which is straight out of Orwell, is take a person who’s being videotaped on a screen, put a mask on a person who’s standing just offstage, and through a technology called ‘face capture’ make the image that’s being projected on the screen conform to the facial expressions of the person wearing the face mask, rather than the person in the screen. In real time, they’ll be able to digitally alter the facial expression of the person who is giving the speech. When that happens, I don’t know what happens next. At that point, it may be that people are so cynical and demoralized that they think everything is fake. And that puts us at enormous risk for authoritarian rule.

I think the only thing that we can do is be aware that it exists. We were enormously behind the curve on text- and picture-based fake news, and it’s taken us two years to get to the level of public education and literacy where people are beginning to be skeptical about news. I think it will take that long or longer for people to get to a similar level of skepticism about the new technology, which is called Deepfake tech. It is so compelling that even if you know it’s fake, it appeals to a certain cognitive bias that we have in our brain, where we might enjoy watching it. I remember after the Parkland tragedy, there was a fake photo that went around of Emma González ripping the Constitution in half. It was completely Photoshopped and fake. The sad part of it is that even people who knew it was fake shared it and loved it. So just imagine what will happen when we can share audio and video. Even if it’s fake, it’s still going to appeal to that cognitive bias that we have.”

Lee McIntyre is a Research Fellow at the Center for Philosophy and History of Science at Boston University and ethics instructor at Harvard Extension School. His many wonderful books on truth, science and philosophy can be ordered at Binnacle Books in Beacon.