Should we worry about deepfakes and an “epistemic apocalypse”?
Share Now on:
It’s getting harder to believe your eyes and ears on the internet. Artificial intelligence tools can generate convincing images, videos and voices. Chatbots can spit out confident misinformation. And Twitter users for $8 a month can basically impersonate anyone they’d like on the site.
The specter of an internet full of fakes has a lot of people worried about an “epistemic apocalypse: a total breakdown of our ability to perceive truth and reality.
It’s something Joshua Habgood-Coote, a research fellow at the School of Philosophy, Religion and History of Science at the University of Leeds in England, has written about. The following is an edited transcript of his conversation with Marketplace’s Meghan McCarty Carino.
Joshua Habgood-Coote: I feel like there’s been a lot of very catastrophic discourse about the effects of various new kinds of technology on our epistemic practices, which is to say the practices which we use to gain knowledge about the world, to find out truths beyond what we can see and hear in our immediate environment. And the thought is: Every new technology is going to catastrophically disrupt our practices for generating knowledge, and we’re going to completely lose track of truth wholesale. So partly, I want to engage with that discourse and do some critical thinking about whether those kinds of very catastrophic predictions are really justified.
Meghan McCarty Carino: What can we learn from how people have responded to some of the recent examples of AI fakes?
Habgood-Coote: I think there are a couple of very interesting cases. So the image of the so-called “drip pope” — so this was a set of images that were produced of Pope [Francis] wearing some kind of designer coats, which were extremely fashionable. And I think a bunch of people did think that these were genuine images of the pope.
McCarty Carino: I bought it (laughs).
Habgood-Coote: Yeah. So you might think, “OK, I got fooled this one time, I’m not going to get fooled again and again.” The question is: What are the stakes on you believing that this isn’t an image of the pope? Actually very low.
McCarty Carino: Right, which maybe is why I bought it, whereas those fake photos of Trump being arrested I think people were naturally a little bit more suspicious of.
Habgood-Coote: Yeah, exactly. So there’s also a whole set of social practices around checking images of politicians, and you can look for other sources of what’s been going on with Trump today.
McCarty Carino: Are there examples from the past of how society has adapted to new epistemic challenges like this?
Habgood-Coote: It’s important to look at the history of deepfakes. There’s a whole lot of practice in the late 19th century of people touching up portraits to make them look better. Actually, at the end of the 19th century in journalism, all photographs that were printed in newspapers in the United States were to some lesser or greater extent changed and modified to make them look better. And what they did, they came up with a set of social norms against faking. So this isn’t to say that the problems are not going to be serious around deepfakes and other kinds of faking, but the long problems with history, and throwing away possible solutions, if we forget that history.
McCarty Carino: It’s interesting talking about the role of technology in all of this, because I think we’ve talked a lot about it with social media — the idea that it has allowed for this kind of proliferation of misinformation and threatened democracies and all of these things. But I think that there is a discussion to be had about whether the way that people are using social media in this way is sort of a symptom of, as you say, social practices that were happening before social media allowed for this, or whether the technology kind of instigated this kind of thing.
Habgood-Coote: Yeah, so I kind of got into this topic by thinking about discourse around fake news and post-truth around 2017. And it was pretty clear that a lot of the discourse around fake news and post-truth was almost acquiring the kind of character of a moral panic discourse, where it was like, “There’s this terrible problem, society is going to collapse, democracy is going to collapse.” And none of the people who were writing about real problems about misinformation and propaganda were paying attention to the longer history of decline in trust, not just in media sources, but in various kinds of social quote-unquote authority sources and the political causes of those phenomena. So there’s a way in which the cart is before the horse.
McCarty Carino: Yeah, it’s kind of a chicken-and-egg thing.
Habgood-Coote: Yeah, we see people believing false things, and we’re like, “It’s connected to technology.” We’re like, “Oh, the technology has got to be the cause.” And that’s the thing that’s driving social change, rather than doing some socially informed, politically informed history and thinking about the way in which our political context has led to decline of trust in media sources, and also the economic context. So in a way, social media is much more important insofar as it’s had economic effects on local and national and state media, rather than having terrible effects on the way that people engage with reality. And they’ve taken away the funding model for lots of forms of media, which did have — they weren’t perfect, but they had some degree of reliability in keeping track of the world.
McCarty Carino: There seem to be two main fears when we talk about this. One, that people will be very easily deceived by these new synthetic media. And the second, the conclusion of that — that people will just be so skeptical of everything that they see that no one will believe anything, there will be no shared reality. Is there a risk of that kind of skepticism?
Habgood-Coote: The conditions which are more likely to cause that kind of very generalized skepticism produced by people talking about how desperately terrible effects of manipulated media are going to be — you don’t become skeptical about all videos that you see after you’ve seen one deep fake video. You might do if you’ve read a book, which says, “Oh, the epistemic apocalypse is coming.” And it tells you about all of these ways in a kind of quite persuasive way in which we’re losing contact with shared reality. So in a way, there’s a really horrible self-fulfilling prophecy, where people talking about the possibilities, this very general kind of skepticism or loss of contact with reality, are actually more likely to be causing the thing they’re worried about than the technology that they’re kind of worrying about. So I think there’s a real reason to be measured and kind of cautious when we’re thinking about these really apocalyptic scenarios, because of these kind of self-fulfilling worries where raising the worries creates the thing you’re worried about.
Related links: More insight from Meghan McCarty Carino
If you’re wondering about that deep history of deepfakes Joshua Habgood-Coote mentioned, he’s posted some examples of early manipulated photographs on his Twitter feed. They include a so-called photo from a 1910 livestock auction where the horses and cows appear quite a bit larger than they should. Another from 1907 shows a supposed ear of corn that is so enormous, it spans an entire railroad car. The images are pulled from a book by Mia Fineman called “Faking It: Manipulated Photography Before Photoshop.”
Last week, we talked about all the AI music fakes, including the fake Drake and The Weeknd song that went viral, and many, many more.
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.
Support “Marketplace Tech” in any amount today and become a partner in our mission.