This weekend only, get a Marketplace zip–up hoodie when you donate $8/month. Don’t wait — this offer ends at midnight Sunday!
Author Jeanette Winterson has been reading and writing about artificial intelligence and its relationship to humans for years. But, as she says in the introduction to her new book, she felt like she wasn’t seeing the big picture of how technology is subtly changing human relationships.
Her book, a collection of essays called “12 Bytes: How We Got Here. Where We Might Go Next,” explores these themes. Winterson goes back to the first computers of the Industrial Revolution and imagines how AI will shape our love and sex lives in the future.
Winterson wants to understand the implications that AI has for humans — the good and the bad, especially when it comes to robotics. And in that space, Winterson said, a lot of the investment is going toward the good. The following is an edited transcript of our conversation.
Jeanette Winterson: Most of the money around friendly robotics is going in two directions. One is in little iPals, really, or the little robots that you can use around the house that will help you. And that’s great because they’ll be patient with your kids, teaching them whatever they need to learn. It’s better than sitting there in front of the TV. They will be fantastic with older people. You can say the same thing to a robot 500 times and it won’t get impatient with you. There are real benefits there. But one of the things that worries me on the other side of that is this growing trade in sex bots, which are really kind of femalettes, 1950s-style. And the question is, well, it looks like a funky, new technology, but it’s built on the old platform of gender, money and power. So that is problematic.
Kimberly Adams: But there’s also uses for AI right now that are happening, where we have computers managing relationships between people. And I wonder how you think having computers and technology as the intermediary within human relationships is going to affect how we relate to each other moving forward.
Winterson: Well, the question is who’s done the programming? Once the datasets are fed in, the machine looks like it’s neutral or unbiased or objective. But we know that isn’t true — that it’s only as good as the stuff that’s going in. But one of the things that I think has been helpful is that when we’ve looked at the datasets as amplified and magnified by AI bias, we can see our own prejudices. And it’s a rather uncomfortable reflecting mirror.
Adams: After doing all of this research and writing this book, how do you balance your excitement with the future and the fear?
Winterson: For me, we’ve got to be realistic, not indulge in magical thinking. It’s the same with climate breakdown. This is real, this is happening, this is now. But if we always go for the, the doom-laden, Armageddon, dystopian, Gotterdammerung approach that it’s all going to collapse and we’re going with it, then that will happen. The way we live, it’s not a force like gravity that we can’t escape. It’s propositional, we can change it. And for me, that is the ultimate power that humans have — that we can change the story because we are the story.
Adams: How has writing this book changed the way that you consume tech in your own life?
Winterson: I’ve always been pretty careful. I think it partly is being an analog human because I was born before any of this started in any significant way. So I feel comfortable using it, but I don’t want it everywhere. I’m somebody who shuts down the computer and the phone about 9 o’clock at night. I don’t use Siri or Alexa, because I know they’re listening in. You’re just going to have a conversation in front of you, your system about dining room chairs. And then you’ll find that all the adverts are popping up for dining room chairs the next time you open your phone. That’s not a level of intrusion that I really want, and I wouldn’t want to live inside the Internet of Things in a smart, connected house, where my fridge will lock me out if it’s a diet day and the toaster won’t toast me any bread if it’s a no-carb day, you know. These things sound like jokes, but they can happen. Your bed will be monitoring your sleep and sending the results to your doctor. Are you fit to drive? Did you have any sex? Are you sure? There are self-driving cars can actually drive you anywhere, and people don’t really realize this yet. And if you don’t pay the installments, it just won’t start. You know, telemetrics is a big thing. You won’t be able to start your car. So there’s all sorts of ways that this can prove problematic. So because I live in the, in the deep country in the middle of nowhere, I’m trying to keep myself a little bit off-grid, even though, as we said at the beginning, young people look at me and say, “Are you crazy?”
Adams: Although it’s kind of lovely that you try to be off-grid and you just wrote a book on technology called “12 Bytes.”
Winterson: Yes, I know. But I’m sitting here by my fire. My stove is going with wood that I chopped myself. I grow a lot of my own food. I sound like one of these prepper [survivalist] communities. Maybe this is the problem. Maybe that’s where my real alignment lies. But I think I could manage pretty well out here with, with food and fuel, and I’ve got a bicycle. I’ve got some hands, so I would manage. Part of me loves that life. But I also know just as it began to disappear for most people all over the world when the Industrial Revolution came in, and we stopped being an agricultural society. I know the life I’m living, it’s not practical and it’s not sustainable for most people. Cities are great. Cities are our future. There you can harness resources. People can live densely, but they don’t have to live lives that are just like an anthill. We can make cities better, beautiful and workable, and smart cities are a way forward. So that’s why I’m torn. I can see it from both sides, and that’s what I wanted to put in the book. And that’s why I call it a debate. Who do we want to be? How do we want to live? What choices would we like to make for our children, for ourselves now? And can we please be in that conversation?
Winterson focused a lot in her book about how AI and robots interact with us in our personal lives, but what about at work?
A team at the University of Georgia is working with the U.S. military on research looking at the best way to help robots and humans interact on teams, digging into questions like, how do you get humans to trust a robot when it comes to life-and-death decisions on the battlefield?
And TechCrunch reports that Facebook, or now Meta, is working on building more sensitive robots, developing a new kind of artificial skin that will help robots of the future better replicate a sense of touch. Researchers hope that will make assistive robots more gentle in their physical responses, like not gripping someone’s arm too tightly or pressing too hard on a button.
In other Facebook news, the company said it’s shutting down its facial-recognition system and deleting more than a billion users’ data in the process. Privacy advocates have been complaining about this for years. Facebook is not, however, getting rid of the software that powered the system.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.