Whether it’s YouTube video recommendations, robotic toys or talking digital assistants like Amazon’s Alexa, there is a debate about how much tech products are designed to addict or even trick us into thinking they’re more real than they are. When products are created to make kids attach to them like a real-life friend, what happens to that child’s empathy?
Marketplace Tech host Molly Wood continues our series on the grand bargain of tech and what it means for kids. Molly asked MIT professor Sherry Turkle, author of the book “Alone Together,” how robots and their attempts at empathy affect the kids they’re targeting. The following is an edited transcript of their conversation.
Sherry Turkle: Well, convincing us that they’re capable of empathy, actually, isn’t a very hard job because we are cheap dates in this department. If a robot knows your name, if it’s able to tell if you’re happy or sad and sort of ooh and aah in the appropriate ways — we really have machines that can do that and convince us that they understand us.
Molly Wood: How much are we talking to robots, we as a society talking to robots right now?
Turkle: Well, we’re talking to Alexa, we’re talking to Siri, we’re talking to Echo and mostly we’re talking to them about our playlists and our pizza and our, you know. But in my research, I find that actually people start to chat with them about other things. And so there actually is a sort of pent-up demand for robot conversation because people are lonely. And people like having sort of the illusion of friendship without the demands of intimacy. But I think that the choice point really is whether or not we’re going to let our children grow up socialized to have these intimate conversations with machines that pretend to be their friends.
Wood: I guess it feels complicated to me because I don’t want to tell my child to be mean to an object, necessarily, or is the idea not to ever have an Alexa in the House?
Turkle: Well, I think we need to work with the designers so we’re less tempted to think of Alexa as a buddy. You know, actually Mattel pulled one of the toys that it had put on the market called Aristotle, which was designed to be a kind of Alexa for babies. Mattel would own everything that the child said to it, and privacy groups and congresspeople and childhood advocates — and I was part of this campaign — said, “Hold on a second, this makes no sense. Mattel is going to own everything your child says to this toy. That can’t be right.” But the product designer of that toy said something very profound. He said, “You know, we know that children are going to have deep relationships with this toy, and we just hope they’re the right ones.” And it showed that these designers know the power of the relationships that are going to be formed, and they have no idea what those relationships are going to be.