This weekend only, get a Marketplace zip–up hoodie when you donate $8/month. Don’t wait — this offer ends at midnight Sunday!
Deciding who receives an organ donation is a morally fraught process. And the organ transplant network in the United States has been criticized for its outdated technology that has led to some deadly mistakes.
But one part of that system could potentially serve as a model of success, according to David G. Robinson, author of the new book “Voices in the Code: A Story About People, Their Values, and the Algorithm They Made.”
Since 1987, kidney recipients have been chosen by an algorithm. Robinson said it could be an ethical “model” for other algorithms and potentially artificial intelligence.
Marketplace’s Kimberly Adams recently spoke with Robinson about how this algorithm works and how a long process of input from nonexperts and people directly affected by kidney transplant decisions have shaped it into what it is today.
The following is an edited transcript of their conversation.
David G. Robinson: The algorithm will produce a prioritized list of recipients. And it’ll depend on lots of details, like who has the right blood type, who is nearby and ready for surgery. But there’s also a moral piece to this, because you could say, “We’re gonna maximize the total number of years of people’s lives.” When they redesigned the system, that was what the doctors wanted to do. But the thing about that is, it’s not very fair, because it means that the youngest and healthiest would be the ones always first in line. It would become very hard for older candidates to get organs. Also, from a race equity point of view, because many of the social determinants of health are concentrated in communities of color, [there are] disadvantages in those areas. So people looked at this and they said, “It also matters whether everyone gets a fair shot at having a kidney.”
Kimberly Adams: Is there a human decision in any part of that? Or is it straight to the list that the algorithm creates?
Robinson: Once we build the machine that makes these lists, then the making of the list is automatic. And of course, we debate about, OK, how to build the machine. But then, once there’s a list, those are not final decisions about who gets an organ. They are offers that go out to the medical team for each patient. And sometimes patients might say, “OK, I got an offer, but I think I’ll get a better one later so I’m going to wait,” or “I’m not ready right now,” or “I’m out of town.” I mean, any number of things. And when those things happen, organs will then go down to the next person on the list. So there’s a big human ingredient before we actually get to the operating room.
Adams: This sounds quite different from the way that other algorithms are designed, especially when it comes to these public-facing programs. How did this process come about to develop this kind of algorithm for determining who gets kidney transplants?
Robinson: It actually goes back even to before there were transplants, even when it was just a matter of who could get dialysis. There was a doctor named Belding Scribner in the 1960s, who said, “I don’t want to be, as a doctor, deciding who lives and who dies. That’s a question for the wider community to weigh in on.” That tradition stayed with us the whole time, and now when they design this algorithm, they have patients, they have loved ones, they have organ donors who get a voice in deciding how this thing is going to work.
Adams: I wanted to bring up an anecdote from your book where a data scientist from the United Network for Organ Sharing, the national nonprofit that manages kidney and other organ donation decisions, they called you up and asked you a very specific question. Can you talk about that?
Robinson: Yeah. They said, “David, when we are calculating the score that each patient gets, how many decimal places do you think we should use?”
Adams: And this is the score that determines whether or not they’ll get a kidney?
Robinson: Exactly, right. And for each organ, because this happens also for lungs and hearts, so it’s hugely important. Whoever has the higher score, it can be a matter of life and death. But what the data scientists who called me explained was that they can go out to 15 decimal places, if they want to, to calculate a tiny difference in the scores. And I sort of thought, “Well, why don’t you go ahead and do that? Use all the data you can to be as precise as possible.” But what they said was those tiny differences in the numbers, they don’t reflect a real difference medically between one patient and the next. So even if, in the 15th decimal place, there’s a tiny difference, realistically, that just means those are two clinically equivalent patients, those are patients who are equally ready to benefit from a transplant. And their point was maybe that’s an ethical decision. You know, we don’t want to flip a coin if we don’t have to. But we also don’t want to pretend that the medical data resolves this hard choice.
Adams: What did that call sort of reveal to you about how data scientists and the folks who design these algorithms are thinking about ethics?
Robinson: So the thing that was so exciting was this is really a moment of moral humility for the data scientists who are in this process, right? They were raising their hands to say, “This is not a technical question. This is a moral question, and it belongs to a wider community than just the technical experts. So we’re going to make space at the table for other voices beyond just the data people to decide how this software is going to work.”
Adams: Are you seeing this sort of more democratic process of developing algorithms, this community input in these, you know, life-changing decisions in other algorithmic models that are affecting people’s day-to-day lives?
Robinson: Yes, I think there’s a real push to start to have it that’s been proposed in recent legislation, both in the U.S. and in Europe. There have been advocacy efforts, some of which I was involved in to try and get community input into these courtroom algorithms. The thing you’d be deciding there is who gets to go home while their case is working its way through the system and who gets to wait in jail because we’ve decided that they’re dangerous? And there’s a lot of thought that, “Oh, we ought to try this.” What they don’t have is a lot of good examples of how this can be done practically. And I think that’s what this gives us. That’s why I find it exciting, is this proves that yes, you know, democracy with respect to some complicated piece of software, getting people into the room, it can be messy, it can take some time to do that, but bottom line, it can work.
Adams: But at the end of the day, it is still an algorithm making a very human decision. And I wonder if you think people are ever going to be OK with that?
Robinson: I think you’re exactly right, that it’s never OK. It’s never perfect. There are never enough organs to go around. And in fact, they’re still working to try and make it better. But I think when you include other voices in the code, there are surprises, and you can end up in a better place.
I mentioned at the top of the show our coverage of flaws in the tech used by the United Network for Organ Sharing in the U.S. You can read more about that here.
David Robinson adapted some of the themes from his book, which is out now, into a shorter article for Slate, in case you want to get a preview.
Robinson also mentioned other areas where algorithms were being used in significant public decisions, like earlier this year, when we covered an AP investigation into a Pennsylvania county that used algorithms to help decide which families officials should investigate for possible child neglect and abuse.
That process was criticized for potentially worsening racial disparities in the child welfare system there.
And Wired has an article about how algorithms are being used in bail hearings, a process that has also been criticized for exacerbating racial inequities.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.