Viruses don’t discriminate, but health care often does
Share Now on:
This week on Marketplace Tech, we’re reporting on the technology that will help us pivot to a post-pandemic world, and some that might create even more inequality. A paper published in June in the New England Journal of Medicine looked at how artificial intelligence is used to determine treatments and care. It found that many of the algorithms used in medicine use race as a variable.
Here’s an example. It’s hard to test directly how well a person’s kidneys are working. So, instead, many doctors use an algorithm to estimate kidney function. The algorithm uses several factors, including race, to make these guesses. And that is a problem.
“Race is a social and not a biological construct,” said Leo Eisenstein, a physician at NYU and Bellevue hospitals and one of the authors of the paper. “We’re seeing differences in outcomes, not because people who are Black have essentially different bodies, but their experience of being Black in this country is essentially different.”
But encoded in the world of health care is the idea that Black and brown bodies are different. As a result, “there’s an excuse for why there are any inequalities of health in the notion that these are produced naturally by racial differences,” said Dorothy Roberts, a professor at the University of Pennsylvania who wrote a book about race in science and medicine.
“People will say Black race predicts for some bad outcome, when it’s actually structural racism that’s operating to put people in a vulnerable position where they are at risk,” she said.
Roberts talks about one study that examined why Black women in Chicago started dying from breast cancer at a much higher rate than white women beginning in the 1990s.
“The researchers concluded it was not because Black women’s health got worse. It was because there had been huge advances in breast cancer detection and treatment over those 20 years, and the best machines are located in private hospitals, where more advantaged people go,” Roberts said.
Roberts said that history makes her deeply worried about the distribution of lifesaving technology to treat or prevent COVID-19. “There are ways in which racism is built into these rationing guidelines, which are based on a premise that scarce resources shouldn’t be wasted on someone who might die at the hospital,” Roberts said. “Now, because we live in a society that is structured by racism and racial capitalism, this tool is systematically going to discriminate against Black patients because they have already experienced a society that is set up in a way to produce lower life expectancy for them.”
Related links: More insight from Molly Wood
The nonprofit science and tech magazine Undark has a long piece about how systemic bias is probably leading to the disproportionately higher death rates among Black and brown people in the U.S., not just because of comorbidities associated with poverty or increased exposure to pollution, but because researchers found Black and Indigenous people may also be getting turned away from hospitals or denied testing until it’s too late.
Monday, I talked about vaccine innovations. One of the leading vaccine research companies, Moderna, announced that it will start one of the biggest vaccine trials in the world, enrolling some 30,000 people to test out its COVID-19 candidate. The Moderna vaccine uses the method I talked about, called mRNA, the viral messenger that then uses the body’s own cells to create antibodies to the fragment of viral particles.
The future of this podcast starts with you.
Every day, Molly Wood and the “Tech” team demystify the digital economy with stories that explore more than just “Big Tech.” We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.