We have far too many examples in recent years of hate speech sparking riots, mobs and individual attacks. One group at particular risk is the transgender community.
This year, at least 35 people in the trans community were “fatally shot or killed by other violent means,” according to the Human Rights Campaign. And a recent report from the HRC Foundation found highly organized online attacks against hospitals and health care providers in 21 states targeting facilities and doctors that provide gender-affirming care.
Marketplace’s Kimberly Adams spoke with Erin Reed, a legislative researcher and a trans rights activist who’s been following these patterns for the last three years. The following is an edited transcript of their conversation:
Erin Reed: What we’ve seen, and what I’ve seen, is an increase in the school boards or those hospitals or those administrators that get targeted on online platforms, then receiving in real-life actual violence. I have seen a number of gender-affirming care programs at hospitals paused or shut down, because of the violent threats that people are receiving. And these threats often first show up on our online platforms through accounts like Libs of TikTok, Matt Walsh, etc. This materializes into what I believe is real-world harm.
Kimberly Adams: I just want to get a little specific with what you’re talking about. It looks like some of these online attacks are focused on hospitals and schools, because in hospitals they might be doing what medical officials view as necessary, which is gender-affirming care. And schools, on the other hand, which maybe will make decisions about bathrooms and who gets to participate in what sports.
Reed: Yes, yes, exactly. So what we normally will see is one of these accounts target a hospital because it has a gender-affirming care program. These are medical decisions made between doctors and their patients, following the best medical guidelines that are out there. Nonetheless, whenever an account targets the hospital, rather than going through the political process, we do see extra political action and violence against these groups of people. And same thing with school districts. A very good example, this would be in Kiel, Wisconsin. The school districts in Kiel, Wisconsin, were investigating the bullying and harassment of a trans student under Title IX. And as a result of that, it got blown up on Twitter. We had multiple hate accounts say “look at what’s going on: Title IX is being used against people for misgendering a trans person.” And then as a result of that, Kiel, Wisconsin, many of the schools in the town, received bomb threats over the course of a month. Schools were shut down in Kiel, Wisconsin. It made news, and after that the Title IX complaints were dropped. So we do see real-world violence and violent responses come from digital hate coming in our digital platforms.
Adams: And what are the platforms doing about all this?
Reed: Prior to [Elon Musk’s] takeover of Twitter, we did see some inroads being made. We saw discussions being had around the role of hate accounts and spurring this violence in real life. I know for instance, the account Libs of TikTok was banned from Twitter for about a week, shortly after sending a hate mob against Vanderbilt Children’s Hospital and Boston Children’s Hospital. And while we don’t know the exact cause of why [Chaya Raichik] (owner of Libs of TikTok) was banned, whenever she returned, she had stopped tweeting about the hospitals. And so this was like a sign of hope that maybe things were going to change. But that’s all gone now. And in the post-Musk takeover of Twitter, we have now seen Elon Musk step in to defend Libs of TikTok. We’ve seen Elon Musk step in to defend Shopify for hosting Libs of TikTok, and the products that Libs of TikTok sells calling LGBTQ people groomers. And so it’s this kind of rhetoric, the grooming rhetoric against LGBTQ people, the attacks on children’s hospitals, the attacks in school districts that no longer can be reported, and no longer will be removed. And in fact, they’re being defended by some of the rich CEOs who own the platforms.
Adams: Can you give a concrete example of how some of this online rhetoric has translated into actual policy?
Reed: So in Vanderbilt Children’s Hospital, we first saw Libs of TikTok blast Vanderbilt Children’s Hospital. Matt Walsh then took up the attacks on Vanderbilt Children’s Hospital and decided to have a rally in Nashville, Tennessee. And it was essentially a rally to shut down the gender-affirming care program at Vanderbilt Children’s Hospital. This two-pronged approach of political rallying plus violent threats, then led to one of the first bills targeting all gender-affirming care in the state of Tennessee. And not just surgeries; we’re talking puberty blockers, hormone therapy, etc. And so we are seeing this sort of translation of Libs of TikTok, Matt Walsh, in-person rallies, as well as in-person violent threats, and then legislation in Tennessee.
Adams: Are there any good examples of the reverse happening — online actions or organizing, changing legislation that the LGBTQ community sees as harmful?
Reed: Absolutely. There is a lot of very powerful digital activism that people like myself try to lead. In Ohio, there have been multiple anti-trans bills that have been proposed. Things like the sports ban, HB151; things like the medical ban, HB454, which would have detransitioned to all transgender youth in the state and medically And I have personally posted TikToks of people that were testifying against these bills. And they went viral. And then people then asked me, “How can I show up? And how can I go and testify?” And this digital activism collectively has led to what essentially was almost every single person coming out against the bill in public. And both of these bills failed. And so we do see the ability to organize online against bills like this. And that is one thing that I’m trying to make sure that we can do effectively.
Adams: Moving forward, what do you hope changes when it comes to the way that platforms deal with these online threats and legislation against the trans community?
Reed: You know, there are guardrails that have to be established against hate. And if you don’t establish those guardrails against hate, you will make your platform: number one, uninviting to members of any minority community; and number two, you’re going to kill the ability of your platform to reach advertisers and the ability of advertisers to reach people on your platform. Companies don’t want to see their content placed alongside hate. And so I think that platforms have a responsibility, a social responsibility too, to ensure that hate content is not left unmoderated on the platform. And this both goes for the ability of hate content to spur on anti-trans legislation, but also the ability of this hate content to spur on anti-LGBTQ violence. And I think that we’ve got so many examples of that having happened before.
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.