For Black History Month, we’re looking at the history of Blackness on the internet. Through most of that history, Black women in particular have been disproportionately harassed and abused.
And then ignored when they tried to report that abuse or point out how tech might be misused to further oppress people online and offline. Ignored by tech companies and, it must be said, by journalists, too.
Researcher and writer Sydette Harry wrote about this in Wired in a piece called “Listening to Black Women: The Innovation Tech Can’t Figure Out.” I asked her: “What could the internet look like right now if we had listened to Black women all along?” The following is an edited transcript of our conversation.
Sydette Harry: The internet could look like a place that people actually lived in. I think one of the biggest problems of what the internet looks like now is there is very little resemblance, in a lot of ways, to the lives we actually have to lead. We don’t have sex workers, we don’t have poor people, we don’t have languages and accessibility that looks like the world. And had people listened to Black women, and all marginalized people, at the start of the internet, we would have designed, we would have made, we would have moderated and created an internet that fit more into our lives.
Molly Wood: I am tempted to ask you, also, what democracy might look like, but that seems a little far afield from this.
Harry: Is it, though? Is it actually far afield of it? One of the jokes is right now we’re in the middle of a pandemic, in the middle of an attempted coup, and the world is going through upheaving changes. And we’re conducting a lot of this through the internet. That is the function of democratic, nondemocratic governments right now, in many ways. And one of the biggest examples and biggest issues that we’re having to deal with is how unequal access is to the internet, but also in general to democracy.
Wood: You talk about the relationship between tech companies and journalists. Tech journalists, hello, should be holding these companies accountable but also have not historically seen Black people as their audience. Neither the companies nor the journalists.
Harry: No. I truly don’t think they have.
Wood: Is it about representation?
Harry: I think representation is a part of it. I also think it’s about access. If you think that reporting on tech is kind of the horse race or a discussion of those in the know, it becomes about access. I think that when you talk about tech culture, and tech culture has continually focused on content farms and influencers and money and brand-new platforms, which is OK, but you haven’t looked at how have organizers created archives? How have they preserved dying mediums, how have they preserved gaps and transformed gaps in how we know technology to work? Some of the most interesting work to me has been dance archives, has been film archives, not because I’m necessarily really concerned with performance. I think other people do that fantastically. But because you’re watching as people just given the tools are completely transforming what we understand history to be by digging into their mamas’ gardens and universities. But I’m more likely to get an in-depth look at what kids in L.A., who come from parents who are in television and film and are in these houses that have already been financed. That’s the gap. There’s that thing of access versus that thing of concern. You can always get someone who needs more publicity to talk to you, you can always get someone who’s impressed by your credentials and your connections to talk to you. When you are talking about people who are doing the fundamental things that are necessary to their community, that are deeply connected, where this isn’t just a job but their survival, the embrace and bringing about of their culture, it takes more. And it does need people who are more familiar with that culture. But it also takes a really …. shifting of what you think is important. And that’s not just representation. I think that’s structure as well.
Wood: Do you see any momentum for these conversations? You said there is resistance. Do you see any of that resistance starting to change? Or does there need to be a bigger effort, a federal-level audit?
Harry: I am always wary of the idea that the way to bring about more change is to get more laws and get more government involved. Because the other part about this is that the people that the government is talking to and the people who are going to be involved are the people the government has oversurveilled and overpoliced and overincarcerated as part of the fundamental system of America. I think that there is a big push because there is so much brilliance, and we have a lower threshold of technological access. You have this ability to make something on your phone, to have access to music, to share, and often circumvent in some ways, the general structures of gatekeeping.
There was the Golden Globe announcements [this week]. And it was after #OscarsSoWhite, after all of this talk of diversity it was still kind of remarkably undiverse. And what I really was stunned, for me, was that the conversation picked up in an hour of how uninterested people were, how unexcited they were about these things, about how this wasn’t diverse. And it used to take months. It used to take years. Now, it’s within an hour. And you cannot, especially for [the] streaming system and other places that depend on our excitement, on our chatter, on people feeling connected, see yourself constantly told that people aren’t excited about this, people are not excited by nondiversity, and not change. There will be change. I think that there’s going to be a constant push, if not because of the political aim of “We want to see more,” [but because] people are turning on and turning off things really quickly. They’re subscribing and unsubscribing really quickly, because there’s not a belief that you can’t go somewhere else. And that, I think, is very exciting. I’m excited about people trying to make their own art. What I want to push for is, this is an opportune moment. This is an excellent moment for us to sit down and truly reexamine what we want the internet to be in our lives, what it is now, what we see going forward and what are the truly imaginative and innovative things we can do.
Wood: You mentioned preservation, and that Black experiences with technology haven’t actually been preserved for centuries. How important is that to creating an internet that accurately reflects what you said: life, real life?
Harry: They’re inextricable. I think what we’re looking at right now, in a lot of ways, is the exact problems we see when that doesn’t happen.
Wood: I guess, I mean, if we just started, are we going to create an internet that in 10 years will be a better reflection of the one that was sort of originally created with this very narrow lens?
Harry: Yes. If we started right now, we would probably create a better internet and a better experience for people on the internet within six to eight months. The question is, do they believe that this is necessary? Are they willing to resource it?
One thing that has not happened is that we are in a pandemic, and people constantly have been speaking to each other. And really, I think heartwarming, and thought-provoking ways about how time has really been knocked out of whack for them, and they don’t know what to do. And they are using platforms of communication styles more and more. And not a single platform has really considered bringing back a chronological timeline. Can you imagine what it would feel like to know that the interactions you have with platforms were helping you understand how you’re moving through time, simply by letting you control that? Is it going to be easy? No. Are companies that are worth billions of dollars and have lots and lots of engineers on it able to do that? I believe they can. I truly believe they can. They got us [Instagram] Reels in how many months, even though we don’t like them? But these are the things that are truly part of what we could be doing.
That is both the optimism and pessimism. If you actually sat down and said, “No, we are seriously concerned with making this better for you, and we’re going to put in systems and steps to do so. We’re not going to try and make it only these people or those people. We’re going to sit down and listen.” People will do a lot of what they’re doing now, which is very much offering you free study time, free market time, willingness to be open. The problem is that you don’t have that. The actual reach-outs and the actual calls and the designs and the bringing-in of people, and the recognition of that has not happened. I mean, think about it this way. Last year, we were looking at all of these black squares. “We promise to listen to Black women. And we promise to up our diversity. And we started fellowships,” and things like that from media companies — The New York Times, NPR, LA Times, all of these places. Honestly with me, how many announcements or places or investments thinking about the relationship between what was said last year, all of these corporate statements, versus what’s been instituted this year? What would you think the relationship is?
Wood: I mean, I live this every day, I can tell you. It’s not good. We know it’s not good. And then we know that when Timnit Gebru at Google tried to take advantage of the moment to speak the truth, she got fired. I don’t even know what the word is — “slow,” at best? I mean, I hear you say the word “listening” over and over, and I just think it couldn’t have been a more brilliant headline, “The Innovation Tech Can’t Figure Out.” But it does sometimes feel like the innovation that people can’t figure out, that the system can’t figure out?
Harry: Or the innovation that tech won’t figure out, or people won’t figure out?
Wood: Or they won’t figure out. And that newsrooms … listen, we all have to own it — that newsrooms won’t figure out.
Harry: Really specifically, OpenNews did a recent study, it’s a beautiful piece of work, that finds that the person most likely to leave the newsroom is a Black woman between 25 and 44 [years old]. And they followed that up with doing exit interviews with journalists, and the exit interviews they’re doing are with a wide swath of people. And these are people that I’ve talked to and even worked with. And these are really amazing, wonderful people of all ages and races. And one of the things they say over and over again is tech journalism refuses to embrace this question: If you can’t talk about racism, you’re not going to be doing your job. And they often cite the experiences of Black women who are not them in seeing that, so they see the examples.
Wood: I think a lot about the way that scale plays into it, the way that we’re sort of realizing that we built these systems, that were built by the system, and as they grow, and they encompass people, we’re all growing into them in a way. And in a way, that says, “This has not been working. You didn’t build it right. Now we have to change it.” The change part is hard, the grafting it on after the fact is hard. But that we’re really in the adolescence of social media. And it’s an awkward, painful, angry time that hopefully will mature into proper adults.
Harry: Like adolescence is, as anybody who’s dealt with a teenager, or been a teenager, can tell you. People often say, “Well, how do we build it to scale?” Rather than, “How do we build it to purpose?” If you look at how you build it to scale, it’s always going to be, “OK, can I get as many as I want?” If you look at how you always build it to purpose, your product benchmarks, your user benchmarks are always, “Is this doing the thing I want?” Rather than, “Is it grabbing all of the people or the most amount of people?” And then it also introduces the idea that you might not be having a one-size-fits-all solution. Because that scale always goes, “Well, we have to do the same thing for everyone.” And very rarely, in almost any other part of tech or any other part of product design, do you say that. But for some reason, when it comes to safety and these issues of diversity, that becomes the focus, like we have to do the same thing for everyone. And I’m like, meh. You make sure for people who want to use your products, you can customize everything. But when you are using people as the product, it’s got to be one size fits all? Are you sure about that?
Wood: I mean, I think I mean to say we got to scale with no purpose. And now we have to graft purpose on, which is totally doable, right? We got a whole bunch of email, and then spam came in and we fixed it. It all can be fixed, but as you point out, it takes a lot of will and a lot of discomfort. And I don’t know. I have hope for the adult that is the internet.
Harry: Always. And I think that the other thing to remember is that the internet is ultimately made by people. And the same thing that gives us sorrow, which is that people can be terrible. The flip side of that, people can be wonderful. There have been moments where we’ve all looked at humanity and go, “We’re not salvageable. Screw it. We’re done.” But we also looked at moments of, “I have never been more honored and more amazed at the capacity and wonderfulness and determination and vision of people.” Often the people not honored. And as the child of a Black woman, as a Black woman, first-generation immigrant from Far Rockaway, New York, people who are not honored but who are able to build and sustain and create these things that have supported the world for generations. I believe we can. I believe that it is available to us. Whether or not we choose to go that way or the other, that’s the thing we’re actually watching.
Related links: More insight from Molly Wood
Of course, you should read Sydette Harry’s full piece. There, she cites an NBC news story about how Twitter, back in August of last year, took down a bunch of accounts that were pretending to be those of Black people tweeting about leaving the Democratic Party. Twitter was like, “Wow, this whole impersonating Black people is kind of a thing.” And as researchers and activists pointed out in the NBC piece — and in lots of articles and tweets and comments for lots of years before that — it has been a thing for arguably a decade and was a big part of disinformation before the 2016 election. That was when Russian troll farms targeted and posed as Black users on Facebook and Twitter to try to encourage them not to vote for Hillary Clinton. What would things look like, indeed.
And one of our frequent guests, Joan Donovan, wrote a piece for The New York Times in 2019 about a targeted harassment campaign aimed at Black feminists in 2013. She writes that the campaign became a blueprint for what would eventually become the Gamergate harassment campaign, which became a blueprint for extremists and Russians and conspiracists practicing the disinformation, conspiracy spreading and frankly, digital terrorism that we seem to think is just how we live now and was apparently inevitable.
Had social media platforms and owners of message boards responded differently in 2013, Donovan writes, “we would be living in a very different world today.”
The future of this podcast starts with you.
Every day, Molly Wood and the “Tech” team demystify the digital economy with stories that explore more than just “Big Tech.” We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.