When Wikipedia started in 2001, it was meant to be an online encyclopedia free for everyone to use and contribute to. Over the years, the question of Wikipedia’s accuracy has been hotly debated, and many journalists are forbidden from using it as their only source.
At any given time, its army of volunteer editors might be fighting a raging battle to make sure that a page contains the truth. That’s happening this week on Wikipedia entries about the coronavirus. Considering the state of information online, Wikipedia’s goal of providing free information for no incentive other than providing information is reassuring, assuming it can beat back the trolls.
I spoke with Katherine Maher, the chief executive officer of the Wikimedia Foundation, which oversees Wikipedia, and asked her how often the site gets hit by misinformation campaigns. The following is an edited transcript of our conversation.
Katherine Maher: There are efforts at times when groups of individuals will try to manipulate the quality of the information towards a particular aim. What is interesting is that it actually only takes one person who is tracking that article page to be able to shut that down. What that means is that we actually have the ability within the system to self-regulate when there’s an effort to actively manipulate information.
Molly Wood: Is it possible that some of the concern going forward is less about errors and more about omission?
Maher: Absolutely. I think that the editor balance is roughly, best numbers, 80% male and 20% identify as women. What that leads to is implicit bias in terms of what gets covered and what is deemed as notable. Right now, the content on most Wikipedias, particularly in the subject area of biographies, is skewed towards men. That’s one of those places where Wikipedia editors are actually really interested in trying to correct the record. There’s a number of people across the Wikimedia community and groups that are actively oriented to writing women into history. The reason this matters so much is a lot of those products that are coming out of the technology sector today are trained on Wikipedia. If you have a database that overrepresents men, or overrepresents certain concepts, like science, to be associated with men, what you’re actually doing is encoding those biases into the world around you.
Wood: How do you see the role of Wikimedia right now, having evolved in a significant way as politics and culture shift around the idea of truth, around what a fact is?
Maher: It’s funny, when I started at Wikimedia Foundation, I always thought, “This job would be so interesting.” I never expected it to be quite so relevant, relevant to the conversation around the future technology, relevant to the conversation around the future of information, relevant to the conversation around the future of trust. I think what has changed for us is we’ve gone from being a website on the internet to a way in which the world records itself. We’ve become much more of a central record of information that underpins the knowledge infrastructure of the web. It belongs to everyone, and we have a responsibility to make sure that [it] stays that way.
Wood: I want to shift to how you guys pay the bills. There’s the public radio model, the public contribution and the donation model. Is that the primary way that you accrue revenue?
Maher: We have a lot in common with public radio in that we, every year at the end of the year, ask our readers to donate to Wikipedia. We’re incredibly fortunate that around 8 million people donate every year an average of $15. That makes up about 85% of our total contributions. The remainder are primarily from major donors, the people who give a little bit more than $15, and philanthropic foundations. We are entirely nonprofit and entirely donor-driven as an organization. What’s interesting to me is, actually, fewer than 1% of our total readers give. Although we think of ourselves as very much a public-supported organization given how many people read Wikipedia all over the globe, it’s still a pretty exceptional group of individuals who make sure that it keeps running year in and year out.
Related links: More insight from Molly Wood
Last month, Wikipedia announced it had more than 6 million articles in English on the site. The 6 millionth article is by a woman and about a woman — a 19th-century school teacher and author.
You can hear or read more from Katherine Maher in a podcast she did back in 2018. She talked about some of these same issues, as well as how they had to ban Congress from editing Wikipedia because “Congress is notorious for vandalizing Wikipedia.”
She also made an interesting argument in the New York Times last year about not overrelying on artificial intelligence to solve our problems, but instead, using it to augment the work of humans and not replace it.
Speaking of AI, there’s a fascinating story in Motherboard this week about how a little army of about 1,600 bots are writing articles for Wikipedia in various languages. That might be the reason that the second largest Wikipedia in the world is written in Cebuano, a language that’s spoken in the southern Philippines. This is a fact that I learned on Wikipedia. Even though only about 15 million people speak the language, there are more than 5 million articles on Cebuano Wikipedia. Apparently most of those articles are the work of a single bot. There is apparently some controversy about this within the Wikimedia community, but Motherboard said it reviewed about 1,000 of the articles the bot wrote and said they were “surprisingly well constructed.”
The future of this podcast starts with you.
Every day, Molly Wood and the “Tech” team demystify the digital economy with stories that explore more than just “Big Tech.” We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.