The right wing is coming for Wikipedia

1 month ago 2

The Heritage Foundation says it will "identify and target" Wikipedia editors over alleged bias. What does that mean for Wikipedia’s future?

Guests

Molly White, Wikipedia editor for around 20 years. Independent writer.

Stephen Harrison, freelance journalist who’s been writing about Wikipedia for 8 years. Tech lawyer. Author of the novel The Editors.

Also Featured

Rachel Goodman, special counsel and team manager for Free Expression and the Right to Dissent at Protect Democracy.

Transcript

Part I

MEGHNA CHAKRABARTI: Roughly a week ago, on September 11, the day after Charlie Kirk was fatally shot, Wikipedia users noticed something new on the page for his widow Erika Kirk. At the top of the page was a gray box with a red stripe.

ANTHONY PETERZZZ: Okay, so I'm going to show you something really strange about Erika Kirk's Wikipedia page. It has been nominated for deletion. That's fricking disturbing. That's really disturbing.

CHAKRABARTI: A Wikipedia editor with the username, 'E pluribus unum, y'all' was recommending Erika Kirk's page be deleted. The editor wrote, "Article was created in the aftermath of Charlie Kirk's killing yesterday, but coverage otherwise is either limited in nature (i.e. her participation in beauty pageants) or inherited from her husband."

NICOLE APPROVES: This is on Fox News, Wikipedia debates deleting the profile page of Charlie Kirk's wife Erika. Editors are split on whether Charlie Kirk's widow has enough independent coverage to warrant entry, the liberal attempts to basically erase conservatives. You're all pathetic.

JESSE ON FIRE: Why would they do that, if she's not notorious enough? Why are you even worried about it? And everyone knows who she is. You just don't like her. Okay? You are reprehensible, disgusting, evil people who are trying to cancel her and get rid of her because you are leftist.

CHAKRABARTI: Stephen Harrison joins us now.

He's a freelance journalist who's covered Wikipedia for almost a decade, and also a tech lawyer as well. He is with us from Dallas, Texas. Stephen, welcome to On Point.

STEPHEN HARRISON: Hello, Meghna. Thanks for having me.

CHAKRABARTI: Okay, so I actually, I've got Wikipedia open here in front of me and I'm on the page for Erika Kirk.

And right now, that mark for deletion box is actually gone. It's not on the page today, which is Thursday, September 18, 10 a.m. Eastern time. Now it says the article is currently protected from editing. What does that mean?

HARRISON: So the article right now is what's called fully protected, and that means that only Wikipedia's volunteer administrators can edit it for the next two days until September 20th.

And the reason that was done is because there was some instances of vandalism and concerns about editing disputes. And if you click or scroll up to the top of the page, maybe about four tabs over, you can see the word "Talk" and that's where you can see the editorial discussion that's been taking place behind the scenes.

CHAKRABARTI: Okay, let's click on that. Talk ... Okay. So here it says the article was nominated for deletion on September 11th, 2025. Why did that happen?

HARRISON: One Wikipedia editor nominated Erika's article for deletion on the grounds that she wasn't what's called independently notable, that she had inherited her notability or her importance from her husband.

But one of the things that I think is not really covered in the Fox News framing of events. And I would say that the Fox News framing suggests that Wikipedia is trying to erase Charlie Kirk's widow, and there's the sort of insinuation is that Wikipedia editors are heartless, is trying to create a certain picture.

But in reality, the nomination of an article for deletion is very common. If your listeners tried to create a Wikipedia article right now, it would very likely be nominated for discussion, a deletion, and then there'd be a whole discussion among the editors about whether it should in fact be kept.

And so I think while there's a move to cast this as politically decisive, it's actually just really commonplace. And you can see if you scroll through the top page that a lot of the discussions were Wikipedia, based on policies like notability and whether there've been independent press coverage of Erika Kirk.

And a lot of the Wikipedia editors just scrolling through the top page, you can see they're expressing their sympathy and remorse for the horrible tragedy that Erika went through. And I think ultimately what's interesting about it is the Fox News piece was talking about deleting Erika Kirk from Wikipedia.

Not only was she not deleted, she was kept. And if you review the discussion, about 150 Wikipedia editors participated and they decided by a margin of four to one to keep her page. And so she will remain on Wikipedia.

CHAKRABARTI: It's interesting though, because as you just said, there's this consensus emerged around whether or not a person is independently notable.

What are the criteria for that?

HARRISON: Regardless of where it is, Erika Kirk's going to appear on Wikipedia in some form. It's a question of whether she's part of Charlie Kirk's Wikipedia page or if she has significant coverage as shown by media sources to have her own page, like a profile of her, specifically.

And one of the things that the editors point to in the discussion is that the press coverage of Erika had really gone up since Charlie Kirk's assassination, right? And so that changing media environment also influenced the Wikipedia discussion, and the Wikipedia editors go where the sources go.

And that's why it was decided that her page was independently notable and that it should be kept.

CHAKRABARTI: I have to say, the disagreement over what to do about the page, it is right there. If you just click on Talk, you, anyone can look at it. Pretty robust. And I think it's worth going into just a little bit more detail, Stephen, because the discussion that these editors themselves are having over the page tells us a lot about some of the basis whether fact or fiction of the critics of, or the criticisms of Wikipedia.

Like for example, as you said, there is an editor who says: Please pay respect to her. She lost her husband today. She deserves a Wikipedia article. The very next comment from another editor says: Losing a husband is not a reason to get a person a Wikipedia page, nor is a Wikipedia page a show of respect.

And then another person, another editor, comes in and says: Pay respect to the wife of a white supremacist. And then someone else then comes in and adds: Why does his ideology, is it valid in that? And then a third person comes in and says: Oh, there we have Wikipedia's cancel culture at work.

No respect for someone because of his adversarial thoughts, and therefore no respect for the wife of the person murdered. And then it goes into: She does have plenty of qualifiers for notoriety before her marriage to Charlie Kirk, et cetera, et cetera. And it goes on to the consensus that emerges, that you said, that people then say, 1, 2, 3, 4, at least five editors say, I agree that she should have her own page.

There's a lot of independent reasons and independent resources mentioning her, et cetera. She should have it. What does the tone of that conversation tell you about how decisions are made for Wikipedia pages, Stephen?

HARRISON: I think first and foremost is that the discussions are transparent.

Like you said, you can click through, and you can see everything and it's on the record. And so I think that's helpful. It's not happening in some back room somewhere in Washington. You can see it on Wikipedia. I do think that as with a lot of things in our society, there's a trauma and from looking at those comments, I saw some sort of trauma expressed.

In the back and forth among the editors, as you said, a consensus emerges. And I think that some of the Wikipedia editors who chimed in there may not typically contribute all that much to Wikipedia, when it became a story and got picked up as a story, that probably drove editors who might not normally participate in what's called an AfD article for deletion discussion to contribute to the page.

And sometimes that's referred to as canvassing. But again, what the administrator did is he looked at the entirety of the arguments and not just from a numbers perspective, but just who is making arguments based on Wikipedia policy and not just the political or cultural points that were raised. Who's making the best arguments based on policy? And decided that, on those merits, Erika Kirk should have a page. And for example, she was Miss Arizona in 2012, I believe. Like she had some independent notability.

So I think of course, while there is a lot of passion that comes into these things, I think the ultimate decision, when Wikipedia works and it doesn't always work, but when Wikipedia is working well, it is based on the policies of the site like notability.

CHAKRABARTI: What I find most interesting is, not only do they come to the conclusion that because Erika Kirk both in relationship to her marriage to Charlie Kirk, and also her own independent standing is notable enough to deserve her page to be kept up and not deleted. But as we mentioned, you see a lot of conversation here about the Wikipedia editor's own conscientiousness about the criticisms of Wikipedia. We mentioned someone saying there's Wikipedia's cancel culture at work.

Someone says she's a notable person who deserves her own page. If this is deleted, it's going to be seen as a political move. Someone else says, Wikipedia is seen as elitist. All of these criticisms that the Wikipedia editors know of are coming from now very powerful right-wing groups such as the Heritage Foundation and even the House Oversight Committee.

We'll talk about that in a second. But I want to actually just give voice to how the criticisms are discussed in popular media. This is conservative commentator Steven Crowder, who has accused Wikipedia of bias. Here's Crowder on his show Louder with Crowder a couple of years ago, January 2022.

CROWDER: What's scary is when you have a very small group of people who are forming a consensus, and then they are saying, no opposing point of view from outside of our consensus is allowed, because we've already achieved consensus. This is the most terrifying response you can get. Because that's a response that can be used for anything, and it's a response that Wikipedia, or in another case, Facebook, YouTube, Google, Alphabet could use for any issue.

CHAKRABARTI: That's again, once again, Steven Crowder there. Stephen Harrison, we have about a minute before our first break. Earlier this year, the Heritage Foundation said it would quote, identify and target Wikipedia editors. It accuses of bias. Tell me more about that.

HARRISON: Yeah, this is really, was very unusual and I think it really sprang from some concerns about how Wikipedia was describing the Israel and Palestine conflict. And I did see this as very much an escalation, because instead of engaging with the Wikipedia arguments about how a particular subject should be reflected on the site, the Heritage Foundation said that they'd use tools like identifying username and certain textiles to try to figure out who these people were as editors and try to identify them and punish them personally as opposed to debating them on Wikipedia. So that was, I would say, a pretty dangerous escalation.

Part II

CHAKRABARTI:  I also just want to note specifically that we did reach out to the Heritage Foundation requesting an interview. They declined, but a Heritage Foundation spokesman told us via email quote: Targeting Wikipedia editors would require publicizing the information.

And the foundation has not posted anything of the sort. However, the statement goes on: If you're writing about Wikipedia's left-wing bias, I hope you report that some of the site's ghoulish editors propose deleting the page for Erika Kirk within days of her husband's assassination. End quote. With Stephen in the first part of the conversation, we did just that.

We forensically went through the process of the conversation that was happening with Wikipedia editors there. And again, On Point listeners, I encourage you to go to wikipedia.org. Look up Erika Kirk's name, and as Stephen said, in the upper left hand corner of the page, you will see a little button that says, or a little place to click that says, Talk.

Click on that, and you too can see the entirety of the conversation that happened around Erika Kirk's page. Stephen also mentioned a lot of controversy over Wikipedia entries about Israel and Gaza. Let's listen to some of that criticism. This is Rabbi Pesach Wolicki, an Orthodox Jewish leader who's been critical of Wikipedia on social media.

Here's that.

RABBI PESACH WOLICKI: Most Wikipedia pages are open source. You can go in and edit them. You can add information, you could add other sources. But this Wikipedia page is locked. It is impossible to edit. There's no way to change this page that claims falsely that Israel has killed Gazans at the aid distribution sites.

CHAKRABARTI: Okay, and here's another one. This is writer Ashley Rindsberg. And Rindsberg has claimed a group of around 40 Wikipedia editors orchestrated a campaign to quote de-legitimize Israel. Present radical Islamist groups in a favorable light and position fringe academic views on the Israel-Palestine conflict as mainstream, end quote.

Rindsberg explained in an interview with the Atlas Society, November 2024.

ASHLEY RINDSBERG: They made something like 850,000 edits across nearly 10,000 articles in the Israel-Palestine space. Things like downplaying allegations of rape, or even removing allegations of rape on October 7th.

They were whitewashing Hezbollah, and they were also making a very concerted effort to sever any ties between the Jewish people in Israel in hundreds of articles. So the idea there would be to delegitimize Israel and try to show that the Jewish people don't really have any place in Israel whatsoever, because all this stuff ends up on the front page of Google, not just the front page, but the top result.

Millions and millions of people around the world are absorbing a perspective that is actually tied back to groups of radical editors.

Once again, that's writer Ashley Rindsberg. Joining us now is Molly White. Molly has been a Wikipedia editor for around 20 years. Molly, welcome to On Point.

MOLLY WHITE: Hi, Meghna. Thank you for having me.

CHAKRABARTI: I just also want to note that we did reach out directly to the Wikimedia Foundation, that's the nonprofit that operates Wikipedia. We requested an interview. The foundation declined our request, but Molly independently agreed. So first of all, Molly, let's just tackle this controversy over Wikipedia entries on Israel and Gaza. You heard there the allegation that very deliberately Wikipedia editors are trying to shape a narrative around this, that supporters of Israel, very vociferously, disagree with.

WHITE: With Wikipedia and its volunteer editors, there are always issues around editors trying to insert viewpoints into articles or highlight specific viewpoints that they may hold.

And the Wikimedia editing community is always working very hard to try to ensure that articles stay balanced when it comes to perspectives. And that means not necessarily choosing which version of events is correct, but representing all viewpoints in proportion to their prominence in reliable sources.

The Wikimedia editing community is always working very hard to try to ensure that articles stay balanced when it comes to perspectives.

Molly White

And when it comes to highly contentious topic areas like Israel and Palestine, that can be very challenging. But the Wikimedia editing community has been dealing with issues of bias in these editing areas for years, and when there have been allegations of editors trying to push viewpoints that are not appropriate in terms of bias or accuracy or sourcing, those are handled by the editing community fairly comprehensively.

CHAKRABARTI: How?

WHITE: It can range dramatically from individual editors working things out on article Talk pages. Stephen explained to you earlier. There are broader community discussions if there is a more serious conflict that's happening, all the way up to what's known as the arbitration committee on Wikipedia, which is the last resort for some of the most intractable disputes among editors or behavior that violates Wikipedia policies.

And there have been multiple arbitration cases dealing with the issue of Israel-Palestine that have resulted in various outcomes, including limiting which editors can edit those pages directly.

And removing editors from the topic area if they were determined to have violated Wikipedia policy on any side of the issue.

CHAKRABARTI: What's interesting is, Molly, many now critics of Wikipedia are pointing to exactly that process that you just outlined as one of the reasons why they think bias emerges on Wikipedia pages, because it's no longer what, Wikipedia is no longer what it once was at its founding, where just about anybody can just jump in and change a page.

So let's listen once again to Ashley Rindsberg. You heard him before. He says in his opinion, the site's bureaucracy and what he calls quote, almost parliamentarian rules make it hard for an average person to make an edit. And this again, is from a 2024 interview with the Atlas Society.

ASHLEY RINDSBERG: Wikipedia is a bureaucracy in many ways, a lot of procedures, and there's a lot of rules to be followed, and there's a lot of committees that have to be consulted for things to get to a point where a decision is actually made, things can take a lot of time.

And when you have that kind of bottleneck, especially when we're dealing with a website that has nearly 7 million articles on it, you're just naturally going to have to restrict what can be decided upon. There is just only so much that the arbitrators can handle, and this is what we're seeing today, is how we got here.

CHAKRABARTI: Molly, how do you respond to that?

WHITE: I think there are some legitimate criticisms there. It is more challenging to edit Wikipedia these days than it once was, because of the policies and processes that exist. That can be a bit of a burden for a new editor to understand.

It is more challenging to edit Wikipedia these days than it once was, because of the policies and processes that exist.

Molly White

But ultimately, I think that you can't have it both ways. There are people who are like Ashley Rindsberg, criticizing Wikipedia for not doing enough and not instituting enough policies and community intervention to shape content the way they believe it needs to be. While also criticizing Wikipedia for having too many policies and too much oversight.

I don't understand how you can have it both ways.

CHAKRABARTI: Okay. Molly and Stephen, let's return back to the latest push now against Wikipedia, again, from very prominent right-wing groups like the Heritage Foundation and the House Oversight Committee, the Republican members thereof. We'll talk about the House in just a second.

But I want to note that, again, the Heritage Foundation did not agree to speak with us on the air, but the spokesman I quoted earlier referred us to an organization called The Oversight Project.

Now, the Oversight Project did not respond to our request for an interview, but publicly available online there is a PDF, a PowerPoint PDF version of a PowerPoint presentation. You can find it easily online, and it's called Wikipedia Editor Targeting from the Oversight Project and the Heritage Foundation.

And it says, identify and target Wikipedia editors abusing their position by analyzing, this is how they're going to identify editors, text patterns, usernames, technical data through data breach analysis, fingerprinting, human intelligence and technical targeting. Which includes things like cross article comparisons, behavioral patterns online, historical comparisons. They're really deploying a lot of tools to identify who Wikipedia editors are.

Molly, first of all, and then Stephen, I promise I'll come back to you.

Tell me, has this had an any kind of impact on you already?

WHITE: Personally, not so much just because I am very public about who I am. It's no difficulty to identify my real-life identity, but I would say these types of threats have a dramatic chilling effect on the volunteer editing community. These are people who are volunteering their time to try to improve this public resource. They're not being compensated for it. They are taking time out of their day and now they are facing this threat that an extremely aggressive, politically extreme organization is going to paint a target on your back for participating in this project.

And we've seen the damage and sort of chaos that can be introduced to people's lives when they're publicly targeted by groups like the Heritage Foundation. And I worry that our volunteer editing community, members of that community will see this and say, you know what? It's not worth it. I don't have the capacity to put up with these types of threats.

Even if I'm not editing in those topic areas. The definition that The Heritage Foundation employed, just the vague idea of abusive editors. It's so vague as to encompass just about anybody. I think ultimately these types of threats have a dramatic chilling effect on our editing community, regardless of whether or not they follow through or whether or not editors are ultimately broadly being targeted.

Ultimately these types of threats have a dramatic chilling effect on our editing community, regardless of whether or not they follow through or whether or not editors are ultimately broadly being targeted.

Molly White

CHAKRABARTI: Stephen Harrison, this document says it discusses doing things like creating sock puppet accounts that would, quote: reveal patterns and provoke reactions from editors. It also talks about using geolocation. I mentioned searching through hacked databases for things like username reuses, and also even going so far as using facial recognition software to learn the real identities of Wikipedia editors.

Again, the Heritage Foundation would not speak to us, but Stephen, have you been able to determine any more of what the Heritage Foundation intends to do with this information if it finds it?

HARRISON: Yeah. It's so interesting because I wrote about this idea of going after the editors in my book called The Editors.

And I think that they also talked about social engineering, the idea that if you could try to trick volunteer editors into revealing personal information about themselves, that might be another opportunity to get at them. I think, I mean, we have to address the irony that this is harassment and cancel culture and you would think that the Heritage Foundation, which has been so against that idea would be against it here.

But of course, it's exactly that. And I have seen in China, it reminds me of how some of the Wikipedia editors in mainland China use these tools, tools that were able to look at IP addresses, in order to intimidate Wikipedia editors who are based in Hong Kong. This was at the time of the Hong Kong protests.

And so it seems like this is taking a tactic from other authoritarian regimes and putting the safety of Wikipedia volunteers at risk.

CHAKRABARTI: Molly, did you want to add to that?

WHITE: I think that's absolutely right. This is ultimately a extremely chilling, type of behavior. And for an organization and a sort of political group that claims to be pro-free speech, pro-free expression, anti-censorship, it is deeply ironic that they are now doing exactly what they condemn.

CHAKRABARTI: Now, let's talk about how these criticisms and the targeting that's coming from, let's say, advocacy groups like Heritage is now bleeding into the United States Congress.

Because on August 27th, so just a couple of weeks ago, the House Oversight Committee sent a letter to Wikipedia's CEO, saying that the, quote:

Committee on Oversight and Government Reform is investigating the efforts of foreign operations and individuals at academic institutions subsidized by U.S. taxpayer dollars to influence U.S. public opinion.

And the Committee says:

We seek your assistance in obtaining documents and communications regarding individuals or specific accounts serving as Wikipedia volunteer editors who violated Wikipedia platform policies, as well as your own efforts to thwart intentional organized efforts to inject bias into important and sensitive topics.

That's the first graph there. Stephen, first of all, and then Molly, I'm going to hear from you on this, but the House is trying to use its leverage over U.S. taxpayer dollars going to American colleges and universities. But I'm not quite sure how that works if Wikipedia itself doesn't directly receive funding.

Do you want to talk about this letter a little bit?

HARRISON: Yeah. I think, I mean they mentioned in the letter efforts by foreign operatives as to what sway U.S. opinion. And I would say if that's in fact happening, I would want to know that, if there's efforts by puppet farms in Russia or China to influence what's on Wikipedia.

But I think that what we have here is, it's just against this background of conservative criticism. And I think within the same week, Mike Lee tweeted something inflammatory, like the editors are putting the wicked in Wikipedia, right? We know where they're coming from in this position, right?

And so I think that there's a lot of concern, I think justifiably, that it doesn't really feel like oversight from the committee. It feels more like targeting.

CHAKRABARTI: Molly, your thoughts.

WHITE: I would agree with that. And, ultimately Congress does not have constitutional authority to investigate a website like Wikipedia for editorial decisions.

That is basic First Amendment stuff. And it is somewhat absurd to see the Oversight Committee, claiming to be evaluating whether editors are following Wikipedia policy. That is not their domain. But I do think that this is ultimately of a kind with the similar intimidation tactics that we're seeing from Heritage and other groups. Because there were demands for private information about editor disputes and identifying information about editors, including private logs of their IP addresses.

This is really, I think, just intensifying concern among Wikipedia editors that not only might the Heritage Foundation or some advocacy group target you for your volunteer editing activities, but you might even have members of Congress who have been willing to engage in similar behavior, to publicly identify and target people they view as politically opposed.

CHAKRABARTI: This also comes at the same time as we've seen multiple lawsuits, for example, filed by President Donald Trump against media organizations.

The latest is a $15 billion lawsuit filed this week against the New York Times.

Part III

CHAKRABARTI: Let's listen to some competing views here from some of the earliest voices from Wikipedia, Larry Sanger helped start the site. He left Wikipedia in 2002. Since then, he has frequently criticized the website for, in his view, being too left wing. Here he is doing an interview on Glenn Greenwald's show System Update, August 2023.

LARRY SANGER: When it shifted from the neutral point of view to a, I guess I would call it a scientistic point of view, any sort of controversial issues in science, the establishment view on that topic was pushed very heavily. That happened like in, I don't know, 2006, 2008, by, at the time, Trump became president. Yeah. It was almost as bad it as it is now. It's amazing. No encyclopedia to my knowledge, has been as biased as Wikipedia has been.

CHAKRABARTI: Here's a different view.

Wikipedia co-founder Jimmy Wales was on the Lex Fridman Podcast in June of 2023, so that same summer. And Fridman asked whether Wales thinks the site has a left-leaning political bias.

JIMMY WALES: I don't think so, not broadly, I think you can always point to specific entries and talk about specific biases, but that's part of the process of Wikipedia.

Anyone can come and challenge and to go on about that. It's certainly true that some people who have quite fringe viewpoints and who knows the full rush of history, in 500 years, they might be considered to be pathbreaking geniuses, but at the moment, quite fringe views, and they're just unhappy that Wikipedia doesn't report on their fringe views as being mainstream.

CHAKRABARTI: That's Wikipedia co-founder Jimmy Wales. Molly, I want to talk with you about some of your own reporting and editing on Wikipedia, specifically regarding Elon Musk. But before that, a quick question here. This idea of mainstream or reliable resources, reliable sources as being one of the criteria of whether information makes it onto Wikipedia.

Isn't that right there? This is a criticism I hear of NPR all the time, or public radio, like it's just subject to bias based on what you consider to be mainstream or reliable. Does that same, could that same bias not apply to Wikipedia?

WHITE: I think that's true. There is the possibility that bias in sources may be reflected on Wikipedia.

And this is something we've grappled with for a very long time as a project. Because, for example, if women or people of color, underrepresented in historical texts, then they are going to necessarily be underrepresented on Wikipedia, or sources that rely on that. The same thing is true of communities that rely on oral histories, which are very challenging to cite.

So I think there are very serious concerns around if material that should be included in Wikipedia is not being included. But ultimately, I don't think that the serious concerns there are really in the area of right-wing American politics. There is no shortage of coverage of even very extreme or fringe views.

There is no shortage of coverage of even very extreme or fringe views [on Wikipedia].

Molly White

And if there is coverage, Wikipedia will describe that. That is why Wikipedia describes things like the flat earth conspiracy theory, right? It is not accepted science, but if it is described in reliable sources, it will also be described as a fringe theory in Wikipedia.

I think ultimately the complaint is that these fringe theories are not being treated as equivalent to the accepted science, and that is the type of complaint we've heard a lot of from Larry Sanger, who has been very vocal about his belief that very fringe theories should be treated with the same weight as mainstream scientific consensus.

CHAKRABARTI: Stephen, I wonder if part of how we arrived here has to do with how Wikipedia has actually become a very important source of information for a ton of people. And that sort of, in a sense, is ironically in opposition to the founding idea of Wikipedia, that it's this open online encyclopedia where, you know, anybody could edit or add information.

And through the sort of wonderful chaos of the masses coming together, the belief was that any and every idea should get equal weight. If that was the case, I don't think Wikipedia would've become so important to so many people as a reliable source.

And in fact, because things are not getting equal weight, and as Molly's describing, some sources are considered more reliable than other, Wikipedia has become even more way more popular. And you heard critics earlier say it's also showing up on like page one of Google searches, at the top.

So the thing that's made it successful is exactly the same thing that's led to the current, like, very intense criticism of it.

HARRISON: I think that's right. I think we have a beginning and the middle to the story, and we might be figuring out the end. And by beginning, I mean in the early days there were all these concerns about Wikipedia being authorial anarchy, and Stephen Colbert used the word Wikiality, and the idea was you can't trust this.

How could you possibly trust the encyclopedia? That quote, anybody can write. Maybe around 2015 there were a lot of observations, and I should say from both sides of the political spectrum that wow, Wikipedia is working pretty well. And I think the idea was that there's so many eyes on the project and that's what's making it better, that just having this visibility and like people editing and more contributions makes it better.

And Elon Musk, who as you said, is now a big critic of Wikipedia, but in 2017, he said Wikipedia is great. It keeps getting better and better. So he liked it back then. And then now, 2025, we have this, concerns about political bias. We have, I think a lot of the urgency of the arguments comes from the fact that Wikipedia articles appear in Google search results and in AI summaries so much.

And so they're very prevalent, and I think that's why there's a lot of discussion now about, okay when Wikipedia editors decide what is a reliable source. What are they choosing from? Does that mean mainstream media, reliable source? Does that mean excluding or not putting as much weight on fringe sources. And so I think that's why we're in this point, in terms of what sources are represented and mirrored on Wikipedia.

CHAKRABARTI: Molly, let me just follow up with you on that. Does Wikipedia, do Wikipedia editors consider stories that have shown up on The Daily Wire, on Fox News, things like that, as reliable sources as well?

WHITE: It's a little bit complicated, but Wikipedia editors, broadly speaking, evaluate every source on a case-by-case basis, which, as you can imagine, can become very time consuming and repetitive if there are very prominent publications that are being discussed over and over again.

And so there have been larger discussions about, generally speaking, is this publication usually reliable, usually unreliable, or does it need to be treated with caution in some subject areas but not others? And those are the types of conversations that Wikipedia editors have had about a broad range of sources, including sources that are on the political right, like Fox News, the Daily Wire, sources that are probably perceived as more centrist and then of course left-wing sources. And there have been concerns about the general reliability of sources like Fox News, which has resulted in a guideline that these sources need to be treated with caution.

And the reliability is generally not up to Wikipedia standards. This has been portrayed as Wikipedia banning these sources in the past. You'll often see claims that Wikipedia has banned Fox News or the New York Post or some publication like that.

That's not accurate, but, like I said, these sources are treated on a case-by-case basis and you can search Wikipedia and you will see citations to Fox News and to the New York Post. Ultimately, we are very cautious about sources that have a reputation for poor fact checking or for publishing tabloid like material.

CHAKRABARTI: Let's talk for a second about Elon Musk. You've written quite extensively, Molly, about Elon Musk's now targeting of Wikipedia. But way back in December, the end of December of last year, he tweeted, quote: Stop donating to Wokepedia as he called it. He's also been training Grok, the AI chatbot run by Musk's Company xAI. He's been training Grok to fact check what he calls bias sources on the internet like Wikipedia, and he explained the training process at an event hosted by the podcast All-In. All-In's hosts have been critical of Wikipedia.

ELON MUSK: Grok is using heavy amounts of inference, compute, let's say, to look at, as an example, a Wikipedia page and say, what is true, partially true or false or missing in this page? Now rewrite the page to remove the falsehoods. Correct the half-truths and add the missing context. 

CHAKRABARTI: Interesting. Musk there saying that he'd ask Grok to rewrite and to include things that are actual truths, he says, or to remove falsehoods. Molly, what has Musk's sort of criticism been of Wikipedia and the actions, more about the actions he's taken against it.

WHITE: I think you heard an allusion to it there, which is that a lot of these grievances come from people who are upset that they personally are not being described as they might like on their Wikipedia pages.

And I think ultimately that's actually where a lot of Elon Musk's own criticisms came from. He, as Stephen alluded to early on, was very complimentary about Wikipedia. He said, I love Wikipedia. But over time he began to complain about how his own article discussed him. He was upset that it didn't describe him as a founder of Tesla because he wasn't a founder of Tesla, but he wishes to be described as such.

And that I think ultimately morphed into his embrace of much broader complaints about Wikipedia as a biased source or as one that does not give enough attention and weight to more fringe sources like he might wish it to. And that has really turned into this campaign by Musk against Wikipedia, where he has encouraged people not to donate or use the site at all.

He has tried to suggest creating alternatives to Wikipedia and has ultimately been very aggressive against Wikipedia as an organization.

CHAKRABARTI: He has the money. He has the power, the information influence through X, formerly Twitter. And the U.S. Congress also has the power vis-a-vis the government to make life very miserable for Wikipedia.

Just these groups have for other media organizations. I also want to just point out that we did reach out to all 26 Republican members of the U.S. House Oversight Committee requesting interviews. Six of them directly declined. The rest did not respond. Okay. In the last few minutes of the conversation, Stephen, I wanna turn back to something you said earlier about a pattern that we've seen internationally in terms of, and now in the United States, in terms of going after independent sources of information.

Rachel Goodman says the attacks on Wikipedia do indeed fit into that pattern, the authoritarian's playbook. She's special counsel and team manager for free expression and the right to dissent at Protect Democracy.

RACHEL GOODMAN: Authoritarians seek to systematically dismantle independent sources of fact-based information.

When you control the narrative, you control power. The authoritarian playbook is straightforward here. Neutralize any institution that might contradict your version of reality.

CHAKRABARTI: And Goodman says, agreeing on what's true is essential for a functioning democracy.

GOODMAN: Democracy needs a shared foundation of truth.

And we as Americans can disagree on solutions. We should, but if we can't agree on basic facts, we can't self-govern. And that is what authoritarians count on. When independent voices are silenced, only one story remains: their story.

If we can't agree on basic facts, we can't self-govern. And that is what authoritarians count on.

Rachel Goodman

CHAKRABARTI: That's Rachel Goodman. She is special counsel for free expression and the right to dissent at Protect Democracy.

Stephen Harrison, I want you to tell us a little bit more about what we might look forward to based on how we've seen this playbook play out in other countries. You'd mentioned China and the attack on independent sources there. Could the same thing happen here? What happens in those places?

HARRISON: Yeah, we've seen authoritarian regimes around the world go after Wikipedia for the reasons that your expert mentioned. I think China blocked Wikipedia, which was, conveniently around the 20th anniversary, or the 30th anniversary, I should say, of the Tiananmen Square massacre and kept their people from seeing that information.

Russia has fined Wikipedia repeatedly. Saudi Arabia has arrested Wikipedia editors for their contributions to the site. So we see this pattern in terms of governments that go after Wikipedia. And I think that I agree with your expert, it has to do with the political subordination of the truth, right?

Those in power want to declare the truth and use sort of a tool like social media to just push it out there, rather than have this sort of independent nonprofit free resource that is curating sources and describing a different view of the truth, right? Which is that, and the idea of Wikipedia editors is very different than the view of authoritarians. Wikipedia editors say, Hey, the best way to get at reality, or the truth is to use independent, reliable sources.

And that's different than Elon Musk who wants to just push out his own narrative.

The first draft of this transcript was created by Descript, an AI transcription tool. An On Point producer then thoroughly reviewed, corrected, and reformatted the transcript before publication. The use of this AI tool creates the capacity to provide these transcripts.

Read Entire Article