close
close

Twitter message from Jan. 6 Deplatforming reduced misinformation, research shows.

In the week after the January 6, 2021, uprising, Twitter suspended about 70,000 accounts linked to the right-wing QAnon radicalist movement, citing their role in spreading disinformation that fueled real-world violence.

A new study finds that the measure had an immediate and widespread impact on the overall spread of false information on the social media site, which has since been purchased by Elon Musk and renamed X.

The study, published Tuesday in the journal Nature, suggests that if social media companies want to reduce misinformation, banning common spreaders could be more effective than suppressing individual posts.

The mass suspension significantly reduced link sharing to websites with “low credibility” among Twitter users who followed the suspended accounts. It also led to a number of other purveyors of disinformation voluntarily leaving the site.

Moderating social media content has fallen out of favor in some circles, especially at X, where Musk has reinstated numerous banned accounts, including that of former President Donald Trump. But with the 2024 elections approaching, the research shows that it is possible to curb the spread of online lies, if platforms have the will to do so.

“There was a spillover effect,” said Kevin M. Esterling, a professor of political science and public policy at the University of California at Riverside and co-author of the study. “It wasn’t just a reduction in de-platformed users themselves, but it reduced circulation on the platform as a whole.”

GET SAVED

Summarized stories to quickly stay informed

Twitter also famously suspended Trump on January 8, 2021, citing the risk that his tweets could incite further violence — a move that Facebook and YouTube soon followed. While Trump’s suspension may have reduced misinformation in itself, the study’s findings hold up even if you take his story out of the equation, said co-author David Lazer, a professor of political science and computer and information science at Northeastern University.

The study used a sample of approximately 500,000 Twitter users who were active at the time. In particular, it targeted 44,734 of the users who had tweeted at least one link to a website that was on lists of fake news or low-credibility news sources. Of those users, those who followed accounts that were banned during the QAnon purge were less likely to share such links after the deplatforming than those who did not follow them.

Some of the websites that the study found to be of low quality included Gateway Pundit, Breitbart, and Judicial Watch. The study’s other co-authors were Stefan McCabe of George Washington University, Diogo Ferrari of University of California at Riverside and Jon Green of Duke University.

Musk has touted X’s “Community Notes” fact-checking feature as an alternative to enforcing online speech rules. He has said he prefers to limit the reach of problematic posts rather than removing them or banning accounts altogether.

A study published last year in the journal Science Advances found that efforts to remove anti-vaccination content on Facebook did not reduce overall engagement on it on the platform.

Trying to moderate misinformation by targeting specific messages is “like sticking your finger in a dike,” Esterling said. Because there are so many of them, by the time you suppress or delete one, it may have already been seen by millions.

Lazer added: “I’m not advocating deplatforming, but it does have potential effectiveness in that identifying people who repeatedly share disinformation is much easier than chasing individual pieces of content.”

It is still unclear whether disinformation is an important driver of political attitudes or election results. Another paper published Tuesday in Nature states that most social media users don’t actually see much disinformation, but that it is “concentrated in a narrow margin with strong motivations to seek out such information.”

Lazer agreed that disinformation often concentrates in a “seedy area” of larger online platforms, rather than permeating “the entire city.” But, he added, these fringe groups “sometimes gather and storm the Capitol.”

Anika Collier Navaroli, a senior fellow at Columbia’s Tow Center for Digital Journalism and a former senior policy official at Twitter, said the findings support the case she was trying to make to Twitter’s leaders at the time.

Navaroli noted that the company compiled the list of QAnon-affiliated accounts before January 6.

“We already knew who they were,” she said. “People only had to die for the damage to be seen as real.”

Back To Top