When a video was removed from our YouTube channel without warning earlier this month, we began to think about how journalists should responsibly cover those making false claims and how coverage of these events is handled by social media sites.

A Colorado Times Recorder reporter recorded the video while covering a “Stop the Steal” rally last December. At the event, speakers touted patently false election fraud conspiracies in front of the Colorado state capitol building.

Among the most egregious claims made in the video come from rally organizer Dave Roach, of the U.S. Election Integrity Project, who said China “funded the voting machines” and that “hundreds of thousands of ballots counterfeited in China” were sent to swing states like Colorado. 

Election fraud conspiracy group U.S. Election Integrity Project spokesman promotes false information at a Dec. 12, 2020 rally

YouTube removed the video from the site nearly three months after it was posted. In its place was a message saying the video had been removed for violating community standards.

Upon realizing the video was taken down, we submitted an appeal form explaining that the video was original news reporting on election misinformation, and was intended to debunk the lies being spread. YouTube responded to the appeal within minutes, and soon thereafter, the video reappeared on our channel. 

To find out more about YouTube’s misinformation policies and enforcement practices, and the broader role of journalists in this landscape, I spoke with Jiore Craig and Kelsey Suter, specialists in online election misinformation at Washington D.C.-based political consulting firm GQR. 

Craig leads GQR’s Digital practice, and works with media organizations to help them responsibly report on social media trends and confront misinformation. She says that in recent years, journalists have found themselves in the middle of a conversation about digital misinformation policies in a rapidly evolving social media landscape.  

“It’s on local journalists to decide how they are going to represent what is occurring without giving validation to some of those ideas,” she says. “I think it’s a question, I don’t think there’s a strict rule.”

Suter and Craig say journalists’ situation is complicated, because each social media site has its own unique policies on misinformation, and enforcement of those policies is applied inconsistently.

“Each platform takes a different approach to how they will moderate what kind of content exists on their platform,” Craig says. “As a blanket statement, none of them are doing a good job.”

According to YouTube’s factsheet on misinformation, content that “deliberately seeks to spread disinformation that could suppress voting or otherwise interfere with democratic or civic processes violates its community guidelines.” 

However, these community guidelines are notoriously nebulous. YouTube describes their policies as a “living set of enforcement guidelines,” and admits that their specific policies and practices are not made available to the public. Furthermore, YouTube acknowledges that it relies on computer algorithms and artificial intelligence to help identify problematic content.

From YouTube’s Handbook on Fighting Disinformation

Suter, who monitors and analyzes disinformation at GQR and helped organizations respond to disinformation during the 2020 election says it is not uncommon to have media coverage of misinformation removed. 

“Media coverage or reporting that is trying to expose a problem will sometimes get flagged within a policy regulating the content it is trying to call out.”

“Enforcement is extremely patchy,” she added. “These rules are not being applied evenly.”

YouTube has come under scrutiny for its handling of misinformation in recent months. Last November, four Democratic senators asked YouTube to remove content containing misinformation from its platform and answer questions about its policies regarding election misinformation. 

Leslie Miller, VP of Government Affairs and Public Policy at YouTube defended YouTube’s enforcement policies in a blog post, saying the site “enforces policies consistently without regard to a video’s political viewpoint,” and that the company introduced reforms to help curb the spread of misinformation on its platform.

However, Craig says YouTube and other social media sites have a history of making superficial changes to ease public pressure without addressing underlying issues of problematic contents’ reach on its platform.

“Ultimately in our country we want to maintain the First Amendment and we want freedom of speech to be intact,” says Craig. “We should talk more not so much about what content should exist or not, but more about how much reach those groups are able to get while the platforms profit off of that reach.”