If you've visited Google's YouTube in the days following last week's election, you may have found a video from a One America News Network anchor declaring victory for President Trump with the baseless claim that Democrats are "casting Republican ballots." .
Or a clip by Mr. Trump that says on his own YouTube channel that he would easily win if all “legal votes” were counted. Or a video claiming that Real Clear Politics, a political news site, had "retracted" its appeal for President-elect Joseph R. Biden Jr. to win Pennsylvania.
All of these videos were unfounded and spread misinformation that challenged the validity of the election result. However, these videos are still available on YouTube while they are widely shared on Facebook and other social media platforms.
This is not a mistake or oversight in the enforcement of YouTube's policies. Instead, YouTube has stated that its policies are working as intended. Its light-touch approach differs from Twitter and Facebook, which cracked down on misinformation about the elections and highlighted content that they consider to be misinformation.
“Disinformation is spread on YouTube, but they are not at all transparent about how they deal with it,” said Lisa Kaplan, founder of the Alethea Group, a company that has been able to help combat election-related misinformation due to the nature of the platform stay out of the spotlight. "
Videos are harder to analyze than text, Ms. Kaplan said, and videos aren't shared in the same way as Facebook posts and tweets.
Prior to the election, YouTube said it would not allow any misleading voting information, but was mostly limited to voting procedures – how to vote, who is eligible to vote or is a candidate, or any claims that could hamper the voting. The guidelines did not extend to anyone expressing views on the outcome of a recent election or the voting process. This means that videos that spread misinformation about the outcome of the vote are allowed.
"The majority of election-related searches show results from authoritative sources and we are reducing the spread of harmful election-related misinformation," Andrea Faville, a YouTube spokeswoman, said in a statement. "Like other companies, we allow discussions about the election results and the vote counting process, and we continue to closely monitor new developments."
The company removes content that violates its policies, especially content intended to incite violence. She declined to say how many videos YouTube had removed.
YouTube's actions are opaque. The most powerful tool is an algorithm that has been trained to prevent borderline content – videos that break the rules but don't clearly break them – from appearing high in search results or recommendations. However, YouTube does not disclose which videos are flagged as a limit, so users have to guess whether the company will take action or not.
Even if YouTube takes steps to make it difficult for users to find the videos on its website, it does not prevent a user from widely distributing them elsewhere. As a result, many YouTube videos have found new life on Facebook. According to BuzzSumo, a web analytics tool, the video spreading falsehoods about Real Clear Politics, which revokes Mr. Biden's projection of Pennsylvania's profit, had approximately 1.5 million views on YouTube and was viewed 67,000 times Tuesday afternoon Facebook shared.
YouTube marked some videos as unadvertising. For the One America News video that aired last Wednesday, YouTube said it removed ads because the content undermined "confidence in elections with proven false information." As a result, YouTube was in a difficult position to see the potentially harmful effects of the video while continuing to host the video.
YouTube has also flagged all election-related videos. On Saturday, the label was changed from a warning that the election result may not be final to a statement that "the AP called the presidential race for Joe Biden," with a link to a Google page with the results. YouTube has displayed this information field "billions of times".