A facebook Funded entity the tech giant set up to distance itself from tricky and potentially damaging content moderation decisions has announced the first batch of cases it will look into.
In a press release on its website, the Facebook Oversight Board (FOB) reports that it looked through more than 20,000 filings before settling on six cases – one of which was referred to it directly by Facebook.
The six cases it started with are:
Facebook submission: 2020-006-FB-FBR
A case from France where a user posted a video and accompanying text to a COVID-19 Facebook group – this relates to allegations about the French authority regulating health products advertising mail for remdesivir ”; The user criticizes the lack of a health strategy in France and explains that "(Didier) Raoult's remedies" are used elsewhere to save lives. Facebook removed the content for violating its violence and incitement policy. The video in question received at least 50,000 views and 1,000 shares.
According to FOB, Facebook stated in its referral that this case is "an example of the challenges facing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic".
Of the five user contributions selected by the FOB, the majority (three cases) relate to eliminating hate speech.
One case apiece is for Facebook's adult nudity and content guidelines. and its policy towards dangerous people and organizations.
Below are the descriptions of the board of the five cases submitted by the user:
2020-001-FB-UA: A user posted a screenshot of two tweets from former Malaysian Prime Minister Dr. Mahathir Mohamad, in which the former Prime Minister stated that "Muslims have the right to be angry and to kill millions of French for past massacres" and "(b) By and large, Muslims have an eye for an eye" not applied. Muslims don't. The French shouldn't. Instead, the French should teach their people to respect the feelings of others. "The user did not add a caption next to the screenshots. Facebook removed the post for violating its hate speech guidelines. In his appeal to the Oversight Board, the user stated that he wanted to draw attention to the former Prime Minister's" terrible words " .
2020-002-FB-UA: A user posted two familiar photos of a deceased child lying fully clothed on a waterfront beach. The accompanying text (in Burmese) asks why, unlike the recent cartoons-related murders in France, there is no retaliation against China for the treatment of Uighur Muslims. The article also refers to the Syrian refugee crisis. Facebook removed the content for violating its hate speech guidelines. The user, in his appeal to the Oversight Board, noted that the post was meant to disagree with people who believe the killer is right and to emphasize that human lives are more important than religious ideologies.
2020-003-FB-UA: A user posted alleged historical photos showing churches in Baku, Azerbaijan. The accompanying text says that Baku was built by Armenians and asks where the churches have gone. The user stated that Armenians are restoring mosques on their land because it is part of their history. The user said that the "т.а.з.и.к.и" destroy churches and have no history. The user stated that he was against "Azerbaijani aggression" and "vandalism". The content was removed for violating Facebook's hate speech. The user stated in his appeal to the Oversight Board that he intended to demonstrate the destruction of cultural and religious monuments.
2020-004-IG-UA: A user in Brazil posted a picture on Instagram with a Portuguese title indicating that there is a need to raise awareness of the signs of breast cancer. Eight photos in the picture showed symptoms of breast cancer with an explanation of the symptoms below. Five of the photos contained visible and uncovered female nipples. The remaining three photos showed female breasts with the nipples either not in the picture or with a hand covering them. Facebook removed the post for violating its adult nudity and sexual activity guidelines. The post has a pink background and the user indicated in a statement to the board of directors that it was shared as part of the national “Pink October” campaign to prevent breast cancer.
2020-005-FB-UA: A user in the US was asked by Facebook's "On This Day" feature to re-share a "reminder" in the form of a post the user made two years ago. The user has released the content again. The contribution (in English) is an alleged quote from Joseph Goebbels, the Reich Propaganda Minister in National Socialist Germany, about the need to address emotions and instincts instead of intellect, and about the unimportance of truth. Facebook removed the content for violating its policy on dangerous people and organizations. The user stated in his appeal to the Oversight Board that the quote is important because the user believes the current US presidency follows a fascist model
Public comments on the cases can be submitted through the FOB website, but only for seven days (closing at 8:00 a.m. Eastern Standard Time on Tuesday, December 8, 2020).
The FOB “expects” that it will make a decision on each case within 90 days – and “that Facebook has reacted to this decision”. The first "results" of the FOB, which didn't begin reviewing cases until October, almost certainly won't land until 2021.
Panels made up of five FOB members – including at least one from the region “involved in the content” – are responsible for deciding whether to keep that specific content down or to rebuild it.
Facebook's outsourcing of a fantastically small subset of content moderation considerations to a subset called the "oversight board" has generated a lot of criticism (including the inspiration of a mirrored unofficial entity calling itself the Real Oversight Board) – and quite a bit of cynicism .
Not least because it is fully funded by Facebook. structured as Facebook wanted to structure it; and with members selected through a system developed by Facebook.
If it's a radical change you're looking for, the FOB isn't for you.
The company is also not empowered to change Facebook policies – it can only make recommendations (which Facebook can completely ignore).
Your remit does not extend to examining how Facebook's attention-grabbing business model affects the types of content that its algorithms amplify or suppress.
And the narrow focus on down content – rather than content that is already allowed on the social network – distorts its purview, as we mentioned earlier.
So you won't find the board asking tough questions about why hate groups continue to thrive and recruit on Facebook, for example, or whether they're asking heavily how much success their algorithmic reinforcement has given the antivaxx movement. The FOB is naturally focused on symptoms rather than the nationwide platform sick of Facebook itself. Outsourcing a fantastically small subset of content moderation decisions can't mean anything else.
With this Facebook-commissioned pantomime of accountability the tech giant hopes to generate a helpful pipeline of distracting advertising that focuses on specific and "nuanced" content decisions and simpler but more punchy questions about the exploitative and abusive nature of the Facebook business itself distracts from the legality of the Mass surveillance of internet users as lawmakers around the world grapple with how to contain tech giants.
The company wants the FOB to reformulate the discussion about the culture wars (and worse) that fuels Facebook's business model as a social problem – by advancing a self-serving solution to the algorithmically fueled social divide in the form of a few hand-picked professionals Contents, so that it remains free to further define the form of the attention economy on a global level.