About three weeks ago Facebook announced that it will step up its efforts to combat hate speech and misinformation in Myanmar ahead of the general election on November 8, 2020. More details were announced today about what the company is doing to prevent the spread of hate speech and misinformation. This includes adding Burmese language warning screens to flag information that third party fact checkers have found incorrect.
In November 2018, Facebook admitted that it had not done enough to prevent its platform from being used to "promote division and incite offline violence" in Myanmar.
This is an understatement when you consider that Facebook has been accused by human rights groups, including the United Nations Human Rights Council, of facilitating the spread of hate speech in Myanmar against Rohingya Muslims who are the target of a brutally violent ethnic cleansing campaign. A 2018 study by the New York Times found that members of the military in Myanmar, a predominantly Buddhist country, sparked a Rohingya genocide and used Facebook, one of the most widely used online services in the country, as a tool to implement a "system "Used campaign" of hate speech against the minority.
In its announcement a few weeks ago, Facebook said it would expand its misinformation policy and remove information intended to "lead to the suppression of voters or the integrity of the electoral process" by working with three partners in Myanmar – BOOM, AFP Fact Check and Fact crescendo. It also said it would flag potentially misleading images and apply a message relaying restriction introduced in Sri Lanka in June 2019.
Facebook also said it cracked 280,000 pieces of content in Myanmar in Q2 2020 that violated community standards on hate speech. 97.8% were detected by its systems before being reported, up from 51,000 pieces of content it took action in the first quarter.
However, TechCrunch's Natasha Lomas commented, “Without better visibility of the content on the Facebook platform, including country-specific factors such as: For example, whether hate speech publication increases in Myanmar as the elections approach, it is impossible to understand how much hate speech is being published passes under the radar of Facebook's detection systems and hits local eyeballs. "
Facebook's latest announcement posted in the News Room today doesn't answer these questions. Instead, the company provided more information on preparations for the general election in Myanmar.
The company said it will use technology to identify "new words and phrases related to hate speech" in the country and either remove posts containing those words or "reduce their circulation".
It will also introduce Burmese language warning screens for misinformation that has been identified as incorrect by its fact-checkers, make reliable information about elections and voting more visible, and promote "digital literacy training" in Myanmar through programs such as an ongoing monthly televised show entitled "Tea Talks" and introduction of the social media analysis tool CrowdTangle in the newsrooms.