Having managed to curb hate speech and misinformation that fed a genocide in Myanmar, Facebook now says it plans to take proactive content moderation measures following a military coup that is taking place in the country.
In an internal message posted Monday afternoon and seen by BuzzFeed News, Rafael Frankel, director of public policy for the Asia-Pacific region, told employees that the social network was seeing the “volatile situation” in Myanmar “with great concern “and outlined a series of measures to repress people who used it to spread misinformation or threaten violence.
As part of these measures, Facebook has designated Myanmar as a “high-risk temporary location” for two weeks, allowing the company to remove content and events from the country that include “any call to carry weapons.” The social network previously applied this designation in Washington, DC, after the uprising at the U.S. Capitol on January 6.
The social network, which had promoted its efforts to protect the integrity of Myanmar’s national elections in November, also said it would protect publications criticizing the army and its coup and would monitor reports from Myanmar. pages and accounts that the military hacked or took control of. .
“Myanmar’s November elections were an important moment in the country’s transition to democracy, although it was not without its challenges, as highlighted by international human rights groups,” Frankel wrote. “This turn of events makes us feel like the days we waited for in Myanmar’s past and reminds us of fundamental rights that should never be taken for granted.”
Facebook moves come after General Min Aung Hlaing, Myanmar’s army chief, took control of the country’s government and arrested its elected leader Aung San Suu Kyi and other members of the National League party on Monday. of Democracy (NLD). After the election in which the NLD won most of the seats in the Myanmar parliament, opposition groups backed by the military called the results fraudulent and demanded a revolt.
On Tuesday, the US State Department officially designated the acquisition of the army in Myanmar as a coup, which led to financial sanctions.
“After reviewing all the facts, we have assessed that the actions of the Burmese army on February 1, after having dismissed the duly elected head of government, constituted a military coup,” an official said. of the State Department in a briefing, which used a name used by the U.S. government to refer to the country.
In a statement to BuzzFeed News, Facebook confirmed the actions it outlined in Frankel’s post and said it would remove content that praised or supported the coup.
“We are putting the safety of people in Myanmar first and removing content that violates our rules on violence, hate speech and harmful misinformation,” Frankel said. “This includes the elimination of misinformation that delegitimizes the result of the November elections.”
Facebook is taking action in a country where it had previously faced international condemnation for its handling of the displacement and genocide of Rohingya Muslims that began in 2016. In 2018, UN researchers found that senior officials Myanmar military had used Facebook, which has no content moderators in the country, to encourage fear and spread hate speech.
UN researchers concluded that “the extent to which Facebook posts and messages have led to discrimination in the real world.” your report.
In Monday’s post, Frankel said Facebook used “several interventions on products that were used in the past in Myanmar and during the U.S. election, to ensure that the platform is not used to spread misinformation, inciting violence or coordinating damage “.
Frankel wrote that the company works to secure the accounts of activists and journalists “who are at risk or who have been arrested” and removes content that threatens them or calls for violence. The company will also protect “critical information about what is happening on the ground,” given the restrictions imposed on the country’s media.
Facebook’s work is an ongoing effort. He withdrew a page from Myanmar’s military television channel on Tuesday afternoon, following investigations by the Wall Street Journal. Although the company had banned a page for Myawaddy TV in 2018 during a crackdown in hundreds of Myanmar army-linked accounts, a new page had reappeared and got 33,000 likes.
Facebook has been frequently attacked for facilitating the growth of violent and extremist groups and their ineffectiveness in preventing misinformation. More recently, a group of technical vigilantes accused the company of fomenting the riots that sparked the deadly coup attempt in the United States.
“[Facebook] has spent the past year failing to eliminate conspiracy theories related to extremist activities and President Trump-led elections, which have radicalized a wide range of the population and led many to a dangerous path, ”the Technical Transparency Project (TTP) he said in a report.
The report uncovered specific threats made on Facebook in favor of Trump and militant groups before and after Joe Biden’s election victory in November.