Tech

Facebook fails to take responsibility for its network’s role in the Burmese genocide

Facebook recently released the findings of an independent study it conducted about its role in the Burmese genocide. Essentially, the company acknowledges that it didn’t do enough to prevent its network “being used to foment division and incite offline violence.” But they have assured both the public and its investors that they have made the necessary changes and are doing as much as they can to stop it from happening again.

However, while the report does show some progress in Facebook’s approach to moderate its platform, it doesn’t make any definite commitments about regular audits like this in the future.

Facebook’s poor handling of the Burmese crisis has been criticized by numerous activists and the United Nations. In May 2018, a group of activists from Myanmar, Syria, and 6 other countries made 3 specific demands of the social network: sustained transparency, an independent and worldwide public audit, and a public commitment that Facebook would enforce equal standards across every area its active in.

Unfortunately, Facebook’s reports do not live up to these demands. It was conducted by the Business for Social Responsibility, an independent non-for-profit based in California. Therefore, it does qualify as independent but does not quite live up to the worldwide audit the activists asked for. Facebook says they agree that there is value in publishing transparent information about its enforcement’s efforts, pointing toward their records detailing their moderation in Myanmar. However, they haven’t provided any assurance that they will have sustained transparency reports after this scandal is forgotten.

It is difficult to properly evaluate whether the tech giant carried out the activist’s final demand, one that ensures Facebook will enforce its standards on a global scale. Since every nation is different, it’s somewhat difficult to have worldwide moderative standards without misunderstanding each country’s unique context.

EX. Because Burma was isolated from the outside world until 2011, it is one of the biggest online communities that hasn’t standardized Unicode for its text. According to Facebook, the Zawqi text code it uses makes it very difficult for them to detect possibly offensive posts. Facebook claims that they have removed Zawgyi as an option for new users in order to force Myanmar to switch to Unicode.

Facebook’s has a team of 99 native Burmese speakers who can address the specific issues on the platform. The company claims it has already blocked over 64k pieces of content for violating its hate speech policies. Many of Burma’s civil right’s groups claim that Facebook’s system’s team took credit for identifying posts that they had reported themselves. However, Facebook released data confirming they had removed 63% of hateful posts before its users reported them manually.

Following the incidents in Burma, the network changed its ‘credible violence’ policy to cover posts containing misinformation that may lead to potential violence or physical harm. Facebook also claims they will likely establish another moderation policy that will handle human rights abuses on their platform.

Since each country’s situation is unique, this report shows that Facebook clearly has failed to understand the context of Burma’s violent situation. As Burma’s 2020 elections near, many experts advocate that Facebook allocate special attention to the country and the 20 million users on its platform.