On election day, Facebook released conclusions from an independent study on how the platform was used to spread disinformation, foment division, and incite violence in Myanmar. Activists called for sustain transparency, independent audits, and commitment to equal enforcement of standards. The report suggests Facebook has struggled to understand Myanmar’s recent violence and has difficulty with communities like Myanmar that do not use Unicode, a consistent encoding across all languages.
- As a private company, does Facebook have an obligation to police disinformation?
- Has Facebook made oppressive regimes more effective?
Facebook has released the conclusions of an independent assessment regarding its role in the recent genocidal violence in Myanmar. In short, the company admits that it previously wasn’t doing enough to prevent its network from “being used to foment division and incite offline violence,” but it argues it’s already begun making the changes necessary to prevent it from happening again. However, while the report shows that the company has made progress in how transparent it is about moderation, it stops short of making any firm commitments about audits like this in the future – a key demand from activists.
Facebook’s handling of the Myanmar crisis has been criticized by everyone from activists to the United Nations. Back in May, a coalition consisting of activists from Myanmar, Syria, and six other countries, made three specific demands of the social network. That coalition called for sustained transparency, an independent and worldwide public audit, and a public commitment to equal enforcement of standards across every territory that Facebook is active in.
Compared to these demands, Facebook’s report is a mixed bag. Since it was conducted by the Business for Social Responsibility, an independent nonprofit organization based in San Francisco, it certainly qualifies as independent, but it stops short of the worldwide audit that the coalition called for. Although Facebook claims to agree with the value of transparently publishing data about enforcement efforts and points toward a recent example covering its Myanmar moderation (it also posted a similar report about Iran), it makes no specific commitments about how regularly it will publish these reports in the future.
The coalition’s final demand – that Facebook equally enforces its standards worldwide – is much more difficult to evaluate. Every country is unique, and having equal standards worldwide risks missing crucial pieces of context. For example, Facebook notes Myanmar is one of the largest online communities that hasn’t standardized on Unicode for its text because of its long period of isolation from the outside world. Instead, it uses the Zawgyi typeface, which Facebook claims makes it much harder to detect offending posts. Facebook wants Myanmar to transition to Unicode, and it says it has removed Zawgyi as an option for new users.
Facebook has also created a team dedicated to addressing Myanmar’s specific issues on the platform, and that team includes 99 native Myanmar speakers. The company says it has already taken action on around 64,000 pieces of content from the country for violating its hate speech policies, proactively identifying 63 percent of these posts before they were reported manually. Similar claims about Facebook’s systems’ abilities to automatically flag content have previously been criticized by Myanmar civil society groups that claimed that they uncovered these messages that Facebook’s systems took credit for identifying.