Facebook, Myanmar and Genocide

By in
Facebook, Myanmar and Genocide

In the West, most of the concerns regarding social media and “Fake News” center on potential election tampering by foreign agents and the echo chamber created by algorithms that feed users’ pre-existing beliefs. These are big topics, to be sure. And they get a lot of attention in the news you see every day.

They are, however, nowhere near the most important global case study regarding the power and potential toxicity of social media; you have to look at Facebook’s enabling role in the Rohingya genocide in Myanmar to see that.We’ve been following this issue for months, but recent events make it topical today. Here is the story:

Background: the Rohingya people live in western Myanmar, on the Bay of Bengal in an area bordering Bangladesh. They are predominantly Muslim while the rest of the country is Buddhist and are also from a different ethnic group than the majority. Despite living in the Myanmar for centuries, they are not recognized as legitimate citizens and are therefore legally stateless. Most estimates put the total Rohingya population at 1-2 million people.

There have been rolling crises in Rhakine State (the region where most Rohingyas live) since at least 2012. Things went downhill quickly in 2016 after a Muslim jihadi attack at 3 border posts, with the military government arming local Buddhist groups and moving the army into the region. By the end of 2017, the region was in full-blown conflict with thousands of Rohingya dead by extrajudicial killing and as many as 400,000 leaving the country as refugees.

Yesterday, a United Nations report called for Myanmar’s military leaders to be investigated for genocide, war crimes, and crimes against humanity. US Secretary of State Mike Pompeo said on Sunday that Myanmar was guilty of “abhorrent ethnic cleansing”. The US has also imposed sanctions on several military leaders.

Why Facebook matters to all this: the social media site is wildly popular in Myanmar, with a +90% share of Internet users and an estimated 30 million accounts out of a population of 50 million. Most of those are new to the last few years as local mobile/smartphone adoption hit its stride.

The United Nations report singled out Facebook for being:

  • A “useful instrument of hate, in a context where for most users Facebook IS the Internet. Although improved in recent months, Facebook’s response has been slow and ineffective.”
  • Even more remarkable, given the company’s supposed technological prowess, the UN report also stated “The Mission regrets that Facebook is unable to provide country-specific data about the spread of hate speech on its platform, which is imperative to assess the adequacy of its response.”

Facebook finally took concrete action in addressing the crisis (on the same day the UN report was released) by banning – for the first time ever – a state official. He is Senior General Min Aung Hliang, commander in chief of the armed forces. In addition the company also banned 19 other individuals and 52 Facebook pages followed by almost 12 million people. They will retain the data and content from these, presumably for potential international law enforcement action in the future.

The upshot to all this: that Facebook needed a UN investigation and report to aggressively act in this case shows how broad its blind spots remain. Their boilerplate promise to “Do better in the future” seems insufficient in the wake of all that has occurred. And it is a reminder that the Internet enables global business models that can starkly change the lives of millions of people for the worse as well as for the better.

In terms of what it means for the stock, while we do not make equity recommendations it is hard to see the company in the same light after looking at this case study. Yes, Myanmar is a small country and many investors might have trouble finding it on a world map. But the Rohingya crisis was world news as it unfolded last year.

How could a company with Facebook’s resources – human and technological – not have had a better handle on what was happening? And why did it take so long to act? Their sluggish response does not bode well for those hoping the company can better police its platform any time soon.