Decades of ethnic and religious tensions, a sudden explosion of internet access, and a company that had trouble identifying and removing the most hateful posts.
It all added up to a perfect storm in Myanmar, where the United Nations says Facebook had a “determining role” in whipping up anger against the Rohingya minority.
“I’m afraid that Facebook has now turned into a beast, and not what it originally intended,” Yanghee Lee, UN special rapporteur on human rights in Myanmar, said in March.
The company admits failures and has moved to address the problems. But how did Facebook’s dream of a more open and connected world go wrong in one south-east Asian country?
Enter Facebook
“Nowadays, everyone can use the internet,” says Thet Swei Win, director of Synergy, an organisation that works to promote social harmony between ethnic groups in Myanmar.
That wasn’t the case in Myanmar five years ago.
Outside influence had been kept to a minimum during the decades when the military dominated the country. But with the release of Aung San Suu Kyi, and her election as Myanmar’s de facto leader, the government began to liberalise business – including, crucially, the telecoms sector.
The country where Facebook posts whipped up hate
You may also be interested in:
Is refugee crisis ‘textbook ethnic cleansing’? Genocide hate speech ‘persists on Facebook’ Blow by blow: How a ‘genocide’ was investigated
The effect was dramatic, according to Elizabeth Mearns of BBC Media Action, the BBC’s international development charity.
“A SIM card was about $200 [before the changes],” she says. “In 2013, they opened up access to other telecom companies and the SIM cards dropped to $2. Suddenly it became incredibly accessible.”
Image copyright Getty Images Image caption For many in Myanmar, Facebook is synonymous with the internet
And after they bought an inexpensive phone and a cheap SIM card, there was one app that everybody in Myanmar wanted: Facebook. The reason? Google and some of the other big online portals didn’t support Burmese text, but Facebook did.
“People were immediately buying internet accessible smart phones and they wouldn’t leave the shop unless the Facebook app had been downloaded onto their phones,” Mearns says.
Thet Swei Win believes that because the bulk of the population had little prior internet experience, they were especially vulnerable to propaganda and misinformation.
“We have no internet literacy,” he told Trending. “We have no proper education on how to use the internet, how to filter the news, how to use the internet effectively. We did not have that kind of knowledge.”
Ethnic tensions
Out of a population of about 50 million, around 18 million in Myanmar are regular Facebook users.
But Facebook and the telecoms companies which gave millions their first access to the internet do not appear to have been ready to grapple with the ethnic and religious tensions inside the country.
The enmity goes deep. Rohingyas are denied Burmese citizenship. Many in the Buddhist ruling class do not even consider them a distinct ethnic group – instead they refer to them as “Bengalis”, a term that deliberately emphasises their separateness from the rest of the country.
Last year’s military operation in the north-west Rakhine state was designed, the government says, to root out militants. It resulted in more than 700,000 people fleeing for neighbouring Bangladesh – something that the United Nations calls the world’s fastest growing refugee crisis.
A UN report has said top military figures in Myanmar must be investigated for genocide in Rakhine state and crimes against humanity in other areas. But the government of Myanmar has rejected those allegations.
Facebook ‘weaponised’
The combination of ethnic tensions and a booming social media market was toxic. Since the beginning of mass internet use in Myanmar, inflammatory posts against Rohingya have regularly appeared on Facebook,
Thet Swei Win said he was horrified by the anti-Rohingya material he has seen being shared. “Facebook is being weaponised,” he told BBC Trending.
Image copyright Reuters
Listen to more on this story
The BBC Trending podcast, from the BBC World Service
In August, a Reuters investigation found more than 1,000 Burmese posts, comments and pornographic images attacking the Rohingya and other Muslims.
“To be honest I thought we might find at best a couple of hundred examples I thought that would make the point,” says Reuters investigative reporter Steve Stecklow, who worked with Burmese-speaking colleagues on the story.
Stecklow says some of the material was extremely violent and graphic.
“It was sickening to read and I had to keep saying to people ‘Are you OK? Do you want to take a break?'”
Image copyright Reuters Image caption Some posts on Facebook expressed the hope that fleeing Rohingya refugees would drown at sea
“When I sent it to Facebook, I put a warning on the email saying I just want you to know these are very disturbing things,” he says. “What was so remarkable was that [some of] this had been on Facebook for five years and it wasn’t until we notified them in August that it was removed.”
Several of the posts catalogued by Stecklow and his team described Rohingyas as dogs or pigs.
“This is a way of dehumanising a group,” Stecklow says. “Then when things like genocide happen, potentially there may not be a public uproar or outcry as people don’t even view these people as people.”
Lack of staff
The material that the Reuters team found clearly contravened Facebook’s community guidelines, the rules that dictate what is and is not allowed on the platform. All of the posts were removed after the investigation, although the BBC has since found similar material still live on the site.
Has Aung San Suu Kyi turned her back on free press? Suu Kyi ‘should have resigned’ on Rohingya What will happen after UN’s ‘genocide’ report? ‘They problematic’: The view from Yangon
So why did the social network fail to grasp how it was being used to spread propaganda?
One reason, according to Mearns, Stecklow and others, was that the company had difficulty with interpreting certain words.
For example, one particular racial slur – “kalar” – can be a highly derogatory term used against Muslims, or have a much more innocent meaning: “chickpea”.
In 2017, Stecklow says, the company banned the term, but later revoked the ban because of the word’s dual meaning.
There were also software problems which meant that many mobile phone users in Myanmar had difficulties reading Facebook’s instructions for how to report worrying material.
But there was also a much more fundamental issue – the lack of Burmese-speaking content monitors. According to the Reuters report, the company had just one such employee in 2014, a number that had increased to four the following year.
The company now has 60 and hopes to have around 100 Burmese speakers by the end of this year.
Multiple warnings
Following the explosion in Facebook use in Myanmar, the company did receive multiple warnings from individuals about how the platform was being used to spread anti-Rohingya hate speech.
In 2013, Australian documentary maker Aela Callan raised concerns with a senior Facebook manager. The next year a doctoral student named Matt Schissler has a series of interactions with employees, which resulted in some content being removed.
And in 2015, tech entrepreneur David Madden travelled to Facebook’s headquarters in California to give managers a presentation on how he had seen the platform used to stir up hate in Myanmar.
“They were warned so many times,” Madden told Reuters. “It couldn’t have been presented to them more clearly, and they didn’t take the necessary steps.”
Accounts removed
A Facebook spokeswoman told Trending via email that the company was committed to hiring more content moderators but was also taking a number of other steps to tackle the problems in Myanmar.
“In the last year, for example, we have established a team of product, policy and operations experts to roll out better reporting tools, a new policy to tackle misinformation that has the potential to contribute to offline harm, faster response times on reported content, and improved proactive detection of hate speech,” the spokeswoman said.
Since last year, the company has taken some publicly visible action. In August, Facebook removed 18 accounts and 52 pages linked to Burmese officials. One account on Instagram, which Facebook owns, was also closed. The company said it “found evidence that many of these individuals and organizations committed or enabled serious human rights abuses in the country.”
The spokeswoman said deleting the accounts was “not a decision we took lightly.”
“Staying ahead of the bad means always looking for how people can misuse technology – and doing everything you can to prevent that misuse from happening in the first place. That’s our responsibility now and it’s something that weighs heavily on all of us.”
Image copyright Facebook screengrab Image caption Radical Buddhist monk Wirathu’s Facebook page was removed earlier this year
Between them, the accounts and pages were followed by almost 12 million people.
In January this year, Facebook also removed the account of Ashin Wirathu, a radical monk famed for his angry speeches which stoking fears against Muslims.
‘Too slow’
In a statement, Facebook has admitted that in Myanmar it was “too slow to prevent misinformation and hate”, and acknowledged that countries that are new to the internet and social media are susceptible to the spreading of hate.
The subject of hate speech on the platform came up in early September, when Facebook’s chief operating officer, Sheryl Sandberg, testified in front of a US Senate committee.
Image copyright Drew Angerer Image caption Sheryl Sandberg says Facebook is committed to tackling hate speech
“Hate is against our policies and we take strong measures to take it down. We also publish publicly what our hate-speech standards are,” she said. “We care tremendously about civil rights.”
When Facebook chief executive Mark Zuckerberg appeared in front of Congress in April, he was asked specifically about events in Myanmar, and said that in addition to hiring more Burmese speakers, the company was also working with local groups to identify “specific hate figures” and creating a team that would help identify similar issues in Myanmar and other countries in the future.
Elizabeth Mearns from BBC Media Action, believes that while it is Facebook’s role in Myanmar that is currently under scrutiny, the situation is just one example of a far wider issue.
“We are definitely now in a situation where content on social media is directly affecting people’s real life. It’s affecting the way people vote. It’s affecting the way people behave towards each other, and it’s creating violence and conflict,” she says.
“The international community understands now, I think, that it needs to step up and understand technology. And understand what’s happening on social media in their countries or in other countries.”
An Egyptian man in Saudi Arabia has been arrested after a video of him having breakfast with a woman went viral on Twitter. READ NOW
You can follow BBC Trending on Twitter @BBCtrending, and find us on Facebook. All our stories are at bbc.com/trending.