Email This Story :
On the eve of Facebook CEO Mark Zuckerberg appearing before a US Senate panel, organisations in Burma have pushed back on his defence of the company’s efforts to mitigate hate speech online, with the groups sharing more damning material as evidence of the social network’s role in the Rohingya crisis.
After thanking him for his emailed response to their previous open letter release last week, the six Burmese civil society and cyber-security groups stated, “This doesn’t change our core belief that your proposed improvements are nowhere near enough to ensure that Myanmar users are provided with the same standards of care as users in the US or Europe.”
Two senators have spoken to the group of CSOs who sent the email to Zuckerberg and pledged that they will aim to raise Facebook’s role in Burma allowing hate speech to flourish on its platform at congressional hearings scheduled for Tuesday and Wednesday.
Senator John Kennedy of Louisiana has already stated that Facebook needs to do more to stop the spread of fake news.
“Some people respond when they see the light. Others have to feel the heat,” he said ahead of Zuckerberg’s testimony this week.
Much of the questioning is expected to focus on Facebook’s role in Russian meddling of the US election, but in examples shared with DVB, evidence has mounted of a slow response in Burma to removing not only hate speech but also personal threats.
One post detected on 4 December was a list of activists “to be assassinated.” The post included the names of people who were vocal about the Rohingya issue or well-known Muslim activists and included links to the activists’ Facebook accounts.
Facebook replied promptly, within 25 minutes, stating that they were “having this looked into” after a security analyst in Burma reported the post. Yet despite reports to Facebook showing the multiple locations of it being shared as well as its originally spotted source, the group of CSOs has pointed out that these posts have not been removed and are still public.
Another post shared with DVB was a graphic photo of a dead woman’s body with her clothes ripped, possibly having been raped. Burmese CSOs came across the post on 3 June of last year and reported it to Facebook at about 10:30 that evening. Having received no reply from Facebook over email and due to the lack of an established emergency response system, the CSOs in desperation also reported the case over Skype to a Facebook staff member.
This post was “eerily similar” to a previous rape case of Buddhist woman Thida Htwe in 2012, which was blamed for inciting mob violence in Rakhine State, the CSOs told DVB.
“The situation was extremely tense in northern Rakhine State at the time and we had good reason to believe that such dangerous content could spark a mob,” they said.
It was not until 11:56 the next morning that Facebook said its monitors had removed the post.
In their email to Zuckerberg ahead of his appearance before US lawmakers, the Burmese CSOs made a list of 11 recommendations and requests of information to be shared to prove that Facebook is indeed building a more efficient emergency response system to hate speech.
Some of these included: What percentage of reports received took more than 48 hours to take down? How many fake accounts have you identified and removed? Do you have a target for review times?
Specific details about Burmese monitors also remain foggy. The CSOs have demanded that Facebook be more transparent and share how many Burmese-reading reviewers it had, in total, as of March 2018, as well as how many the company plans to have by the end of the year.