, FILE - This Feb. 19, 2014, file photo, shows a Facebook app icon on a smartphone in New York. Facebook is admitting that it didn’t do enough to prevent its platform from being used to incite violence and spread hate in Myanmar. A Facebook executive said in a blog post late Monday, Nov. 5, 2018, that the company “can and should do more.” (AP Photo/Patrick Sison, File)
06 of November 2018 22:54:07
NEW YORK (AP) — Facebook is admitting that it didn't do enough to prevent its services from being used to incite violence and spread hate in Myanmar.
The company "can and should do more" to protect human rights and ensure it isn't used to foment division and spread offline violence in the country, Alex Warofka, a product policy manager, said in a blog post.
Facebook commissioned the nonprofit Business for Social Responsibility to study the company's role in Myanmar and released the group's 62-page report late Monday.
It has come under heavy criticism for permitting itself to be used to inflame ethnic and religious conflict in the country, particularly against minority Rohingya Muslims. The report confirms this and offers recommendations, including preparing for "massive chaos and manipulation" in the country's 2020 parliamentary elections.
"Facebook has become a means for those seeking to spread hate and cause harm, and posts have been linked to offline violence," the report says. "A minority of users is seeking to use Facebook as a platform to undermine democracy and incite offline violence, including serious crimes under international law."
The Myanmar report comes as Facebook and other social media companies face a trove of problems in dealing with people, groups and nations intent on using their services for malicious reasons, whether that's inciting violence, spreading hate messages, propaganda and misinformation or meddling with elections around the world.
Facebook is focused on rooting out misinformation in the U.S., but it's also dealing with people using its platforms to incite violence in Sri Lanka, India and elsewhere. Late Monday, Facebook said it shut down 30 Facebook accounts and 85 Instagram accounts for suspected "coordinated inauthentic behavior" linked to foreign groups attempting to interfere in Tuesday's U.S. midterm elections.
Facebook and smartphones entered Myanmar quickly, and the report notes that this has led to a "steep learning curve for users, policymakers, and civil society." The report notes that Facebook "is the internet" for many in Myanmar and that it has played an important role in supporting freedom of expression and helping activists organize.
At the same time, the report said, hate and harassment is leading to self-censorship among "vulnerable groups such as political activists, human rights defenders, women, and minorities."
Facebook released the report on the eve of the U.S. midterm elections, prompting critics to question its timing when so many people are focused on other news. Facebook says the report was focused on "Myanmar stakeholders," for whom the U.S. elections are not a priority. It also said it had promised to share the results of the assessment once it had them.
The report does acknowledge that Facebook has made progress, but adds that there is "more to do." In August, the company banned Myanmar's military chief and 19 other individuals and organizations from its service to prevent the spread of hate and misinformation.
Facebook doesn't have any employees permanently based in Myanmar, but makes "regular trips" there with a range of employees. The company says that having employees there could pose risks to them and increase the Myanmar government's ability to request data on users.