SAN FRANCISCO – Facebook, which enjoys growing criticism of posts that have provoked violence in some countries, said on Wednesday that it will start eliminating misinformation that could lead to people being physically harmed.
The policy expands Facebook's rules on what kind of false information it will remove, and is largely a response to episodes in Sri Lanka, Myanmar, and India where rumors spreading to Facebook become real attacks ethnic minorities.
"We've found that it's a type of misinformation shared in certain countries that can cause tension and cause physical injury offline," said Tessa Lyons, a product manager on Facebook. "We have a greater responsibility not only to reduce, but to remove, this type of content."
Facebook accused in Myanmar Investigators and United Nations Human Rights Groups facilitate violence against Rohingya Muslims, an ethnic minority group by allowing anti-Muslim hate speech and false news.
Riots erupted in Sri Lanka after the majority of the country spread false news Buddhist community against Muslims. Almost identical social media rumors have also led to attacks in India and Mexico. The rumors in many cases did not include a call for violence, but increased tensions.
The new rules do not apply to Facebook's other major social media offerings, Instagram and WhatsApp, which also leaked false news. In India, for example, false rumors about WhatsApp about child abductors have led to mass violence.
In an interview that the technology news site Recode released on Wednesday, Mark Zuckerberg, the CEO of Facebook, tried to explain how the company is going to make a difference between offensive speech – the example he used was people, denying the Holocaust – and posts that promoted false information that could lead to physical harm.
"I think there is a terrible situation underlying sectarian violence and intent." Mr. Zuckerberg told Recode's Kara Swisher, who will be joining The New York Times later this summer. "It is clearly the responsibility of all actors involved there."
While the social media company has already established rules that suppress a direct threat of violence or hatred, it is reluctant to remove rumors that do not directly violate its content policies.
Under the new rules, Facebook said it will create partnerships with local civil society groups to identify misinformation for removal. The new rules are already in place in Sri Lanka, and Ms. Lyons said the company hopes to introduce them soon in Myanmar and then expand elsewhere.
Mr. Zuckerberg's example of Holocaust denial quickly created an online furor, and on Wednesday afternoon he declared his comments in an email to Ms. Swisher. "Personally, I find the denial of the Holocaust deeply offensive, and I have absolutely no intention of defending the intentions of those who deny it," he said.
He went on to outline Facebook's current policy regarding misinformation. Contributions that violate the company's community standards, which prohibit, among other things, hate speech, nakedness and direct threats of violence, are removed immediately.
The company has begun to identify submissions that are considered false by independent fact-finding auditors. Facebook will "down" these posts and effectively move them to each user's news feed so they will not be promoted across the entire platform.
The company has also added information boxes under proven false news that other sources of information propose (19659016) But the extension of the new rules to the United States and other countries where offensive language is still protected by law may prove difficult as long as the company uses the laws of freedom of expression as guiding principles for content control. Facebook also faces pressure from conservative groups who argue that the company unfairly addresses users with a conservative outlook.
When Facebook was asked in an interview how misinformation could cause harm and be removed against this material, it would simply decline it was objectionable, Ms. Lyons said, "There is not always a really clear line."
"All this is a challenge – that's why we're repeating ourselves," she said. "That's why we take serious feedback."
Follow Sheera Frenkel on Twitter: @sheeraf