قالب وردپرس درنا توس
Home / World / In search of Facebook's heroes, only find victims

In search of Facebook's heroes, only find victims



Photo



Police guard a restaurant in Ampara, Sri Lanka, which was attacked by a Buddhist mob.

Credit
Adam Dean for the New York Times

Times Insider provides insights behind the scenes as news, features, and opinions converge in the New York Times.

MURUTHALAWA, Sri Lanka – Every story needs a protagonist. When we came to Sri Lanka to talk about the flood of Facebook-related violence in developing countries, we thought we had found ours in a young Buddhist monk from a small mountain town.

But we would soon learn that monastic history was morally more complicated than we expected – and made our story on Facebook bigger and darker than we had originally planned.

A week before, mobs of Sinhalese-speaking Buddhists, the majority group of the country, had risen, inspired by anti-Muslim misinformation and hate speech on Facebook. They destroyed mosques and Muslim homes in several cities and killed one man.

But here was the monk we had heard about, Nelligala Dhammaratne, the mob on the street. Relying on his religious authority and his influence on the community, he exhorted his members to disperse. The local mosque was saved.

Mr. Dhammaratne should be a heroic contrast to the villains of our history: the extremists who dominated social media here and in other developing countries and used them to incite people into paranoid outbursts of rage that sometimes became deadly.

We thought we had a story about good people and villains fighting for control of the vast information spaces Facebook opens up in countries like Sri Lanka. This was in line with Facebook's own philosophy that the platform reinforces the already existing tendencies of a society for good and evil. In this view, Facebook has the responsibility to mitigate the bad, but you can hardly blame it.

But Mr. Dhammaratne was not what we expected. He spent time with him at his hilltop temple, overlooking green hills, and found that our understanding of Facebook's role in societies with weak institutions was far beyond improving good and evil.

Yes, he had stopped the mob. But he also recited as an unvarnished truth some of the very anti-Muslim rumors that had spread before the attack in viral Facebook memes. Muslim shopkeepers put chemicals in bras to sterilize Buddhist women. Muslim doctors sterilized Buddhist patients without their knowledge. Muslims in the government were secret extremists.

How did he know that? "The whole country has heard of it" on Facebook, he replied. Had he discussed this with his constituents in the run-up to the violence? Naturally.

We understand that Facebook's algorithmic newsfeed, by pushing out content that has aroused the most interest from users, not only reinforces existing prejudices or reinforces extremists.

Primordial feelings, such as anger or fear – which, studies show, work best on the algorithm – can change the way people see and relate to each other. However, in countries with weak institutions where Facebook is widespread, misinformation can be widespread. And in societies with a history of deep social mistrust, it can be deadly

. Dhammaratne, which showed how the gatekeepers protect society from these forces, proved to be an example that even the good-looking are overwhelmed by these forces. This would not be a story of heroes and bad guys, as we learned, but of victims from both sides of the violence.

Facebook, which was supposed to deliver dopamine hits like a casino vending machine, had unintentionally induced Sri Lanka to construct a proxy every day, from the emotionally charged rumors and us-versus-them hate the algorithm best.

Who were they – suddenly plugged into one of the most sophisticated media platforms in history, which posed as a portal for news – resisting?

"Singhalese who were perfectly alright were provoked," Mr. Dhammaratne said, referring to emotionally charged Facebook content that made it difficult for members of his community to distinguish the truth from reality

We would see that this pattern of otherwise well-intentioned onlookers, caused by anger and fear on the Internet, comes into play in a number of cases that we are following in developing countries 59022] "It is the ability of these apps, news, especially audio and video, as quickly and in groups as large as this wave of violence arose, "said Dhammaratne.

He said the government had done it right to temporarily block social media; "That's how it was controlled."

But Facebook has become too important to quench everyday life, he added. It needs to be regulated or better controlled, though he did not know exactly how. Otherwise it could happen again.

Continue reading the main story


Source link