قالب وردپرس درنا توس
Home / US / Facebook finally explains on 27 pages why it prohibits some content

Facebook finally explains on 27 pages why it prohibits some content



(REUTERS / Dado Ruvic / Illustration / File photo)

SAN FRANCISCO – One of Facebook's biggest challenges is his role as a police officer for the freedom of expression of his two billion users.

Now the social network opens up about its decision on what item it decides – and why. On Tuesday, the company released for the first time the 27-page guidelines, called community standards, that it gives to its workforce of thousands of human censors. It covers dozens of topics such as hate speech, violent images, misrepresentations, terrorist propaganda and disinformation. Facebook said it would give users the opportunity to appeal against Facebook's decisions.

The move adds a new level of transparency to a process that users, the public and supporters have criticized as arbitrary and obscure. The newly published guidelines offer suggestions on various topics, including how to tell the difference between humor, sarcasm and hate speech. They explain that images of female nipples are generally banned, but exceptions are made for images that promote breastfeeding or attack breast cancer.

"We want people to know our standards and we want to bring clarity to people," Monika Bickert, Facebook director of global policy administration, said in an interview. She added that she hoped that publication of the guidelines would stimulate dialogue. "We try to strike the line between security and the ability to really express oneself."

The company's censors, referred to as content moderators, have been punished by civil rights groups for mistakenly removing contributions from minorities that are stories of victims of racial slurs. The moderators struggled to distinguish the difference between someone denying a bow as an attack and someone using the bow to tell the story of their own victimization.

In another case, presenters removed an iconic photo from Vietnam's Napalm attack, claiming the girl's nudity violates his policy. (The photo was restored following news agency protests.) Moderators have erased contributions from activists and journalists in Burma and in controversial areas such as Palestine and Kashmir, banning Pro Trump activists Diamond and Silk as "unsafe for the community." "

The publication of the guidelines is part of a wave of transparency that Facebook hopes to quash its many critics, and it has also published political ads and tightened its privacy controls after it came under fire for its lax approach to protecting consumer data

The company is being investigated by the US Federal Trade Commission for misusing data by a Trump-affiliated consulting firm, as Cambridge Analytica and Facebook CEO Mark Zuckerberg recently testified on the subject before the congress, Bickert said, discussing the exchange The guidelines started last fall and have not been linked to the Cambridge controversy.

The company's content policy, which began in 2005, dealt with nudity and Holocaust denial in its first few years Year 2008 grew to today 27 pages. [19659011] As Facebook has reached almost a third of the world's population, the Bickert team has grown significantly and is expected to grow even more in the coming year. A diverse team of 7,500 reviewers in places such as Austin, Dublin and the Philippines rate posts 24 hours a day, seven days a week in more than 40 languages. Moderators are sometimes temporary workers without much cultural familiarity with the content they judge, and they make complex decisions when applying the Facebook rules.

Bickert also employs high-level experts, including a human rights lawyer, a rape counselor and a West Point counter-terrorism expert, and a Ph.D. student with expertise in European extremist organizations as part of their content review team.

Activists and users were particularly frustrated at the lack of appeal procedures when their jobs are cut. (Facebook users may challenge the blocking of an entire account, not individual posts.) The Washington Post has previously documented how people liked this situation to get into "Facebook jail" – without giving a reason why

Zahra Billoo, Executive Director of the Council for American-Islamic Relations in the San Francisco Bay Area, said adding an appeal procedure and opening directives would be a "positive development," but said the social network still exists Way to go if he wants to remain a relevant and safe space.

Billoo said at least a dozen pages of white racists are still on the platform, even though politics prohibits hate speech and Zuckerberg said this month before Congress that Facebook does not allow hate groups.

"One persistent issue that many Muslim communities have asked is how to make Facebook better protect users from hatred and not be abducted by white racists, right-wing activists, Republicans or Russians to fight Muslims, LGBT and undocumented people, "she said.

Billoo himself was censored by Facebook two weeks after Donald Trump's election when she wrote a picture of a handwritten letter to a mosque in San Jose, quoting: "He will do to you Muslims what Hitler did to the Jews."

Bickert's team has been working for years to develop a software system that can classify the reasons for deleting a post so that users can get clearer information – and track Facebook, for example, for how many hate speech posts were published in a particular year or if certain groups have their own posts are logged out more frequently than others.

Currently, people who have reduced their posts receive a generic message stating that they have violated Facebook's community standards. Following the announcement on Tuesday, people will be told if their posts violate policies on nudity, hate speech and violence. A Facebook manager said the teams were working to build more tools. "We want to provide more details and information about why content has been removed," said Ellen Silver, vice president of community operations at Facebook. "We have more to do there, and we're committed to those improvements."

Facebook's content moderation is still driven by people, but the company uses technology to help with its work. The company is currently using software to identify duplicate reports, a time-saving technique for auditors that helps them review the same piece of content over and over again because it has been tagged by many people at the same time. Software can also identify the language of a post and some of the topics, and help to get the post with the greatest expertise to the reviewer.

The company can recognize images that were previously posted but can not recognize new images. For example, if a terrorist organization reactivates a decapitated video that Facebook has already written off, Facebook's systems will notice it almost immediately, Silver said, but it can not identify new decapitated videos. The majority of community-flagged items will be reviewed within 24 hours, she said.

Every two weeks, employees and executives meet to make decisions on the most difficult topics around the world. They discuss the advantages and disadvantages of potential policy measures. Teams that are present need to do research for each page, a list of possible solutions and a recommendation. You are encouraged to list the organizations outside of Facebook that you have consulted.

In an interview, Bickert and Silver confirmed that Facebook would continue to make mistakes in its judgment. "The scale in which we work," said Silver. "Even if the accuracy was 99 percent, that's still a lot of mistakes."


Source link