قالب وردپرس درنا توس
Home / US / Facebook publishes new content guidelines, as it deals with a transparency crisis

Facebook publishes new content guidelines, as it deals with a transparency crisis



Hoping to unpack one of the great secrets of our time – why Facebook prohibits content, but no others – the social media giant told the world on Tuesday "What's allowed on Facebook and what's not."

They only needed 27 pages.

"Every day, people come to Facebook to tell their stories, see the world through the eyes of others, and connect with friends and concerns," the company said on its website. "We recognize how important it is for Facebook to be a place where people feel empowered to communicate with each other, and we take our role to keep abuse of our ministry serious, which is why we have one […]

According to Facebook, these standards are "worldwide" for "all types of content." They are designed to be inclusive ̵

1; for example, content, which may not be considered hate speech may be removed for violating our Mobbing Guidelines. "

Here's what you should know about Facebook's latest efforts to explain to its users how it works:

What is the objective of these Community standards?

"To promote the expression and create a safe environment."

What are they based on?

"We base our policies on contributions from our community and experts in areas such as technology and public safety."

Facebook says its policy is "rooted in three principles". Whats that?

The first is "security".

Facebook says: "People need to feel safe to build community .We commit ourselves to remove content that does real harm, including (but not limited to) physical, financial and emotional injuries."

What is the second?

"Voice: Our mission is to embrace various views – we're wrong on the content-sharing side, although some may find it offensive, unless removing it can prevent specific damage In addition, we will occasionally allow content that would otherwise violate our standards if we believe that it is important, important, or important to the public interest, and we do so only after the public value of the content is compromised weighed in on real damage. "

And the third?

They call it "equity" and they say that they have designed their guidelines to reflect a global and diverse community that "goes beyond regions, cultures and languages". As a result, our community standards may sometimes appear less differentiated than we desire, resulting in a result that conflicts with its underlying purpose. For this reason, in some cases and when we are given an additional context, we make a decision based on the spirit and not on the letter of politics.

How can users report what they consider to be offensive content?

Facebook says, "We make it easy for people to use potentially offensive content such as pages, groups, profiles, individual Report contents and / or comments to us for review. We also give people the ability to block, remove or hide people and posts so they can control their own experiences on Facebook.

What are the consequences for someone who posted offensive content on Facebook?

Facebook is somewhat vague on this one, stating, "The implications of violating our community standards vary depending on the severity of the offense and a person's story on the platform. For example, we can warn someone about a first violation, but if he continues to violate our policies, we may limit his ability to post on Facebook or disable his profile. We can also alert law enforcement authorities if we believe that there is a real risk of bodily harm or direct danger to public safety. "

Who are the people who check out this questionable content?

While Facebook is publishing its 27-page essay, a premiere for the Menlo Park company, the new and more detailed rules become more Kosher is and is not on the platform, used by thousands of paid human censors who police monitor the platform, looking for anything from hate suspicion and violent imagery to terrorist propaganda and disinformation.

What else is in

Facebook also offers suggestions on various topics, including ways to distinguish between humor, sarcasm, and hate speech, for example, explaining that images of female nipples are usually taboo, but that It is also possible to make exceptions for pictures that promote breastfeeding or circumvent breast cancer education.

What are Facebook speakers? Say about the new standards?

"We want people to know our standards and we want to clarify things to people," said Monika Bickert, Facebook director of Global Policy Management, to the Washington Post Policy, which will trigger a global online conversation. "We try to strike the line between security and the ability to really express oneself."

What is the story that led to this step?

As the Post Office writes, Facebook's censors, or "content moderators" were chastised by civil rights groups for the erroneous cancellation of minority posts that had shared stories of victims of racial denigration. Moderators had trouble distinguishing the difference between someone denying a bow as an attack and someone using the bow to tell the story of their own victimization.

What is the context for Facebook? [196592002] The publication of the guidelines is part of what the Post Report calls "a wave of transparency that Facebook hopes to quash its many critics. It has also published political ads and tightened its privacy controls after it came under fire for its lax approach to protecting consumer data. This topic exploded earlier this month in a public relations nightmare for Facebook when it became known that personal information that Facebook had collected about tens of millions of its users had been hacked by a Trump-affiliated consulting firm called Cambridge Analytica. The discovery forced Facebook boss Mark Zuckerberg to debate the issue in front of the congress

So, what about the 27 pages? Is that important?

The content policy of the company began humbly in 2005, when nudity and Holocaust denial were addressed in the early years. For comparison, these early guidelines fit on a single page.


Source link