قالب وردپرس درنا توس
Home / World / Undercover report shows the Facebook moderation sausage being made – TechCrunch

Undercover report shows the Facebook moderation sausage being made – TechCrunch



An undercover reporter with the UK's Channel 4 visited a content moderation outsourcing firm in Dublin and came out of the way underage users.

It is bad news for journalists to keep big companies (and their small contractors) in the process. rather than problems with the process itself. Facebook [194559004] fan, but in the matter of moderation I think they are sincere, if hugely unprepared.

The bullet points raised in a letter from Facebook to the filmmakers. It is not in violation of the company's standards and may be informative; underage users and content has some special requirements; popular pages that need to be differentiated as small ones, whether they are radical partisans or celebrities (or both); hate speech is a delicate and complex matter that often needs to be reviewed multiple times; and so on.

The case does not come to fruition, but they are often unsatisfying but effective compromises. The problem is that the company has dragged its feet for years on to take responsibility for content and as its moderation resources are simply overtaxed. Why do you think it's outsourcing the work?

By the way, did you know that this is a horrible job?

Facebook in a blog post says it is working on doubling its "safety and security" staff to 20,000, among which 6,500 will be on moderation duty.

Even with a staff of thousands of judgments that need to be made are often so subjective, and the volume of content is so great. It may not be a good idea, but it does not mean that it does not work.

If you want to the process will be done by thousands of people who imperfectly execute the task. Automated processes are useful but no replacement for the real thing. The result is a huge international group of moderators, overworked and cynical by profession, doing a messy and at times inadequate job of it.


Source link