Facebook boss Mark Zuckerberg – whose company is lost in controversy over everything possible has recently raised users' privacy and privacy concerns to reinforce extremist content and verbatim genocide – responding to growing criticism of the technology industry by seeking further regulation from one intervention requested by the Washington Post (and on their own personal Facebook page) outside called Saturday.
Zuckerberg has divided the areas in which he now says that regulation could be useful into four sections: harmful content, election integrity, privacy, and data portability. Surprisingly, he offered details of what that might look like.
Zuckerberg first wrote that platforms have a "responsibility that people are protected in our services," and that Internet companies should be responsible for enforcing harmful content standards. "To do this effectively, Facebook needs to recognize and eliminate violent or hate speech, but also calls for a" more standardized approach "across the industry, including third-party oversight:  One idea is for third parties to set standards, regulate the distribution of harmful content and measure it by companies. The regulation could lay the foundations for the prohibited and oblige companies to create systems to minimize harmful content.
Considering that Facebook took years to admit that white nationalism and white separatism are in fact the same thing as white supremacy and independent oversight of content decisions is probably not such a bad idea.
Regarding election integrity, Zuckerberg said the company has already taken steps by forcing political ad buyers to verify their identity in real life and create a political advertising database, but suggested that what is really needed is a comprehensive one Review of the Campaign for the Financial Campaign:
… It's not always easy to decide if an ad is political. Our systems would be more effective if regulation created common standards for policing actors.
The online advertising laws focus primarily on candidates and elections rather than on ambiguous political issues where more intervention was attempted. Some laws only apply during elections, although information campaigns are non-stop. In addition, there are important questions about how political campaigns use data and targeting. We believe the legislation should be updated to reflect the reality of the threats and set standards for the industry as a whole.
These are all things that are likely to be true, but also to dodge the question of why Facebook is at all vulnerable to political machinations, and whether the entire Facebook information economy is actually the problem. At least in the US, this kind of change would lead to a major overhaul of the campaign funding and disclosure laws that are unlikely to appear for years, if at any time in the visible future.
There is also the fact that the company In the past, it has done its best to be exempt from the disclosure rules for advertisements, and has generally been involved in ethical issues related to advertisements, such as the Department of Advertising Housing and urban development, she has tapped with allegations for discrimination of housing.
To protect privacy, Zuckerberg called on the US to pass legislation similar to the comprehensive data protection regulation of the European Union. He said he would prefer to become a "common global framework" (as opposed to a patchwork of laws in each country) nation). He also demanded the data portability, which he described as a free flow of information between the services, although he alluded to Facebook login as an example. This is more of a way in which the company has expanded its tracking ranks on the Web than anything to protect user rights:
If you share data with one service, you should be able to move it to another. This gives people choices and enables developers to innovate and compete.
This is important for the Internet – and for the creation of services that people want. That's why we built our development platform. Real data portability should look more like the way users use our platform to sign in to an app than the existing ways to download an archive of your information. However, this requires clear rules about who is responsible for protecting information when switching between services.
(As TechCrunch noted, Facebook itself has brought data portability to the forefront, allowing users to export lists of friends so hard to find on other social networks.)
This is However, a major reversal a year ago, when Zuckerberg was publicly on the fence, whether regulation is necessary at all, called the DSDPR basically good, but only for Europe, suggesting that self-regulation is the better approach. Meanwhile, it seems to have changed that outside political pressure on Facebook has continued to grow: they and other tech companies have been increasingly confronted with hostility by the public and elected officials, including regulatory threats and antitrust bargaining. For example, the Australian government is threatening to enact laws and impose fines if they do not remove terrorist content as quickly as possible after the Christchurch Facebook live streaming massacre, which would land platform executives in jail.
In other words Zuckerberg et al. Perhaps you now believe that the rules of the GDPR and others on topics such as moderation of content are inevitable, and that Facebook is best before the move.
Take, for example, Phillip Morris, the cigarette titanium that came out in favor of regulation in the tobacco industry: An article in BMJ's Tobacco Control described the intention to "increase its legitimacy by calling itself social." Defining Responsibility and Changing the Litigation Environment "A far lower level of evil than in the cigarette industry, but in general, industries do not support regulation unless it either contributes to its bottom line or helps them to avoid tougher regulations , Note that Facebook allegedly increased its lobby by lobbyists in Washington, which could come in handy if the legislature decides what to do about it. Bloomberg noted:
Facebook has an incentive to play a strong role in the debate on data regulation of technology companies. The company's rapid revenue growth and billions in profits are encouraged by gathering numerous data points around its customers and making them easily accessible to advertisers.
… Zuckerberg has been working this year to address the more critical issues of Facebook as a broader issue – the Internet as a whole, not just his company. His willingness to move closer to regulation could take away the tough questions from Facebook, or at least give him more time to solve them.
Zuckerberg also came up against arguably the most potent criticism of Facebook: That it and its Silicon Valley cousins Like Google and Amazon are so big, so powerful and so ingrained that they get smashed by the right regulatory reaction – in the case of Facebook can mean that subsidiaries like WhatsApp and Instagram have to be spun out – or at least chunks are torn. Do not look at it.
Mark Zuckerberg has gone into detail below:
Technology is an integral part of our lives, and companies like Facebook have immense responsibilities. Every day, we decide what is harmful to language, what makes political advertising and how sophisticated cyberattacks can be prevented. This is important to protect our community. But if we started from scratch, we would not ask companies to make those judgments alone.
I think we need a more active role for governments and regulators. By updating the rules for the Internet, we can keep the best of it – the freedom of people to express themselves, and the entrepreneurs to build new things – while protecting society from further harm.
I've learned what I've learned I think we need new regulations in four areas: harmful content, election integrity, privacy and data portability.
First, harmful content. Facebook gives everyone the opportunity to use their voice, and that creates real benefits – from sharing experiences to growing moves. As part of that, we have the responsibility to ensure the safety of our employees. That means deciding what counts as terrorist propaganda, hate speech and more. We constantly review our policies with experts, but on our scale we will always make mistakes and decisions that people disagree with.
Lawmakers often tell me we have too much power over the language, and I agree. I believe that we should not make so many important decisions about one's own language. So we are creating an independent body so that people can call our decisions. We also work with governments, including French officials, to ensure the effectiveness of content review systems.
Internet companies should be responsible for enforcing harmful content standards. It is impossible to remove all harmful content from the Internet, but when people use dozens of different sharing services – all with their own policies and processes – we need a more consistent approach.
One idea is that third parties set standards to regulate the distribution of harmful content and to measure companies against these standards. Regulation could lay down basic rules for the ban and companies need to build systems to keep harmful content to a minimum.
Facebook is already publishing transparency reports on how effectively malicious content is being removed. I think every major internet service should do this on a quarterly basis because it is as important as financial reporting. Once we understand the spread of harmful content, we can see which companies are improving and where we should set the baselines.
Secondly, legislation to protect elections is important. Facebook has already made important changes to political ads: advertisers in many countries need to verify their identity before they buy political ads. We've created a searchable archive that shows who paid for ads, what other ads they viewed, and what audiences the ads saw. However, deciding whether an ad is political is not always easy. Our systems would be more effective if regulation created common standards for policing actors.
The online advertising laws focus primarily on candidates and elections rather than on ambiguous political issues where more intervention was attempted. Some laws only apply during elections, although information campaigns are non-stop. In addition, there are important questions about how political campaigns use data and targeting. We believe that legislation should be updated to reflect the reality of threats and set standards for the industry as a whole.
Third, effective privacy and data protection requires a globally harmonized framework. People around the world have called for a comprehensive data protection regime that complies with the general data protection regulation of the European Union, and I agree. I think it would be good for the Internet if more countries adopted a regulation like the GDPR as a common framework.
New privacy policies in the United States and around the world should build on the protection of the GDPR. It should protect your right to choose the way your information is used. At the same time, companies can use information for security purposes and provide services. It should not be necessary to store data locally, making it more vulnerable to unauthorized access. And there should be a way to hold companies like Facebook accountable by imposing penalties in case of mistakes.
I also believe that a common global framework – rather than a regulation that varies significantly by country and state – will keep the Internet from doing so. If you break it, entrepreneurs can make products that serve everyone, and everyone gets the same protection.
When the legislature passes new privacy rules, I hope it can help answer some of the questions that the GDPR leaves open. We need clear rules on when to use information for the public interest and how it should be applied to new technologies such as artificial intelligence.
Finally, regulation should ensure the principle of data portability. If you share data with one service, you should be able to move it to another. This gives people choices and enables developers to innovate and compete.
This is important for the Internet – and for the creation of services that people want. That's why we built our development platform. Real data portability should look more like the way users use our platform to sign in to an app than the existing ways to download an archive of your information. However, this requires clear rules as to who is responsible for protecting information when switching between services.
This also requires common standards, which is why we support a standard data transfer format and the open source data transfer project.
I believe Facebook has a responsibility to address these issues, and I look forward to discussing them with legislators around the world. We have developed advanced systems to detect malicious content, stop dialing errors, and make ads more transparent. However, people should not have to rely on individual companies addressing these issues themselves. We should have a broader debate about what we want as a society and how regulation can help. These four areas are important, but of course there is more to discuss.
The rules for the Internet allowed a generation of entrepreneurs to build services that changed the world and created great value in people's lives. It is time to update these rules to establish clear responsibilities for individuals, businesses and governments.