قالب وردپرس درنا توس
Home / Science / Does Conspiracy Theory Hurt Videos YouTube?

Does Conspiracy Theory Hurt Videos YouTube?



Louie Veleski has some interesting opinions. He thinks spirits exist and people have never been to the moon. Veleski from Melbourne, Australia, explains his views on his YouTube channel Better Mankind, which earns him up to $ 5,400 a month.

Conspiracy theories, it turns out, are very profitable for the YouTube-inclined entrepreneur. On his channel, Peladophobian, Ryan Silvey, 18 and also from Australia, post videos like "School is Illuminati" and "Donald Trump is Vladimir Putin." Although satirical, the videos can be concentrated with other contrarian or esoteric posts in the search results. On average, Silvey earns more than $ 7,500 per month from ads that some of its 628,000 subscribers see.

YouTube also makes a bundle. About 55 percent of the money companies who pay for their 30-second advertising at the beginning of popular videos go to the content creators. The rest goes to Alphabet, the parent company of the site. In 201

7, more than $ 110 billion was transacted (compared to $ 90 billion in 2016). Nearly 90 percent of that number came from ads, and there was a growing number on YouTube.

YouTube was founded in 2005 and is the dominant video content platform on the Internet. People around the world see about a billion educational videos on the site every day, and more people use them as a news source. But media reports have implicated YouTube in the spread of fake news and extremism, often because of conspiracy videos that spread false information. Now that Facebook is under government oversight and may be subject to regulation, YouTube takes action to ensure its integrity. And that could mean the end of the conspiracy video.

Recommended slideshows

  00 51

In pictures: The 50 most powerful armed forces in the world

  Abraham Lincoln 44

In Pictures: Every US President from Best to Worst

Concern for these videos may seem exaggerated. A post office claiming a geomagnetic storm on March 18 would receive "[disrupt] satellites, GPS navigation, and power grids all over the planet." Some news agencies took the claim as fact until the US scientific authorities refuted it. This video was misleading but probably harmless.

But maybe others have played a role in recent tragedies. The person who drove a pedestrian car on London Bridge in June 2017 and stabbed guests in nearby bars may have seen videos of a Salafist preacher on YouTube. Following the rally last August in Charlottesville, Virginia, by the so-called Old Right, The New Republic named the platform "the world leader in white supremacy." After filming in Las Vegas in October 2017, the Wall Street Journal began using the algorithm to propose videos claiming the event was a false flag. By the time the algorithm changed, the first five results for a search for "Las Vegas Shooting" contained a video claiming that government agents were responsible for the attack.

"From my experience in disinformation space," wrote Jonathan Albright, the research director of the Tow Center for Digital Journalism, in an essay on Medium, "all roads eventually seem to lead to YouTube." [19455912]  YouTube logo screen "title =" [19659002] On March 23, 2018, in Istanbul, the YouTube and Netflix app logos will be displayed on a TV screen, turkey. YouTube has voiced criticism for its role in disseminating misinformation and conspiracy theories, but the platform is trying new strategies to promote reliable sources. Chris McGrath / Getty Images

Addressing the problem is tricky, because what makes a conspiracy is not always clear, says YouTube. Are there forecasts for the year 2018, including the fact that Vesuvius in Italy will erupt and kill hundreds of thousands of people? What about Shane Dawson, who routinely posts videos on his channel but does not necessarily support what he's discussing? A video that states, among other things, that aliens could be linked to the disappearance of Malaysia Airlines Flight 370 began with the disclaimer that "those are just theories," and "they should not hurt or harm a company."

The difficulty of pinpointing whether a post is considered groundless or not is part of the problem. Without a definition, YouTube's algorithm can not filter such videos from search results. This is a problem for Alphabet, who is worried that the spread of conspiracy videos on YouTube could backfire. Incorrect information that enters the list of recommended top video listings could ultimately result in customers viewing YouTube videos becoming unreachable. "Our brands may also be adversely affected by the use of our products or services," says Alphabets Annual Report 2017, "to disseminate information that is considered misleading." HOR_YouTube_01_a1618s2964_HORIZONTAL "[title="”/>

Brian Stauffer / theispot [19659002] The.