YouTube cracks down on QAnon videos that target individuals or groups

YouTube cracks down on QAnon videos that target individuals or groups

Sponsored Links

A worker walks past YouTube offices, in King's Cross, London, Britain, September 11, 2020. REUTERS/Toby Melville

Toby Melville / reuters

With just weeks to go before the 2020 presidential election, YouTube has confirmed it is expanding efforts to crack down on harmful conspiracies being shared on its platform, with a specific focus on prohibiting “content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.”

“One example would be content that threatens or harrasses someone by suggesting they are complicit one of these harmful conspiracies, such as QAnon or Pizzagate,” the company wrote in a blog post.

While YouTube’s statement notes that the company has removed hundreds of and thousands of QAnon-related videos, today’s move is not quite as drastic as those taken by other major Silicon Valley players. Last week, Facebook expanded on earlier efforts to ban QAnon pages and groups that discussed violence by effectively banning QAnon-centric accounts entirely. And over the summer, Twitter began banning thousands of QAnon accounts, preventing even more appearing in users’ recommendations, and blocking URLs associated with QAnon content from being shared on the platform.


In this article:

news, tomorrow
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.




Go to Source

Author: {authorlink} Engadget RSS Feed

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics