In the wake of the Coronavirus Crisis, many social media companies have announced a reduction in staff to comply with social distancing norms.
That’s why Silicone Valley tech titans like Google, who owns Youtube, have placed a heavier reliance on algorithms rather than human censors to “help with some of the work normally done by reviewers.”
Youtube claimed in a blog that they are switching to “automated systems” that “will start removing some content without human review, so we can continue to act quickly to remove violative content and protect our ecosystem.”
There’s just one problem. In the very next sentence, Youtube contradicts itself saying that they will remove content even if it doesn’t violate their community standards: “As we do this, users and creators may see increased video removals, including some videos that may not violate policies.”
A tweet was also sent out by Youtube on Monday confirming the new decree.
With fewer people to review content, our automated systems will be stepping in to keep YouTube safe. More videos will be removed than normal during this time, including content that does not violate our Community Guidelines.
We know this will be hard for all of you.
— TeamYouTube (@TeamYouTube) March 16, 2020
And although Youtube is blaming the robots for the erroneous flagging of legitimate content, tech news site Fast Company says otherwise explaining that ‘Google and YouTube aren’t entrusting COVID-19 to an algorithm’.
If this is true, it means that Youtube is actively taking down videos that don’t violate community guidelines making their ‘community standards’ null and void – a phenomenon that many would argue is far worse than ‘fake news’.