Google and Facebook: Coronavirus Outbreak To Result In Less Content Moderation
The ongoing pandemic has had its adverse effects on the human population as well as the economy. More people have shifted to work from home, and that translates to more social media, according to the recent spike in usage numbers.
These social media platforms depend on contract workers to filter any content that doesn’t adhere to specific social and cultural standards, depending on the platforms. But due to the lockdown, content moderators won’t be doing their work extensively.
Facebook Inc., the parent company to Facebook, WhatsApp, and Instagram, together with YouTube, have both warned that the “there’ll be less content moderation and slower customer support.”
Facebook Inc. mostly depends on contract workers, and they won’t let them have remote access to their systems, citing privacy-related issues and legalities.
So, most of the work will have to be handled by the company’s employees with assistance from their AI software, which has continuously been improving since inception but is not yet reliable. Just switching more to automated software has already resulted in an outcry among some Facebook users who have seen even ‘unharmful’ posts nixed.
Google has also turned its focus to automated content moderation systems across its services, including YouTube.
For its video content platform, creators should expect more videos to be axed, and the appeal process will be slow.
Typically, Google works with over 10,000 contract workers, while Facebook depends on about 15,000 contract workers for content moderation.
Follow us on Telegram, Twitter, Facebook or subscribe to our weekly newsletter to ensure you don’t miss out on any future updates.