Facebook is developing its own chips that can analyse and filter video content in real-time. This paves the way for potentially replacing human moderators as streaming on the social network grows exponentially
Users of Facebook Live are posting inappropriate content constantly. Some are abusing the service on purpose while others, are simply ignorant to the terms and conditions. Either way plenty of videos being streamed on Facebook need strict monitoring and filtering policies.
Yann LeCun, Facebook chief artificial intelligence scientist, said during a talk at the Viva Technology conference in Paris that the company is reportedly making its own chips for filtering video content. “Let’s imagine someone uses Facebook Live to film their own suicide or murder. You’d like to be able to take down that kind of content as it happens,” said LeCun.
Bloomberg reports that, due to Facebook’s massive userbase, conventional filtering methods simply aren’t enough anymore as they require too much energy and computer power.
Facebook uses Intel’s CPUs for its A.I. requirements, according to Kim Hazelwood, engineering manager at Facebook. Switching to its own chips, which are specialised and optimised for video filtering could expedite the identification of bad Facebook Live agents. This includes if a person commits suicide in a livestream or if they perform other acts of violence toward themselves or others.
This is not the first time we’ve heard chatter of Facebook looking internally for bespoke silicon. In April 2018, reports suggested Facebook was following in the footsteps of Apple and Google, developing its own chipsets.
“Facebook has worked on hardware before: It makes its own server design, motherboards, its own communications chips for data centres. So this is not completely new for Facebook,” said LeCun. It’s true. Facebook is already using A.I. for detecting and eliminating extremist propaganda, fake accounts, and hate speech. The current form isn’t sophisticated enough to handle every pressing issue on the social network.