By: The Guardian
June 16, 2020
“It’s much easier to build an AI system that can detect a nipple than it is to determine what is linguistically hate speech.”
The Facebook founder Mark Zuckerberg made that comment in 2018 when he was discussing how the company tackles content that is deemed inappropriate or, in Facebook terms, judged to be violating community standards.
Facebook’s artificial intelligence technology for identifying nudity gets it right more often than not. Between January and March this year, Facebook removed 39.5m pieces of content for adult nudity or sexual activity, and 99.2% of it was removed automatically, without a user reporting it.
There were 2.5m appeals against removal and 613,000 pieces of content were restored.