For apps that let people bare their souls and selfies, social networks can be rather opaque about how they handle user data. But in the wake of the Cambridge Analytica scandal, Facebook is pulling back the curtain on its platform in an attempt to be more transparent.
In a blog post published on Tuesday, the social media giant revealed numbers for its enforcement of community standards. Although the company’s guidelines have been public for years — and internal guidelines for how it would enforce them were publicized last month — this is the first time it has posted a progress report.
According to the Community Standards Enforcement Report, Facebook beefed up its ability to zero in on false accounts. The system can find and nix fakes “within minutes of registration,” wrote Guy Rosen, vice president of product management. Between January and March 2018, the company disabled 583 million accounts. And on Monday, Facebook announced that it suspended roughly 200 apps pending further investigation into user data practices.
The platform, which has two billion users, took down 837 million pieces of spam over the first three months of 2018, practically all of which was found and flagged before users reported it. The information touches on an area that Congress focused on in April, when officials grilled chief executive officer and cofounder Mark Zuckerberg about why users needed to report suspicious activity before the company took action.
Still, Rosen acknowledged that it’s not a perfect system and identified one major area for improvement: identifying hate speech.
“Our technology still doesn’t work that well [for hate speech] and so it needs to be checked by our review teams,” he said. “We removed 2.5 million pieces of hate speech in Q1 2018 — 38 percent of which was flagged by our technology.”
Though Facebook shares were down 1.6 percent to $183.73 in midday trading on Monday, the company has recovered practically all of its losses stemming from the scandal kicked off by Cambridge Analytica, which may have compromised as many as 87 million user records. But the work to mend its image and relationship with the public is still ongoing, with Facebook doubling down on transparency — and artificial intelligence tools — to help patch things up.
Facebook, Instagram’s parent company, made no mention of its photo-sharing arm in its latest report.