A letter to Facebook chief executive officer Mark Zuckerberg from more than three dozen attorneys general on Monday made one thing perfectly clear: They believe that an Instagram for children is such a bad idea, Facebook should pull out of its plans to build one.
In March, reports began swirling that the social media giant was looking to develop a new version of the photo- and video-sharing network aimed at users under 13 years old, a potential audience that current company policy forbids from participating in the platform. Facebook representatives confirmed that the company was indeed exploring a kid-oriented offering, but the matter has apparently raised alarm at The National Association of Attorneys General.
The organization, made up of a bipartisan group of 44 state AGs across the country, made the case against social media use by kids in general, citing research about its potentially harmful effects on young people. But they also made a pointed argument against the company, in particular, in rather scathing terms.
“Use of social media can be detrimental to the health and well-being of children, who are not equipped to navigate the challenges of having a social media account,” they wrote. “Facebook has historically failed to protect the welfare of children on its platforms. The attorneys general have an interest in protecting our youngest citizens, and Facebook’s plans to create a platform where kids under the age of 13 are encouraged to share content online is contrary to that interest.”
The AGs noted that studies have correlated social media use with low self-esteem, self-damage and even thoughts of suicide among children. These platforms also remain a breeding ground for cyberbullying, they said, along with other risks, including dangerous actors using them to target and abuse children. Protecting privacy is also paramount when it comes to such young users.
This is concerning enough, but the AGs seem to believe that Facebook is the last company that should be entrusted to create a safe environment for kids.
Naturally, the tech company disagrees.
As a Facebook spokesperson wrote in response to a WWD request for comment, “As every parent knows, kids are already online. We want to improve this situation by delivering experiences that give parents visibility and control over what their kids are doing.”
The company is consulting experts in child development, child safety and mental health, as well as privacy advocates, the person said, and “in addition, we commit today to not showing ads in any Instagram experience we develop for people under the age of 13.” Whether that’s just for now or forever isn’t clear.
Facebook has been pledging for weeks that it won’t target kids with ads. The promise stands out, because advertising is the lifeblood for the parent company. But it begs the question of why it wants to build an offering that will surely become a magnet of attention and scrutiny, if it can’t generate revenue from it.
One potential scenario: The company could use the kids’ data to inform ads aimed at their parents. If that’s the case, Facebook would still track their behavior, capture the information and generate insights based on them, and the public would have to trust the company to safeguard all of it. That’s a lot to ask, especially by a tech company that has been roiled by privacy scandals for years.
In response to a follow-up request by WWD, Facebook did not directly address the specifics above or other ways in which the children’s data could be used. Instead, the company explained that it’s still very early in the process, so it hasn’t figured out all the details of the new platform yet. In other words, parent-targeting is not off the table.
There are numerous other reasons why the company would be interested in young people, of course. Marketers know well the power of attaching brands to childhood memories, as it can create an intimate bond that lasts for a lifetime. Meanwhile, the act of indoctrinating ever-younger users also has a short-term effect, further plumping up Instagram’s already massive user levels of more than a billion people.
Whatever its business interests, the company insists that safety and privacy will be fundamental parts of its decision-making process. And indeed they have to be, as the very notion of an Instagram for Kids, just on its face, is already horrifying critics that now include most of the nation’s attorneys general.
Notably, it’s not the first time Facebook has made a play for younger users. When it rolled out Messenger for Kids in 2017, the company similarly explained that it developed the service alongside child development experts, so youngsters could have a secure way to communicate with parent-approved contacts.
Then in 2019, a technical bug struck that let children join Messenger groups with strangers. Facebook addressed the glitch and went on to open the app up to more than 70 new markets last year.