(Updates throughout with additional reporting. A previous version corrected the feature’s description.)
Bloomberg Government subscribers get the stories like this first. Act now and gain unlimited access to everything you need to know. Learn more.
Meta Platforms Inc.‘s Instagram will begin automatically routing new users younger than 16 to a version of the social media platform that limits inappropriate content.
The company’s latest feature comes as lawmakers have gotten more vocal about protecting children online. Testimony last year from a whistleblower at Meta’s Facebook that the company was prioritizing profit over the health and safety of kids prompted a flurry of legislative action on Capitol Hill to hold big tech companies accountable.
New underage Instagram users will be defaulted to the “less” version of the platform, which reduces sexual, graphic, and violent content that doesn’t violate community guidelines but is considered inappropriate for minors. They still maintain the option to select the “standard” version of the platform. Individuals must be at least 13 years old to use Instagram.
Existing underage Instagram users will have a choice between the “less” or “standard” version, but the company said it will send a prompt encouraging them to opt for the restrictions.
“This is an exercise to empower teenagers and it’s constantly iterating. If we don’t see great adoption, we can change how this works,” said Jeanne Moran, policy communications manager at Meta. The company wouldn’t change the feature, only its approach to making teenagers aware of it, Moran added.
The Senate Commerce, Science, and Transportation Committee last month unanimously advanced a measure (S. 3663) cosponsored by Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) that would require tech companies to offer easy-to-use safeguards to control the experience and personal data of minors online.
The committee also advanced a second proposal (S. 1628) from Sens. Ed Markey (D-Mass.) and Bill Cassidy (R-La.) that would update the Children’s Online Privacy Protection Act, or COPPA, to make the collection, use, and disclosure of kids’ data safer.
Another comprehensive privacy proposal (H.R. 8152), which has advanced out of the House Energy and Commerce Committee, includes similar protections for kids.
Meta says it builds defaults, protections, and tools for kids that complement legislative proposals, though the company hasn’t taken an official position on any bill.
Instagram Head Adam Mosseri answered senators’ concerns at a hearing last year by saying the company is developing “a new experience” for kids that will make it harder for them to encounter sensitive content.
‘More Needs to Be Done’
Josh Golin, executive director of kids advocacy group Fairplay, said Instagram’s effort won’t lead to significant change.
“The choice to see extreme or harmful content on Instagram should not be left up to the children using the platform — the adults who run Instagram ought to take on that responsibility themselves,” Golin said.
Jim Steyer, founder and CEO of kids advocacy group Common Sense Media, called it a delayed step in the right direction.
“Defaulting young users to a safer version of the platform is a substantial move that could help lessen the amount of harmful content teens see on their feeds,” Steyer said. “However, the efforts to create a safer platform for young users are more complicated than this one step and more needs to be done.”
To contact the reporter on this story: Maria Curi in Washington at firstname.lastname@example.org