Instagram rolls out restrictive new privacy settings for teenagers
Published in News & Features
Instagram is changing the default privacy settings for many U.S. teenagers, part of an effort to keep them safer and give parents more control over how their kids interact online.
The new settings will make teen accounts private by default, limit who those users can send private messages to, and put teens in the “most restrictive” tier when it comes to viewing sensitive content. That means the app will block teens from seeing sensitive photos and videos, including posts that show people fighting or certain cosmetic procedures.
These more restrictive settings will be turned on automatically for all Instagram users under 18 years of age, the company said Tuesday, though 16- and 17-year-olds can change them on their own. If a younger teen tries to evade the new restrictions by changing their birthday on the service, Meta said it will use artificial intelligence technology to try to “proactively find these teens and place them” into more restrictive accounts.
The restrictions for those under 16 can be relaxed if a parent provides permission via their own Instagram account.
The widespread changes come after years of criticism that Instagram, which is owned by Facebook parent company Meta Platforms Inc., has failed to adequately protect young people online. Meta was sued last year by a group of more than 30 states alleging the company’s apps are harming young people, and Chief Executive Officer Mark Zuckerberg appeared at a congressional hearing earlier this year on child safety where Meta was criticized for enabling child sexual exploitation. Zuckerberg has fought in court to avoid facing personal liability for any alleged harms.
In 2021, a Facebook whistleblower went public with hundreds of pages of internal Meta documents, including the company’s own research that found Instagram had a negative impact on the mental health for some teen girls.
Instagram boss Adam Mosseri said the new policies and restrictions have been in the works for the better part of a year, and that they weren’t designed to appease angry lawmakers. “Honestly, it’s not designed for any of them,” he said. “My hope is that it’s received well by parents and by teens because that’s exactly who it’s designed for.”
The new account settings represent the company’s most aggressive effort to date to protect younger users. Teens will only be able to receive messages from people they already follow or that they are already connected to and can only be tagged or mentioned by users they follow. Teens will also receive a notification to leave the app after 60 minutes of use each day.
Parents will also be able to see which accounts their teen is messaging, but won’t be able to read the actual messages.
Mosseri said the idea behind the increased restrictions is similar to the plan Meta had for “Instagram Youth,” a proposal for a version of the app for kids under 13. That plan was scrapped in 2021, and Mosseri said there is no plan to revive it.
It’s possible these new “Teen Accounts,” as Meta is calling them, will make their way to other company apps, including Facebook. “It’s faster to start on one app and then learn and iterate and fast follow across the rest of the family” of apps, Mosseri said.
The new restrictions will be live to all users under 18 in the U.S., UK, Canada and Australia within the next 60 days, and Meta plans to deploy them to the rest of the European Union later this year. They’ll be deployed globally starting early in 2025.
©2024 Bloomberg L.P. Visit bloomberg.com. Distributed by Tribune Content Agency, LLC.