Meta Platforms is implementing new privacy and parental controls for Instagram accounts of users under 18 to address concerns about social media’s impact on young people. Starting Tuesday, Meta will automatically convert these accounts to “Teen Accounts,” which will have enhanced privacy settings.
Key changes include:
– Default Privacy: Teen accounts will be private by default, meaning users can only be messaged or tagged by accounts they follow or are already connected to.
– Sensitive Content Settings: These will be set to the most restrictive levels available.
– Parental Controls: Users under 16 will need parental permission to change default settings. Parents will have tools to monitor their children’s interactions and limit app usage.
The move follows studies linking social media use to increased depression, anxiety, and learning disabilities among young users. Meta, along with other major platforms like ByteDance’s TikTok and Google’s YouTube, faces numerous lawsuits regarding the addictive nature of social media and its effects on children.
Meta’s update comes three years after it halted development on a version of Instagram designed specifically for teenagers, in response to safety concerns raised by lawmakers and advocacy groups.
In July, the U.S. Senate advanced two bills, The Kids Online Safety Act and The Children and Teens’ Online Privacy Protection Act, aimed at holding social media companies accountable for the impact of their platforms on young users.
As part of the update, Instagram will notify users under 18 to close the app after 60 minutes each day and will implement a default sleep mode to silence notifications overnight. Meta plans to roll out these changes within 60 days in the U.S., UK, Canada, and Australia, with a global rollout starting in January.