Platform
Meta Expands Teen Safety Protections Across Platforms
Meta has announced it will expand its Teen Account protections to Facebook and Messenger while adding new restrictions for Instagram’s younger users, as the company continues to address concerns about teen safety across its platforms.
Building on the Instagram Teen Accounts launched last year, Meta is implementing additional safeguards requiring parental permission for users under 16 to access certain features. These teens will be prohibited from going Live or turning off the platform’s feature that blurs suspected nudity in direct messages unless they receive parental approval.
The company reports that since implementing Teen Accounts, 97% of users aged 13-15 have maintained the default restrictions, suggesting strong adoption of the safety measures.
Teen Accounts Expanding to Facebook and Messenger
Meta will begin rolling out Teen Accounts to Facebook and Messenger users in the United States, United Kingdom, Australia, and Canada, with plans to expand to additional regions soon. These accounts will offer protection similar to those on Instagram, including limitations on inappropriate content and unwanted contact.
The expansion follows what Meta describes as a positive reception to Instagram Teen Accounts, which had at least 54 million active accounts globally as of April 8, 2025.
According to survey research conducted by Ipsos and commissioned by Meta, 94% of U.S. parents find Teen Accounts helpful, and 85% believe they make it easier to help teens have positive experiences on Instagram. Over 90% of parents surveyed view the default protections as beneficial.
Teen Social Media Usage Remains High
The new measures come as teen social media usage continues at high levels. A Pew Research Center survey conducted in the fall of 2024 found that 46% of U.S. teens report being online “almost constantly,” unchanged from recent years, but nearly double the 24% reported a decade ago.
The same research shows platform preferences among teens, with YouTube used by 90%, followed by TikTok (63%), Instagram (61%), and Snapchat (55%). Facebook usage among teens has declined significantly over the past decade, dropping from 71% in 2014-15 to 32% in 2024.
Safety Advocates Call for Additional Measures
While child safety organizations have welcomed the expanded protections, The Guardian reports that some advocate for more comprehensive approaches.
UK child protection charity NSPCC stated, “For these changes to be truly effective, they must be combined with proactive measures so dangerous content doesn’t proliferate on Instagram, Facebook and Messenger in the first place.”
The announcement coincides with the implementation of the UK’s Online Safety Act, which requires tech platforms to shield users under 18 from potentially harmful content and take steps to prevent illegal material from appearing on their services.
Meta indicates the new Instagram protections will become available in the next couple of months as the company continues its efforts to create safer online environments for teenage users.
Dragomir is a Serbian freelance blog writer and translator. He is passionate about covering insightful stories and exploring topics such as influencer marketing, the creator economy, technology, business, and cyber fraud.