Select Language:
Meta has introduced a series of new tools and updates designed to enhance the safety of teens and children on its platforms, including Instagram and Facebook. These enhancements feature improvements to direct messaging for teen accounts, broader protections against nudity, and stronger safeguards for adult-managed accounts that predominantly feature minors.
Teen accounts will now provide users with more contextual details about their messaging contacts, such as safety tips, account creation date, and a new streamlined option to block or report message senders in a single action. In June alone, teens utilized Meta’s safety notices to block one million accounts and reported another million after viewing these notices.
To address cross-border sextortion scams, Meta is rolling out a “Location Notice” feature on Instagram. Over 10% of users interacted with this notice to learn about sharing their location, which helps identify when someone they’re chatting with is in another country—an often exploitative tactic targeting young people.
Nudity protection, a key feature that automatically blurs potentially nude images in direct messages, remains highly used. As of June, 99% of users, including teens, kept this setting enabled. The feature has also decreased the sharing of explicit content, with nearly 45% of recipients choosing not to forward such images after receiving a warning.
For accounts operated by adults that prominently feature minors—such as those managed by parents or talent agents—Meta is implementing Teen Account protections. These include activating strict messaging controls and filtering offensive comments through Hidden Words, aiming to prevent unwanted or inappropriate contact before it occurs.
Moreover, Meta plans to make these accounts less discoverable to suspicious users by reducing their visibility in searches and recommendations. This initiative complements earlier measures like disabling gift acceptance and paid subscription features for such accounts.
In ongoing efforts to combat inappropriate interactions, Meta has removed nearly 135,000 Instagram accounts for posting sexual comments or soliciting sexual images from adult-managed accounts involving children under 13. Additionally, over 500,000 related Facebook and Instagram accounts linked to those original profiles have been taken down. Meta continues to collaborate with other technology companies through the Tech Coalition’s Lantern program to prevent harmful actors from surfacing on other platforms.




