Select Language:
Almost 20% of teenagers aged 13 to 15 reported seeing “nudity or sexual images on Instagram” that they didn’t want to view, according to a court document. The file, publicly released on Friday as part of a federal lawsuit in California and reviewed by Reuters, contains excerpts from a March 2025 deposition by Instagram’s CEO, Adam Mosseri.
Mosseri stated that the company generally does not share survey outcomes, noting that self-reports are often unreliable. Meta, Instagram’s parent company, faces criticism from global leaders claiming its platforms harm young users. In the U.S., thousands of lawsuits allege that Meta’s products are designed to be addictive and contribute to a mental health crisis among minors.
The statistic about explicit images was derived from user surveys about their experiences on Instagram, not through direct review of posts, according to spokesperson Andy Stone. At the end of 2025, Meta announced plans to remove images and videos “containing nudity or explicit sexual activity, including those generated by AI,” with some exceptions for educational or medical content.
About 8% of users aged 13 to 15 also reported witnessing someone harming themselves or threatening to do so on Instagram, as revealed in the deposition. Most sexually explicit images are shared via private messages, with Mosseri emphasizing the importance of user privacy in moderation efforts, stating, “A lot of people don’t want us reading their messages.”



