Select Language:
Meta executives moved forward with plans to implement end-to-end encryption for messaging features linked to Facebook and Instagram, despite internal warnings that doing so could impede efforts to detect and report child exploitation to authorities. Internal documents filed in a New Mexico court case reveal that in March 2019, Monika Bickert, Meta’s head of content policy, expressed concern in internal chats, stating, “We’re about to do something irresponsible.” This internal communication coincided with CEO Mark Zuckerberg preparing to announce the initiative publicly.
The newly uncovered documents, released publicly on Friday and previously unreported, include emails, messages, and briefing materials obtained through discovery in the lawsuit filed by New Mexico Attorney General Raul Torrez. These materials provide insight into Meta’s assessment of the potential consequences of encryption and how top policy and safety officials viewed these risks at the time.
Torrez’s lawsuit alleges that Meta allowed predators to access underage users freely, connecting them with victims and contributing to cases of real-world abuse and human trafficking. The case just entered a jury trial—marking the first of its kind against Meta.
The revelations arrive amid widespread legal challenges and regulatory scrutiny worldwide concerning the platforms’ impact on young users’ well-being. In addition to the New Mexico case concerning child safety, over 40 attorneys general are investigating claims that Meta’s products negatively affect youth mental health. Several school districts have also filed lawsuits, and Zuckerberg himself testified last week in a California case involving a teenager harmed by Meta’s services.
The court documents specifically criticize Meta for allegedly misrepresenting the safety implications of its 2019 plan to default-enable end-to-end encryption on Messenger, a service that connects with Facebook and later expanded to include Instagram direct messaging.
Increased Risks
End-to-end encryption (E2EE)—which ensures that only the sender and recipient can decode messages—is a common privacy feature in many messaging apps like Apple’s iMessage, Google Messages, and WhatsApp. However, child safety advocates, including the National Center for Missing & Exploited Children (NCMEC), have warned that incorporating this technology into open social networks can significantly increase risks to children, who may connect with strangers and become vulnerable to exploitation.
The court filings reveal that high-level safety officials at Meta shared these concerns internally. While Zuckerberg publicly claimed the company was addressing encryption challenges, internal communications show top safety policy leaders warning that the company was making false claims about its ability to prevent and respond to safety threats within encrypted chats. Monika Bickert stated, “I’m not very invested in helping him sell this,” referring to Zuckerberg’s promotion of encryption on privacy grounds. She emphasized that with E2EE, “there is no way to find terror attack planning or child exploitation” proactively, making it difficult to report cases to law enforcement.
A 2019 briefing estimate indicated that if Messenger had been encrypted that year, the company would have reported only 6.4 million instances of child nudity and exploitation imagery to NCMEC—compared to 18.4 million without encryption—a 65% decrease. A subsequent update noted that Meta would have been incapable of proactively providing data in over 600 child exploitation, 1,454 sextortion, and 152 terrorism cases, including nine threats of school shootings.
Additional Safety Measures
Andy Stone, a Meta spokesperson, explained in response to Reuters that concerns raised by Bickert and Antigone Davis, Meta’s Global Head of Safety, led to the development of new safety tools prior to the 2023 rollout of encrypted messaging on Facebook and Instagram. Although messages are encrypted by default, users can still report harmful content, which the company reviews and can refer to law enforcement if necessary.
Stone added, “The concerns raised in 2019 are precisely why we developed new safety features designed to detect and prevent abuse within encrypted chats.” Among these features are specialized accounts for minors that prevent adults from initiating contact with underage users they don’t know.
Safety officials also expressed worries about grooming and exploitation on Meta’s semi-public social media platforms and subsequent abuse through private messaging. In an internal 2019 email, Davis noted, “Facebook allows pedophiles to find each other and kids via social graph with an easy transition to Messenger,” pointing out that WhatsApp—Meta’s encrypted messaging platform—posed less risk because it isn’t directly integrated with social media. She cautioned that enabling E2EE on Messenger would be far more dangerous than what had been observed on WhatsApp.



