Select Language:
Meta has terminated internal research into how Facebook and Instagram impact mental health after discovering causal links indicating their products negatively affect users’ mental well-being, as revealed by unredacted court filings in a class action lawsuit brought by U.S. school districts against Meta and other social media companies.
In 2020, Meta collaborated with survey firm Nielsen on a project dubbed “Project Mercury” to assess the effects of temporarily deactivating Facebook and Instagram, according to documents obtained through discovery. The results were disappointing: individuals who paused use of these platforms for a week reported experiencing less depression, anxiety, loneliness, and social comparison. Instead of sharing these results or pursuing further research, Meta decided to cease investigation, citing that the negative findings were influenced by the prevailing media narrative about the company.
Internally, however, staff members assured Nick Clegg, then the head of global public policy at Meta, that the research’s conclusions were valid. An anonymous researcher reportedly noted that “the Nielsen study clearly shows a causal impact on social comparison,” while another staff member expressed concern that suppressing negative findings resembled the tobacco industry’s tactic of concealing harmful effects of smoking.
Despite documented evidence of a causal relationship between Meta’s platforms and adverse mental health outcomes, the company allegedly told Congress it lacked the ability to measure whether its products harmed teenage girls. A spokesperson, Andy Stone, stated that the study was halted due to flawed methodology and emphasized Meta’s efforts to improve safety, claiming that the company has listened to parents, researched critical issues, and implemented meaningful protections for teens over the past decade.
The lawsuit accuses Meta of concealing social media risks, along with similar allegations against Google, TikTok, and Snapchat. Plaintiffs argue that these companies have intentionally hidden known dangers from users, parents, and educators. Specific allegations against Meta include designing youth safety features to be ineffective, delaying action against child predators due to growth concerns, and prioritizing engagement metrics over safety. Internal documents suggest Meta set a high threshold—requiring 17 violations before removing traffickers—and ultimately knowingly served harmful content to teens by optimizing for increased engagement.
Mark Zuckerberg reportedly told staff in 2021 that child safety was not his top priority, as he focused on other initiatives like building the metaverse. Internal safety efforts were reportedly delayed or downplayed, with pressure placed on staff to justify inaction. Stone deflected these claims, asserting that Meta’s safety measures are effective and that accounts flagged for sex trafficking are removed swiftly.
Meta has filed a motion to prevent the unsealing of the internal documents cited in the lawsuit, arguing that the plaintiffs’ request is overly broad. A hearing is scheduled for January 26 in Northern California District Court.




