
During recent testimony in a New Mexico child safety trial, Mark Zuckerberg, CEO of Meta Platforms, systematically downplayed the significance of his company's own internal research on social media addiction and its impact on young users. His testimony, part of a broader lawsuit alleging lapses in child safety and platform addiction, emphasized that mobile operating system and app store owners are better equipped to handle age verification for children.
During pre-recorded testimony from March, Zuckerberg was directly confronted with these findings. He was repeatedly asked about Meta's understanding of social media addiction and other issues studied by its own teams. His general approach was to cast doubt on the certainty or practical implications of these internal reports.
Another internal chart presented to Zuckerberg reportedly showed that around 20 percent of 11-year-olds were monthly active users on Instagram. This data point is significant, given the platform's stated minimum age of 13. While acknowledging the graph's content, Zuckerberg claimed he was "not familiar with what methodology we were using to estimate this." He added that "if we had direct knowledge that any given person was under the age of 13, that we would have them removed from our services."
The CEO also dismissed findings from a company researcher who noted "there is increasing scientific evidence, particularly in the US, … that the average net effect of Facebook on people's well being is slightly negative." Zuckerberg countered this by stating, "my understanding is that the general consensus view is not that." This consistent pattern of questioning internal research aligns with Meta's broader defense strategy in these child safety trials.
Zuckerberg's testimony in New Mexico mirrored his appearance in a separate social media addiction trial in Los Angeles. In both instances, he frequently rejected the "characterization" of questions posed by prosecutors. He emphasized that Meta's goal is to make its apps "useful," not merely to increase the amount of time users spend on them.
This isn't the first time a Meta executive has attempted to minimize internal findings. In 2021, whistleblower Frances Haugen disclosed documents that indicated Facebook's researchers had found Instagram negatively impacted some teen girls' self-perception. Instagram chief Adam Mosseri, whose recorded testimony was heard a day prior to Zuckerberg's, similarly referred to some of these disclosures as based on "problematic research." He claimed, "Most research is surveys. We run hundreds of surveys every month."
Despite downplaying internal research findings in court, Meta has concurrently introduced new child safety measures. Instagram, for instance, has unveiled a new flagging system to alert parents if their child repeatedly searches for self-harm or suicide content within a short period. These alerts are designed to provide parents with information and expert resources to support their teens during sensitive conversations. This move, following Zuckerberg's various testimonies, indicates an ongoing effort to address concerns about user safety, even while challenging certain legal characterizations of its platforms.
| Research Finding | Zuckerberg/Meta's Stance |
|---|---|
| Facebook posting associated with feedback, leading to seeking rewards by visiting more often. | "Not sure if that's actually how it works in practice," "agree you're summarizing what they appear to be saying." |
| Around 20% of 11-year-olds are monthly Instagram users. | "I agree that the graph says that, I am not familiar with what methodology we were using." |
| Average net effect of Facebook on well-being is slightly negative. | "My understanding is that the general consensus view is not that." |
| Instagram made some teen girls feel worse about themselves. | "Problematic research," "Most research is surveys." |
For Developers
Zuckerberg's testimony highlights the increasing legal scrutiny on platform design, particularly features that could be perceived as addictive. Consider how feedback loops and engagement mechanisms in your applications are structured and their potential impact on user behavior, especially for younger audiences.
For Founders & Product Managers
The ongoing trials against Meta underscore the critical importance of internal research integrity and transparency. Discrepancies between internal findings and public statements can lead to significant legal and reputational challenges. Ensure your organization has clear policies for addressing sensitive research insights.
For Parents & Consumers
The revelations about internal Meta research and Zuckerberg's responses suggest a complex and sometimes evasive stance on platform impacts. While new features like Instagram's self-harm flagging system offer some protection, parents should remain vigilant and utilize available tools to monitor their children's online activity, rather than relying solely on platform self-regulation.
For Investors in Social Media Platforms
The legal battles and the cost of defending against widespread allegations of harm, including the need for new safety features, represent a non-trivial operational cost and potential liability. This could impact future earnings and growth projections as platforms face increased regulatory pressure and potential litigation expenses.
Mark Zuckerberg has downplayed the significance of Meta's internal research on social media addiction and its impact on young users. During a New Mexico child safety trial, he questioned the methodology and practical application of internal documents detailing user engagement and teen well-being. He also suggested that Apple and Google are more responsible for age verification on their platforms.
The New Mexico Attorney General filed a lawsuit against Meta in 2023 alleging that the company designed its platforms to be addictive and facilitated predators' access to minors. The lawsuit uses internal Meta documents to demonstrate how the company's applications affect teens and other users. This trial is one of several ongoing legal challenges Meta faces regarding the safety and well-being of its younger users.
Internal Meta research indicated that users on Facebook are likely to associate posting with feedback, which could lead them to seek rewards by visiting the site more often. Mark Zuckerberg responded to this research by stating he wasn’t "sure if that's actually how it works in practice," distancing himself from the conclusions drawn by his own researchers.
Internal data presented to Mark Zuckerberg showed that around 20% of 11-year-olds were monthly active users on Instagram, despite the platform's stated minimum age of 13. While acknowledging the data, Zuckerberg claimed he was not familiar with the methodology used to estimate this and stated that Meta would remove users if they had direct knowledge of them being under 13.
According to internal research, one Meta researcher noted that there is increasing scientific evidence that the average net effect of Facebook on people's well-being is slightly negative. Zuckerberg countered this by stating that his understanding is that the general consensus view is not that. This highlights a pattern of Zuckerberg questioning internal research findings.
More insights on trending topics and technology







