Wednesday was a historic day as Mark Zuckerberg took the stand and faced a jury under oath to answer allegations that Meta knowingly designed and promoted products that hooked young users — including children — despite internal warnings about the risks, marking the first time he has testified before a jury in such a case.
While Zuckerberg’s testimony was often characterized by sidestepping and dodging questions — to the point that the judge instructed him to answer directly — he can’t deflect his way out of this one. The evidence in this social media trial speaks for itself.
The plaintiff’s attorney, Mark Lanier, focused on three central themes in his questioning: 1) addicting users; 2) allowing underage users access to the platform; and 3) making business decisions that put profits over safety.
Zuckerberg was presented with a 2015 email in which the CEO stated his goal for 2016 was to increase users’ time spent on the platform by 12%. Zuckerberg argued that Meta’s growth targets reflect an aim to give users something useful, not to addict them, and stated that the company does not seek to attract children as users.
NEARLY TWO-THIRDS OF AMERICAN VOTERS BACK SOCIAL MEDIA BAN FOR KIDS UNDER 16, FOX NEWS POLL SHOWS
When asked whether he believes people tend to use something more if it’s addictive, he dismissed the premise. “I don’t think that applies here,” he said.
But it absolutely does apply. Meta’s entire business model is built on user engagement. Social media appears “free,” but a child’s time, attention and data are the product being sold. More hours with eyes glued to the screen mean more advertisements to sell. The user is the product. The incentive is to keep users engaged as much as possible.
As confirmed earlier in the trial by addiction expert Dr. Anna Lembke of Stanford University, social media meets the clinical criteria for addiction, according to her expert testimony.
AFTER AUSTRALIA PASSES SOCIAL MEDIA BAN LAWMAKERS PROBED ON WHY CONGRESS HASN’T DONE MORE TO PROTECT KIDS
Lanier also questioned Zuckerberg extensively on Meta’s age-verification policies. He showed an internal Meta email from 2015 estimating that 4 million children under 13 were using Instagram — approximately 30% of U.S. children ages 10 to 12. One in three preteens.
Zuckerberg said the company removes identified underage users and includes terms about age requirements during the sign-up process. Lanier responded, “You expect a 9-year-old to read all of the fine print? That’s your basis for swearing under oath that children under 13 are not allowed?”
Zuckerberg added that some children “lie about their age in order to use the services.” During this exchange, he also said, “I don’t see why this is so complicated … we have rules and people broadly understand that.”
AI COMPANIONS ARE RESHAPING TEEN EMOTIONAL BONDS
Waving his hand and saying “we have rules” is not an adequate defense. These are minors. It is the company’s responsibility to ensure the platform is effectively age-gated; otherwise, its stated age policy is meaningless.
In practice, age verification on most social media platforms relies largely on self-reported birthdates. A child can enter a false age, click to accept the terms and conditions and gain access within minutes. Critics argue that without meaningful safeguards, age restrictions amount to little more than an honor system.
Age of access is a key issue in this trial. The plaintiff, K.G.M., who got on Instagram at age 9, alleges that her social media use as a child and teenager led to body dysmorphia, suicidal thoughts, anxiety, addiction and depression. Her age when she began using the app — during a period of significant brain development between ages 10 and 12 — is central to the harms she alleges.
AUSTRALIA REMOVES 4.7M KIDS FROM SOCIAL MEDIA PLATFORMS IN FIRST MONTH OF HISTORIC BAN
Instagram should never have allowed her on the platform at age 9, the plaintiff argues. Whether the jury ultimately agrees remains to be seen, but the case places responsibility for those decisions squarely on Meta’s leadership.
Lanier ended his questioning by unrolling — with the help of six others — a 50-foot collage of every selfie K.G.M. had posted on Instagram, many with beauty filters. He asked Zuckerberg whether Meta ever investigated her account for unhealthy behavior. Zuckerberg did not answer.
Earlier, Lanier pressed Zuckerberg about his decision to allow beauty filters that mimicked plastic surgery after 18 internal experts warned they were harmful to teenage girls and could contribute to body dysmorphia, according to internal documents. Zuckerberg and Adam Mosseri, head of Instagram, ultimately reversed a temporary ban and allowed the filters on the platform. Plaintiffs contend that decision exposed vulnerable young users to tools linked to body dysmorphia and other mental health struggles.
Zuckerberg defended the decision by saying that after lifting the ban, Instagram did not create its own filters or recommend them to users. He added, “I think oftentimes telling people that they can’t express themselves like that is overbearing.”
CLICK HERE FOR MORE FOX NEWS OPINION
What a spin. Removing plastic surgery filters that harm young girls is, in his words, “overbearing.” Many parents would call it putting reasonable safeguards in place.
While Zuckerberg has publicly said Meta cares about children’s safety — telling Congress in 2024 that “Our job is to make sure that we build tools to help keep people safe” and that “We are on the side of parents everywhere working hard to raise their kids” — the internal evidence presented at trial suggests otherwise.
Though he would not admit in court that he knew his products were addictive or targeted teens, he didn’t need to. The jury — and the public — can weigh his answers against the internal documents and decide for themselves.
CLICK HERE TO READ MORE FROM CLARE MORELL
Under oath, Meta’s Zuckerberg showed why Big Tech can’t police itself
