Instagram chief denies social media can be ‘clinically addictive’ in landmark case

Adam Mosseri, the head of Instagram, testified on Wednesday that he does not think users can be “clinically addicted” to the social media app.
Mosseri is the first executive to testify in the landmark social media addiction trial against YouTube and Instagram parent company Meta in a suit brought by a now 20-year-old woman identified as Kaley. The woman alleges the companies intentionally developed addictive features to hook young users, which she claims harmed her mental health.
The lawsuit is the first of more than 1,500 similar cases to go to trial and could serve as a test of whether the social media giants can be held responsible for claims that they’ve harmed young users’ mental health.
Mark Lanier, a lawyer for the plaintiff,questioned Mosseri on Wednesday about whether Instagram chooses profits over the health and safety of minors and whether Mosseri oversees an app that hooks younger users.
It was a rare look into how Mosseri views Instagram’s business. Mosseri said he hadn’t testified in a trial like this before.
Mosseri said that he didn’t think that it was possible to be addicted to Instagram but that “problematic use” was possible, though it varies from person to person. Mosseri compared it to “watching TV for longer than you feel good about.” (Mosseri conceded in questioning that he is not a doctor.)
“It’s relative,” he said. “Yes, for an individual, there’s a such thing as using Instagram more than you feel good about.”
Mosseri became head of Instagram in 2018 after joining the company then known as Facebook in 2008. In 2021, Facebook whistleblower Frances Haugen leaked a trove of internal documents indicating the company knew Instagram could have a “toxic” effect on teen girls. The same year, CNN reported that Instagram promoted accounts encouraging extreme dieting and eating disorders to teen users. The company acknowledged at the time that those accounts violated its rules.
Mosseri told a Senate committee in December 2021 that he was in favor of greater online safety regulation but also “committed” to making the platform safe, even if parents didn’t use parental control tools.
On Wednesday, Mosseri said that any implication that Instagram specific targets teen users to maximize profits was false.
“We make less money from teens than any other demographic on the platform,” he said, while being questioned by one of Meta’s lawyers, Phyllis Jones. “Teens don’t clock on ads and they don’t have much expendable income.”
Instagram has since rolled outadditional safety and well-being features, most notably “teen accounts,” which apply default content restrictions and privacy protections for teen users. Meta has previously said “we strongly disagree” with the allegations in Kaley’s lawsuit.
Kaley began using Instagram at the age of nine, according to Lanier, although the app’s minimum age is 13. (Instagram has more recently begun rolling out AI age verification technology to identify younger users who sign up with an inaccurate birthdate, although the technology’s accuracy is unclear.)
Lanier, in his opening statement Monday, called out features such as “infinite scroll and autoplay” and the “like” button, which Lanier equated to a “chemical hit” that teens looking for validation from their peers grow to crave. Kaley’s lawsuit also alleges that “beauty filters” that can alter a user’s face contributed to body dysmorphia and that she experienced bullying and sextortion on Instagram.
On Wednesday, Lanier asked if Mosseri was aware that Kaley had once spent over 16 hours in a single day on Instagram.
“That sounds like problematic use,” Mosseri said.
Ahead of Mosseri’s testimony, a group of parents and family members who say they lost loved ones because of harms from social media gathered outside the courtroom around 1 am in blue rain slickers in hopes of getting one of the limited seats for public observers in the courtroom, a video from one of the parents shows. “We’re never going to stop fighting, that’s why we’re here,” said Julianna Arnold, whose 17-year-old daughter Coco died after being given a pill laced with fentanyl by an older man whom Instagram connected her to.
Instagram’s ‘beauty’ filters
Lanier questioned Mosseri at length about Instagram’s beauty filters, especially those that alter users’ faces in ways that some view as promoting cosmetic surgery.
Lanier pointed to internal documents from 2019 in which Meta executives debated whether to ban such filters. One email said experts were “unanimous on the harm there.”
“We are talking about encouraging young girls into body dysmorphia,” another email from a Meta executive read.
At first, Instagram decided to ban all filters that distort faces, Mosseri said. But it later altered the decision.
Instagram filters that promote plastic surgery, such as adding the appearance of facial scars in common procedure areas, were banned, Mosseri said. But the company decided to lift the ban on Instagram filters that alter facial features, such as enlarging a person’s lips or slimming their nose, instead deciding to stop recommending them, Mosseri said.
At the time of the policy change, Kaley was 14 years old, Lanier said.
Profits over safety?
Lanier also grilled Mosseri on his salary. Mosseri said that his base salary is “about $900,000 per year” but that his compensation can be more than $10 million or, in some years, more than $20 million, including bonuses and stock options.
Lanier questioned whether Mosseri’s decisions on product features, such as “beauty filters,” were motivated by ensuring growth at the company, thus benefiting his compensation. He showed another internal email suggesting that removing such filters would “limit our ability to be competitive in Asian markets (including India).”
“I was never concerned with any of these things affecting our stock price,” Mosseri replied.
Lanier also referenced an unreleased, internal Meta study called “Project Myst.” During his opening statement, Lanier said that the study found evidence that children who had experienced “adverse effects” were most likely to get addicted to Instagram. The study also found that parents were powerless to stop the addiction, he said.
Mosseri said he recognized the study, but didn’t remember anything specific about it. “I was a supporter, I am generally a supporter of research,” he said.
Meta lawyer Paul Schmidt argued during his opening statement that Kaley’s difficult family life during childhood was responsible for her mental health challenges, rather than Instagram. He showed portions of pre-trial testimony from two therapists who he said worked with Kaley that suggested they did not believe Instagram was central to her challenges. A Meta spokesperson reiterated the company’s argument in a new statement on Wednesday.
“The question for the jury in Los Angeles is whether Instagram was a substantial factor in the plaintiff’s mental health struggles. The evidence will show she faced many significant, difficult challenges well before she ever used social media,” the statement said.
Matthew Bergman, an attorney representing Kaley, said Mosseri’s testimony indicated Instagram’s executives “made a conscious decision to put growth over the safety of minors” in a statement Wednesday afternoon.
“The evidence shows that Instagram knew the risks its product posed to young users, yet continued to deploy features engineered to keep kids online longer, even when those features exposed them to significant danger,” he said.
The jury will likely not hear many arguments related to Instagram or YouTube’s content because of Section 230, a federal law that shields tech companies from liability over content that their users post. Ahead of Mosseri’s testimony, Superior Court Judge Carolyn Kuhl directed the parties not to question him about Instagram’s content safety features or the content Kaley was exposed to while using it, citing that law.
By CNN
