The family of a New York adolescent has sued Instagram and its parent company Meta for purposefully designing addictive goods for children.

In a class action complaint filed in California on Monday, lawyers for a 13-year-old girl identified only as “AA” accused the social media giant of exploiting compulsive design to keep youngsters reading despite knowing it could affect their mental health.

The complaint wants at least $5 billion in compensation for the millions of youngsters who use Instagram every day in the United States, as well as a court order prohibiting Instagram from delivering many of its fundamental services to minors under the age of 18.

READ MORE: 63K Instagram Accounts Associated With Sextortion Scams Are Removed By Meta

The corporation has yet to react in court to the complaint, but it is anticipated to dispute it. A Meta representative merely provided a boilerplate statement that the company has used in response to prior similar complaints, claiming that its apps have numerous measures and features designed to keep kids secure.

The case references various corporate documents made public by Meta whistleblower Frances Haugen in 2021, which demonstrate how the firm regularly disregarded internal evidence that Instagram was harming underage users.

“This country prohibits minors from having access to other addictive items, such as tobacco and alcohol, due to the physical and psychological harm they might bring. According to the lawsuit, social media is no exception, and Meta’s own documentation demonstrate that the company is aware of the harm its products cause.

READ MORE: Meta Ad Revenues To Overtake All Linear TV By 2025

“Nonetheless, Meta has done little to improve its social media offerings or restrict access to youthful users. In fact, a child can sign up for Meta’s dangerous products in minutes, with no parental or guardian supervision or approval…

“Meta’s conduct has harmed [the] plaintiff and [other children], and will continue to harm them unless and until it is stopped.”

‘Overwhelmed by anxiety’
According to Monday’s lawsuit, “AA” is a 13-year-old New Yorker who began using Instagram at the age of ten and now spends approximately five hours each day on the popular photo-sharing app, including up to one hour before bedtime. The lawsuit claims Meta did little to check her age, despite Instagram’s rules prohibiting users under the age of thirteen.

READ MORE: As Wall Street Is Alarmed By AI Spending, Facebook’s Parent Company Meta Initiates A Tech Selloff

As a result, the lawsuit claims, she is “unable to put her phone away” and “constantly checks Instagram while doing her homework,” prompting her to stay up late and rush through her assignments.

It further alleges that she becomes “overwhelmed with anxiety” when she does not check her notifications, and that she has “internalised the belief that her friends are constantly ignoring her” when they do not like or comment with her postings.

The case includes internal papers indicating that Meta prioritized recruiting more minors to its services in order to offset the waning popularity of its best-known program, Facebook, and the gradual aging of its existing user base.

READ MORE: The Judge Dismisses Some Accusations Against Meta’s Zuckerberg About Social Media Harm

In pursuit of that purpose, the complaint claims, Meta frequently rejected internal and external allegations that its apps were having a harmful impact on young users in particular, such as inciting them to constantly compare themselves to others.

Instagram’s algorithmically-sorted, infinite scrolling news feed, which operates like a slot machine, enticing users to keep refreshing it in the chance of receiving an unforeseen payout, is one of the aspects singled out for criticism.

Furthermore, external and internal studies discovered evidence that Instagram’s automated recommendation system was amplifying users’ most negative inclinations, such as detecting their interest in eating disorder content and recommending more of the same.

Internal research also revealed that these automated algorithms appeared to promote so-called “negative appearance comparison” (NAC) information, which made users feel jealous or bad about themselves, and that this content was harmful to youngsters’ well-being.

Other internal research and reports expressed worries about the number of alerts, the psychological impact of revealing how many likes each post had received, and AI filters that made users appear to have undergone plastic surgery.

Meanwhile, Meta was well aware that an estimated 4 million under-13s in the United States were using its services in violation of its policies, and that its age verification mechanisms provided few impediments to underage users.

Nonetheless, the lawsuit claims that Meta routinely rejected these concerns and turned off opportunities to repair or mitigate known issues, often at the personal request of CEO Mark Zuckerberg.

“Instead of warning parents and young users of the dangers of Instagram, Meta has gone to great lengths to solicit increased numbers of young users to join and spend more time on their platforms,” according to its lawyers.

Source