‘Behavioral cocaine.’ CT and 41 other states sue Meta for addicting kids  

Connecticut and dozens of other states sued their parent company, accusing Meta Platforms Inc. of purposely addicting younger users.

John Craven and Associated Press

Oct 24, 2023, 11:22 PM

Updated 276 days ago

Share:

Pretty much everyone is on Instagram and Facebook. But Tuesday, Connecticut and dozens of other states sued their parent company, accusing Meta Platforms Inc. of purposely addicting younger users.
“It's as if we are taking ‘behavioral cocaine,’” said Connecticut Attorney General William Tong. “Just like Big Tobacco, Meta targets millions of American and Connecticut children.”
“LIKE A GAMBLER AT A SLOT MACHINE”
The federal lawsuit claims that “Meta manipulates dopamine releases in its young users, inducing them to engage repeatedly with its platforms – much like a gambler at a slot machine.”
Tong said Meta’s own documents prove it, but they are blacked-out in the lawsuit.
The suit filed by 33 states in federal court in California, claims that Meta routinely collects data on children under 13 without their parents' consent, in violation of federal law. In addition, nine attorneys general are filing lawsuits in their respective states.
“Kids and teenagers are suffering from record levels of poor mental health and social media companies like Meta are to blame,” said New York Attorney General Letitia James in a statement. “Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem."
INVESTIGATION STARTED IN 2021
The states launched their investigation two years ago, after damning newspaper reports based on the Meta's own research found that the company knew about the harms Instagram can cause teenagers — especially teen girls — when it comes to mental health and body image issues. One internal study cited 13.5% of teen girls saying Instagram makes thoughts of suicide worse, and 17% of teen girls saying it makes eating disorders worse.
Following the first reports, a consortium of news organizations, including The Associated Press, published their own findings based on leaked documents from whistleblower Frances Haugen, who has testified before Congress and a British parliamentary committee about what she found.
“The company's leadership knows how to make Facebook and Instagram safer, but won't make the necessary changes because they have put their astronomical profits before people,” Haugen told Congress in Oct. 2021.
META RESPONDS
A Meta spokesperson said the company is aggressively addressing teen safety, including stricter age verification and parental supervision tools.
“We share the attorneys general’s commitment to providing teens with safe, positive experiences online, and have already introduced over 30 tools to support teens and their families,” the company said in a statement. “We're disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path.”
In 2021, Facebook officials detailed their efforts to Congress.
“We've built AI [Artificial Intelligence] to identify suicide content on our platform and rapidly respond with resources,” Antigone Davis, Facebook’s global safety director, testified in Sept. 2021. “We've launched tools to help control time spent on our apps.”
But the lawsuit accused Meta of blowing smoke.
“Meta appears to be expanding the use of these practices into new platforms” like the Virtual Reality Metaverse, WhatsApp and Facebook Messenger.
AGE RESTRICTIONS
To comply with federal regulation, social media companies ban kids under 13 from signing up to their platforms — but children have been shown to easily get around the bans, both with and without their parents’ consent, and many younger kids have social media accounts. The states' complaint says Meta knowingly violated this law, the Children’s Online Privacy Protection Act, by collecting data on children without informing and getting permission from their parents.
Other measures social platforms have taken to address concerns about children’s mental health are also easily circumvented. For instance, TikTok recently introduced a default 60-minute time limit for users under 18. But once the limit is reached, minors can simply enter a passcode to keep watching.
Other social media platforms are not part of Tuesday's lawsuit, but Tong said the states are also investigating TikTok.


More from News 12