More Stories






Do you know what your kids are watching on social media?
You could soon have a lot more control over what they see under a proposal that Connecticut lawmakers considered on Wednesday.
But critics raised privacy and censorship concerns – and some younger users think the new law wouldn't work.
“ON SCREENS SO OFTEN”
Buffy Cautela is babysitting two pre-teens. A trip to the aquarium on Wednesday was a welcome break from social media.
“They’re on screens so often,” she said. “The older one, she’s on, like, a Chromebook all day.”
And those sites can recommend harmful content to kids.
“It’s a lot of peer pressure, you know?” said Nelsy Batista, of Norwalk. “So let’s see if they can do something about it.”
NEW SOCIAL MEDIA PROPOSAL
While Cautela was visiting the aquarium, Connecticut lawmakers considered a new bill designed to protect kids on social media platforms. The proposal, from Gov. Ned Lamont and state Attorney General William Tong, would require sites like TikTok and Instagram to verify users’ ages. Those under 18 would see a warning every three hours saying, “While social media may have benefits for some young users, social media is associated with significant mental health harms.”
Under the bill, kids could not receive notifications after 9 p.m., or content suggestions in general, without parental consent. It’s based on similar laws in New York and California.
COULD IT BACKFIRE?
But could the protections backfire – and actually hurt underage users? At a hearing on Wednesday, critics raised concerns about censorship.
“Unfortunately, a lot of LGBTQ kids do not come from supportive families, and they do turn to online spaces,” said Brianna January, with the Chamber of Progress.
The tech industry said removing algorithms could expose kids to even more harmful content.
“Algorithms are used not only to recommend content, but also to filter and downrank material that may be age-inappropriate or harmful,” said Nathan Trail, with the Information Technology Industry Council. “Limiting these tools could increase the likelihood that teens encounter inappropriate content because platforms may be restricted from using systems that currently help curate safer experiences.”
Supporters insisted that the proposal is “content-neutral.”
“This bill is not prohibiting a minor from searching for the content they want to view, but is restricting the tech companies from using addictive algorithms to decide what the minor sees,” said state Rep. Gary Turco (D-Newington).
Other lawmakers are skeptical that age verification can even work. “You’re really sort of leaving it up to the social media companies to determine whether somebody is under 18,” said state Rep. Dave Rutigliano (R-Trumbull).
Tong’s office said tech companies are creating ways to verify users’ age without requiring photo identification, which has raised privacy concerns among judges.
“They can use facial tech – facial recognition technology,” said Rebecca Borne, an assistant state attorney general. “They use the behavior of the individual on the platform.”
Regardless of the technology, Cautela thinks kids will find a way around the law.
“These kids are so smart,” she said. “I work at a middle school too, and they hack into the school and they’re able to unblock websites.
WHAT’S NEXT?
A similar bill failed last year, but this year’s version is a priority for Lamont, who is running for reelection.
It faces a vote in the General Assembly’s General Law Committee in the next few weeks. If the law ultimately passes, it is likely to face legal challenges.