© 2024 Peoria Public Radio
A joint service of Bradley University and Illinois State University
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Senators aim to rewrite child safety rules on social media

Ranking member Sen. Marsha Blackburn, R-Tenn., speaks during a Senate Subcommittee on Consumer Protection, Product Safety, and Data Security hearing about online child safety in October.
Samuel Corum
/
Getty Images file photo
Ranking member Sen. Marsha Blackburn, R-Tenn., speaks during a Senate Subcommittee on Consumer Protection, Product Safety, and Data Security hearing about online child safety in October.

Senators are introducing a bill aimed at keeping kids safe online amid mounting frustrations that popular apps including Instagram and YouTube don't do enough to protect their youngest users.

The bipartisan Kids Online Safety Act, introduced by Sen. Richard Blumenthal, D-Conn., and Sen. Marsha Blackburn, R-Tenn., is a grab bag of new rules and safeguards covering some of the biggest concerns that have emerged among lawmakers in the last year, as child safety has become a rare point of cross-party agreement.

"Big Tech has brazenly failed children and betrayed its trust, putting profits above safety," Blumenthal said in a statement. "The Kids Online Safety Act would finally give kids and their parents the tools and safeguards they need to protect against toxic content — and hold Big Tech accountable for deeply dangerous algorithms."

The law would require apps to create stricter safety measures for users under 16 by default, including tools to protect against stalking, exploitation, addiction and "rabbit holes of dangerous material." They would have to build parental supervision tools and dedicated channels to report harm. Kids would be able to turn off recommendations based on algorithms that use their personal data.

Tech companies would have a "duty of care" to protect kids from content that promotes self-harm, suicide, eating disorders, substance abuse and sexual exploitation. They would be barred from showing ads to kids for products that are illegal to sell to them, like alcohol and tobacco.

The bill follows a series of contentious hearings on Capitol Hill over the role of social media in fueling a teenage mental health crisis and exposing kids to harms from bullying to drug abuse to predators.

"Senator Blumenthal and I have heard countless stories of physical and emotional damage affecting young users, and Big Tech's unwillingness to change," Blackburn said in a statement. She said the bill would set "necessary safety guiderails" and "give parents more peace of mind."

Concerns over kids' safety escalated last year with news that Facebook parent Meta was building a version of its Instagram app for 10 to 12 year olds and reached a crescendo with subsequent revelations from Facebook whistleblower Frances Haugen that Instagram has researched how the app can be toxic for some of its youngest users. Lawmakers have also grilled executives from other apps popular with kids, including Snapchat, TikTok and Google's YouTube.

Attracting the next generation of users is a matter of existential importance for social media companies, and in particular for Meta, which is seeing growth slow at Facebook, the world's biggest social network.

Under pressure from lawmakers, regulators and advocacy groups, Instagram paused development of its kids' product last fall. But the app's head, Adam Mosseri, told lawmakers that the company still believes building an app for kids, with parental supervision, is the right thing to do.

Meta has said it supports new regulations on tech companies, and Mosseri has said the industry should come together to propose safety standards for kids on social media.

Editor's note: Meta pays NPR to license NPR content.

Copyright 2022 NPR. To see more, visit https://www.npr.org.

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.