It’s About Time: Bringing Federal Policy Up to Speed with Our Youths’ Online Experience
A Digital Progress Institute Explainer
July 26, 2022
by Kate Forscey, Contributing Fellow at The Digital Progress Institute and Principal of KRF Strategies
The Internet hasn’t been a safe space for kids. From online sexual predation to bullying and harassment, from the promoting of eating disorders to teen girls to encouraging suicidal ideation, our children and teenagers are confronting a host of challenges once confined to adult life. The Wall Street Journal’s Facebook Files revealed Facebook’s own research that showed “Teens blame Instagram for increases in the rate of anxiety and depression”—or as the Journal put it: “Facebook Knows Instagram Is Toxic for Teen Girls.” Meanwhile, TikTok’s Blackout Challenge has led to the deaths of at least 7 children, one only eight years old, who were swept up by a viral challenge for kids to choke themselves until they blacked out, all while the company profited.
These reports demonstrate widespread evidence of how these dangerous online experiences can be amplified in online echo chambers and subsequently invade lives of our youth—and cause unfortunate consequences in the physical world.
For parents, turning off the spigot is no simple matter. Entirely denying access to social media can have a crushing effect on the social lives of teens. What is more, social media platforms rely on addictive features that prey on natural physiology that keep us coming back to their sites for another hit of dopamine; that’s especially true for kids, who don’t have the same self-control as adults.
That’s what makes the Senate Commerce Committee’s markup of two bipartisan bills on Wednesday so important. The first bill, the Children and Teens’ Online Privacy Protection Act (S.1628), introduced by Senators Ed Markey and Bill Cassidy, is a relatively straightforward kids privacy bill. It expands the privacy protections already enshrined in federal law for children to minors (those aged 12-16), limits how online operators can collect data from children and minors and restricts targeted advertising, and codifies into law the promise of platforms like Snapchat, TikTok, and YouTube to give parents the option to erase any personal information collected about their kids.
The second bill, the Kids Online Safety Act (S.3663), introduced by Senators Richard Blumenthal and Marsha Blackburn, is a solid piece of legislation, aiming to provide kids and their parents with the tools needed to protect children from the severely harmful effects kids and teens face from being online. Although it’s a strong, thoughtful bill to make that Internet a safer space for young users, it’s venturing into new waters that require a deeper dive. So let’s take the plunge.
Getting to Know the Kids Online Safety Act
An overarching theme running throughout the bill is first and foremost transparency—something the evidence shows is terribly lacking in Silicon Valley. Specifically:
- Platforms must disclose their policies and practices and give access to safeguards and receive acknowledgement of that disclosure before letting a minor use their platform (sec. 5(a)).
- If a platform targets ads at minors, it must disclose information about the product being advertised and why it’s targeting minors as well as how it obtained personal data of the minor that informed this targeting in the first place (sec. 5(c)).
- Platforms must publish an annual independent audit to assess risks to minors, including the platforms’ compliance with federal protections and the meaningful steps they are taking to mitigate harm to minors (sec. 6(a)).
- Platforms must also give access to their data to academics and public interest organizations so they can independently verify their claims and to enable further research on harms to minors resulting from online activity (sec. 7).
Another major theme of the bill is user empowerment. Platforms must give parents the ability to safeguard children online rather than just trusting the platforms to do the right thing (as noted, trusting platforms alone has not worked). That means:
- New parental controls to protect the information of minors, disable addictive product features, and turn off algorithmic recommendations (secs. 4(a)-(b)).
- Turning parental controls on for minors as a default, with notice to the minor, so that kids are protected as a matter of course while still allowing parents to turn off those features for minors that don’t need them (sec. 4(b)(3)-(4)).
- New reporting controls so that parents can more easily identify harmful behaviors as well as a dedicated channel for parents to report to the platform harms to their kids. The platform must establish an internal process to receive and respond to reports in a reasonable and timely manner (sec. 4(c)).
Perhaps most interesting is that the bill imposes a new duty of care on platforms: They must act in the best interests of the minors who use their services. Although duties of care are relatively common in the real world (think of minimum height requirements for rollercoasters or minimum age requirements for bars), they’re relatively rare online—in part because section 230 of the Communications Act grants social media immunity from most such liability. And the duty is tailored to the harms demonstrated over the course of five congressional hearings:
- Platforms must protect minors from harms such as materials that promote self-harm, addiction-like behaviors, online bullying or harassment, sexual exploitation, and promotion of behavior unlawful to minors (e.g. gambling, tobacco use, drug use, or alcohol) (sec. 3).
Two last notable features of the bill are worth mentioning. When it comes to enforcement, both the Federal Trade Commission and State Attorneys General will be on the beat (sec. 10). And when it comes to implementation, the bill creates a Kid’s Online Safety Council, which will be made up of parents, experts, representatives from tech, federal agencies, State Attorneys General, and youth voices (sec. 11).
What Comes Next
So that’s an overview of the Kids Online Safety Act as it stands today, but we still have some way to go to get it (and the Children and Teens’ Online Privacy Protection Act) across the finish line. Notably, the bill will need to be voted out of committee, teed up for cloture on the Senate floor, and voted out—not to mention passed by the House (a much easier task) and signed by President Biden. And the clock is running out before the upcoming midterm elections.
That said, the Kids Online Safety Act has a lot going for it. It appears to have the bipartisan support needed to advance through the full Committee. It takes a narrowly tailored approach in that it prioritizes protecting our most vulnerable populations from some of the greatest dangers they face today. As Facebook whistleblower Frances Haugen put it, “Whatever your party affiliation, well-being of kids matters.” And the fact that Senator Cantwell has scheduled the bill for markup is an auspicious sign in and of itself.
What is more, the House Energy and Commerce last week moved its own bipartisan American Data Privacy and Protection Act (H.R. 8152) in a vote of 53-2. Although much broader in scope, it also has provisions specifically prohibiting platforms from tracking, predicting, and manipulating minors online. That bodes well for the Kids Online Safety Act when it gets to the House. So pay close attention to what happens tomorrow at the Senate Commerce markup, which can be viewed virtually here starting at 10 am. Fingers crossed, this may be a banner year for bringing our laws up to speed with our youths’ online experience.