Online Safety Measures for Children Face Broad Opposition Lobby


By Oma Seddiq and Kate Ackley

  • Lawmakers seek to pass kids’ safety and privacy legislation
  • Critics say bills would undermine protections for all users

Bloomberg Government subscribers get the stories like this first. Act now and gain unlimited access to everything you need to know. Learn more.

Congressional efforts to strengthen children’s safety and privacy online amid a youth mental health crisis are running into opposition from lobbyists who warn legislative proposals would be counterproductive.

“What is so frustrating here is that the special interests, these big tech platforms, have outsized influence, and parents and kids are just kind of left out there to grapple with the harms of mental health concerns and anxiety,” said Rep. Kathy Castor (D-Fla.), who’s introduced a child privacy bill (H.R. 2801). “It’s just so overdue for some action by the Congress.”

Advocates for kids’ mental health have decried social media algorithms that they say promote eating disorders, self-harm, and substance abuse. Families and schools have sued major platforms, including Snap Inc.’s Snapchat, Meta Platforms Inc.’s Instagram and TikTok Inc., over alleged harm to teens’ well-being.

A motley collection of LGBTQ+ advocates, free speech crusaders, and technology nonprofits maintain that the legislation, while laudable, risks creating more online surveillance and censorship that threaten the privacy and safety of all internet users.

“The bills are, on balance, gonna hurt more people than they’re gonna help,” said India McKinney, director of federal affairs at the Electronic Frontier Foundation, a digital rights group that’s urged Congress to block legislation.

Nicholas Kamm/AFP via Getty Images
Teenagers check their smartphones outside the Natural History Museum in Washington on April 8, 2015.

Behind the scenes, social media giants’ recent federal lobbying disclosures show a spending blitz over privacy and safety concerns. Lawmakers are digging in against the criticisms — and the tech industry — in hopes of approving legislation in the coming months, especially after past failures to address the problem.

“The crisis in mental health for young people is so great that Congress would have made an historic mistake if it did not act — and act now,” said Sen. Ed Markey (D-Mass.), a sponsor of one major child privacy bill (S. 1418). “I’m going to continue to drive this message until we get it passed before the end of this year.”

Safety Proposals

Youth suicide has surged in the past decade, making it the second leading cause of death among those aged 10 to 24 in 2021, according to the Centers for Disease Control and Prevention. At the same time, kids increasingly are online. Almost all teens had access to a smartphone in 2022 — up 22 percentage points from 2015, a Pew Research Center survey found.

Researchers can’t conclude that social media causes depression and anxiety among youth, but growing evidence suggests that excessive online use, cyberbullying, and harassment and violent content could worsen kids’ mental health.

“Every year you let this go unattended to by Congress is another year that kids are receiving inappropriate content, being taken to places that are unhealthy for them, when we know how to stop it,” said Danny Weiss, chief advocacy officer at Common Sense Media, which has been pushing for kids’ privacy legislation.

Common Sense Media spent at least $45,000 on federal lobbying in the first half of the year on issues “related to the impact of social media and technology on kids and families, including privacy and platform accountability,” according to disclosures. Last year, the group spent around $140,000 in total.

States such as California and Utah have moved ahead with their own laws creating online guardrails for children. In Congress, lawmakers have unveiled several approaches to improve kids’ experience online and have made tweaks in response to concerns about worsening privacy and safety.

See also: Health Data, Social Media ‘Top of Mind’ in Privacy Law Expansion

‘Aggravating’ and ‘Exploiting’

Markey’s bill, led with Republican Sen. Bill Cassidy (La.), would block internet platforms from gathering information from teenagers without their consent, updating a decades-old law that applied to children younger than 13. The legislation, the Children and Teens’ Online Privacy Protection Act, or COPPA 2.0, would also ban websites from targeting kids and teens with ads.

The Kids Online Safety Act, or KOSA, a similar measure Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) are championing, would establish a duty of care for social media companies to keep children safe from bullying and harassment, and from content that promotes suicide, substance abuse, eating disorders, and sexual exploitation.

It would also require platforms to create safeguards to protect kids’ data, limit personalized ads, and disable so-called addictive product features, such as videos playing automatically. Parents with kids younger than 13 would be provided tools to manage their child’s time spent online, and teenagers could opt in. Thirty-nine senators have signed on to the bill, (S. 1409), which would be enforced by the Federal Trade Commission and state attorneys general.

“We are in the midst of a mental health crisis. It hasn’t been created alone by social media, but social media is aggravating it and exploiting it,” Blumenthal, sitting beside Blackburn, in June told a group of mothers fighting for KOSA’s passage. “They know what they’re doing, and they’re continuing to do it because it makes money for them.”

Separately, the Senate Judiciary Committee recently approved a handful of bills that seek to curb online child sexual exploitation (S. 1199, S. 412, S. 1207). Sens. Brian Schatz (D-Hawaii) and Tom Cotton (R-Ark.) also proposed an age verification requirement for social media use, blocking users younger than 13 (S. 1291).

Lobbying Push

Critics say such measures are misguided and flawed as they could infringe on privacy and free speech rights, increase collection of user data, chip away at encryption protections, and force social media platforms, in fear of liability, to over-moderate and block content that isn’t necessarily harmful.

“It gives the government way too much power,” McKinney, of the Electronic Frontier Foundation, said. “Giving the government power to decide what is safe and what is helpful and assuming that children are not at risk from the government — that’s also something that I take issue with.” The foundation spent $20,000 on federal lobbying in the first quarter of this year and $95,000 for all of last year, according to filings.

Online platforms would have difficulty complying with these rules without removing potentially useful information for teens, or adults, who are struggling with eating disorders or depression, for example, said Cody Venzke, the senior policy counsel for surveillance, privacy, and technology at the American Civil Liberties Union.

The ACLU disclosed spending $250,000 on federal lobbying in the first three months of this year, less than it spent in the same period last year, and reported lobbying on KOSA, free speech, and privacy issues, among others, a filing shows.

Restricting content for kids and teens could also undermine the freedom to remain anonymous online as companies trying to verify users’ ages could “intrude on the constitutional rights of both kids and adults alike,” Venzke said.

Online Community

A higher percentage of LGBTQ+ youth report cyberbullying and harassment compared with other young people, and can find support and community on social media, said David Stacy, the Human Rights Campaign’s vice president of government affairs. The Human Rights Campaign spent about $305,000 on federal lobbying in the first quarter — considerably more than it spent in the same period last year — on issues including KOSA, according to a lobbying filing.

“We’re coming at it from, how do we address these problems and not limit access to some of the really positive things online?” Stacy said.

The opponents want lawmakers to modify the bills to mitigate such risks, or to block the legislation.

“They’re misunderstanding or having too optimistic a view as to how the legislation will play itself out,” said Samir Jain, vice president of policy at the Center for Democracy and Technology. The nonpartisan digital rights group lists among board members on its website Microsoft Corp.‘s corporate vice president and deputy general counsel.

Lawmakers don’t consider the bills an overreach.

“We have clearly, in the past, regulated communications with children through things like cartoons, or through marketing products,” Cassidy, the top Republican on the Senate Health, Education, Labor and Pensions Committee, said. “Now of course you can go overboard, but I don’t think we’re anywhere close to going overboard.”

Tech Industry Efforts

Groups such as NetChoice and the Computer and Communications Industry Association, which represent Alphabet Inc.’s Google, Meta, and other major tech companies, have similarly criticized the child safety bills. The platforms declined to comment on particular bills, but, as public pressure intensifies, have touted their own steps to firm up protections for young users and commitments to keeping kids safe online

Meta, owner of Instagram and Facebook, disclosed shelling out almost $4.6 million for the first quarter of the year on matters of “internet security policy and internet privacy issues, federal privacy legislation, and freedom of expression on the internet, including connectivity, spectrum and access issues, local media issues,” according to its first-quarter report. Last year, Meta spent $19.2 million on federal lobbying, disclosures show.

“We want young people to have safe, positive experiences online,” a Meta spokesperson said in a statement. “That’s why we’ve built safety and privacy directly into teen experiences, and have developed more than 30 tools to support teens and families online, including parental supervision tools and age-appropriate protections across our technologies. We refer to research, feedback from parents, teens, experts, and academics to inform our approach, and we’ll continue evaluating proposed legislation and working with policymakers on these important issues.”

TikTok and Snap Inc. boosted their federal lobbying as lawmakers eyed their industry. Snap’s lobbying rose in 2022 to almost $700,000 and is poised to exceed that in 2023, spending $340,000 in the first three months of the year — the most it’s ever spent in one quarter, according to lobbying disclosures filed with Congress “related to transparency, safety, and privacy.”

As it increased its federal lobbying, Snapchat has sought to enhance protections by requiring users to be at least 13 to create an account, giving parents access to see who their kids are friends with, and alerting teens if they’ve accepted a friend request from an adult who may be a stranger.

Boosting Spending

TikTok has similarly announced new tools aimed at making the platform safer for teens, including parental controls, content filtering, and plans to launch a youth council where kids can share their views about their experiences on the app.

TikTok parent company ByteDance is on track this year to increase its spending on federal lobbying, disclosing almost $1.6 million in the first quarter. It spent $4.9 million last year. ByteDance disclosed lobbying on issues “related to privacy, data security, data localization, protecting children, intermediary liability, and platform/content moderation, including federal privacy legislation,” in the first quarter of this year, disclosures show.

Owner of the widely popular YouTube, Google’s lobbying expenditures have been relatively flat in recent years. It reported $2.8 million for the first quarter of the year, and reported lobbying on bills on children online privacy and safety, according to its disclosure. A Google representative didn’t return requests for comment.

Despite the industry’s skepticism, lawmakers say it’s just a matter of time until they aim to advance legislation to protect kids online.

“No matter how much parents can be aware of it, we don’t have the ability to deal with this on our own,” Deb Schmill told Blackburn and Blumenthal in the June meeting. Schmill said her 18-year-old daughter, Becca, was cyberbullied, and died of fentanyl poisoning from drugs she and a friend purchased from a dealer they found on Facebook.

“We need help and we need social media companies to be responsible,” she said.

To contact the reporters on this story: Oma Seddiq at oseddiq@bloombergindustry.com; Kate Ackley at kackley@bloombergindustry.com

To contact the editors responsible for this story: Anna Yukhananov at ayukhananov@bloombergindustry.com; Robin Meszoly at rmeszoly@bgov.com

Stay informed with more news like this – from the largest team of reporters on Capitol Hill – subscribe to Bloomberg Government today. Learn more.