Bloomberg Government subscribers get the stories like this first. Act now and gain unlimited access to everything you need to know. Learn more.
The chief executives of Facebook, Google, and Twitter are set to testify before Congress this week, as lawmakers on both sides of the aisle prepare to press the companies over the spread of false information that contributed to the deadly Jan. 6 Capitol attacks.
Two House Energy and Commerce subcommittees will hold a joint hearing Thursday focused on the growth of misinformation and disinformation shared on Facebook, Twitter, and Alphabet Inc.‘s Google and Youtube. The hearing will give lawmakers an opportunity to air their grievances, and discuss legislative efforts targeting the broad liability shield that protect the companies from the legal ramifications of dangerous content on their platforms.
The panel’s Democrats, led by Chair Frank Pallone (D-N.J.), have sent letters of inquiry on four occasions just this year to the companies pressing for information on the platforms’ roles in promoting misinformation—false information shared regardless of the intent to mislead—and disinformation—intentionally false information shared with the intent to mislead.
During the Covid-19 pandemic, false information spread quickly on social media platforms ranging from videos claiming the pandemic is fake or planned, to false information that the Covid vaccine includes microchips. Disinformation also spread online by groups supporting the QAnon conspiracy that former President Donald Trump won the November 2020 election, which contributed to Trump supporters’ attempt to overthrow the formal election count in Congress on Jan. 6.
Building Their Case
Pallone and other Democrats sent letters on Feb. 23 and March 8 to Facebook CEO Mark Zuckerberg questioning the platform’s role in the sharing of misinformation and disinformation ahead the Jan. 6 Capitol attack. They also wrote on March 3 to Google CEO Sundar Pichai about increased extremist content on the platform.
Committee Democrats sent letters last July and again on Feb. 2 to Facebook, Google, and Twitter demanding answers as part of an on-going investigation into the companies’ handling of Covid-19 vaccine misinformation on their sites. They also sent direct letters specifically calling out Facebook’s platform for vaccine misinformation.
“For far too long, big tech has failed to acknowledge the role they’ve played in fomenting and elevating blatantly false information to its online audiences,” Pallone, Consumer Protection and Commerce Chair Jan Schakowsky (D-Ill.), and Communications and Technology Chair Mike Doyle (D-Pa.) said in a statement.
The committee’s top Republican, Cathy McMorris Rodgers (Wash.), led the panel’s GOP members in letters to all three companies on March 11 asking them to explain their content removal policies and whether they’ve coordinated on content moderation decision making.
The Republicans separately requested public feedback before the hearing on how to hold the companies accountable, saying the platforms have “broken any sense of trust that they can be fair stewards for speech and the truth.” The panel received thousands of responses to the request for public comments, according to a Republican committee aide.
Legislation Targets Liability Shield
The letters underscore the pressure the chief executives will face in front of the panels on Thursday, as an effort from both Democrats and Republicans to limit tech companies’ liability shield gains momentum. Lawmakers have set their sights on an overhaul of Section 230 of the Communications Act of 1996, which prevents the companies from being sued over the majority of content on their platforms.
There’s partisan disagreement, however, on how to change the law. Democrats argue companies need to do more to remove extremist and hate content. Republicans claim the companies are going too far in censoring conservative voices, pointing to the ban on Trump’s accounts following the attack on the Capitol.
Legislation has been introduced this year to curb the liability protections for tech companies, but efforts in the House have so far been partisan. Several proposals are expected to be discussed during this week’s hearing.
Schakowsky will discuss her Online Consumer Protection Act—likely to be unveiled this week—which would remove liability protections if platforms violate their terms of service and allow for FTC enforcement and consumer lawsuits.
“To me, this hearing is really a call to action,” Schakowsky said in an email ahead of the hearing. “We need to make these companies more accountable to the American people,” which is what she said her bill would do.
Energy and Commerce Health Subcommittee Chair Anna Eshoo (D-Calif.) may discuss her Protecting Americans from Dangerous Algorithms Act, which she plans to reintroduce with Rep. Tom Malinowski (D-N.J.) a day before the hearing. The bill would limit liability shield protections if a platform’s algorithm is used to amplify or recommend content that incites hate speech, violence, or acts of terrorism.
Rep. Yvette Clarke (D-N.Y.) is working on a more narrowly focused bill known as the Civil Rights Modernization Act, which she could highlight at the hearing. The bill would amend Section 230 to ensure federal civil rights laws apply to tech companies’ targeted advertisements in an effort to stop the spread of hate speech online.
Even with Democratic control of Congress, any bill would need bipartisan support in the Senate to clear the 60-vote threshold to advance. But if the frustration with large tech companies continues to build among lawmakers and the public, Section 230 could see reforms in the near future.
Clarke stressed the importance of holding tech accountable for extremist speech, “so it doesn’t get to the point of harm to the American people or American institutions.”
To contact the reporter on this story: Rebecca Kern in Washington at firstname.lastname@example.org