Lawmakers vow stricter regulations on social media platforms to combat misinformation
House lawmakers promised that stricter regulations on social media platforms are now inevitable as CEOs from Facebook, Twitter and Google faced intense scrutiny from Democrats and Republicans like at a hearing on Thursday. The hearing was aimed at addressing misinformation that spread on the social media platforms and contributed to the January 6 riot at the Capitol.
Democratic Congressman Frank Pallone Jr., chair of the House Energy and Commerce Committee (ECC), said that Facebook, Twitter, and Google "played a role in fomenting insurrection" and accused the platforms of handing a "megaphone" to extremists who spread misinformation.
"Your business model itself has become the problem and the time for self-regulation is over," Pallone Jr. said. "It's time we legislate and hold you accountable and that is what we are going to do."
Democratic Congressman Michael Doyle of Pennsylvania, who chairs the subcommittee on commerce and technology, told Facebook's Mark Zuckerberg, Twitter's Jack Dorsey, and Google's Sundar Pichai that their companies "failed to protect" users from the consequences of their creations and their platforms are responsible for the deadly riots that occurred at the U.S Capitol in January.
"That attack and the movement that motivated it started and was nourished on your platform," Doyle said. "Your platforms suggested groups for people to join, videos they should watch, and posts they should like driving this movement forward with terrifying speed and efficiency,"
Doyle said lawmakers want to hold the companies accountable and called for "audit authority" of their technologies. "We will legislate to stop this. The stakes are simply too high," Doyle said.
Republican Congressman Bill Bilirakis of Florida echoed similar sentiments, saying that the committee knows "how to get things done when we come together."
"We can do this with you or without you and we will," Bilirakis said regarding legislation to regulate the companies.
Lawmakers on both sides asked the three CEOs if they should be held responsible for their role during the January 6 attacks at the U.S Capitol but all three skirted the question.
Facebook's Mark Zuckerberg said the people who participated in the insurrection should be the ones held responsible.
"President Trump gave a speech rejecting the results and calling on people to fight," Zuckerberg said. "I believe that the former president should be responsible for his words and that the people who broke the law should be responsible for their actions."
Zuckerberg acknowledged that not every piece of misinformation leading up to the attack at the Capitol was caught but argued that the company made its services "inhospitable to those who might do harm."
Pallone Jr. and Republican Congresswoman Cathy McMorris Rodgers both criticized the social media companies' use of algorithms to curate content and suggest posts for users to engage with.
"The dirty truth is that they are relying on algorithms to purposely promote conspiratorial, divisive, or extremist content so that they can take more money in ad dollars," Pallone Jr. said. "This is because the more outrageous and extremist the content, the more engagement and views these companies get from their users and more views equals more money."
McMorris Rodgers said the algorithms used by social platforms are harmful for the mental health of children and added that she doesn't want artificial intelligence manipulating kids.
Zuckerberg fired back saying the claims that algorithms feed users content to make them angry is not true. "The division we see today is primarily the result of a political and media environment that drives Americans apart and we need to reckon with that," Zuckerberg said.
Facebook's chief also proposed some changes he wants to see to Section 230 of the 1996 Communications Decency Act. The federal allow provides immunity for online platforms from liability for content that others post on their sites.
Zuckerberg said it is important the entire law is not repealed but called for large platforms to issue regular reports on each category of harmful content and how effective they are at removing them. He cautioned that changes to Section 230 might have a different impact on smaller platforms that don't have the same kind of resources as Facebook for content moderation.
"It would be reasonable to condition immunity for the larger platforms on having a generally effective system in place to moderate clearly illegal types of content," Zuckerberg said.
Ahead of the hearing, all three companies tried to highlight the work they have done in recent months to curb the spread of misinformation and harmful content on their sites.
Google said it has taken down 850,000 videos from YouTube related to dangerous or misleading COVID-19 medical information and blocked nearly 100 million COVID related ads in 2020.
Facebook pointed out that it has referred billions of users to authoritative public health and election security sources. A Facebook spokesperson told CBS News the company has removed 2 million posts containing misinformation about COVID-19 just in February.
And Twitter highlighted that it has removed more than 22,000 tweets and challenged nearly 12 million accounts worldwide for posting COVID-19 related misinformation.