Watch CBS News

Experts tell Senate panel mis- and dis-information campaigns are growing more sophisticated

Russians behind latest Facebook campaigns?
Morell: "No doubt" Russians behind latest Facebook campaigns 03:34

One day after Facebook made global headlines with an announcement it had detected a new, coordinated, politically divisive foreign influence campaign of suspected Russian origin on its platform, a panel of experts told the Senate Intelligence Committee that influence campaigns stretch far beyond Facebook, target much more than elections, and increasingly originate from actors other than Russia.

"Over the past decade, disinformation, misinformation, and social media hoaxes have evolved from a nuisance into high-stakes information war," said Renee DiResta, Director of Research at New Knowledge, which specializes in identifying disinformation on social media.  "This will be one of the defining threats of our generation," she warned.

Mis- and dis-information campaigns are growing increasingly sophisticated, and the twin challenges of identifying manipulated content while still protecting anonymized free speech will rapidly grow thornier, the experts agreed.

Wednesday's hearing, which focused on foreign influence campaigns on social media and included five witnesses from specialized tech companies and policy think-tanks, was convened by the committee as part of its ongoing investigation into Russian interference in the 2016 election.

Though the committee made note of the progress it had made – both in its investigation and in coaxing change and increased transparency from some social media companies – a number of senators acknowledged it was far from finished.

"I'm concerned that even after 18 months of study, we are still only scratching the surface when it comes to Russia's information warfare," said Vice Chairman Mark Warner, D-Virginia, calling known activity by the Internet Research Company "a small fraction of the total Russian effort on social media."

The panelists were unified in their agreement that Russia's interference efforts had not ceased since 2016 and, in some cases, said the Kremlin had "stepped on the gas," according to Dr. John Kelly of Graphika, a firm that analyzes social media.

"They are targeting both sides of our political spectrum simultaneously, both before the 2016 election and right now," Kelly said, adding that automated accounts on the far left and far right "produce as many as 25 to 30 times the number of messages per day on average" as compared to genuine political accounts.

"The extremes are screaming while the majority whispers," Kelly said.

Although well-known platforms like Facebook and Twitter had received outsized attention as more has come to light about the IRA's activities, a host of other social media entities have been leveraged in influence operations, members of the panel said. DiResta named YouTube, Gmail, Reddit, Tumblr, Medium, Pinterest and the video game Pokemon Go, among others.

Laura Rosenberger, director of the German Marshall Fund's Alliance for Securing Democracy, added that some platforms were used to target specific communities – Tumblr, for example, was used to target African Americans. Facebook and Twitter "represent only a segment of the broader information ecosystem," she said.

The number and variety of platforms and the sheer diffuseness of content meant the number of actual Americans reached by manipulated content was unquantifiable, the experts agreed.

Both Rosenberger and the Oxford Internet Institute's Philip Howard said other authoritarian governments besides Russia – and principally China – were actively investing in disinformation efforts. "We believe they have capacity, but as of yet they haven't set American voters in their sights," Howard said.

Beyond nation states, Kelly warned, there is also a "growing black market" of actors skilled in social media manipulation that countries other than Russia are hiring for their own political purposes. "This is a critical challenge," he said.

The panelists also stressed that there was growing evidence of influence operations targeting industry-specific sectors, with some actors disseminating anti-fracking messaging in areas with strong oil interests, and fear about GMOs stoked in agricultural and farming communities.

In most cases, rancor and distrust, not advocacy, remains the goal, the experts said. "You don't just have one roomful of people who are running right-wing trolls and left-wing trolls," Kelly said. "It's the same people at the same computers."

"They're trying to play us like marionettes" he said.

At the hearing's conclusion, committee chairman Richard Burr, R-North Carolina, said he remained "optimistic" about the potential for progress, including on potential legislation. But he also generated headlines – and later, memes – of his own by invoking a popular image of a cartoon dog sitting placidly in a room consumed by fire.  

"Some feel that we as a society are sitting in a burning room, calmly drinking a cup of coffee, telling ourselves, 'This is fine,'" Burr said. "That's not fine, and that's not the case. We should no longer be talking about 'if' the Russians attempted to interfere with American society."

"They've been doing it since the days of the Soviet Union, and they're still doing it today," Burr said.

The committee will hear from executives from Google, Facebook and Twitter about similar topics on September 5.

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.