Watch CBS News

Euro leaders urge Facebook, Google to scrub terrorist content

NYPD's John Miller on London
NYPD's John Miller: "Juggernaut of propaganda" coming from terror groups 02:05

British Prime Minister Theresa May will be joined by the leaders of France and Italy in calling for terrorist content to be removed from the internet within one to two hours after it has been posted.

World leaders and internet giants including Facebook, Google and Microsoft are set to meet at the UN General Assembly in New York on Wednesday to discuss preventing terrorist use of the internet.

May will call upon the companies to go "further and faster" in making sure terrorist material is removed, as well as preventing it from being posted in the first place.

The meeting comes only a week after the UK suffered its fifth terrorist attack of 2017 in which 30 people were injured in an explosion on the London Underground. Groups like the self-described Islamic State are accused by politicians of using the internet to radicalize terrorists and educate them on how to carry out attacks.

BRITAIN-ATTACKS-SECURITY-TRANSPORT
Members of the emergency services work outside Parsons Green underground tube station in west London on September 15, 2017, following an incident on an underground tube carriage at the station. Getty

The British government, along with the European Union and other political institutions, have repeatedly called upon internet companies to do more in the fight against terrorism. For their part, the companies have consistently tried to step up by funding research, establishing anti-extremism programs and putting artificial intelligence to work at identifying content automatically.

Twitter announced on Tuesday that in the first half of this year it had removed 299,649 accounts for the promotion of terrorism, 75 percent of which were pulled down before their first tweet.

Google on Wednesday announced a $5 million innovation fund for countering hate and extremism, with the first $1.3 million grant being awarded to the Institute for Strategic Dialogue, a counter-extremist organization in the UK.

"By funding experts like ISD, we hope to support sustainable solutions to extremism both online and offline," said Kent Walker, senior vice president for policy, legal, trust and safety at Google.org in a blog post. "We don't have all the answers, but we're committed to playing our part."

Walker will represent Google, as well as an industry body founded in June called the Global Internet Forum to Counter Terrorism, at the UN meeting on Wednesday. Along with representatives from Facebook and Microsoft, he will hear May call for a new target of one to two hours for removing terrorist material from the web -- this being the period in which most material is disseminated.

"Terrorist groups are aware that links to their propaganda are being removed more quickly, and are placing a greater emphasis on disseminating content at speed in order to stay ahead," the British prime minister will say, according to prepared remarks.

The EU established a code of conduct earlier this year that the internet companies signed, promising to pull down illegal content within 24 hours of it being posted. Last week the European Commission was reported to be considering legislation on the issue if it wasn't happy with the progress made by spring 2018.

"Industry needs to go further and faster in automating the detection and removal of terrorist content online, and developing technological solutions which prevent it being uploaded in the first place," May is set to say Wednesday. "This is a global problem that transcends national interests."

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.