YouTube says it's taken steps to remove terrorism-related content
The video sharing site YouTube says that over the last couple of months, it has taken steps to remove terrorism-related content, CBS News' Jeff Pegues reports.
One example is Anwar al-Awlaki. The radical American born cleric was killed in 2011 by a U.S. drone strike in Yemen, but his teachings, which encourage violence, have lived on in tens of thousands of videos online.
At one time, there were more than 70,000 videos of al-Awlaki on YouTube, but now there are just over 18,000.
YouTube says it has stepped up efforts using technology to flag terrorism-related videos, expanded its work with "counter-extremist groups to help identify content that may be being used to radicalize", and it is also doing more to amplify voices speaking out against terrorism.
In 2015, internet giant Google said that YouTube was so inundated that staff couldn't filter all terror-related content, complicating the struggle to halt the publication of terrorist propaganda and hostage videos.
Google Public Policy Manager Verity Harding said that about 300 hours of video material was being uploaded to YouTube every minute, making it virtually impossible for the company to filter all images.
But by last year, Facebook, Microsoft, Twitter and YouTube all announced they would be joining forces to more quickly identify the worst terrorist propaganda and prevent it from spreading online.
The new program creates a database of unique digital "fingerprints" to help automatically identify videos or images the companies could remove.
Social media has increasingly become a tool for recruiting and radicalization by the Islamic State of Iraq and Syria (ISIS). Its use by terror groups and supporters has added to the threat from so-called lone-wolf attacks and decreased the time from "flash to bang" -- or radicalization to violence -- with little or no time for law enforcement to follow evidentiary trails before an attack.