Alphabet Inc.’s Google says it is creating new policies and practices to suppress terrorism-related videos, a response to U.K. lawmakers who have said the internet is a petri dish for radical ideology.
Google will increase its use of technology to identify extremist and terrorism-related videos across its sites, which include YouTube, and will boost the number of people who screen for terrorism-related content, Google’s General Counsel Kent Walker wrote in an editorial in the Financial Times Sunday. The company will also be more aggressive in putting warnings on and limiting the reach of content that, while not officially forbidden, is still inflammatory.
“While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done,” Walker wrote.
Google and other social media and search sites are facing pressure to quickly remove posts by terrorist groups, such as Islamic State or ISIS. After seven people were killed and 48 injured in an attack in London this month, U.K. officials have focused on sites seen as enabling extremists to recruit followers, coordinate attacks and spread propaganda. Lawmakers have proposed new laws to regulate how social media platforms counter extremism online.
“We cannot allow this ideology the safe space it needs to breed,” U.K. Prime Minister Theresa May said earlier this month. The proposed legislation would force social networks to make user data available to domestic security forces.
In response, a spokesperson for the U.K.’s Home Office called on companies to work toward implementing technology to identify, remove and even prevent extremist content from being widely distributed on their sites.
SOURCE: Lindsey Rupp and John Lauerman