GOOGLE HAS BEEN QUICK TO FOLLOW in Facebook's footsteps with the announcement that it's also going to use AI to better tackle terrorist content.
"Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all," Kent Walker, general counsel at Google, said in a blog post.
"Google and YouTube are committed to being part of the solution. We are working with government, law enforcement and civil society groups to tackle the problem of violent extremism online. There should be no place for terrorist content on our services."
In a bid to better crack down on such content, Google has announced that it'll plough more investment into its machine learning technology to improve its ability to automatically detect and remove terrorist content while keeping innocent videos, such as a BBC News report, online.
Because AI ain't all that just yet, Google has said it will also hire a bunch of new staffers to keep the computers in check. The firm said it will "greatly increase" the number of independent experts in YouTube's Trusted Flagger programme, noting: "Machines can help identify problematic videos, but human experts still play a role in nuanced decisions about the line between violent propaganda and religious or newsworthy speech."
Step three of Google's four-step plan is to take a tougher stance on videos that don't blatantly violate its policies. For example, videos that contain inflammatory religious or supremacist content will be placed behind a warning and will be prevented from getting ad revenue, comments or viewing recommendations.
Finally, Google said that, through a partnership with Jigsaw, YouTube will implement its ‘Redirect Method' more broadly across Europe.
"This promising approach harnesses the power of targeted online advertising to reach potential Isis recruits, and redirects them towards anti-terrorist videos that can change their minds about joining," Google explains.
Google's announcement that it will crack down on terrorist material comes just days UK prime minister by default Theresa May said that social media firms should be fined for failing to remove extremist content. µ
But we probably won't see it until next year
Why stick a finger in a dyke when you can ram the entire boy in the hole, eh?
Reminds us that we're supposed to be able to trust them
'Exclusive' model starts shipping on 29 June