PRIME MINISTER BY DEFAULT Theresa May is not backing down in her bid to regulate the internet more tightly and has said that social media firms should be fine for failing to remove extremist content.
Under proposals agreed by the UK and French governments, firms such as Facebook, Google and Twitter could face fines if they fail to remove extremist content and terrorist material.
May said that the two countries would work together to ensure the internet could not be used as a 'safe space' for terrorists and criminals, in the wake of terrorist attacks in Westminster, Manchester and London Bridge.
"The counter-terrorism co-operation between British and French intelligence agencies is already strong, but President Macron and I agree that more should be done to tackle the terrorist threat online," she said.
"In the UK we are already working with social media companies to halt the spread of extremist material and poisonous propaganda that is warping young minds.
"And today I can announce that the UK and France will work together to encourage corporations to do more and abide by their social responsibility to step up their efforts to remove harmful content from their networks, including exploring the possibility of creating a new legal liability for tech companies if they fail to remove unacceptable content.
"We are united in our total condemnation of terrorism and our commitment to stamp out this evil."
The two countries will be drawing up the new legislation, and will also work in tandem with technology companies to explore whether new tools to identify and remove material online automatically could be deployed.
Further details, including the potential size of the fines, are likely to be announced at a joint press conference in Paris later on Tuesday.
After the attacks in London Bridge, social media companies said they were working on ensuring terrorist content is removed as swiftly as possible.
"We want Facebook to be a hostile environment for terrorists," said Simon Milner, Facebook's director of policy said.
"Using a combination of technology and human review, we work aggressively to remove terrorist content from our platform as soon as we become aware of it and if we become aware of an emergency involving imminent harm to someone's safety, we notify law enforcement," he added.
Twitter and Google, which also owns YouTube, both also said they were working on taking terrorist and harmful content off of their respective social media networks. µ
Check Point warns that 'the next cyber hurricane is about to come'
He who controls the Animoji, rules the Animoji
Ha ha ha, hee hee hee, Will Cooke from Ubuntu had a chat with we
POKE no more. Oh wait, that was 30 years ago