BOFFINS at the University of Cambridge have established a centre to explore the dangers that advances in technology might pose to humanity.
The Centre for the Study of Existential Risk (CSER) will look at how developments in biotechnology, nanotechnology and artificial intelligence could potentially pose "extinction-level" risks to the human species.
While few would deny the benefits we have received as a result of technology, the centre will analyse whether technological advances will help humans survive or might lead to our extinction.
"At some point, this century or next, we may well be facing one of the major shifts in human history - perhaps even cosmic history - when intelligence escapes the constraints of biology," said Huw Price, a professor of philosophy and one of the CSER's three founders.
"We need to take seriously the possibility that there might be a 'Pandora's box' moment with artificial general intelligence (AGI) that, if missed, could be disastrous. With so much at stake, we need to do a better job of understanding the risks of potentially catastrophic technologies."
Jaan Tallinn, a former software engineer who co-founded Skype, will be working with Price on the project. The third CSER founder is scientist Lord Martin Rees, a former master of Trinity College and president of the Royal Society.
Price spoke of his hope that many other researchers will join the CSER project. "We hope that CSER will be a place where world class minds from a variety of disciplines can collaborate in exploring technological risks in both the near and far future," he added.
The INQUIRER contacted the CSER founders to find out how much funding the research project will have but has yet to receive a response.
The launch of the centre is planned for next year. µ
Is this a banana I see before me, etc
Bad news for developers, good news for SoundCloud
Windows 10 Pro users kicked in the craw again
It might feature a 3.5mm headphone jack, after all